Oct 03 08:39:14 crc systemd[1]: Starting Kubernetes Kubelet... Oct 03 08:39:14 crc restorecon[4687]: Relabeled /var/lib/kubelet/config.json from system_u:object_r:unlabeled_t:s0 to system_u:object_r:container_var_lib_t:s0 Oct 03 08:39:14 crc restorecon[4687]: /var/lib/kubelet/device-plugins not reset as customized by admin to system_u:object_r:container_file_t:s0 Oct 03 08:39:14 crc restorecon[4687]: /var/lib/kubelet/device-plugins/kubelet.sock not reset as customized by admin to system_u:object_r:container_file_t:s0 Oct 03 08:39:14 crc restorecon[4687]: /var/lib/kubelet/pods/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8/volumes/kubernetes.io~configmap/nginx-conf/..2025_02_23_05_40_35.4114275528/nginx.conf not reset as customized by admin to system_u:object_r:container_file_t:s0:c15,c25 Oct 03 08:39:14 crc restorecon[4687]: /var/lib/kubelet/pods/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c15,c25 Oct 03 08:39:14 crc restorecon[4687]: /var/lib/kubelet/pods/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8/containers/networking-console-plugin/22e96971 not reset as customized by admin to system_u:object_r:container_file_t:s0:c15,c25 Oct 03 08:39:14 crc restorecon[4687]: /var/lib/kubelet/pods/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8/containers/networking-console-plugin/21c98286 not reset as customized by admin to system_u:object_r:container_file_t:s0:c15,c25 Oct 03 08:39:14 crc restorecon[4687]: /var/lib/kubelet/pods/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8/containers/networking-console-plugin/0f1869e1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c15,c25 Oct 03 08:39:14 crc restorecon[4687]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c215,c682 Oct 03 08:39:14 crc restorecon[4687]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/setup/46889d52 not reset as customized by admin to system_u:object_r:container_file_t:s0:c225,c458 Oct 03 08:39:14 crc restorecon[4687]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/setup/5b6a5969 not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c963 Oct 03 08:39:14 crc restorecon[4687]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/setup/6c7921f5 not reset as customized by admin to system_u:object_r:container_file_t:s0:c215,c682 Oct 03 08:39:14 crc restorecon[4687]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/kube-rbac-proxy-crio/4804f443 not reset as customized by admin to system_u:object_r:container_file_t:s0:c225,c458 Oct 03 08:39:14 crc restorecon[4687]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/kube-rbac-proxy-crio/2a46b283 not reset as customized by admin to system_u:object_r:container_file_t:s0:c225,c458 Oct 03 08:39:14 crc restorecon[4687]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/kube-rbac-proxy-crio/a6b5573e not reset as customized by admin to system_u:object_r:container_file_t:s0:c225,c458 Oct 03 08:39:14 crc restorecon[4687]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/kube-rbac-proxy-crio/4f88ee5b not reset as customized by admin to system_u:object_r:container_file_t:s0:c225,c458 Oct 03 08:39:14 crc restorecon[4687]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/kube-rbac-proxy-crio/5a4eee4b not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c963 Oct 03 08:39:14 crc restorecon[4687]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/kube-rbac-proxy-crio/cd87c521 not reset as customized by admin to system_u:object_r:container_file_t:s0:c215,c682 Oct 03 08:39:14 crc restorecon[4687]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/volumes/kubernetes.io~configmap/service-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Oct 03 08:39:14 crc restorecon[4687]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/volumes/kubernetes.io~configmap/service-ca-bundle/..2025_02_23_05_33_42.2574241751 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Oct 03 08:39:14 crc restorecon[4687]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/volumes/kubernetes.io~configmap/service-ca-bundle/..2025_02_23_05_33_42.2574241751/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Oct 03 08:39:14 crc restorecon[4687]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/volumes/kubernetes.io~configmap/service-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Oct 03 08:39:14 crc restorecon[4687]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/volumes/kubernetes.io~configmap/service-ca-bundle/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Oct 03 08:39:14 crc restorecon[4687]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Oct 03 08:39:14 crc restorecon[4687]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/38602af4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Oct 03 08:39:14 crc restorecon[4687]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/1483b002 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Oct 03 08:39:14 crc restorecon[4687]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/0346718b not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Oct 03 08:39:14 crc restorecon[4687]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/d3ed4ada not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Oct 03 08:39:14 crc restorecon[4687]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/3bb473a5 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Oct 03 08:39:14 crc restorecon[4687]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/8cd075a9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Oct 03 08:39:14 crc restorecon[4687]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/00ab4760 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Oct 03 08:39:14 crc restorecon[4687]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/54a21c09 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Oct 03 08:39:14 crc restorecon[4687]: /var/lib/kubelet/pods/37a5e44f-9a88-4405-be8a-b645485e7312/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c589,c726 Oct 03 08:39:14 crc restorecon[4687]: /var/lib/kubelet/pods/37a5e44f-9a88-4405-be8a-b645485e7312/containers/network-operator/70478888 not reset as customized by admin to system_u:object_r:container_file_t:s0:c176,c499 Oct 03 08:39:14 crc restorecon[4687]: /var/lib/kubelet/pods/37a5e44f-9a88-4405-be8a-b645485e7312/containers/network-operator/43802770 not reset as customized by admin to system_u:object_r:container_file_t:s0:c176,c499 Oct 03 08:39:14 crc restorecon[4687]: /var/lib/kubelet/pods/37a5e44f-9a88-4405-be8a-b645485e7312/containers/network-operator/955a0edc not reset as customized by admin to system_u:object_r:container_file_t:s0:c176,c499 Oct 03 08:39:14 crc restorecon[4687]: /var/lib/kubelet/pods/37a5e44f-9a88-4405-be8a-b645485e7312/containers/network-operator/bca2d009 not reset as customized by admin to system_u:object_r:container_file_t:s0:c140,c1009 Oct 03 08:39:14 crc restorecon[4687]: /var/lib/kubelet/pods/37a5e44f-9a88-4405-be8a-b645485e7312/containers/network-operator/b295f9bd not reset as customized by admin to system_u:object_r:container_file_t:s0:c589,c726 Oct 03 08:39:14 crc restorecon[4687]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-binary-copy not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Oct 03 08:39:14 crc restorecon[4687]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-binary-copy/..2025_02_23_05_21_22.3617465230 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Oct 03 08:39:14 crc restorecon[4687]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-binary-copy/..2025_02_23_05_21_22.3617465230/cnibincopy.sh not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Oct 03 08:39:14 crc restorecon[4687]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-binary-copy/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Oct 03 08:39:14 crc restorecon[4687]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-binary-copy/cnibincopy.sh not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Oct 03 08:39:14 crc restorecon[4687]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-sysctl-allowlist not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Oct 03 08:39:14 crc restorecon[4687]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-sysctl-allowlist/..2025_02_23_05_21_22.2050650026 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Oct 03 08:39:14 crc restorecon[4687]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-sysctl-allowlist/..2025_02_23_05_21_22.2050650026/allowlist.conf not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Oct 03 08:39:14 crc restorecon[4687]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-sysctl-allowlist/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Oct 03 08:39:14 crc restorecon[4687]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-sysctl-allowlist/allowlist.conf not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Oct 03 08:39:14 crc restorecon[4687]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Oct 03 08:39:14 crc restorecon[4687]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/egress-router-binary-copy/bc46ea27 not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Oct 03 08:39:14 crc restorecon[4687]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/egress-router-binary-copy/5731fc1b not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Oct 03 08:39:14 crc restorecon[4687]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/egress-router-binary-copy/5e1b2a3c not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Oct 03 08:39:14 crc restorecon[4687]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/cni-plugins/943f0936 not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Oct 03 08:39:14 crc restorecon[4687]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/cni-plugins/3f764ee4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Oct 03 08:39:14 crc restorecon[4687]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/cni-plugins/8695e3f9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Oct 03 08:39:14 crc restorecon[4687]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/bond-cni-plugin/aed7aa86 not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Oct 03 08:39:14 crc restorecon[4687]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/bond-cni-plugin/c64d7448 not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Oct 03 08:39:14 crc restorecon[4687]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/bond-cni-plugin/0ba16bd2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Oct 03 08:39:14 crc restorecon[4687]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/routeoverride-cni/207a939f not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Oct 03 08:39:14 crc restorecon[4687]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/routeoverride-cni/54aa8cdb not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Oct 03 08:39:14 crc restorecon[4687]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/routeoverride-cni/1f5fa595 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Oct 03 08:39:14 crc restorecon[4687]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/whereabouts-cni-bincopy/bf9c8153 not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Oct 03 08:39:14 crc restorecon[4687]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/whereabouts-cni-bincopy/47fba4ea not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Oct 03 08:39:14 crc restorecon[4687]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/whereabouts-cni-bincopy/7ae55ce9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Oct 03 08:39:14 crc restorecon[4687]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/whereabouts-cni/7906a268 not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Oct 03 08:39:14 crc restorecon[4687]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/whereabouts-cni/ce43fa69 not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Oct 03 08:39:14 crc restorecon[4687]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/whereabouts-cni/7fc7ea3a not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Oct 03 08:39:14 crc restorecon[4687]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/kube-multus-additional-cni-plugins/d8c38b7d not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Oct 03 08:39:14 crc restorecon[4687]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/kube-multus-additional-cni-plugins/9ef015fb not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Oct 03 08:39:14 crc restorecon[4687]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/kube-multus-additional-cni-plugins/b9db6a41 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Oct 03 08:39:14 crc restorecon[4687]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c432,c991 Oct 03 08:39:14 crc restorecon[4687]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/containers/network-metrics-daemon/b1733d79 not reset as customized by admin to system_u:object_r:container_file_t:s0:c476,c820 Oct 03 08:39:14 crc restorecon[4687]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/containers/network-metrics-daemon/afccd338 not reset as customized by admin to system_u:object_r:container_file_t:s0:c272,c818 Oct 03 08:39:14 crc restorecon[4687]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/containers/network-metrics-daemon/9df0a185 not reset as customized by admin to system_u:object_r:container_file_t:s0:c432,c991 Oct 03 08:39:14 crc restorecon[4687]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/containers/kube-rbac-proxy/18938cf8 not reset as customized by admin to system_u:object_r:container_file_t:s0:c476,c820 Oct 03 08:39:14 crc restorecon[4687]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/containers/kube-rbac-proxy/7ab4eb23 not reset as customized by admin to system_u:object_r:container_file_t:s0:c272,c818 Oct 03 08:39:14 crc restorecon[4687]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/containers/kube-rbac-proxy/56930be6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c432,c991 Oct 03 08:39:14 crc restorecon[4687]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/env-overrides not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Oct 03 08:39:14 crc restorecon[4687]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/env-overrides/..2025_02_23_05_21_35.630010865 not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Oct 03 08:39:14 crc restorecon[4687]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/env-overrides/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Oct 03 08:39:14 crc restorecon[4687]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/ovnkube-config not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Oct 03 08:39:14 crc restorecon[4687]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/ovnkube-config/..2025_02_23_05_21_35.1088506337 not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Oct 03 08:39:14 crc restorecon[4687]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/ovnkube-config/..2025_02_23_05_21_35.1088506337/ovnkube.conf not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Oct 03 08:39:14 crc restorecon[4687]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/ovnkube-config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Oct 03 08:39:14 crc restorecon[4687]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/ovnkube-config/ovnkube.conf not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Oct 03 08:39:14 crc restorecon[4687]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Oct 03 08:39:14 crc restorecon[4687]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/kube-rbac-proxy/0d8e3722 not reset as customized by admin to system_u:object_r:container_file_t:s0:c89,c211 Oct 03 08:39:14 crc restorecon[4687]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/kube-rbac-proxy/d22b2e76 not reset as customized by admin to system_u:object_r:container_file_t:s0:c382,c850 Oct 03 08:39:14 crc restorecon[4687]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/kube-rbac-proxy/e036759f not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Oct 03 08:39:14 crc restorecon[4687]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/ovnkube-cluster-manager/2734c483 not reset as customized by admin to system_u:object_r:container_file_t:s0:c89,c211 Oct 03 08:39:14 crc restorecon[4687]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/ovnkube-cluster-manager/57878fe7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c89,c211 Oct 03 08:39:14 crc restorecon[4687]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/ovnkube-cluster-manager/3f3c2e58 not reset as customized by admin to system_u:object_r:container_file_t:s0:c89,c211 Oct 03 08:39:14 crc restorecon[4687]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/ovnkube-cluster-manager/375bec3e not reset as customized by admin to system_u:object_r:container_file_t:s0:c382,c850 Oct 03 08:39:14 crc restorecon[4687]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/ovnkube-cluster-manager/7bc41e08 not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Oct 03 08:39:14 crc restorecon[4687]: /var/lib/kubelet/pods/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Oct 03 08:39:14 crc restorecon[4687]: /var/lib/kubelet/pods/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d/containers/download-server/48c7a72d not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Oct 03 08:39:14 crc restorecon[4687]: /var/lib/kubelet/pods/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d/containers/download-server/4b66701f not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Oct 03 08:39:14 crc restorecon[4687]: /var/lib/kubelet/pods/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d/containers/download-server/a5a1c202 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Oct 03 08:39:14 crc restorecon[4687]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Oct 03 08:39:14 crc restorecon[4687]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm/..2025_02_23_05_21_40.3350632666 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Oct 03 08:39:14 crc restorecon[4687]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm/..2025_02_23_05_21_40.3350632666/additional-cert-acceptance-cond.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Oct 03 08:39:14 crc restorecon[4687]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm/..2025_02_23_05_21_40.3350632666/additional-pod-admission-cond.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Oct 03 08:39:14 crc restorecon[4687]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Oct 03 08:39:14 crc restorecon[4687]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm/additional-cert-acceptance-cond.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Oct 03 08:39:14 crc restorecon[4687]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm/additional-pod-admission-cond.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Oct 03 08:39:14 crc restorecon[4687]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/env-overrides not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Oct 03 08:39:14 crc restorecon[4687]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/env-overrides/..2025_02_23_05_21_40.1388695756 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Oct 03 08:39:14 crc restorecon[4687]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/env-overrides/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Oct 03 08:39:14 crc restorecon[4687]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Oct 03 08:39:14 crc restorecon[4687]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/webhook/26f3df5b not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Oct 03 08:39:14 crc restorecon[4687]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/webhook/6d8fb21d not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Oct 03 08:39:14 crc restorecon[4687]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/webhook/50e94777 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Oct 03 08:39:14 crc restorecon[4687]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/approver/208473b3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Oct 03 08:39:14 crc restorecon[4687]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/approver/ec9e08ba not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Oct 03 08:39:14 crc restorecon[4687]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/approver/3b787c39 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Oct 03 08:39:14 crc restorecon[4687]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/approver/208eaed5 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Oct 03 08:39:14 crc restorecon[4687]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/approver/93aa3a2b not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Oct 03 08:39:14 crc restorecon[4687]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/approver/3c697968 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Oct 03 08:39:14 crc restorecon[4687]: /var/lib/kubelet/pods/3b6479f0-333b-4a96-9adf-2099afdc2447/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Oct 03 08:39:14 crc restorecon[4687]: /var/lib/kubelet/pods/3b6479f0-333b-4a96-9adf-2099afdc2447/containers/network-check-target-container/ba950ec9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Oct 03 08:39:14 crc restorecon[4687]: /var/lib/kubelet/pods/3b6479f0-333b-4a96-9adf-2099afdc2447/containers/network-check-target-container/cb5cdb37 not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Oct 03 08:39:14 crc restorecon[4687]: /var/lib/kubelet/pods/3b6479f0-333b-4a96-9adf-2099afdc2447/containers/network-check-target-container/f2df9827 not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Oct 03 08:39:14 crc restorecon[4687]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/images not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Oct 03 08:39:14 crc restorecon[4687]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/images/..2025_02_23_05_22_30.473230615 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Oct 03 08:39:14 crc restorecon[4687]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/images/..2025_02_23_05_22_30.473230615/images.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Oct 03 08:39:14 crc restorecon[4687]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/images/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Oct 03 08:39:14 crc restorecon[4687]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/images/images.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Oct 03 08:39:14 crc restorecon[4687]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/auth-proxy-config not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Oct 03 08:39:14 crc restorecon[4687]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/auth-proxy-config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Oct 03 08:39:14 crc restorecon[4687]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/auth-proxy-config/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Oct 03 08:39:14 crc restorecon[4687]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/auth-proxy-config/..2025_02_24_06_22_02.1904938450 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Oct 03 08:39:14 crc restorecon[4687]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/auth-proxy-config/..2025_02_24_06_22_02.1904938450/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Oct 03 08:39:14 crc restorecon[4687]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Oct 03 08:39:14 crc restorecon[4687]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/containers/machine-config-operator/fedaa673 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Oct 03 08:39:14 crc restorecon[4687]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/containers/machine-config-operator/9ca2df95 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Oct 03 08:39:14 crc restorecon[4687]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/containers/machine-config-operator/b2d7460e not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Oct 03 08:39:14 crc restorecon[4687]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/containers/kube-rbac-proxy/2207853c not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Oct 03 08:39:14 crc restorecon[4687]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/containers/kube-rbac-proxy/241c1c29 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Oct 03 08:39:14 crc restorecon[4687]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/containers/kube-rbac-proxy/2d910eaf not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Oct 03 08:39:14 crc restorecon[4687]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Oct 03 08:39:14 crc restorecon[4687]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Oct 03 08:39:14 crc restorecon[4687]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-ca/..2025_02_23_05_23_49.3726007728 not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Oct 03 08:39:14 crc restorecon[4687]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-ca/..2025_02_23_05_23_49.3726007728/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Oct 03 08:39:14 crc restorecon[4687]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-ca/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Oct 03 08:39:14 crc restorecon[4687]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-service-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Oct 03 08:39:14 crc restorecon[4687]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-service-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Oct 03 08:39:14 crc restorecon[4687]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-service-ca/..2025_02_23_05_23_49.841175008 not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Oct 03 08:39:14 crc restorecon[4687]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-service-ca/..2025_02_23_05_23_49.841175008/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Oct 03 08:39:14 crc restorecon[4687]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-service-ca/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Oct 03 08:39:14 crc restorecon[4687]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Oct 03 08:39:14 crc restorecon[4687]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.843437178 not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Oct 03 08:39:14 crc restorecon[4687]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.843437178/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Oct 03 08:39:14 crc restorecon[4687]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Oct 03 08:39:14 crc restorecon[4687]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Oct 03 08:39:14 crc restorecon[4687]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Oct 03 08:39:14 crc restorecon[4687]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/containers/etcd-operator/c6c0f2e7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c263,c871 Oct 03 08:39:14 crc restorecon[4687]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/containers/etcd-operator/399edc97 not reset as customized by admin to system_u:object_r:container_file_t:s0:c263,c871 Oct 03 08:39:14 crc restorecon[4687]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/containers/etcd-operator/8049f7cc not reset as customized by admin to system_u:object_r:container_file_t:s0:c263,c871 Oct 03 08:39:14 crc restorecon[4687]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/containers/etcd-operator/0cec5484 not reset as customized by admin to system_u:object_r:container_file_t:s0:c263,c871 Oct 03 08:39:14 crc restorecon[4687]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/containers/etcd-operator/312446d0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c406,c828 Oct 03 08:39:14 crc restorecon[4687]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/containers/etcd-operator/8e56a35d not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Oct 03 08:39:14 crc restorecon[4687]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Oct 03 08:39:14 crc restorecon[4687]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.133159589 not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Oct 03 08:39:14 crc restorecon[4687]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.133159589/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Oct 03 08:39:14 crc restorecon[4687]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Oct 03 08:39:14 crc restorecon[4687]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Oct 03 08:39:14 crc restorecon[4687]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Oct 03 08:39:14 crc restorecon[4687]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/containers/kube-controller-manager-operator/2d30ddb9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c380,c909 Oct 03 08:39:14 crc restorecon[4687]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/containers/kube-controller-manager-operator/eca8053d not reset as customized by admin to system_u:object_r:container_file_t:s0:c380,c909 Oct 03 08:39:14 crc restorecon[4687]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/containers/kube-controller-manager-operator/c3a25c9a not reset as customized by admin to system_u:object_r:container_file_t:s0:c168,c522 Oct 03 08:39:14 crc restorecon[4687]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/containers/kube-controller-manager-operator/b9609c22 not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Oct 03 08:39:14 crc restorecon[4687]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c968,c969 Oct 03 08:39:14 crc restorecon[4687]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/containers/dns-operator/e8b0eca9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c106,c418 Oct 03 08:39:14 crc restorecon[4687]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/containers/dns-operator/b36a9c3f not reset as customized by admin to system_u:object_r:container_file_t:s0:c529,c711 Oct 03 08:39:14 crc restorecon[4687]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/containers/dns-operator/38af7b07 not reset as customized by admin to system_u:object_r:container_file_t:s0:c968,c969 Oct 03 08:39:14 crc restorecon[4687]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/containers/kube-rbac-proxy/ae821620 not reset as customized by admin to system_u:object_r:container_file_t:s0:c106,c418 Oct 03 08:39:14 crc restorecon[4687]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/containers/kube-rbac-proxy/baa23338 not reset as customized by admin to system_u:object_r:container_file_t:s0:c529,c711 Oct 03 08:39:14 crc restorecon[4687]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/containers/kube-rbac-proxy/2c534809 not reset as customized by admin to system_u:object_r:container_file_t:s0:c968,c969 Oct 03 08:39:14 crc restorecon[4687]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Oct 03 08:39:14 crc restorecon[4687]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.3532625537 not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Oct 03 08:39:14 crc restorecon[4687]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.3532625537/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Oct 03 08:39:14 crc restorecon[4687]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Oct 03 08:39:14 crc restorecon[4687]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Oct 03 08:39:14 crc restorecon[4687]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Oct 03 08:39:14 crc restorecon[4687]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/containers/kube-scheduler-operator-container/59b29eae not reset as customized by admin to system_u:object_r:container_file_t:s0:c338,c381 Oct 03 08:39:14 crc restorecon[4687]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/containers/kube-scheduler-operator-container/c91a8e4f not reset as customized by admin to system_u:object_r:container_file_t:s0:c338,c381 Oct 03 08:39:14 crc restorecon[4687]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/containers/kube-scheduler-operator-container/4d87494a not reset as customized by admin to system_u:object_r:container_file_t:s0:c442,c857 Oct 03 08:39:14 crc restorecon[4687]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/containers/kube-scheduler-operator-container/1e33ca63 not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Oct 03 08:39:14 crc restorecon[4687]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Oct 03 08:39:14 crc restorecon[4687]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/containers/kube-rbac-proxy/8dea7be2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Oct 03 08:39:14 crc restorecon[4687]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/containers/kube-rbac-proxy/d0b04a99 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Oct 03 08:39:14 crc restorecon[4687]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/containers/kube-rbac-proxy/d84f01e7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Oct 03 08:39:14 crc restorecon[4687]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/containers/package-server-manager/4109059b not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Oct 03 08:39:14 crc restorecon[4687]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/containers/package-server-manager/a7258a3e not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Oct 03 08:39:14 crc restorecon[4687]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/containers/package-server-manager/05bdf2b6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Oct 03 08:39:14 crc restorecon[4687]: /var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Oct 03 08:39:14 crc restorecon[4687]: /var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/containers/control-plane-machine-set-operator/f3261b51 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Oct 03 08:39:14 crc restorecon[4687]: /var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/containers/control-plane-machine-set-operator/315d045e not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Oct 03 08:39:14 crc restorecon[4687]: /var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/containers/control-plane-machine-set-operator/5fdcf278 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Oct 03 08:39:14 crc restorecon[4687]: /var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/containers/control-plane-machine-set-operator/d053f757 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Oct 03 08:39:14 crc restorecon[4687]: /var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/containers/control-plane-machine-set-operator/c2850dc7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Oct 03 08:39:14 crc restorecon[4687]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/volumes/kubernetes.io~configmap/marketplace-trusted-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:39:14 crc restorecon[4687]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/volumes/kubernetes.io~configmap/marketplace-trusted-ca/..2025_02_23_05_22_30.2390596521 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:39:14 crc restorecon[4687]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/volumes/kubernetes.io~configmap/marketplace-trusted-ca/..2025_02_23_05_22_30.2390596521/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:39:14 crc restorecon[4687]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/volumes/kubernetes.io~configmap/marketplace-trusted-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:39:14 crc restorecon[4687]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/volumes/kubernetes.io~configmap/marketplace-trusted-ca/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:39:14 crc restorecon[4687]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:39:14 crc restorecon[4687]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/fcfb0b2b not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:39:14 crc restorecon[4687]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/c7ac9b7d not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:39:14 crc restorecon[4687]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/fa0c0d52 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:39:14 crc restorecon[4687]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/c609b6ba not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:39:14 crc restorecon[4687]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/2be6c296 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:39:14 crc restorecon[4687]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/89a32653 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:39:14 crc restorecon[4687]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/4eb9afeb not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:39:14 crc restorecon[4687]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/13af6efa not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:39:14 crc restorecon[4687]: /var/lib/kubelet/pods/b6312bbd-5731-4ea0-a20f-81d5a57df44a/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Oct 03 08:39:14 crc restorecon[4687]: /var/lib/kubelet/pods/b6312bbd-5731-4ea0-a20f-81d5a57df44a/containers/olm-operator/b03f9724 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Oct 03 08:39:14 crc restorecon[4687]: /var/lib/kubelet/pods/b6312bbd-5731-4ea0-a20f-81d5a57df44a/containers/olm-operator/e3d105cc not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Oct 03 08:39:14 crc restorecon[4687]: /var/lib/kubelet/pods/b6312bbd-5731-4ea0-a20f-81d5a57df44a/containers/olm-operator/3aed4d83 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Oct 03 08:39:14 crc restorecon[4687]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Oct 03 08:39:14 crc restorecon[4687]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.1906041176 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Oct 03 08:39:14 crc restorecon[4687]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.1906041176/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Oct 03 08:39:14 crc restorecon[4687]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Oct 03 08:39:14 crc restorecon[4687]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Oct 03 08:39:14 crc restorecon[4687]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Oct 03 08:39:14 crc restorecon[4687]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/containers/kube-storage-version-migrator-operator/0765fa6e not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Oct 03 08:39:14 crc restorecon[4687]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/containers/kube-storage-version-migrator-operator/2cefc627 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Oct 03 08:39:14 crc restorecon[4687]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/containers/kube-storage-version-migrator-operator/3dcc6345 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Oct 03 08:39:14 crc restorecon[4687]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/containers/kube-storage-version-migrator-operator/365af391 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Oct 03 08:39:14 crc restorecon[4687]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Oct 03 08:39:14 crc restorecon[4687]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates/featureGate-SelfManagedHA-Default.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Oct 03 08:39:14 crc restorecon[4687]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates/featureGate-SelfManagedHA-TechPreviewNoUpgrade.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Oct 03 08:39:14 crc restorecon[4687]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates/featureGate-SelfManagedHA-DevPreviewNoUpgrade.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Oct 03 08:39:14 crc restorecon[4687]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates/featureGate-Hypershift-TechPreviewNoUpgrade.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Oct 03 08:39:14 crc restorecon[4687]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates/featureGate-Hypershift-DevPreviewNoUpgrade.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Oct 03 08:39:14 crc restorecon[4687]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates/featureGate-Hypershift-Default.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Oct 03 08:39:14 crc restorecon[4687]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Oct 03 08:39:14 crc restorecon[4687]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-api/b1130c0f not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Oct 03 08:39:14 crc restorecon[4687]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-api/236a5913 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Oct 03 08:39:14 crc restorecon[4687]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-api/b9432e26 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Oct 03 08:39:14 crc restorecon[4687]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-config-operator/5ddb0e3f not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Oct 03 08:39:14 crc restorecon[4687]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-config-operator/986dc4fd not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Oct 03 08:39:14 crc restorecon[4687]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-config-operator/8a23ff9a not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Oct 03 08:39:14 crc restorecon[4687]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-config-operator/9728ae68 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Oct 03 08:39:14 crc restorecon[4687]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-config-operator/665f31d0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Oct 03 08:39:14 crc restorecon[4687]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Oct 03 08:39:14 crc restorecon[4687]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.1255385357 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Oct 03 08:39:14 crc restorecon[4687]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.1255385357/operator-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Oct 03 08:39:14 crc restorecon[4687]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Oct 03 08:39:14 crc restorecon[4687]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/config/operator-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Oct 03 08:39:14 crc restorecon[4687]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/service-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Oct 03 08:39:14 crc restorecon[4687]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/service-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Oct 03 08:39:14 crc restorecon[4687]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/service-ca-bundle/..2025_02_23_05_23_57.573792656 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Oct 03 08:39:14 crc restorecon[4687]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/service-ca-bundle/..2025_02_23_05_23_57.573792656/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Oct 03 08:39:14 crc restorecon[4687]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/service-ca-bundle/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Oct 03 08:39:14 crc restorecon[4687]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/trusted-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Oct 03 08:39:14 crc restorecon[4687]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_23_05_22_30.3254245399 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Oct 03 08:39:14 crc restorecon[4687]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_23_05_22_30.3254245399/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Oct 03 08:39:14 crc restorecon[4687]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/trusted-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Oct 03 08:39:14 crc restorecon[4687]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/trusted-ca-bundle/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Oct 03 08:39:14 crc restorecon[4687]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Oct 03 08:39:14 crc restorecon[4687]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/containers/authentication-operator/136c9b42 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Oct 03 08:39:14 crc restorecon[4687]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/containers/authentication-operator/98a1575b not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Oct 03 08:39:14 crc restorecon[4687]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/containers/authentication-operator/cac69136 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Oct 03 08:39:14 crc restorecon[4687]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/containers/authentication-operator/5deb77a7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Oct 03 08:39:14 crc restorecon[4687]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/containers/authentication-operator/2ae53400 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Oct 03 08:39:14 crc restorecon[4687]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Oct 03 08:39:14 crc restorecon[4687]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.3608339744 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Oct 03 08:39:14 crc restorecon[4687]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.3608339744/operator-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Oct 03 08:39:14 crc restorecon[4687]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Oct 03 08:39:14 crc restorecon[4687]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/volumes/kubernetes.io~configmap/config/operator-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Oct 03 08:39:14 crc restorecon[4687]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Oct 03 08:39:14 crc restorecon[4687]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/containers/service-ca-operator/e46f2326 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Oct 03 08:39:14 crc restorecon[4687]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/containers/service-ca-operator/dc688d3c not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Oct 03 08:39:14 crc restorecon[4687]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/containers/service-ca-operator/3497c3cd not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Oct 03 08:39:14 crc restorecon[4687]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/containers/service-ca-operator/177eb008 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Oct 03 08:39:14 crc restorecon[4687]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Oct 03 08:39:14 crc restorecon[4687]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.3819292994 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Oct 03 08:39:14 crc restorecon[4687]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.3819292994/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Oct 03 08:39:14 crc restorecon[4687]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Oct 03 08:39:14 crc restorecon[4687]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Oct 03 08:39:14 crc restorecon[4687]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Oct 03 08:39:14 crc restorecon[4687]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/containers/openshift-apiserver-operator/af5a2afa not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Oct 03 08:39:14 crc restorecon[4687]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/containers/openshift-apiserver-operator/d780cb1f not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Oct 03 08:39:14 crc restorecon[4687]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/containers/openshift-apiserver-operator/49b0f374 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Oct 03 08:39:14 crc restorecon[4687]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/containers/openshift-apiserver-operator/26fbb125 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Oct 03 08:39:14 crc restorecon[4687]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/volumes/kubernetes.io~configmap/trusted-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Oct 03 08:39:14 crc restorecon[4687]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_23_05_22_30.3244779536 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Oct 03 08:39:14 crc restorecon[4687]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_23_05_22_30.3244779536/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Oct 03 08:39:14 crc restorecon[4687]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/volumes/kubernetes.io~configmap/trusted-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Oct 03 08:39:14 crc restorecon[4687]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/volumes/kubernetes.io~configmap/trusted-ca/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Oct 03 08:39:14 crc restorecon[4687]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Oct 03 08:39:14 crc restorecon[4687]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/cf14125a not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Oct 03 08:39:14 crc restorecon[4687]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/b7f86972 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Oct 03 08:39:14 crc restorecon[4687]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/e51d739c not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Oct 03 08:39:14 crc restorecon[4687]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/88ba6a69 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Oct 03 08:39:14 crc restorecon[4687]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/669a9acf not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Oct 03 08:39:14 crc restorecon[4687]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/5cd51231 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Oct 03 08:39:14 crc restorecon[4687]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/75349ec7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Oct 03 08:39:14 crc restorecon[4687]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/15c26839 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Oct 03 08:39:14 crc restorecon[4687]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/45023dcd not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Oct 03 08:39:14 crc restorecon[4687]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/2bb66a50 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Oct 03 08:39:14 crc restorecon[4687]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/kube-rbac-proxy/64d03bdd not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Oct 03 08:39:14 crc restorecon[4687]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/kube-rbac-proxy/ab8e7ca0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Oct 03 08:39:14 crc restorecon[4687]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/kube-rbac-proxy/bb9be25f not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Oct 03 08:39:14 crc restorecon[4687]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/volumes/kubernetes.io~configmap/trusted-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 08:39:14 crc restorecon[4687]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_23_05_22_30.2034221258 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 08:39:14 crc restorecon[4687]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_23_05_22_30.2034221258/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 08:39:14 crc restorecon[4687]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/volumes/kubernetes.io~configmap/trusted-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 08:39:14 crc restorecon[4687]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/volumes/kubernetes.io~configmap/trusted-ca/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 08:39:14 crc restorecon[4687]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 08:39:14 crc restorecon[4687]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/containers/cluster-image-registry-operator/9a0b61d3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 08:39:14 crc restorecon[4687]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/containers/cluster-image-registry-operator/d471b9d2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 08:39:14 crc restorecon[4687]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/containers/cluster-image-registry-operator/8cb76b8e not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 08:39:14 crc restorecon[4687]: /var/lib/kubelet/pods/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Oct 03 08:39:14 crc restorecon[4687]: /var/lib/kubelet/pods/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9/containers/catalog-operator/11a00840 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Oct 03 08:39:14 crc restorecon[4687]: /var/lib/kubelet/pods/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9/containers/catalog-operator/ec355a92 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Oct 03 08:39:14 crc restorecon[4687]: /var/lib/kubelet/pods/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9/containers/catalog-operator/992f735e not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Oct 03 08:39:14 crc restorecon[4687]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Oct 03 08:39:14 crc restorecon[4687]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.1782968797 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Oct 03 08:39:14 crc restorecon[4687]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.1782968797/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Oct 03 08:39:14 crc restorecon[4687]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Oct 03 08:39:14 crc restorecon[4687]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Oct 03 08:39:14 crc restorecon[4687]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Oct 03 08:39:14 crc restorecon[4687]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/containers/openshift-controller-manager-operator/d59cdbbc not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Oct 03 08:39:14 crc restorecon[4687]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/containers/openshift-controller-manager-operator/72133ff0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Oct 03 08:39:14 crc restorecon[4687]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/containers/openshift-controller-manager-operator/c56c834c not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Oct 03 08:39:14 crc restorecon[4687]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/containers/openshift-controller-manager-operator/d13724c7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Oct 03 08:39:14 crc restorecon[4687]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/containers/openshift-controller-manager-operator/0a498258 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Oct 03 08:39:14 crc restorecon[4687]: /var/lib/kubelet/pods/5fe579f8-e8a6-4643-bce5-a661393c4dde/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Oct 03 08:39:14 crc restorecon[4687]: /var/lib/kubelet/pods/5fe579f8-e8a6-4643-bce5-a661393c4dde/containers/machine-config-server/fa471982 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Oct 03 08:39:14 crc restorecon[4687]: /var/lib/kubelet/pods/5fe579f8-e8a6-4643-bce5-a661393c4dde/containers/machine-config-server/fc900d92 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Oct 03 08:39:14 crc restorecon[4687]: /var/lib/kubelet/pods/5fe579f8-e8a6-4643-bce5-a661393c4dde/containers/machine-config-server/fa7d68da not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Oct 03 08:39:14 crc restorecon[4687]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Oct 03 08:39:14 crc restorecon[4687]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/containers/migrator/4bacf9b4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Oct 03 08:39:14 crc restorecon[4687]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/containers/migrator/424021b1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Oct 03 08:39:14 crc restorecon[4687]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/containers/migrator/fc2e31a3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Oct 03 08:39:14 crc restorecon[4687]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/containers/graceful-termination/f51eefac not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Oct 03 08:39:14 crc restorecon[4687]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/containers/graceful-termination/c8997f2f not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Oct 03 08:39:14 crc restorecon[4687]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/containers/graceful-termination/7481f599 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Oct 03 08:39:14 crc restorecon[4687]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/volumes/kubernetes.io~configmap/signing-cabundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Oct 03 08:39:14 crc restorecon[4687]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/volumes/kubernetes.io~configmap/signing-cabundle/..2025_02_23_05_22_49.2255460704 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Oct 03 08:39:14 crc restorecon[4687]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/volumes/kubernetes.io~configmap/signing-cabundle/..2025_02_23_05_22_49.2255460704/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Oct 03 08:39:14 crc restorecon[4687]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/volumes/kubernetes.io~configmap/signing-cabundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Oct 03 08:39:14 crc restorecon[4687]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/volumes/kubernetes.io~configmap/signing-cabundle/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Oct 03 08:39:14 crc restorecon[4687]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Oct 03 08:39:14 crc restorecon[4687]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/containers/service-ca-controller/fdafea19 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Oct 03 08:39:14 crc restorecon[4687]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/containers/service-ca-controller/d0e1c571 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Oct 03 08:39:14 crc restorecon[4687]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/containers/service-ca-controller/ee398915 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Oct 03 08:39:14 crc restorecon[4687]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/containers/service-ca-controller/682bb6b8 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Oct 03 08:39:14 crc restorecon[4687]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Oct 03 08:39:14 crc restorecon[4687]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/setup/a3e67855 not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Oct 03 08:39:14 crc restorecon[4687]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/setup/a989f289 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Oct 03 08:39:14 crc restorecon[4687]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/setup/915431bd not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Oct 03 08:39:14 crc restorecon[4687]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-ensure-env-vars/7796fdab not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Oct 03 08:39:14 crc restorecon[4687]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-ensure-env-vars/dcdb5f19 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Oct 03 08:39:14 crc restorecon[4687]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-ensure-env-vars/a3aaa88c not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Oct 03 08:39:14 crc restorecon[4687]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-resources-copy/5508e3e6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Oct 03 08:39:14 crc restorecon[4687]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-resources-copy/160585de not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Oct 03 08:39:14 crc restorecon[4687]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-resources-copy/e99f8da3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Oct 03 08:39:14 crc restorecon[4687]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcdctl/8bc85570 not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Oct 03 08:39:14 crc restorecon[4687]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcdctl/a5861c91 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Oct 03 08:39:14 crc restorecon[4687]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcdctl/84db1135 not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Oct 03 08:39:14 crc restorecon[4687]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd/9e1a6043 not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Oct 03 08:39:14 crc restorecon[4687]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd/c1aba1c2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Oct 03 08:39:14 crc restorecon[4687]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd/d55ccd6d not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Oct 03 08:39:14 crc restorecon[4687]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-metrics/971cc9f6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Oct 03 08:39:14 crc restorecon[4687]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-metrics/8f2e3dcf not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Oct 03 08:39:14 crc restorecon[4687]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-metrics/ceb35e9c not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Oct 03 08:39:14 crc restorecon[4687]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-readyz/1c192745 not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Oct 03 08:39:14 crc restorecon[4687]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-readyz/5209e501 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Oct 03 08:39:14 crc restorecon[4687]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-readyz/f83de4df not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Oct 03 08:39:14 crc restorecon[4687]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-rev/e7b978ac not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Oct 03 08:39:14 crc restorecon[4687]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-rev/c64304a1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Oct 03 08:39:14 crc restorecon[4687]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-rev/5384386b not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Oct 03 08:39:14 crc restorecon[4687]: /var/lib/kubelet/pods/efdd0498-1daa-4136-9a4a-3b948c2293fc/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c268,c620 Oct 03 08:39:14 crc restorecon[4687]: /var/lib/kubelet/pods/efdd0498-1daa-4136-9a4a-3b948c2293fc/containers/multus-admission-controller/cce3e3ff not reset as customized by admin to system_u:object_r:container_file_t:s0:c435,c756 Oct 03 08:39:14 crc restorecon[4687]: /var/lib/kubelet/pods/efdd0498-1daa-4136-9a4a-3b948c2293fc/containers/multus-admission-controller/8fb75465 not reset as customized by admin to system_u:object_r:container_file_t:s0:c268,c620 Oct 03 08:39:14 crc restorecon[4687]: /var/lib/kubelet/pods/efdd0498-1daa-4136-9a4a-3b948c2293fc/containers/kube-rbac-proxy/740f573e not reset as customized by admin to system_u:object_r:container_file_t:s0:c435,c756 Oct 03 08:39:14 crc restorecon[4687]: /var/lib/kubelet/pods/efdd0498-1daa-4136-9a4a-3b948c2293fc/containers/kube-rbac-proxy/32fd1134 not reset as customized by admin to system_u:object_r:container_file_t:s0:c268,c620 Oct 03 08:39:14 crc restorecon[4687]: /var/lib/kubelet/pods/20b0d48f-5fd6-431c-a545-e3c800c7b866/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c24 Oct 03 08:39:14 crc restorecon[4687]: /var/lib/kubelet/pods/20b0d48f-5fd6-431c-a545-e3c800c7b866/containers/serve-healthcheck-canary/0a861bd3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c24 Oct 03 08:39:14 crc restorecon[4687]: /var/lib/kubelet/pods/20b0d48f-5fd6-431c-a545-e3c800c7b866/containers/serve-healthcheck-canary/80363026 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c24 Oct 03 08:39:14 crc restorecon[4687]: /var/lib/kubelet/pods/20b0d48f-5fd6-431c-a545-e3c800c7b866/containers/serve-healthcheck-canary/bfa952a8 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c24 Oct 03 08:39:14 crc restorecon[4687]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/auth-proxy-config not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Oct 03 08:39:14 crc restorecon[4687]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/auth-proxy-config/..2025_02_23_05_33_31.2122464563 not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Oct 03 08:39:14 crc restorecon[4687]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/auth-proxy-config/..2025_02_23_05_33_31.2122464563/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Oct 03 08:39:14 crc restorecon[4687]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/auth-proxy-config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Oct 03 08:39:14 crc restorecon[4687]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/auth-proxy-config/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Oct 03 08:39:14 crc restorecon[4687]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Oct 03 08:39:14 crc restorecon[4687]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/config/..2025_02_23_05_33_31.333075221 not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Oct 03 08:39:14 crc restorecon[4687]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Oct 03 08:39:14 crc restorecon[4687]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Oct 03 08:39:14 crc restorecon[4687]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/kube-rbac-proxy/793bf43d not reset as customized by admin to system_u:object_r:container_file_t:s0:c381,c387 Oct 03 08:39:14 crc restorecon[4687]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/kube-rbac-proxy/7db1bb6e not reset as customized by admin to system_u:object_r:container_file_t:s0:c142,c438 Oct 03 08:39:14 crc restorecon[4687]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/kube-rbac-proxy/4f6a0368 not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Oct 03 08:39:14 crc restorecon[4687]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/machine-approver-controller/c12c7d86 not reset as customized by admin to system_u:object_r:container_file_t:s0:c381,c387 Oct 03 08:39:14 crc restorecon[4687]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/machine-approver-controller/36c4a773 not reset as customized by admin to system_u:object_r:container_file_t:s0:c142,c438 Oct 03 08:39:14 crc restorecon[4687]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/machine-approver-controller/4c1e98ae not reset as customized by admin to system_u:object_r:container_file_t:s0:c142,c438 Oct 03 08:39:14 crc restorecon[4687]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/machine-approver-controller/a4c8115c not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Oct 03 08:39:14 crc restorecon[4687]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Oct 03 08:39:14 crc restorecon[4687]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/containers/setup/7db1802e not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Oct 03 08:39:14 crc restorecon[4687]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/containers/kube-apiserver/a008a7ab not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Oct 03 08:39:14 crc restorecon[4687]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/containers/kube-apiserver-cert-syncer/2c836bac not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Oct 03 08:39:14 crc restorecon[4687]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/containers/kube-apiserver-cert-regeneration-controller/0ce62299 not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Oct 03 08:39:14 crc restorecon[4687]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/containers/kube-apiserver-insecure-readyz/945d2457 not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Oct 03 08:39:14 crc restorecon[4687]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/containers/kube-apiserver-check-endpoints/7d5c1dd8 not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Oct 03 08:39:14 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/utilities not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:39:14 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/utilities/copy-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:39:14 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:39:14 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:39:14 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/3scale-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:39:14 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/3scale-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:39:14 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/advanced-cluster-management not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:39:14 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/advanced-cluster-management/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:39:14 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-broker-rhel8 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:39:14 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-broker-rhel8/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:39:14 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-online not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:39:14 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-online/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:39:14 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-streams not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:39:14 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-streams/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:39:14 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-streams-console not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:39:14 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-streams-console/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:39:14 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq7-interconnect-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:39:14 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq7-interconnect-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:39:14 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ansible-automation-platform-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:39:14 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ansible-automation-platform-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:39:14 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ansible-cloud-addons-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:39:14 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ansible-cloud-addons-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:39:14 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicast-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:39:14 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicast-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:39:14 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurio-registry-3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:39:14 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurio-registry-3/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:39:14 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/authorino-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:39:14 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/authorino-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:39:14 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aws-load-balancer-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:39:14 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aws-load-balancer-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:39:14 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bamoe-businessautomation-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:39:14 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bamoe-businessautomation-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:39:14 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bamoe-kogito-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:39:14 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bamoe-kogito-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:39:14 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bpfman-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:39:14 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bpfman-operator/index.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:39:14 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/businessautomation-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:39:14 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/businessautomation-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:39:14 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cephcsi-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:39:14 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cephcsi-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:39:14 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cincinnati-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:39:14 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cincinnati-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:39:14 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-kube-descheduler-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:39:14 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-kube-descheduler-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:39:14 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-logging not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:39:14 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-logging/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:39:14 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-observability-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:39:14 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-observability-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:39:14 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/compliance-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:39:14 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/compliance-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:39:14 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/container-security-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:39:14 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/container-security-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:39:14 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/costmanagement-metrics-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:39:14 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/costmanagement-metrics-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:39:14 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cryostat-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:39:14 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cryostat-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:39:14 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datagrid not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:39:14 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datagrid/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:39:14 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/devspaces not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:39:14 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/devspaces/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:39:14 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/devworkspace-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:39:14 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/devworkspace-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:39:14 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dpu-network-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:39:14 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dpu-network-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:39:14 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eap not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:39:14 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eap/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:39:14 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/elasticsearch-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:39:14 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/elasticsearch-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:39:14 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/external-dns-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:39:14 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/external-dns-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:39:14 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fence-agents-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:39:14 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fence-agents-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:39:14 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/file-integrity-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:39:14 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/file-integrity-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:39:14 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fuse-apicurito not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:39:14 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fuse-apicurito/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:39:14 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fuse-console not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:39:14 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fuse-console/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:39:14 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fuse-online not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:39:14 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fuse-online/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:39:14 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/gatekeeper-operator-product not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:39:14 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/gatekeeper-operator-product/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:39:14 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/jaeger-product not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:39:14 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/jaeger-product/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:39:14 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/jws-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:39:14 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/jws-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:39:14 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kernel-module-management not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:39:14 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kernel-module-management/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:39:14 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kernel-module-management-hub not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:39:14 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kernel-module-management-hub/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:39:14 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kiali-ossm not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:39:14 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kiali-ossm/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:39:14 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubevirt-hyperconverged not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:39:14 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubevirt-hyperconverged/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:39:14 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/logic-operator-rhel8 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:39:14 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/logic-operator-rhel8/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:39:14 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/loki-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:39:14 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/loki-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:39:14 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lvms-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:39:14 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lvms-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:39:14 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/machine-deletion-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:39:14 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/machine-deletion-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:39:14 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mcg-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:39:14 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mcg-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:39:14 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mta-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:39:14 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mta-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:39:14 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mtc-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:39:14 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mtc-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:39:14 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mtr-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:39:14 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mtr-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:39:14 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mtv-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:39:14 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mtv-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:39:14 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multicluster-engine not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:39:14 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multicluster-engine/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:39:14 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netobserv-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:39:14 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netobserv-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:39:14 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-healthcheck-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:39:14 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-healthcheck-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:39:14 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-maintenance-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:39:14 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-maintenance-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:39:14 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-observability-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:39:14 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-observability-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:39:14 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ocs-client-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:39:14 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ocs-client-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:39:14 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ocs-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:39:14 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ocs-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:39:14 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-csi-addons-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:39:14 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-csi-addons-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:39:14 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-multicluster-orchestrator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:39:14 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-multicluster-orchestrator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:39:14 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:39:14 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:39:14 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-prometheus-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:39:14 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-prometheus-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:39:14 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odr-cluster-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:39:14 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odr-cluster-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:39:14 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odr-hub-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:39:14 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odr-hub-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:39:14 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-cert-manager-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:39:14 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-cert-manager-operator/bundle-v1.15.0.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:39:14 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-cert-manager-operator/channel.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:39:14 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-cert-manager-operator/package.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:39:14 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-custom-metrics-autoscaler-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:39:14 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-custom-metrics-autoscaler-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:39:14 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-gitops-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:39:14 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-gitops-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:39:14 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-pipelines-operator-rh not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:39:14 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-pipelines-operator-rh/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:39:14 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-secondary-scheduler-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:39:14 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-secondary-scheduler-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:39:14 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/opentelemetry-product not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:39:14 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/opentelemetry-product/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:39:14 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/quay-bridge-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:39:14 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/quay-bridge-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:39:14 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/quay-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:39:14 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/quay-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:39:14 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/recipe not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:39:14 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/recipe/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:39:14 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/red-hat-camel-k not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:39:14 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/red-hat-camel-k/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:39:14 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/red-hat-hawtio-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:39:14 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/red-hat-hawtio-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:39:14 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redhat-oadp-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:39:14 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redhat-oadp-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:39:14 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rh-service-binding-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:39:14 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rh-service-binding-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:39:14 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhacs-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:39:14 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhacs-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:39:14 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhbk-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:39:14 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhbk-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:39:14 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhdh not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:39:14 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhdh/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:39:14 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhods-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:39:14 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhods-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:39:14 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhods-prometheus-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:39:14 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhods-prometheus-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:39:14 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhpam-kogito-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:39:14 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhpam-kogito-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:39:14 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhsso-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:39:14 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhsso-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:39:14 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rook-ceph-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:39:14 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rook-ceph-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:39:14 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/run-once-duration-override-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:39:14 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/run-once-duration-override-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:39:14 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sandboxed-containers-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:39:14 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sandboxed-containers-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:39:14 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/security-profiles-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:39:14 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/security-profiles-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:39:14 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/self-node-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:39:14 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/self-node-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:39:14 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/serverless-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:39:14 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/serverless-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:39:14 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/service-registry-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:39:14 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/service-registry-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:39:14 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/servicemeshoperator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:39:14 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/servicemeshoperator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:39:14 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/servicemeshoperator3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:39:14 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/servicemeshoperator3/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:39:14 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/skupper-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:39:14 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/skupper-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:39:14 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/submariner not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:39:14 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/submariner/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:39:14 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tang-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:39:14 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tang-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:39:14 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tempo-product not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:39:14 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tempo-product/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:39:14 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/trustee-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:39:14 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/trustee-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:39:14 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/volsync-product not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:39:14 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/volsync-product/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:39:14 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/web-terminal not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:39:14 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/web-terminal/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:39:14 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:39:14 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:39:14 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:39:14 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:39:14 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:39:14 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/db.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:39:14 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/index.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:39:14 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/main.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:39:14 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/overflow.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:39:14 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/digest not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:39:14 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:39:14 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/extract-utilities/bc8d0691 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:39:14 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/extract-utilities/6b76097a not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:39:14 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/extract-utilities/34d1af30 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:39:14 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/extract-content/312ba61c not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:39:14 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/extract-content/645d5dd1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:39:14 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/extract-content/16e825f0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:39:14 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/registry-server/4cf51fc9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:39:14 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/registry-server/2a23d348 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:39:14 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/registry-server/075dbd49 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:39:14 crc restorecon[4687]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Oct 03 08:39:14 crc restorecon[4687]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Oct 03 08:39:14 crc restorecon[4687]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/..2025_02_24_06_09_13.3521195566 not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Oct 03 08:39:14 crc restorecon[4687]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/..2025_02_24_06_09_13.3521195566/image-registry.openshift-image-registry.svc..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Oct 03 08:39:14 crc restorecon[4687]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/..2025_02_24_06_09_13.3521195566/image-registry.openshift-image-registry.svc.cluster.local..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Oct 03 08:39:14 crc restorecon[4687]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/..2025_02_24_06_09_13.3521195566/default-route-openshift-image-registry.apps-crc.testing not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Oct 03 08:39:14 crc restorecon[4687]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/default-route-openshift-image-registry.apps-crc.testing not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Oct 03 08:39:14 crc restorecon[4687]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/image-registry.openshift-image-registry.svc..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Oct 03 08:39:14 crc restorecon[4687]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/image-registry.openshift-image-registry.svc.cluster.local..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Oct 03 08:39:14 crc restorecon[4687]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Oct 03 08:39:14 crc restorecon[4687]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/containers/node-ca/dd585ddd not reset as customized by admin to system_u:object_r:container_file_t:s0:c377,c642 Oct 03 08:39:14 crc restorecon[4687]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/containers/node-ca/17ebd0ab not reset as customized by admin to system_u:object_r:container_file_t:s0:c338,c343 Oct 03 08:39:14 crc restorecon[4687]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/containers/node-ca/005579f4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Oct 03 08:39:14 crc restorecon[4687]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/etcd-serving-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Oct 03 08:39:14 crc restorecon[4687]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/etcd-serving-ca/..2025_02_23_05_23_11.449897510 not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Oct 03 08:39:14 crc restorecon[4687]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/etcd-serving-ca/..2025_02_23_05_23_11.449897510/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Oct 03 08:39:14 crc restorecon[4687]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/etcd-serving-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Oct 03 08:39:14 crc restorecon[4687]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/etcd-serving-ca/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Oct 03 08:39:14 crc restorecon[4687]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/trusted-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Oct 03 08:39:14 crc restorecon[4687]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_23_05_23_11.1287037894 not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Oct 03 08:39:14 crc restorecon[4687]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/trusted-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Oct 03 08:39:14 crc restorecon[4687]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/audit-policies not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Oct 03 08:39:14 crc restorecon[4687]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/audit-policies/..2025_02_23_05_23_11.1301053334 not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Oct 03 08:39:14 crc restorecon[4687]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/audit-policies/..2025_02_23_05_23_11.1301053334/policy.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Oct 03 08:39:14 crc restorecon[4687]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/audit-policies/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Oct 03 08:39:14 crc restorecon[4687]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/audit-policies/policy.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Oct 03 08:39:14 crc restorecon[4687]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Oct 03 08:39:14 crc restorecon[4687]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/containers/fix-audit-permissions/bf5f3b9c not reset as customized by admin to system_u:object_r:container_file_t:s0:c49,c263 Oct 03 08:39:14 crc restorecon[4687]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/containers/fix-audit-permissions/af276eb7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c701 Oct 03 08:39:14 crc restorecon[4687]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/containers/fix-audit-permissions/ea28e322 not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Oct 03 08:39:14 crc restorecon[4687]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/containers/oauth-apiserver/692e6683 not reset as customized by admin to system_u:object_r:container_file_t:s0:c49,c263 Oct 03 08:39:14 crc restorecon[4687]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/containers/oauth-apiserver/871746a7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c701 Oct 03 08:39:14 crc restorecon[4687]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/containers/oauth-apiserver/4eb2e958 not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Oct 03 08:39:14 crc restorecon[4687]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/console-config not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Oct 03 08:39:14 crc restorecon[4687]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/console-config/..2025_02_24_06_09_06.2875086261 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Oct 03 08:39:14 crc restorecon[4687]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/console-config/..2025_02_24_06_09_06.2875086261/console-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Oct 03 08:39:14 crc restorecon[4687]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/console-config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Oct 03 08:39:14 crc restorecon[4687]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/console-config/console-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Oct 03 08:39:14 crc restorecon[4687]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/trusted-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Oct 03 08:39:14 crc restorecon[4687]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_24_06_09_06.286118152 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Oct 03 08:39:14 crc restorecon[4687]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_24_06_09_06.286118152/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Oct 03 08:39:14 crc restorecon[4687]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/trusted-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Oct 03 08:39:14 crc restorecon[4687]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/trusted-ca-bundle/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Oct 03 08:39:14 crc restorecon[4687]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/oauth-serving-cert not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Oct 03 08:39:14 crc restorecon[4687]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/oauth-serving-cert/..2025_02_24_06_09_06.3865795478 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Oct 03 08:39:14 crc restorecon[4687]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/oauth-serving-cert/..2025_02_24_06_09_06.3865795478/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Oct 03 08:39:14 crc restorecon[4687]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/oauth-serving-cert/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Oct 03 08:39:14 crc restorecon[4687]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/oauth-serving-cert/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Oct 03 08:39:14 crc restorecon[4687]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/service-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Oct 03 08:39:14 crc restorecon[4687]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/service-ca/..2025_02_24_06_09_06.584414814 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Oct 03 08:39:14 crc restorecon[4687]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/service-ca/..2025_02_24_06_09_06.584414814/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Oct 03 08:39:14 crc restorecon[4687]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/service-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Oct 03 08:39:14 crc restorecon[4687]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/service-ca/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Oct 03 08:39:14 crc restorecon[4687]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Oct 03 08:39:14 crc restorecon[4687]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/containers/console/ca9b62da not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Oct 03 08:39:14 crc restorecon[4687]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/containers/console/0edd6fce not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Oct 03 08:39:14 crc restorecon[4687]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Oct 03 08:39:14 crc restorecon[4687]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.2406383837 not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Oct 03 08:39:14 crc restorecon[4687]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.2406383837/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Oct 03 08:39:14 crc restorecon[4687]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.2406383837/openshift-controller-manager.client-ca.configmap not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Oct 03 08:39:14 crc restorecon[4687]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.2406383837/openshift-controller-manager.openshift-global-ca.configmap not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Oct 03 08:39:14 crc restorecon[4687]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.2406383837/openshift-controller-manager.serving-cert.secret not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Oct 03 08:39:14 crc restorecon[4687]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Oct 03 08:39:14 crc restorecon[4687]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Oct 03 08:39:14 crc restorecon[4687]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/openshift-controller-manager.client-ca.configmap not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Oct 03 08:39:14 crc restorecon[4687]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/openshift-controller-manager.openshift-global-ca.configmap not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Oct 03 08:39:14 crc restorecon[4687]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/openshift-controller-manager.serving-cert.secret not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Oct 03 08:39:14 crc restorecon[4687]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/client-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Oct 03 08:39:14 crc restorecon[4687]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/client-ca/..2025_02_24_06_20_07.1071801880 not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Oct 03 08:39:14 crc restorecon[4687]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/client-ca/..2025_02_24_06_20_07.1071801880/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Oct 03 08:39:14 crc restorecon[4687]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/client-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Oct 03 08:39:14 crc restorecon[4687]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/client-ca/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Oct 03 08:39:14 crc restorecon[4687]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/proxy-ca-bundles not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Oct 03 08:39:14 crc restorecon[4687]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/proxy-ca-bundles/..2025_02_24_06_20_07.2494444877 not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Oct 03 08:39:14 crc restorecon[4687]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/proxy-ca-bundles/..2025_02_24_06_20_07.2494444877/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Oct 03 08:39:14 crc restorecon[4687]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/proxy-ca-bundles/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Oct 03 08:39:14 crc restorecon[4687]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/proxy-ca-bundles/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Oct 03 08:39:14 crc restorecon[4687]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Oct 03 08:39:14 crc restorecon[4687]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/containers/controller-manager/89b4555f not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Oct 03 08:39:14 crc restorecon[4687]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/volumes/kubernetes.io~configmap/config-volume not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Oct 03 08:39:14 crc restorecon[4687]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/volumes/kubernetes.io~configmap/config-volume/..2025_02_23_05_23_22.4071100442 not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Oct 03 08:39:14 crc restorecon[4687]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/volumes/kubernetes.io~configmap/config-volume/..2025_02_23_05_23_22.4071100442/Corefile not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Oct 03 08:39:14 crc restorecon[4687]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/volumes/kubernetes.io~configmap/config-volume/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Oct 03 08:39:14 crc restorecon[4687]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/volumes/kubernetes.io~configmap/config-volume/Corefile not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Oct 03 08:39:14 crc restorecon[4687]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Oct 03 08:39:14 crc restorecon[4687]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/containers/dns/655fcd71 not reset as customized by admin to system_u:object_r:container_file_t:s0:c457,c841 Oct 03 08:39:14 crc restorecon[4687]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/containers/dns/0d43c002 not reset as customized by admin to system_u:object_r:container_file_t:s0:c55,c1022 Oct 03 08:39:14 crc restorecon[4687]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/containers/dns/e68efd17 not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Oct 03 08:39:14 crc restorecon[4687]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/containers/kube-rbac-proxy/9acf9b65 not reset as customized by admin to system_u:object_r:container_file_t:s0:c457,c841 Oct 03 08:39:14 crc restorecon[4687]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/containers/kube-rbac-proxy/5ae3ff11 not reset as customized by admin to system_u:object_r:container_file_t:s0:c55,c1022 Oct 03 08:39:14 crc restorecon[4687]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/containers/kube-rbac-proxy/1e59206a not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Oct 03 08:39:14 crc restorecon[4687]: /var/lib/kubelet/pods/44663579-783b-4372-86d6-acf235a62d72/containers/dns-node-resolver/27af16d1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c304,c1017 Oct 03 08:39:14 crc restorecon[4687]: /var/lib/kubelet/pods/44663579-783b-4372-86d6-acf235a62d72/containers/dns-node-resolver/7918e729 not reset as customized by admin to system_u:object_r:container_file_t:s0:c853,c893 Oct 03 08:39:14 crc restorecon[4687]: /var/lib/kubelet/pods/44663579-783b-4372-86d6-acf235a62d72/containers/dns-node-resolver/5d976d0e not reset as customized by admin to system_u:object_r:container_file_t:s0:c585,c981 Oct 03 08:39:14 crc restorecon[4687]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Oct 03 08:39:14 crc restorecon[4687]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/config/..2025_02_23_05_38_56.1112187283 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Oct 03 08:39:14 crc restorecon[4687]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/config/..2025_02_23_05_38_56.1112187283/controller-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Oct 03 08:39:14 crc restorecon[4687]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Oct 03 08:39:14 crc restorecon[4687]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/config/controller-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Oct 03 08:39:15 crc restorecon[4687]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/trusted-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Oct 03 08:39:15 crc restorecon[4687]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_23_05_38_56.2839772658 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Oct 03 08:39:15 crc restorecon[4687]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_23_05_38_56.2839772658/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Oct 03 08:39:15 crc restorecon[4687]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/trusted-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Oct 03 08:39:15 crc restorecon[4687]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/trusted-ca/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Oct 03 08:39:15 crc restorecon[4687]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Oct 03 08:39:15 crc restorecon[4687]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/containers/console-operator/d7f55cbb not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Oct 03 08:39:15 crc restorecon[4687]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/containers/console-operator/f0812073 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Oct 03 08:39:15 crc restorecon[4687]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/containers/console-operator/1a56cbeb not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Oct 03 08:39:15 crc restorecon[4687]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/containers/console-operator/7fdd437e not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Oct 03 08:39:15 crc restorecon[4687]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/containers/console-operator/cdfb5652 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Oct 03 08:39:15 crc restorecon[4687]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/etcd-serving-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Oct 03 08:39:15 crc restorecon[4687]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/etcd-serving-ca/..2025_02_24_06_17_29.3844392896 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Oct 03 08:39:15 crc restorecon[4687]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/etcd-serving-ca/..2025_02_24_06_17_29.3844392896/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Oct 03 08:39:15 crc restorecon[4687]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/etcd-serving-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Oct 03 08:39:15 crc restorecon[4687]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/etcd-serving-ca/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Oct 03 08:39:15 crc restorecon[4687]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Oct 03 08:39:15 crc restorecon[4687]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/config/..2025_02_24_06_17_29.848549803 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Oct 03 08:39:15 crc restorecon[4687]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/config/..2025_02_24_06_17_29.848549803/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Oct 03 08:39:15 crc restorecon[4687]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Oct 03 08:39:15 crc restorecon[4687]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Oct 03 08:39:15 crc restorecon[4687]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/audit not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Oct 03 08:39:15 crc restorecon[4687]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/audit/..2025_02_24_06_17_29.780046231 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Oct 03 08:39:15 crc restorecon[4687]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/audit/..2025_02_24_06_17_29.780046231/policy.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Oct 03 08:39:15 crc restorecon[4687]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/audit/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Oct 03 08:39:15 crc restorecon[4687]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/audit/policy.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Oct 03 08:39:15 crc restorecon[4687]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Oct 03 08:39:15 crc restorecon[4687]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/..2025_02_24_06_17_29.2926008347 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Oct 03 08:39:15 crc restorecon[4687]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/..2025_02_24_06_17_29.2926008347/image-registry.openshift-image-registry.svc..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Oct 03 08:39:15 crc restorecon[4687]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/..2025_02_24_06_17_29.2926008347/image-registry.openshift-image-registry.svc.cluster.local..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Oct 03 08:39:15 crc restorecon[4687]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/..2025_02_24_06_17_29.2926008347/default-route-openshift-image-registry.apps-crc.testing not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Oct 03 08:39:15 crc restorecon[4687]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Oct 03 08:39:15 crc restorecon[4687]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/image-registry.openshift-image-registry.svc..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Oct 03 08:39:15 crc restorecon[4687]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/image-registry.openshift-image-registry.svc.cluster.local..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Oct 03 08:39:15 crc restorecon[4687]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/default-route-openshift-image-registry.apps-crc.testing not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Oct 03 08:39:15 crc restorecon[4687]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/trusted-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Oct 03 08:39:15 crc restorecon[4687]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_24_06_17_29.2729721485 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Oct 03 08:39:15 crc restorecon[4687]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_24_06_17_29.2729721485/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Oct 03 08:39:15 crc restorecon[4687]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/trusted-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Oct 03 08:39:15 crc restorecon[4687]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/trusted-ca-bundle/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Oct 03 08:39:15 crc restorecon[4687]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Oct 03 08:39:15 crc restorecon[4687]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/containers/fix-audit-permissions/fb93119e not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Oct 03 08:39:15 crc restorecon[4687]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/containers/openshift-apiserver/f1e8fc0e not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Oct 03 08:39:15 crc restorecon[4687]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/containers/openshift-apiserver-check-endpoints/218511f3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Oct 03 08:39:15 crc restorecon[4687]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/volumes/kubernetes.io~empty-dir/tmpfs not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Oct 03 08:39:15 crc restorecon[4687]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/volumes/kubernetes.io~empty-dir/tmpfs/k8s-webhook-server not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Oct 03 08:39:15 crc restorecon[4687]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/volumes/kubernetes.io~empty-dir/tmpfs/k8s-webhook-server/serving-certs not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Oct 03 08:39:15 crc restorecon[4687]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Oct 03 08:39:15 crc restorecon[4687]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/containers/packageserver/ca8af7b3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Oct 03 08:39:15 crc restorecon[4687]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/containers/packageserver/72cc8a75 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Oct 03 08:39:15 crc restorecon[4687]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/containers/packageserver/6e8a3760 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Oct 03 08:39:15 crc restorecon[4687]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/volumes/kubernetes.io~configmap/service-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Oct 03 08:39:15 crc restorecon[4687]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/volumes/kubernetes.io~configmap/service-ca/..2025_02_23_05_27_30.557428972 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Oct 03 08:39:15 crc restorecon[4687]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/volumes/kubernetes.io~configmap/service-ca/..2025_02_23_05_27_30.557428972/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Oct 03 08:39:15 crc restorecon[4687]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/volumes/kubernetes.io~configmap/service-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Oct 03 08:39:15 crc restorecon[4687]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/volumes/kubernetes.io~configmap/service-ca/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Oct 03 08:39:15 crc restorecon[4687]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Oct 03 08:39:15 crc restorecon[4687]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/containers/cluster-version-operator/4c3455c0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Oct 03 08:39:15 crc restorecon[4687]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/containers/cluster-version-operator/2278acb0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Oct 03 08:39:15 crc restorecon[4687]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/containers/cluster-version-operator/4b453e4f not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Oct 03 08:39:15 crc restorecon[4687]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/containers/cluster-version-operator/3ec09bda not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Oct 03 08:39:15 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/trusted-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 08:39:15 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_24_06_25_03.422633132 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 08:39:15 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_24_06_25_03.422633132/anchors not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 08:39:15 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_24_06_25_03.422633132/anchors/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 08:39:15 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/trusted-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 08:39:15 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/trusted-ca/anchors not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 08:39:15 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 08:39:15 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/..2025_02_24_06_25_03.3594477318 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 08:39:15 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/..2025_02_24_06_25_03.3594477318/image-registry.openshift-image-registry.svc..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 08:39:15 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/..2025_02_24_06_25_03.3594477318/image-registry.openshift-image-registry.svc.cluster.local..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 08:39:15 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/..2025_02_24_06_25_03.3594477318/default-route-openshift-image-registry.apps-crc.testing not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 08:39:15 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 08:39:15 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/image-registry.openshift-image-registry.svc..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 08:39:15 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/image-registry.openshift-image-registry.svc.cluster.local..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 08:39:15 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/default-route-openshift-image-registry.apps-crc.testing not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 08:39:15 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 08:39:15 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/edk2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 08:39:15 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/edk2/cacerts.bin not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 08:39:15 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/java not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 08:39:15 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/java/cacerts not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 08:39:15 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/openssl not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 08:39:15 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/openssl/ca-bundle.trust.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 08:39:15 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 08:39:15 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 08:39:15 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/email-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 08:39:15 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/objsign-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 08:39:15 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 08:39:15 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2ae6433e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 08:39:15 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fde84897.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 08:39:15 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/75680d2e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 08:39:15 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/openshift-service-serving-signer_1740288168.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 08:39:15 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/facfc4fa.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 08:39:15 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8f5a969c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 08:39:15 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/CFCA_EV_ROOT.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 08:39:15 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9ef4a08a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 08:39:15 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ingress-operator_1740288202.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 08:39:15 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2f332aed.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 08:39:15 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/248c8271.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 08:39:15 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8d10a21f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 08:39:15 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ACCVRAIZ1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 08:39:15 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a94d09e5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 08:39:15 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3c9a4d3b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 08:39:15 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/40193066.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 08:39:15 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AC_RAIZ_FNMT-RCM.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 08:39:15 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cd8c0d63.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 08:39:15 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b936d1c6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 08:39:15 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/CA_Disig_Root_R2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 08:39:15 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4fd49c6c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 08:39:15 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AC_RAIZ_FNMT-RCM_SERVIDORES_SEGUROS.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 08:39:15 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b81b93f0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 08:39:15 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5f9a69fa.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 08:39:15 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certigna.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 08:39:15 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b30d5fda.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 08:39:15 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ANF_Secure_Server_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 08:39:15 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b433981b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 08:39:15 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/93851c9e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 08:39:15 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9282e51c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 08:39:15 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e7dd1bc4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 08:39:15 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Actalis_Authentication_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 08:39:15 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/930ac5d2.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 08:39:15 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5f47b495.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 08:39:15 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e113c810.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 08:39:15 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5931b5bc.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 08:39:15 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AffirmTrust_Commercial.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 08:39:15 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2b349938.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 08:39:15 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e48193cf.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 08:39:15 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/302904dd.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 08:39:15 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a716d4ed.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 08:39:15 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AffirmTrust_Networking.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 08:39:15 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/93bc0acc.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 08:39:15 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/86212b19.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 08:39:15 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certigna_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 08:39:15 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AffirmTrust_Premium.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 08:39:15 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b727005e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 08:39:15 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/dbc54cab.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 08:39:15 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f51bb24c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 08:39:15 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c28a8a30.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 08:39:15 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AffirmTrust_Premium_ECC.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 08:39:15 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9c8dfbd4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 08:39:15 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ccc52f49.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 08:39:15 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cb1c3204.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 08:39:15 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Amazon_Root_CA_1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 08:39:15 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ce5e74ef.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 08:39:15 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fd08c599.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 08:39:15 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certum_Trusted_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 08:39:15 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Amazon_Root_CA_2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 08:39:15 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6d41d539.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 08:39:15 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fb5fa911.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 08:39:15 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e35234b1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 08:39:15 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Amazon_Root_CA_3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 08:39:15 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8cb5ee0f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 08:39:15 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7a7c655d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 08:39:15 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f8fc53da.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 08:39:15 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Amazon_Root_CA_4.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 08:39:15 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/de6d66f3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 08:39:15 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d41b5e2a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 08:39:15 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/41a3f684.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 08:39:15 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1df5a75f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 08:39:15 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Atos_TrustedRoot_2011.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 08:39:15 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e36a6752.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 08:39:15 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b872f2b4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 08:39:15 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9576d26b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 08:39:15 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/228f89db.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 08:39:15 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Atos_TrustedRoot_Root_CA_ECC_TLS_2021.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 08:39:15 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fb717492.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 08:39:15 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2d21b73c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 08:39:15 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0b1b94ef.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 08:39:15 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/595e996b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 08:39:15 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Atos_TrustedRoot_Root_CA_RSA_TLS_2021.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 08:39:15 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9b46e03d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 08:39:15 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/128f4b91.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 08:39:15 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Buypass_Class_3_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 08:39:15 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/81f2d2b1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 08:39:15 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Autoridad_de_Certificacion_Firmaprofesional_CIF_A62634068.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 08:39:15 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3bde41ac.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 08:39:15 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d16a5865.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 08:39:15 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certum_EC-384_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 08:39:15 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/BJCA_Global_Root_CA1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 08:39:15 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0179095f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 08:39:15 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ffa7f1eb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 08:39:15 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9482e63a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 08:39:15 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d4dae3dd.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 08:39:15 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/BJCA_Global_Root_CA2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 08:39:15 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3e359ba6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 08:39:15 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7e067d03.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 08:39:15 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/95aff9e3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 08:39:15 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d7746a63.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 08:39:15 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Baltimore_CyberTrust_Root.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 08:39:15 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/653b494a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 08:39:15 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3ad48a91.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 08:39:15 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certum_Trusted_Network_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 08:39:15 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Buypass_Class_2_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 08:39:15 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/54657681.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 08:39:15 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/82223c44.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 08:39:15 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e8de2f56.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 08:39:15 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2d9dafe4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 08:39:15 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d96b65e2.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 08:39:15 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ee64a828.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 08:39:15 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/COMODO_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 08:39:15 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/40547a79.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 08:39:15 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5a3f0ff8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 08:39:15 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7a780d93.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 08:39:15 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/34d996fb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 08:39:15 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/COMODO_ECC_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 08:39:15 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/eed8c118.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 08:39:15 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/89c02a45.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 08:39:15 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certainly_Root_R1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 08:39:15 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b1159c4c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 08:39:15 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/COMODO_RSA_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 08:39:15 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d6325660.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 08:39:15 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d4c339cb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 08:39:15 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8312c4c1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 08:39:15 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certainly_Root_E1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 08:39:15 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8508e720.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 08:39:15 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5fdd185d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 08:39:15 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/48bec511.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 08:39:15 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/69105f4f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 08:39:15 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign.1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 08:39:15 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0b9bc432.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 08:39:15 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certum_Trusted_Network_CA_2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 08:39:15 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GTS_Root_R3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 08:39:15 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/32888f65.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 08:39:15 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/CommScope_Public_Trust_ECC_Root-01.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 08:39:15 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6b03dec0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 08:39:15 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/219d9499.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 08:39:15 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/CommScope_Public_Trust_ECC_Root-02.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 08:39:15 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5acf816d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 08:39:15 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cbf06781.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 08:39:15 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/CommScope_Public_Trust_RSA_Root-01.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 08:39:15 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GTS_Root_R4.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 08:39:15 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/dc99f41e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 08:39:15 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/CommScope_Public_Trust_RSA_Root-02.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 08:39:15 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign.3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 08:39:15 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AAA_Certificate_Services.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 08:39:15 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/985c1f52.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 08:39:15 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8794b4e3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 08:39:15 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/D-TRUST_BR_Root_CA_1_2020.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 08:39:15 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e7c037b4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 08:39:15 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ef954a4e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 08:39:15 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/D-TRUST_EV_Root_CA_1_2020.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 08:39:15 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2add47b6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 08:39:15 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/90c5a3c8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 08:39:15 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/D-TRUST_Root_Class_3_CA_2_2009.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 08:39:15 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b0f3e76e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 08:39:15 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/53a1b57a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 08:39:15 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/D-TRUST_Root_Class_3_CA_2_EV_2009.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 08:39:15 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 08:39:15 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Assured_ID_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 08:39:15 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5ad8a5d6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 08:39:15 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/68dd7389.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 08:39:15 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Assured_ID_Root_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 08:39:15 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9d04f354.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 08:39:15 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8d6437c3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 08:39:15 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/062cdee6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 08:39:15 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/bd43e1dd.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 08:39:15 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Assured_ID_Root_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 08:39:15 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7f3d5d1d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 08:39:15 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c491639e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 08:39:15 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign_Root_E46.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 08:39:15 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Global_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 08:39:15 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3513523f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 08:39:15 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/399e7759.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 08:39:15 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/feffd413.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 08:39:15 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d18e9066.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 08:39:15 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Global_Root_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 08:39:15 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/607986c7.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 08:39:15 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c90bc37d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 08:39:15 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1b0f7e5c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 08:39:15 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1e08bfd1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 08:39:15 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Global_Root_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 08:39:15 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/dd8e9d41.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 08:39:15 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ed39abd0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 08:39:15 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a3418fda.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 08:39:15 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/bc3f2570.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 08:39:15 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_High_Assurance_EV_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 08:39:15 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/244b5494.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 08:39:15 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/81b9768f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 08:39:15 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign.2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 08:39:15 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4be590e0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 08:39:15 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_TLS_ECC_P384_Root_G5.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 08:39:15 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9846683b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 08:39:15 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/252252d2.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 08:39:15 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1e8e7201.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 08:39:15 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ISRG_Root_X1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 08:39:15 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_TLS_RSA4096_Root_G5.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 08:39:15 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d52c538d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 08:39:15 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c44cc0c0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 08:39:15 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign_Root_R46.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 08:39:15 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Trusted_Root_G4.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 08:39:15 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/75d1b2ed.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 08:39:15 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a2c66da8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 08:39:15 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GTS_Root_R2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 08:39:15 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ecccd8db.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 08:39:15 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Entrust.net_Certification_Authority__2048_.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 08:39:15 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/aee5f10d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 08:39:15 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3e7271e8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 08:39:15 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b0e59380.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 08:39:15 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4c3982f2.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 08:39:15 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Entrust_Root_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 08:39:15 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6b99d060.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 08:39:15 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/bf64f35b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 08:39:15 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0a775a30.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 08:39:15 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/002c0b4f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 08:39:15 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cc450945.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 08:39:15 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Entrust_Root_Certification_Authority_-_EC1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 08:39:15 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/106f3e4d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 08:39:15 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b3fb433b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 08:39:15 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 08:39:15 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4042bcee.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 08:39:15 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Entrust_Root_Certification_Authority_-_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 08:39:15 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/02265526.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 08:39:15 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/455f1b52.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 08:39:15 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0d69c7e1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 08:39:15 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9f727ac7.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 08:39:15 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Entrust_Root_Certification_Authority_-_G4.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 08:39:15 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5e98733a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 08:39:15 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f0cd152c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 08:39:15 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/dc4d6a89.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 08:39:15 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6187b673.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 08:39:15 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/FIRMAPROFESIONAL_CA_ROOT-A_WEB.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 08:39:15 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ba8887ce.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 08:39:15 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/068570d1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 08:39:15 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f081611a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 08:39:15 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/48a195d8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 08:39:15 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GDCA_TrustAUTH_R5_ROOT.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 08:39:15 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0f6fa695.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 08:39:15 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ab59055e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 08:39:15 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b92fd57f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 08:39:15 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GLOBALTRUST_2020.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 08:39:15 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fa5da96b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 08:39:15 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1ec40989.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 08:39:15 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7719f463.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 08:39:15 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GTS_Root_R1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 08:39:15 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1001acf7.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 08:39:15 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f013ecaf.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 08:39:15 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/626dceaf.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 08:39:15 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c559d742.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 08:39:15 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1d3472b9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 08:39:15 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9479c8c3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 08:39:15 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a81e292b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 08:39:15 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4bfab552.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 08:39:15 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Go_Daddy_Class_2_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 08:39:15 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Sectigo_Public_Server_Authentication_Root_E46.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 08:39:15 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Go_Daddy_Root_Certificate_Authority_-_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 08:39:15 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e071171e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 08:39:15 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/57bcb2da.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 08:39:15 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/HARICA_TLS_ECC_Root_CA_2021.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 08:39:15 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ab5346f4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 08:39:15 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5046c355.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 08:39:15 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/HARICA_TLS_RSA_Root_CA_2021.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 08:39:15 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/865fbdf9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 08:39:15 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/da0cfd1d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 08:39:15 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/85cde254.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 08:39:15 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Hellenic_Academic_and_Research_Institutions_ECC_RootCA_2015.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 08:39:15 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cbb3f32b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 08:39:15 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SecureSign_RootCA11.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 08:39:15 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Hellenic_Academic_and_Research_Institutions_RootCA_2015.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 08:39:15 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5860aaa6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 08:39:15 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/31188b5e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 08:39:15 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/HiPKI_Root_CA_-_G1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 08:39:15 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c7f1359b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 08:39:15 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5f15c80c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 08:39:15 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Hongkong_Post_Root_CA_3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 08:39:15 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/09789157.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 08:39:15 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ISRG_Root_X2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 08:39:15 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/18856ac4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 08:39:15 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1e09d511.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 08:39:15 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/IdenTrust_Commercial_Root_CA_1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 08:39:15 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cf701eeb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 08:39:15 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d06393bb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 08:39:15 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/IdenTrust_Public_Sector_Root_CA_1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 08:39:15 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/10531352.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 08:39:15 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Izenpe.com.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 08:39:15 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SecureTrust_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 08:39:15 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b0ed035a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 08:39:15 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Microsec_e-Szigno_Root_CA_2009.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 08:39:15 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8160b96c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 08:39:15 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e8651083.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 08:39:15 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2c63f966.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 08:39:15 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Security_Communication_RootCA2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 08:39:15 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Microsoft_ECC_Root_Certificate_Authority_2017.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 08:39:15 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8d89cda1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 08:39:15 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/01419da9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 08:39:15 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SSL.com_TLS_RSA_Root_CA_2022.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 08:39:15 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b7a5b843.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 08:39:15 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Microsoft_RSA_Root_Certificate_Authority_2017.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 08:39:15 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/bf53fb88.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 08:39:15 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9591a472.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 08:39:15 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3afde786.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 08:39:15 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SwissSign_Gold_CA_-_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 08:39:15 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/NAVER_Global_Root_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 08:39:15 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3fb36b73.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 08:39:15 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d39b0a2c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 08:39:15 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a89d74c2.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 08:39:15 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cd58d51e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 08:39:15 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b7db1890.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 08:39:15 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/NetLock_Arany__Class_Gold__F__tan__s__tv__ny.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 08:39:15 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/988a38cb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 08:39:15 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/60afe812.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 08:39:15 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f39fc864.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 08:39:15 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5443e9e3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 08:39:15 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/OISTE_WISeKey_Global_Root_GB_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 08:39:15 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e73d606e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 08:39:15 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/dfc0fe80.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 08:39:15 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b66938e9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 08:39:15 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1e1eab7c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 08:39:15 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/OISTE_WISeKey_Global_Root_GC_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 08:39:15 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/773e07ad.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 08:39:15 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3c899c73.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 08:39:15 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d59297b8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 08:39:15 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ddcda989.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 08:39:15 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/QuoVadis_Root_CA_1_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 08:39:15 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/749e9e03.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 08:39:15 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/52b525c7.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 08:39:15 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Security_Communication_RootCA3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 08:39:15 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/QuoVadis_Root_CA_2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 08:39:15 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d7e8dc79.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 08:39:15 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7a819ef2.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 08:39:15 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/08063a00.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 08:39:15 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6b483515.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 08:39:15 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/QuoVadis_Root_CA_2_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 08:39:15 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/064e0aa9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 08:39:15 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1f58a078.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 08:39:15 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6f7454b3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 08:39:15 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7fa05551.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 08:39:15 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/QuoVadis_Root_CA_3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 08:39:15 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/76faf6c0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 08:39:15 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9339512a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 08:39:15 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f387163d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 08:39:15 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ee37c333.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 08:39:15 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/QuoVadis_Root_CA_3_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 08:39:15 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e18bfb83.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 08:39:15 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e442e424.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 08:39:15 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fe8a2cd8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 08:39:15 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/23f4c490.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 08:39:15 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5cd81ad7.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 08:39:15 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SSL.com_EV_Root_Certification_Authority_ECC.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 08:39:15 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f0c70a8d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 08:39:15 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7892ad52.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 08:39:15 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SZAFIR_ROOT_CA2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 08:39:15 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4f316efb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 08:39:15 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SSL.com_EV_Root_Certification_Authority_RSA_R2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 08:39:15 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/06dc52d5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 08:39:15 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/583d0756.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 08:39:15 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Sectigo_Public_Server_Authentication_Root_R46.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 08:39:15 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SSL.com_Root_Certification_Authority_ECC.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 08:39:15 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0bf05006.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 08:39:15 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/88950faa.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 08:39:15 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9046744a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 08:39:15 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3c860d51.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 08:39:15 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SSL.com_Root_Certification_Authority_RSA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 08:39:15 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6fa5da56.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 08:39:15 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/33ee480d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 08:39:15 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Secure_Global_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 08:39:15 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/63a2c897.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 08:39:15 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SSL.com_TLS_ECC_Root_CA_2022.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 08:39:15 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/bdacca6f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 08:39:15 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ff34af3f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 08:39:15 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/dbff3a01.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 08:39:15 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Security_Communication_ECC_RootCA1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 08:39:15 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/emSign_Root_CA_-_C1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 08:39:15 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Starfield_Class_2_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 08:39:15 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/406c9bb1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 08:39:15 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Starfield_Root_Certificate_Authority_-_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 08:39:15 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/emSign_ECC_Root_CA_-_C3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 08:39:15 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Starfield_Services_Root_Certificate_Authority_-_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 08:39:15 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SwissSign_Silver_CA_-_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 08:39:15 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/99e1b953.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 08:39:15 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/T-TeleSec_GlobalRoot_Class_2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 08:39:15 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/vTrus_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 08:39:15 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/T-TeleSec_GlobalRoot_Class_3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 08:39:15 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/14bc7599.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 08:39:15 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TUBITAK_Kamu_SM_SSL_Kok_Sertifikasi_-_Surum_1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 08:39:15 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TWCA_Global_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 08:39:15 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7a3adc42.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 08:39:15 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TWCA_Root_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 08:39:15 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f459871d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 08:39:15 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Telekom_Security_TLS_ECC_Root_2020.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 08:39:15 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/emSign_Root_CA_-_G1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 08:39:15 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Telekom_Security_TLS_RSA_Root_2023.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 08:39:15 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TeliaSonera_Root_CA_v1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 08:39:15 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Telia_Root_CA_v2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 08:39:15 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8f103249.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 08:39:15 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f058632f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 08:39:15 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ca-certificates.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 08:39:15 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TrustAsia_Global_Root_CA_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 08:39:15 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9bf03295.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 08:39:15 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/98aaf404.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 08:39:15 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 08:39:15 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TrustAsia_Global_Root_CA_G4.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 08:39:15 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1cef98f5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 08:39:15 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/073bfcc5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 08:39:15 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2923b3f9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 08:39:15 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Trustwave_Global_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 08:39:15 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f249de83.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 08:39:15 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/edcbddb5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 08:39:15 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/emSign_ECC_Root_CA_-_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 08:39:15 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Trustwave_Global_ECC_P256_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 08:39:15 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9b5697b0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 08:39:15 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1ae85e5e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 08:39:15 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b74d2bd5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 08:39:15 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Trustwave_Global_ECC_P384_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 08:39:15 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d887a5bb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 08:39:15 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9aef356c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 08:39:15 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TunTrust_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 08:39:15 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fd64f3fc.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 08:39:15 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e13665f9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 08:39:15 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/UCA_Extended_Validation_Root.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 08:39:15 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0f5dc4f3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 08:39:15 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/da7377f6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 08:39:15 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/UCA_Global_G2_Root.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 08:39:15 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c01eb047.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 08:39:15 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/304d27c3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 08:39:15 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ed858448.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 08:39:15 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/USERTrust_ECC_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 08:39:15 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f30dd6ad.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 08:39:15 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/04f60c28.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 08:39:15 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/vTrus_ECC_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 08:39:15 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/USERTrust_RSA_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 08:39:15 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fc5a8f99.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 08:39:15 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/35105088.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 08:39:15 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ee532fd5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 08:39:15 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/XRamp_Global_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 08:39:15 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/706f604c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 08:39:15 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/76579174.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 08:39:15 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/certSIGN_ROOT_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 08:39:15 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8d86cdd1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 08:39:15 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/882de061.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 08:39:15 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/certSIGN_ROOT_CA_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 08:39:15 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5f618aec.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 08:39:15 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a9d40e02.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 08:39:15 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e-Szigno_Root_CA_2017.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 08:39:15 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e868b802.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 08:39:15 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/83e9984f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 08:39:15 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ePKI_Root_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 08:39:15 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ca6e4ad9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 08:39:15 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9d6523ce.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 08:39:15 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4b718d9b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 08:39:15 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/869fbf79.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 08:39:15 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 08:39:15 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/containers/registry/f8d22bdb not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 08:39:15 crc restorecon[4687]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Oct 03 08:39:15 crc restorecon[4687]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/containers/cluster-samples-operator/6e8bbfac not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Oct 03 08:39:15 crc restorecon[4687]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/containers/cluster-samples-operator/54dd7996 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Oct 03 08:39:15 crc restorecon[4687]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/containers/cluster-samples-operator/a4f1bb05 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Oct 03 08:39:15 crc restorecon[4687]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/containers/cluster-samples-operator-watch/207129da not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Oct 03 08:39:15 crc restorecon[4687]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/containers/cluster-samples-operator-watch/c1df39e1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Oct 03 08:39:15 crc restorecon[4687]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/containers/cluster-samples-operator-watch/15b8f1cd not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Oct 03 08:39:15 crc restorecon[4687]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Oct 03 08:39:15 crc restorecon[4687]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/config/..2025_02_23_05_27_49.3523263858 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Oct 03 08:39:15 crc restorecon[4687]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/config/..2025_02_23_05_27_49.3523263858/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Oct 03 08:39:15 crc restorecon[4687]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Oct 03 08:39:15 crc restorecon[4687]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/config/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Oct 03 08:39:15 crc restorecon[4687]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/images not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Oct 03 08:39:15 crc restorecon[4687]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/images/..2025_02_23_05_27_49.3256605594 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Oct 03 08:39:15 crc restorecon[4687]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/images/..2025_02_23_05_27_49.3256605594/images.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Oct 03 08:39:15 crc restorecon[4687]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/images/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Oct 03 08:39:15 crc restorecon[4687]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/images/images.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Oct 03 08:39:15 crc restorecon[4687]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Oct 03 08:39:15 crc restorecon[4687]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/containers/kube-rbac-proxy/77bd6913 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Oct 03 08:39:15 crc restorecon[4687]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/containers/kube-rbac-proxy/2382c1b1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Oct 03 08:39:15 crc restorecon[4687]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/containers/kube-rbac-proxy/704ce128 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Oct 03 08:39:15 crc restorecon[4687]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/containers/machine-api-operator/70d16fe0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Oct 03 08:39:15 crc restorecon[4687]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/containers/machine-api-operator/bfb95535 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Oct 03 08:39:15 crc restorecon[4687]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/containers/machine-api-operator/57a8e8e2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Oct 03 08:39:15 crc restorecon[4687]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Oct 03 08:39:15 crc restorecon[4687]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/volumes/kubernetes.io~configmap/config/..2025_02_23_05_27_49.3413793711 not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Oct 03 08:39:15 crc restorecon[4687]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/volumes/kubernetes.io~configmap/config/..2025_02_23_05_27_49.3413793711/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Oct 03 08:39:15 crc restorecon[4687]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Oct 03 08:39:15 crc restorecon[4687]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Oct 03 08:39:15 crc restorecon[4687]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Oct 03 08:39:15 crc restorecon[4687]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/containers/kube-apiserver-operator/1b9d3e5e not reset as customized by admin to system_u:object_r:container_file_t:s0:c107,c917 Oct 03 08:39:15 crc restorecon[4687]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/containers/kube-apiserver-operator/fddb173c not reset as customized by admin to system_u:object_r:container_file_t:s0:c202,c983 Oct 03 08:39:15 crc restorecon[4687]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/containers/kube-apiserver-operator/95d3c6c4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Oct 03 08:39:15 crc restorecon[4687]: /var/lib/kubelet/pods/9d751cbb-f2e2-430d-9754-c882a5e924a5/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Oct 03 08:39:15 crc restorecon[4687]: /var/lib/kubelet/pods/9d751cbb-f2e2-430d-9754-c882a5e924a5/containers/check-endpoints/bfb5fff5 not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Oct 03 08:39:15 crc restorecon[4687]: /var/lib/kubelet/pods/9d751cbb-f2e2-430d-9754-c882a5e924a5/containers/check-endpoints/2aef40aa not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Oct 03 08:39:15 crc restorecon[4687]: /var/lib/kubelet/pods/9d751cbb-f2e2-430d-9754-c882a5e924a5/containers/check-endpoints/c0391cad not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Oct 03 08:39:15 crc restorecon[4687]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c214,c928 Oct 03 08:39:15 crc restorecon[4687]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager/1119e69d not reset as customized by admin to system_u:object_r:container_file_t:s0:c776,c1007 Oct 03 08:39:15 crc restorecon[4687]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager/660608b4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c214,c928 Oct 03 08:39:15 crc restorecon[4687]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager/8220bd53 not reset as customized by admin to system_u:object_r:container_file_t:s0:c214,c928 Oct 03 08:39:15 crc restorecon[4687]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/cluster-policy-controller/85f99d5c not reset as customized by admin to system_u:object_r:container_file_t:s0:c776,c1007 Oct 03 08:39:15 crc restorecon[4687]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/cluster-policy-controller/4b0225f6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c214,c928 Oct 03 08:39:15 crc restorecon[4687]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager-cert-syncer/9c2a3394 not reset as customized by admin to system_u:object_r:container_file_t:s0:c776,c1007 Oct 03 08:39:15 crc restorecon[4687]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager-cert-syncer/e820b243 not reset as customized by admin to system_u:object_r:container_file_t:s0:c214,c928 Oct 03 08:39:15 crc restorecon[4687]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager-recovery-controller/1ca52ea0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c776,c1007 Oct 03 08:39:15 crc restorecon[4687]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager-recovery-controller/e6988e45 not reset as customized by admin to system_u:object_r:container_file_t:s0:c214,c928 Oct 03 08:39:15 crc restorecon[4687]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/volumes/kubernetes.io~configmap/mcc-auth-proxy-config not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Oct 03 08:39:15 crc restorecon[4687]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/volumes/kubernetes.io~configmap/mcc-auth-proxy-config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Oct 03 08:39:15 crc restorecon[4687]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/volumes/kubernetes.io~configmap/mcc-auth-proxy-config/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Oct 03 08:39:15 crc restorecon[4687]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/volumes/kubernetes.io~configmap/mcc-auth-proxy-config/..2025_02_24_06_09_21.2517297950 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Oct 03 08:39:15 crc restorecon[4687]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/volumes/kubernetes.io~configmap/mcc-auth-proxy-config/..2025_02_24_06_09_21.2517297950/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Oct 03 08:39:15 crc restorecon[4687]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Oct 03 08:39:15 crc restorecon[4687]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/containers/machine-config-controller/6655f00b not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Oct 03 08:39:15 crc restorecon[4687]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/containers/machine-config-controller/98bc3986 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Oct 03 08:39:15 crc restorecon[4687]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/containers/machine-config-controller/08e3458a not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Oct 03 08:39:15 crc restorecon[4687]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/containers/kube-rbac-proxy/2a191cb0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Oct 03 08:39:15 crc restorecon[4687]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/containers/kube-rbac-proxy/6c4eeefb not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Oct 03 08:39:15 crc restorecon[4687]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/containers/kube-rbac-proxy/f61a549c not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Oct 03 08:39:15 crc restorecon[4687]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c318,c553 Oct 03 08:39:15 crc restorecon[4687]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/containers/hostpath-provisioner/24891863 not reset as customized by admin to system_u:object_r:container_file_t:s0:c37,c572 Oct 03 08:39:15 crc restorecon[4687]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/containers/hostpath-provisioner/fbdfd89c not reset as customized by admin to system_u:object_r:container_file_t:s0:c318,c553 Oct 03 08:39:15 crc restorecon[4687]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/containers/liveness-probe/9b63b3bc not reset as customized by admin to system_u:object_r:container_file_t:s0:c37,c572 Oct 03 08:39:15 crc restorecon[4687]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/containers/liveness-probe/8acde6d6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c318,c553 Oct 03 08:39:15 crc restorecon[4687]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/containers/node-driver-registrar/59ecbba3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c318,c553 Oct 03 08:39:15 crc restorecon[4687]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/containers/csi-provisioner/685d4be3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c318,c553 Oct 03 08:39:15 crc restorecon[4687]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Oct 03 08:39:15 crc restorecon[4687]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.341639300 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Oct 03 08:39:15 crc restorecon[4687]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.341639300/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Oct 03 08:39:15 crc restorecon[4687]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.341639300/openshift-route-controller-manager.client-ca.configmap not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Oct 03 08:39:15 crc restorecon[4687]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.341639300/openshift-route-controller-manager.serving-cert.secret not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Oct 03 08:39:15 crc restorecon[4687]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Oct 03 08:39:15 crc restorecon[4687]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Oct 03 08:39:15 crc restorecon[4687]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/openshift-route-controller-manager.client-ca.configmap not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Oct 03 08:39:15 crc restorecon[4687]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/openshift-route-controller-manager.serving-cert.secret not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Oct 03 08:39:15 crc restorecon[4687]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/client-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Oct 03 08:39:15 crc restorecon[4687]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/client-ca/..2025_02_24_06_20_07.2950937851 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Oct 03 08:39:15 crc restorecon[4687]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/client-ca/..2025_02_24_06_20_07.2950937851/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Oct 03 08:39:15 crc restorecon[4687]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/client-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Oct 03 08:39:15 crc restorecon[4687]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/client-ca/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Oct 03 08:39:15 crc restorecon[4687]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Oct 03 08:39:15 crc restorecon[4687]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/containers/route-controller-manager/feaea55e not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Oct 03 08:39:15 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:39:15 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:39:15 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/abinitio-runtime-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:39:15 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/abinitio-runtime-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:39:15 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/accuknox-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:39:15 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/accuknox-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:39:15 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aci-containers-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:39:15 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aci-containers-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:39:15 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aikit-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:39:15 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aikit-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:39:15 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/airlock-microgateway not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:39:15 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/airlock-microgateway/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:39:15 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ako-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:39:15 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ako-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:39:15 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/alloy not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:39:15 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/alloy/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:39:15 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anchore-engine not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:39:15 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anchore-engine/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:39:15 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzo-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:39:15 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzo-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:39:15 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzograph-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:39:15 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzograph-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:39:15 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzounstructured-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:39:15 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzounstructured-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:39:15 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/appdynamics-cloud-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:39:15 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/appdynamics-cloud-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:39:15 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/appdynamics-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:39:15 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/appdynamics-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:39:15 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aqua-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:39:15 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aqua-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:39:15 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cass-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:39:15 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cass-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:39:15 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ccm-node-agent-dcap-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:39:15 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ccm-node-agent-dcap-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:39:15 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ccm-node-agent-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:39:15 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ccm-node-agent-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:39:15 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cfm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:39:15 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cfm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:39:15 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cilium not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:39:15 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cilium/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:39:15 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cilium-enterprise not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:39:15 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cilium-enterprise/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:39:15 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloud-native-postgresql not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:39:15 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloud-native-postgresql/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:39:15 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudbees-ci not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:39:15 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudbees-ci/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:39:15 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudera-streams-messaging-kubernetes-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:39:15 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudera-streams-messaging-kubernetes-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:39:15 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudnative-pg not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:39:15 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudnative-pg/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:39:15 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cnfv-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:39:15 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cnfv-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:39:15 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cockroachdb-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:39:15 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cockroachdb-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:39:15 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/conjur-follower-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:39:15 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/conjur-follower-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:39:15 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/coroot-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:39:15 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/coroot-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:39:15 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/crunchy-postgres-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:39:15 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/crunchy-postgres-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:39:15 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cte-k8s-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:39:15 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cte-k8s-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:39:15 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datadog-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:39:15 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datadog-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:39:15 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dell-csm-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:39:15 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dell-csm-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:39:15 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/digitalai-deploy-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:39:15 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/digitalai-deploy-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:39:15 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/digitalai-release-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:39:15 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/digitalai-release-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:39:15 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dynatrace-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:39:15 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dynatrace-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:39:15 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/edb-hcp-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:39:15 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/edb-hcp-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:39:15 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eginnovations-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:39:15 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eginnovations-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:39:15 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/elasticsearch-eck-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:39:15 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/elasticsearch-eck-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:39:15 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/falcon-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:39:15 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/falcon-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:39:15 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/federatorai-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:39:15 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/federatorai-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:39:15 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fujitsu-enterprise-postgres-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:39:15 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fujitsu-enterprise-postgres-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:39:15 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/function-mesh not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:39:15 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/function-mesh/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:39:15 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/harness-gitops-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:39:15 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/harness-gitops-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:39:15 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hazelcast-platform-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:39:15 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hazelcast-platform-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:39:15 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hcp-terraform-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:39:15 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hcp-terraform-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:39:15 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hpe-ezmeral-csi-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:39:15 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hpe-ezmeral-csi-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:39:15 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-application-gateway-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:39:15 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-application-gateway-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:39:15 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-block-csi-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:39:15 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-block-csi-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:39:15 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-access-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:39:15 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-access-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:39:15 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-directory-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:39:15 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-directory-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:39:15 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:39:15 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:39:15 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infoscale-dr-manager not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:39:15 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infoscale-dr-manager/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:39:15 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infoscale-licensing-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:39:15 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infoscale-licensing-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:39:15 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infoscale-sds-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:39:15 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infoscale-sds-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:39:15 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infrastructure-asset-orchestrator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:39:15 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infrastructure-asset-orchestrator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:39:15 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/instana-agent-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:39:15 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/instana-agent-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:39:15 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/intel-device-plugins-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:39:15 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/intel-device-plugins-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:39:15 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/intel-kubernetes-power-manager not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:39:15 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/intel-kubernetes-power-manager/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:39:15 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/iomesh-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:39:15 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/iomesh-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:39:15 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:39:15 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:39:15 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx8-openshift-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:39:15 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx8-openshift-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:39:15 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx8-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:39:15 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx8-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:39:15 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k8s-triliovault not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:39:15 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k8s-triliovault/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:39:15 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-ati-updates not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:39:15 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-ati-updates/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:39:15 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-framework not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:39:15 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-framework/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:39:15 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-ingress not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:39:15 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-ingress/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:39:15 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-licensing not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:39:15 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-licensing/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:39:15 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-sso not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:39:15 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-sso/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:39:15 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-keycloak-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:39:15 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-keycloak-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:39:15 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-load-core not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:39:15 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-load-core/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:39:15 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-loadcore-agents not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:39:15 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-loadcore-agents/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:39:15 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-nats-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:39:15 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-nats-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:39:15 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-nimbusmosaic-dusim not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:39:15 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-nimbusmosaic-dusim/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:39:15 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-rest-api-browser-v1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:39:15 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-rest-api-browser-v1/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:39:15 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-appsec not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:39:15 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-appsec/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:39:15 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-core not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:39:15 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-core/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:39:15 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-db not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:39:15 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-db/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:39:15 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-diagnostics not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:39:15 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-diagnostics/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:39:15 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-logging not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:39:15 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-logging/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:39:15 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-migration not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:39:15 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-migration/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:39:15 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-msg-broker not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:39:15 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-msg-broker/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:39:15 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-notifications not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:39:15 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-notifications/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:39:15 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-stats-dashboards not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:39:15 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-stats-dashboards/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:39:15 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-storage not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:39:15 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-storage/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:39:15 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-test-core not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:39:15 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-test-core/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:39:15 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-ui not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:39:15 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-ui/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:39:15 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-websocket-service not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:39:15 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-websocket-service/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:39:15 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kong-gateway-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:39:15 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kong-gateway-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:39:15 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubearmor-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:39:15 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubearmor-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:39:15 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubecost-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:39:15 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubecost-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:39:15 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubemq-operator-marketplace not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:39:15 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubemq-operator-marketplace/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:39:15 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubeturbo-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:39:15 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubeturbo-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:39:15 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lenovo-locd-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:39:15 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lenovo-locd-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:39:15 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/marketplace-games-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:39:15 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/marketplace-games-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:39:15 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/memcached-operator-ogaye not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:39:15 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/memcached-operator-ogaye/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:39:15 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/memory-machine-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:39:15 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/memory-machine-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:39:15 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/model-builder-for-vision-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:39:15 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/model-builder-for-vision-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:39:15 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-atlas-kubernetes not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:39:15 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-atlas-kubernetes/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:39:15 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-enterprise not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:39:15 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-enterprise/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:39:15 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netapp-spark-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:39:15 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netapp-spark-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:39:15 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netscaler-adm-agent-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:39:15 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netscaler-adm-agent-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:39:15 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netscaler-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:39:15 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netscaler-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:39:15 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/neuvector-certified-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:39:15 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/neuvector-certified-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:39:15 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nexus-repository-ha-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:39:15 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nexus-repository-ha-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:39:15 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nginx-ingress-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:39:15 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nginx-ingress-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:39:15 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pcc-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:39:15 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pcc-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:39:15 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nim-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:39:15 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nim-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:39:15 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nxiq-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:39:15 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nxiq-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:39:15 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nxrm-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:39:15 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nxrm-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:39:15 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odigos-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:39:15 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odigos-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:39:15 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/open-liberty-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:39:15 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/open-liberty-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:39:15 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshiftartifactoryha-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:39:15 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshiftartifactoryha-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:39:15 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshiftxray-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:39:15 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshiftxray-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:39:15 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/operator-certification-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:39:15 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/operator-certification-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:39:15 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ovms-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:39:15 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ovms-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:39:15 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pachyderm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:39:15 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pachyderm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:39:15 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pmem-csi-operator-os not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:39:15 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pmem-csi-operator-os/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:39:15 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/portworx-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:39:15 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/portworx-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:39:15 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometurbo-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:39:15 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometurbo-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:39:15 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pubsubplus-eventbroker-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:39:15 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pubsubplus-eventbroker-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:39:15 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redis-enterprise-operator-cert not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:39:15 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redis-enterprise-operator-cert/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:39:15 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/runtime-component-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:39:15 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/runtime-component-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:39:15 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/runtime-fabric-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:39:15 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/runtime-fabric-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:39:15 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sanstoragecsi-operator-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:39:15 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sanstoragecsi-operator-bundle/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:39:15 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/silicom-sts-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:39:15 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/silicom-sts-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:39:15 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/smilecdr-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:39:15 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/smilecdr-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:39:15 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sriov-fec not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:39:15 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sriov-fec/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:39:15 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stackable-commons-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:39:15 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stackable-commons-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:39:15 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stackable-zookeeper-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:39:15 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stackable-zookeeper-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:39:15 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:39:15 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:39:15 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c-tsc-client-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:39:15 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c-tsc-client-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:39:15 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tawon-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:39:15 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tawon-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:39:15 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tigera-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:39:15 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tigera-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:39:15 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/timemachine-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:39:15 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/timemachine-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:39:15 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vault-secrets-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:39:15 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vault-secrets-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:39:15 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vcp-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:39:15 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vcp-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:39:15 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/webotx-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:39:15 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/webotx-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:39:15 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/xcrypt-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:39:15 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/xcrypt-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:39:15 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/zabbix-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:39:15 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/zabbix-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:39:15 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:39:15 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:39:15 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:39:15 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:39:15 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:39:15 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/db.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:39:15 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/index.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:39:15 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/main.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:39:15 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/overflow.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:39:15 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/digest not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:39:15 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/utilities not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:39:15 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/utilities/copy-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:39:15 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:39:15 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/extract-utilities/63709497 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:39:15 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/extract-utilities/d966b7fd not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:39:15 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/extract-utilities/f5773757 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:39:15 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/extract-content/81c9edb9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:39:15 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/extract-content/57bf57ee not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:39:15 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/extract-content/86f5e6aa not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:39:15 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/registry-server/0aabe31d not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:39:15 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/registry-server/d2af85c2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:39:15 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/registry-server/09d157d9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:39:15 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:39:15 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:39:15 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:39:15 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:39:15 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:39:15 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:39:15 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/db.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:39:15 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/index.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:39:15 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/main.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:39:15 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/overflow.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:39:15 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/digest not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:39:15 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:39:15 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/3scale-community-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:39:15 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/3scale-community-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:39:15 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-acm-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:39:15 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-acm-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:39:15 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-acmpca-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:39:15 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-acmpca-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:39:15 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-apigateway-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:39:15 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-apigateway-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:39:15 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-apigatewayv2-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:39:15 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-apigatewayv2-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:39:15 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-applicationautoscaling-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:39:15 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-applicationautoscaling-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:39:15 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-athena-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:39:15 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-athena-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:39:15 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudfront-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:39:15 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudfront-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:39:15 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudtrail-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:39:15 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudtrail-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:39:15 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudwatch-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:39:15 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudwatch-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:39:15 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudwatchlogs-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:39:15 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudwatchlogs-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:39:15 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-documentdb-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:39:15 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-documentdb-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:39:15 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-dynamodb-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:39:15 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-dynamodb-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:39:15 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ec2-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:39:15 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ec2-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:39:15 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ecr-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:39:15 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ecr-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:39:15 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ecs-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:39:15 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ecs-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:39:15 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-efs-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:39:15 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-efs-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:39:15 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-eks-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:39:15 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-eks-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:39:15 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-elasticache-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:39:15 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-elasticache-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:39:15 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-elbv2-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:39:15 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-elbv2-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:39:15 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-emrcontainers-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:39:15 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-emrcontainers-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:39:15 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-eventbridge-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:39:15 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-eventbridge-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:39:15 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-iam-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:39:15 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-iam-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:39:15 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-kafka-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:39:15 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-kafka-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:39:15 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-keyspaces-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:39:15 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-keyspaces-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:39:15 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-kinesis-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:39:15 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-kinesis-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:39:15 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-kms-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:39:15 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-kms-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:39:15 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-lambda-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:39:15 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-lambda-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:39:15 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-memorydb-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:39:15 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-memorydb-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:39:15 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-mq-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:39:15 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-mq-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:39:15 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-networkfirewall-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:39:15 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-networkfirewall-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:39:15 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-opensearchservice-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:39:15 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-opensearchservice-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:39:15 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-organizations-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:39:15 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-organizations-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:39:15 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-pipes-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:39:15 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-pipes-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:39:15 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-prometheusservice-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:39:15 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-prometheusservice-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:39:15 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-rds-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:39:15 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-rds-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:39:15 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-recyclebin-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:39:15 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-recyclebin-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:39:15 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-route53-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:39:15 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-route53-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:39:15 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-route53resolver-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:39:15 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-route53resolver-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:39:15 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-s3-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:39:15 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-s3-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:39:15 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sagemaker-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:39:15 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sagemaker-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:39:15 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-secretsmanager-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:39:15 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-secretsmanager-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:39:15 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ses-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:39:15 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ses-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:39:15 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sfn-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:39:15 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sfn-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:39:15 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sns-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:39:15 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sns-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:39:15 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sqs-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:39:15 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sqs-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:39:15 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ssm-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:39:15 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ssm-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:39:15 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-wafv2-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:39:15 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-wafv2-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:39:15 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aerospike-kubernetes-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:39:15 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aerospike-kubernetes-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:39:15 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/airflow-helm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:39:15 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/airflow-helm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:39:15 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/alloydb-omni-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:39:15 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/alloydb-omni-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:39:15 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/alvearie-imaging-ingestion not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:39:15 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/alvearie-imaging-ingestion/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:39:15 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amd-gpu-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:39:15 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amd-gpu-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:39:15 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/analytics-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:39:15 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/analytics-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:39:15 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/annotationlab not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:39:15 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/annotationlab/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:39:15 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicast-community-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:39:15 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicast-community-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:39:15 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurio-api-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:39:15 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurio-api-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:39:15 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurio-registry not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:39:15 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurio-registry/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:39:15 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurito not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:39:15 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurito/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:39:15 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apimatic-kubernetes-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:39:15 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apimatic-kubernetes-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:39:15 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/application-services-metering-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:39:15 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/application-services-metering-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:39:15 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aqua not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:39:15 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aqua/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:39:15 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/argocd-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:39:15 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/argocd-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:39:15 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/assisted-service-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:39:15 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/assisted-service-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:39:15 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/authorino-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:39:15 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/authorino-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:39:15 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/automotive-infra not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:39:15 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/automotive-infra/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:39:15 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aws-efs-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:39:15 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aws-efs-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:39:15 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/awss3-operator-registry not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:39:15 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/awss3-operator-registry/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:39:15 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/azure-service-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:39:15 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/azure-service-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:39:15 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/beegfs-csi-driver-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:39:15 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/beegfs-csi-driver-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:39:15 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bpfman-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:39:15 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bpfman-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:39:15 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/camel-k not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:39:15 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/camel-k/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:39:15 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/camel-karavan-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:39:15 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/camel-karavan-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:39:15 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cass-operator-community not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:39:15 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cass-operator-community/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:39:15 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cert-manager not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:39:15 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cert-manager/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:39:15 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cert-utils-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:39:15 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cert-utils-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:39:15 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-aas-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:39:15 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-aas-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:39:15 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-impairment-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:39:15 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-impairment-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:39:15 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-manager not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:39:15 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-manager/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:39:15 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cockroachdb not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:39:15 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cockroachdb/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:39:15 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/codeflare-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:39:15 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/codeflare-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:39:15 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/community-kubevirt-hyperconverged not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:39:15 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/community-kubevirt-hyperconverged/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:39:15 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/community-trivy-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:39:15 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/community-trivy-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:39:15 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/community-windows-machine-config-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:39:15 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/community-windows-machine-config-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:39:15 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/customized-user-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:39:15 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/customized-user-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:39:15 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cxl-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:39:15 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cxl-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:39:15 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dapr-kubernetes-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:39:15 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dapr-kubernetes-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:39:15 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datadog-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:39:15 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datadog-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:39:15 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datatrucker-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:39:15 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datatrucker-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:39:15 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dbaas-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:39:15 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dbaas-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:39:15 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/debezium-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:39:15 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/debezium-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:39:15 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dell-csm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:39:15 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dell-csm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:39:15 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/deployment-validation-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:39:15 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/deployment-validation-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:39:15 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/devopsinabox not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:39:15 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/devopsinabox/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:39:15 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dns-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:39:15 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dns-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:39:15 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dynatrace-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:39:15 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dynatrace-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:39:15 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eclipse-amlen-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:39:15 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eclipse-amlen-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:39:15 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eclipse-che not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:39:15 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eclipse-che/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:39:15 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ecr-secret-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:39:15 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ecr-secret-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:39:15 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/edp-keycloak-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:39:15 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/edp-keycloak-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:39:15 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eginnovations-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:39:15 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eginnovations-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:39:15 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/egressip-ipam-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:39:15 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/egressip-ipam-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:39:15 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ember-csi-community-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:39:15 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ember-csi-community-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:39:15 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/etcd not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:39:15 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/etcd/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:39:15 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eventing-kogito not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:39:15 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eventing-kogito/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:39:15 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/external-secrets-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:39:15 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/external-secrets-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:39:15 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/falcon-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:39:15 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/falcon-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:39:15 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fence-agents-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:39:15 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fence-agents-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:39:15 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/flink-kubernetes-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:39:15 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/flink-kubernetes-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:39:15 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/flux not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:39:15 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/flux/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:39:15 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k8gb not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:39:15 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k8gb/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:39:15 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fossul-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:39:15 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fossul-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:39:15 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/github-arc-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:39:15 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/github-arc-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:39:15 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/gitops-primer not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:39:15 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/gitops-primer/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:39:15 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/gitwebhook-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:39:15 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/gitwebhook-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:39:15 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/global-load-balancer-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:39:15 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/global-load-balancer-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:39:15 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/grafana-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:39:15 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/grafana-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:39:15 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/group-sync-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:39:15 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/group-sync-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:39:15 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hawtio-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:39:15 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hawtio-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:39:15 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hazelcast-platform-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:39:15 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hazelcast-platform-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:39:15 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hedvig-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:39:15 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hedvig-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:39:15 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hive-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:39:15 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hive-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:39:15 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/horreum-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:39:15 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/horreum-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:39:15 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hyperfoil-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:39:15 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hyperfoil-bundle/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:39:15 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-block-csi-operator-community not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:39:15 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-block-csi-operator-community/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:39:15 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-access-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:39:15 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-access-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:39:15 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-spectrum-scale-csi-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:39:15 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-spectrum-scale-csi-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:39:15 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibmcloud-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:39:15 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibmcloud-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:39:15 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infinispan not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:39:15 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infinispan/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:39:15 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/integrity-shield-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:39:15 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/integrity-shield-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:39:15 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ipfs-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:39:15 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ipfs-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:39:15 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/istio-workspace-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:39:15 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/istio-workspace-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:39:15 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/jaeger not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:39:15 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/jaeger/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:39:15 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kaoto-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:39:15 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kaoto-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:39:15 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keda not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:39:15 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keda/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:39:15 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keepalived-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:39:15 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keepalived-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:39:15 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keycloak-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:39:15 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keycloak-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:39:15 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keycloak-permissions-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:39:15 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keycloak-permissions-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:39:15 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/klusterlet not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:39:15 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/klusterlet/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:39:15 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kogito-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:39:15 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kogito-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:39:15 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/koku-metrics-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:39:15 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/koku-metrics-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:39:15 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/konveyor-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:39:15 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/konveyor-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:39:15 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/korrel8r not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:39:15 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/korrel8r/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:39:15 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kuadrant-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:39:15 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kuadrant-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:39:15 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kube-green not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:39:15 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kube-green/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:39:15 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubecost not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:39:15 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubecost/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:39:15 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubernetes-imagepuller-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:39:15 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubernetes-imagepuller-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:39:15 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubeturbo not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:39:15 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubeturbo/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:39:15 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/l5-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:39:15 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/l5-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:39:15 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/layer7-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:39:15 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/layer7-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:39:15 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lbconfig-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:39:15 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lbconfig-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:39:15 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lib-bucket-provisioner not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:39:15 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lib-bucket-provisioner/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:39:15 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/limitador-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:39:15 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/limitador-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:39:15 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/logging-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:39:15 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/logging-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:39:15 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/loki-helm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:39:15 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/loki-helm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:39:15 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/loki-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:39:15 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/loki-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:39:15 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/machine-deletion-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:39:15 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/machine-deletion-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:39:15 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mariadb-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:39:15 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mariadb-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:39:15 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/marin3r not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:39:15 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/marin3r/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:39:15 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mercury-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:39:15 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mercury-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:39:15 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/microcks not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:39:15 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/microcks/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:39:15 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-atlas-kubernetes not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:39:15 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-atlas-kubernetes/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:39:15 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:39:15 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:39:15 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/move2kube-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:39:15 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/move2kube-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:39:15 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multi-nic-cni-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:39:15 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multi-nic-cni-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:39:15 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multicluster-global-hub-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:39:15 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multicluster-global-hub-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:39:15 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multicluster-operators-subscription not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:39:15 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multicluster-operators-subscription/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:39:15 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/must-gather-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:39:15 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/must-gather-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:39:15 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/namespace-configuration-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:39:15 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/namespace-configuration-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:39:15 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ncn-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:39:15 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ncn-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:39:15 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ndmspc-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:39:15 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ndmspc-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:39:15 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netobserv-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:39:15 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netobserv-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:39:15 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/neuvector-community-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:39:15 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/neuvector-community-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:39:15 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nexus-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:39:15 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nexus-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:39:15 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nexus-operator-m88i not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:39:15 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nexus-operator-m88i/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:39:15 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nfs-provisioner-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:39:15 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nfs-provisioner-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:39:15 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nlp-server not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:39:15 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nlp-server/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:39:15 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-discovery-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:39:15 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-discovery-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:39:15 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-healthcheck-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:39:15 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-healthcheck-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:39:15 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-maintenance-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:39:15 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-maintenance-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:39:15 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nsm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:39:15 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nsm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:39:15 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/oadp-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:39:15 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/oadp-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:39:15 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/observability-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:39:15 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/observability-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:39:15 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/oci-ccm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:39:15 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/oci-ccm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:39:15 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ocm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:39:15 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ocm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:39:15 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odoo-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:39:15 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odoo-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:39:15 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/opendatahub-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:39:15 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/opendatahub-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:39:15 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openebs not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:39:15 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openebs/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:39:15 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-nfd-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:39:15 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-nfd-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:39:15 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-node-upgrade-mutex-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:39:15 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-node-upgrade-mutex-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:39:15 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-qiskit-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:39:15 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-qiskit-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:39:15 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/opentelemetry-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:39:15 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/opentelemetry-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:39:15 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/patch-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:39:15 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/patch-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:39:15 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/patterns-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:39:15 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/patterns-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:39:15 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pcc-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:39:15 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pcc-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:39:15 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pelorus-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:39:15 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pelorus-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:39:15 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/percona-xtradb-cluster-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:39:15 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/percona-xtradb-cluster-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:39:15 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/portworx-essentials not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:39:15 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/portworx-essentials/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:39:15 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/postgresql not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:39:15 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/postgresql/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:39:15 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/proactive-node-scaling-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:39:15 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/proactive-node-scaling-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:39:15 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/project-quay not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:39:15 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/project-quay/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:39:15 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometheus not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:39:15 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometheus/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:39:15 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometheus-exporter-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:39:15 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometheus-exporter-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:39:15 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometurbo not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:39:15 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometurbo/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:39:15 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pubsubplus-eventbroker-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:39:15 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pubsubplus-eventbroker-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:39:15 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pulp-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:39:15 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pulp-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:39:15 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rabbitmq-cluster-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:39:15 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rabbitmq-cluster-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:39:15 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rabbitmq-messaging-topology-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:39:15 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rabbitmq-messaging-topology-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:39:15 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redis-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:39:15 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redis-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:39:15 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/reportportal-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:39:15 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/reportportal-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:39:15 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/resource-locker-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:39:15 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/resource-locker-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:39:15 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhoas-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:39:15 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhoas-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:39:15 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ripsaw not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:39:15 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ripsaw/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:39:15 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sailoperator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:39:15 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sailoperator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:39:15 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sap-commerce-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:39:15 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sap-commerce-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:39:15 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sap-data-intelligence-observer-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:39:15 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sap-data-intelligence-observer-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:39:15 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sap-hana-express-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:39:15 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sap-hana-express-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:39:15 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/seldon-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:39:15 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/seldon-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:39:15 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/self-node-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:39:15 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/self-node-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:39:15 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/service-binding-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:39:15 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/service-binding-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:39:15 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/shipwright-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:39:15 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/shipwright-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:39:15 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sigstore-helm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:39:15 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sigstore-helm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:39:15 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/silicom-sts-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:39:15 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/silicom-sts-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:39:15 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/skupper-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:39:15 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/skupper-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:39:15 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/snapscheduler not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:39:15 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/snapscheduler/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:39:15 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/snyk-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:39:15 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/snyk-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:39:15 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/socmmd not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:39:15 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/socmmd/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:39:15 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sonar-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:39:15 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sonar-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:39:15 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sosivio not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:39:15 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sosivio/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:39:15 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sonataflow-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:39:15 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sonataflow-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:39:15 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sosreport-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:39:15 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sosreport-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:39:15 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/spark-helm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:39:15 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/spark-helm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:39:15 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/special-resource-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:39:15 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/special-resource-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:39:15 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stolostron not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:39:15 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stolostron/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:39:15 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stolostron-engine not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:39:15 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stolostron-engine/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:39:15 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/strimzi-kafka-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:39:15 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/strimzi-kafka-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:39:15 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/syndesis not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:39:15 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/syndesis/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:39:15 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:39:15 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:39:15 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tagger not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:39:15 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tagger/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:39:15 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tempo-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:39:15 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tempo-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:39:15 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tf-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:39:15 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tf-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:39:15 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tidb-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:39:15 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tidb-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:39:15 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/trident-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:39:15 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/trident-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:39:15 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/trustify-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:39:15 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/trustify-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:39:15 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ucs-ci-solutions-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:39:15 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ucs-ci-solutions-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:39:15 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/universal-crossplane not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:39:15 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/universal-crossplane/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:39:15 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/varnish-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:39:15 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/varnish-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:39:15 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vault-config-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:39:15 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vault-config-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:39:15 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/verticadb-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:39:15 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/verticadb-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:39:15 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/volume-expander-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:39:15 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/volume-expander-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:39:15 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/wandb-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:39:15 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/wandb-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:39:15 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/windup-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:39:15 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/windup-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:39:15 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/yaks not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:39:15 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/yaks/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:39:15 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/utilities not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:39:15 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/utilities/copy-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:39:15 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:39:15 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/extract-utilities/c0fe7256 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:39:15 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/extract-utilities/c30319e4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:39:15 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/extract-utilities/e6b1dd45 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:39:15 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/extract-content/2bb643f0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:39:15 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/extract-content/920de426 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:39:15 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/extract-content/70fa1e87 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:39:15 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/registry-server/a1c12a2f not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:39:15 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/registry-server/9442e6c7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:39:15 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/registry-server/5b45ec72 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:39:15 crc restorecon[4687]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:39:15 crc restorecon[4687]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:39:15 crc restorecon[4687]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/abot-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:39:15 crc restorecon[4687]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/abot-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:39:15 crc restorecon[4687]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aerospike-kubernetes-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:39:15 crc restorecon[4687]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aerospike-kubernetes-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:39:15 crc restorecon[4687]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aikit-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:39:15 crc restorecon[4687]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aikit-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:39:15 crc restorecon[4687]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzo-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:39:15 crc restorecon[4687]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzo-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:39:15 crc restorecon[4687]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzograph-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:39:15 crc restorecon[4687]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzograph-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:39:15 crc restorecon[4687]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzounstructured-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:39:15 crc restorecon[4687]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzounstructured-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:39:15 crc restorecon[4687]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudbees-ci-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:39:15 crc restorecon[4687]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudbees-ci-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:39:15 crc restorecon[4687]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cockroachdb-certified-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:39:15 crc restorecon[4687]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cockroachdb-certified-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:39:15 crc restorecon[4687]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/crunchy-postgres-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:39:15 crc restorecon[4687]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/crunchy-postgres-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:39:15 crc restorecon[4687]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datadog-operator-certified-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:39:15 crc restorecon[4687]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datadog-operator-certified-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:39:15 crc restorecon[4687]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dynatrace-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:39:15 crc restorecon[4687]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dynatrace-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:39:15 crc restorecon[4687]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/entando-k8s-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:39:15 crc restorecon[4687]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/entando-k8s-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:39:15 crc restorecon[4687]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/flux not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:39:15 crc restorecon[4687]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/flux/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:39:15 crc restorecon[4687]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/instana-agent-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:39:15 crc restorecon[4687]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/instana-agent-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:39:15 crc restorecon[4687]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/iomesh-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:39:15 crc restorecon[4687]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/iomesh-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:39:15 crc restorecon[4687]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:39:15 crc restorecon[4687]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:39:15 crc restorecon[4687]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx8-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:39:15 crc restorecon[4687]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx8-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:39:15 crc restorecon[4687]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k10-kasten-operator-paygo-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:39:15 crc restorecon[4687]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k10-kasten-operator-paygo-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:39:15 crc restorecon[4687]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k10-kasten-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:39:15 crc restorecon[4687]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k10-kasten-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:39:15 crc restorecon[4687]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k10-kasten-operator-term-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:39:15 crc restorecon[4687]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k10-kasten-operator-term-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:39:15 crc restorecon[4687]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubemq-operator-marketplace-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:39:15 crc restorecon[4687]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubemq-operator-marketplace-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:39:15 crc restorecon[4687]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubeturbo-certified-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:39:15 crc restorecon[4687]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubeturbo-certified-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:39:15 crc restorecon[4687]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/linstor-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:39:15 crc restorecon[4687]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/linstor-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:39:15 crc restorecon[4687]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/marketplace-games-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:39:15 crc restorecon[4687]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/marketplace-games-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:39:15 crc restorecon[4687]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/model-builder-for-vision-certified-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:39:15 crc restorecon[4687]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/model-builder-for-vision-certified-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:39:15 crc restorecon[4687]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/neuvector-certified-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:39:15 crc restorecon[4687]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/neuvector-certified-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:39:15 crc restorecon[4687]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ovms-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:39:15 crc restorecon[4687]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ovms-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:39:15 crc restorecon[4687]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pachyderm-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:39:15 crc restorecon[4687]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pachyderm-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:39:15 crc restorecon[4687]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redis-enterprise-operator-cert-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:39:15 crc restorecon[4687]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redis-enterprise-operator-cert-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:39:15 crc restorecon[4687]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/seldon-deploy-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:39:15 crc restorecon[4687]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/seldon-deploy-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:39:15 crc restorecon[4687]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/starburst-enterprise-helm-operator-paygo-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:39:15 crc restorecon[4687]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/starburst-enterprise-helm-operator-paygo-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:39:15 crc restorecon[4687]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/starburst-enterprise-helm-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:39:15 crc restorecon[4687]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/starburst-enterprise-helm-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:39:15 crc restorecon[4687]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c-certified-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:39:15 crc restorecon[4687]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c-certified-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:39:15 crc restorecon[4687]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/timemachine-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:39:15 crc restorecon[4687]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/timemachine-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:39:15 crc restorecon[4687]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vfunction-server-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:39:15 crc restorecon[4687]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vfunction-server-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:39:15 crc restorecon[4687]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/xcrypt-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:39:15 crc restorecon[4687]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/xcrypt-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:39:15 crc restorecon[4687]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/yugabyte-platform-operator-bundle-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:39:15 crc restorecon[4687]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/yugabyte-platform-operator-bundle-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:39:15 crc restorecon[4687]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/zabbix-operator-certified-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:39:15 crc restorecon[4687]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/zabbix-operator-certified-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:39:15 crc restorecon[4687]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:39:15 crc restorecon[4687]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:39:15 crc restorecon[4687]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:39:15 crc restorecon[4687]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:39:15 crc restorecon[4687]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:39:15 crc restorecon[4687]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/db.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:39:15 crc restorecon[4687]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/index.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:39:15 crc restorecon[4687]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/main.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:39:15 crc restorecon[4687]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/overflow.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:39:15 crc restorecon[4687]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/digest not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:39:15 crc restorecon[4687]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/utilities not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:39:15 crc restorecon[4687]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/utilities/copy-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:39:15 crc restorecon[4687]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:39:15 crc restorecon[4687]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/extract-utilities/3c9f3a59 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:39:15 crc restorecon[4687]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/extract-utilities/1091c11b not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:39:15 crc restorecon[4687]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/extract-utilities/9a6821c6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:39:15 crc restorecon[4687]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/extract-content/ec0c35e2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:39:15 crc restorecon[4687]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/extract-content/517f37e7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:39:15 crc restorecon[4687]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/extract-content/6214fe78 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:39:15 crc restorecon[4687]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/registry-server/ba189c8b not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:39:15 crc restorecon[4687]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/registry-server/351e4f31 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:39:15 crc restorecon[4687]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/registry-server/c0f219ff not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:39:15 crc restorecon[4687]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c247,c522 Oct 03 08:39:15 crc restorecon[4687]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/wait-for-host-port/8069f607 not reset as customized by admin to system_u:object_r:container_file_t:s0:c378,c723 Oct 03 08:39:15 crc restorecon[4687]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/wait-for-host-port/559c3d82 not reset as customized by admin to system_u:object_r:container_file_t:s0:c133,c223 Oct 03 08:39:15 crc restorecon[4687]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/wait-for-host-port/605ad488 not reset as customized by admin to system_u:object_r:container_file_t:s0:c247,c522 Oct 03 08:39:15 crc restorecon[4687]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler/148df488 not reset as customized by admin to system_u:object_r:container_file_t:s0:c378,c723 Oct 03 08:39:15 crc restorecon[4687]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler/3bf6dcb4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c133,c223 Oct 03 08:39:15 crc restorecon[4687]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler/022a2feb not reset as customized by admin to system_u:object_r:container_file_t:s0:c247,c522 Oct 03 08:39:15 crc restorecon[4687]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler-cert-syncer/938c3924 not reset as customized by admin to system_u:object_r:container_file_t:s0:c378,c723 Oct 03 08:39:15 crc restorecon[4687]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler-cert-syncer/729fe23e not reset as customized by admin to system_u:object_r:container_file_t:s0:c133,c223 Oct 03 08:39:15 crc restorecon[4687]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler-cert-syncer/1fd5cbd4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c247,c522 Oct 03 08:39:15 crc restorecon[4687]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler-recovery-controller/a96697e1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c378,c723 Oct 03 08:39:15 crc restorecon[4687]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler-recovery-controller/e155ddca not reset as customized by admin to system_u:object_r:container_file_t:s0:c133,c223 Oct 03 08:39:15 crc restorecon[4687]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler-recovery-controller/10dd0e0f not reset as customized by admin to system_u:object_r:container_file_t:s0:c247,c522 Oct 03 08:39:15 crc restorecon[4687]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-trusted-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Oct 03 08:39:15 crc restorecon[4687]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-trusted-ca-bundle/..2025_02_24_06_09_35.3018472960 not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Oct 03 08:39:15 crc restorecon[4687]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-trusted-ca-bundle/..2025_02_24_06_09_35.3018472960/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Oct 03 08:39:15 crc restorecon[4687]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-trusted-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Oct 03 08:39:15 crc restorecon[4687]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-trusted-ca-bundle/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Oct 03 08:39:15 crc restorecon[4687]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/audit-policies not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Oct 03 08:39:15 crc restorecon[4687]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/audit-policies/..2025_02_24_06_09_35.4262376737 not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Oct 03 08:39:15 crc restorecon[4687]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/audit-policies/..2025_02_24_06_09_35.4262376737/audit.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Oct 03 08:39:15 crc restorecon[4687]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/audit-policies/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Oct 03 08:39:15 crc restorecon[4687]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/audit-policies/audit.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Oct 03 08:39:15 crc restorecon[4687]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-cliconfig not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Oct 03 08:39:15 crc restorecon[4687]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-cliconfig/..2025_02_24_06_09_35.2630275752 not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Oct 03 08:39:15 crc restorecon[4687]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-cliconfig/..2025_02_24_06_09_35.2630275752/v4-0-config-system-cliconfig not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Oct 03 08:39:15 crc restorecon[4687]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-cliconfig/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Oct 03 08:39:15 crc restorecon[4687]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-cliconfig/v4-0-config-system-cliconfig not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Oct 03 08:39:15 crc restorecon[4687]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-service-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Oct 03 08:39:15 crc restorecon[4687]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-service-ca/..2025_02_24_06_09_35.2376963788 not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Oct 03 08:39:15 crc restorecon[4687]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-service-ca/..2025_02_24_06_09_35.2376963788/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Oct 03 08:39:15 crc restorecon[4687]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-service-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Oct 03 08:39:15 crc restorecon[4687]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-service-ca/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Oct 03 08:39:15 crc restorecon[4687]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Oct 03 08:39:15 crc restorecon[4687]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/containers/oauth-openshift/6f2c8392 not reset as customized by admin to system_u:object_r:container_file_t:s0:c267,c588 Oct 03 08:39:15 crc restorecon[4687]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/containers/oauth-openshift/bd241ad9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Oct 03 08:39:15 crc restorecon[4687]: /var/lib/kubelet/plugins not reset as customized by admin to system_u:object_r:container_file_t:s0 Oct 03 08:39:15 crc restorecon[4687]: /var/lib/kubelet/plugins/csi-hostpath not reset as customized by admin to system_u:object_r:container_file_t:s0 Oct 03 08:39:15 crc restorecon[4687]: /var/lib/kubelet/plugins/csi-hostpath/csi.sock not reset as customized by admin to system_u:object_r:container_file_t:s0 Oct 03 08:39:15 crc restorecon[4687]: /var/lib/kubelet/plugins/kubernetes.io not reset as customized by admin to system_u:object_r:container_file_t:s0 Oct 03 08:39:15 crc restorecon[4687]: /var/lib/kubelet/plugins/kubernetes.io/csi not reset as customized by admin to system_u:object_r:container_file_t:s0 Oct 03 08:39:15 crc restorecon[4687]: /var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner not reset as customized by admin to system_u:object_r:container_file_t:s0 Oct 03 08:39:15 crc restorecon[4687]: /var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/1f4776af88835e41c12b831b4c9fed40233456d14189815a54dbe7f892fc1983 not reset as customized by admin to system_u:object_r:container_file_t:s0 Oct 03 08:39:15 crc restorecon[4687]: /var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/1f4776af88835e41c12b831b4c9fed40233456d14189815a54dbe7f892fc1983/globalmount not reset as customized by admin to system_u:object_r:container_file_t:s0 Oct 03 08:39:15 crc restorecon[4687]: /var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/1f4776af88835e41c12b831b4c9fed40233456d14189815a54dbe7f892fc1983/vol_data.json not reset as customized by admin to system_u:object_r:container_file_t:s0 Oct 03 08:39:15 crc restorecon[4687]: /var/lib/kubelet/plugins_registry not reset as customized by admin to system_u:object_r:container_file_t:s0 Oct 03 08:39:15 crc restorecon[4687]: Relabeled /var/usrlocal/bin/kubenswrapper from system_u:object_r:bin_t:s0 to system_u:object_r:kubelet_exec_t:s0 Oct 03 08:39:16 crc kubenswrapper[4765]: Flag --container-runtime-endpoint has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Oct 03 08:39:16 crc kubenswrapper[4765]: Flag --minimum-container-ttl-duration has been deprecated, Use --eviction-hard or --eviction-soft instead. Will be removed in a future version. Oct 03 08:39:16 crc kubenswrapper[4765]: Flag --volume-plugin-dir has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Oct 03 08:39:16 crc kubenswrapper[4765]: Flag --register-with-taints has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Oct 03 08:39:16 crc kubenswrapper[4765]: Flag --pod-infra-container-image has been deprecated, will be removed in a future release. Image garbage collector will get sandbox image information from CRI. Oct 03 08:39:16 crc kubenswrapper[4765]: Flag --system-reserved has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Oct 03 08:39:16 crc kubenswrapper[4765]: I1003 08:39:16.075047 4765 server.go:211] "--pod-infra-container-image will not be pruned by the image garbage collector in kubelet and should also be set in the remote runtime" Oct 03 08:39:16 crc kubenswrapper[4765]: W1003 08:39:16.079553 4765 feature_gate.go:330] unrecognized feature gate: SignatureStores Oct 03 08:39:16 crc kubenswrapper[4765]: W1003 08:39:16.079571 4765 feature_gate.go:330] unrecognized feature gate: VSphereStaticIPs Oct 03 08:39:16 crc kubenswrapper[4765]: W1003 08:39:16.079575 4765 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstallIBMCloud Oct 03 08:39:16 crc kubenswrapper[4765]: W1003 08:39:16.079580 4765 feature_gate.go:330] unrecognized feature gate: VolumeGroupSnapshot Oct 03 08:39:16 crc kubenswrapper[4765]: W1003 08:39:16.079584 4765 feature_gate.go:330] unrecognized feature gate: InsightsRuntimeExtractor Oct 03 08:39:16 crc kubenswrapper[4765]: W1003 08:39:16.079589 4765 feature_gate.go:330] unrecognized feature gate: OpenShiftPodSecurityAdmission Oct 03 08:39:16 crc kubenswrapper[4765]: W1003 08:39:16.079592 4765 feature_gate.go:330] unrecognized feature gate: AzureWorkloadIdentity Oct 03 08:39:16 crc kubenswrapper[4765]: W1003 08:39:16.079596 4765 feature_gate.go:330] unrecognized feature gate: MixedCPUsAllocation Oct 03 08:39:16 crc kubenswrapper[4765]: W1003 08:39:16.079600 4765 feature_gate.go:330] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Oct 03 08:39:16 crc kubenswrapper[4765]: W1003 08:39:16.079603 4765 feature_gate.go:330] unrecognized feature gate: OnClusterBuild Oct 03 08:39:16 crc kubenswrapper[4765]: W1003 08:39:16.079610 4765 feature_gate.go:330] unrecognized feature gate: InsightsOnDemandDataGather Oct 03 08:39:16 crc kubenswrapper[4765]: W1003 08:39:16.079614 4765 feature_gate.go:330] unrecognized feature gate: AdditionalRoutingCapabilities Oct 03 08:39:16 crc kubenswrapper[4765]: W1003 08:39:16.079617 4765 feature_gate.go:330] unrecognized feature gate: NetworkDiagnosticsConfig Oct 03 08:39:16 crc kubenswrapper[4765]: W1003 08:39:16.079621 4765 feature_gate.go:330] unrecognized feature gate: ExternalOIDC Oct 03 08:39:16 crc kubenswrapper[4765]: W1003 08:39:16.079624 4765 feature_gate.go:330] unrecognized feature gate: MachineConfigNodes Oct 03 08:39:16 crc kubenswrapper[4765]: W1003 08:39:16.079628 4765 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAzure Oct 03 08:39:16 crc kubenswrapper[4765]: W1003 08:39:16.079632 4765 feature_gate.go:330] unrecognized feature gate: ManagedBootImages Oct 03 08:39:16 crc kubenswrapper[4765]: W1003 08:39:16.079635 4765 feature_gate.go:330] unrecognized feature gate: MachineAPIProviderOpenStack Oct 03 08:39:16 crc kubenswrapper[4765]: W1003 08:39:16.079638 4765 feature_gate.go:330] unrecognized feature gate: InsightsConfigAPI Oct 03 08:39:16 crc kubenswrapper[4765]: W1003 08:39:16.079655 4765 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAWS Oct 03 08:39:16 crc kubenswrapper[4765]: W1003 08:39:16.079659 4765 feature_gate.go:330] unrecognized feature gate: ImageStreamImportMode Oct 03 08:39:16 crc kubenswrapper[4765]: W1003 08:39:16.079664 4765 feature_gate.go:351] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Oct 03 08:39:16 crc kubenswrapper[4765]: W1003 08:39:16.079668 4765 feature_gate.go:330] unrecognized feature gate: NetworkSegmentation Oct 03 08:39:16 crc kubenswrapper[4765]: W1003 08:39:16.079671 4765 feature_gate.go:330] unrecognized feature gate: PrivateHostedZoneAWS Oct 03 08:39:16 crc kubenswrapper[4765]: W1003 08:39:16.079675 4765 feature_gate.go:330] unrecognized feature gate: GCPClusterHostedDNS Oct 03 08:39:16 crc kubenswrapper[4765]: W1003 08:39:16.079679 4765 feature_gate.go:330] unrecognized feature gate: HardwareSpeed Oct 03 08:39:16 crc kubenswrapper[4765]: W1003 08:39:16.079683 4765 feature_gate.go:330] unrecognized feature gate: AutomatedEtcdBackup Oct 03 08:39:16 crc kubenswrapper[4765]: W1003 08:39:16.079686 4765 feature_gate.go:330] unrecognized feature gate: InsightsConfig Oct 03 08:39:16 crc kubenswrapper[4765]: W1003 08:39:16.079690 4765 feature_gate.go:330] unrecognized feature gate: Example Oct 03 08:39:16 crc kubenswrapper[4765]: W1003 08:39:16.079693 4765 feature_gate.go:330] unrecognized feature gate: IngressControllerLBSubnetsAWS Oct 03 08:39:16 crc kubenswrapper[4765]: W1003 08:39:16.079697 4765 feature_gate.go:330] unrecognized feature gate: PinnedImages Oct 03 08:39:16 crc kubenswrapper[4765]: W1003 08:39:16.079700 4765 feature_gate.go:330] unrecognized feature gate: NodeDisruptionPolicy Oct 03 08:39:16 crc kubenswrapper[4765]: W1003 08:39:16.079704 4765 feature_gate.go:330] unrecognized feature gate: UpgradeStatus Oct 03 08:39:16 crc kubenswrapper[4765]: W1003 08:39:16.079707 4765 feature_gate.go:330] unrecognized feature gate: NetworkLiveMigration Oct 03 08:39:16 crc kubenswrapper[4765]: W1003 08:39:16.079710 4765 feature_gate.go:330] unrecognized feature gate: BootcNodeManagement Oct 03 08:39:16 crc kubenswrapper[4765]: W1003 08:39:16.079714 4765 feature_gate.go:330] unrecognized feature gate: VSphereDriverConfiguration Oct 03 08:39:16 crc kubenswrapper[4765]: W1003 08:39:16.079718 4765 feature_gate.go:330] unrecognized feature gate: GatewayAPI Oct 03 08:39:16 crc kubenswrapper[4765]: W1003 08:39:16.079721 4765 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstall Oct 03 08:39:16 crc kubenswrapper[4765]: W1003 08:39:16.079724 4765 feature_gate.go:330] unrecognized feature gate: IngressControllerDynamicConfigurationManager Oct 03 08:39:16 crc kubenswrapper[4765]: W1003 08:39:16.079734 4765 feature_gate.go:330] unrecognized feature gate: AlibabaPlatform Oct 03 08:39:16 crc kubenswrapper[4765]: W1003 08:39:16.079739 4765 feature_gate.go:353] Setting GA feature gate DisableKubeletCloudCredentialProviders=true. It will be removed in a future release. Oct 03 08:39:16 crc kubenswrapper[4765]: W1003 08:39:16.079743 4765 feature_gate.go:330] unrecognized feature gate: PlatformOperators Oct 03 08:39:16 crc kubenswrapper[4765]: W1003 08:39:16.079749 4765 feature_gate.go:330] unrecognized feature gate: ConsolePluginContentSecurityPolicy Oct 03 08:39:16 crc kubenswrapper[4765]: W1003 08:39:16.079753 4765 feature_gate.go:330] unrecognized feature gate: NutanixMultiSubnets Oct 03 08:39:16 crc kubenswrapper[4765]: W1003 08:39:16.079757 4765 feature_gate.go:330] unrecognized feature gate: SetEIPForNLBIngressController Oct 03 08:39:16 crc kubenswrapper[4765]: W1003 08:39:16.079760 4765 feature_gate.go:330] unrecognized feature gate: MachineAPIMigration Oct 03 08:39:16 crc kubenswrapper[4765]: W1003 08:39:16.079764 4765 feature_gate.go:330] unrecognized feature gate: MetricsCollectionProfiles Oct 03 08:39:16 crc kubenswrapper[4765]: W1003 08:39:16.079767 4765 feature_gate.go:330] unrecognized feature gate: BareMetalLoadBalancer Oct 03 08:39:16 crc kubenswrapper[4765]: W1003 08:39:16.079771 4765 feature_gate.go:330] unrecognized feature gate: VSphereMultiVCenters Oct 03 08:39:16 crc kubenswrapper[4765]: W1003 08:39:16.079774 4765 feature_gate.go:330] unrecognized feature gate: EtcdBackendQuota Oct 03 08:39:16 crc kubenswrapper[4765]: W1003 08:39:16.079777 4765 feature_gate.go:330] unrecognized feature gate: BuildCSIVolumes Oct 03 08:39:16 crc kubenswrapper[4765]: W1003 08:39:16.079781 4765 feature_gate.go:330] unrecognized feature gate: RouteAdvertisements Oct 03 08:39:16 crc kubenswrapper[4765]: W1003 08:39:16.079786 4765 feature_gate.go:330] unrecognized feature gate: DNSNameResolver Oct 03 08:39:16 crc kubenswrapper[4765]: W1003 08:39:16.079792 4765 feature_gate.go:330] unrecognized feature gate: AWSEFSDriverVolumeMetrics Oct 03 08:39:16 crc kubenswrapper[4765]: W1003 08:39:16.079802 4765 feature_gate.go:353] Setting GA feature gate ValidatingAdmissionPolicy=true. It will be removed in a future release. Oct 03 08:39:16 crc kubenswrapper[4765]: W1003 08:39:16.079809 4765 feature_gate.go:330] unrecognized feature gate: CSIDriverSharedResource Oct 03 08:39:16 crc kubenswrapper[4765]: W1003 08:39:16.079814 4765 feature_gate.go:330] unrecognized feature gate: MinimumKubeletVersion Oct 03 08:39:16 crc kubenswrapper[4765]: W1003 08:39:16.079819 4765 feature_gate.go:330] unrecognized feature gate: VSphereControlPlaneMachineSet Oct 03 08:39:16 crc kubenswrapper[4765]: W1003 08:39:16.079824 4765 feature_gate.go:353] Setting GA feature gate CloudDualStackNodeIPs=true. It will be removed in a future release. Oct 03 08:39:16 crc kubenswrapper[4765]: W1003 08:39:16.079830 4765 feature_gate.go:330] unrecognized feature gate: AdminNetworkPolicy Oct 03 08:39:16 crc kubenswrapper[4765]: W1003 08:39:16.079835 4765 feature_gate.go:330] unrecognized feature gate: PersistentIPsForVirtualization Oct 03 08:39:16 crc kubenswrapper[4765]: W1003 08:39:16.079842 4765 feature_gate.go:330] unrecognized feature gate: VSphereMultiNetworks Oct 03 08:39:16 crc kubenswrapper[4765]: W1003 08:39:16.079847 4765 feature_gate.go:330] unrecognized feature gate: ManagedBootImagesAWS Oct 03 08:39:16 crc kubenswrapper[4765]: W1003 08:39:16.079854 4765 feature_gate.go:330] unrecognized feature gate: NewOLM Oct 03 08:39:16 crc kubenswrapper[4765]: W1003 08:39:16.079858 4765 feature_gate.go:330] unrecognized feature gate: OVNObservability Oct 03 08:39:16 crc kubenswrapper[4765]: W1003 08:39:16.079863 4765 feature_gate.go:330] unrecognized feature gate: ChunkSizeMiB Oct 03 08:39:16 crc kubenswrapper[4765]: W1003 08:39:16.079867 4765 feature_gate.go:330] unrecognized feature gate: ClusterMonitoringConfig Oct 03 08:39:16 crc kubenswrapper[4765]: W1003 08:39:16.079871 4765 feature_gate.go:330] unrecognized feature gate: MultiArchInstallGCP Oct 03 08:39:16 crc kubenswrapper[4765]: W1003 08:39:16.079876 4765 feature_gate.go:330] unrecognized feature gate: AWSClusterHostedDNS Oct 03 08:39:16 crc kubenswrapper[4765]: W1003 08:39:16.079880 4765 feature_gate.go:330] unrecognized feature gate: SigstoreImageVerification Oct 03 08:39:16 crc kubenswrapper[4765]: W1003 08:39:16.079884 4765 feature_gate.go:330] unrecognized feature gate: GCPLabelsTags Oct 03 08:39:16 crc kubenswrapper[4765]: I1003 08:39:16.080688 4765 flags.go:64] FLAG: --address="0.0.0.0" Oct 03 08:39:16 crc kubenswrapper[4765]: I1003 08:39:16.080702 4765 flags.go:64] FLAG: --allowed-unsafe-sysctls="[]" Oct 03 08:39:16 crc kubenswrapper[4765]: I1003 08:39:16.080709 4765 flags.go:64] FLAG: --anonymous-auth="true" Oct 03 08:39:16 crc kubenswrapper[4765]: I1003 08:39:16.080715 4765 flags.go:64] FLAG: --application-metrics-count-limit="100" Oct 03 08:39:16 crc kubenswrapper[4765]: I1003 08:39:16.080721 4765 flags.go:64] FLAG: --authentication-token-webhook="false" Oct 03 08:39:16 crc kubenswrapper[4765]: I1003 08:39:16.080725 4765 flags.go:64] FLAG: --authentication-token-webhook-cache-ttl="2m0s" Oct 03 08:39:16 crc kubenswrapper[4765]: I1003 08:39:16.080731 4765 flags.go:64] FLAG: --authorization-mode="AlwaysAllow" Oct 03 08:39:16 crc kubenswrapper[4765]: I1003 08:39:16.080737 4765 flags.go:64] FLAG: --authorization-webhook-cache-authorized-ttl="5m0s" Oct 03 08:39:16 crc kubenswrapper[4765]: I1003 08:39:16.080741 4765 flags.go:64] FLAG: --authorization-webhook-cache-unauthorized-ttl="30s" Oct 03 08:39:16 crc kubenswrapper[4765]: I1003 08:39:16.080746 4765 flags.go:64] FLAG: --boot-id-file="/proc/sys/kernel/random/boot_id" Oct 03 08:39:16 crc kubenswrapper[4765]: I1003 08:39:16.080751 4765 flags.go:64] FLAG: --bootstrap-kubeconfig="/etc/kubernetes/kubeconfig" Oct 03 08:39:16 crc kubenswrapper[4765]: I1003 08:39:16.080756 4765 flags.go:64] FLAG: --cert-dir="/var/lib/kubelet/pki" Oct 03 08:39:16 crc kubenswrapper[4765]: I1003 08:39:16.080760 4765 flags.go:64] FLAG: --cgroup-driver="cgroupfs" Oct 03 08:39:16 crc kubenswrapper[4765]: I1003 08:39:16.080764 4765 flags.go:64] FLAG: --cgroup-root="" Oct 03 08:39:16 crc kubenswrapper[4765]: I1003 08:39:16.080768 4765 flags.go:64] FLAG: --cgroups-per-qos="true" Oct 03 08:39:16 crc kubenswrapper[4765]: I1003 08:39:16.080772 4765 flags.go:64] FLAG: --client-ca-file="" Oct 03 08:39:16 crc kubenswrapper[4765]: I1003 08:39:16.080776 4765 flags.go:64] FLAG: --cloud-config="" Oct 03 08:39:16 crc kubenswrapper[4765]: I1003 08:39:16.080780 4765 flags.go:64] FLAG: --cloud-provider="" Oct 03 08:39:16 crc kubenswrapper[4765]: I1003 08:39:16.080784 4765 flags.go:64] FLAG: --cluster-dns="[]" Oct 03 08:39:16 crc kubenswrapper[4765]: I1003 08:39:16.080790 4765 flags.go:64] FLAG: --cluster-domain="" Oct 03 08:39:16 crc kubenswrapper[4765]: I1003 08:39:16.080794 4765 flags.go:64] FLAG: --config="/etc/kubernetes/kubelet.conf" Oct 03 08:39:16 crc kubenswrapper[4765]: I1003 08:39:16.080798 4765 flags.go:64] FLAG: --config-dir="" Oct 03 08:39:16 crc kubenswrapper[4765]: I1003 08:39:16.080802 4765 flags.go:64] FLAG: --container-hints="/etc/cadvisor/container_hints.json" Oct 03 08:39:16 crc kubenswrapper[4765]: I1003 08:39:16.080806 4765 flags.go:64] FLAG: --container-log-max-files="5" Oct 03 08:39:16 crc kubenswrapper[4765]: I1003 08:39:16.080812 4765 flags.go:64] FLAG: --container-log-max-size="10Mi" Oct 03 08:39:16 crc kubenswrapper[4765]: I1003 08:39:16.080816 4765 flags.go:64] FLAG: --container-runtime-endpoint="/var/run/crio/crio.sock" Oct 03 08:39:16 crc kubenswrapper[4765]: I1003 08:39:16.080821 4765 flags.go:64] FLAG: --containerd="/run/containerd/containerd.sock" Oct 03 08:39:16 crc kubenswrapper[4765]: I1003 08:39:16.080825 4765 flags.go:64] FLAG: --containerd-namespace="k8s.io" Oct 03 08:39:16 crc kubenswrapper[4765]: I1003 08:39:16.080829 4765 flags.go:64] FLAG: --contention-profiling="false" Oct 03 08:39:16 crc kubenswrapper[4765]: I1003 08:39:16.080833 4765 flags.go:64] FLAG: --cpu-cfs-quota="true" Oct 03 08:39:16 crc kubenswrapper[4765]: I1003 08:39:16.080837 4765 flags.go:64] FLAG: --cpu-cfs-quota-period="100ms" Oct 03 08:39:16 crc kubenswrapper[4765]: I1003 08:39:16.080841 4765 flags.go:64] FLAG: --cpu-manager-policy="none" Oct 03 08:39:16 crc kubenswrapper[4765]: I1003 08:39:16.080849 4765 flags.go:64] FLAG: --cpu-manager-policy-options="" Oct 03 08:39:16 crc kubenswrapper[4765]: I1003 08:39:16.080854 4765 flags.go:64] FLAG: --cpu-manager-reconcile-period="10s" Oct 03 08:39:16 crc kubenswrapper[4765]: I1003 08:39:16.080858 4765 flags.go:64] FLAG: --enable-controller-attach-detach="true" Oct 03 08:39:16 crc kubenswrapper[4765]: I1003 08:39:16.080862 4765 flags.go:64] FLAG: --enable-debugging-handlers="true" Oct 03 08:39:16 crc kubenswrapper[4765]: I1003 08:39:16.080866 4765 flags.go:64] FLAG: --enable-load-reader="false" Oct 03 08:39:16 crc kubenswrapper[4765]: I1003 08:39:16.080878 4765 flags.go:64] FLAG: --enable-server="true" Oct 03 08:39:16 crc kubenswrapper[4765]: I1003 08:39:16.080882 4765 flags.go:64] FLAG: --enforce-node-allocatable="[pods]" Oct 03 08:39:16 crc kubenswrapper[4765]: I1003 08:39:16.080887 4765 flags.go:64] FLAG: --event-burst="100" Oct 03 08:39:16 crc kubenswrapper[4765]: I1003 08:39:16.080892 4765 flags.go:64] FLAG: --event-qps="50" Oct 03 08:39:16 crc kubenswrapper[4765]: I1003 08:39:16.080896 4765 flags.go:64] FLAG: --event-storage-age-limit="default=0" Oct 03 08:39:16 crc kubenswrapper[4765]: I1003 08:39:16.080901 4765 flags.go:64] FLAG: --event-storage-event-limit="default=0" Oct 03 08:39:16 crc kubenswrapper[4765]: I1003 08:39:16.080904 4765 flags.go:64] FLAG: --eviction-hard="" Oct 03 08:39:16 crc kubenswrapper[4765]: I1003 08:39:16.080910 4765 flags.go:64] FLAG: --eviction-max-pod-grace-period="0" Oct 03 08:39:16 crc kubenswrapper[4765]: I1003 08:39:16.080915 4765 flags.go:64] FLAG: --eviction-minimum-reclaim="" Oct 03 08:39:16 crc kubenswrapper[4765]: I1003 08:39:16.080920 4765 flags.go:64] FLAG: --eviction-pressure-transition-period="5m0s" Oct 03 08:39:16 crc kubenswrapper[4765]: I1003 08:39:16.080931 4765 flags.go:64] FLAG: --eviction-soft="" Oct 03 08:39:16 crc kubenswrapper[4765]: I1003 08:39:16.080938 4765 flags.go:64] FLAG: --eviction-soft-grace-period="" Oct 03 08:39:16 crc kubenswrapper[4765]: I1003 08:39:16.080943 4765 flags.go:64] FLAG: --exit-on-lock-contention="false" Oct 03 08:39:16 crc kubenswrapper[4765]: I1003 08:39:16.080948 4765 flags.go:64] FLAG: --experimental-allocatable-ignore-eviction="false" Oct 03 08:39:16 crc kubenswrapper[4765]: I1003 08:39:16.080953 4765 flags.go:64] FLAG: --experimental-mounter-path="" Oct 03 08:39:16 crc kubenswrapper[4765]: I1003 08:39:16.080958 4765 flags.go:64] FLAG: --fail-cgroupv1="false" Oct 03 08:39:16 crc kubenswrapper[4765]: I1003 08:39:16.080963 4765 flags.go:64] FLAG: --fail-swap-on="true" Oct 03 08:39:16 crc kubenswrapper[4765]: I1003 08:39:16.080968 4765 flags.go:64] FLAG: --feature-gates="" Oct 03 08:39:16 crc kubenswrapper[4765]: I1003 08:39:16.080980 4765 flags.go:64] FLAG: --file-check-frequency="20s" Oct 03 08:39:16 crc kubenswrapper[4765]: I1003 08:39:16.080985 4765 flags.go:64] FLAG: --global-housekeeping-interval="1m0s" Oct 03 08:39:16 crc kubenswrapper[4765]: I1003 08:39:16.080990 4765 flags.go:64] FLAG: --hairpin-mode="promiscuous-bridge" Oct 03 08:39:16 crc kubenswrapper[4765]: I1003 08:39:16.080995 4765 flags.go:64] FLAG: --healthz-bind-address="127.0.0.1" Oct 03 08:39:16 crc kubenswrapper[4765]: I1003 08:39:16.080999 4765 flags.go:64] FLAG: --healthz-port="10248" Oct 03 08:39:16 crc kubenswrapper[4765]: I1003 08:39:16.081004 4765 flags.go:64] FLAG: --help="false" Oct 03 08:39:16 crc kubenswrapper[4765]: I1003 08:39:16.081008 4765 flags.go:64] FLAG: --hostname-override="" Oct 03 08:39:16 crc kubenswrapper[4765]: I1003 08:39:16.081013 4765 flags.go:64] FLAG: --housekeeping-interval="10s" Oct 03 08:39:16 crc kubenswrapper[4765]: I1003 08:39:16.081018 4765 flags.go:64] FLAG: --http-check-frequency="20s" Oct 03 08:39:16 crc kubenswrapper[4765]: I1003 08:39:16.081024 4765 flags.go:64] FLAG: --image-credential-provider-bin-dir="" Oct 03 08:39:16 crc kubenswrapper[4765]: I1003 08:39:16.081028 4765 flags.go:64] FLAG: --image-credential-provider-config="" Oct 03 08:39:16 crc kubenswrapper[4765]: I1003 08:39:16.081032 4765 flags.go:64] FLAG: --image-gc-high-threshold="85" Oct 03 08:39:16 crc kubenswrapper[4765]: I1003 08:39:16.081036 4765 flags.go:64] FLAG: --image-gc-low-threshold="80" Oct 03 08:39:16 crc kubenswrapper[4765]: I1003 08:39:16.081040 4765 flags.go:64] FLAG: --image-service-endpoint="" Oct 03 08:39:16 crc kubenswrapper[4765]: I1003 08:39:16.081044 4765 flags.go:64] FLAG: --kernel-memcg-notification="false" Oct 03 08:39:16 crc kubenswrapper[4765]: I1003 08:39:16.081048 4765 flags.go:64] FLAG: --kube-api-burst="100" Oct 03 08:39:16 crc kubenswrapper[4765]: I1003 08:39:16.081052 4765 flags.go:64] FLAG: --kube-api-content-type="application/vnd.kubernetes.protobuf" Oct 03 08:39:16 crc kubenswrapper[4765]: I1003 08:39:16.081057 4765 flags.go:64] FLAG: --kube-api-qps="50" Oct 03 08:39:16 crc kubenswrapper[4765]: I1003 08:39:16.081061 4765 flags.go:64] FLAG: --kube-reserved="" Oct 03 08:39:16 crc kubenswrapper[4765]: I1003 08:39:16.081066 4765 flags.go:64] FLAG: --kube-reserved-cgroup="" Oct 03 08:39:16 crc kubenswrapper[4765]: I1003 08:39:16.081069 4765 flags.go:64] FLAG: --kubeconfig="/var/lib/kubelet/kubeconfig" Oct 03 08:39:16 crc kubenswrapper[4765]: I1003 08:39:16.081074 4765 flags.go:64] FLAG: --kubelet-cgroups="" Oct 03 08:39:16 crc kubenswrapper[4765]: I1003 08:39:16.081079 4765 flags.go:64] FLAG: --local-storage-capacity-isolation="true" Oct 03 08:39:16 crc kubenswrapper[4765]: I1003 08:39:16.081084 4765 flags.go:64] FLAG: --lock-file="" Oct 03 08:39:16 crc kubenswrapper[4765]: I1003 08:39:16.081089 4765 flags.go:64] FLAG: --log-cadvisor-usage="false" Oct 03 08:39:16 crc kubenswrapper[4765]: I1003 08:39:16.081094 4765 flags.go:64] FLAG: --log-flush-frequency="5s" Oct 03 08:39:16 crc kubenswrapper[4765]: I1003 08:39:16.081099 4765 flags.go:64] FLAG: --log-json-info-buffer-size="0" Oct 03 08:39:16 crc kubenswrapper[4765]: I1003 08:39:16.081108 4765 flags.go:64] FLAG: --log-json-split-stream="false" Oct 03 08:39:16 crc kubenswrapper[4765]: I1003 08:39:16.081112 4765 flags.go:64] FLAG: --log-text-info-buffer-size="0" Oct 03 08:39:16 crc kubenswrapper[4765]: I1003 08:39:16.081117 4765 flags.go:64] FLAG: --log-text-split-stream="false" Oct 03 08:39:16 crc kubenswrapper[4765]: I1003 08:39:16.081122 4765 flags.go:64] FLAG: --logging-format="text" Oct 03 08:39:16 crc kubenswrapper[4765]: I1003 08:39:16.081126 4765 flags.go:64] FLAG: --machine-id-file="/etc/machine-id,/var/lib/dbus/machine-id" Oct 03 08:39:16 crc kubenswrapper[4765]: I1003 08:39:16.081130 4765 flags.go:64] FLAG: --make-iptables-util-chains="true" Oct 03 08:39:16 crc kubenswrapper[4765]: I1003 08:39:16.081134 4765 flags.go:64] FLAG: --manifest-url="" Oct 03 08:39:16 crc kubenswrapper[4765]: I1003 08:39:16.081138 4765 flags.go:64] FLAG: --manifest-url-header="" Oct 03 08:39:16 crc kubenswrapper[4765]: I1003 08:39:16.081144 4765 flags.go:64] FLAG: --max-housekeeping-interval="15s" Oct 03 08:39:16 crc kubenswrapper[4765]: I1003 08:39:16.081148 4765 flags.go:64] FLAG: --max-open-files="1000000" Oct 03 08:39:16 crc kubenswrapper[4765]: I1003 08:39:16.081153 4765 flags.go:64] FLAG: --max-pods="110" Oct 03 08:39:16 crc kubenswrapper[4765]: I1003 08:39:16.081157 4765 flags.go:64] FLAG: --maximum-dead-containers="-1" Oct 03 08:39:16 crc kubenswrapper[4765]: I1003 08:39:16.081161 4765 flags.go:64] FLAG: --maximum-dead-containers-per-container="1" Oct 03 08:39:16 crc kubenswrapper[4765]: I1003 08:39:16.081165 4765 flags.go:64] FLAG: --memory-manager-policy="None" Oct 03 08:39:16 crc kubenswrapper[4765]: I1003 08:39:16.081171 4765 flags.go:64] FLAG: --minimum-container-ttl-duration="6m0s" Oct 03 08:39:16 crc kubenswrapper[4765]: I1003 08:39:16.081175 4765 flags.go:64] FLAG: --minimum-image-ttl-duration="2m0s" Oct 03 08:39:16 crc kubenswrapper[4765]: I1003 08:39:16.081180 4765 flags.go:64] FLAG: --node-ip="192.168.126.11" Oct 03 08:39:16 crc kubenswrapper[4765]: I1003 08:39:16.081184 4765 flags.go:64] FLAG: --node-labels="node-role.kubernetes.io/control-plane=,node-role.kubernetes.io/master=,node.openshift.io/os_id=rhcos" Oct 03 08:39:16 crc kubenswrapper[4765]: I1003 08:39:16.081194 4765 flags.go:64] FLAG: --node-status-max-images="50" Oct 03 08:39:16 crc kubenswrapper[4765]: I1003 08:39:16.081198 4765 flags.go:64] FLAG: --node-status-update-frequency="10s" Oct 03 08:39:16 crc kubenswrapper[4765]: I1003 08:39:16.081202 4765 flags.go:64] FLAG: --oom-score-adj="-999" Oct 03 08:39:16 crc kubenswrapper[4765]: I1003 08:39:16.081207 4765 flags.go:64] FLAG: --pod-cidr="" Oct 03 08:39:16 crc kubenswrapper[4765]: I1003 08:39:16.081211 4765 flags.go:64] FLAG: --pod-infra-container-image="quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:33549946e22a9ffa738fd94b1345f90921bc8f92fa6137784cb33c77ad806f9d" Oct 03 08:39:16 crc kubenswrapper[4765]: I1003 08:39:16.081219 4765 flags.go:64] FLAG: --pod-manifest-path="" Oct 03 08:39:16 crc kubenswrapper[4765]: I1003 08:39:16.081222 4765 flags.go:64] FLAG: --pod-max-pids="-1" Oct 03 08:39:16 crc kubenswrapper[4765]: I1003 08:39:16.081227 4765 flags.go:64] FLAG: --pods-per-core="0" Oct 03 08:39:16 crc kubenswrapper[4765]: I1003 08:39:16.081231 4765 flags.go:64] FLAG: --port="10250" Oct 03 08:39:16 crc kubenswrapper[4765]: I1003 08:39:16.081235 4765 flags.go:64] FLAG: --protect-kernel-defaults="false" Oct 03 08:39:16 crc kubenswrapper[4765]: I1003 08:39:16.081239 4765 flags.go:64] FLAG: --provider-id="" Oct 03 08:39:16 crc kubenswrapper[4765]: I1003 08:39:16.081243 4765 flags.go:64] FLAG: --qos-reserved="" Oct 03 08:39:16 crc kubenswrapper[4765]: I1003 08:39:16.081247 4765 flags.go:64] FLAG: --read-only-port="10255" Oct 03 08:39:16 crc kubenswrapper[4765]: I1003 08:39:16.081251 4765 flags.go:64] FLAG: --register-node="true" Oct 03 08:39:16 crc kubenswrapper[4765]: I1003 08:39:16.081255 4765 flags.go:64] FLAG: --register-schedulable="true" Oct 03 08:39:16 crc kubenswrapper[4765]: I1003 08:39:16.081259 4765 flags.go:64] FLAG: --register-with-taints="node-role.kubernetes.io/master=:NoSchedule" Oct 03 08:39:16 crc kubenswrapper[4765]: I1003 08:39:16.081266 4765 flags.go:64] FLAG: --registry-burst="10" Oct 03 08:39:16 crc kubenswrapper[4765]: I1003 08:39:16.081270 4765 flags.go:64] FLAG: --registry-qps="5" Oct 03 08:39:16 crc kubenswrapper[4765]: I1003 08:39:16.081274 4765 flags.go:64] FLAG: --reserved-cpus="" Oct 03 08:39:16 crc kubenswrapper[4765]: I1003 08:39:16.081278 4765 flags.go:64] FLAG: --reserved-memory="" Oct 03 08:39:16 crc kubenswrapper[4765]: I1003 08:39:16.081283 4765 flags.go:64] FLAG: --resolv-conf="/etc/resolv.conf" Oct 03 08:39:16 crc kubenswrapper[4765]: I1003 08:39:16.081287 4765 flags.go:64] FLAG: --root-dir="/var/lib/kubelet" Oct 03 08:39:16 crc kubenswrapper[4765]: I1003 08:39:16.081292 4765 flags.go:64] FLAG: --rotate-certificates="false" Oct 03 08:39:16 crc kubenswrapper[4765]: I1003 08:39:16.081296 4765 flags.go:64] FLAG: --rotate-server-certificates="false" Oct 03 08:39:16 crc kubenswrapper[4765]: I1003 08:39:16.081300 4765 flags.go:64] FLAG: --runonce="false" Oct 03 08:39:16 crc kubenswrapper[4765]: I1003 08:39:16.081305 4765 flags.go:64] FLAG: --runtime-cgroups="/system.slice/crio.service" Oct 03 08:39:16 crc kubenswrapper[4765]: I1003 08:39:16.081309 4765 flags.go:64] FLAG: --runtime-request-timeout="2m0s" Oct 03 08:39:16 crc kubenswrapper[4765]: I1003 08:39:16.081313 4765 flags.go:64] FLAG: --seccomp-default="false" Oct 03 08:39:16 crc kubenswrapper[4765]: I1003 08:39:16.081317 4765 flags.go:64] FLAG: --serialize-image-pulls="true" Oct 03 08:39:16 crc kubenswrapper[4765]: I1003 08:39:16.081325 4765 flags.go:64] FLAG: --storage-driver-buffer-duration="1m0s" Oct 03 08:39:16 crc kubenswrapper[4765]: I1003 08:39:16.081330 4765 flags.go:64] FLAG: --storage-driver-db="cadvisor" Oct 03 08:39:16 crc kubenswrapper[4765]: I1003 08:39:16.081334 4765 flags.go:64] FLAG: --storage-driver-host="localhost:8086" Oct 03 08:39:16 crc kubenswrapper[4765]: I1003 08:39:16.081338 4765 flags.go:64] FLAG: --storage-driver-password="root" Oct 03 08:39:16 crc kubenswrapper[4765]: I1003 08:39:16.081342 4765 flags.go:64] FLAG: --storage-driver-secure="false" Oct 03 08:39:16 crc kubenswrapper[4765]: I1003 08:39:16.081347 4765 flags.go:64] FLAG: --storage-driver-table="stats" Oct 03 08:39:16 crc kubenswrapper[4765]: I1003 08:39:16.081351 4765 flags.go:64] FLAG: --storage-driver-user="root" Oct 03 08:39:16 crc kubenswrapper[4765]: I1003 08:39:16.081356 4765 flags.go:64] FLAG: --streaming-connection-idle-timeout="4h0m0s" Oct 03 08:39:16 crc kubenswrapper[4765]: I1003 08:39:16.081360 4765 flags.go:64] FLAG: --sync-frequency="1m0s" Oct 03 08:39:16 crc kubenswrapper[4765]: I1003 08:39:16.081364 4765 flags.go:64] FLAG: --system-cgroups="" Oct 03 08:39:16 crc kubenswrapper[4765]: I1003 08:39:16.081368 4765 flags.go:64] FLAG: --system-reserved="cpu=200m,ephemeral-storage=350Mi,memory=350Mi" Oct 03 08:39:16 crc kubenswrapper[4765]: I1003 08:39:16.081374 4765 flags.go:64] FLAG: --system-reserved-cgroup="" Oct 03 08:39:16 crc kubenswrapper[4765]: I1003 08:39:16.081378 4765 flags.go:64] FLAG: --tls-cert-file="" Oct 03 08:39:16 crc kubenswrapper[4765]: I1003 08:39:16.081382 4765 flags.go:64] FLAG: --tls-cipher-suites="[]" Oct 03 08:39:16 crc kubenswrapper[4765]: I1003 08:39:16.081388 4765 flags.go:64] FLAG: --tls-min-version="" Oct 03 08:39:16 crc kubenswrapper[4765]: I1003 08:39:16.081392 4765 flags.go:64] FLAG: --tls-private-key-file="" Oct 03 08:39:16 crc kubenswrapper[4765]: I1003 08:39:16.081396 4765 flags.go:64] FLAG: --topology-manager-policy="none" Oct 03 08:39:16 crc kubenswrapper[4765]: I1003 08:39:16.081400 4765 flags.go:64] FLAG: --topology-manager-policy-options="" Oct 03 08:39:16 crc kubenswrapper[4765]: I1003 08:39:16.081404 4765 flags.go:64] FLAG: --topology-manager-scope="container" Oct 03 08:39:16 crc kubenswrapper[4765]: I1003 08:39:16.081408 4765 flags.go:64] FLAG: --v="2" Oct 03 08:39:16 crc kubenswrapper[4765]: I1003 08:39:16.081414 4765 flags.go:64] FLAG: --version="false" Oct 03 08:39:16 crc kubenswrapper[4765]: I1003 08:39:16.081419 4765 flags.go:64] FLAG: --vmodule="" Oct 03 08:39:16 crc kubenswrapper[4765]: I1003 08:39:16.081424 4765 flags.go:64] FLAG: --volume-plugin-dir="/etc/kubernetes/kubelet-plugins/volume/exec" Oct 03 08:39:16 crc kubenswrapper[4765]: I1003 08:39:16.081429 4765 flags.go:64] FLAG: --volume-stats-agg-period="1m0s" Oct 03 08:39:16 crc kubenswrapper[4765]: W1003 08:39:16.081522 4765 feature_gate.go:330] unrecognized feature gate: DNSNameResolver Oct 03 08:39:16 crc kubenswrapper[4765]: W1003 08:39:16.081528 4765 feature_gate.go:330] unrecognized feature gate: ConsolePluginContentSecurityPolicy Oct 03 08:39:16 crc kubenswrapper[4765]: W1003 08:39:16.081532 4765 feature_gate.go:330] unrecognized feature gate: VSphereControlPlaneMachineSet Oct 03 08:39:16 crc kubenswrapper[4765]: W1003 08:39:16.081536 4765 feature_gate.go:330] unrecognized feature gate: NetworkSegmentation Oct 03 08:39:16 crc kubenswrapper[4765]: W1003 08:39:16.081541 4765 feature_gate.go:353] Setting GA feature gate DisableKubeletCloudCredentialProviders=true. It will be removed in a future release. Oct 03 08:39:16 crc kubenswrapper[4765]: W1003 08:39:16.081545 4765 feature_gate.go:330] unrecognized feature gate: VSphereStaticIPs Oct 03 08:39:16 crc kubenswrapper[4765]: W1003 08:39:16.081550 4765 feature_gate.go:330] unrecognized feature gate: ExternalOIDC Oct 03 08:39:16 crc kubenswrapper[4765]: W1003 08:39:16.081554 4765 feature_gate.go:330] unrecognized feature gate: AlibabaPlatform Oct 03 08:39:16 crc kubenswrapper[4765]: W1003 08:39:16.081558 4765 feature_gate.go:330] unrecognized feature gate: NetworkDiagnosticsConfig Oct 03 08:39:16 crc kubenswrapper[4765]: W1003 08:39:16.081562 4765 feature_gate.go:330] unrecognized feature gate: SigstoreImageVerification Oct 03 08:39:16 crc kubenswrapper[4765]: W1003 08:39:16.081566 4765 feature_gate.go:330] unrecognized feature gate: OVNObservability Oct 03 08:39:16 crc kubenswrapper[4765]: W1003 08:39:16.081569 4765 feature_gate.go:330] unrecognized feature gate: EtcdBackendQuota Oct 03 08:39:16 crc kubenswrapper[4765]: W1003 08:39:16.081573 4765 feature_gate.go:330] unrecognized feature gate: BootcNodeManagement Oct 03 08:39:16 crc kubenswrapper[4765]: W1003 08:39:16.081576 4765 feature_gate.go:330] unrecognized feature gate: AWSClusterHostedDNS Oct 03 08:39:16 crc kubenswrapper[4765]: W1003 08:39:16.081580 4765 feature_gate.go:330] unrecognized feature gate: NutanixMultiSubnets Oct 03 08:39:16 crc kubenswrapper[4765]: W1003 08:39:16.081584 4765 feature_gate.go:330] unrecognized feature gate: AdminNetworkPolicy Oct 03 08:39:16 crc kubenswrapper[4765]: W1003 08:39:16.081588 4765 feature_gate.go:330] unrecognized feature gate: ManagedBootImagesAWS Oct 03 08:39:16 crc kubenswrapper[4765]: W1003 08:39:16.081595 4765 feature_gate.go:330] unrecognized feature gate: BuildCSIVolumes Oct 03 08:39:16 crc kubenswrapper[4765]: W1003 08:39:16.081604 4765 feature_gate.go:330] unrecognized feature gate: ChunkSizeMiB Oct 03 08:39:16 crc kubenswrapper[4765]: W1003 08:39:16.081610 4765 feature_gate.go:330] unrecognized feature gate: AdditionalRoutingCapabilities Oct 03 08:39:16 crc kubenswrapper[4765]: W1003 08:39:16.081614 4765 feature_gate.go:330] unrecognized feature gate: VSphereMultiVCenters Oct 03 08:39:16 crc kubenswrapper[4765]: W1003 08:39:16.081619 4765 feature_gate.go:330] unrecognized feature gate: MixedCPUsAllocation Oct 03 08:39:16 crc kubenswrapper[4765]: W1003 08:39:16.081623 4765 feature_gate.go:330] unrecognized feature gate: GCPClusterHostedDNS Oct 03 08:39:16 crc kubenswrapper[4765]: W1003 08:39:16.081627 4765 feature_gate.go:330] unrecognized feature gate: MachineConfigNodes Oct 03 08:39:16 crc kubenswrapper[4765]: W1003 08:39:16.081632 4765 feature_gate.go:330] unrecognized feature gate: ManagedBootImages Oct 03 08:39:16 crc kubenswrapper[4765]: W1003 08:39:16.081636 4765 feature_gate.go:330] unrecognized feature gate: NetworkLiveMigration Oct 03 08:39:16 crc kubenswrapper[4765]: W1003 08:39:16.081659 4765 feature_gate.go:353] Setting GA feature gate ValidatingAdmissionPolicy=true. It will be removed in a future release. Oct 03 08:39:16 crc kubenswrapper[4765]: W1003 08:39:16.081666 4765 feature_gate.go:330] unrecognized feature gate: InsightsOnDemandDataGather Oct 03 08:39:16 crc kubenswrapper[4765]: W1003 08:39:16.081671 4765 feature_gate.go:330] unrecognized feature gate: OpenShiftPodSecurityAdmission Oct 03 08:39:16 crc kubenswrapper[4765]: W1003 08:39:16.081676 4765 feature_gate.go:330] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Oct 03 08:39:16 crc kubenswrapper[4765]: W1003 08:39:16.081681 4765 feature_gate.go:330] unrecognized feature gate: AutomatedEtcdBackup Oct 03 08:39:16 crc kubenswrapper[4765]: W1003 08:39:16.081686 4765 feature_gate.go:330] unrecognized feature gate: SignatureStores Oct 03 08:39:16 crc kubenswrapper[4765]: W1003 08:39:16.081692 4765 feature_gate.go:353] Setting GA feature gate CloudDualStackNodeIPs=true. It will be removed in a future release. Oct 03 08:39:16 crc kubenswrapper[4765]: W1003 08:39:16.081698 4765 feature_gate.go:330] unrecognized feature gate: BareMetalLoadBalancer Oct 03 08:39:16 crc kubenswrapper[4765]: W1003 08:39:16.081703 4765 feature_gate.go:330] unrecognized feature gate: HardwareSpeed Oct 03 08:39:16 crc kubenswrapper[4765]: W1003 08:39:16.081708 4765 feature_gate.go:330] unrecognized feature gate: MachineAPIProviderOpenStack Oct 03 08:39:16 crc kubenswrapper[4765]: W1003 08:39:16.081713 4765 feature_gate.go:330] unrecognized feature gate: MetricsCollectionProfiles Oct 03 08:39:16 crc kubenswrapper[4765]: W1003 08:39:16.081717 4765 feature_gate.go:330] unrecognized feature gate: OnClusterBuild Oct 03 08:39:16 crc kubenswrapper[4765]: W1003 08:39:16.081722 4765 feature_gate.go:330] unrecognized feature gate: IngressControllerDynamicConfigurationManager Oct 03 08:39:16 crc kubenswrapper[4765]: W1003 08:39:16.081727 4765 feature_gate.go:330] unrecognized feature gate: GCPLabelsTags Oct 03 08:39:16 crc kubenswrapper[4765]: W1003 08:39:16.081732 4765 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstall Oct 03 08:39:16 crc kubenswrapper[4765]: W1003 08:39:16.081737 4765 feature_gate.go:330] unrecognized feature gate: PersistentIPsForVirtualization Oct 03 08:39:16 crc kubenswrapper[4765]: W1003 08:39:16.081742 4765 feature_gate.go:330] unrecognized feature gate: SetEIPForNLBIngressController Oct 03 08:39:16 crc kubenswrapper[4765]: W1003 08:39:16.081746 4765 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAWS Oct 03 08:39:16 crc kubenswrapper[4765]: W1003 08:39:16.081751 4765 feature_gate.go:330] unrecognized feature gate: CSIDriverSharedResource Oct 03 08:39:16 crc kubenswrapper[4765]: W1003 08:39:16.081755 4765 feature_gate.go:330] unrecognized feature gate: NewOLM Oct 03 08:39:16 crc kubenswrapper[4765]: W1003 08:39:16.081761 4765 feature_gate.go:330] unrecognized feature gate: IngressControllerLBSubnetsAWS Oct 03 08:39:16 crc kubenswrapper[4765]: W1003 08:39:16.081765 4765 feature_gate.go:330] unrecognized feature gate: AzureWorkloadIdentity Oct 03 08:39:16 crc kubenswrapper[4765]: W1003 08:39:16.081770 4765 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstallIBMCloud Oct 03 08:39:16 crc kubenswrapper[4765]: W1003 08:39:16.081775 4765 feature_gate.go:330] unrecognized feature gate: MinimumKubeletVersion Oct 03 08:39:16 crc kubenswrapper[4765]: W1003 08:39:16.081779 4765 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAzure Oct 03 08:39:16 crc kubenswrapper[4765]: W1003 08:39:16.081784 4765 feature_gate.go:330] unrecognized feature gate: ClusterMonitoringConfig Oct 03 08:39:16 crc kubenswrapper[4765]: W1003 08:39:16.081788 4765 feature_gate.go:330] unrecognized feature gate: GatewayAPI Oct 03 08:39:16 crc kubenswrapper[4765]: W1003 08:39:16.081794 4765 feature_gate.go:330] unrecognized feature gate: PlatformOperators Oct 03 08:39:16 crc kubenswrapper[4765]: W1003 08:39:16.081799 4765 feature_gate.go:330] unrecognized feature gate: ImageStreamImportMode Oct 03 08:39:16 crc kubenswrapper[4765]: W1003 08:39:16.081803 4765 feature_gate.go:330] unrecognized feature gate: InsightsConfigAPI Oct 03 08:39:16 crc kubenswrapper[4765]: W1003 08:39:16.081808 4765 feature_gate.go:330] unrecognized feature gate: VSphereMultiNetworks Oct 03 08:39:16 crc kubenswrapper[4765]: W1003 08:39:16.081812 4765 feature_gate.go:330] unrecognized feature gate: NodeDisruptionPolicy Oct 03 08:39:16 crc kubenswrapper[4765]: W1003 08:39:16.081818 4765 feature_gate.go:351] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Oct 03 08:39:16 crc kubenswrapper[4765]: W1003 08:39:16.081823 4765 feature_gate.go:330] unrecognized feature gate: MachineAPIMigration Oct 03 08:39:16 crc kubenswrapper[4765]: W1003 08:39:16.081827 4765 feature_gate.go:330] unrecognized feature gate: VSphereDriverConfiguration Oct 03 08:39:16 crc kubenswrapper[4765]: W1003 08:39:16.081832 4765 feature_gate.go:330] unrecognized feature gate: RouteAdvertisements Oct 03 08:39:16 crc kubenswrapper[4765]: W1003 08:39:16.081836 4765 feature_gate.go:330] unrecognized feature gate: InsightsRuntimeExtractor Oct 03 08:39:16 crc kubenswrapper[4765]: W1003 08:39:16.081840 4765 feature_gate.go:330] unrecognized feature gate: UpgradeStatus Oct 03 08:39:16 crc kubenswrapper[4765]: W1003 08:39:16.081845 4765 feature_gate.go:330] unrecognized feature gate: VolumeGroupSnapshot Oct 03 08:39:16 crc kubenswrapper[4765]: W1003 08:39:16.081849 4765 feature_gate.go:330] unrecognized feature gate: Example Oct 03 08:39:16 crc kubenswrapper[4765]: W1003 08:39:16.081853 4765 feature_gate.go:330] unrecognized feature gate: PinnedImages Oct 03 08:39:16 crc kubenswrapper[4765]: W1003 08:39:16.081858 4765 feature_gate.go:330] unrecognized feature gate: InsightsConfig Oct 03 08:39:16 crc kubenswrapper[4765]: W1003 08:39:16.081862 4765 feature_gate.go:330] unrecognized feature gate: AWSEFSDriverVolumeMetrics Oct 03 08:39:16 crc kubenswrapper[4765]: W1003 08:39:16.081867 4765 feature_gate.go:330] unrecognized feature gate: MultiArchInstallGCP Oct 03 08:39:16 crc kubenswrapper[4765]: W1003 08:39:16.081871 4765 feature_gate.go:330] unrecognized feature gate: PrivateHostedZoneAWS Oct 03 08:39:16 crc kubenswrapper[4765]: I1003 08:39:16.081880 4765 feature_gate.go:386] feature gates: {map[CloudDualStackNodeIPs:true DisableKubeletCloudCredentialProviders:true DynamicResourceAllocation:false EventedPLEG:false KMSv1:true MaxUnavailableStatefulSet:false NodeSwap:false ProcMountType:false RouteExternalCertificate:false ServiceAccountTokenNodeBinding:false TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:false UserNamespacesSupport:false ValidatingAdmissionPolicy:true VolumeAttributesClass:false]} Oct 03 08:39:16 crc kubenswrapper[4765]: I1003 08:39:16.089957 4765 server.go:491] "Kubelet version" kubeletVersion="v1.31.5" Oct 03 08:39:16 crc kubenswrapper[4765]: I1003 08:39:16.089983 4765 server.go:493] "Golang settings" GOGC="" GOMAXPROCS="" GOTRACEBACK="" Oct 03 08:39:16 crc kubenswrapper[4765]: W1003 08:39:16.090034 4765 feature_gate.go:330] unrecognized feature gate: VSphereDriverConfiguration Oct 03 08:39:16 crc kubenswrapper[4765]: W1003 08:39:16.090042 4765 feature_gate.go:330] unrecognized feature gate: ClusterMonitoringConfig Oct 03 08:39:16 crc kubenswrapper[4765]: W1003 08:39:16.090046 4765 feature_gate.go:330] unrecognized feature gate: GCPLabelsTags Oct 03 08:39:16 crc kubenswrapper[4765]: W1003 08:39:16.090050 4765 feature_gate.go:330] unrecognized feature gate: AutomatedEtcdBackup Oct 03 08:39:16 crc kubenswrapper[4765]: W1003 08:39:16.090054 4765 feature_gate.go:330] unrecognized feature gate: InsightsConfig Oct 03 08:39:16 crc kubenswrapper[4765]: W1003 08:39:16.090058 4765 feature_gate.go:330] unrecognized feature gate: MetricsCollectionProfiles Oct 03 08:39:16 crc kubenswrapper[4765]: W1003 08:39:16.090062 4765 feature_gate.go:330] unrecognized feature gate: AWSClusterHostedDNS Oct 03 08:39:16 crc kubenswrapper[4765]: W1003 08:39:16.090066 4765 feature_gate.go:330] unrecognized feature gate: InsightsConfigAPI Oct 03 08:39:16 crc kubenswrapper[4765]: W1003 08:39:16.090070 4765 feature_gate.go:330] unrecognized feature gate: NetworkSegmentation Oct 03 08:39:16 crc kubenswrapper[4765]: W1003 08:39:16.090074 4765 feature_gate.go:330] unrecognized feature gate: GatewayAPI Oct 03 08:39:16 crc kubenswrapper[4765]: W1003 08:39:16.090077 4765 feature_gate.go:330] unrecognized feature gate: IngressControllerDynamicConfigurationManager Oct 03 08:39:16 crc kubenswrapper[4765]: W1003 08:39:16.090081 4765 feature_gate.go:330] unrecognized feature gate: MultiArchInstallGCP Oct 03 08:39:16 crc kubenswrapper[4765]: W1003 08:39:16.090084 4765 feature_gate.go:330] unrecognized feature gate: InsightsRuntimeExtractor Oct 03 08:39:16 crc kubenswrapper[4765]: W1003 08:39:16.090087 4765 feature_gate.go:330] unrecognized feature gate: AdditionalRoutingCapabilities Oct 03 08:39:16 crc kubenswrapper[4765]: W1003 08:39:16.090091 4765 feature_gate.go:330] unrecognized feature gate: MachineAPIProviderOpenStack Oct 03 08:39:16 crc kubenswrapper[4765]: W1003 08:39:16.090094 4765 feature_gate.go:330] unrecognized feature gate: HardwareSpeed Oct 03 08:39:16 crc kubenswrapper[4765]: W1003 08:39:16.090099 4765 feature_gate.go:353] Setting GA feature gate DisableKubeletCloudCredentialProviders=true. It will be removed in a future release. Oct 03 08:39:16 crc kubenswrapper[4765]: W1003 08:39:16.090105 4765 feature_gate.go:330] unrecognized feature gate: SignatureStores Oct 03 08:39:16 crc kubenswrapper[4765]: W1003 08:39:16.090109 4765 feature_gate.go:330] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Oct 03 08:39:16 crc kubenswrapper[4765]: W1003 08:39:16.090112 4765 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAWS Oct 03 08:39:16 crc kubenswrapper[4765]: W1003 08:39:16.090116 4765 feature_gate.go:330] unrecognized feature gate: GCPClusterHostedDNS Oct 03 08:39:16 crc kubenswrapper[4765]: W1003 08:39:16.090120 4765 feature_gate.go:330] unrecognized feature gate: PrivateHostedZoneAWS Oct 03 08:39:16 crc kubenswrapper[4765]: W1003 08:39:16.090123 4765 feature_gate.go:330] unrecognized feature gate: UpgradeStatus Oct 03 08:39:16 crc kubenswrapper[4765]: W1003 08:39:16.090127 4765 feature_gate.go:330] unrecognized feature gate: NutanixMultiSubnets Oct 03 08:39:16 crc kubenswrapper[4765]: W1003 08:39:16.090131 4765 feature_gate.go:351] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Oct 03 08:39:16 crc kubenswrapper[4765]: W1003 08:39:16.090136 4765 feature_gate.go:330] unrecognized feature gate: ConsolePluginContentSecurityPolicy Oct 03 08:39:16 crc kubenswrapper[4765]: W1003 08:39:16.090140 4765 feature_gate.go:330] unrecognized feature gate: OVNObservability Oct 03 08:39:16 crc kubenswrapper[4765]: W1003 08:39:16.090143 4765 feature_gate.go:330] unrecognized feature gate: Example Oct 03 08:39:16 crc kubenswrapper[4765]: W1003 08:39:16.090146 4765 feature_gate.go:330] unrecognized feature gate: EtcdBackendQuota Oct 03 08:39:16 crc kubenswrapper[4765]: W1003 08:39:16.090150 4765 feature_gate.go:330] unrecognized feature gate: CSIDriverSharedResource Oct 03 08:39:16 crc kubenswrapper[4765]: W1003 08:39:16.090153 4765 feature_gate.go:330] unrecognized feature gate: SigstoreImageVerification Oct 03 08:39:16 crc kubenswrapper[4765]: W1003 08:39:16.090157 4765 feature_gate.go:330] unrecognized feature gate: NodeDisruptionPolicy Oct 03 08:39:16 crc kubenswrapper[4765]: W1003 08:39:16.090161 4765 feature_gate.go:330] unrecognized feature gate: PlatformOperators Oct 03 08:39:16 crc kubenswrapper[4765]: W1003 08:39:16.090164 4765 feature_gate.go:330] unrecognized feature gate: VSphereMultiVCenters Oct 03 08:39:16 crc kubenswrapper[4765]: W1003 08:39:16.090169 4765 feature_gate.go:330] unrecognized feature gate: DNSNameResolver Oct 03 08:39:16 crc kubenswrapper[4765]: W1003 08:39:16.090172 4765 feature_gate.go:330] unrecognized feature gate: AzureWorkloadIdentity Oct 03 08:39:16 crc kubenswrapper[4765]: W1003 08:39:16.090176 4765 feature_gate.go:330] unrecognized feature gate: NetworkDiagnosticsConfig Oct 03 08:39:16 crc kubenswrapper[4765]: W1003 08:39:16.090179 4765 feature_gate.go:330] unrecognized feature gate: ExternalOIDC Oct 03 08:39:16 crc kubenswrapper[4765]: W1003 08:39:16.090183 4765 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAzure Oct 03 08:39:16 crc kubenswrapper[4765]: W1003 08:39:16.090186 4765 feature_gate.go:330] unrecognized feature gate: SetEIPForNLBIngressController Oct 03 08:39:16 crc kubenswrapper[4765]: W1003 08:39:16.090189 4765 feature_gate.go:330] unrecognized feature gate: MinimumKubeletVersion Oct 03 08:39:16 crc kubenswrapper[4765]: W1003 08:39:16.090193 4765 feature_gate.go:330] unrecognized feature gate: BuildCSIVolumes Oct 03 08:39:16 crc kubenswrapper[4765]: W1003 08:39:16.090196 4765 feature_gate.go:330] unrecognized feature gate: PersistentIPsForVirtualization Oct 03 08:39:16 crc kubenswrapper[4765]: W1003 08:39:16.090200 4765 feature_gate.go:330] unrecognized feature gate: RouteAdvertisements Oct 03 08:39:16 crc kubenswrapper[4765]: W1003 08:39:16.090203 4765 feature_gate.go:330] unrecognized feature gate: ChunkSizeMiB Oct 03 08:39:16 crc kubenswrapper[4765]: W1003 08:39:16.090207 4765 feature_gate.go:330] unrecognized feature gate: ImageStreamImportMode Oct 03 08:39:16 crc kubenswrapper[4765]: W1003 08:39:16.090212 4765 feature_gate.go:353] Setting GA feature gate ValidatingAdmissionPolicy=true. It will be removed in a future release. Oct 03 08:39:16 crc kubenswrapper[4765]: W1003 08:39:16.090217 4765 feature_gate.go:330] unrecognized feature gate: ManagedBootImagesAWS Oct 03 08:39:16 crc kubenswrapper[4765]: W1003 08:39:16.090220 4765 feature_gate.go:330] unrecognized feature gate: MachineAPIMigration Oct 03 08:39:16 crc kubenswrapper[4765]: W1003 08:39:16.090224 4765 feature_gate.go:330] unrecognized feature gate: AWSEFSDriverVolumeMetrics Oct 03 08:39:16 crc kubenswrapper[4765]: W1003 08:39:16.090227 4765 feature_gate.go:330] unrecognized feature gate: VSphereControlPlaneMachineSet Oct 03 08:39:16 crc kubenswrapper[4765]: W1003 08:39:16.090231 4765 feature_gate.go:330] unrecognized feature gate: VolumeGroupSnapshot Oct 03 08:39:16 crc kubenswrapper[4765]: W1003 08:39:16.090234 4765 feature_gate.go:330] unrecognized feature gate: MachineConfigNodes Oct 03 08:39:16 crc kubenswrapper[4765]: W1003 08:39:16.090238 4765 feature_gate.go:330] unrecognized feature gate: InsightsOnDemandDataGather Oct 03 08:39:16 crc kubenswrapper[4765]: W1003 08:39:16.090241 4765 feature_gate.go:330] unrecognized feature gate: IngressControllerLBSubnetsAWS Oct 03 08:39:16 crc kubenswrapper[4765]: W1003 08:39:16.090244 4765 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstall Oct 03 08:39:16 crc kubenswrapper[4765]: W1003 08:39:16.090248 4765 feature_gate.go:330] unrecognized feature gate: AlibabaPlatform Oct 03 08:39:16 crc kubenswrapper[4765]: W1003 08:39:16.090253 4765 feature_gate.go:353] Setting GA feature gate CloudDualStackNodeIPs=true. It will be removed in a future release. Oct 03 08:39:16 crc kubenswrapper[4765]: W1003 08:39:16.090257 4765 feature_gate.go:330] unrecognized feature gate: NewOLM Oct 03 08:39:16 crc kubenswrapper[4765]: W1003 08:39:16.090261 4765 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstallIBMCloud Oct 03 08:39:16 crc kubenswrapper[4765]: W1003 08:39:16.090265 4765 feature_gate.go:330] unrecognized feature gate: PinnedImages Oct 03 08:39:16 crc kubenswrapper[4765]: W1003 08:39:16.090269 4765 feature_gate.go:330] unrecognized feature gate: AdminNetworkPolicy Oct 03 08:39:16 crc kubenswrapper[4765]: W1003 08:39:16.090272 4765 feature_gate.go:330] unrecognized feature gate: VSphereMultiNetworks Oct 03 08:39:16 crc kubenswrapper[4765]: W1003 08:39:16.090276 4765 feature_gate.go:330] unrecognized feature gate: MixedCPUsAllocation Oct 03 08:39:16 crc kubenswrapper[4765]: W1003 08:39:16.090279 4765 feature_gate.go:330] unrecognized feature gate: OnClusterBuild Oct 03 08:39:16 crc kubenswrapper[4765]: W1003 08:39:16.090283 4765 feature_gate.go:330] unrecognized feature gate: NetworkLiveMigration Oct 03 08:39:16 crc kubenswrapper[4765]: W1003 08:39:16.090286 4765 feature_gate.go:330] unrecognized feature gate: VSphereStaticIPs Oct 03 08:39:16 crc kubenswrapper[4765]: W1003 08:39:16.090290 4765 feature_gate.go:330] unrecognized feature gate: OpenShiftPodSecurityAdmission Oct 03 08:39:16 crc kubenswrapper[4765]: W1003 08:39:16.090293 4765 feature_gate.go:330] unrecognized feature gate: BareMetalLoadBalancer Oct 03 08:39:16 crc kubenswrapper[4765]: W1003 08:39:16.090297 4765 feature_gate.go:330] unrecognized feature gate: ManagedBootImages Oct 03 08:39:16 crc kubenswrapper[4765]: W1003 08:39:16.090301 4765 feature_gate.go:330] unrecognized feature gate: BootcNodeManagement Oct 03 08:39:16 crc kubenswrapper[4765]: I1003 08:39:16.090307 4765 feature_gate.go:386] feature gates: {map[CloudDualStackNodeIPs:true DisableKubeletCloudCredentialProviders:true DynamicResourceAllocation:false EventedPLEG:false KMSv1:true MaxUnavailableStatefulSet:false NodeSwap:false ProcMountType:false RouteExternalCertificate:false ServiceAccountTokenNodeBinding:false TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:false UserNamespacesSupport:false ValidatingAdmissionPolicy:true VolumeAttributesClass:false]} Oct 03 08:39:16 crc kubenswrapper[4765]: W1003 08:39:16.090414 4765 feature_gate.go:330] unrecognized feature gate: Example Oct 03 08:39:16 crc kubenswrapper[4765]: W1003 08:39:16.090420 4765 feature_gate.go:330] unrecognized feature gate: IngressControllerLBSubnetsAWS Oct 03 08:39:16 crc kubenswrapper[4765]: W1003 08:39:16.090425 4765 feature_gate.go:330] unrecognized feature gate: BootcNodeManagement Oct 03 08:39:16 crc kubenswrapper[4765]: W1003 08:39:16.090428 4765 feature_gate.go:330] unrecognized feature gate: MachineAPIProviderOpenStack Oct 03 08:39:16 crc kubenswrapper[4765]: W1003 08:39:16.090432 4765 feature_gate.go:330] unrecognized feature gate: GatewayAPI Oct 03 08:39:16 crc kubenswrapper[4765]: W1003 08:39:16.090436 4765 feature_gate.go:330] unrecognized feature gate: NetworkLiveMigration Oct 03 08:39:16 crc kubenswrapper[4765]: W1003 08:39:16.090439 4765 feature_gate.go:330] unrecognized feature gate: MachineAPIMigration Oct 03 08:39:16 crc kubenswrapper[4765]: W1003 08:39:16.090443 4765 feature_gate.go:330] unrecognized feature gate: ConsolePluginContentSecurityPolicy Oct 03 08:39:16 crc kubenswrapper[4765]: W1003 08:39:16.090447 4765 feature_gate.go:330] unrecognized feature gate: AzureWorkloadIdentity Oct 03 08:39:16 crc kubenswrapper[4765]: W1003 08:39:16.090451 4765 feature_gate.go:330] unrecognized feature gate: MachineConfigNodes Oct 03 08:39:16 crc kubenswrapper[4765]: W1003 08:39:16.090454 4765 feature_gate.go:330] unrecognized feature gate: BareMetalLoadBalancer Oct 03 08:39:16 crc kubenswrapper[4765]: W1003 08:39:16.090458 4765 feature_gate.go:330] unrecognized feature gate: AWSEFSDriverVolumeMetrics Oct 03 08:39:16 crc kubenswrapper[4765]: W1003 08:39:16.090462 4765 feature_gate.go:351] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Oct 03 08:39:16 crc kubenswrapper[4765]: W1003 08:39:16.090467 4765 feature_gate.go:330] unrecognized feature gate: VSphereControlPlaneMachineSet Oct 03 08:39:16 crc kubenswrapper[4765]: W1003 08:39:16.090471 4765 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstallIBMCloud Oct 03 08:39:16 crc kubenswrapper[4765]: W1003 08:39:16.090474 4765 feature_gate.go:330] unrecognized feature gate: MetricsCollectionProfiles Oct 03 08:39:16 crc kubenswrapper[4765]: W1003 08:39:16.090478 4765 feature_gate.go:330] unrecognized feature gate: SigstoreImageVerification Oct 03 08:39:16 crc kubenswrapper[4765]: W1003 08:39:16.090481 4765 feature_gate.go:330] unrecognized feature gate: AWSClusterHostedDNS Oct 03 08:39:16 crc kubenswrapper[4765]: W1003 08:39:16.090485 4765 feature_gate.go:330] unrecognized feature gate: AdditionalRoutingCapabilities Oct 03 08:39:16 crc kubenswrapper[4765]: W1003 08:39:16.090488 4765 feature_gate.go:330] unrecognized feature gate: AdminNetworkPolicy Oct 03 08:39:16 crc kubenswrapper[4765]: W1003 08:39:16.090492 4765 feature_gate.go:330] unrecognized feature gate: VSphereMultiVCenters Oct 03 08:39:16 crc kubenswrapper[4765]: W1003 08:39:16.090495 4765 feature_gate.go:330] unrecognized feature gate: ManagedBootImages Oct 03 08:39:16 crc kubenswrapper[4765]: W1003 08:39:16.090499 4765 feature_gate.go:330] unrecognized feature gate: UpgradeStatus Oct 03 08:39:16 crc kubenswrapper[4765]: W1003 08:39:16.090502 4765 feature_gate.go:330] unrecognized feature gate: MinimumKubeletVersion Oct 03 08:39:16 crc kubenswrapper[4765]: W1003 08:39:16.090506 4765 feature_gate.go:330] unrecognized feature gate: AlibabaPlatform Oct 03 08:39:16 crc kubenswrapper[4765]: W1003 08:39:16.090509 4765 feature_gate.go:330] unrecognized feature gate: InsightsConfigAPI Oct 03 08:39:16 crc kubenswrapper[4765]: W1003 08:39:16.090513 4765 feature_gate.go:330] unrecognized feature gate: ManagedBootImagesAWS Oct 03 08:39:16 crc kubenswrapper[4765]: W1003 08:39:16.090516 4765 feature_gate.go:330] unrecognized feature gate: NetworkDiagnosticsConfig Oct 03 08:39:16 crc kubenswrapper[4765]: W1003 08:39:16.090520 4765 feature_gate.go:330] unrecognized feature gate: PinnedImages Oct 03 08:39:16 crc kubenswrapper[4765]: W1003 08:39:16.090523 4765 feature_gate.go:330] unrecognized feature gate: HardwareSpeed Oct 03 08:39:16 crc kubenswrapper[4765]: W1003 08:39:16.090527 4765 feature_gate.go:330] unrecognized feature gate: NodeDisruptionPolicy Oct 03 08:39:16 crc kubenswrapper[4765]: W1003 08:39:16.090530 4765 feature_gate.go:330] unrecognized feature gate: VSphereMultiNetworks Oct 03 08:39:16 crc kubenswrapper[4765]: W1003 08:39:16.090534 4765 feature_gate.go:330] unrecognized feature gate: DNSNameResolver Oct 03 08:39:16 crc kubenswrapper[4765]: W1003 08:39:16.090537 4765 feature_gate.go:330] unrecognized feature gate: ExternalOIDC Oct 03 08:39:16 crc kubenswrapper[4765]: W1003 08:39:16.090541 4765 feature_gate.go:330] unrecognized feature gate: BuildCSIVolumes Oct 03 08:39:16 crc kubenswrapper[4765]: W1003 08:39:16.090545 4765 feature_gate.go:330] unrecognized feature gate: NewOLM Oct 03 08:39:16 crc kubenswrapper[4765]: W1003 08:39:16.090549 4765 feature_gate.go:330] unrecognized feature gate: CSIDriverSharedResource Oct 03 08:39:16 crc kubenswrapper[4765]: W1003 08:39:16.090552 4765 feature_gate.go:330] unrecognized feature gate: MultiArchInstallGCP Oct 03 08:39:16 crc kubenswrapper[4765]: W1003 08:39:16.090555 4765 feature_gate.go:330] unrecognized feature gate: PersistentIPsForVirtualization Oct 03 08:39:16 crc kubenswrapper[4765]: W1003 08:39:16.090559 4765 feature_gate.go:330] unrecognized feature gate: VSphereDriverConfiguration Oct 03 08:39:16 crc kubenswrapper[4765]: W1003 08:39:16.090563 4765 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAWS Oct 03 08:39:16 crc kubenswrapper[4765]: W1003 08:39:16.090568 4765 feature_gate.go:353] Setting GA feature gate ValidatingAdmissionPolicy=true. It will be removed in a future release. Oct 03 08:39:16 crc kubenswrapper[4765]: W1003 08:39:16.090573 4765 feature_gate.go:330] unrecognized feature gate: NetworkSegmentation Oct 03 08:39:16 crc kubenswrapper[4765]: W1003 08:39:16.090576 4765 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstall Oct 03 08:39:16 crc kubenswrapper[4765]: W1003 08:39:16.090581 4765 feature_gate.go:353] Setting GA feature gate CloudDualStackNodeIPs=true. It will be removed in a future release. Oct 03 08:39:16 crc kubenswrapper[4765]: W1003 08:39:16.090585 4765 feature_gate.go:330] unrecognized feature gate: ClusterMonitoringConfig Oct 03 08:39:16 crc kubenswrapper[4765]: W1003 08:39:16.090589 4765 feature_gate.go:330] unrecognized feature gate: MixedCPUsAllocation Oct 03 08:39:16 crc kubenswrapper[4765]: W1003 08:39:16.090593 4765 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAzure Oct 03 08:39:16 crc kubenswrapper[4765]: W1003 08:39:16.090596 4765 feature_gate.go:330] unrecognized feature gate: AutomatedEtcdBackup Oct 03 08:39:16 crc kubenswrapper[4765]: W1003 08:39:16.090600 4765 feature_gate.go:330] unrecognized feature gate: ChunkSizeMiB Oct 03 08:39:16 crc kubenswrapper[4765]: W1003 08:39:16.090603 4765 feature_gate.go:330] unrecognized feature gate: IngressControllerDynamicConfigurationManager Oct 03 08:39:16 crc kubenswrapper[4765]: W1003 08:39:16.090607 4765 feature_gate.go:330] unrecognized feature gate: ImageStreamImportMode Oct 03 08:39:16 crc kubenswrapper[4765]: W1003 08:39:16.090610 4765 feature_gate.go:330] unrecognized feature gate: OnClusterBuild Oct 03 08:39:16 crc kubenswrapper[4765]: W1003 08:39:16.090614 4765 feature_gate.go:330] unrecognized feature gate: PlatformOperators Oct 03 08:39:16 crc kubenswrapper[4765]: W1003 08:39:16.090617 4765 feature_gate.go:330] unrecognized feature gate: NutanixMultiSubnets Oct 03 08:39:16 crc kubenswrapper[4765]: W1003 08:39:16.090621 4765 feature_gate.go:330] unrecognized feature gate: EtcdBackendQuota Oct 03 08:39:16 crc kubenswrapper[4765]: W1003 08:39:16.090625 4765 feature_gate.go:330] unrecognized feature gate: InsightsRuntimeExtractor Oct 03 08:39:16 crc kubenswrapper[4765]: W1003 08:39:16.090628 4765 feature_gate.go:330] unrecognized feature gate: InsightsConfig Oct 03 08:39:16 crc kubenswrapper[4765]: W1003 08:39:16.090632 4765 feature_gate.go:330] unrecognized feature gate: SignatureStores Oct 03 08:39:16 crc kubenswrapper[4765]: W1003 08:39:16.090635 4765 feature_gate.go:330] unrecognized feature gate: GCPClusterHostedDNS Oct 03 08:39:16 crc kubenswrapper[4765]: W1003 08:39:16.090639 4765 feature_gate.go:330] unrecognized feature gate: PrivateHostedZoneAWS Oct 03 08:39:16 crc kubenswrapper[4765]: W1003 08:39:16.090661 4765 feature_gate.go:353] Setting GA feature gate DisableKubeletCloudCredentialProviders=true. It will be removed in a future release. Oct 03 08:39:16 crc kubenswrapper[4765]: W1003 08:39:16.090668 4765 feature_gate.go:330] unrecognized feature gate: VSphereStaticIPs Oct 03 08:39:16 crc kubenswrapper[4765]: W1003 08:39:16.090672 4765 feature_gate.go:330] unrecognized feature gate: OpenShiftPodSecurityAdmission Oct 03 08:39:16 crc kubenswrapper[4765]: W1003 08:39:16.090676 4765 feature_gate.go:330] unrecognized feature gate: VolumeGroupSnapshot Oct 03 08:39:16 crc kubenswrapper[4765]: W1003 08:39:16.090681 4765 feature_gate.go:330] unrecognized feature gate: GCPLabelsTags Oct 03 08:39:16 crc kubenswrapper[4765]: W1003 08:39:16.090685 4765 feature_gate.go:330] unrecognized feature gate: RouteAdvertisements Oct 03 08:39:16 crc kubenswrapper[4765]: W1003 08:39:16.090689 4765 feature_gate.go:330] unrecognized feature gate: SetEIPForNLBIngressController Oct 03 08:39:16 crc kubenswrapper[4765]: W1003 08:39:16.090693 4765 feature_gate.go:330] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Oct 03 08:39:16 crc kubenswrapper[4765]: W1003 08:39:16.090698 4765 feature_gate.go:330] unrecognized feature gate: InsightsOnDemandDataGather Oct 03 08:39:16 crc kubenswrapper[4765]: W1003 08:39:16.090703 4765 feature_gate.go:330] unrecognized feature gate: OVNObservability Oct 03 08:39:16 crc kubenswrapper[4765]: I1003 08:39:16.090710 4765 feature_gate.go:386] feature gates: {map[CloudDualStackNodeIPs:true DisableKubeletCloudCredentialProviders:true DynamicResourceAllocation:false EventedPLEG:false KMSv1:true MaxUnavailableStatefulSet:false NodeSwap:false ProcMountType:false RouteExternalCertificate:false ServiceAccountTokenNodeBinding:false TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:false UserNamespacesSupport:false ValidatingAdmissionPolicy:true VolumeAttributesClass:false]} Oct 03 08:39:16 crc kubenswrapper[4765]: I1003 08:39:16.090843 4765 server.go:940] "Client rotation is on, will bootstrap in background" Oct 03 08:39:16 crc kubenswrapper[4765]: I1003 08:39:16.096220 4765 bootstrap.go:85] "Current kubeconfig file contents are still valid, no bootstrap necessary" Oct 03 08:39:16 crc kubenswrapper[4765]: I1003 08:39:16.096325 4765 certificate_store.go:130] Loading cert/key pair from "/var/lib/kubelet/pki/kubelet-client-current.pem". Oct 03 08:39:16 crc kubenswrapper[4765]: I1003 08:39:16.097824 4765 server.go:997] "Starting client certificate rotation" Oct 03 08:39:16 crc kubenswrapper[4765]: I1003 08:39:16.097844 4765 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Certificate rotation is enabled Oct 03 08:39:16 crc kubenswrapper[4765]: I1003 08:39:16.098000 4765 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Certificate expiration is 2026-02-24 05:52:08 +0000 UTC, rotation deadline is 2025-11-25 22:36:58.229606269 +0000 UTC Oct 03 08:39:16 crc kubenswrapper[4765]: I1003 08:39:16.098088 4765 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Waiting 1285h57m42.131522468s for next certificate rotation Oct 03 08:39:16 crc kubenswrapper[4765]: I1003 08:39:16.130703 4765 dynamic_cafile_content.go:123] "Loaded a new CA Bundle and Verifier" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Oct 03 08:39:16 crc kubenswrapper[4765]: I1003 08:39:16.132572 4765 dynamic_cafile_content.go:161] "Starting controller" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Oct 03 08:39:16 crc kubenswrapper[4765]: I1003 08:39:16.157678 4765 log.go:25] "Validated CRI v1 runtime API" Oct 03 08:39:16 crc kubenswrapper[4765]: I1003 08:39:16.199311 4765 log.go:25] "Validated CRI v1 image API" Oct 03 08:39:16 crc kubenswrapper[4765]: I1003 08:39:16.201460 4765 server.go:1437] "Using cgroup driver setting received from the CRI runtime" cgroupDriver="systemd" Oct 03 08:39:16 crc kubenswrapper[4765]: I1003 08:39:16.207118 4765 fs.go:133] Filesystem UUIDs: map[0b076daa-c26a-46d2-b3a6-72a8dbc6e257:/dev/vda4 2025-10-03-08-35-12-00:/dev/sr0 7B77-95E7:/dev/vda2 de0497b0-db1b-465a-b278-03db02455c71:/dev/vda3] Oct 03 08:39:16 crc kubenswrapper[4765]: I1003 08:39:16.207173 4765 fs.go:134] Filesystem partitions: map[/dev/shm:{mountpoint:/dev/shm major:0 minor:22 fsType:tmpfs blockSize:0} /dev/vda3:{mountpoint:/boot major:252 minor:3 fsType:ext4 blockSize:0} /dev/vda4:{mountpoint:/var major:252 minor:4 fsType:xfs blockSize:0} /run:{mountpoint:/run major:0 minor:24 fsType:tmpfs blockSize:0} /run/user/1000:{mountpoint:/run/user/1000 major:0 minor:41 fsType:tmpfs blockSize:0} /tmp:{mountpoint:/tmp major:0 minor:30 fsType:tmpfs blockSize:0} /var/lib/etcd:{mountpoint:/var/lib/etcd major:0 minor:42 fsType:tmpfs blockSize:0}] Oct 03 08:39:16 crc kubenswrapper[4765]: I1003 08:39:16.222778 4765 manager.go:217] Machine: {Timestamp:2025-10-03 08:39:16.220205613 +0000 UTC m=+0.521699963 CPUVendorID:AuthenticAMD NumCores:12 NumPhysicalCores:1 NumSockets:12 CpuFrequency:2800000 MemoryCapacity:33654128640 SwapCapacity:0 MemoryByType:map[] NVMInfo:{MemoryModeCapacity:0 AppDirectModeCapacity:0 AvgPowerBudget:0} HugePages:[{PageSize:1048576 NumPages:0} {PageSize:2048 NumPages:0}] MachineID:21801e6708c44f15b81395eb736a7cec SystemUUID:c85bcae8-d463-4f60-8737-09c0f3c02573 BootID:4a5a1b91-d1b3-462d-b8c2-89eae83d6c3d Filesystems:[{Device:/dev/shm DeviceMajor:0 DeviceMinor:22 Capacity:16827064320 Type:vfs Inodes:4108170 HasInodes:true} {Device:/run DeviceMajor:0 DeviceMinor:24 Capacity:6730825728 Type:vfs Inodes:819200 HasInodes:true} {Device:/dev/vda4 DeviceMajor:252 DeviceMinor:4 Capacity:85292941312 Type:vfs Inodes:41679680 HasInodes:true} {Device:/tmp DeviceMajor:0 DeviceMinor:30 Capacity:16827064320 Type:vfs Inodes:1048576 HasInodes:true} {Device:/dev/vda3 DeviceMajor:252 DeviceMinor:3 Capacity:366869504 Type:vfs Inodes:98304 HasInodes:true} {Device:/run/user/1000 DeviceMajor:0 DeviceMinor:41 Capacity:3365412864 Type:vfs Inodes:821634 HasInodes:true} {Device:/var/lib/etcd DeviceMajor:0 DeviceMinor:42 Capacity:1073741824 Type:vfs Inodes:4108170 HasInodes:true}] DiskMap:map[252:0:{Name:vda Major:252 Minor:0 Size:214748364800 Scheduler:none}] NetworkDevices:[{Name:br-ex MacAddress:fa:16:3e:53:d8:69 Speed:0 Mtu:1500} {Name:br-int MacAddress:d6:39:55:2e:22:71 Speed:0 Mtu:1400} {Name:ens3 MacAddress:fa:16:3e:53:d8:69 Speed:-1 Mtu:1500} {Name:ens7 MacAddress:fa:16:3e:01:9f:0d Speed:-1 Mtu:1500} {Name:ens7.20 MacAddress:52:54:00:5e:c4:a5 Speed:-1 Mtu:1496} {Name:ens7.21 MacAddress:52:54:00:6c:2c:e4 Speed:-1 Mtu:1496} {Name:ens7.22 MacAddress:52:54:00:7e:10:c9 Speed:-1 Mtu:1496} {Name:eth10 MacAddress:26:82:3d:42:65:46 Speed:0 Mtu:1500} {Name:ovn-k8s-mp0 MacAddress:0a:58:0a:d9:00:02 Speed:0 Mtu:1400} {Name:ovs-system MacAddress:3e:f8:90:b8:89:46 Speed:0 Mtu:1500}] Topology:[{Id:0 Memory:33654128640 HugePages:[{PageSize:1048576 NumPages:0} {PageSize:2048 NumPages:0}] Cores:[{Id:0 Threads:[0] Caches:[{Id:0 Size:32768 Type:Data Level:1} {Id:0 Size:32768 Type:Instruction Level:1} {Id:0 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:0 Size:16777216 Type:Unified Level:3}] SocketID:0 BookID: DrawerID:} {Id:0 Threads:[1] Caches:[{Id:1 Size:32768 Type:Data Level:1} {Id:1 Size:32768 Type:Instruction Level:1} {Id:1 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:1 Size:16777216 Type:Unified Level:3}] SocketID:1 BookID: DrawerID:} {Id:0 Threads:[10] Caches:[{Id:10 Size:32768 Type:Data Level:1} {Id:10 Size:32768 Type:Instruction Level:1} {Id:10 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:10 Size:16777216 Type:Unified Level:3}] SocketID:10 BookID: DrawerID:} {Id:0 Threads:[11] Caches:[{Id:11 Size:32768 Type:Data Level:1} {Id:11 Size:32768 Type:Instruction Level:1} {Id:11 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:11 Size:16777216 Type:Unified Level:3}] SocketID:11 BookID: DrawerID:} {Id:0 Threads:[2] Caches:[{Id:2 Size:32768 Type:Data Level:1} {Id:2 Size:32768 Type:Instruction Level:1} {Id:2 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:2 Size:16777216 Type:Unified Level:3}] SocketID:2 BookID: DrawerID:} {Id:0 Threads:[3] Caches:[{Id:3 Size:32768 Type:Data Level:1} {Id:3 Size:32768 Type:Instruction Level:1} {Id:3 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:3 Size:16777216 Type:Unified Level:3}] SocketID:3 BookID: DrawerID:} {Id:0 Threads:[4] Caches:[{Id:4 Size:32768 Type:Data Level:1} {Id:4 Size:32768 Type:Instruction Level:1} {Id:4 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:4 Size:16777216 Type:Unified Level:3}] SocketID:4 BookID: DrawerID:} {Id:0 Threads:[5] Caches:[{Id:5 Size:32768 Type:Data Level:1} {Id:5 Size:32768 Type:Instruction Level:1} {Id:5 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:5 Size:16777216 Type:Unified Level:3}] SocketID:5 BookID: DrawerID:} {Id:0 Threads:[6] Caches:[{Id:6 Size:32768 Type:Data Level:1} {Id:6 Size:32768 Type:Instruction Level:1} {Id:6 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:6 Size:16777216 Type:Unified Level:3}] SocketID:6 BookID: DrawerID:} {Id:0 Threads:[7] Caches:[{Id:7 Size:32768 Type:Data Level:1} {Id:7 Size:32768 Type:Instruction Level:1} {Id:7 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:7 Size:16777216 Type:Unified Level:3}] SocketID:7 BookID: DrawerID:} {Id:0 Threads:[8] Caches:[{Id:8 Size:32768 Type:Data Level:1} {Id:8 Size:32768 Type:Instruction Level:1} {Id:8 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:8 Size:16777216 Type:Unified Level:3}] SocketID:8 BookID: DrawerID:} {Id:0 Threads:[9] Caches:[{Id:9 Size:32768 Type:Data Level:1} {Id:9 Size:32768 Type:Instruction Level:1} {Id:9 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:9 Size:16777216 Type:Unified Level:3}] SocketID:9 BookID: DrawerID:}] Caches:[] Distances:[10]}] CloudProvider:Unknown InstanceType:Unknown InstanceID:None} Oct 03 08:39:16 crc kubenswrapper[4765]: I1003 08:39:16.223039 4765 manager_no_libpfm.go:29] cAdvisor is build without cgo and/or libpfm support. Perf event counters are not available. Oct 03 08:39:16 crc kubenswrapper[4765]: I1003 08:39:16.223187 4765 manager.go:233] Version: {KernelVersion:5.14.0-427.50.2.el9_4.x86_64 ContainerOsVersion:Red Hat Enterprise Linux CoreOS 418.94.202502100215-0 DockerVersion: DockerAPIVersion: CadvisorVersion: CadvisorRevision:} Oct 03 08:39:16 crc kubenswrapper[4765]: I1003 08:39:16.223496 4765 swap_util.go:113] "Swap is on" /proc/swaps contents="Filename\t\t\t\tType\t\tSize\t\tUsed\t\tPriority" Oct 03 08:39:16 crc kubenswrapper[4765]: I1003 08:39:16.223672 4765 container_manager_linux.go:267] "Container manager verified user specified cgroup-root exists" cgroupRoot=[] Oct 03 08:39:16 crc kubenswrapper[4765]: I1003 08:39:16.223748 4765 container_manager_linux.go:272] "Creating Container Manager object based on Node Config" nodeConfig={"NodeName":"crc","RuntimeCgroupsName":"/system.slice/crio.service","SystemCgroupsName":"/system.slice","KubeletCgroupsName":"","KubeletOOMScoreAdj":-999,"ContainerRuntime":"","CgroupsPerQOS":true,"CgroupRoot":"/","CgroupDriver":"systemd","KubeletRootDir":"/var/lib/kubelet","ProtectKernelDefaults":true,"KubeReservedCgroupName":"","SystemReservedCgroupName":"","ReservedSystemCPUs":{},"EnforceNodeAllocatable":{"pods":{}},"KubeReserved":null,"SystemReserved":{"cpu":"200m","ephemeral-storage":"350Mi","memory":"350Mi"},"HardEvictionThresholds":[{"Signal":"nodefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.1},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.15},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"memory.available","Operator":"LessThan","Value":{"Quantity":"100Mi","Percentage":0},"GracePeriod":0,"MinReclaim":null}],"QOSReserved":{},"CPUManagerPolicy":"none","CPUManagerPolicyOptions":null,"TopologyManagerScope":"container","CPUManagerReconcilePeriod":10000000000,"ExperimentalMemoryManagerPolicy":"None","ExperimentalMemoryManagerReservedMemory":null,"PodPidsLimit":4096,"EnforceCPULimits":true,"CPUCFSQuotaPeriod":100000000,"TopologyManagerPolicy":"none","TopologyManagerPolicyOptions":null,"CgroupVersion":2} Oct 03 08:39:16 crc kubenswrapper[4765]: I1003 08:39:16.223928 4765 topology_manager.go:138] "Creating topology manager with none policy" Oct 03 08:39:16 crc kubenswrapper[4765]: I1003 08:39:16.223939 4765 container_manager_linux.go:303] "Creating device plugin manager" Oct 03 08:39:16 crc kubenswrapper[4765]: I1003 08:39:16.224423 4765 manager.go:142] "Creating Device Plugin manager" path="/var/lib/kubelet/device-plugins/kubelet.sock" Oct 03 08:39:16 crc kubenswrapper[4765]: I1003 08:39:16.224452 4765 server.go:66] "Creating device plugin registration server" version="v1beta1" socket="/var/lib/kubelet/device-plugins/kubelet.sock" Oct 03 08:39:16 crc kubenswrapper[4765]: I1003 08:39:16.225137 4765 state_mem.go:36] "Initialized new in-memory state store" Oct 03 08:39:16 crc kubenswrapper[4765]: I1003 08:39:16.225786 4765 server.go:1245] "Using root directory" path="/var/lib/kubelet" Oct 03 08:39:16 crc kubenswrapper[4765]: I1003 08:39:16.231153 4765 kubelet.go:418] "Attempting to sync node with API server" Oct 03 08:39:16 crc kubenswrapper[4765]: I1003 08:39:16.231179 4765 kubelet.go:313] "Adding static pod path" path="/etc/kubernetes/manifests" Oct 03 08:39:16 crc kubenswrapper[4765]: I1003 08:39:16.231207 4765 file.go:69] "Watching path" path="/etc/kubernetes/manifests" Oct 03 08:39:16 crc kubenswrapper[4765]: I1003 08:39:16.231220 4765 kubelet.go:324] "Adding apiserver pod source" Oct 03 08:39:16 crc kubenswrapper[4765]: I1003 08:39:16.231231 4765 apiserver.go:42] "Waiting for node sync before watching apiserver pods" Oct 03 08:39:16 crc kubenswrapper[4765]: W1003 08:39:16.239700 4765 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Node: Get "https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0": dial tcp 38.102.83.173:6443: connect: connection refused Oct 03 08:39:16 crc kubenswrapper[4765]: E1003 08:39:16.239800 4765 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Node: failed to list *v1.Node: Get \"https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0\": dial tcp 38.102.83.173:6443: connect: connection refused" logger="UnhandledError" Oct 03 08:39:16 crc kubenswrapper[4765]: W1003 08:39:16.240104 4765 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Service: Get "https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0": dial tcp 38.102.83.173:6443: connect: connection refused Oct 03 08:39:16 crc kubenswrapper[4765]: E1003 08:39:16.240286 4765 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Service: failed to list *v1.Service: Get \"https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0\": dial tcp 38.102.83.173:6443: connect: connection refused" logger="UnhandledError" Oct 03 08:39:16 crc kubenswrapper[4765]: I1003 08:39:16.242281 4765 kuberuntime_manager.go:262] "Container runtime initialized" containerRuntime="cri-o" version="1.31.5-4.rhaos4.18.gitdad78d5.el9" apiVersion="v1" Oct 03 08:39:16 crc kubenswrapper[4765]: I1003 08:39:16.243246 4765 certificate_store.go:130] Loading cert/key pair from "/var/lib/kubelet/pki/kubelet-server-current.pem". Oct 03 08:39:16 crc kubenswrapper[4765]: I1003 08:39:16.244635 4765 kubelet.go:854] "Not starting ClusterTrustBundle informer because we are in static kubelet mode" Oct 03 08:39:16 crc kubenswrapper[4765]: I1003 08:39:16.246208 4765 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/portworx-volume" Oct 03 08:39:16 crc kubenswrapper[4765]: I1003 08:39:16.246240 4765 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/empty-dir" Oct 03 08:39:16 crc kubenswrapper[4765]: I1003 08:39:16.246251 4765 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/git-repo" Oct 03 08:39:16 crc kubenswrapper[4765]: I1003 08:39:16.246262 4765 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/host-path" Oct 03 08:39:16 crc kubenswrapper[4765]: I1003 08:39:16.246277 4765 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/nfs" Oct 03 08:39:16 crc kubenswrapper[4765]: I1003 08:39:16.246288 4765 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/secret" Oct 03 08:39:16 crc kubenswrapper[4765]: I1003 08:39:16.246299 4765 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/iscsi" Oct 03 08:39:16 crc kubenswrapper[4765]: I1003 08:39:16.246330 4765 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/downward-api" Oct 03 08:39:16 crc kubenswrapper[4765]: I1003 08:39:16.246343 4765 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/fc" Oct 03 08:39:16 crc kubenswrapper[4765]: I1003 08:39:16.246353 4765 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/configmap" Oct 03 08:39:16 crc kubenswrapper[4765]: I1003 08:39:16.246372 4765 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/projected" Oct 03 08:39:16 crc kubenswrapper[4765]: I1003 08:39:16.246383 4765 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/local-volume" Oct 03 08:39:16 crc kubenswrapper[4765]: I1003 08:39:16.247292 4765 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/csi" Oct 03 08:39:16 crc kubenswrapper[4765]: I1003 08:39:16.247828 4765 server.go:1280] "Started kubelet" Oct 03 08:39:16 crc kubenswrapper[4765]: I1003 08:39:16.248052 4765 ratelimit.go:55] "Setting rate limiting for endpoint" service="podresources" qps=100 burstTokens=10 Oct 03 08:39:16 crc kubenswrapper[4765]: I1003 08:39:16.248118 4765 server.go:163] "Starting to listen" address="0.0.0.0" port=10250 Oct 03 08:39:16 crc kubenswrapper[4765]: I1003 08:39:16.248197 4765 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": dial tcp 38.102.83.173:6443: connect: connection refused Oct 03 08:39:16 crc kubenswrapper[4765]: I1003 08:39:16.248722 4765 server.go:236] "Starting to serve the podresources API" endpoint="unix:/var/lib/kubelet/pod-resources/kubelet.sock" Oct 03 08:39:16 crc systemd[1]: Started Kubernetes Kubelet. Oct 03 08:39:16 crc kubenswrapper[4765]: I1003 08:39:16.249814 4765 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate rotation is enabled Oct 03 08:39:16 crc kubenswrapper[4765]: I1003 08:39:16.249842 4765 fs_resource_analyzer.go:67] "Starting FS ResourceAnalyzer" Oct 03 08:39:16 crc kubenswrapper[4765]: I1003 08:39:16.250160 4765 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-31 09:16:04.347744341 +0000 UTC Oct 03 08:39:16 crc kubenswrapper[4765]: I1003 08:39:16.250213 4765 certificate_manager.go:356] kubernetes.io/kubelet-serving: Waiting 2136h36m48.097534168s for next certificate rotation Oct 03 08:39:16 crc kubenswrapper[4765]: E1003 08:39:16.250500 4765 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Oct 03 08:39:16 crc kubenswrapper[4765]: I1003 08:39:16.251200 4765 volume_manager.go:287] "The desired_state_of_world populator starts" Oct 03 08:39:16 crc kubenswrapper[4765]: I1003 08:39:16.251222 4765 volume_manager.go:289] "Starting Kubelet Volume Manager" Oct 03 08:39:16 crc kubenswrapper[4765]: E1003 08:39:16.251238 4765 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.173:6443: connect: connection refused" interval="200ms" Oct 03 08:39:16 crc kubenswrapper[4765]: I1003 08:39:16.251281 4765 desired_state_of_world_populator.go:146] "Desired state populator starts to run" Oct 03 08:39:16 crc kubenswrapper[4765]: W1003 08:39:16.251694 4765 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.CSIDriver: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0": dial tcp 38.102.83.173:6443: connect: connection refused Oct 03 08:39:16 crc kubenswrapper[4765]: E1003 08:39:16.251771 4765 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.CSIDriver: failed to list *v1.CSIDriver: Get \"https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0\": dial tcp 38.102.83.173:6443: connect: connection refused" logger="UnhandledError" Oct 03 08:39:16 crc kubenswrapper[4765]: I1003 08:39:16.252941 4765 factory.go:55] Registering systemd factory Oct 03 08:39:16 crc kubenswrapper[4765]: I1003 08:39:16.252963 4765 factory.go:221] Registration of the systemd container factory successfully Oct 03 08:39:16 crc kubenswrapper[4765]: I1003 08:39:16.253107 4765 server.go:460] "Adding debug handlers to kubelet server" Oct 03 08:39:16 crc kubenswrapper[4765]: I1003 08:39:16.253456 4765 factory.go:153] Registering CRI-O factory Oct 03 08:39:16 crc kubenswrapper[4765]: I1003 08:39:16.253525 4765 factory.go:221] Registration of the crio container factory successfully Oct 03 08:39:16 crc kubenswrapper[4765]: I1003 08:39:16.253841 4765 factory.go:219] Registration of the containerd container factory failed: unable to create containerd client: containerd: cannot unix dial containerd api service: dial unix /run/containerd/containerd.sock: connect: no such file or directory Oct 03 08:39:16 crc kubenswrapper[4765]: I1003 08:39:16.253912 4765 factory.go:103] Registering Raw factory Oct 03 08:39:16 crc kubenswrapper[4765]: I1003 08:39:16.253959 4765 manager.go:1196] Started watching for new ooms in manager Oct 03 08:39:16 crc kubenswrapper[4765]: E1003 08:39:16.254814 4765 event.go:368] "Unable to write event (may retry after sleeping)" err="Post \"https://api-int.crc.testing:6443/api/v1/namespaces/default/events\": dial tcp 38.102.83.173:6443: connect: connection refused" event="&Event{ObjectMeta:{crc.186aee6cf9490a5d default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:Starting,Message:Starting kubelet.,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2025-10-03 08:39:16.247792221 +0000 UTC m=+0.549286561,LastTimestamp:2025-10-03 08:39:16.247792221 +0000 UTC m=+0.549286561,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Oct 03 08:39:16 crc kubenswrapper[4765]: I1003 08:39:16.256817 4765 manager.go:319] Starting recovery of all containers Oct 03 08:39:16 crc kubenswrapper[4765]: I1003 08:39:16.267148 4765 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="01ab3dd5-8196-46d0-ad33-122e2ca51def" volumeName="kubernetes.io/configmap/01ab3dd5-8196-46d0-ad33-122e2ca51def-config" seLinuxMountContext="" Oct 03 08:39:16 crc kubenswrapper[4765]: I1003 08:39:16.267211 4765 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09efc573-dbb6-4249-bd59-9b87aba8dd28" volumeName="kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-config" seLinuxMountContext="" Oct 03 08:39:16 crc kubenswrapper[4765]: I1003 08:39:16.267225 4765 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="22c825df-677d-4ca6-82db-3454ed06e783" volumeName="kubernetes.io/projected/22c825df-677d-4ca6-82db-3454ed06e783-kube-api-access-7c4vf" seLinuxMountContext="" Oct 03 08:39:16 crc kubenswrapper[4765]: I1003 08:39:16.267239 4765 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6509e943-70c6-444c-bc41-48a544e36fbd" volumeName="kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-trusted-ca-bundle" seLinuxMountContext="" Oct 03 08:39:16 crc kubenswrapper[4765]: I1003 08:39:16.267250 4765 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="fda69060-fa79-4696-b1a6-7980f124bf7c" volumeName="kubernetes.io/projected/fda69060-fa79-4696-b1a6-7980f124bf7c-kube-api-access-xcgwh" seLinuxMountContext="" Oct 03 08:39:16 crc kubenswrapper[4765]: I1003 08:39:16.267262 4765 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5225d0e4-402f-4861-b410-819f433b1803" volumeName="kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-utilities" seLinuxMountContext="" Oct 03 08:39:16 crc kubenswrapper[4765]: I1003 08:39:16.267275 4765 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6402fda4-df10-493c-b4e5-d0569419652d" volumeName="kubernetes.io/projected/6402fda4-df10-493c-b4e5-d0569419652d-kube-api-access-mg5zb" seLinuxMountContext="" Oct 03 08:39:16 crc kubenswrapper[4765]: I1003 08:39:16.267288 4765 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" volumeName="kubernetes.io/secret/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-serving-cert" seLinuxMountContext="" Oct 03 08:39:16 crc kubenswrapper[4765]: I1003 08:39:16.267301 4765 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="925f1c65-6136-48ba-85aa-3a3b50560753" volumeName="kubernetes.io/secret/925f1c65-6136-48ba-85aa-3a3b50560753-ovn-control-plane-metrics-cert" seLinuxMountContext="" Oct 03 08:39:16 crc kubenswrapper[4765]: I1003 08:39:16.267312 4765 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="96b93a3a-6083-4aea-8eab-fe1aa8245ad9" volumeName="kubernetes.io/secret/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-metrics-tls" seLinuxMountContext="" Oct 03 08:39:16 crc kubenswrapper[4765]: I1003 08:39:16.267323 4765 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" volumeName="kubernetes.io/projected/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-kube-api-access-x4zgh" seLinuxMountContext="" Oct 03 08:39:16 crc kubenswrapper[4765]: I1003 08:39:16.267334 4765 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bf126b07-da06-4140-9a57-dfd54fc6b486" volumeName="kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-bound-sa-token" seLinuxMountContext="" Oct 03 08:39:16 crc kubenswrapper[4765]: I1003 08:39:16.267345 4765 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" volumeName="kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-profile-collector-cert" seLinuxMountContext="" Oct 03 08:39:16 crc kubenswrapper[4765]: I1003 08:39:16.267365 4765 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-audit-policies" seLinuxMountContext="" Oct 03 08:39:16 crc kubenswrapper[4765]: I1003 08:39:16.267380 4765 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-client" seLinuxMountContext="" Oct 03 08:39:16 crc kubenswrapper[4765]: I1003 08:39:16.267394 4765 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5fe579f8-e8a6-4643-bce5-a661393c4dde" volumeName="kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-node-bootstrap-token" seLinuxMountContext="" Oct 03 08:39:16 crc kubenswrapper[4765]: I1003 08:39:16.267409 4765 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="925f1c65-6136-48ba-85aa-3a3b50560753" volumeName="kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-ovnkube-config" seLinuxMountContext="" Oct 03 08:39:16 crc kubenswrapper[4765]: I1003 08:39:16.267422 4765 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="9d4552c7-cd75-42dd-8880-30dd377c49a4" volumeName="kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-config" seLinuxMountContext="" Oct 03 08:39:16 crc kubenswrapper[4765]: I1003 08:39:16.267435 4765 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a31745f5-9847-4afe-82a5-3161cc66ca93" volumeName="kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-bound-sa-token" seLinuxMountContext="" Oct 03 08:39:16 crc kubenswrapper[4765]: I1003 08:39:16.267446 4765 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="ef543e1b-8068-4ea3-b32a-61027b32e95d" volumeName="kubernetes.io/projected/ef543e1b-8068-4ea3-b32a-61027b32e95d-kube-api-access-s2kz5" seLinuxMountContext="" Oct 03 08:39:16 crc kubenswrapper[4765]: I1003 08:39:16.267459 4765 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09efc573-dbb6-4249-bd59-9b87aba8dd28" volumeName="kubernetes.io/projected/09efc573-dbb6-4249-bd59-9b87aba8dd28-kube-api-access-8tdtz" seLinuxMountContext="" Oct 03 08:39:16 crc kubenswrapper[4765]: I1003 08:39:16.267471 4765 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7583ce53-e0fe-4a16-9e4d-50516596a136" volumeName="kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-config" seLinuxMountContext="" Oct 03 08:39:16 crc kubenswrapper[4765]: I1003 08:39:16.267484 4765 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="ef543e1b-8068-4ea3-b32a-61027b32e95d" volumeName="kubernetes.io/secret/ef543e1b-8068-4ea3-b32a-61027b32e95d-webhook-cert" seLinuxMountContext="" Oct 03 08:39:16 crc kubenswrapper[4765]: I1003 08:39:16.267501 4765 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-encryption-config" seLinuxMountContext="" Oct 03 08:39:16 crc kubenswrapper[4765]: I1003 08:39:16.267517 4765 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09efc573-dbb6-4249-bd59-9b87aba8dd28" volumeName="kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-ca" seLinuxMountContext="" Oct 03 08:39:16 crc kubenswrapper[4765]: I1003 08:39:16.267533 4765 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" volumeName="kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-webhook-cert" seLinuxMountContext="" Oct 03 08:39:16 crc kubenswrapper[4765]: I1003 08:39:16.267550 4765 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b6312bbd-5731-4ea0-a20f-81d5a57df44a" volumeName="kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-profile-collector-cert" seLinuxMountContext="" Oct 03 08:39:16 crc kubenswrapper[4765]: I1003 08:39:16.267563 4765 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d" volumeName="kubernetes.io/projected/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d-kube-api-access-x2m85" seLinuxMountContext="" Oct 03 08:39:16 crc kubenswrapper[4765]: I1003 08:39:16.267578 4765 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0b78653f-4ff9-4508-8672-245ed9b561e3" volumeName="kubernetes.io/secret/0b78653f-4ff9-4508-8672-245ed9b561e3-serving-cert" seLinuxMountContext="" Oct 03 08:39:16 crc kubenswrapper[4765]: I1003 08:39:16.267591 4765 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="25e176fe-21b4-4974-b1ed-c8b94f112a7f" volumeName="kubernetes.io/configmap/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-cabundle" seLinuxMountContext="" Oct 03 08:39:16 crc kubenswrapper[4765]: I1003 08:39:16.267605 4765 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-ocp-branding-template" seLinuxMountContext="" Oct 03 08:39:16 crc kubenswrapper[4765]: I1003 08:39:16.267616 4765 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5225d0e4-402f-4861-b410-819f433b1803" volumeName="kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-catalog-content" seLinuxMountContext="" Oct 03 08:39:16 crc kubenswrapper[4765]: I1003 08:39:16.267629 4765 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6509e943-70c6-444c-bc41-48a544e36fbd" volumeName="kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-config" seLinuxMountContext="" Oct 03 08:39:16 crc kubenswrapper[4765]: I1003 08:39:16.267790 4765 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bf126b07-da06-4140-9a57-dfd54fc6b486" volumeName="kubernetes.io/configmap/bf126b07-da06-4140-9a57-dfd54fc6b486-trusted-ca" seLinuxMountContext="" Oct 03 08:39:16 crc kubenswrapper[4765]: I1003 08:39:16.267810 4765 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="ef543e1b-8068-4ea3-b32a-61027b32e95d" volumeName="kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-ovnkube-identity-cm" seLinuxMountContext="" Oct 03 08:39:16 crc kubenswrapper[4765]: I1003 08:39:16.267823 4765 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6ea678ab-3438-413e-bfe3-290ae7725660" volumeName="kubernetes.io/projected/6ea678ab-3438-413e-bfe3-290ae7725660-kube-api-access-htfz6" seLinuxMountContext="" Oct 03 08:39:16 crc kubenswrapper[4765]: I1003 08:39:16.267838 4765 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09efc573-dbb6-4249-bd59-9b87aba8dd28" volumeName="kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-client" seLinuxMountContext="" Oct 03 08:39:16 crc kubenswrapper[4765]: I1003 08:39:16.267851 4765 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="22c825df-677d-4ca6-82db-3454ed06e783" volumeName="kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-config" seLinuxMountContext="" Oct 03 08:39:16 crc kubenswrapper[4765]: I1003 08:39:16.267868 4765 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="37a5e44f-9a88-4405-be8a-b645485e7312" volumeName="kubernetes.io/projected/37a5e44f-9a88-4405-be8a-b645485e7312-kube-api-access-rdwmf" seLinuxMountContext="" Oct 03 08:39:16 crc kubenswrapper[4765]: I1003 08:39:16.267890 4765 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="4bb40260-dbaa-4fb0-84df-5e680505d512" volumeName="kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-cni-binary-copy" seLinuxMountContext="" Oct 03 08:39:16 crc kubenswrapper[4765]: I1003 08:39:16.267903 4765 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="57a731c4-ef35-47a8-b875-bfb08a7f8011" volumeName="kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-utilities" seLinuxMountContext="" Oct 03 08:39:16 crc kubenswrapper[4765]: I1003 08:39:16.267917 4765 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6509e943-70c6-444c-bc41-48a544e36fbd" volumeName="kubernetes.io/secret/6509e943-70c6-444c-bc41-48a544e36fbd-serving-cert" seLinuxMountContext="" Oct 03 08:39:16 crc kubenswrapper[4765]: I1003 08:39:16.267933 4765 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6ea678ab-3438-413e-bfe3-290ae7725660" volumeName="kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-env-overrides" seLinuxMountContext="" Oct 03 08:39:16 crc kubenswrapper[4765]: I1003 08:39:16.267947 4765 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" seLinuxMountContext="" Oct 03 08:39:16 crc kubenswrapper[4765]: I1003 08:39:16.269672 4765 reconstruct.go:144] "Volume is marked device as uncertain and added into the actual state" volumeName="kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" deviceMountPath="/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/1f4776af88835e41c12b831b4c9fed40233456d14189815a54dbe7f892fc1983/globalmount" Oct 03 08:39:16 crc kubenswrapper[4765]: I1003 08:39:16.269708 4765 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="e7e6199b-1264-4501-8953-767f51328d08" volumeName="kubernetes.io/secret/e7e6199b-1264-4501-8953-767f51328d08-serving-cert" seLinuxMountContext="" Oct 03 08:39:16 crc kubenswrapper[4765]: I1003 08:39:16.269723 4765 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" volumeName="kubernetes.io/projected/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-kube-api-access-dbsvg" seLinuxMountContext="" Oct 03 08:39:16 crc kubenswrapper[4765]: I1003 08:39:16.269738 4765 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-serving-cert" seLinuxMountContext="" Oct 03 08:39:16 crc kubenswrapper[4765]: I1003 08:39:16.269753 4765 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0b78653f-4ff9-4508-8672-245ed9b561e3" volumeName="kubernetes.io/projected/0b78653f-4ff9-4508-8672-245ed9b561e3-kube-api-access" seLinuxMountContext="" Oct 03 08:39:16 crc kubenswrapper[4765]: I1003 08:39:16.269769 4765 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7583ce53-e0fe-4a16-9e4d-50516596a136" volumeName="kubernetes.io/projected/7583ce53-e0fe-4a16-9e4d-50516596a136-kube-api-access-xcphl" seLinuxMountContext="" Oct 03 08:39:16 crc kubenswrapper[4765]: I1003 08:39:16.269785 4765 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="01ab3dd5-8196-46d0-ad33-122e2ca51def" volumeName="kubernetes.io/secret/01ab3dd5-8196-46d0-ad33-122e2ca51def-serving-cert" seLinuxMountContext="" Oct 03 08:39:16 crc kubenswrapper[4765]: I1003 08:39:16.269799 4765 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09efc573-dbb6-4249-bd59-9b87aba8dd28" volumeName="kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-service-ca" seLinuxMountContext="" Oct 03 08:39:16 crc kubenswrapper[4765]: I1003 08:39:16.269812 4765 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0b574797-001e-440a-8f4e-c0be86edad0f" volumeName="kubernetes.io/secret/0b574797-001e-440a-8f4e-c0be86edad0f-proxy-tls" seLinuxMountContext="" Oct 03 08:39:16 crc kubenswrapper[4765]: I1003 08:39:16.269835 4765 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-image-import-ca" seLinuxMountContext="" Oct 03 08:39:16 crc kubenswrapper[4765]: I1003 08:39:16.269854 4765 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1d611f23-29be-4491-8495-bee1670e935f" volumeName="kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-catalog-content" seLinuxMountContext="" Oct 03 08:39:16 crc kubenswrapper[4765]: I1003 08:39:16.269868 4765 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="22c825df-677d-4ca6-82db-3454ed06e783" volumeName="kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-auth-proxy-config" seLinuxMountContext="" Oct 03 08:39:16 crc kubenswrapper[4765]: I1003 08:39:16.269883 4765 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" volumeName="kubernetes.io/projected/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-kube-api-access-6ccd8" seLinuxMountContext="" Oct 03 08:39:16 crc kubenswrapper[4765]: I1003 08:39:16.269899 4765 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="87cf06ed-a83f-41a7-828d-70653580a8cb" volumeName="kubernetes.io/projected/87cf06ed-a83f-41a7-828d-70653580a8cb-kube-api-access-d6qdx" seLinuxMountContext="" Oct 03 08:39:16 crc kubenswrapper[4765]: I1003 08:39:16.269913 4765 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b6312bbd-5731-4ea0-a20f-81d5a57df44a" volumeName="kubernetes.io/projected/b6312bbd-5731-4ea0-a20f-81d5a57df44a-kube-api-access-249nr" seLinuxMountContext="" Oct 03 08:39:16 crc kubenswrapper[4765]: I1003 08:39:16.269927 4765 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1386a44e-36a2-460c-96d0-0359d2b6f0f5" volumeName="kubernetes.io/projected/1386a44e-36a2-460c-96d0-0359d2b6f0f5-kube-api-access" seLinuxMountContext="" Oct 03 08:39:16 crc kubenswrapper[4765]: I1003 08:39:16.269941 4765 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="25e176fe-21b4-4974-b1ed-c8b94f112a7f" volumeName="kubernetes.io/projected/25e176fe-21b4-4974-b1ed-c8b94f112a7f-kube-api-access-d4lsv" seLinuxMountContext="" Oct 03 08:39:16 crc kubenswrapper[4765]: I1003 08:39:16.269953 4765 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="9d4552c7-cd75-42dd-8880-30dd377c49a4" volumeName="kubernetes.io/projected/9d4552c7-cd75-42dd-8880-30dd377c49a4-kube-api-access-pcxfs" seLinuxMountContext="" Oct 03 08:39:16 crc kubenswrapper[4765]: I1003 08:39:16.269966 4765 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="9d4552c7-cd75-42dd-8880-30dd377c49a4" volumeName="kubernetes.io/secret/9d4552c7-cd75-42dd-8880-30dd377c49a4-serving-cert" seLinuxMountContext="" Oct 03 08:39:16 crc kubenswrapper[4765]: I1003 08:39:16.269979 4765 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="c03ee662-fb2f-4fc4-a2c1-af487c19d254" volumeName="kubernetes.io/projected/c03ee662-fb2f-4fc4-a2c1-af487c19d254-kube-api-access-v47cf" seLinuxMountContext="" Oct 03 08:39:16 crc kubenswrapper[4765]: I1003 08:39:16.269994 4765 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0b574797-001e-440a-8f4e-c0be86edad0f" volumeName="kubernetes.io/projected/0b574797-001e-440a-8f4e-c0be86edad0f-kube-api-access-lzf88" seLinuxMountContext="" Oct 03 08:39:16 crc kubenswrapper[4765]: I1003 08:39:16.270007 4765 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-serving-cert" seLinuxMountContext="" Oct 03 08:39:16 crc kubenswrapper[4765]: I1003 08:39:16.270018 4765 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a31745f5-9847-4afe-82a5-3161cc66ca93" volumeName="kubernetes.io/secret/a31745f5-9847-4afe-82a5-3161cc66ca93-metrics-tls" seLinuxMountContext="" Oct 03 08:39:16 crc kubenswrapper[4765]: I1003 08:39:16.270030 4765 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="e7e6199b-1264-4501-8953-767f51328d08" volumeName="kubernetes.io/projected/e7e6199b-1264-4501-8953-767f51328d08-kube-api-access" seLinuxMountContext="" Oct 03 08:39:16 crc kubenswrapper[4765]: I1003 08:39:16.270042 4765 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5441d097-087c-4d9a-baa8-b210afa90fc9" volumeName="kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-config" seLinuxMountContext="" Oct 03 08:39:16 crc kubenswrapper[4765]: I1003 08:39:16.270057 4765 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="01ab3dd5-8196-46d0-ad33-122e2ca51def" volumeName="kubernetes.io/projected/01ab3dd5-8196-46d0-ad33-122e2ca51def-kube-api-access-w7l8j" seLinuxMountContext="" Oct 03 08:39:16 crc kubenswrapper[4765]: I1003 08:39:16.270075 4765 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" volumeName="kubernetes.io/empty-dir/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-tmpfs" seLinuxMountContext="" Oct 03 08:39:16 crc kubenswrapper[4765]: I1003 08:39:16.270088 4765 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="31d8b7a1-420e-4252-a5b7-eebe8a111292" volumeName="kubernetes.io/secret/31d8b7a1-420e-4252-a5b7-eebe8a111292-proxy-tls" seLinuxMountContext="" Oct 03 08:39:16 crc kubenswrapper[4765]: I1003 08:39:16.270100 4765 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/projected/43509403-f426-496e-be36-56cef71462f5-kube-api-access-qg5z5" seLinuxMountContext="" Oct 03 08:39:16 crc kubenswrapper[4765]: I1003 08:39:16.270113 4765 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-service-ca" seLinuxMountContext="" Oct 03 08:39:16 crc kubenswrapper[4765]: I1003 08:39:16.270142 4765 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/projected/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-kube-api-access-ngvvp" seLinuxMountContext="" Oct 03 08:39:16 crc kubenswrapper[4765]: I1003 08:39:16.270156 4765 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="4bb40260-dbaa-4fb0-84df-5e680505d512" volumeName="kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-multus-daemon-config" seLinuxMountContext="" Oct 03 08:39:16 crc kubenswrapper[4765]: I1003 08:39:16.270169 4765 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5441d097-087c-4d9a-baa8-b210afa90fc9" volumeName="kubernetes.io/secret/5441d097-087c-4d9a-baa8-b210afa90fc9-serving-cert" seLinuxMountContext="" Oct 03 08:39:16 crc kubenswrapper[4765]: I1003 08:39:16.270183 4765 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5b88f790-22fa-440e-b583-365168c0b23d" volumeName="kubernetes.io/secret/5b88f790-22fa-440e-b583-365168c0b23d-metrics-certs" seLinuxMountContext="" Oct 03 08:39:16 crc kubenswrapper[4765]: I1003 08:39:16.270196 4765 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7583ce53-e0fe-4a16-9e4d-50516596a136" volumeName="kubernetes.io/secret/7583ce53-e0fe-4a16-9e4d-50516596a136-serving-cert" seLinuxMountContext="" Oct 03 08:39:16 crc kubenswrapper[4765]: I1003 08:39:16.270209 4765 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-kube-api-access-kfwg7" seLinuxMountContext="" Oct 03 08:39:16 crc kubenswrapper[4765]: I1003 08:39:16.270224 4765 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-oauth-serving-cert" seLinuxMountContext="" Oct 03 08:39:16 crc kubenswrapper[4765]: I1003 08:39:16.270237 4765 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-audit-policies" seLinuxMountContext="" Oct 03 08:39:16 crc kubenswrapper[4765]: I1003 08:39:16.270250 4765 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a0128f3a-b052-44ed-a84e-c4c8aaf17c13" volumeName="kubernetes.io/secret/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-samples-operator-tls" seLinuxMountContext="" Oct 03 08:39:16 crc kubenswrapper[4765]: I1003 08:39:16.270261 4765 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" volumeName="kubernetes.io/projected/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-kube-api-access-mnrrd" seLinuxMountContext="" Oct 03 08:39:16 crc kubenswrapper[4765]: I1003 08:39:16.270272 4765 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="e7e6199b-1264-4501-8953-767f51328d08" volumeName="kubernetes.io/configmap/e7e6199b-1264-4501-8953-767f51328d08-config" seLinuxMountContext="" Oct 03 08:39:16 crc kubenswrapper[4765]: I1003 08:39:16.270284 4765 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" volumeName="kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-catalog-content" seLinuxMountContext="" Oct 03 08:39:16 crc kubenswrapper[4765]: I1003 08:39:16.270297 4765 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1386a44e-36a2-460c-96d0-0359d2b6f0f5" volumeName="kubernetes.io/secret/1386a44e-36a2-460c-96d0-0359d2b6f0f5-serving-cert" seLinuxMountContext="" Oct 03 08:39:16 crc kubenswrapper[4765]: I1003 08:39:16.270311 4765 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-audit" seLinuxMountContext="" Oct 03 08:39:16 crc kubenswrapper[4765]: I1003 08:39:16.270325 4765 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="3ab1a177-2de0-46d9-b765-d0d0649bb42e" volumeName="kubernetes.io/secret/3ab1a177-2de0-46d9-b765-d0d0649bb42e-package-server-manager-serving-cert" seLinuxMountContext="" Oct 03 08:39:16 crc kubenswrapper[4765]: I1003 08:39:16.270338 4765 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-trusted-ca-bundle" seLinuxMountContext="" Oct 03 08:39:16 crc kubenswrapper[4765]: I1003 08:39:16.270352 4765 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="4bb40260-dbaa-4fb0-84df-5e680505d512" volumeName="kubernetes.io/projected/4bb40260-dbaa-4fb0-84df-5e680505d512-kube-api-access-2w9zh" seLinuxMountContext="" Oct 03 08:39:16 crc kubenswrapper[4765]: I1003 08:39:16.270365 4765 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6509e943-70c6-444c-bc41-48a544e36fbd" volumeName="kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-service-ca-bundle" seLinuxMountContext="" Oct 03 08:39:16 crc kubenswrapper[4765]: I1003 08:39:16.270377 4765 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="925f1c65-6136-48ba-85aa-3a3b50560753" volumeName="kubernetes.io/projected/925f1c65-6136-48ba-85aa-3a3b50560753-kube-api-access-s4n52" seLinuxMountContext="" Oct 03 08:39:16 crc kubenswrapper[4765]: I1003 08:39:16.270392 4765 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6402fda4-df10-493c-b4e5-d0569419652d" volumeName="kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-config" seLinuxMountContext="" Oct 03 08:39:16 crc kubenswrapper[4765]: I1003 08:39:16.270406 4765 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/projected/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-kube-api-access-zkvpv" seLinuxMountContext="" Oct 03 08:39:16 crc kubenswrapper[4765]: I1003 08:39:16.270420 4765 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-config" seLinuxMountContext="" Oct 03 08:39:16 crc kubenswrapper[4765]: I1003 08:39:16.270435 4765 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1d611f23-29be-4491-8495-bee1670e935f" volumeName="kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-utilities" seLinuxMountContext="" Oct 03 08:39:16 crc kubenswrapper[4765]: I1003 08:39:16.270449 4765 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" volumeName="kubernetes.io/configmap/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-config" seLinuxMountContext="" Oct 03 08:39:16 crc kubenswrapper[4765]: I1003 08:39:16.270462 4765 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="25e176fe-21b4-4974-b1ed-c8b94f112a7f" volumeName="kubernetes.io/secret/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-key" seLinuxMountContext="" Oct 03 08:39:16 crc kubenswrapper[4765]: I1003 08:39:16.270474 4765 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-oauth-config" seLinuxMountContext="" Oct 03 08:39:16 crc kubenswrapper[4765]: I1003 08:39:16.270489 4765 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-serving-cert" seLinuxMountContext="" Oct 03 08:39:16 crc kubenswrapper[4765]: I1003 08:39:16.270502 4765 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7583ce53-e0fe-4a16-9e4d-50516596a136" volumeName="kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-proxy-ca-bundles" seLinuxMountContext="" Oct 03 08:39:16 crc kubenswrapper[4765]: I1003 08:39:16.270521 4765 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bd23aa5c-e532-4e53-bccf-e79f130c5ae8" volumeName="kubernetes.io/projected/bd23aa5c-e532-4e53-bccf-e79f130c5ae8-kube-api-access-jhbk2" seLinuxMountContext="" Oct 03 08:39:16 crc kubenswrapper[4765]: I1003 08:39:16.270533 4765 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="c03ee662-fb2f-4fc4-a2c1-af487c19d254" volumeName="kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-stats-auth" seLinuxMountContext="" Oct 03 08:39:16 crc kubenswrapper[4765]: I1003 08:39:16.270546 4765 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-login" seLinuxMountContext="" Oct 03 08:39:16 crc kubenswrapper[4765]: I1003 08:39:16.270566 4765 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-session" seLinuxMountContext="" Oct 03 08:39:16 crc kubenswrapper[4765]: I1003 08:39:16.270582 4765 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7bb08738-c794-4ee8-9972-3a62ca171029" volumeName="kubernetes.io/projected/7bb08738-c794-4ee8-9972-3a62ca171029-kube-api-access-279lb" seLinuxMountContext="" Oct 03 08:39:16 crc kubenswrapper[4765]: I1003 08:39:16.270620 4765 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-trusted-ca" seLinuxMountContext="" Oct 03 08:39:16 crc kubenswrapper[4765]: I1003 08:39:16.270634 4765 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/secret/8f668bae-612b-4b75-9490-919e737c6a3b-installation-pull-secrets" seLinuxMountContext="" Oct 03 08:39:16 crc kubenswrapper[4765]: I1003 08:39:16.270668 4765 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="96b93a3a-6083-4aea-8eab-fe1aa8245ad9" volumeName="kubernetes.io/projected/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-kube-api-access-nzwt7" seLinuxMountContext="" Oct 03 08:39:16 crc kubenswrapper[4765]: I1003 08:39:16.270683 4765 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b6cd30de-2eeb-49a2-ab40-9167f4560ff5" volumeName="kubernetes.io/configmap/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-trusted-ca" seLinuxMountContext="" Oct 03 08:39:16 crc kubenswrapper[4765]: I1003 08:39:16.270696 4765 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="c03ee662-fb2f-4fc4-a2c1-af487c19d254" volumeName="kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-default-certificate" seLinuxMountContext="" Oct 03 08:39:16 crc kubenswrapper[4765]: I1003 08:39:16.270711 4765 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="d75a4c96-2883-4a0b-bab2-0fab2b6c0b49" volumeName="kubernetes.io/projected/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-kube-api-access-rczfb" seLinuxMountContext="" Oct 03 08:39:16 crc kubenswrapper[4765]: I1003 08:39:16.270725 4765 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="efdd0498-1daa-4136-9a4a-3b948c2293fc" volumeName="kubernetes.io/projected/efdd0498-1daa-4136-9a4a-3b948c2293fc-kube-api-access-fqsjt" seLinuxMountContext="" Oct 03 08:39:16 crc kubenswrapper[4765]: I1003 08:39:16.270739 4765 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-registry-certificates" seLinuxMountContext="" Oct 03 08:39:16 crc kubenswrapper[4765]: I1003 08:39:16.270754 4765 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0b574797-001e-440a-8f4e-c0be86edad0f" volumeName="kubernetes.io/configmap/0b574797-001e-440a-8f4e-c0be86edad0f-mcc-auth-proxy-config" seLinuxMountContext="" Oct 03 08:39:16 crc kubenswrapper[4765]: I1003 08:39:16.270768 4765 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1d611f23-29be-4491-8495-bee1670e935f" volumeName="kubernetes.io/projected/1d611f23-29be-4491-8495-bee1670e935f-kube-api-access-bf2bz" seLinuxMountContext="" Oct 03 08:39:16 crc kubenswrapper[4765]: I1003 08:39:16.270783 4765 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="20b0d48f-5fd6-431c-a545-e3c800c7b866" volumeName="kubernetes.io/projected/20b0d48f-5fd6-431c-a545-e3c800c7b866-kube-api-access-w9rds" seLinuxMountContext="" Oct 03 08:39:16 crc kubenswrapper[4765]: I1003 08:39:16.270797 4765 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-idp-0-file-data" seLinuxMountContext="" Oct 03 08:39:16 crc kubenswrapper[4765]: I1003 08:39:16.270811 4765 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6731426b-95fe-49ff-bb5f-40441049fde2" volumeName="kubernetes.io/projected/6731426b-95fe-49ff-bb5f-40441049fde2-kube-api-access-x7zkh" seLinuxMountContext="" Oct 03 08:39:16 crc kubenswrapper[4765]: I1003 08:39:16.270823 4765 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7bb08738-c794-4ee8-9972-3a62ca171029" volumeName="kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-sysctl-allowlist" seLinuxMountContext="" Oct 03 08:39:16 crc kubenswrapper[4765]: I1003 08:39:16.270838 4765 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" volumeName="kubernetes.io/projected/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-kube-api-access-w4xd4" seLinuxMountContext="" Oct 03 08:39:16 crc kubenswrapper[4765]: I1003 08:39:16.270885 4765 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="9d4552c7-cd75-42dd-8880-30dd377c49a4" volumeName="kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-trusted-ca" seLinuxMountContext="" Oct 03 08:39:16 crc kubenswrapper[4765]: I1003 08:39:16.270897 4765 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" volumeName="kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-srv-cert" seLinuxMountContext="" Oct 03 08:39:16 crc kubenswrapper[4765]: I1003 08:39:16.270909 4765 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="fda69060-fa79-4696-b1a6-7980f124bf7c" volumeName="kubernetes.io/configmap/fda69060-fa79-4696-b1a6-7980f124bf7c-mcd-auth-proxy-config" seLinuxMountContext="" Oct 03 08:39:16 crc kubenswrapper[4765]: I1003 08:39:16.270920 4765 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="ef543e1b-8068-4ea3-b32a-61027b32e95d" volumeName="kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-env-overrides" seLinuxMountContext="" Oct 03 08:39:16 crc kubenswrapper[4765]: I1003 08:39:16.270932 4765 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-serving-ca" seLinuxMountContext="" Oct 03 08:39:16 crc kubenswrapper[4765]: I1003 08:39:16.270945 4765 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5441d097-087c-4d9a-baa8-b210afa90fc9" volumeName="kubernetes.io/projected/5441d097-087c-4d9a-baa8-b210afa90fc9-kube-api-access-2d4wz" seLinuxMountContext="" Oct 03 08:39:16 crc kubenswrapper[4765]: I1003 08:39:16.270957 4765 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5fe579f8-e8a6-4643-bce5-a661393c4dde" volumeName="kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-certs" seLinuxMountContext="" Oct 03 08:39:16 crc kubenswrapper[4765]: I1003 08:39:16.270971 4765 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6402fda4-df10-493c-b4e5-d0569419652d" volumeName="kubernetes.io/secret/6402fda4-df10-493c-b4e5-d0569419652d-machine-api-operator-tls" seLinuxMountContext="" Oct 03 08:39:16 crc kubenswrapper[4765]: I1003 08:39:16.270984 4765 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7539238d-5fe0-46ed-884e-1c3b566537ec" volumeName="kubernetes.io/secret/7539238d-5fe0-46ed-884e-1c3b566537ec-serving-cert" seLinuxMountContext="" Oct 03 08:39:16 crc kubenswrapper[4765]: I1003 08:39:16.270995 4765 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="925f1c65-6136-48ba-85aa-3a3b50560753" volumeName="kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-env-overrides" seLinuxMountContext="" Oct 03 08:39:16 crc kubenswrapper[4765]: I1003 08:39:16.271010 4765 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bf126b07-da06-4140-9a57-dfd54fc6b486" volumeName="kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-kube-api-access-rnphk" seLinuxMountContext="" Oct 03 08:39:16 crc kubenswrapper[4765]: I1003 08:39:16.271022 4765 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-registry-tls" seLinuxMountContext="" Oct 03 08:39:16 crc kubenswrapper[4765]: I1003 08:39:16.271036 4765 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0b78653f-4ff9-4508-8672-245ed9b561e3" volumeName="kubernetes.io/configmap/0b78653f-4ff9-4508-8672-245ed9b561e3-service-ca" seLinuxMountContext="" Oct 03 08:39:16 crc kubenswrapper[4765]: I1003 08:39:16.271050 4765 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="31d8b7a1-420e-4252-a5b7-eebe8a111292" volumeName="kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-images" seLinuxMountContext="" Oct 03 08:39:16 crc kubenswrapper[4765]: I1003 08:39:16.271064 4765 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-error" seLinuxMountContext="" Oct 03 08:39:16 crc kubenswrapper[4765]: I1003 08:39:16.271077 4765 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5b88f790-22fa-440e-b583-365168c0b23d" volumeName="kubernetes.io/projected/5b88f790-22fa-440e-b583-365168c0b23d-kube-api-access-jkwtn" seLinuxMountContext="" Oct 03 08:39:16 crc kubenswrapper[4765]: I1003 08:39:16.271090 4765 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" volumeName="kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert" seLinuxMountContext="" Oct 03 08:39:16 crc kubenswrapper[4765]: I1003 08:39:16.271103 4765 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6402fda4-df10-493c-b4e5-d0569419652d" volumeName="kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-images" seLinuxMountContext="" Oct 03 08:39:16 crc kubenswrapper[4765]: I1003 08:39:16.271117 4765 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="87cf06ed-a83f-41a7-828d-70653580a8cb" volumeName="kubernetes.io/secret/87cf06ed-a83f-41a7-828d-70653580a8cb-metrics-tls" seLinuxMountContext="" Oct 03 08:39:16 crc kubenswrapper[4765]: I1003 08:39:16.271135 4765 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="9d751cbb-f2e2-430d-9754-c882a5e924a5" volumeName="kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl" seLinuxMountContext="" Oct 03 08:39:16 crc kubenswrapper[4765]: I1003 08:39:16.271146 4765 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a31745f5-9847-4afe-82a5-3161cc66ca93" volumeName="kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-kube-api-access-lz9wn" seLinuxMountContext="" Oct 03 08:39:16 crc kubenswrapper[4765]: I1003 08:39:16.271158 4765 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b6cd30de-2eeb-49a2-ab40-9167f4560ff5" volumeName="kubernetes.io/projected/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-kube-api-access-pj782" seLinuxMountContext="" Oct 03 08:39:16 crc kubenswrapper[4765]: I1003 08:39:16.271172 4765 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" volumeName="kubernetes.io/empty-dir/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-available-featuregates" seLinuxMountContext="" Oct 03 08:39:16 crc kubenswrapper[4765]: I1003 08:39:16.271185 4765 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="20b0d48f-5fd6-431c-a545-e3c800c7b866" volumeName="kubernetes.io/secret/20b0d48f-5fd6-431c-a545-e3c800c7b866-cert" seLinuxMountContext="" Oct 03 08:39:16 crc kubenswrapper[4765]: I1003 08:39:16.271198 4765 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="57a731c4-ef35-47a8-b875-bfb08a7f8011" volumeName="kubernetes.io/projected/57a731c4-ef35-47a8-b875-bfb08a7f8011-kube-api-access-cfbct" seLinuxMountContext="" Oct 03 08:39:16 crc kubenswrapper[4765]: I1003 08:39:16.271211 4765 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" volumeName="kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf" seLinuxMountContext="" Oct 03 08:39:16 crc kubenswrapper[4765]: I1003 08:39:16.271225 4765 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6ea678ab-3438-413e-bfe3-290ae7725660" volumeName="kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-config" seLinuxMountContext="" Oct 03 08:39:16 crc kubenswrapper[4765]: I1003 08:39:16.271238 4765 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6ea678ab-3438-413e-bfe3-290ae7725660" volumeName="kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-script-lib" seLinuxMountContext="" Oct 03 08:39:16 crc kubenswrapper[4765]: I1003 08:39:16.271251 4765 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="d75a4c96-2883-4a0b-bab2-0fab2b6c0b49" volumeName="kubernetes.io/configmap/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-iptables-alerter-script" seLinuxMountContext="" Oct 03 08:39:16 crc kubenswrapper[4765]: I1003 08:39:16.271266 4765 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="fda69060-fa79-4696-b1a6-7980f124bf7c" volumeName="kubernetes.io/secret/fda69060-fa79-4696-b1a6-7980f124bf7c-proxy-tls" seLinuxMountContext="" Oct 03 08:39:16 crc kubenswrapper[4765]: I1003 08:39:16.271280 4765 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a31745f5-9847-4afe-82a5-3161cc66ca93" volumeName="kubernetes.io/configmap/a31745f5-9847-4afe-82a5-3161cc66ca93-trusted-ca" seLinuxMountContext="" Oct 03 08:39:16 crc kubenswrapper[4765]: I1003 08:39:16.271293 4765 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" volumeName="kubernetes.io/projected/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-kube-api-access-qs4fp" seLinuxMountContext="" Oct 03 08:39:16 crc kubenswrapper[4765]: I1003 08:39:16.271306 4765 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="22c825df-677d-4ca6-82db-3454ed06e783" volumeName="kubernetes.io/secret/22c825df-677d-4ca6-82db-3454ed06e783-machine-approver-tls" seLinuxMountContext="" Oct 03 08:39:16 crc kubenswrapper[4765]: I1003 08:39:16.271319 4765 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="3cb93b32-e0ae-4377-b9c8-fdb9842c6d59" volumeName="kubernetes.io/projected/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-kube-api-access-wxkg8" seLinuxMountContext="" Oct 03 08:39:16 crc kubenswrapper[4765]: I1003 08:39:16.271332 4765 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-console-config" seLinuxMountContext="" Oct 03 08:39:16 crc kubenswrapper[4765]: I1003 08:39:16.271369 4765 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="496e6271-fb68-4057-954e-a0d97a4afa3f" volumeName="kubernetes.io/projected/496e6271-fb68-4057-954e-a0d97a4afa3f-kube-api-access" seLinuxMountContext="" Oct 03 08:39:16 crc kubenswrapper[4765]: I1003 08:39:16.271385 4765 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6509e943-70c6-444c-bc41-48a544e36fbd" volumeName="kubernetes.io/projected/6509e943-70c6-444c-bc41-48a544e36fbd-kube-api-access-6g6sz" seLinuxMountContext="" Oct 03 08:39:16 crc kubenswrapper[4765]: I1003 08:39:16.271403 4765 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6731426b-95fe-49ff-bb5f-40441049fde2" volumeName="kubernetes.io/secret/6731426b-95fe-49ff-bb5f-40441049fde2-control-plane-machine-set-operator-tls" seLinuxMountContext="" Oct 03 08:39:16 crc kubenswrapper[4765]: I1003 08:39:16.271416 4765 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-serving-ca" seLinuxMountContext="" Oct 03 08:39:16 crc kubenswrapper[4765]: I1003 08:39:16.271431 4765 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" volumeName="kubernetes.io/secret/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-serving-cert" seLinuxMountContext="" Oct 03 08:39:16 crc kubenswrapper[4765]: I1003 08:39:16.271444 4765 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="37a5e44f-9a88-4405-be8a-b645485e7312" volumeName="kubernetes.io/secret/37a5e44f-9a88-4405-be8a-b645485e7312-metrics-tls" seLinuxMountContext="" Oct 03 08:39:16 crc kubenswrapper[4765]: I1003 08:39:16.271457 4765 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5225d0e4-402f-4861-b410-819f433b1803" volumeName="kubernetes.io/projected/5225d0e4-402f-4861-b410-819f433b1803-kube-api-access-9xfj7" seLinuxMountContext="" Oct 03 08:39:16 crc kubenswrapper[4765]: I1003 08:39:16.271469 4765 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/empty-dir/8f668bae-612b-4b75-9490-919e737c6a3b-ca-trust-extracted" seLinuxMountContext="" Oct 03 08:39:16 crc kubenswrapper[4765]: I1003 08:39:16.271487 4765 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="efdd0498-1daa-4136-9a4a-3b948c2293fc" volumeName="kubernetes.io/secret/efdd0498-1daa-4136-9a4a-3b948c2293fc-webhook-certs" seLinuxMountContext="" Oct 03 08:39:16 crc kubenswrapper[4765]: I1003 08:39:16.271515 4765 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-serving-cert" seLinuxMountContext="" Oct 03 08:39:16 crc kubenswrapper[4765]: I1003 08:39:16.271529 4765 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-provider-selection" seLinuxMountContext="" Oct 03 08:39:16 crc kubenswrapper[4765]: I1003 08:39:16.271541 4765 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="57a731c4-ef35-47a8-b875-bfb08a7f8011" volumeName="kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-catalog-content" seLinuxMountContext="" Oct 03 08:39:16 crc kubenswrapper[4765]: I1003 08:39:16.271554 4765 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a0128f3a-b052-44ed-a84e-c4c8aaf17c13" volumeName="kubernetes.io/projected/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-kube-api-access-gf66m" seLinuxMountContext="" Oct 03 08:39:16 crc kubenswrapper[4765]: I1003 08:39:16.271568 4765 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b6cd30de-2eeb-49a2-ab40-9167f4560ff5" volumeName="kubernetes.io/secret/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-operator-metrics" seLinuxMountContext="" Oct 03 08:39:16 crc kubenswrapper[4765]: I1003 08:39:16.271580 4765 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="c03ee662-fb2f-4fc4-a2c1-af487c19d254" volumeName="kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-metrics-certs" seLinuxMountContext="" Oct 03 08:39:16 crc kubenswrapper[4765]: I1003 08:39:16.271593 4765 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" volumeName="kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-apiservice-cert" seLinuxMountContext="" Oct 03 08:39:16 crc kubenswrapper[4765]: I1003 08:39:16.271608 4765 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="31d8b7a1-420e-4252-a5b7-eebe8a111292" volumeName="kubernetes.io/projected/31d8b7a1-420e-4252-a5b7-eebe8a111292-kube-api-access-zgdk5" seLinuxMountContext="" Oct 03 08:39:16 crc kubenswrapper[4765]: I1003 08:39:16.271621 4765 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-trusted-ca-bundle" seLinuxMountContext="" Oct 03 08:39:16 crc kubenswrapper[4765]: I1003 08:39:16.271633 4765 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5fe579f8-e8a6-4643-bce5-a661393c4dde" volumeName="kubernetes.io/projected/5fe579f8-e8a6-4643-bce5-a661393c4dde-kube-api-access-fcqwp" seLinuxMountContext="" Oct 03 08:39:16 crc kubenswrapper[4765]: I1003 08:39:16.271666 4765 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6ea678ab-3438-413e-bfe3-290ae7725660" volumeName="kubernetes.io/secret/6ea678ab-3438-413e-bfe3-290ae7725660-ovn-node-metrics-cert" seLinuxMountContext="" Oct 03 08:39:16 crc kubenswrapper[4765]: I1003 08:39:16.271680 4765 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" volumeName="kubernetes.io/configmap/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-config" seLinuxMountContext="" Oct 03 08:39:16 crc kubenswrapper[4765]: I1003 08:39:16.271692 4765 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-trusted-ca-bundle" seLinuxMountContext="" Oct 03 08:39:16 crc kubenswrapper[4765]: I1003 08:39:16.271705 4765 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-encryption-config" seLinuxMountContext="" Oct 03 08:39:16 crc kubenswrapper[4765]: I1003 08:39:16.271718 4765 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="3b6479f0-333b-4a96-9adf-2099afdc2447" volumeName="kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr" seLinuxMountContext="" Oct 03 08:39:16 crc kubenswrapper[4765]: I1003 08:39:16.271731 4765 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="3cb93b32-e0ae-4377-b9c8-fdb9842c6d59" volumeName="kubernetes.io/configmap/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-serviceca" seLinuxMountContext="" Oct 03 08:39:16 crc kubenswrapper[4765]: I1003 08:39:16.271746 4765 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-cliconfig" seLinuxMountContext="" Oct 03 08:39:16 crc kubenswrapper[4765]: I1003 08:39:16.271758 4765 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7539238d-5fe0-46ed-884e-1c3b566537ec" volumeName="kubernetes.io/projected/7539238d-5fe0-46ed-884e-1c3b566537ec-kube-api-access-tk88c" seLinuxMountContext="" Oct 03 08:39:16 crc kubenswrapper[4765]: I1003 08:39:16.271770 4765 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7583ce53-e0fe-4a16-9e4d-50516596a136" volumeName="kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-client-ca" seLinuxMountContext="" Oct 03 08:39:16 crc kubenswrapper[4765]: I1003 08:39:16.271781 4765 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09efc573-dbb6-4249-bd59-9b87aba8dd28" volumeName="kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-serving-cert" seLinuxMountContext="" Oct 03 08:39:16 crc kubenswrapper[4765]: I1003 08:39:16.271824 4765 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-client" seLinuxMountContext="" Oct 03 08:39:16 crc kubenswrapper[4765]: I1003 08:39:16.271840 4765 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5441d097-087c-4d9a-baa8-b210afa90fc9" volumeName="kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-client-ca" seLinuxMountContext="" Oct 03 08:39:16 crc kubenswrapper[4765]: I1003 08:39:16.271853 4765 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7bb08738-c794-4ee8-9972-3a62ca171029" volumeName="kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-binary-copy" seLinuxMountContext="" Oct 03 08:39:16 crc kubenswrapper[4765]: I1003 08:39:16.271864 4765 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" volumeName="kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-utilities" seLinuxMountContext="" Oct 03 08:39:16 crc kubenswrapper[4765]: I1003 08:39:16.271877 4765 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-trusted-ca-bundle" seLinuxMountContext="" Oct 03 08:39:16 crc kubenswrapper[4765]: I1003 08:39:16.271889 4765 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1386a44e-36a2-460c-96d0-0359d2b6f0f5" volumeName="kubernetes.io/configmap/1386a44e-36a2-460c-96d0-0359d2b6f0f5-config" seLinuxMountContext="" Oct 03 08:39:16 crc kubenswrapper[4765]: I1003 08:39:16.271901 4765 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="496e6271-fb68-4057-954e-a0d97a4afa3f" volumeName="kubernetes.io/configmap/496e6271-fb68-4057-954e-a0d97a4afa3f-config" seLinuxMountContext="" Oct 03 08:39:16 crc kubenswrapper[4765]: I1003 08:39:16.271913 4765 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7539238d-5fe0-46ed-884e-1c3b566537ec" volumeName="kubernetes.io/configmap/7539238d-5fe0-46ed-884e-1c3b566537ec-config" seLinuxMountContext="" Oct 03 08:39:16 crc kubenswrapper[4765]: I1003 08:39:16.271926 4765 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-bound-sa-token" seLinuxMountContext="" Oct 03 08:39:16 crc kubenswrapper[4765]: I1003 08:39:16.271938 4765 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-service-ca" seLinuxMountContext="" Oct 03 08:39:16 crc kubenswrapper[4765]: I1003 08:39:16.271950 4765 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="496e6271-fb68-4057-954e-a0d97a4afa3f" volumeName="kubernetes.io/secret/496e6271-fb68-4057-954e-a0d97a4afa3f-serving-cert" seLinuxMountContext="" Oct 03 08:39:16 crc kubenswrapper[4765]: I1003 08:39:16.271966 4765 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" volumeName="kubernetes.io/secret/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-serving-cert" seLinuxMountContext="" Oct 03 08:39:16 crc kubenswrapper[4765]: I1003 08:39:16.271980 4765 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/projected/1bf7eb37-55a3-4c65-b768-a94c82151e69-kube-api-access-sb6h7" seLinuxMountContext="" Oct 03 08:39:16 crc kubenswrapper[4765]: I1003 08:39:16.271995 4765 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="87cf06ed-a83f-41a7-828d-70653580a8cb" volumeName="kubernetes.io/configmap/87cf06ed-a83f-41a7-828d-70653580a8cb-config-volume" seLinuxMountContext="" Oct 03 08:39:16 crc kubenswrapper[4765]: I1003 08:39:16.272009 4765 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bf126b07-da06-4140-9a57-dfd54fc6b486" volumeName="kubernetes.io/secret/bf126b07-da06-4140-9a57-dfd54fc6b486-image-registry-operator-tls" seLinuxMountContext="" Oct 03 08:39:16 crc kubenswrapper[4765]: I1003 08:39:16.272020 4765 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="c03ee662-fb2f-4fc4-a2c1-af487c19d254" volumeName="kubernetes.io/configmap/c03ee662-fb2f-4fc4-a2c1-af487c19d254-service-ca-bundle" seLinuxMountContext="" Oct 03 08:39:16 crc kubenswrapper[4765]: I1003 08:39:16.272035 4765 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="31d8b7a1-420e-4252-a5b7-eebe8a111292" volumeName="kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-auth-proxy-config" seLinuxMountContext="" Oct 03 08:39:16 crc kubenswrapper[4765]: I1003 08:39:16.272048 4765 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="3ab1a177-2de0-46d9-b765-d0d0649bb42e" volumeName="kubernetes.io/projected/3ab1a177-2de0-46d9-b765-d0d0649bb42e-kube-api-access-4d4hj" seLinuxMountContext="" Oct 03 08:39:16 crc kubenswrapper[4765]: I1003 08:39:16.272060 4765 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="44663579-783b-4372-86d6-acf235a62d72" volumeName="kubernetes.io/projected/44663579-783b-4372-86d6-acf235a62d72-kube-api-access-vt5rc" seLinuxMountContext="" Oct 03 08:39:16 crc kubenswrapper[4765]: I1003 08:39:16.272072 4765 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-router-certs" seLinuxMountContext="" Oct 03 08:39:16 crc kubenswrapper[4765]: I1003 08:39:16.272085 4765 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49ef4625-1d3a-4a9f-b595-c2433d32326d" volumeName="kubernetes.io/projected/49ef4625-1d3a-4a9f-b595-c2433d32326d-kube-api-access-pjr6v" seLinuxMountContext="" Oct 03 08:39:16 crc kubenswrapper[4765]: I1003 08:39:16.272097 4765 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b6312bbd-5731-4ea0-a20f-81d5a57df44a" volumeName="kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-srv-cert" seLinuxMountContext="" Oct 03 08:39:16 crc kubenswrapper[4765]: I1003 08:39:16.272109 4765 reconstruct.go:97] "Volume reconstruction finished" Oct 03 08:39:16 crc kubenswrapper[4765]: I1003 08:39:16.272118 4765 reconciler.go:26] "Reconciler: start to sync state" Oct 03 08:39:16 crc kubenswrapper[4765]: I1003 08:39:16.286019 4765 manager.go:324] Recovery completed Oct 03 08:39:16 crc kubenswrapper[4765]: I1003 08:39:16.300254 4765 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 03 08:39:16 crc kubenswrapper[4765]: I1003 08:39:16.302736 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 08:39:16 crc kubenswrapper[4765]: I1003 08:39:16.302799 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 08:39:16 crc kubenswrapper[4765]: I1003 08:39:16.302813 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 08:39:16 crc kubenswrapper[4765]: I1003 08:39:16.303495 4765 kubelet_network_linux.go:50] "Initialized iptables rules." protocol="IPv4" Oct 03 08:39:16 crc kubenswrapper[4765]: I1003 08:39:16.303951 4765 cpu_manager.go:225] "Starting CPU manager" policy="none" Oct 03 08:39:16 crc kubenswrapper[4765]: I1003 08:39:16.303977 4765 cpu_manager.go:226] "Reconciling" reconcilePeriod="10s" Oct 03 08:39:16 crc kubenswrapper[4765]: I1003 08:39:16.304003 4765 state_mem.go:36] "Initialized new in-memory state store" Oct 03 08:39:16 crc kubenswrapper[4765]: I1003 08:39:16.305322 4765 kubelet_network_linux.go:50] "Initialized iptables rules." protocol="IPv6" Oct 03 08:39:16 crc kubenswrapper[4765]: I1003 08:39:16.305382 4765 status_manager.go:217] "Starting to sync pod status with apiserver" Oct 03 08:39:16 crc kubenswrapper[4765]: I1003 08:39:16.305452 4765 kubelet.go:2335] "Starting kubelet main sync loop" Oct 03 08:39:16 crc kubenswrapper[4765]: E1003 08:39:16.305691 4765 kubelet.go:2359] "Skipping pod synchronization" err="[container runtime status check may not have completed yet, PLEG is not healthy: pleg has yet to be successful]" Oct 03 08:39:16 crc kubenswrapper[4765]: W1003 08:39:16.307428 4765 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.RuntimeClass: Get "https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0": dial tcp 38.102.83.173:6443: connect: connection refused Oct 03 08:39:16 crc kubenswrapper[4765]: E1003 08:39:16.307495 4765 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.RuntimeClass: failed to list *v1.RuntimeClass: Get \"https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0\": dial tcp 38.102.83.173:6443: connect: connection refused" logger="UnhandledError" Oct 03 08:39:16 crc kubenswrapper[4765]: I1003 08:39:16.325267 4765 policy_none.go:49] "None policy: Start" Oct 03 08:39:16 crc kubenswrapper[4765]: I1003 08:39:16.326943 4765 memory_manager.go:170] "Starting memorymanager" policy="None" Oct 03 08:39:16 crc kubenswrapper[4765]: I1003 08:39:16.326999 4765 state_mem.go:35] "Initializing new in-memory state store" Oct 03 08:39:16 crc kubenswrapper[4765]: E1003 08:39:16.351367 4765 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Oct 03 08:39:16 crc kubenswrapper[4765]: I1003 08:39:16.389044 4765 manager.go:334] "Starting Device Plugin manager" Oct 03 08:39:16 crc kubenswrapper[4765]: I1003 08:39:16.389097 4765 manager.go:513] "Failed to read data from checkpoint" checkpoint="kubelet_internal_checkpoint" err="checkpoint is not found" Oct 03 08:39:16 crc kubenswrapper[4765]: I1003 08:39:16.389111 4765 server.go:79] "Starting device plugin registration server" Oct 03 08:39:16 crc kubenswrapper[4765]: I1003 08:39:16.389886 4765 eviction_manager.go:189] "Eviction manager: starting control loop" Oct 03 08:39:16 crc kubenswrapper[4765]: I1003 08:39:16.389908 4765 container_log_manager.go:189] "Initializing container log rotate workers" workers=1 monitorPeriod="10s" Oct 03 08:39:16 crc kubenswrapper[4765]: I1003 08:39:16.390140 4765 plugin_watcher.go:51] "Plugin Watcher Start" path="/var/lib/kubelet/plugins_registry" Oct 03 08:39:16 crc kubenswrapper[4765]: I1003 08:39:16.390281 4765 plugin_manager.go:116] "The desired_state_of_world populator (plugin watcher) starts" Oct 03 08:39:16 crc kubenswrapper[4765]: I1003 08:39:16.390292 4765 plugin_manager.go:118] "Starting Kubelet Plugin Manager" Oct 03 08:39:16 crc kubenswrapper[4765]: E1003 08:39:16.396116 4765 eviction_manager.go:285] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"crc\" not found" Oct 03 08:39:16 crc kubenswrapper[4765]: I1003 08:39:16.406252 4765 kubelet.go:2421] "SyncLoop ADD" source="file" pods=["openshift-machine-config-operator/kube-rbac-proxy-crio-crc","openshift-etcd/etcd-crc","openshift-kube-apiserver/kube-apiserver-crc","openshift-kube-controller-manager/kube-controller-manager-crc","openshift-kube-scheduler/openshift-kube-scheduler-crc"] Oct 03 08:39:16 crc kubenswrapper[4765]: I1003 08:39:16.406367 4765 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 03 08:39:16 crc kubenswrapper[4765]: I1003 08:39:16.408310 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 08:39:16 crc kubenswrapper[4765]: I1003 08:39:16.408448 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 08:39:16 crc kubenswrapper[4765]: I1003 08:39:16.408595 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 08:39:16 crc kubenswrapper[4765]: I1003 08:39:16.408990 4765 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 03 08:39:16 crc kubenswrapper[4765]: I1003 08:39:16.409330 4765 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Oct 03 08:39:16 crc kubenswrapper[4765]: I1003 08:39:16.409402 4765 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 03 08:39:16 crc kubenswrapper[4765]: I1003 08:39:16.410683 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 08:39:16 crc kubenswrapper[4765]: I1003 08:39:16.410721 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 08:39:16 crc kubenswrapper[4765]: I1003 08:39:16.410684 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 08:39:16 crc kubenswrapper[4765]: I1003 08:39:16.410732 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 08:39:16 crc kubenswrapper[4765]: I1003 08:39:16.410749 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 08:39:16 crc kubenswrapper[4765]: I1003 08:39:16.410760 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 08:39:16 crc kubenswrapper[4765]: I1003 08:39:16.410890 4765 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 03 08:39:16 crc kubenswrapper[4765]: I1003 08:39:16.411190 4765 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-etcd/etcd-crc" Oct 03 08:39:16 crc kubenswrapper[4765]: I1003 08:39:16.411257 4765 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 03 08:39:16 crc kubenswrapper[4765]: I1003 08:39:16.411786 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 08:39:16 crc kubenswrapper[4765]: I1003 08:39:16.411808 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 08:39:16 crc kubenswrapper[4765]: I1003 08:39:16.411817 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 08:39:16 crc kubenswrapper[4765]: I1003 08:39:16.411959 4765 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 03 08:39:16 crc kubenswrapper[4765]: I1003 08:39:16.412143 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 08:39:16 crc kubenswrapper[4765]: I1003 08:39:16.412242 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 08:39:16 crc kubenswrapper[4765]: I1003 08:39:16.412255 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 08:39:16 crc kubenswrapper[4765]: I1003 08:39:16.413354 4765 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-crc" Oct 03 08:39:16 crc kubenswrapper[4765]: I1003 08:39:16.413475 4765 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 03 08:39:16 crc kubenswrapper[4765]: I1003 08:39:16.414395 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 08:39:16 crc kubenswrapper[4765]: I1003 08:39:16.414434 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 08:39:16 crc kubenswrapper[4765]: I1003 08:39:16.414479 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 08:39:16 crc kubenswrapper[4765]: I1003 08:39:16.414839 4765 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 03 08:39:16 crc kubenswrapper[4765]: I1003 08:39:16.415039 4765 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Oct 03 08:39:16 crc kubenswrapper[4765]: I1003 08:39:16.415087 4765 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 03 08:39:16 crc kubenswrapper[4765]: I1003 08:39:16.416460 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 08:39:16 crc kubenswrapper[4765]: I1003 08:39:16.416510 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 08:39:16 crc kubenswrapper[4765]: I1003 08:39:16.416530 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 08:39:16 crc kubenswrapper[4765]: I1003 08:39:16.416722 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 08:39:16 crc kubenswrapper[4765]: I1003 08:39:16.416849 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 08:39:16 crc kubenswrapper[4765]: I1003 08:39:16.416874 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 08:39:16 crc kubenswrapper[4765]: I1003 08:39:16.416851 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 08:39:16 crc kubenswrapper[4765]: I1003 08:39:16.416927 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 08:39:16 crc kubenswrapper[4765]: I1003 08:39:16.416945 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 08:39:16 crc kubenswrapper[4765]: I1003 08:39:16.417250 4765 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Oct 03 08:39:16 crc kubenswrapper[4765]: I1003 08:39:16.417298 4765 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 03 08:39:16 crc kubenswrapper[4765]: I1003 08:39:16.418937 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 08:39:16 crc kubenswrapper[4765]: I1003 08:39:16.418968 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 08:39:16 crc kubenswrapper[4765]: I1003 08:39:16.418978 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 08:39:16 crc kubenswrapper[4765]: E1003 08:39:16.452720 4765 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.173:6443: connect: connection refused" interval="400ms" Oct 03 08:39:16 crc kubenswrapper[4765]: I1003 08:39:16.474309 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"static-pod-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-static-pod-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Oct 03 08:39:16 crc kubenswrapper[4765]: I1003 08:39:16.474458 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-resource-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Oct 03 08:39:16 crc kubenswrapper[4765]: I1003 08:39:16.474499 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-log-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Oct 03 08:39:16 crc kubenswrapper[4765]: I1003 08:39:16.474582 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-cert-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Oct 03 08:39:16 crc kubenswrapper[4765]: I1003 08:39:16.474621 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-cert-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Oct 03 08:39:16 crc kubenswrapper[4765]: I1003 08:39:16.474663 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"data-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-data-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Oct 03 08:39:16 crc kubenswrapper[4765]: I1003 08:39:16.474690 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f614b9022728cf315e60c057852e563e-cert-dir\") pod \"kube-controller-manager-crc\" (UID: \"f614b9022728cf315e60c057852e563e\") " pod="openshift-kube-controller-manager/kube-controller-manager-crc" Oct 03 08:39:16 crc kubenswrapper[4765]: I1003 08:39:16.474716 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/3dcd261975c3d6b9a6ad6367fd4facd3-cert-dir\") pod \"openshift-kube-scheduler-crc\" (UID: \"3dcd261975c3d6b9a6ad6367fd4facd3\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Oct 03 08:39:16 crc kubenswrapper[4765]: I1003 08:39:16.474797 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-audit-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Oct 03 08:39:16 crc kubenswrapper[4765]: I1003 08:39:16.474846 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f614b9022728cf315e60c057852e563e-resource-dir\") pod \"kube-controller-manager-crc\" (UID: \"f614b9022728cf315e60c057852e563e\") " pod="openshift-kube-controller-manager/kube-controller-manager-crc" Oct 03 08:39:16 crc kubenswrapper[4765]: I1003 08:39:16.474885 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/d1b160f5dda77d281dd8e69ec8d817f9-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-crc\" (UID: \"d1b160f5dda77d281dd8e69ec8d817f9\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Oct 03 08:39:16 crc kubenswrapper[4765]: I1003 08:39:16.474908 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"usr-local-bin\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-usr-local-bin\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Oct 03 08:39:16 crc kubenswrapper[4765]: I1003 08:39:16.474927 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-resource-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Oct 03 08:39:16 crc kubenswrapper[4765]: I1003 08:39:16.474950 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/3dcd261975c3d6b9a6ad6367fd4facd3-resource-dir\") pod \"openshift-kube-scheduler-crc\" (UID: \"3dcd261975c3d6b9a6ad6367fd4facd3\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Oct 03 08:39:16 crc kubenswrapper[4765]: I1003 08:39:16.474971 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/d1b160f5dda77d281dd8e69ec8d817f9-etc-kube\") pod \"kube-rbac-proxy-crio-crc\" (UID: \"d1b160f5dda77d281dd8e69ec8d817f9\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Oct 03 08:39:16 crc kubenswrapper[4765]: I1003 08:39:16.490495 4765 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 03 08:39:16 crc kubenswrapper[4765]: I1003 08:39:16.491779 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 08:39:16 crc kubenswrapper[4765]: I1003 08:39:16.491860 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 08:39:16 crc kubenswrapper[4765]: I1003 08:39:16.491874 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 08:39:16 crc kubenswrapper[4765]: I1003 08:39:16.491931 4765 kubelet_node_status.go:76] "Attempting to register node" node="crc" Oct 03 08:39:16 crc kubenswrapper[4765]: E1003 08:39:16.492803 4765 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": dial tcp 38.102.83.173:6443: connect: connection refused" node="crc" Oct 03 08:39:16 crc kubenswrapper[4765]: I1003 08:39:16.576194 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"data-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-data-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Oct 03 08:39:16 crc kubenswrapper[4765]: I1003 08:39:16.576248 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f614b9022728cf315e60c057852e563e-cert-dir\") pod \"kube-controller-manager-crc\" (UID: \"f614b9022728cf315e60c057852e563e\") " pod="openshift-kube-controller-manager/kube-controller-manager-crc" Oct 03 08:39:16 crc kubenswrapper[4765]: I1003 08:39:16.576266 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/3dcd261975c3d6b9a6ad6367fd4facd3-cert-dir\") pod \"openshift-kube-scheduler-crc\" (UID: \"3dcd261975c3d6b9a6ad6367fd4facd3\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Oct 03 08:39:16 crc kubenswrapper[4765]: I1003 08:39:16.576282 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-cert-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Oct 03 08:39:16 crc kubenswrapper[4765]: I1003 08:39:16.576312 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f614b9022728cf315e60c057852e563e-resource-dir\") pod \"kube-controller-manager-crc\" (UID: \"f614b9022728cf315e60c057852e563e\") " pod="openshift-kube-controller-manager/kube-controller-manager-crc" Oct 03 08:39:16 crc kubenswrapper[4765]: I1003 08:39:16.576328 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-audit-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Oct 03 08:39:16 crc kubenswrapper[4765]: I1003 08:39:16.576344 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-resource-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Oct 03 08:39:16 crc kubenswrapper[4765]: I1003 08:39:16.576361 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/3dcd261975c3d6b9a6ad6367fd4facd3-resource-dir\") pod \"openshift-kube-scheduler-crc\" (UID: \"3dcd261975c3d6b9a6ad6367fd4facd3\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Oct 03 08:39:16 crc kubenswrapper[4765]: I1003 08:39:16.576377 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/d1b160f5dda77d281dd8e69ec8d817f9-etc-kube\") pod \"kube-rbac-proxy-crio-crc\" (UID: \"d1b160f5dda77d281dd8e69ec8d817f9\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Oct 03 08:39:16 crc kubenswrapper[4765]: I1003 08:39:16.576390 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/d1b160f5dda77d281dd8e69ec8d817f9-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-crc\" (UID: \"d1b160f5dda77d281dd8e69ec8d817f9\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Oct 03 08:39:16 crc kubenswrapper[4765]: I1003 08:39:16.576407 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"usr-local-bin\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-usr-local-bin\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Oct 03 08:39:16 crc kubenswrapper[4765]: I1003 08:39:16.576421 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-resource-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Oct 03 08:39:16 crc kubenswrapper[4765]: I1003 08:39:16.576434 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-log-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Oct 03 08:39:16 crc kubenswrapper[4765]: I1003 08:39:16.576431 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f614b9022728cf315e60c057852e563e-cert-dir\") pod \"kube-controller-manager-crc\" (UID: \"f614b9022728cf315e60c057852e563e\") " pod="openshift-kube-controller-manager/kube-controller-manager-crc" Oct 03 08:39:16 crc kubenswrapper[4765]: I1003 08:39:16.576468 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-audit-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Oct 03 08:39:16 crc kubenswrapper[4765]: I1003 08:39:16.576491 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-cert-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Oct 03 08:39:16 crc kubenswrapper[4765]: I1003 08:39:16.576508 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-resource-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Oct 03 08:39:16 crc kubenswrapper[4765]: I1003 08:39:16.576530 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"usr-local-bin\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-usr-local-bin\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Oct 03 08:39:16 crc kubenswrapper[4765]: I1003 08:39:16.576597 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/3dcd261975c3d6b9a6ad6367fd4facd3-resource-dir\") pod \"openshift-kube-scheduler-crc\" (UID: \"3dcd261975c3d6b9a6ad6367fd4facd3\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Oct 03 08:39:16 crc kubenswrapper[4765]: I1003 08:39:16.576608 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-resource-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Oct 03 08:39:16 crc kubenswrapper[4765]: I1003 08:39:16.576456 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/3dcd261975c3d6b9a6ad6367fd4facd3-cert-dir\") pod \"openshift-kube-scheduler-crc\" (UID: \"3dcd261975c3d6b9a6ad6367fd4facd3\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Oct 03 08:39:16 crc kubenswrapper[4765]: I1003 08:39:16.576448 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-cert-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Oct 03 08:39:16 crc kubenswrapper[4765]: I1003 08:39:16.576439 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-cert-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Oct 03 08:39:16 crc kubenswrapper[4765]: I1003 08:39:16.576725 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/d1b160f5dda77d281dd8e69ec8d817f9-etc-kube\") pod \"kube-rbac-proxy-crio-crc\" (UID: \"d1b160f5dda77d281dd8e69ec8d817f9\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Oct 03 08:39:16 crc kubenswrapper[4765]: I1003 08:39:16.576485 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f614b9022728cf315e60c057852e563e-resource-dir\") pod \"kube-controller-manager-crc\" (UID: \"f614b9022728cf315e60c057852e563e\") " pod="openshift-kube-controller-manager/kube-controller-manager-crc" Oct 03 08:39:16 crc kubenswrapper[4765]: I1003 08:39:16.576739 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"static-pod-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-static-pod-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Oct 03 08:39:16 crc kubenswrapper[4765]: I1003 08:39:16.576561 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-log-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Oct 03 08:39:16 crc kubenswrapper[4765]: I1003 08:39:16.576598 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/d1b160f5dda77d281dd8e69ec8d817f9-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-crc\" (UID: \"d1b160f5dda77d281dd8e69ec8d817f9\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Oct 03 08:39:16 crc kubenswrapper[4765]: I1003 08:39:16.576756 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"data-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-data-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Oct 03 08:39:16 crc kubenswrapper[4765]: I1003 08:39:16.576801 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"static-pod-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-static-pod-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Oct 03 08:39:16 crc kubenswrapper[4765]: I1003 08:39:16.692948 4765 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 03 08:39:16 crc kubenswrapper[4765]: I1003 08:39:16.694482 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 08:39:16 crc kubenswrapper[4765]: I1003 08:39:16.694550 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 08:39:16 crc kubenswrapper[4765]: I1003 08:39:16.694562 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 08:39:16 crc kubenswrapper[4765]: I1003 08:39:16.694596 4765 kubelet_node_status.go:76] "Attempting to register node" node="crc" Oct 03 08:39:16 crc kubenswrapper[4765]: E1003 08:39:16.695272 4765 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": dial tcp 38.102.83.173:6443: connect: connection refused" node="crc" Oct 03 08:39:16 crc kubenswrapper[4765]: I1003 08:39:16.748911 4765 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Oct 03 08:39:16 crc kubenswrapper[4765]: I1003 08:39:16.772200 4765 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-etcd/etcd-crc" Oct 03 08:39:16 crc kubenswrapper[4765]: I1003 08:39:16.780709 4765 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-crc" Oct 03 08:39:16 crc kubenswrapper[4765]: W1003 08:39:16.794202 4765 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd1b160f5dda77d281dd8e69ec8d817f9.slice/crio-536f0844d557522d2c44ec053d67a5c4f54216ea220be235f6697d4e2c8ba8d4 WatchSource:0}: Error finding container 536f0844d557522d2c44ec053d67a5c4f54216ea220be235f6697d4e2c8ba8d4: Status 404 returned error can't find the container with id 536f0844d557522d2c44ec053d67a5c4f54216ea220be235f6697d4e2c8ba8d4 Oct 03 08:39:16 crc kubenswrapper[4765]: I1003 08:39:16.799289 4765 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Oct 03 08:39:16 crc kubenswrapper[4765]: I1003 08:39:16.803894 4765 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Oct 03 08:39:16 crc kubenswrapper[4765]: W1003 08:39:16.813302 4765 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod2139d3e2895fc6797b9c76a1b4c9886d.slice/crio-b7947a10cb85366ec09c973704c32ad80c666f0d7eeb0235961bb93bc4a2c125 WatchSource:0}: Error finding container b7947a10cb85366ec09c973704c32ad80c666f0d7eeb0235961bb93bc4a2c125: Status 404 returned error can't find the container with id b7947a10cb85366ec09c973704c32ad80c666f0d7eeb0235961bb93bc4a2c125 Oct 03 08:39:16 crc kubenswrapper[4765]: W1003 08:39:16.831137 4765 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod3dcd261975c3d6b9a6ad6367fd4facd3.slice/crio-65e2e319e6b0e29db06e533c14763c2f5eb8ab9a4c99d7bfbddd1cf3c92012bf WatchSource:0}: Error finding container 65e2e319e6b0e29db06e533c14763c2f5eb8ab9a4c99d7bfbddd1cf3c92012bf: Status 404 returned error can't find the container with id 65e2e319e6b0e29db06e533c14763c2f5eb8ab9a4c99d7bfbddd1cf3c92012bf Oct 03 08:39:16 crc kubenswrapper[4765]: W1003 08:39:16.833481 4765 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf4b27818a5e8e43d0dc095d08835c792.slice/crio-ebd8b7f8e430ce2e77050a07ee3b6620a45fa9b1bb583c127e9e22876766abd8 WatchSource:0}: Error finding container ebd8b7f8e430ce2e77050a07ee3b6620a45fa9b1bb583c127e9e22876766abd8: Status 404 returned error can't find the container with id ebd8b7f8e430ce2e77050a07ee3b6620a45fa9b1bb583c127e9e22876766abd8 Oct 03 08:39:16 crc kubenswrapper[4765]: E1003 08:39:16.855088 4765 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.173:6443: connect: connection refused" interval="800ms" Oct 03 08:39:17 crc kubenswrapper[4765]: I1003 08:39:17.095891 4765 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 03 08:39:17 crc kubenswrapper[4765]: I1003 08:39:17.099343 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 08:39:17 crc kubenswrapper[4765]: I1003 08:39:17.099398 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 08:39:17 crc kubenswrapper[4765]: I1003 08:39:17.099415 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 08:39:17 crc kubenswrapper[4765]: I1003 08:39:17.099444 4765 kubelet_node_status.go:76] "Attempting to register node" node="crc" Oct 03 08:39:17 crc kubenswrapper[4765]: E1003 08:39:17.099976 4765 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": dial tcp 38.102.83.173:6443: connect: connection refused" node="crc" Oct 03 08:39:17 crc kubenswrapper[4765]: W1003 08:39:17.122856 4765 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Node: Get "https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0": dial tcp 38.102.83.173:6443: connect: connection refused Oct 03 08:39:17 crc kubenswrapper[4765]: E1003 08:39:17.122966 4765 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Node: failed to list *v1.Node: Get \"https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0\": dial tcp 38.102.83.173:6443: connect: connection refused" logger="UnhandledError" Oct 03 08:39:17 crc kubenswrapper[4765]: I1003 08:39:17.249477 4765 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": dial tcp 38.102.83.173:6443: connect: connection refused Oct 03 08:39:17 crc kubenswrapper[4765]: I1003 08:39:17.315587 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"ebd8b7f8e430ce2e77050a07ee3b6620a45fa9b1bb583c127e9e22876766abd8"} Oct 03 08:39:17 crc kubenswrapper[4765]: I1003 08:39:17.317006 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" event={"ID":"3dcd261975c3d6b9a6ad6367fd4facd3","Type":"ContainerStarted","Data":"65e2e319e6b0e29db06e533c14763c2f5eb8ab9a4c99d7bfbddd1cf3c92012bf"} Oct 03 08:39:17 crc kubenswrapper[4765]: I1003 08:39:17.317945 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"c1998b6c4c49f99db98c6e87a2b7d0d80661da6394824cfe7c95948edcf914b9"} Oct 03 08:39:17 crc kubenswrapper[4765]: I1003 08:39:17.319535 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerStarted","Data":"b7947a10cb85366ec09c973704c32ad80c666f0d7eeb0235961bb93bc4a2c125"} Oct 03 08:39:17 crc kubenswrapper[4765]: I1003 08:39:17.320453 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" event={"ID":"d1b160f5dda77d281dd8e69ec8d817f9","Type":"ContainerStarted","Data":"536f0844d557522d2c44ec053d67a5c4f54216ea220be235f6697d4e2c8ba8d4"} Oct 03 08:39:17 crc kubenswrapper[4765]: W1003 08:39:17.376589 4765 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.CSIDriver: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0": dial tcp 38.102.83.173:6443: connect: connection refused Oct 03 08:39:17 crc kubenswrapper[4765]: E1003 08:39:17.377121 4765 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.CSIDriver: failed to list *v1.CSIDriver: Get \"https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0\": dial tcp 38.102.83.173:6443: connect: connection refused" logger="UnhandledError" Oct 03 08:39:17 crc kubenswrapper[4765]: W1003 08:39:17.388319 4765 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Service: Get "https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0": dial tcp 38.102.83.173:6443: connect: connection refused Oct 03 08:39:17 crc kubenswrapper[4765]: E1003 08:39:17.388385 4765 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Service: failed to list *v1.Service: Get \"https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0\": dial tcp 38.102.83.173:6443: connect: connection refused" logger="UnhandledError" Oct 03 08:39:17 crc kubenswrapper[4765]: E1003 08:39:17.656616 4765 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.173:6443: connect: connection refused" interval="1.6s" Oct 03 08:39:17 crc kubenswrapper[4765]: I1003 08:39:17.900438 4765 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 03 08:39:17 crc kubenswrapper[4765]: I1003 08:39:17.901905 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 08:39:17 crc kubenswrapper[4765]: I1003 08:39:17.901977 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 08:39:17 crc kubenswrapper[4765]: I1003 08:39:17.901993 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 08:39:17 crc kubenswrapper[4765]: I1003 08:39:17.902056 4765 kubelet_node_status.go:76] "Attempting to register node" node="crc" Oct 03 08:39:17 crc kubenswrapper[4765]: E1003 08:39:17.902903 4765 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": dial tcp 38.102.83.173:6443: connect: connection refused" node="crc" Oct 03 08:39:17 crc kubenswrapper[4765]: W1003 08:39:17.908478 4765 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.RuntimeClass: Get "https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0": dial tcp 38.102.83.173:6443: connect: connection refused Oct 03 08:39:17 crc kubenswrapper[4765]: E1003 08:39:17.908567 4765 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.RuntimeClass: failed to list *v1.RuntimeClass: Get \"https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0\": dial tcp 38.102.83.173:6443: connect: connection refused" logger="UnhandledError" Oct 03 08:39:18 crc kubenswrapper[4765]: I1003 08:39:18.249208 4765 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": dial tcp 38.102.83.173:6443: connect: connection refused Oct 03 08:39:18 crc kubenswrapper[4765]: I1003 08:39:18.326465 4765 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 03 08:39:18 crc kubenswrapper[4765]: I1003 08:39:18.326457 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"99ae775d5cfd2e88a1c7ca516e1c59f2e08ce1d383653cacbefeac66b07abcb5"} Oct 03 08:39:18 crc kubenswrapper[4765]: I1003 08:39:18.326592 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"1c3168f51c49cd9633557cf31cdc0fec47b3fcf981462dc85f4253a0584fcf52"} Oct 03 08:39:18 crc kubenswrapper[4765]: I1003 08:39:18.326625 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"23c27e7d79dab0c54b22f0114e7f55a9267e3a21961b8479c37fd77d0e8b66c7"} Oct 03 08:39:18 crc kubenswrapper[4765]: I1003 08:39:18.326707 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"fb89a31c804d86cbc11b04e4dcfab79d4536f28a107d43e98d48172a1c257ebb"} Oct 03 08:39:18 crc kubenswrapper[4765]: I1003 08:39:18.327430 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 08:39:18 crc kubenswrapper[4765]: I1003 08:39:18.327511 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 08:39:18 crc kubenswrapper[4765]: I1003 08:39:18.327538 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 08:39:18 crc kubenswrapper[4765]: I1003 08:39:18.328711 4765 generic.go:334] "Generic (PLEG): container finished" podID="2139d3e2895fc6797b9c76a1b4c9886d" containerID="c1359775b3b907743ce1d7754c904fab5cf953ad3a28f91e781be95462e6a9d4" exitCode=0 Oct 03 08:39:18 crc kubenswrapper[4765]: I1003 08:39:18.328813 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerDied","Data":"c1359775b3b907743ce1d7754c904fab5cf953ad3a28f91e781be95462e6a9d4"} Oct 03 08:39:18 crc kubenswrapper[4765]: I1003 08:39:18.328828 4765 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 03 08:39:18 crc kubenswrapper[4765]: I1003 08:39:18.329434 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 08:39:18 crc kubenswrapper[4765]: I1003 08:39:18.329485 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 08:39:18 crc kubenswrapper[4765]: I1003 08:39:18.329512 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 08:39:18 crc kubenswrapper[4765]: I1003 08:39:18.330854 4765 generic.go:334] "Generic (PLEG): container finished" podID="d1b160f5dda77d281dd8e69ec8d817f9" containerID="bdfadb3541e9c76e5ab7469b7161c24715f4eeff89ec4bba0cc253bece41f1b2" exitCode=0 Oct 03 08:39:18 crc kubenswrapper[4765]: I1003 08:39:18.330924 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" event={"ID":"d1b160f5dda77d281dd8e69ec8d817f9","Type":"ContainerDied","Data":"bdfadb3541e9c76e5ab7469b7161c24715f4eeff89ec4bba0cc253bece41f1b2"} Oct 03 08:39:18 crc kubenswrapper[4765]: I1003 08:39:18.330958 4765 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 03 08:39:18 crc kubenswrapper[4765]: I1003 08:39:18.332076 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 08:39:18 crc kubenswrapper[4765]: I1003 08:39:18.332118 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 08:39:18 crc kubenswrapper[4765]: I1003 08:39:18.332142 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 08:39:18 crc kubenswrapper[4765]: I1003 08:39:18.334906 4765 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="7c94b0f96f50324a67a7b189e9d3c4e406193421aa2e793692219bef9f2578dd" exitCode=0 Oct 03 08:39:18 crc kubenswrapper[4765]: I1003 08:39:18.334992 4765 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 03 08:39:18 crc kubenswrapper[4765]: I1003 08:39:18.334998 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerDied","Data":"7c94b0f96f50324a67a7b189e9d3c4e406193421aa2e793692219bef9f2578dd"} Oct 03 08:39:18 crc kubenswrapper[4765]: I1003 08:39:18.340501 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 08:39:18 crc kubenswrapper[4765]: I1003 08:39:18.340562 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 08:39:18 crc kubenswrapper[4765]: I1003 08:39:18.340578 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 08:39:18 crc kubenswrapper[4765]: I1003 08:39:18.341665 4765 generic.go:334] "Generic (PLEG): container finished" podID="3dcd261975c3d6b9a6ad6367fd4facd3" containerID="00a75616be0bff2d1c730afda7f4212c6d85e07870e6f680c6903862387e00a0" exitCode=0 Oct 03 08:39:18 crc kubenswrapper[4765]: I1003 08:39:18.341725 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" event={"ID":"3dcd261975c3d6b9a6ad6367fd4facd3","Type":"ContainerDied","Data":"00a75616be0bff2d1c730afda7f4212c6d85e07870e6f680c6903862387e00a0"} Oct 03 08:39:18 crc kubenswrapper[4765]: I1003 08:39:18.341866 4765 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 03 08:39:18 crc kubenswrapper[4765]: I1003 08:39:18.345601 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 08:39:18 crc kubenswrapper[4765]: I1003 08:39:18.345773 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 08:39:18 crc kubenswrapper[4765]: I1003 08:39:18.345792 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 08:39:18 crc kubenswrapper[4765]: I1003 08:39:18.348519 4765 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 03 08:39:18 crc kubenswrapper[4765]: I1003 08:39:18.349555 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 08:39:18 crc kubenswrapper[4765]: I1003 08:39:18.349583 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 08:39:18 crc kubenswrapper[4765]: I1003 08:39:18.349593 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 08:39:19 crc kubenswrapper[4765]: W1003 08:39:19.012337 4765 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Node: Get "https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0": dial tcp 38.102.83.173:6443: connect: connection refused Oct 03 08:39:19 crc kubenswrapper[4765]: E1003 08:39:19.012476 4765 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Node: failed to list *v1.Node: Get \"https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0\": dial tcp 38.102.83.173:6443: connect: connection refused" logger="UnhandledError" Oct 03 08:39:19 crc kubenswrapper[4765]: I1003 08:39:19.249510 4765 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": dial tcp 38.102.83.173:6443: connect: connection refused Oct 03 08:39:19 crc kubenswrapper[4765]: W1003 08:39:19.249557 4765 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Service: Get "https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0": dial tcp 38.102.83.173:6443: connect: connection refused Oct 03 08:39:19 crc kubenswrapper[4765]: E1003 08:39:19.249638 4765 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Service: failed to list *v1.Service: Get \"https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0\": dial tcp 38.102.83.173:6443: connect: connection refused" logger="UnhandledError" Oct 03 08:39:19 crc kubenswrapper[4765]: E1003 08:39:19.257319 4765 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.173:6443: connect: connection refused" interval="3.2s" Oct 03 08:39:19 crc kubenswrapper[4765]: I1003 08:39:19.345867 4765 generic.go:334] "Generic (PLEG): container finished" podID="2139d3e2895fc6797b9c76a1b4c9886d" containerID="a56c24b3af6b3d5c18009cbb5517273674eb36f57157344f99903f9c50c7d1a0" exitCode=0 Oct 03 08:39:19 crc kubenswrapper[4765]: I1003 08:39:19.345908 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerDied","Data":"a56c24b3af6b3d5c18009cbb5517273674eb36f57157344f99903f9c50c7d1a0"} Oct 03 08:39:19 crc kubenswrapper[4765]: I1003 08:39:19.346042 4765 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 03 08:39:19 crc kubenswrapper[4765]: I1003 08:39:19.347474 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 08:39:19 crc kubenswrapper[4765]: I1003 08:39:19.347513 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 08:39:19 crc kubenswrapper[4765]: I1003 08:39:19.347524 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 08:39:19 crc kubenswrapper[4765]: I1003 08:39:19.348017 4765 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 03 08:39:19 crc kubenswrapper[4765]: I1003 08:39:19.348062 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" event={"ID":"d1b160f5dda77d281dd8e69ec8d817f9","Type":"ContainerStarted","Data":"dee8eb78cfc7f681a7009b32e7521490cfa896aee35f8f552a150738224517be"} Oct 03 08:39:19 crc kubenswrapper[4765]: I1003 08:39:19.349013 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 08:39:19 crc kubenswrapper[4765]: I1003 08:39:19.349033 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 08:39:19 crc kubenswrapper[4765]: I1003 08:39:19.349042 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 08:39:19 crc kubenswrapper[4765]: I1003 08:39:19.361521 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"fa1d1c0f4dab4b4c6c9f3afccac34473eab40a714015a2a7ce725ed1a92b609c"} Oct 03 08:39:19 crc kubenswrapper[4765]: I1003 08:39:19.361638 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"e89b19d6a5b90a2051665bf2e5e150f73df7899eff246ee75246bc2127c415ea"} Oct 03 08:39:19 crc kubenswrapper[4765]: I1003 08:39:19.361679 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"79fad446c147481b1a0ff2a173848b2d24384e6b6aafcd0749dc820e9abfe929"} Oct 03 08:39:19 crc kubenswrapper[4765]: I1003 08:39:19.361693 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"d31497fd54f7500ac776bdd9a16414d873c053353911ed5ba237b201e9e7ac12"} Oct 03 08:39:19 crc kubenswrapper[4765]: I1003 08:39:19.367215 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" event={"ID":"3dcd261975c3d6b9a6ad6367fd4facd3","Type":"ContainerStarted","Data":"f1e92137f7438e3f6ae4b9225226f23f10f0e5e8a2b6a86f486971315d8bee00"} Oct 03 08:39:19 crc kubenswrapper[4765]: I1003 08:39:19.367291 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" event={"ID":"3dcd261975c3d6b9a6ad6367fd4facd3","Type":"ContainerStarted","Data":"6cedef6c592c877edfd8afe1dc09789fdc84a816a6a84d9ac9115fa494d8b5fe"} Oct 03 08:39:19 crc kubenswrapper[4765]: I1003 08:39:19.367316 4765 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 03 08:39:19 crc kubenswrapper[4765]: I1003 08:39:19.367367 4765 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 03 08:39:19 crc kubenswrapper[4765]: I1003 08:39:19.367323 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" event={"ID":"3dcd261975c3d6b9a6ad6367fd4facd3","Type":"ContainerStarted","Data":"78b8f31e2b3f0891e3909baeb57c5a2dfe52c0e85d1aa86fe045ed54c56d5202"} Oct 03 08:39:19 crc kubenswrapper[4765]: I1003 08:39:19.368299 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 08:39:19 crc kubenswrapper[4765]: I1003 08:39:19.368331 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 08:39:19 crc kubenswrapper[4765]: I1003 08:39:19.368341 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 08:39:19 crc kubenswrapper[4765]: I1003 08:39:19.368576 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 08:39:19 crc kubenswrapper[4765]: I1003 08:39:19.368607 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 08:39:19 crc kubenswrapper[4765]: I1003 08:39:19.368620 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 08:39:19 crc kubenswrapper[4765]: I1003 08:39:19.502987 4765 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 03 08:39:19 crc kubenswrapper[4765]: I1003 08:39:19.504353 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 08:39:19 crc kubenswrapper[4765]: I1003 08:39:19.504386 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 08:39:19 crc kubenswrapper[4765]: I1003 08:39:19.504396 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 08:39:19 crc kubenswrapper[4765]: I1003 08:39:19.504419 4765 kubelet_node_status.go:76] "Attempting to register node" node="crc" Oct 03 08:39:19 crc kubenswrapper[4765]: E1003 08:39:19.504813 4765 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": dial tcp 38.102.83.173:6443: connect: connection refused" node="crc" Oct 03 08:39:20 crc kubenswrapper[4765]: I1003 08:39:20.375000 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"aa0283dadc2c5e48aa9bfd20ef35d889a350244b72eb8529d4d4e682d5fa0e47"} Oct 03 08:39:20 crc kubenswrapper[4765]: I1003 08:39:20.375584 4765 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 03 08:39:20 crc kubenswrapper[4765]: I1003 08:39:20.377015 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 08:39:20 crc kubenswrapper[4765]: I1003 08:39:20.377046 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 08:39:20 crc kubenswrapper[4765]: I1003 08:39:20.377059 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 08:39:20 crc kubenswrapper[4765]: I1003 08:39:20.377534 4765 generic.go:334] "Generic (PLEG): container finished" podID="2139d3e2895fc6797b9c76a1b4c9886d" containerID="18dbd66a12f96fa4beef4da9dcc0e1b1d3a5164a8b21a6774142289078d6c05f" exitCode=0 Oct 03 08:39:20 crc kubenswrapper[4765]: I1003 08:39:20.377594 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerDied","Data":"18dbd66a12f96fa4beef4da9dcc0e1b1d3a5164a8b21a6774142289078d6c05f"} Oct 03 08:39:20 crc kubenswrapper[4765]: I1003 08:39:20.377675 4765 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Oct 03 08:39:20 crc kubenswrapper[4765]: I1003 08:39:20.377685 4765 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 03 08:39:20 crc kubenswrapper[4765]: I1003 08:39:20.377724 4765 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 03 08:39:20 crc kubenswrapper[4765]: I1003 08:39:20.377762 4765 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 03 08:39:20 crc kubenswrapper[4765]: I1003 08:39:20.378852 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 08:39:20 crc kubenswrapper[4765]: I1003 08:39:20.378900 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 08:39:20 crc kubenswrapper[4765]: I1003 08:39:20.378913 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 08:39:20 crc kubenswrapper[4765]: I1003 08:39:20.378924 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 08:39:20 crc kubenswrapper[4765]: I1003 08:39:20.378944 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 08:39:20 crc kubenswrapper[4765]: I1003 08:39:20.378953 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 08:39:20 crc kubenswrapper[4765]: I1003 08:39:20.379072 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 08:39:20 crc kubenswrapper[4765]: I1003 08:39:20.379127 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 08:39:20 crc kubenswrapper[4765]: I1003 08:39:20.379140 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 08:39:21 crc kubenswrapper[4765]: I1003 08:39:21.386784 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerStarted","Data":"153c9584928c3d064c6098126dad58733015ed123b9a55c959e69ddcc0ad2110"} Oct 03 08:39:21 crc kubenswrapper[4765]: I1003 08:39:21.386861 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerStarted","Data":"ab0ed26066c771f9943b6435fa382ff61fb04f0c8bef3d505aba4c5d1a1d4740"} Oct 03 08:39:21 crc kubenswrapper[4765]: I1003 08:39:21.386873 4765 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 03 08:39:21 crc kubenswrapper[4765]: I1003 08:39:21.386873 4765 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 03 08:39:21 crc kubenswrapper[4765]: I1003 08:39:21.386877 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerStarted","Data":"0fee410f71d4fa82e7bf54dad906736bc7182be512825a06bf7a4c76ed2f2789"} Oct 03 08:39:21 crc kubenswrapper[4765]: I1003 08:39:21.387215 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerStarted","Data":"ffaf0cbc60fa84230a87aff908b5b2a76956abfa937aeea94363abe91640b93e"} Oct 03 08:39:21 crc kubenswrapper[4765]: I1003 08:39:21.387264 4765 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-crc" Oct 03 08:39:21 crc kubenswrapper[4765]: I1003 08:39:21.387282 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerStarted","Data":"6fa1bc45d80d90bc08ca3a7177e2ac77b66c36f5a0f863532174be7719bfaae0"} Oct 03 08:39:21 crc kubenswrapper[4765]: I1003 08:39:21.388095 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 08:39:21 crc kubenswrapper[4765]: I1003 08:39:21.388109 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 08:39:21 crc kubenswrapper[4765]: I1003 08:39:21.388129 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 08:39:21 crc kubenswrapper[4765]: I1003 08:39:21.388141 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 08:39:21 crc kubenswrapper[4765]: I1003 08:39:21.388141 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 08:39:21 crc kubenswrapper[4765]: I1003 08:39:21.388241 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 08:39:22 crc kubenswrapper[4765]: I1003 08:39:22.390004 4765 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 03 08:39:22 crc kubenswrapper[4765]: I1003 08:39:22.390013 4765 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 03 08:39:22 crc kubenswrapper[4765]: I1003 08:39:22.391292 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 08:39:22 crc kubenswrapper[4765]: I1003 08:39:22.391332 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 08:39:22 crc kubenswrapper[4765]: I1003 08:39:22.391342 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 08:39:22 crc kubenswrapper[4765]: I1003 08:39:22.391792 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 08:39:22 crc kubenswrapper[4765]: I1003 08:39:22.391839 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 08:39:22 crc kubenswrapper[4765]: I1003 08:39:22.391856 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 08:39:22 crc kubenswrapper[4765]: I1003 08:39:22.462065 4765 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Oct 03 08:39:22 crc kubenswrapper[4765]: I1003 08:39:22.462361 4765 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 03 08:39:22 crc kubenswrapper[4765]: I1003 08:39:22.464261 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 08:39:22 crc kubenswrapper[4765]: I1003 08:39:22.464336 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 08:39:22 crc kubenswrapper[4765]: I1003 08:39:22.464351 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 08:39:22 crc kubenswrapper[4765]: I1003 08:39:22.705590 4765 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 03 08:39:22 crc kubenswrapper[4765]: I1003 08:39:22.707366 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 08:39:22 crc kubenswrapper[4765]: I1003 08:39:22.707432 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 08:39:22 crc kubenswrapper[4765]: I1003 08:39:22.707448 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 08:39:22 crc kubenswrapper[4765]: I1003 08:39:22.707491 4765 kubelet_node_status.go:76] "Attempting to register node" node="crc" Oct 03 08:39:22 crc kubenswrapper[4765]: I1003 08:39:22.806775 4765 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Oct 03 08:39:22 crc kubenswrapper[4765]: I1003 08:39:22.807093 4765 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 03 08:39:22 crc kubenswrapper[4765]: I1003 08:39:22.809085 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 08:39:22 crc kubenswrapper[4765]: I1003 08:39:22.809164 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 08:39:22 crc kubenswrapper[4765]: I1003 08:39:22.809190 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 08:39:23 crc kubenswrapper[4765]: I1003 08:39:23.141236 4765 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-apiserver/kube-apiserver-crc" Oct 03 08:39:23 crc kubenswrapper[4765]: I1003 08:39:23.392062 4765 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 03 08:39:23 crc kubenswrapper[4765]: I1003 08:39:23.393499 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 08:39:23 crc kubenswrapper[4765]: I1003 08:39:23.393548 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 08:39:23 crc kubenswrapper[4765]: I1003 08:39:23.393563 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 08:39:23 crc kubenswrapper[4765]: I1003 08:39:23.764781 4765 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-crc" Oct 03 08:39:24 crc kubenswrapper[4765]: I1003 08:39:24.394383 4765 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 03 08:39:24 crc kubenswrapper[4765]: I1003 08:39:24.395265 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 08:39:24 crc kubenswrapper[4765]: I1003 08:39:24.395293 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 08:39:24 crc kubenswrapper[4765]: I1003 08:39:24.395304 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 08:39:24 crc kubenswrapper[4765]: I1003 08:39:24.435842 4765 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-etcd/etcd-crc" Oct 03 08:39:24 crc kubenswrapper[4765]: I1003 08:39:24.436038 4765 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 03 08:39:24 crc kubenswrapper[4765]: I1003 08:39:24.437102 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 08:39:24 crc kubenswrapper[4765]: I1003 08:39:24.437192 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 08:39:24 crc kubenswrapper[4765]: I1003 08:39:24.437208 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 08:39:24 crc kubenswrapper[4765]: I1003 08:39:24.544019 4765 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Oct 03 08:39:24 crc kubenswrapper[4765]: I1003 08:39:24.544304 4765 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 03 08:39:24 crc kubenswrapper[4765]: I1003 08:39:24.548620 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 08:39:24 crc kubenswrapper[4765]: I1003 08:39:24.548758 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 08:39:24 crc kubenswrapper[4765]: I1003 08:39:24.548783 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 08:39:24 crc kubenswrapper[4765]: I1003 08:39:24.778265 4765 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Oct 03 08:39:25 crc kubenswrapper[4765]: I1003 08:39:25.396754 4765 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 03 08:39:25 crc kubenswrapper[4765]: I1003 08:39:25.397621 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 08:39:25 crc kubenswrapper[4765]: I1003 08:39:25.397673 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 08:39:25 crc kubenswrapper[4765]: I1003 08:39:25.397683 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 08:39:26 crc kubenswrapper[4765]: I1003 08:39:26.179458 4765 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Oct 03 08:39:26 crc kubenswrapper[4765]: I1003 08:39:26.184494 4765 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Oct 03 08:39:26 crc kubenswrapper[4765]: E1003 08:39:26.396494 4765 eviction_manager.go:285] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"crc\" not found" Oct 03 08:39:26 crc kubenswrapper[4765]: I1003 08:39:26.398234 4765 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 03 08:39:26 crc kubenswrapper[4765]: I1003 08:39:26.399033 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 08:39:26 crc kubenswrapper[4765]: I1003 08:39:26.399065 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 08:39:26 crc kubenswrapper[4765]: I1003 08:39:26.399073 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 08:39:27 crc kubenswrapper[4765]: I1003 08:39:27.400862 4765 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 03 08:39:27 crc kubenswrapper[4765]: I1003 08:39:27.402829 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 08:39:27 crc kubenswrapper[4765]: I1003 08:39:27.402885 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 08:39:27 crc kubenswrapper[4765]: I1003 08:39:27.402907 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 08:39:27 crc kubenswrapper[4765]: I1003 08:39:27.405617 4765 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Oct 03 08:39:27 crc kubenswrapper[4765]: I1003 08:39:27.531037 4765 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-etcd/etcd-crc" Oct 03 08:39:27 crc kubenswrapper[4765]: I1003 08:39:27.531357 4765 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 03 08:39:27 crc kubenswrapper[4765]: I1003 08:39:27.533086 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 08:39:27 crc kubenswrapper[4765]: I1003 08:39:27.533141 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 08:39:27 crc kubenswrapper[4765]: I1003 08:39:27.533159 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 08:39:27 crc kubenswrapper[4765]: I1003 08:39:27.545143 4765 patch_prober.go:28] interesting pod/kube-controller-manager-crc container/cluster-policy-controller namespace/openshift-kube-controller-manager: Startup probe status=failure output="Get \"https://192.168.126.11:10357/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Oct 03 08:39:27 crc kubenswrapper[4765]: I1003 08:39:27.545255 4765 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-controller-manager/kube-controller-manager-crc" podUID="f614b9022728cf315e60c057852e563e" containerName="cluster-policy-controller" probeResult="failure" output="Get \"https://192.168.126.11:10357/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Oct 03 08:39:28 crc kubenswrapper[4765]: I1003 08:39:28.403353 4765 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 03 08:39:28 crc kubenswrapper[4765]: I1003 08:39:28.404306 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 08:39:28 crc kubenswrapper[4765]: I1003 08:39:28.404371 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 08:39:28 crc kubenswrapper[4765]: I1003 08:39:28.404386 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 08:39:29 crc kubenswrapper[4765]: I1003 08:39:29.579927 4765 patch_prober.go:28] interesting pod/kube-apiserver-crc container/kube-apiserver-check-endpoints namespace/openshift-kube-apiserver: Liveness probe status=failure output="Get \"https://192.168.126.11:17697/healthz\": dial tcp 192.168.126.11:17697: connect: connection refused" start-of-body= Oct 03 08:39:29 crc kubenswrapper[4765]: I1003 08:39:29.580480 4765 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" probeResult="failure" output="Get \"https://192.168.126.11:17697/healthz\": dial tcp 192.168.126.11:17697: connect: connection refused" Oct 03 08:39:29 crc kubenswrapper[4765]: W1003 08:39:29.912395 4765 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.CSIDriver: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0": net/http: TLS handshake timeout Oct 03 08:39:29 crc kubenswrapper[4765]: I1003 08:39:29.912510 4765 trace.go:236] Trace[40276151]: "Reflector ListAndWatch" name:k8s.io/client-go/informers/factory.go:160 (03-Oct-2025 08:39:19.910) (total time: 10001ms): Oct 03 08:39:29 crc kubenswrapper[4765]: Trace[40276151]: ---"Objects listed" error:Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0": net/http: TLS handshake timeout 10001ms (08:39:29.912) Oct 03 08:39:29 crc kubenswrapper[4765]: Trace[40276151]: [10.001863797s] [10.001863797s] END Oct 03 08:39:29 crc kubenswrapper[4765]: E1003 08:39:29.912534 4765 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.CSIDriver: failed to list *v1.CSIDriver: Get \"https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0\": net/http: TLS handshake timeout" logger="UnhandledError" Oct 03 08:39:30 crc kubenswrapper[4765]: I1003 08:39:30.061296 4765 patch_prober.go:28] interesting pod/kube-apiserver-crc container/kube-apiserver namespace/openshift-kube-apiserver: Startup probe status=failure output="HTTP probe failed with statuscode: 403" start-of-body={"kind":"Status","apiVersion":"v1","metadata":{},"status":"Failure","message":"forbidden: User \"system:anonymous\" cannot get path \"/livez\"","reason":"Forbidden","details":{},"code":403} Oct 03 08:39:30 crc kubenswrapper[4765]: I1003 08:39:30.061357 4765 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver" probeResult="failure" output="HTTP probe failed with statuscode: 403" Oct 03 08:39:30 crc kubenswrapper[4765]: I1003 08:39:30.070515 4765 patch_prober.go:28] interesting pod/kube-apiserver-crc container/kube-apiserver namespace/openshift-kube-apiserver: Startup probe status=failure output="HTTP probe failed with statuscode: 403" start-of-body={"kind":"Status","apiVersion":"v1","metadata":{},"status":"Failure","message":"forbidden: User \"system:anonymous\" cannot get path \"/livez\"","reason":"Forbidden","details":{},"code":403} Oct 03 08:39:30 crc kubenswrapper[4765]: I1003 08:39:30.070581 4765 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver" probeResult="failure" output="HTTP probe failed with statuscode: 403" Oct 03 08:39:33 crc kubenswrapper[4765]: I1003 08:39:33.146343 4765 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-kube-apiserver/kube-apiserver-crc" Oct 03 08:39:33 crc kubenswrapper[4765]: I1003 08:39:33.146552 4765 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 03 08:39:33 crc kubenswrapper[4765]: I1003 08:39:33.147764 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 08:39:33 crc kubenswrapper[4765]: I1003 08:39:33.147796 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 08:39:33 crc kubenswrapper[4765]: I1003 08:39:33.147805 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 08:39:33 crc kubenswrapper[4765]: I1003 08:39:33.150390 4765 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-apiserver/kube-apiserver-crc" Oct 03 08:39:33 crc kubenswrapper[4765]: I1003 08:39:33.416467 4765 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 03 08:39:33 crc kubenswrapper[4765]: I1003 08:39:33.417351 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 08:39:33 crc kubenswrapper[4765]: I1003 08:39:33.417391 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 08:39:33 crc kubenswrapper[4765]: I1003 08:39:33.417403 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 08:39:35 crc kubenswrapper[4765]: E1003 08:39:35.044889 4765 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": context deadline exceeded" interval="6.4s" Oct 03 08:39:35 crc kubenswrapper[4765]: I1003 08:39:35.048609 4765 trace.go:236] Trace[53338930]: "Reflector ListAndWatch" name:k8s.io/client-go/informers/factory.go:160 (03-Oct-2025 08:39:20.613) (total time: 14434ms): Oct 03 08:39:35 crc kubenswrapper[4765]: Trace[53338930]: ---"Objects listed" error: 14434ms (08:39:35.048) Oct 03 08:39:35 crc kubenswrapper[4765]: Trace[53338930]: [14.434864473s] [14.434864473s] END Oct 03 08:39:35 crc kubenswrapper[4765]: I1003 08:39:35.048641 4765 reflector.go:368] Caches populated for *v1.RuntimeClass from k8s.io/client-go/informers/factory.go:160 Oct 03 08:39:35 crc kubenswrapper[4765]: I1003 08:39:35.050125 4765 reconstruct.go:205] "DevicePaths of reconstructed volumes updated" Oct 03 08:39:35 crc kubenswrapper[4765]: I1003 08:39:35.050462 4765 trace.go:236] Trace[961326393]: "Reflector ListAndWatch" name:k8s.io/client-go/informers/factory.go:160 (03-Oct-2025 08:39:22.630) (total time: 12420ms): Oct 03 08:39:35 crc kubenswrapper[4765]: Trace[961326393]: ---"Objects listed" error: 12420ms (08:39:35.050) Oct 03 08:39:35 crc kubenswrapper[4765]: Trace[961326393]: [12.420288384s] [12.420288384s] END Oct 03 08:39:35 crc kubenswrapper[4765]: I1003 08:39:35.050473 4765 trace.go:236] Trace[1701505384]: "Reflector ListAndWatch" name:k8s.io/client-go/informers/factory.go:160 (03-Oct-2025 08:39:22.481) (total time: 12568ms): Oct 03 08:39:35 crc kubenswrapper[4765]: Trace[1701505384]: ---"Objects listed" error: 12568ms (08:39:35.050) Oct 03 08:39:35 crc kubenswrapper[4765]: Trace[1701505384]: [12.568302061s] [12.568302061s] END Oct 03 08:39:35 crc kubenswrapper[4765]: E1003 08:39:35.050482 4765 kubelet_node_status.go:99] "Unable to register node with API server" err="nodes \"crc\" is forbidden: autoscaling.openshift.io/ManagedNode infra config cache not synchronized" node="crc" Oct 03 08:39:35 crc kubenswrapper[4765]: I1003 08:39:35.050534 4765 reflector.go:368] Caches populated for *v1.Node from k8s.io/client-go/informers/factory.go:160 Oct 03 08:39:35 crc kubenswrapper[4765]: I1003 08:39:35.050486 4765 reflector.go:368] Caches populated for *v1.Service from k8s.io/client-go/informers/factory.go:160 Oct 03 08:39:35 crc kubenswrapper[4765]: I1003 08:39:35.061086 4765 reflector.go:368] Caches populated for *v1.CSIDriver from k8s.io/client-go/informers/factory.go:160 Oct 03 08:39:35 crc kubenswrapper[4765]: I1003 08:39:35.115370 4765 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Oct 03 08:39:35 crc kubenswrapper[4765]: I1003 08:39:35.120423 4765 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Oct 03 08:39:35 crc kubenswrapper[4765]: I1003 08:39:35.235666 4765 patch_prober.go:28] interesting pod/kube-apiserver-crc container/kube-apiserver-check-endpoints namespace/openshift-kube-apiserver: Readiness probe status=failure output="Get \"https://192.168.126.11:17697/healthz\": read tcp 192.168.126.11:48420->192.168.126.11:17697: read: connection reset by peer" start-of-body= Oct 03 08:39:35 crc kubenswrapper[4765]: I1003 08:39:35.235729 4765 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" probeResult="failure" output="Get \"https://192.168.126.11:17697/healthz\": read tcp 192.168.126.11:48420->192.168.126.11:17697: read: connection reset by peer" Oct 03 08:39:35 crc kubenswrapper[4765]: I1003 08:39:35.236452 4765 patch_prober.go:28] interesting pod/kube-apiserver-crc container/kube-apiserver-check-endpoints namespace/openshift-kube-apiserver: Readiness probe status=failure output="Get \"https://192.168.126.11:17697/healthz\": dial tcp 192.168.126.11:17697: connect: connection refused" start-of-body= Oct 03 08:39:35 crc kubenswrapper[4765]: I1003 08:39:35.236508 4765 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" probeResult="failure" output="Get \"https://192.168.126.11:17697/healthz\": dial tcp 192.168.126.11:17697: connect: connection refused" Oct 03 08:39:35 crc kubenswrapper[4765]: I1003 08:39:35.243836 4765 apiserver.go:52] "Watching apiserver" Oct 03 08:39:35 crc kubenswrapper[4765]: I1003 08:39:35.245963 4765 reflector.go:368] Caches populated for *v1.Pod from pkg/kubelet/config/apiserver.go:66 Oct 03 08:39:35 crc kubenswrapper[4765]: I1003 08:39:35.246286 4765 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-network-operator/iptables-alerter-4ln5h","openshift-network-operator/network-operator-58b4c7f79c-55gtf","openshift-kube-controller-manager/kube-controller-manager-crc","openshift-network-console/networking-console-plugin-85b44fc459-gdk6g","openshift-network-diagnostics/network-check-source-55646444c4-trplf","openshift-network-diagnostics/network-check-target-xd92c","openshift-network-node-identity/network-node-identity-vrzqb"] Oct 03 08:39:35 crc kubenswrapper[4765]: I1003 08:39:35.246792 4765 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 03 08:39:35 crc kubenswrapper[4765]: I1003 08:39:35.246858 4765 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 03 08:39:35 crc kubenswrapper[4765]: E1003 08:39:35.246879 4765 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 03 08:39:35 crc kubenswrapper[4765]: E1003 08:39:35.246911 4765 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 03 08:39:35 crc kubenswrapper[4765]: I1003 08:39:35.246959 4765 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Oct 03 08:39:35 crc kubenswrapper[4765]: I1003 08:39:35.247071 4765 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-node-identity/network-node-identity-vrzqb" Oct 03 08:39:35 crc kubenswrapper[4765]: I1003 08:39:35.247587 4765 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 03 08:39:35 crc kubenswrapper[4765]: I1003 08:39:35.247591 4765 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/iptables-alerter-4ln5h" Oct 03 08:39:35 crc kubenswrapper[4765]: E1003 08:39:35.247638 4765 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 03 08:39:35 crc kubenswrapper[4765]: I1003 08:39:35.249095 4765 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"kube-root-ca.crt" Oct 03 08:39:35 crc kubenswrapper[4765]: I1003 08:39:35.250356 4765 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-operator"/"metrics-tls" Oct 03 08:39:35 crc kubenswrapper[4765]: I1003 08:39:35.250523 4765 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"kube-root-ca.crt" Oct 03 08:39:35 crc kubenswrapper[4765]: I1003 08:39:35.250546 4765 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"env-overrides" Oct 03 08:39:35 crc kubenswrapper[4765]: I1003 08:39:35.251777 4765 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"ovnkube-identity-cm" Oct 03 08:39:35 crc kubenswrapper[4765]: I1003 08:39:35.251794 4765 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-node-identity"/"network-node-identity-cert" Oct 03 08:39:35 crc kubenswrapper[4765]: I1003 08:39:35.252583 4765 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"openshift-service-ca.crt" Oct 03 08:39:35 crc kubenswrapper[4765]: I1003 08:39:35.252599 4765 desired_state_of_world_populator.go:154] "Finished populating initial desired state of world" Oct 03 08:39:35 crc kubenswrapper[4765]: I1003 08:39:35.253496 4765 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"iptables-alerter-script" Oct 03 08:39:35 crc kubenswrapper[4765]: I1003 08:39:35.253669 4765 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"openshift-service-ca.crt" Oct 03 08:39:35 crc kubenswrapper[4765]: I1003 08:39:35.271770 4765 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:35Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:35Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 03 08:39:35 crc kubenswrapper[4765]: I1003 08:39:35.283874 4765 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:35Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:35Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 03 08:39:35 crc kubenswrapper[4765]: I1003 08:39:35.299329 4765 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:35Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:35Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 03 08:39:35 crc kubenswrapper[4765]: I1003 08:39:35.311515 4765 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:35Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:35Z\\\",\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":false,\\\"restartCount\\\":6,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 03 08:39:35 crc kubenswrapper[4765]: I1003 08:39:35.322923 4765 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:35Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:35Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 03 08:39:35 crc kubenswrapper[4765]: I1003 08:39:35.334886 4765 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:35Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:35Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 03 08:39:35 crc kubenswrapper[4765]: I1003 08:39:35.345895 4765 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:35Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:35Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 03 08:39:35 crc kubenswrapper[4765]: I1003 08:39:35.351200 4765 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/bf126b07-da06-4140-9a57-dfd54fc6b486-trusted-ca\") pod \"bf126b07-da06-4140-9a57-dfd54fc6b486\" (UID: \"bf126b07-da06-4140-9a57-dfd54fc6b486\") " Oct 03 08:39:35 crc kubenswrapper[4765]: I1003 08:39:35.351243 4765 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-utilities\") pod \"57a731c4-ef35-47a8-b875-bfb08a7f8011\" (UID: \"57a731c4-ef35-47a8-b875-bfb08a7f8011\") " Oct 03 08:39:35 crc kubenswrapper[4765]: I1003 08:39:35.351263 4765 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kfwg7\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-kube-api-access-kfwg7\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 03 08:39:35 crc kubenswrapper[4765]: I1003 08:39:35.351279 4765 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-serving-cert\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Oct 03 08:39:35 crc kubenswrapper[4765]: I1003 08:39:35.351294 4765 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/87cf06ed-a83f-41a7-828d-70653580a8cb-metrics-tls\") pod \"87cf06ed-a83f-41a7-828d-70653580a8cb\" (UID: \"87cf06ed-a83f-41a7-828d-70653580a8cb\") " Oct 03 08:39:35 crc kubenswrapper[4765]: I1003 08:39:35.351628 4765 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/87cf06ed-a83f-41a7-828d-70653580a8cb-metrics-tls" (OuterVolumeSpecName: "metrics-tls") pod "87cf06ed-a83f-41a7-828d-70653580a8cb" (UID: "87cf06ed-a83f-41a7-828d-70653580a8cb"). InnerVolumeSpecName "metrics-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 08:39:35 crc kubenswrapper[4765]: I1003 08:39:35.351720 4765 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/22c825df-677d-4ca6-82db-3454ed06e783-machine-approver-tls" (OuterVolumeSpecName: "machine-approver-tls") pod "22c825df-677d-4ca6-82db-3454ed06e783" (UID: "22c825df-677d-4ca6-82db-3454ed06e783"). InnerVolumeSpecName "machine-approver-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 08:39:35 crc kubenswrapper[4765]: I1003 08:39:35.351824 4765 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"machine-approver-tls\" (UniqueName: \"kubernetes.io/secret/22c825df-677d-4ca6-82db-3454ed06e783-machine-approver-tls\") pod \"22c825df-677d-4ca6-82db-3454ed06e783\" (UID: \"22c825df-677d-4ca6-82db-3454ed06e783\") " Oct 03 08:39:35 crc kubenswrapper[4765]: I1003 08:39:35.352000 4765 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8tdtz\" (UniqueName: \"kubernetes.io/projected/09efc573-dbb6-4249-bd59-9b87aba8dd28-kube-api-access-8tdtz\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Oct 03 08:39:35 crc kubenswrapper[4765]: I1003 08:39:35.352045 4765 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-ocp-branding-template\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Oct 03 08:39:35 crc kubenswrapper[4765]: I1003 08:39:35.352077 4765 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nzwt7\" (UniqueName: \"kubernetes.io/projected/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-kube-api-access-nzwt7\") pod \"96b93a3a-6083-4aea-8eab-fe1aa8245ad9\" (UID: \"96b93a3a-6083-4aea-8eab-fe1aa8245ad9\") " Oct 03 08:39:35 crc kubenswrapper[4765]: I1003 08:39:35.352064 4765 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-kube-api-access-kfwg7" (OuterVolumeSpecName: "kube-api-access-kfwg7") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "kube-api-access-kfwg7". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 08:39:35 crc kubenswrapper[4765]: I1003 08:39:35.352220 4765 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-utilities" (OuterVolumeSpecName: "utilities") pod "57a731c4-ef35-47a8-b875-bfb08a7f8011" (UID: "57a731c4-ef35-47a8-b875-bfb08a7f8011"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 03 08:39:35 crc kubenswrapper[4765]: I1003 08:39:35.352257 4765 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/09efc573-dbb6-4249-bd59-9b87aba8dd28-kube-api-access-8tdtz" (OuterVolumeSpecName: "kube-api-access-8tdtz") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "kube-api-access-8tdtz". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 08:39:35 crc kubenswrapper[4765]: I1003 08:39:35.352244 4765 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-serving-cert" (OuterVolumeSpecName: "console-serving-cert") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "console-serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 08:39:35 crc kubenswrapper[4765]: I1003 08:39:35.352303 4765 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-ocp-branding-template" (OuterVolumeSpecName: "v4-0-config-system-ocp-branding-template") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-ocp-branding-template". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 08:39:35 crc kubenswrapper[4765]: I1003 08:39:35.352347 4765 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 03 08:39:35 crc kubenswrapper[4765]: I1003 08:39:35.352388 4765 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-service-ca\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Oct 03 08:39:35 crc kubenswrapper[4765]: I1003 08:39:35.352422 4765 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/0b78653f-4ff9-4508-8672-245ed9b561e3-serving-cert\") pod \"0b78653f-4ff9-4508-8672-245ed9b561e3\" (UID: \"0b78653f-4ff9-4508-8672-245ed9b561e3\") " Oct 03 08:39:35 crc kubenswrapper[4765]: I1003 08:39:35.352455 4765 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-default-certificate\") pod \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\" (UID: \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\") " Oct 03 08:39:35 crc kubenswrapper[4765]: I1003 08:39:35.352528 4765 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-kube-api-access-nzwt7" (OuterVolumeSpecName: "kube-api-access-nzwt7") pod "96b93a3a-6083-4aea-8eab-fe1aa8245ad9" (UID: "96b93a3a-6083-4aea-8eab-fe1aa8245ad9"). InnerVolumeSpecName "kube-api-access-nzwt7". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 08:39:35 crc kubenswrapper[4765]: E1003 08:39:35.352539 4765 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-03 08:39:35.852464968 +0000 UTC m=+20.153959298 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 03 08:39:35 crc kubenswrapper[4765]: I1003 08:39:35.352597 4765 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-trusted-ca\") pod \"9d4552c7-cd75-42dd-8880-30dd377c49a4\" (UID: \"9d4552c7-cd75-42dd-8880-30dd377c49a4\") " Oct 03 08:39:35 crc kubenswrapper[4765]: I1003 08:39:35.352626 4765 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-bound-sa-token\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 03 08:39:35 crc kubenswrapper[4765]: I1003 08:39:35.352688 4765 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-catalog-content\") pod \"5225d0e4-402f-4861-b410-819f433b1803\" (UID: \"5225d0e4-402f-4861-b410-819f433b1803\") " Oct 03 08:39:35 crc kubenswrapper[4765]: I1003 08:39:35.352707 4765 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-catalog-content\") pod \"1d611f23-29be-4491-8495-bee1670e935f\" (UID: \"1d611f23-29be-4491-8495-bee1670e935f\") " Oct 03 08:39:35 crc kubenswrapper[4765]: I1003 08:39:35.352754 4765 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0b78653f-4ff9-4508-8672-245ed9b561e3-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "0b78653f-4ff9-4508-8672-245ed9b561e3" (UID: "0b78653f-4ff9-4508-8672-245ed9b561e3"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 08:39:35 crc kubenswrapper[4765]: I1003 08:39:35.352964 4765 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-default-certificate" (OuterVolumeSpecName: "default-certificate") pod "c03ee662-fb2f-4fc4-a2c1-af487c19d254" (UID: "c03ee662-fb2f-4fc4-a2c1-af487c19d254"). InnerVolumeSpecName "default-certificate". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 08:39:35 crc kubenswrapper[4765]: I1003 08:39:35.352970 4765 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-bound-sa-token" (OuterVolumeSpecName: "bound-sa-token") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "bound-sa-token". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 08:39:35 crc kubenswrapper[4765]: I1003 08:39:35.353160 4765 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-service-ca" (OuterVolumeSpecName: "v4-0-config-system-service-ca") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 03 08:39:35 crc kubenswrapper[4765]: I1003 08:39:35.353213 4765 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/25e176fe-21b4-4974-b1ed-c8b94f112a7f-kube-api-access-d4lsv" (OuterVolumeSpecName: "kube-api-access-d4lsv") pod "25e176fe-21b4-4974-b1ed-c8b94f112a7f" (UID: "25e176fe-21b4-4974-b1ed-c8b94f112a7f"). InnerVolumeSpecName "kube-api-access-d4lsv". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 08:39:35 crc kubenswrapper[4765]: I1003 08:39:35.353348 4765 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "9d4552c7-cd75-42dd-8880-30dd377c49a4" (UID: "9d4552c7-cd75-42dd-8880-30dd377c49a4"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 03 08:39:35 crc kubenswrapper[4765]: I1003 08:39:35.352724 4765 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-d4lsv\" (UniqueName: \"kubernetes.io/projected/25e176fe-21b4-4974-b1ed-c8b94f112a7f-kube-api-access-d4lsv\") pod \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\" (UID: \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\") " Oct 03 08:39:35 crc kubenswrapper[4765]: I1003 08:39:35.353558 4765 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jkwtn\" (UniqueName: \"kubernetes.io/projected/5b88f790-22fa-440e-b583-365168c0b23d-kube-api-access-jkwtn\") pod \"5b88f790-22fa-440e-b583-365168c0b23d\" (UID: \"5b88f790-22fa-440e-b583-365168c0b23d\") " Oct 03 08:39:35 crc kubenswrapper[4765]: I1003 08:39:35.353587 4765 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/20b0d48f-5fd6-431c-a545-e3c800c7b866-cert\") pod \"20b0d48f-5fd6-431c-a545-e3c800c7b866\" (UID: \"20b0d48f-5fd6-431c-a545-e3c800c7b866\") " Oct 03 08:39:35 crc kubenswrapper[4765]: I1003 08:39:35.353656 4765 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"node-bootstrap-token\" (UniqueName: \"kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-node-bootstrap-token\") pod \"5fe579f8-e8a6-4643-bce5-a661393c4dde\" (UID: \"5fe579f8-e8a6-4643-bce5-a661393c4dde\") " Oct 03 08:39:35 crc kubenswrapper[4765]: I1003 08:39:35.353679 4765 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-cabundle\") pod \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\" (UID: \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\") " Oct 03 08:39:35 crc kubenswrapper[4765]: I1003 08:39:35.353695 4765 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-proxy-ca-bundles\") pod \"7583ce53-e0fe-4a16-9e4d-50516596a136\" (UID: \"7583ce53-e0fe-4a16-9e4d-50516596a136\") " Oct 03 08:39:35 crc kubenswrapper[4765]: I1003 08:39:35.353713 4765 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-encryption-config\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Oct 03 08:39:35 crc kubenswrapper[4765]: I1003 08:39:35.353729 4765 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/0b574797-001e-440a-8f4e-c0be86edad0f-proxy-tls\") pod \"0b574797-001e-440a-8f4e-c0be86edad0f\" (UID: \"0b574797-001e-440a-8f4e-c0be86edad0f\") " Oct 03 08:39:35 crc kubenswrapper[4765]: I1003 08:39:35.353744 4765 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/a31745f5-9847-4afe-82a5-3161cc66ca93-metrics-tls\") pod \"a31745f5-9847-4afe-82a5-3161cc66ca93\" (UID: \"a31745f5-9847-4afe-82a5-3161cc66ca93\") " Oct 03 08:39:35 crc kubenswrapper[4765]: I1003 08:39:35.353764 4765 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-key\") pod \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\" (UID: \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\") " Oct 03 08:39:35 crc kubenswrapper[4765]: I1003 08:39:35.353780 4765 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7c4vf\" (UniqueName: \"kubernetes.io/projected/22c825df-677d-4ca6-82db-3454ed06e783-kube-api-access-7c4vf\") pod \"22c825df-677d-4ca6-82db-3454ed06e783\" (UID: \"22c825df-677d-4ca6-82db-3454ed06e783\") " Oct 03 08:39:35 crc kubenswrapper[4765]: I1003 08:39:35.353796 4765 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vt5rc\" (UniqueName: \"kubernetes.io/projected/44663579-783b-4372-86d6-acf235a62d72-kube-api-access-vt5rc\") pod \"44663579-783b-4372-86d6-acf235a62d72\" (UID: \"44663579-783b-4372-86d6-acf235a62d72\") " Oct 03 08:39:35 crc kubenswrapper[4765]: I1003 08:39:35.353811 4765 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-utilities\") pod \"5225d0e4-402f-4861-b410-819f433b1803\" (UID: \"5225d0e4-402f-4861-b410-819f433b1803\") " Oct 03 08:39:35 crc kubenswrapper[4765]: I1003 08:39:35.353827 4765 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/31d8b7a1-420e-4252-a5b7-eebe8a111292-proxy-tls\") pod \"31d8b7a1-420e-4252-a5b7-eebe8a111292\" (UID: \"31d8b7a1-420e-4252-a5b7-eebe8a111292\") " Oct 03 08:39:35 crc kubenswrapper[4765]: I1003 08:39:35.353867 4765 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/20b0d48f-5fd6-431c-a545-e3c800c7b866-cert" (OuterVolumeSpecName: "cert") pod "20b0d48f-5fd6-431c-a545-e3c800c7b866" (UID: "20b0d48f-5fd6-431c-a545-e3c800c7b866"). InnerVolumeSpecName "cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 08:39:35 crc kubenswrapper[4765]: I1003 08:39:35.353871 4765 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5b88f790-22fa-440e-b583-365168c0b23d-kube-api-access-jkwtn" (OuterVolumeSpecName: "kube-api-access-jkwtn") pod "5b88f790-22fa-440e-b583-365168c0b23d" (UID: "5b88f790-22fa-440e-b583-365168c0b23d"). InnerVolumeSpecName "kube-api-access-jkwtn". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 08:39:35 crc kubenswrapper[4765]: I1003 08:39:35.353872 4765 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-oauth-serving-cert\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Oct 03 08:39:35 crc kubenswrapper[4765]: I1003 08:39:35.353930 4765 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-node-bootstrap-token" (OuterVolumeSpecName: "node-bootstrap-token") pod "5fe579f8-e8a6-4643-bce5-a661393c4dde" (UID: "5fe579f8-e8a6-4643-bce5-a661393c4dde"). InnerVolumeSpecName "node-bootstrap-token". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 08:39:35 crc kubenswrapper[4765]: I1003 08:39:35.353939 4765 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-env-overrides\") pod \"925f1c65-6136-48ba-85aa-3a3b50560753\" (UID: \"925f1c65-6136-48ba-85aa-3a3b50560753\") " Oct 03 08:39:35 crc kubenswrapper[4765]: I1003 08:39:35.353991 4765 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-config\") pod \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\" (UID: \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\") " Oct 03 08:39:35 crc kubenswrapper[4765]: I1003 08:39:35.354016 4765 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-utilities\") pod \"1d611f23-29be-4491-8495-bee1670e935f\" (UID: \"1d611f23-29be-4491-8495-bee1670e935f\") " Oct 03 08:39:35 crc kubenswrapper[4765]: I1003 08:39:35.354038 4765 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/0b78653f-4ff9-4508-8672-245ed9b561e3-service-ca\") pod \"0b78653f-4ff9-4508-8672-245ed9b561e3\" (UID: \"0b78653f-4ff9-4508-8672-245ed9b561e3\") " Oct 03 08:39:35 crc kubenswrapper[4765]: I1003 08:39:35.354063 4765 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lz9wn\" (UniqueName: \"kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-kube-api-access-lz9wn\") pod \"a31745f5-9847-4afe-82a5-3161cc66ca93\" (UID: \"a31745f5-9847-4afe-82a5-3161cc66ca93\") " Oct 03 08:39:35 crc kubenswrapper[4765]: I1003 08:39:35.354085 4765 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9xfj7\" (UniqueName: \"kubernetes.io/projected/5225d0e4-402f-4861-b410-819f433b1803-kube-api-access-9xfj7\") pod \"5225d0e4-402f-4861-b410-819f433b1803\" (UID: \"5225d0e4-402f-4861-b410-819f433b1803\") " Oct 03 08:39:35 crc kubenswrapper[4765]: I1003 08:39:35.354096 4765 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/22c825df-677d-4ca6-82db-3454ed06e783-kube-api-access-7c4vf" (OuterVolumeSpecName: "kube-api-access-7c4vf") pod "22c825df-677d-4ca6-82db-3454ed06e783" (UID: "22c825df-677d-4ca6-82db-3454ed06e783"). InnerVolumeSpecName "kube-api-access-7c4vf". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 08:39:35 crc kubenswrapper[4765]: I1003 08:39:35.354106 4765 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/6ea678ab-3438-413e-bfe3-290ae7725660-ovn-node-metrics-cert\") pod \"6ea678ab-3438-413e-bfe3-290ae7725660\" (UID: \"6ea678ab-3438-413e-bfe3-290ae7725660\") " Oct 03 08:39:35 crc kubenswrapper[4765]: I1003 08:39:35.354132 4765 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-trusted-ca\") pod \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\" (UID: \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\") " Oct 03 08:39:35 crc kubenswrapper[4765]: I1003 08:39:35.354133 4765 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-encryption-config" (OuterVolumeSpecName: "encryption-config") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "encryption-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 08:39:35 crc kubenswrapper[4765]: I1003 08:39:35.354162 4765 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-utilities\") pod \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\" (UID: \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\") " Oct 03 08:39:35 crc kubenswrapper[4765]: I1003 08:39:35.354186 4765 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-images\") pod \"31d8b7a1-420e-4252-a5b7-eebe8a111292\" (UID: \"31d8b7a1-420e-4252-a5b7-eebe8a111292\") " Oct 03 08:39:35 crc kubenswrapper[4765]: I1003 08:39:35.354212 4765 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-provider-selection\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Oct 03 08:39:35 crc kubenswrapper[4765]: I1003 08:39:35.354236 4765 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/efdd0498-1daa-4136-9a4a-3b948c2293fc-webhook-certs\") pod \"efdd0498-1daa-4136-9a4a-3b948c2293fc\" (UID: \"efdd0498-1daa-4136-9a4a-3b948c2293fc\") " Oct 03 08:39:35 crc kubenswrapper[4765]: I1003 08:39:35.354258 4765 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-bound-sa-token\") pod \"bf126b07-da06-4140-9a57-dfd54fc6b486\" (UID: \"bf126b07-da06-4140-9a57-dfd54fc6b486\") " Oct 03 08:39:35 crc kubenswrapper[4765]: I1003 08:39:35.354280 4765 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-audit-policies\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Oct 03 08:39:35 crc kubenswrapper[4765]: I1003 08:39:35.354300 4765 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-webhook-cert\") pod \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\" (UID: \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\") " Oct 03 08:39:35 crc kubenswrapper[4765]: I1003 08:39:35.354319 4765 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"tmpfs\" (UniqueName: \"kubernetes.io/empty-dir/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-tmpfs\") pod \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\" (UID: \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\") " Oct 03 08:39:35 crc kubenswrapper[4765]: I1003 08:39:35.354342 4765 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ngvvp\" (UniqueName: \"kubernetes.io/projected/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-kube-api-access-ngvvp\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Oct 03 08:39:35 crc kubenswrapper[4765]: I1003 08:39:35.354363 4765 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/3ab1a177-2de0-46d9-b765-d0d0649bb42e-package-server-manager-serving-cert\") pod \"3ab1a177-2de0-46d9-b765-d0d0649bb42e\" (UID: \"3ab1a177-2de0-46d9-b765-d0d0649bb42e\") " Oct 03 08:39:35 crc kubenswrapper[4765]: I1003 08:39:35.354384 4765 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-w9rds\" (UniqueName: \"kubernetes.io/projected/20b0d48f-5fd6-431c-a545-e3c800c7b866-kube-api-access-w9rds\") pod \"20b0d48f-5fd6-431c-a545-e3c800c7b866\" (UID: \"20b0d48f-5fd6-431c-a545-e3c800c7b866\") " Oct 03 08:39:35 crc kubenswrapper[4765]: I1003 08:39:35.354406 4765 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pj782\" (UniqueName: \"kubernetes.io/projected/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-kube-api-access-pj782\") pod \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\" (UID: \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\") " Oct 03 08:39:35 crc kubenswrapper[4765]: I1003 08:39:35.354430 4765 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-cliconfig\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Oct 03 08:39:35 crc kubenswrapper[4765]: I1003 08:39:35.354455 4765 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-binary-copy\") pod \"7bb08738-c794-4ee8-9972-3a62ca171029\" (UID: \"7bb08738-c794-4ee8-9972-3a62ca171029\") " Oct 03 08:39:35 crc kubenswrapper[4765]: I1003 08:39:35.354476 4765 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/5b88f790-22fa-440e-b583-365168c0b23d-metrics-certs\") pod \"5b88f790-22fa-440e-b583-365168c0b23d\" (UID: \"5b88f790-22fa-440e-b583-365168c0b23d\") " Oct 03 08:39:35 crc kubenswrapper[4765]: I1003 08:39:35.354568 4765 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mnrrd\" (UniqueName: \"kubernetes.io/projected/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-kube-api-access-mnrrd\") pod \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\" (UID: \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\") " Oct 03 08:39:35 crc kubenswrapper[4765]: I1003 08:39:35.354593 4765 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-279lb\" (UniqueName: \"kubernetes.io/projected/7bb08738-c794-4ee8-9972-3a62ca171029-kube-api-access-279lb\") pod \"7bb08738-c794-4ee8-9972-3a62ca171029\" (UID: \"7bb08738-c794-4ee8-9972-3a62ca171029\") " Oct 03 08:39:35 crc kubenswrapper[4765]: I1003 08:39:35.354617 4765 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-serving-ca\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Oct 03 08:39:35 crc kubenswrapper[4765]: I1003 08:39:35.354639 4765 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-serving-ca\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Oct 03 08:39:35 crc kubenswrapper[4765]: I1003 08:39:35.354683 4765 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/496e6271-fb68-4057-954e-a0d97a4afa3f-config\") pod \"496e6271-fb68-4057-954e-a0d97a4afa3f\" (UID: \"496e6271-fb68-4057-954e-a0d97a4afa3f\") " Oct 03 08:39:35 crc kubenswrapper[4765]: I1003 08:39:35.354751 4765 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/6509e943-70c6-444c-bc41-48a544e36fbd-serving-cert\") pod \"6509e943-70c6-444c-bc41-48a544e36fbd\" (UID: \"6509e943-70c6-444c-bc41-48a544e36fbd\") " Oct 03 08:39:35 crc kubenswrapper[4765]: I1003 08:39:35.354780 4765 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xcphl\" (UniqueName: \"kubernetes.io/projected/7583ce53-e0fe-4a16-9e4d-50516596a136-kube-api-access-xcphl\") pod \"7583ce53-e0fe-4a16-9e4d-50516596a136\" (UID: \"7583ce53-e0fe-4a16-9e4d-50516596a136\") " Oct 03 08:39:35 crc kubenswrapper[4765]: I1003 08:39:35.354805 4765 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-image-import-ca\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Oct 03 08:39:35 crc kubenswrapper[4765]: I1003 08:39:35.354827 4765 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-stats-auth\") pod \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\" (UID: \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\") " Oct 03 08:39:35 crc kubenswrapper[4765]: I1003 08:39:35.354849 4765 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-encryption-config\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Oct 03 08:39:35 crc kubenswrapper[4765]: I1003 08:39:35.354872 4765 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xcgwh\" (UniqueName: \"kubernetes.io/projected/fda69060-fa79-4696-b1a6-7980f124bf7c-kube-api-access-xcgwh\") pod \"fda69060-fa79-4696-b1a6-7980f124bf7c\" (UID: \"fda69060-fa79-4696-b1a6-7980f124bf7c\") " Oct 03 08:39:35 crc kubenswrapper[4765]: I1003 08:39:35.354895 4765 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-auth-proxy-config\") pod \"22c825df-677d-4ca6-82db-3454ed06e783\" (UID: \"22c825df-677d-4ca6-82db-3454ed06e783\") " Oct 03 08:39:35 crc kubenswrapper[4765]: I1003 08:39:35.354917 4765 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-multus-daemon-config\") pod \"4bb40260-dbaa-4fb0-84df-5e680505d512\" (UID: \"4bb40260-dbaa-4fb0-84df-5e680505d512\") " Oct 03 08:39:35 crc kubenswrapper[4765]: I1003 08:39:35.354939 4765 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"mcd-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/fda69060-fa79-4696-b1a6-7980f124bf7c-mcd-auth-proxy-config\") pod \"fda69060-fa79-4696-b1a6-7980f124bf7c\" (UID: \"fda69060-fa79-4696-b1a6-7980f124bf7c\") " Oct 03 08:39:35 crc kubenswrapper[4765]: I1003 08:39:35.354959 4765 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-config\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Oct 03 08:39:35 crc kubenswrapper[4765]: I1003 08:39:35.355003 4765 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-249nr\" (UniqueName: \"kubernetes.io/projected/b6312bbd-5731-4ea0-a20f-81d5a57df44a-kube-api-access-249nr\") pod \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\" (UID: \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\") " Oct 03 08:39:35 crc kubenswrapper[4765]: I1003 08:39:35.355022 4765 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6ccd8\" (UniqueName: \"kubernetes.io/projected/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-kube-api-access-6ccd8\") pod \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\" (UID: \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\") " Oct 03 08:39:35 crc kubenswrapper[4765]: I1003 08:39:35.355040 4765 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-operator-metrics\") pod \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\" (UID: \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\") " Oct 03 08:39:35 crc kubenswrapper[4765]: I1003 08:39:35.355055 4765 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2w9zh\" (UniqueName: \"kubernetes.io/projected/4bb40260-dbaa-4fb0-84df-5e680505d512-kube-api-access-2w9zh\") pod \"4bb40260-dbaa-4fb0-84df-5e680505d512\" (UID: \"4bb40260-dbaa-4fb0-84df-5e680505d512\") " Oct 03 08:39:35 crc kubenswrapper[4765]: I1003 08:39:35.355077 4765 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-x7zkh\" (UniqueName: \"kubernetes.io/projected/6731426b-95fe-49ff-bb5f-40441049fde2-kube-api-access-x7zkh\") pod \"6731426b-95fe-49ff-bb5f-40441049fde2\" (UID: \"6731426b-95fe-49ff-bb5f-40441049fde2\") " Oct 03 08:39:35 crc kubenswrapper[4765]: I1003 08:39:35.355099 4765 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-sb6h7\" (UniqueName: \"kubernetes.io/projected/1bf7eb37-55a3-4c65-b768-a94c82151e69-kube-api-access-sb6h7\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Oct 03 08:39:35 crc kubenswrapper[4765]: I1003 08:39:35.355119 4765 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-auth-proxy-config\") pod \"31d8b7a1-420e-4252-a5b7-eebe8a111292\" (UID: \"31d8b7a1-420e-4252-a5b7-eebe8a111292\") " Oct 03 08:39:35 crc kubenswrapper[4765]: I1003 08:39:35.355142 4765 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-ca\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-ca\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Oct 03 08:39:35 crc kubenswrapper[4765]: I1003 08:39:35.355167 4765 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-htfz6\" (UniqueName: \"kubernetes.io/projected/6ea678ab-3438-413e-bfe3-290ae7725660-kube-api-access-htfz6\") pod \"6ea678ab-3438-413e-bfe3-290ae7725660\" (UID: \"6ea678ab-3438-413e-bfe3-290ae7725660\") " Oct 03 08:39:35 crc kubenswrapper[4765]: I1003 08:39:35.355187 4765 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-cni-binary-copy\") pod \"4bb40260-dbaa-4fb0-84df-5e680505d512\" (UID: \"4bb40260-dbaa-4fb0-84df-5e680505d512\") " Oct 03 08:39:35 crc kubenswrapper[4765]: I1003 08:39:35.355207 4765 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-trusted-ca-bundle\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Oct 03 08:39:35 crc kubenswrapper[4765]: I1003 08:39:35.355227 4765 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tk88c\" (UniqueName: \"kubernetes.io/projected/7539238d-5fe0-46ed-884e-1c3b566537ec-kube-api-access-tk88c\") pod \"7539238d-5fe0-46ed-884e-1c3b566537ec\" (UID: \"7539238d-5fe0-46ed-884e-1c3b566537ec\") " Oct 03 08:39:35 crc kubenswrapper[4765]: I1003 08:39:35.355251 4765 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-trusted-ca-bundle\") pod \"6509e943-70c6-444c-bc41-48a544e36fbd\" (UID: \"6509e943-70c6-444c-bc41-48a544e36fbd\") " Oct 03 08:39:35 crc kubenswrapper[4765]: I1003 08:39:35.355399 4765 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lzf88\" (UniqueName: \"kubernetes.io/projected/0b574797-001e-440a-8f4e-c0be86edad0f-kube-api-access-lzf88\") pod \"0b574797-001e-440a-8f4e-c0be86edad0f\" (UID: \"0b574797-001e-440a-8f4e-c0be86edad0f\") " Oct 03 08:39:35 crc kubenswrapper[4765]: I1003 08:39:35.355422 4765 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7539238d-5fe0-46ed-884e-1c3b566537ec-config\") pod \"7539238d-5fe0-46ed-884e-1c3b566537ec\" (UID: \"7539238d-5fe0-46ed-884e-1c3b566537ec\") " Oct 03 08:39:35 crc kubenswrapper[4765]: I1003 08:39:35.355443 4765 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/1386a44e-36a2-460c-96d0-0359d2b6f0f5-kube-api-access\") pod \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\" (UID: \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\") " Oct 03 08:39:35 crc kubenswrapper[4765]: I1003 08:39:35.355463 4765 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-srv-cert\") pod \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\" (UID: \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\") " Oct 03 08:39:35 crc kubenswrapper[4765]: I1003 08:39:35.355486 4765 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-trusted-ca-bundle\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Oct 03 08:39:35 crc kubenswrapper[4765]: I1003 08:39:35.355508 4765 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-client\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Oct 03 08:39:35 crc kubenswrapper[4765]: I1003 08:39:35.355530 4765 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-registry-certificates\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 03 08:39:35 crc kubenswrapper[4765]: I1003 08:39:35.355562 4765 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/8f668bae-612b-4b75-9490-919e737c6a3b-ca-trust-extracted\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 03 08:39:35 crc kubenswrapper[4765]: I1003 08:39:35.355582 4765 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-trusted-ca-bundle\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Oct 03 08:39:35 crc kubenswrapper[4765]: I1003 08:39:35.355612 4765 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-serving-cert\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Oct 03 08:39:35 crc kubenswrapper[4765]: I1003 08:39:35.355634 4765 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/01ab3dd5-8196-46d0-ad33-122e2ca51def-serving-cert\") pod \"01ab3dd5-8196-46d0-ad33-122e2ca51def\" (UID: \"01ab3dd5-8196-46d0-ad33-122e2ca51def\") " Oct 03 08:39:35 crc kubenswrapper[4765]: I1003 08:39:35.355676 4765 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-v47cf\" (UniqueName: \"kubernetes.io/projected/c03ee662-fb2f-4fc4-a2c1-af487c19d254-kube-api-access-v47cf\") pod \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\" (UID: \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\") " Oct 03 08:39:35 crc kubenswrapper[4765]: I1003 08:39:35.355699 4765 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-config\") pod \"22c825df-677d-4ca6-82db-3454ed06e783\" (UID: \"22c825df-677d-4ca6-82db-3454ed06e783\") " Oct 03 08:39:35 crc kubenswrapper[4765]: I1003 08:39:35.355721 4765 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-metrics-tls\") pod \"96b93a3a-6083-4aea-8eab-fe1aa8245ad9\" (UID: \"96b93a3a-6083-4aea-8eab-fe1aa8245ad9\") " Oct 03 08:39:35 crc kubenswrapper[4765]: I1003 08:39:35.355741 4765 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1386a44e-36a2-460c-96d0-0359d2b6f0f5-config\") pod \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\" (UID: \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\") " Oct 03 08:39:35 crc kubenswrapper[4765]: I1003 08:39:35.355760 4765 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c03ee662-fb2f-4fc4-a2c1-af487c19d254-service-ca-bundle\") pod \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\" (UID: \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\") " Oct 03 08:39:35 crc kubenswrapper[4765]: I1003 08:39:35.355778 4765 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"certs\" (UniqueName: \"kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-certs\") pod \"5fe579f8-e8a6-4643-bce5-a661393c4dde\" (UID: \"5fe579f8-e8a6-4643-bce5-a661393c4dde\") " Oct 03 08:39:35 crc kubenswrapper[4765]: I1003 08:39:35.355825 4765 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-client-ca\") pod \"5441d097-087c-4d9a-baa8-b210afa90fc9\" (UID: \"5441d097-087c-4d9a-baa8-b210afa90fc9\") " Oct 03 08:39:35 crc kubenswrapper[4765]: I1003 08:39:35.355847 4765 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"mcc-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/0b574797-001e-440a-8f4e-c0be86edad0f-mcc-auth-proxy-config\") pod \"0b574797-001e-440a-8f4e-c0be86edad0f\" (UID: \"0b574797-001e-440a-8f4e-c0be86edad0f\") " Oct 03 08:39:35 crc kubenswrapper[4765]: I1003 08:39:35.355871 4765 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-script-lib\") pod \"6ea678ab-3438-413e-bfe3-290ae7725660\" (UID: \"6ea678ab-3438-413e-bfe3-290ae7725660\") " Oct 03 08:39:35 crc kubenswrapper[4765]: I1003 08:39:35.355898 4765 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zgdk5\" (UniqueName: \"kubernetes.io/projected/31d8b7a1-420e-4252-a5b7-eebe8a111292-kube-api-access-zgdk5\") pod \"31d8b7a1-420e-4252-a5b7-eebe8a111292\" (UID: \"31d8b7a1-420e-4252-a5b7-eebe8a111292\") " Oct 03 08:39:35 crc kubenswrapper[4765]: I1003 08:39:35.355919 4765 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/e7e6199b-1264-4501-8953-767f51328d08-kube-api-access\") pod \"e7e6199b-1264-4501-8953-767f51328d08\" (UID: \"e7e6199b-1264-4501-8953-767f51328d08\") " Oct 03 08:39:35 crc kubenswrapper[4765]: I1003 08:39:35.355939 4765 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qg5z5\" (UniqueName: \"kubernetes.io/projected/43509403-f426-496e-be36-56cef71462f5-kube-api-access-qg5z5\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Oct 03 08:39:35 crc kubenswrapper[4765]: I1003 08:39:35.355961 4765 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-service-ca-bundle\") pod \"6509e943-70c6-444c-bc41-48a544e36fbd\" (UID: \"6509e943-70c6-444c-bc41-48a544e36fbd\") " Oct 03 08:39:35 crc kubenswrapper[4765]: I1003 08:39:35.355985 4765 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-serving-cert\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Oct 03 08:39:35 crc kubenswrapper[4765]: I1003 08:39:35.356007 4765 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-apiservice-cert\") pod \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\" (UID: \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\") " Oct 03 08:39:35 crc kubenswrapper[4765]: I1003 08:39:35.356028 4765 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/0b78653f-4ff9-4508-8672-245ed9b561e3-kube-api-access\") pod \"0b78653f-4ff9-4508-8672-245ed9b561e3\" (UID: \"0b78653f-4ff9-4508-8672-245ed9b561e3\") " Oct 03 08:39:35 crc kubenswrapper[4765]: I1003 08:39:35.356050 4765 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/9d4552c7-cd75-42dd-8880-30dd377c49a4-serving-cert\") pod \"9d4552c7-cd75-42dd-8880-30dd377c49a4\" (UID: \"9d4552c7-cd75-42dd-8880-30dd377c49a4\") " Oct 03 08:39:35 crc kubenswrapper[4765]: I1003 08:39:35.356072 4765 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-session\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Oct 03 08:39:35 crc kubenswrapper[4765]: I1003 08:39:35.356095 4765 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rnphk\" (UniqueName: \"kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-kube-api-access-rnphk\") pod \"bf126b07-da06-4140-9a57-dfd54fc6b486\" (UID: \"bf126b07-da06-4140-9a57-dfd54fc6b486\") " Oct 03 08:39:35 crc kubenswrapper[4765]: I1003 08:39:35.356115 4765 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/496e6271-fb68-4057-954e-a0d97a4afa3f-kube-api-access\") pod \"496e6271-fb68-4057-954e-a0d97a4afa3f\" (UID: \"496e6271-fb68-4057-954e-a0d97a4afa3f\") " Oct 03 08:39:35 crc kubenswrapper[4765]: I1003 08:39:35.356136 4765 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-client\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Oct 03 08:39:35 crc kubenswrapper[4765]: I1003 08:39:35.356155 4765 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-trusted-ca\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 03 08:39:35 crc kubenswrapper[4765]: I1003 08:39:35.356177 4765 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/87cf06ed-a83f-41a7-828d-70653580a8cb-config-volume\") pod \"87cf06ed-a83f-41a7-828d-70653580a8cb\" (UID: \"87cf06ed-a83f-41a7-828d-70653580a8cb\") " Oct 03 08:39:35 crc kubenswrapper[4765]: I1003 08:39:35.356202 4765 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-client\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Oct 03 08:39:35 crc kubenswrapper[4765]: I1003 08:39:35.356225 4765 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/bf126b07-da06-4140-9a57-dfd54fc6b486-image-registry-operator-tls\") pod \"bf126b07-da06-4140-9a57-dfd54fc6b486\" (UID: \"bf126b07-da06-4140-9a57-dfd54fc6b486\") " Oct 03 08:39:35 crc kubenswrapper[4765]: I1003 08:39:35.356248 4765 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dbsvg\" (UniqueName: \"kubernetes.io/projected/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-kube-api-access-dbsvg\") pod \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\" (UID: \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\") " Oct 03 08:39:35 crc kubenswrapper[4765]: I1003 08:39:35.356268 4765 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-srv-cert\") pod \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\" (UID: \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\") " Oct 03 08:39:35 crc kubenswrapper[4765]: I1003 08:39:35.356289 4765 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-audit\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Oct 03 08:39:35 crc kubenswrapper[4765]: I1003 08:39:35.356308 4765 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-config\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Oct 03 08:39:35 crc kubenswrapper[4765]: I1003 08:39:35.356332 4765 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qs4fp\" (UniqueName: \"kubernetes.io/projected/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-kube-api-access-qs4fp\") pod \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\" (UID: \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\") " Oct 03 08:39:35 crc kubenswrapper[4765]: I1003 08:39:35.356354 4765 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pjr6v\" (UniqueName: \"kubernetes.io/projected/49ef4625-1d3a-4a9f-b595-c2433d32326d-kube-api-access-pjr6v\") pod \"49ef4625-1d3a-4a9f-b595-c2433d32326d\" (UID: \"49ef4625-1d3a-4a9f-b595-c2433d32326d\") " Oct 03 08:39:35 crc kubenswrapper[4765]: I1003 08:39:35.356377 4765 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/8f668bae-612b-4b75-9490-919e737c6a3b-installation-pull-secrets\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 03 08:39:35 crc kubenswrapper[4765]: I1003 08:39:35.356398 4765 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2d4wz\" (UniqueName: \"kubernetes.io/projected/5441d097-087c-4d9a-baa8-b210afa90fc9-kube-api-access-2d4wz\") pod \"5441d097-087c-4d9a-baa8-b210afa90fc9\" (UID: \"5441d097-087c-4d9a-baa8-b210afa90fc9\") " Oct 03 08:39:35 crc kubenswrapper[4765]: I1003 08:39:35.356421 4765 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-config\") pod \"7583ce53-e0fe-4a16-9e4d-50516596a136\" (UID: \"7583ce53-e0fe-4a16-9e4d-50516596a136\") " Oct 03 08:39:35 crc kubenswrapper[4765]: I1003 08:39:35.356441 4765 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-config\") pod \"5441d097-087c-4d9a-baa8-b210afa90fc9\" (UID: \"5441d097-087c-4d9a-baa8-b210afa90fc9\") " Oct 03 08:39:35 crc kubenswrapper[4765]: I1003 08:39:35.356464 4765 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-serving-cert\") pod \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\" (UID: \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\") " Oct 03 08:39:35 crc kubenswrapper[4765]: I1003 08:39:35.356488 4765 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pcxfs\" (UniqueName: \"kubernetes.io/projected/9d4552c7-cd75-42dd-8880-30dd377c49a4-kube-api-access-pcxfs\") pod \"9d4552c7-cd75-42dd-8880-30dd377c49a4\" (UID: \"9d4552c7-cd75-42dd-8880-30dd377c49a4\") " Oct 03 08:39:35 crc kubenswrapper[4765]: I1003 08:39:35.356512 4765 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gf66m\" (UniqueName: \"kubernetes.io/projected/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-kube-api-access-gf66m\") pod \"a0128f3a-b052-44ed-a84e-c4c8aaf17c13\" (UID: \"a0128f3a-b052-44ed-a84e-c4c8aaf17c13\") " Oct 03 08:39:35 crc kubenswrapper[4765]: I1003 08:39:35.356534 4765 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-client-ca\") pod \"7583ce53-e0fe-4a16-9e4d-50516596a136\" (UID: \"7583ce53-e0fe-4a16-9e4d-50516596a136\") " Oct 03 08:39:35 crc kubenswrapper[4765]: I1003 08:39:35.356558 4765 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-sysctl-allowlist\") pod \"7bb08738-c794-4ee8-9972-3a62ca171029\" (UID: \"7bb08738-c794-4ee8-9972-3a62ca171029\") " Oct 03 08:39:35 crc kubenswrapper[4765]: I1003 08:39:35.356580 4765 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6g6sz\" (UniqueName: \"kubernetes.io/projected/6509e943-70c6-444c-bc41-48a544e36fbd-kube-api-access-6g6sz\") pod \"6509e943-70c6-444c-bc41-48a544e36fbd\" (UID: \"6509e943-70c6-444c-bc41-48a544e36fbd\") " Oct 03 08:39:35 crc kubenswrapper[4765]: I1003 08:39:35.356603 4765 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-serving-cert\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Oct 03 08:39:35 crc kubenswrapper[4765]: I1003 08:39:35.356624 4765 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-router-certs\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Oct 03 08:39:35 crc kubenswrapper[4765]: I1003 08:39:35.356684 4765 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-x2m85\" (UniqueName: \"kubernetes.io/projected/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d-kube-api-access-x2m85\") pod \"cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d\" (UID: \"cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d\") " Oct 03 08:39:35 crc kubenswrapper[4765]: I1003 08:39:35.357081 4765 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1386a44e-36a2-460c-96d0-0359d2b6f0f5-serving-cert\") pod \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\" (UID: \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\") " Oct 03 08:39:35 crc kubenswrapper[4765]: I1003 08:39:35.357145 4765 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-oauth-config\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Oct 03 08:39:35 crc kubenswrapper[4765]: I1003 08:39:35.357168 4765 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"machine-api-operator-tls\" (UniqueName: \"kubernetes.io/secret/6402fda4-df10-493c-b4e5-d0569419652d-machine-api-operator-tls\") pod \"6402fda4-df10-493c-b4e5-d0569419652d\" (UID: \"6402fda4-df10-493c-b4e5-d0569419652d\") " Oct 03 08:39:35 crc kubenswrapper[4765]: I1003 08:39:35.357192 4765 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-s4n52\" (UniqueName: \"kubernetes.io/projected/925f1c65-6136-48ba-85aa-3a3b50560753-kube-api-access-s4n52\") pod \"925f1c65-6136-48ba-85aa-3a3b50560753\" (UID: \"925f1c65-6136-48ba-85aa-3a3b50560753\") " Oct 03 08:39:35 crc kubenswrapper[4765]: I1003 08:39:35.357212 4765 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bf2bz\" (UniqueName: \"kubernetes.io/projected/1d611f23-29be-4491-8495-bee1670e935f-kube-api-access-bf2bz\") pod \"1d611f23-29be-4491-8495-bee1670e935f\" (UID: \"1d611f23-29be-4491-8495-bee1670e935f\") " Oct 03 08:39:35 crc kubenswrapper[4765]: I1003 08:39:35.357232 4765 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-console-config\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Oct 03 08:39:35 crc kubenswrapper[4765]: I1003 08:39:35.357253 4765 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"control-plane-machine-set-operator-tls\" (UniqueName: \"kubernetes.io/secret/6731426b-95fe-49ff-bb5f-40441049fde2-control-plane-machine-set-operator-tls\") pod \"6731426b-95fe-49ff-bb5f-40441049fde2\" (UID: \"6731426b-95fe-49ff-bb5f-40441049fde2\") " Oct 03 08:39:35 crc kubenswrapper[4765]: I1003 08:39:35.357276 4765 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-ovnkube-config\") pod \"925f1c65-6136-48ba-85aa-3a3b50560753\" (UID: \"925f1c65-6136-48ba-85aa-3a3b50560753\") " Oct 03 08:39:35 crc kubenswrapper[4765]: I1003 08:39:35.357295 4765 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-config\") pod \"6402fda4-df10-493c-b4e5-d0569419652d\" (UID: \"6402fda4-df10-493c-b4e5-d0569419652d\") " Oct 03 08:39:35 crc kubenswrapper[4765]: I1003 08:39:35.357312 4765 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-w4xd4\" (UniqueName: \"kubernetes.io/projected/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-kube-api-access-w4xd4\") pod \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\" (UID: \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\") " Oct 03 08:39:35 crc kubenswrapper[4765]: I1003 08:39:35.357330 4765 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"available-featuregates\" (UniqueName: \"kubernetes.io/empty-dir/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-available-featuregates\") pod \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\" (UID: \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\") " Oct 03 08:39:35 crc kubenswrapper[4765]: I1003 08:39:35.357347 4765 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4d4hj\" (UniqueName: \"kubernetes.io/projected/3ab1a177-2de0-46d9-b765-d0d0649bb42e-kube-api-access-4d4hj\") pod \"3ab1a177-2de0-46d9-b765-d0d0649bb42e\" (UID: \"3ab1a177-2de0-46d9-b765-d0d0649bb42e\") " Oct 03 08:39:35 crc kubenswrapper[4765]: I1003 08:39:35.357369 4765 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-config\") pod \"6509e943-70c6-444c-bc41-48a544e36fbd\" (UID: \"6509e943-70c6-444c-bc41-48a544e36fbd\") " Oct 03 08:39:35 crc kubenswrapper[4765]: I1003 08:39:35.357385 4765 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-audit-policies\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Oct 03 08:39:35 crc kubenswrapper[4765]: I1003 08:39:35.357402 4765 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-samples-operator-tls\") pod \"a0128f3a-b052-44ed-a84e-c4c8aaf17c13\" (UID: \"a0128f3a-b052-44ed-a84e-c4c8aaf17c13\") " Oct 03 08:39:35 crc kubenswrapper[4765]: I1003 08:39:35.357418 4765 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-error\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Oct 03 08:39:35 crc kubenswrapper[4765]: I1003 08:39:35.357439 4765 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-catalog-content\") pod \"57a731c4-ef35-47a8-b875-bfb08a7f8011\" (UID: \"57a731c4-ef35-47a8-b875-bfb08a7f8011\") " Oct 03 08:39:35 crc kubenswrapper[4765]: I1003 08:39:35.357458 4765 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-env-overrides\") pod \"6ea678ab-3438-413e-bfe3-290ae7725660\" (UID: \"6ea678ab-3438-413e-bfe3-290ae7725660\") " Oct 03 08:39:35 crc kubenswrapper[4765]: I1003 08:39:35.357474 4765 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-serving-cert\") pod \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\" (UID: \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\") " Oct 03 08:39:35 crc kubenswrapper[4765]: I1003 08:39:35.357490 4765 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-login\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Oct 03 08:39:35 crc kubenswrapper[4765]: I1003 08:39:35.357506 4765 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fqsjt\" (UniqueName: \"kubernetes.io/projected/efdd0498-1daa-4136-9a4a-3b948c2293fc-kube-api-access-fqsjt\") pod \"efdd0498-1daa-4136-9a4a-3b948c2293fc\" (UID: \"efdd0498-1daa-4136-9a4a-3b948c2293fc\") " Oct 03 08:39:35 crc kubenswrapper[4765]: I1003 08:39:35.357521 4765 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/fda69060-fa79-4696-b1a6-7980f124bf7c-proxy-tls\") pod \"fda69060-fa79-4696-b1a6-7980f124bf7c\" (UID: \"fda69060-fa79-4696-b1a6-7980f124bf7c\") " Oct 03 08:39:35 crc kubenswrapper[4765]: I1003 08:39:35.357537 4765 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-profile-collector-cert\") pod \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\" (UID: \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\") " Oct 03 08:39:35 crc kubenswrapper[4765]: I1003 08:39:35.357552 4765 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-service-ca\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-service-ca\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Oct 03 08:39:35 crc kubenswrapper[4765]: I1003 08:39:35.357568 4765 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7583ce53-e0fe-4a16-9e4d-50516596a136-serving-cert\") pod \"7583ce53-e0fe-4a16-9e4d-50516596a136\" (UID: \"7583ce53-e0fe-4a16-9e4d-50516596a136\") " Oct 03 08:39:35 crc kubenswrapper[4765]: I1003 08:39:35.357584 4765 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-config\") pod \"6ea678ab-3438-413e-bfe3-290ae7725660\" (UID: \"6ea678ab-3438-413e-bfe3-290ae7725660\") " Oct 03 08:39:35 crc kubenswrapper[4765]: I1003 08:39:35.357600 4765 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/496e6271-fb68-4057-954e-a0d97a4afa3f-serving-cert\") pod \"496e6271-fb68-4057-954e-a0d97a4afa3f\" (UID: \"496e6271-fb68-4057-954e-a0d97a4afa3f\") " Oct 03 08:39:35 crc kubenswrapper[4765]: I1003 08:39:35.357618 4765 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovn-control-plane-metrics-cert\" (UniqueName: \"kubernetes.io/secret/925f1c65-6136-48ba-85aa-3a3b50560753-ovn-control-plane-metrics-cert\") pod \"925f1c65-6136-48ba-85aa-3a3b50560753\" (UID: \"925f1c65-6136-48ba-85aa-3a3b50560753\") " Oct 03 08:39:35 crc kubenswrapper[4765]: I1003 08:39:35.357640 4765 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-bound-sa-token\") pod \"a31745f5-9847-4afe-82a5-3161cc66ca93\" (UID: \"a31745f5-9847-4afe-82a5-3161cc66ca93\") " Oct 03 08:39:35 crc kubenswrapper[4765]: I1003 08:39:35.357677 4765 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-idp-0-file-data\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Oct 03 08:39:35 crc kubenswrapper[4765]: I1003 08:39:35.357701 4765 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-d6qdx\" (UniqueName: \"kubernetes.io/projected/87cf06ed-a83f-41a7-828d-70653580a8cb-kube-api-access-d6qdx\") pod \"87cf06ed-a83f-41a7-828d-70653580a8cb\" (UID: \"87cf06ed-a83f-41a7-828d-70653580a8cb\") " Oct 03 08:39:35 crc kubenswrapper[4765]: I1003 08:39:35.357730 4765 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-cfbct\" (UniqueName: \"kubernetes.io/projected/57a731c4-ef35-47a8-b875-bfb08a7f8011-kube-api-access-cfbct\") pod \"57a731c4-ef35-47a8-b875-bfb08a7f8011\" (UID: \"57a731c4-ef35-47a8-b875-bfb08a7f8011\") " Oct 03 08:39:35 crc kubenswrapper[4765]: I1003 08:39:35.357752 4765 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-service-ca\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Oct 03 08:39:35 crc kubenswrapper[4765]: I1003 08:39:35.357775 4765 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jhbk2\" (UniqueName: \"kubernetes.io/projected/bd23aa5c-e532-4e53-bccf-e79f130c5ae8-kube-api-access-jhbk2\") pod \"bd23aa5c-e532-4e53-bccf-e79f130c5ae8\" (UID: \"bd23aa5c-e532-4e53-bccf-e79f130c5ae8\") " Oct 03 08:39:35 crc kubenswrapper[4765]: I1003 08:39:35.357801 4765 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wxkg8\" (UniqueName: \"kubernetes.io/projected/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-kube-api-access-wxkg8\") pod \"3cb93b32-e0ae-4377-b9c8-fdb9842c6d59\" (UID: \"3cb93b32-e0ae-4377-b9c8-fdb9842c6d59\") " Oct 03 08:39:35 crc kubenswrapper[4765]: I1003 08:39:35.357827 4765 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-serviceca\") pod \"3cb93b32-e0ae-4377-b9c8-fdb9842c6d59\" (UID: \"3cb93b32-e0ae-4377-b9c8-fdb9842c6d59\") " Oct 03 08:39:35 crc kubenswrapper[4765]: I1003 08:39:35.357850 4765 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mg5zb\" (UniqueName: \"kubernetes.io/projected/6402fda4-df10-493c-b4e5-d0569419652d-kube-api-access-mg5zb\") pod \"6402fda4-df10-493c-b4e5-d0569419652d\" (UID: \"6402fda4-df10-493c-b4e5-d0569419652d\") " Oct 03 08:39:35 crc kubenswrapper[4765]: I1003 08:39:35.357874 4765 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e7e6199b-1264-4501-8953-767f51328d08-serving-cert\") pod \"e7e6199b-1264-4501-8953-767f51328d08\" (UID: \"e7e6199b-1264-4501-8953-767f51328d08\") " Oct 03 08:39:35 crc kubenswrapper[4765]: I1003 08:39:35.357896 4765 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-metrics-certs\") pod \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\" (UID: \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\") " Oct 03 08:39:35 crc kubenswrapper[4765]: I1003 08:39:35.357919 4765 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-config\") pod \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\" (UID: \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\") " Oct 03 08:39:35 crc kubenswrapper[4765]: I1003 08:39:35.357940 4765 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/5441d097-087c-4d9a-baa8-b210afa90fc9-serving-cert\") pod \"5441d097-087c-4d9a-baa8-b210afa90fc9\" (UID: \"5441d097-087c-4d9a-baa8-b210afa90fc9\") " Oct 03 08:39:35 crc kubenswrapper[4765]: I1003 08:39:35.357961 4765 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-catalog-content\") pod \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\" (UID: \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\") " Oct 03 08:39:35 crc kubenswrapper[4765]: I1003 08:39:35.357987 4765 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-images\") pod \"6402fda4-df10-493c-b4e5-d0569419652d\" (UID: \"6402fda4-df10-493c-b4e5-d0569419652d\") " Oct 03 08:39:35 crc kubenswrapper[4765]: I1003 08:39:35.358012 4765 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fcqwp\" (UniqueName: \"kubernetes.io/projected/5fe579f8-e8a6-4643-bce5-a661393c4dde-kube-api-access-fcqwp\") pod \"5fe579f8-e8a6-4643-bce5-a661393c4dde\" (UID: \"5fe579f8-e8a6-4643-bce5-a661393c4dde\") " Oct 03 08:39:35 crc kubenswrapper[4765]: I1003 08:39:35.358035 4765 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-serving-cert\") pod \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\" (UID: \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\") " Oct 03 08:39:35 crc kubenswrapper[4765]: I1003 08:39:35.358058 4765 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/01ab3dd5-8196-46d0-ad33-122e2ca51def-config\") pod \"01ab3dd5-8196-46d0-ad33-122e2ca51def\" (UID: \"01ab3dd5-8196-46d0-ad33-122e2ca51def\") " Oct 03 08:39:35 crc kubenswrapper[4765]: I1003 08:39:35.358085 4765 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-profile-collector-cert\") pod \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\" (UID: \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\") " Oct 03 08:39:35 crc kubenswrapper[4765]: I1003 08:39:35.358109 4765 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-w7l8j\" (UniqueName: \"kubernetes.io/projected/01ab3dd5-8196-46d0-ad33-122e2ca51def-kube-api-access-w7l8j\") pod \"01ab3dd5-8196-46d0-ad33-122e2ca51def\" (UID: \"01ab3dd5-8196-46d0-ad33-122e2ca51def\") " Oct 03 08:39:35 crc kubenswrapper[4765]: I1003 08:39:35.358135 4765 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zkvpv\" (UniqueName: \"kubernetes.io/projected/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-kube-api-access-zkvpv\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Oct 03 08:39:35 crc kubenswrapper[4765]: I1003 08:39:35.358157 4765 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7539238d-5fe0-46ed-884e-1c3b566537ec-serving-cert\") pod \"7539238d-5fe0-46ed-884e-1c3b566537ec\" (UID: \"7539238d-5fe0-46ed-884e-1c3b566537ec\") " Oct 03 08:39:35 crc kubenswrapper[4765]: I1003 08:39:35.358179 4765 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-config\") pod \"9d4552c7-cd75-42dd-8880-30dd377c49a4\" (UID: \"9d4552c7-cd75-42dd-8880-30dd377c49a4\") " Oct 03 08:39:35 crc kubenswrapper[4765]: I1003 08:39:35.358395 4765 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e7e6199b-1264-4501-8953-767f51328d08-config\") pod \"e7e6199b-1264-4501-8953-767f51328d08\" (UID: \"e7e6199b-1264-4501-8953-767f51328d08\") " Oct 03 08:39:35 crc kubenswrapper[4765]: I1003 08:39:35.358418 4765 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-trusted-ca-bundle\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Oct 03 08:39:35 crc kubenswrapper[4765]: I1003 08:39:35.358441 4765 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-x4zgh\" (UniqueName: \"kubernetes.io/projected/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-kube-api-access-x4zgh\") pod \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\" (UID: \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\") " Oct 03 08:39:35 crc kubenswrapper[4765]: I1003 08:39:35.358462 4765 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/a31745f5-9847-4afe-82a5-3161cc66ca93-trusted-ca\") pod \"a31745f5-9847-4afe-82a5-3161cc66ca93\" (UID: \"a31745f5-9847-4afe-82a5-3161cc66ca93\") " Oct 03 08:39:35 crc kubenswrapper[4765]: I1003 08:39:35.358487 4765 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-registry-tls\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 03 08:39:35 crc kubenswrapper[4765]: I1003 08:39:35.358510 4765 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-serving-cert\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Oct 03 08:39:35 crc kubenswrapper[4765]: I1003 08:39:35.358563 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-env-overrides\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Oct 03 08:39:35 crc kubenswrapper[4765]: I1003 08:39:35.358596 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-identity-cm\" (UniqueName: \"kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-ovnkube-identity-cm\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Oct 03 08:39:35 crc kubenswrapper[4765]: I1003 08:39:35.358620 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2kz5\" (UniqueName: \"kubernetes.io/projected/ef543e1b-8068-4ea3-b32a-61027b32e95d-kube-api-access-s2kz5\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Oct 03 08:39:35 crc kubenswrapper[4765]: I1003 08:39:35.358659 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rczfb\" (UniqueName: \"kubernetes.io/projected/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-kube-api-access-rczfb\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Oct 03 08:39:35 crc kubenswrapper[4765]: I1003 08:39:35.358690 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-etc-kube\" (UniqueName: \"kubernetes.io/host-path/37a5e44f-9a88-4405-be8a-b645485e7312-host-etc-kube\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Oct 03 08:39:35 crc kubenswrapper[4765]: I1003 08:39:35.358746 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 03 08:39:35 crc kubenswrapper[4765]: I1003 08:39:35.358775 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/ef543e1b-8068-4ea3-b32a-61027b32e95d-webhook-cert\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Oct 03 08:39:35 crc kubenswrapper[4765]: I1003 08:39:35.358801 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-iptables-alerter-script\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Oct 03 08:39:35 crc kubenswrapper[4765]: I1003 08:39:35.358829 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 03 08:39:35 crc kubenswrapper[4765]: I1003 08:39:35.358861 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 03 08:39:35 crc kubenswrapper[4765]: I1003 08:39:35.358887 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 03 08:39:35 crc kubenswrapper[4765]: I1003 08:39:35.358911 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/37a5e44f-9a88-4405-be8a-b645485e7312-metrics-tls\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Oct 03 08:39:35 crc kubenswrapper[4765]: I1003 08:39:35.358934 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-host-slash\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Oct 03 08:39:35 crc kubenswrapper[4765]: I1003 08:39:35.358960 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rdwmf\" (UniqueName: \"kubernetes.io/projected/37a5e44f-9a88-4405-be8a-b645485e7312-kube-api-access-rdwmf\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Oct 03 08:39:35 crc kubenswrapper[4765]: I1003 08:39:35.359040 4765 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-utilities\") on node \"crc\" DevicePath \"\"" Oct 03 08:39:35 crc kubenswrapper[4765]: I1003 08:39:35.359059 4765 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-kfwg7\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-kube-api-access-kfwg7\") on node \"crc\" DevicePath \"\"" Oct 03 08:39:35 crc kubenswrapper[4765]: I1003 08:39:35.359074 4765 reconciler_common.go:293] "Volume detached for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-serving-cert\") on node \"crc\" DevicePath \"\"" Oct 03 08:39:35 crc kubenswrapper[4765]: I1003 08:39:35.359088 4765 reconciler_common.go:293] "Volume detached for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/87cf06ed-a83f-41a7-828d-70653580a8cb-metrics-tls\") on node \"crc\" DevicePath \"\"" Oct 03 08:39:35 crc kubenswrapper[4765]: I1003 08:39:35.359101 4765 reconciler_common.go:293] "Volume detached for volume \"machine-approver-tls\" (UniqueName: \"kubernetes.io/secret/22c825df-677d-4ca6-82db-3454ed06e783-machine-approver-tls\") on node \"crc\" DevicePath \"\"" Oct 03 08:39:35 crc kubenswrapper[4765]: I1003 08:39:35.359117 4765 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8tdtz\" (UniqueName: \"kubernetes.io/projected/09efc573-dbb6-4249-bd59-9b87aba8dd28-kube-api-access-8tdtz\") on node \"crc\" DevicePath \"\"" Oct 03 08:39:35 crc kubenswrapper[4765]: I1003 08:39:35.359131 4765 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-ocp-branding-template\") on node \"crc\" DevicePath \"\"" Oct 03 08:39:35 crc kubenswrapper[4765]: I1003 08:39:35.359144 4765 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-nzwt7\" (UniqueName: \"kubernetes.io/projected/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-kube-api-access-nzwt7\") on node \"crc\" DevicePath \"\"" Oct 03 08:39:35 crc kubenswrapper[4765]: I1003 08:39:35.359157 4765 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-service-ca\") on node \"crc\" DevicePath \"\"" Oct 03 08:39:35 crc kubenswrapper[4765]: I1003 08:39:35.359171 4765 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/0b78653f-4ff9-4508-8672-245ed9b561e3-serving-cert\") on node \"crc\" DevicePath \"\"" Oct 03 08:39:35 crc kubenswrapper[4765]: I1003 08:39:35.359184 4765 reconciler_common.go:293] "Volume detached for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-default-certificate\") on node \"crc\" DevicePath \"\"" Oct 03 08:39:35 crc kubenswrapper[4765]: I1003 08:39:35.359198 4765 reconciler_common.go:293] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-trusted-ca\") on node \"crc\" DevicePath \"\"" Oct 03 08:39:35 crc kubenswrapper[4765]: I1003 08:39:35.359212 4765 reconciler_common.go:293] "Volume detached for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-bound-sa-token\") on node \"crc\" DevicePath \"\"" Oct 03 08:39:35 crc kubenswrapper[4765]: I1003 08:39:35.359226 4765 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-d4lsv\" (UniqueName: \"kubernetes.io/projected/25e176fe-21b4-4974-b1ed-c8b94f112a7f-kube-api-access-d4lsv\") on node \"crc\" DevicePath \"\"" Oct 03 08:39:35 crc kubenswrapper[4765]: I1003 08:39:35.359241 4765 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jkwtn\" (UniqueName: \"kubernetes.io/projected/5b88f790-22fa-440e-b583-365168c0b23d-kube-api-access-jkwtn\") on node \"crc\" DevicePath \"\"" Oct 03 08:39:35 crc kubenswrapper[4765]: I1003 08:39:35.359254 4765 reconciler_common.go:293] "Volume detached for volume \"cert\" (UniqueName: \"kubernetes.io/secret/20b0d48f-5fd6-431c-a545-e3c800c7b866-cert\") on node \"crc\" DevicePath \"\"" Oct 03 08:39:35 crc kubenswrapper[4765]: I1003 08:39:35.359267 4765 reconciler_common.go:293] "Volume detached for volume \"node-bootstrap-token\" (UniqueName: \"kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-node-bootstrap-token\") on node \"crc\" DevicePath \"\"" Oct 03 08:39:35 crc kubenswrapper[4765]: I1003 08:39:35.359281 4765 reconciler_common.go:293] "Volume detached for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-encryption-config\") on node \"crc\" DevicePath \"\"" Oct 03 08:39:35 crc kubenswrapper[4765]: I1003 08:39:35.359295 4765 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7c4vf\" (UniqueName: \"kubernetes.io/projected/22c825df-677d-4ca6-82db-3454ed06e783-kube-api-access-7c4vf\") on node \"crc\" DevicePath \"\"" Oct 03 08:39:35 crc kubenswrapper[4765]: I1003 08:39:35.363177 4765 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9660b983-3561-4cf7-8ea0-31a63e8d1051\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://23c27e7d79dab0c54b22f0114e7f55a9267e3a21961b8479c37fd77d0e8b66c7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T08:39:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fb89a31c804d86cbc11b04e4dcfab79d4536f28a107d43e98d48172a1c257ebb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T08:39:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1c3168f51c49cd9633557cf31cdc0fec47b3fcf981462dc85f4253a0584fcf52\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T08:39:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://99ae775d5cfd2e88a1c7ca516e1c59f2e08ce1d383653cacbefeac66b07abcb5\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T08:39:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T08:39:16Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 03 08:39:35 crc kubenswrapper[4765]: I1003 08:39:35.367455 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-env-overrides\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Oct 03 08:39:35 crc kubenswrapper[4765]: I1003 08:39:35.354211 4765 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-proxy-ca-bundles" (OuterVolumeSpecName: "proxy-ca-bundles") pod "7583ce53-e0fe-4a16-9e4d-50516596a136" (UID: "7583ce53-e0fe-4a16-9e4d-50516596a136"). InnerVolumeSpecName "proxy-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 03 08:39:35 crc kubenswrapper[4765]: I1003 08:39:35.369832 4765 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0b78653f-4ff9-4508-8672-245ed9b561e3-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "0b78653f-4ff9-4508-8672-245ed9b561e3" (UID: "0b78653f-4ff9-4508-8672-245ed9b561e3"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 08:39:35 crc kubenswrapper[4765]: I1003 08:39:35.354281 4765 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0b574797-001e-440a-8f4e-c0be86edad0f-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "0b574797-001e-440a-8f4e-c0be86edad0f" (UID: "0b574797-001e-440a-8f4e-c0be86edad0f"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 08:39:35 crc kubenswrapper[4765]: I1003 08:39:35.369848 4765 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a31745f5-9847-4afe-82a5-3161cc66ca93-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "a31745f5-9847-4afe-82a5-3161cc66ca93" (UID: "a31745f5-9847-4afe-82a5-3161cc66ca93"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 03 08:39:35 crc kubenswrapper[4765]: I1003 08:39:35.354388 4765 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-env-overrides" (OuterVolumeSpecName: "env-overrides") pod "925f1c65-6136-48ba-85aa-3a3b50560753" (UID: "925f1c65-6136-48ba-85aa-3a3b50560753"). InnerVolumeSpecName "env-overrides". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 03 08:39:35 crc kubenswrapper[4765]: I1003 08:39:35.369890 4765 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5fe579f8-e8a6-4643-bce5-a661393c4dde-kube-api-access-fcqwp" (OuterVolumeSpecName: "kube-api-access-fcqwp") pod "5fe579f8-e8a6-4643-bce5-a661393c4dde" (UID: "5fe579f8-e8a6-4643-bce5-a661393c4dde"). InnerVolumeSpecName "kube-api-access-fcqwp". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 08:39:35 crc kubenswrapper[4765]: I1003 08:39:35.354408 4765 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5225d0e4-402f-4861-b410-819f433b1803-kube-api-access-9xfj7" (OuterVolumeSpecName: "kube-api-access-9xfj7") pod "5225d0e4-402f-4861-b410-819f433b1803" (UID: "5225d0e4-402f-4861-b410-819f433b1803"). InnerVolumeSpecName "kube-api-access-9xfj7". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 08:39:35 crc kubenswrapper[4765]: I1003 08:39:35.354559 4765 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a31745f5-9847-4afe-82a5-3161cc66ca93-metrics-tls" (OuterVolumeSpecName: "metrics-tls") pod "a31745f5-9847-4afe-82a5-3161cc66ca93" (UID: "a31745f5-9847-4afe-82a5-3161cc66ca93"). InnerVolumeSpecName "metrics-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 08:39:35 crc kubenswrapper[4765]: I1003 08:39:35.354555 4765 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-utilities" (OuterVolumeSpecName: "utilities") pod "5225d0e4-402f-4861-b410-819f433b1803" (UID: "5225d0e4-402f-4861-b410-819f433b1803"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 03 08:39:35 crc kubenswrapper[4765]: I1003 08:39:35.354582 4765 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/bf126b07-da06-4140-9a57-dfd54fc6b486-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "bf126b07-da06-4140-9a57-dfd54fc6b486" (UID: "bf126b07-da06-4140-9a57-dfd54fc6b486"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 03 08:39:35 crc kubenswrapper[4765]: I1003 08:39:35.355083 4765 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-cabundle" (OuterVolumeSpecName: "signing-cabundle") pod "25e176fe-21b4-4974-b1ed-c8b94f112a7f" (UID: "25e176fe-21b4-4974-b1ed-c8b94f112a7f"). InnerVolumeSpecName "signing-cabundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 03 08:39:35 crc kubenswrapper[4765]: I1003 08:39:35.355173 4765 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/31d8b7a1-420e-4252-a5b7-eebe8a111292-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "31d8b7a1-420e-4252-a5b7-eebe8a111292" (UID: "31d8b7a1-420e-4252-a5b7-eebe8a111292"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 08:39:35 crc kubenswrapper[4765]: I1003 08:39:35.355333 4765 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-provider-selection" (OuterVolumeSpecName: "v4-0-config-user-template-provider-selection") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-user-template-provider-selection". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 08:39:35 crc kubenswrapper[4765]: I1003 08:39:35.355464 4765 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-webhook-cert" (OuterVolumeSpecName: "webhook-cert") pod "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" (UID: "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b"). InnerVolumeSpecName "webhook-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 08:39:35 crc kubenswrapper[4765]: I1003 08:39:35.355573 4765 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/efdd0498-1daa-4136-9a4a-3b948c2293fc-webhook-certs" (OuterVolumeSpecName: "webhook-certs") pod "efdd0498-1daa-4136-9a4a-3b948c2293fc" (UID: "efdd0498-1daa-4136-9a4a-3b948c2293fc"). InnerVolumeSpecName "webhook-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 08:39:35 crc kubenswrapper[4765]: I1003 08:39:35.355577 4765 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6ea678ab-3438-413e-bfe3-290ae7725660-ovn-node-metrics-cert" (OuterVolumeSpecName: "ovn-node-metrics-cert") pod "6ea678ab-3438-413e-bfe3-290ae7725660" (UID: "6ea678ab-3438-413e-bfe3-290ae7725660"). InnerVolumeSpecName "ovn-node-metrics-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 08:39:35 crc kubenswrapper[4765]: I1003 08:39:35.355580 4765 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-config" (OuterVolumeSpecName: "config") pod "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" (UID: "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 03 08:39:35 crc kubenswrapper[4765]: I1003 08:39:35.355708 4765 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-bound-sa-token" (OuterVolumeSpecName: "bound-sa-token") pod "bf126b07-da06-4140-9a57-dfd54fc6b486" (UID: "bf126b07-da06-4140-9a57-dfd54fc6b486"). InnerVolumeSpecName "bound-sa-token". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 08:39:35 crc kubenswrapper[4765]: I1003 08:39:35.355978 4765 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-utilities" (OuterVolumeSpecName: "utilities") pod "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" (UID: "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 03 08:39:35 crc kubenswrapper[4765]: I1003 08:39:35.356029 4765 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-trusted-ca" (OuterVolumeSpecName: "marketplace-trusted-ca") pod "b6cd30de-2eeb-49a2-ab40-9167f4560ff5" (UID: "b6cd30de-2eeb-49a2-ab40-9167f4560ff5"). InnerVolumeSpecName "marketplace-trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 03 08:39:35 crc kubenswrapper[4765]: I1003 08:39:35.356224 4765 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-images" (OuterVolumeSpecName: "images") pod "31d8b7a1-420e-4252-a5b7-eebe8a111292" (UID: "31d8b7a1-420e-4252-a5b7-eebe8a111292"). InnerVolumeSpecName "images". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 03 08:39:35 crc kubenswrapper[4765]: I1003 08:39:35.356253 4765 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7583ce53-e0fe-4a16-9e4d-50516596a136-kube-api-access-xcphl" (OuterVolumeSpecName: "kube-api-access-xcphl") pod "7583ce53-e0fe-4a16-9e4d-50516596a136" (UID: "7583ce53-e0fe-4a16-9e4d-50516596a136"). InnerVolumeSpecName "kube-api-access-xcphl". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 08:39:35 crc kubenswrapper[4765]: I1003 08:39:35.356253 4765 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-audit-policies" (OuterVolumeSpecName: "audit-policies") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "audit-policies". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 03 08:39:35 crc kubenswrapper[4765]: I1003 08:39:35.356339 4765 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0b78653f-4ff9-4508-8672-245ed9b561e3-service-ca" (OuterVolumeSpecName: "service-ca") pod "0b78653f-4ff9-4508-8672-245ed9b561e3" (UID: "0b78653f-4ff9-4508-8672-245ed9b561e3"). InnerVolumeSpecName "service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 03 08:39:35 crc kubenswrapper[4765]: I1003 08:39:35.356471 4765 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5b88f790-22fa-440e-b583-365168c0b23d-metrics-certs" (OuterVolumeSpecName: "metrics-certs") pod "5b88f790-22fa-440e-b583-365168c0b23d" (UID: "5b88f790-22fa-440e-b583-365168c0b23d"). InnerVolumeSpecName "metrics-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 08:39:35 crc kubenswrapper[4765]: I1003 08:39:35.356521 4765 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3ab1a177-2de0-46d9-b765-d0d0649bb42e-package-server-manager-serving-cert" (OuterVolumeSpecName: "package-server-manager-serving-cert") pod "3ab1a177-2de0-46d9-b765-d0d0649bb42e" (UID: "3ab1a177-2de0-46d9-b765-d0d0649bb42e"). InnerVolumeSpecName "package-server-manager-serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 08:39:35 crc kubenswrapper[4765]: I1003 08:39:35.357102 4765 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-config" (OuterVolumeSpecName: "config") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 03 08:39:35 crc kubenswrapper[4765]: I1003 08:39:35.357884 4765 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-kube-api-access-lz9wn" (OuterVolumeSpecName: "kube-api-access-lz9wn") pod "a31745f5-9847-4afe-82a5-3161cc66ca93" (UID: "a31745f5-9847-4afe-82a5-3161cc66ca93"). InnerVolumeSpecName "kube-api-access-lz9wn". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 08:39:35 crc kubenswrapper[4765]: I1003 08:39:35.357949 4765 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-kube-api-access-pj782" (OuterVolumeSpecName: "kube-api-access-pj782") pod "b6cd30de-2eeb-49a2-ab40-9167f4560ff5" (UID: "b6cd30de-2eeb-49a2-ab40-9167f4560ff5"). InnerVolumeSpecName "kube-api-access-pj782". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 08:39:35 crc kubenswrapper[4765]: I1003 08:39:35.358154 4765 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-binary-copy" (OuterVolumeSpecName: "cni-binary-copy") pod "7bb08738-c794-4ee8-9972-3a62ca171029" (UID: "7bb08738-c794-4ee8-9972-3a62ca171029"). InnerVolumeSpecName "cni-binary-copy". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 03 08:39:35 crc kubenswrapper[4765]: I1003 08:39:35.358481 4765 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-kube-api-access-mnrrd" (OuterVolumeSpecName: "kube-api-access-mnrrd") pod "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" (UID: "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d"). InnerVolumeSpecName "kube-api-access-mnrrd". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 08:39:35 crc kubenswrapper[4765]: I1003 08:39:35.358580 4765 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-cliconfig" (OuterVolumeSpecName: "v4-0-config-system-cliconfig") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-cliconfig". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 03 08:39:35 crc kubenswrapper[4765]: I1003 08:39:35.370232 4765 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-serving-ca" (OuterVolumeSpecName: "etcd-serving-ca") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "etcd-serving-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 03 08:39:35 crc kubenswrapper[4765]: I1003 08:39:35.358769 4765 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/fda69060-fa79-4696-b1a6-7980f124bf7c-kube-api-access-xcgwh" (OuterVolumeSpecName: "kube-api-access-xcgwh") pod "fda69060-fa79-4696-b1a6-7980f124bf7c" (UID: "fda69060-fa79-4696-b1a6-7980f124bf7c"). InnerVolumeSpecName "kube-api-access-xcgwh". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 08:39:35 crc kubenswrapper[4765]: I1003 08:39:35.358824 4765 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4bb40260-dbaa-4fb0-84df-5e680505d512-kube-api-access-2w9zh" (OuterVolumeSpecName: "kube-api-access-2w9zh") pod "4bb40260-dbaa-4fb0-84df-5e680505d512" (UID: "4bb40260-dbaa-4fb0-84df-5e680505d512"). InnerVolumeSpecName "kube-api-access-2w9zh". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 08:39:35 crc kubenswrapper[4765]: I1003 08:39:35.359057 4765 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b6312bbd-5731-4ea0-a20f-81d5a57df44a-kube-api-access-249nr" (OuterVolumeSpecName: "kube-api-access-249nr") pod "b6312bbd-5731-4ea0-a20f-81d5a57df44a" (UID: "b6312bbd-5731-4ea0-a20f-81d5a57df44a"). InnerVolumeSpecName "kube-api-access-249nr". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 08:39:35 crc kubenswrapper[4765]: I1003 08:39:35.359051 4765 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-stats-auth" (OuterVolumeSpecName: "stats-auth") pod "c03ee662-fb2f-4fc4-a2c1-af487c19d254" (UID: "c03ee662-fb2f-4fc4-a2c1-af487c19d254"). InnerVolumeSpecName "stats-auth". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 08:39:35 crc kubenswrapper[4765]: I1003 08:39:35.359542 4765 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-auth-proxy-config" (OuterVolumeSpecName: "auth-proxy-config") pod "22c825df-677d-4ca6-82db-3454ed06e783" (UID: "22c825df-677d-4ca6-82db-3454ed06e783"). InnerVolumeSpecName "auth-proxy-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 03 08:39:35 crc kubenswrapper[4765]: I1003 08:39:35.359786 4765 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-kube-api-access-6ccd8" (OuterVolumeSpecName: "kube-api-access-6ccd8") pod "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" (UID: "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b"). InnerVolumeSpecName "kube-api-access-6ccd8". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 08:39:35 crc kubenswrapper[4765]: I1003 08:39:35.360511 4765 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-tmpfs" (OuterVolumeSpecName: "tmpfs") pod "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" (UID: "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b"). InnerVolumeSpecName "tmpfs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 03 08:39:35 crc kubenswrapper[4765]: I1003 08:39:35.360579 4765 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-ca" (OuterVolumeSpecName: "etcd-ca") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "etcd-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 03 08:39:35 crc kubenswrapper[4765]: I1003 08:39:35.360623 4765 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-key" (OuterVolumeSpecName: "signing-key") pod "25e176fe-21b4-4974-b1ed-c8b94f112a7f" (UID: "25e176fe-21b4-4974-b1ed-c8b94f112a7f"). InnerVolumeSpecName "signing-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 08:39:35 crc kubenswrapper[4765]: I1003 08:39:35.360885 4765 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/20b0d48f-5fd6-431c-a545-e3c800c7b866-kube-api-access-w9rds" (OuterVolumeSpecName: "kube-api-access-w9rds") pod "20b0d48f-5fd6-431c-a545-e3c800c7b866" (UID: "20b0d48f-5fd6-431c-a545-e3c800c7b866"). InnerVolumeSpecName "kube-api-access-w9rds". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 08:39:35 crc kubenswrapper[4765]: I1003 08:39:35.361077 4765 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-kube-api-access-ngvvp" (OuterVolumeSpecName: "kube-api-access-ngvvp") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "kube-api-access-ngvvp". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 08:39:35 crc kubenswrapper[4765]: I1003 08:39:35.361971 4765 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-serving-ca" (OuterVolumeSpecName: "etcd-serving-ca") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "etcd-serving-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 03 08:39:35 crc kubenswrapper[4765]: I1003 08:39:35.362442 4765 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/44663579-783b-4372-86d6-acf235a62d72-kube-api-access-vt5rc" (OuterVolumeSpecName: "kube-api-access-vt5rc") pod "44663579-783b-4372-86d6-acf235a62d72" (UID: "44663579-783b-4372-86d6-acf235a62d72"). InnerVolumeSpecName "kube-api-access-vt5rc". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 08:39:35 crc kubenswrapper[4765]: I1003 08:39:35.362814 4765 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-encryption-config" (OuterVolumeSpecName: "encryption-config") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "encryption-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 08:39:35 crc kubenswrapper[4765]: I1003 08:39:35.362868 4765 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1d611f23-29be-4491-8495-bee1670e935f-kube-api-access-bf2bz" (OuterVolumeSpecName: "kube-api-access-bf2bz") pod "1d611f23-29be-4491-8495-bee1670e935f" (UID: "1d611f23-29be-4491-8495-bee1670e935f"). InnerVolumeSpecName "kube-api-access-bf2bz". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 08:39:35 crc kubenswrapper[4765]: I1003 08:39:35.363236 4765 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7bb08738-c794-4ee8-9972-3a62ca171029-kube-api-access-279lb" (OuterVolumeSpecName: "kube-api-access-279lb") pod "7bb08738-c794-4ee8-9972-3a62ca171029" (UID: "7bb08738-c794-4ee8-9972-3a62ca171029"). InnerVolumeSpecName "kube-api-access-279lb". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 08:39:35 crc kubenswrapper[4765]: I1003 08:39:35.363988 4765 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-ovnkube-config" (OuterVolumeSpecName: "ovnkube-config") pod "925f1c65-6136-48ba-85aa-3a3b50560753" (UID: "925f1c65-6136-48ba-85aa-3a3b50560753"). InnerVolumeSpecName "ovnkube-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 03 08:39:35 crc kubenswrapper[4765]: I1003 08:39:35.364155 4765 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6731426b-95fe-49ff-bb5f-40441049fde2-control-plane-machine-set-operator-tls" (OuterVolumeSpecName: "control-plane-machine-set-operator-tls") pod "6731426b-95fe-49ff-bb5f-40441049fde2" (UID: "6731426b-95fe-49ff-bb5f-40441049fde2"). InnerVolumeSpecName "control-plane-machine-set-operator-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 08:39:35 crc kubenswrapper[4765]: I1003 08:39:35.364247 4765 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6ea678ab-3438-413e-bfe3-290ae7725660-kube-api-access-htfz6" (OuterVolumeSpecName: "kube-api-access-htfz6") pod "6ea678ab-3438-413e-bfe3-290ae7725660" (UID: "6ea678ab-3438-413e-bfe3-290ae7725660"). InnerVolumeSpecName "kube-api-access-htfz6". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 08:39:35 crc kubenswrapper[4765]: I1003 08:39:35.364174 4765 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-config" (OuterVolumeSpecName: "config") pod "6402fda4-df10-493c-b4e5-d0569419652d" (UID: "6402fda4-df10-493c-b4e5-d0569419652d"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 03 08:39:35 crc kubenswrapper[4765]: I1003 08:39:35.364431 4765 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-utilities" (OuterVolumeSpecName: "utilities") pod "1d611f23-29be-4491-8495-bee1670e935f" (UID: "1d611f23-29be-4491-8495-bee1670e935f"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 03 08:39:35 crc kubenswrapper[4765]: I1003 08:39:35.364587 4765 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "1d611f23-29be-4491-8495-bee1670e935f" (UID: "1d611f23-29be-4491-8495-bee1670e935f"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 03 08:39:35 crc kubenswrapper[4765]: I1003 08:39:35.364607 4765 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-auth-proxy-config" (OuterVolumeSpecName: "auth-proxy-config") pod "31d8b7a1-420e-4252-a5b7-eebe8a111292" (UID: "31d8b7a1-420e-4252-a5b7-eebe8a111292"). InnerVolumeSpecName "auth-proxy-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 03 08:39:35 crc kubenswrapper[4765]: I1003 08:39:35.364817 4765 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-console-config" (OuterVolumeSpecName: "console-config") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "console-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 03 08:39:35 crc kubenswrapper[4765]: I1003 08:39:35.365061 4765 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-image-import-ca" (OuterVolumeSpecName: "image-import-ca") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "image-import-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 03 08:39:35 crc kubenswrapper[4765]: I1003 08:39:35.365119 4765 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-kube-api-access-w4xd4" (OuterVolumeSpecName: "kube-api-access-w4xd4") pod "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" (UID: "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b"). InnerVolumeSpecName "kube-api-access-w4xd4". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 08:39:35 crc kubenswrapper[4765]: I1003 08:39:35.365222 4765 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6731426b-95fe-49ff-bb5f-40441049fde2-kube-api-access-x7zkh" (OuterVolumeSpecName: "kube-api-access-x7zkh") pod "6731426b-95fe-49ff-bb5f-40441049fde2" (UID: "6731426b-95fe-49ff-bb5f-40441049fde2"). InnerVolumeSpecName "kube-api-access-x7zkh". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 08:39:35 crc kubenswrapper[4765]: I1003 08:39:35.365251 4765 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/496e6271-fb68-4057-954e-a0d97a4afa3f-config" (OuterVolumeSpecName: "config") pod "496e6271-fb68-4057-954e-a0d97a4afa3f" (UID: "496e6271-fb68-4057-954e-a0d97a4afa3f"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 03 08:39:35 crc kubenswrapper[4765]: I1003 08:39:35.365576 4765 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-operator-metrics" (OuterVolumeSpecName: "marketplace-operator-metrics") pod "b6cd30de-2eeb-49a2-ab40-9167f4560ff5" (UID: "b6cd30de-2eeb-49a2-ab40-9167f4560ff5"). InnerVolumeSpecName "marketplace-operator-metrics". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 08:39:35 crc kubenswrapper[4765]: I1003 08:39:35.365733 4765 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6509e943-70c6-444c-bc41-48a544e36fbd-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "6509e943-70c6-444c-bc41-48a544e36fbd" (UID: "6509e943-70c6-444c-bc41-48a544e36fbd"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 08:39:35 crc kubenswrapper[4765]: I1003 08:39:35.365965 4765 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-multus-daemon-config" (OuterVolumeSpecName: "multus-daemon-config") pod "4bb40260-dbaa-4fb0-84df-5e680505d512" (UID: "4bb40260-dbaa-4fb0-84df-5e680505d512"). InnerVolumeSpecName "multus-daemon-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 03 08:39:35 crc kubenswrapper[4765]: I1003 08:39:35.366042 4765 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-profile-collector-cert" (OuterVolumeSpecName: "profile-collector-cert") pod "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" (UID: "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9"). InnerVolumeSpecName "profile-collector-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 08:39:35 crc kubenswrapper[4765]: I1003 08:39:35.366167 4765 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/fda69060-fa79-4696-b1a6-7980f124bf7c-mcd-auth-proxy-config" (OuterVolumeSpecName: "mcd-auth-proxy-config") pod "fda69060-fa79-4696-b1a6-7980f124bf7c" (UID: "fda69060-fa79-4696-b1a6-7980f124bf7c"). InnerVolumeSpecName "mcd-auth-proxy-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 03 08:39:35 crc kubenswrapper[4765]: E1003 08:39:35.366484 4765 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Oct 03 08:39:35 crc kubenswrapper[4765]: I1003 08:39:35.366526 4765 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-registry-tls" (OuterVolumeSpecName: "registry-tls") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "registry-tls". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 08:39:35 crc kubenswrapper[4765]: I1003 08:39:35.367016 4765 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 08:39:35 crc kubenswrapper[4765]: I1003 08:39:35.367437 4765 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3ab1a177-2de0-46d9-b765-d0d0649bb42e-kube-api-access-4d4hj" (OuterVolumeSpecName: "kube-api-access-4d4hj") pod "3ab1a177-2de0-46d9-b765-d0d0649bb42e" (UID: "3ab1a177-2de0-46d9-b765-d0d0649bb42e"). InnerVolumeSpecName "kube-api-access-4d4hj". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 08:39:35 crc kubenswrapper[4765]: I1003 08:39:35.367696 4765 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-available-featuregates" (OuterVolumeSpecName: "available-featuregates") pod "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" (UID: "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d"). InnerVolumeSpecName "available-featuregates". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 03 08:39:35 crc kubenswrapper[4765]: I1003 08:39:35.367891 4765 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5441d097-087c-4d9a-baa8-b210afa90fc9-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "5441d097-087c-4d9a-baa8-b210afa90fc9" (UID: "5441d097-087c-4d9a-baa8-b210afa90fc9"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 08:39:35 crc kubenswrapper[4765]: I1003 08:39:35.367934 4765 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1bf7eb37-55a3-4c65-b768-a94c82151e69-kube-api-access-sb6h7" (OuterVolumeSpecName: "kube-api-access-sb6h7") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "kube-api-access-sb6h7". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 08:39:35 crc kubenswrapper[4765]: I1003 08:39:35.368043 4765 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-config" (OuterVolumeSpecName: "config") pod "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" (UID: "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 03 08:39:35 crc kubenswrapper[4765]: I1003 08:39:35.368291 4765 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-config" (OuterVolumeSpecName: "config") pod "6509e943-70c6-444c-bc41-48a544e36fbd" (UID: "6509e943-70c6-444c-bc41-48a544e36fbd"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 03 08:39:35 crc kubenswrapper[4765]: I1003 08:39:35.368904 4765 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/01ab3dd5-8196-46d0-ad33-122e2ca51def-kube-api-access-w7l8j" (OuterVolumeSpecName: "kube-api-access-w7l8j") pod "01ab3dd5-8196-46d0-ad33-122e2ca51def" (UID: "01ab3dd5-8196-46d0-ad33-122e2ca51def"). InnerVolumeSpecName "kube-api-access-w7l8j". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 08:39:35 crc kubenswrapper[4765]: I1003 08:39:35.369067 4765 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9d4552c7-cd75-42dd-8880-30dd377c49a4-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "9d4552c7-cd75-42dd-8880-30dd377c49a4" (UID: "9d4552c7-cd75-42dd-8880-30dd377c49a4"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 08:39:35 crc kubenswrapper[4765]: I1003 08:39:35.369295 4765 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-kube-api-access-zkvpv" (OuterVolumeSpecName: "kube-api-access-zkvpv") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "kube-api-access-zkvpv". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 08:39:35 crc kubenswrapper[4765]: I1003 08:39:35.369348 4765 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-samples-operator-tls" (OuterVolumeSpecName: "samples-operator-tls") pod "a0128f3a-b052-44ed-a84e-c4c8aaf17c13" (UID: "a0128f3a-b052-44ed-a84e-c4c8aaf17c13"). InnerVolumeSpecName "samples-operator-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 08:39:35 crc kubenswrapper[4765]: I1003 08:39:35.369664 4765 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-cni-binary-copy" (OuterVolumeSpecName: "cni-binary-copy") pod "4bb40260-dbaa-4fb0-84df-5e680505d512" (UID: "4bb40260-dbaa-4fb0-84df-5e680505d512"). InnerVolumeSpecName "cni-binary-copy". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 03 08:39:35 crc kubenswrapper[4765]: I1003 08:39:35.354380 4765 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-oauth-serving-cert" (OuterVolumeSpecName: "oauth-serving-cert") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "oauth-serving-cert". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 03 08:39:35 crc kubenswrapper[4765]: I1003 08:39:35.370405 4765 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8f668bae-612b-4b75-9490-919e737c6a3b-installation-pull-secrets" (OuterVolumeSpecName: "installation-pull-secrets") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "installation-pull-secrets". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 08:39:35 crc kubenswrapper[4765]: E1003 08:39:35.370831 4765 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-10-03 08:39:35.87080813 +0000 UTC m=+20.172302460 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Oct 03 08:39:35 crc kubenswrapper[4765]: I1003 08:39:35.371234 4765 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1386a44e-36a2-460c-96d0-0359d2b6f0f5-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "1386a44e-36a2-460c-96d0-0359d2b6f0f5" (UID: "1386a44e-36a2-460c-96d0-0359d2b6f0f5"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 08:39:35 crc kubenswrapper[4765]: I1003 08:39:35.371355 4765 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7583ce53-e0fe-4a16-9e4d-50516596a136-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "7583ce53-e0fe-4a16-9e4d-50516596a136" (UID: "7583ce53-e0fe-4a16-9e4d-50516596a136"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 08:39:35 crc kubenswrapper[4765]: I1003 08:39:35.372569 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-identity-cm\" (UniqueName: \"kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-ovnkube-identity-cm\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Oct 03 08:39:35 crc kubenswrapper[4765]: I1003 08:39:35.372819 4765 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-config" (OuterVolumeSpecName: "config") pod "22c825df-677d-4ca6-82db-3454ed06e783" (UID: "22c825df-677d-4ca6-82db-3454ed06e783"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 03 08:39:35 crc kubenswrapper[4765]: I1003 08:39:35.372953 4765 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c03ee662-fb2f-4fc4-a2c1-af487c19d254-kube-api-access-v47cf" (OuterVolumeSpecName: "kube-api-access-v47cf") pod "c03ee662-fb2f-4fc4-a2c1-af487c19d254" (UID: "c03ee662-fb2f-4fc4-a2c1-af487c19d254"). InnerVolumeSpecName "kube-api-access-v47cf". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 08:39:35 crc kubenswrapper[4765]: I1003 08:39:35.373008 4765 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-srv-cert" (OuterVolumeSpecName: "srv-cert") pod "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" (UID: "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9"). InnerVolumeSpecName "srv-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 08:39:35 crc kubenswrapper[4765]: I1003 08:39:35.373063 4765 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fda69060-fa79-4696-b1a6-7980f124bf7c-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "fda69060-fa79-4696-b1a6-7980f124bf7c" (UID: "fda69060-fa79-4696-b1a6-7980f124bf7c"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 08:39:35 crc kubenswrapper[4765]: I1003 08:39:35.373263 4765 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-images" (OuterVolumeSpecName: "images") pod "6402fda4-df10-493c-b4e5-d0569419652d" (UID: "6402fda4-df10-493c-b4e5-d0569419652d"). InnerVolumeSpecName "images". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 03 08:39:35 crc kubenswrapper[4765]: I1003 08:39:35.373285 4765 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-srv-cert" (OuterVolumeSpecName: "srv-cert") pod "b6312bbd-5731-4ea0-a20f-81d5a57df44a" (UID: "b6312bbd-5731-4ea0-a20f-81d5a57df44a"). InnerVolumeSpecName "srv-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 08:39:35 crc kubenswrapper[4765]: I1003 08:39:35.373600 4765 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6509e943-70c6-444c-bc41-48a544e36fbd-kube-api-access-6g6sz" (OuterVolumeSpecName: "kube-api-access-6g6sz") pod "6509e943-70c6-444c-bc41-48a544e36fbd" (UID: "6509e943-70c6-444c-bc41-48a544e36fbd"). InnerVolumeSpecName "kube-api-access-6g6sz". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 08:39:35 crc kubenswrapper[4765]: I1003 08:39:35.375917 4765 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6402fda4-df10-493c-b4e5-d0569419652d-machine-api-operator-tls" (OuterVolumeSpecName: "machine-api-operator-tls") pod "6402fda4-df10-493c-b4e5-d0569419652d" (UID: "6402fda4-df10-493c-b4e5-d0569419652d"). InnerVolumeSpecName "machine-api-operator-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 08:39:35 crc kubenswrapper[4765]: I1003 08:39:35.376318 4765 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-script-lib" (OuterVolumeSpecName: "ovnkube-script-lib") pod "6ea678ab-3438-413e-bfe3-290ae7725660" (UID: "6ea678ab-3438-413e-bfe3-290ae7725660"). InnerVolumeSpecName "ovnkube-script-lib". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 03 08:39:35 crc kubenswrapper[4765]: I1003 08:39:35.376618 4765 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/01ab3dd5-8196-46d0-ad33-122e2ca51def-config" (OuterVolumeSpecName: "config") pod "01ab3dd5-8196-46d0-ad33-122e2ca51def" (UID: "01ab3dd5-8196-46d0-ad33-122e2ca51def"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 03 08:39:35 crc kubenswrapper[4765]: I1003 08:39:35.376794 4765 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bf126b07-da06-4140-9a57-dfd54fc6b486-image-registry-operator-tls" (OuterVolumeSpecName: "image-registry-operator-tls") pod "bf126b07-da06-4140-9a57-dfd54fc6b486" (UID: "bf126b07-da06-4140-9a57-dfd54fc6b486"). InnerVolumeSpecName "image-registry-operator-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 08:39:35 crc kubenswrapper[4765]: I1003 08:39:35.377058 4765 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1386a44e-36a2-460c-96d0-0359d2b6f0f5-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "1386a44e-36a2-460c-96d0-0359d2b6f0f5" (UID: "1386a44e-36a2-460c-96d0-0359d2b6f0f5"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 08:39:35 crc kubenswrapper[4765]: I1003 08:39:35.377065 4765 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-kube-api-access-dbsvg" (OuterVolumeSpecName: "kube-api-access-dbsvg") pod "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" (UID: "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9"). InnerVolumeSpecName "kube-api-access-dbsvg". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 08:39:35 crc kubenswrapper[4765]: I1003 08:39:35.377100 4765 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/87cf06ed-a83f-41a7-828d-70653580a8cb-kube-api-access-d6qdx" (OuterVolumeSpecName: "kube-api-access-d6qdx") pod "87cf06ed-a83f-41a7-828d-70653580a8cb" (UID: "87cf06ed-a83f-41a7-828d-70653580a8cb"). InnerVolumeSpecName "kube-api-access-d6qdx". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 08:39:35 crc kubenswrapper[4765]: I1003 08:39:35.377078 4765 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/57a731c4-ef35-47a8-b875-bfb08a7f8011-kube-api-access-cfbct" (OuterVolumeSpecName: "kube-api-access-cfbct") pod "57a731c4-ef35-47a8-b875-bfb08a7f8011" (UID: "57a731c4-ef35-47a8-b875-bfb08a7f8011"). InnerVolumeSpecName "kube-api-access-cfbct". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 08:39:35 crc kubenswrapper[4765]: I1003 08:39:35.376531 4765 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-error" (OuterVolumeSpecName: "v4-0-config-user-template-error") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-user-template-error". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 08:39:35 crc kubenswrapper[4765]: I1003 08:39:35.377365 4765 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bd23aa5c-e532-4e53-bccf-e79f130c5ae8-kube-api-access-jhbk2" (OuterVolumeSpecName: "kube-api-access-jhbk2") pod "bd23aa5c-e532-4e53-bccf-e79f130c5ae8" (UID: "bd23aa5c-e532-4e53-bccf-e79f130c5ae8"). InnerVolumeSpecName "kube-api-access-jhbk2". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 08:39:35 crc kubenswrapper[4765]: I1003 08:39:35.377472 4765 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5441d097-087c-4d9a-baa8-b210afa90fc9-kube-api-access-2d4wz" (OuterVolumeSpecName: "kube-api-access-2d4wz") pod "5441d097-087c-4d9a-baa8-b210afa90fc9" (UID: "5441d097-087c-4d9a-baa8-b210afa90fc9"). InnerVolumeSpecName "kube-api-access-2d4wz". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 08:39:35 crc kubenswrapper[4765]: I1003 08:39:35.377913 4765 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/925f1c65-6136-48ba-85aa-3a3b50560753-kube-api-access-s4n52" (OuterVolumeSpecName: "kube-api-access-s4n52") pod "925f1c65-6136-48ba-85aa-3a3b50560753" (UID: "925f1c65-6136-48ba-85aa-3a3b50560753"). InnerVolumeSpecName "kube-api-access-s4n52". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 08:39:35 crc kubenswrapper[4765]: I1003 08:39:35.378011 4765 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-profile-collector-cert" (OuterVolumeSpecName: "profile-collector-cert") pod "b6312bbd-5731-4ea0-a20f-81d5a57df44a" (UID: "b6312bbd-5731-4ea0-a20f-81d5a57df44a"). InnerVolumeSpecName "profile-collector-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 08:39:35 crc kubenswrapper[4765]: I1003 08:39:35.378166 4765 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-kube-api-access-qs4fp" (OuterVolumeSpecName: "kube-api-access-qs4fp") pod "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" (UID: "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c"). InnerVolumeSpecName "kube-api-access-qs4fp". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 08:39:35 crc kubenswrapper[4765]: I1003 08:39:35.378767 4765 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1386a44e-36a2-460c-96d0-0359d2b6f0f5-config" (OuterVolumeSpecName: "config") pod "1386a44e-36a2-460c-96d0-0359d2b6f0f5" (UID: "1386a44e-36a2-460c-96d0-0359d2b6f0f5"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 03 08:39:35 crc kubenswrapper[4765]: I1003 08:39:35.379289 4765 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-serving-cert" (OuterVolumeSpecName: "v4-0-config-system-serving-cert") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 08:39:35 crc kubenswrapper[4765]: I1003 08:39:35.379457 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-iptables-alerter-script\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Oct 03 08:39:35 crc kubenswrapper[4765]: I1003 08:39:35.379567 4765 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-trusted-ca-bundle" (OuterVolumeSpecName: "v4-0-config-system-trusted-ca-bundle") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 03 08:39:35 crc kubenswrapper[4765]: E1003 08:39:35.380451 4765 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Oct 03 08:39:35 crc kubenswrapper[4765]: I1003 08:39:35.380968 4765 swap_util.go:74] "error creating dir to test if tmpfs noswap is enabled. Assuming not supported" mount path="" error="stat /var/lib/kubelet/plugins/kubernetes.io/empty-dir: no such file or directory" Oct 03 08:39:35 crc kubenswrapper[4765]: I1003 08:39:35.383098 4765 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-sysctl-allowlist" (OuterVolumeSpecName: "cni-sysctl-allowlist") pod "7bb08738-c794-4ee8-9972-3a62ca171029" (UID: "7bb08738-c794-4ee8-9972-3a62ca171029"). InnerVolumeSpecName "cni-sysctl-allowlist". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 03 08:39:35 crc kubenswrapper[4765]: I1003 08:39:35.383804 4765 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-metrics-certs" (OuterVolumeSpecName: "metrics-certs") pod "c03ee662-fb2f-4fc4-a2c1-af487c19d254" (UID: "c03ee662-fb2f-4fc4-a2c1-af487c19d254"). InnerVolumeSpecName "metrics-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 08:39:35 crc kubenswrapper[4765]: E1003 08:39:35.387152 4765 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-10-03 08:39:35.887118503 +0000 UTC m=+20.188612833 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Oct 03 08:39:35 crc kubenswrapper[4765]: I1003 08:39:35.391168 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/ef543e1b-8068-4ea3-b32a-61027b32e95d-webhook-cert\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Oct 03 08:39:35 crc kubenswrapper[4765]: I1003 08:39:35.391384 4765 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-client-ca" (OuterVolumeSpecName: "client-ca") pod "7583ce53-e0fe-4a16-9e4d-50516596a136" (UID: "7583ce53-e0fe-4a16-9e4d-50516596a136"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 03 08:39:35 crc kubenswrapper[4765]: I1003 08:39:35.391536 4765 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-audit-policies" (OuterVolumeSpecName: "audit-policies") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "audit-policies". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 03 08:39:35 crc kubenswrapper[4765]: I1003 08:39:35.391610 4765 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-config" (OuterVolumeSpecName: "config") pod "7583ce53-e0fe-4a16-9e4d-50516596a136" (UID: "7583ce53-e0fe-4a16-9e4d-50516596a136"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 03 08:39:35 crc kubenswrapper[4765]: I1003 08:39:35.393997 4765 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-config" (OuterVolumeSpecName: "config") pod "5441d097-087c-4d9a-baa8-b210afa90fc9" (UID: "5441d097-087c-4d9a-baa8-b210afa90fc9"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 03 08:39:35 crc kubenswrapper[4765]: I1003 08:39:35.394518 4765 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/87cf06ed-a83f-41a7-828d-70653580a8cb-config-volume" (OuterVolumeSpecName: "config-volume") pod "87cf06ed-a83f-41a7-828d-70653580a8cb" (UID: "87cf06ed-a83f-41a7-828d-70653580a8cb"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 03 08:39:35 crc kubenswrapper[4765]: I1003 08:39:35.395491 4765 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-oauth-config" (OuterVolumeSpecName: "console-oauth-config") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "console-oauth-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 08:39:35 crc kubenswrapper[4765]: I1003 08:39:35.395533 4765 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d-kube-api-access-x2m85" (OuterVolumeSpecName: "kube-api-access-x2m85") pod "cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d" (UID: "cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d"). InnerVolumeSpecName "kube-api-access-x2m85". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 08:39:35 crc kubenswrapper[4765]: I1003 08:39:35.395801 4765 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:35Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:35Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 03 08:39:35 crc kubenswrapper[4765]: E1003 08:39:35.395932 4765 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Oct 03 08:39:35 crc kubenswrapper[4765]: E1003 08:39:35.395957 4765 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Oct 03 08:39:35 crc kubenswrapper[4765]: E1003 08:39:35.395973 4765 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Oct 03 08:39:35 crc kubenswrapper[4765]: E1003 08:39:35.396046 4765 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2025-10-03 08:39:35.896022147 +0000 UTC m=+20.197516537 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Oct 03 08:39:35 crc kubenswrapper[4765]: I1003 08:39:35.397065 4765 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-certs" (OuterVolumeSpecName: "certs") pod "5fe579f8-e8a6-4643-bce5-a661393c4dde" (UID: "5fe579f8-e8a6-4643-bce5-a661393c4dde"). InnerVolumeSpecName "certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 08:39:35 crc kubenswrapper[4765]: I1003 08:39:35.397616 4765 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c03ee662-fb2f-4fc4-a2c1-af487c19d254-service-ca-bundle" (OuterVolumeSpecName: "service-ca-bundle") pod "c03ee662-fb2f-4fc4-a2c1-af487c19d254" (UID: "c03ee662-fb2f-4fc4-a2c1-af487c19d254"). InnerVolumeSpecName "service-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 03 08:39:35 crc kubenswrapper[4765]: I1003 08:39:35.397984 4765 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-service-ca" (OuterVolumeSpecName: "etcd-service-ca") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "etcd-service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 03 08:39:35 crc kubenswrapper[4765]: E1003 08:39:35.398253 4765 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Oct 03 08:39:35 crc kubenswrapper[4765]: E1003 08:39:35.398285 4765 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Oct 03 08:39:35 crc kubenswrapper[4765]: E1003 08:39:35.398300 4765 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Oct 03 08:39:35 crc kubenswrapper[4765]: I1003 08:39:35.398337 4765 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-env-overrides" (OuterVolumeSpecName: "env-overrides") pod "6ea678ab-3438-413e-bfe3-290ae7725660" (UID: "6ea678ab-3438-413e-bfe3-290ae7725660"). InnerVolumeSpecName "env-overrides". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 03 08:39:35 crc kubenswrapper[4765]: E1003 08:39:35.398349 4765 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2025-10-03 08:39:35.898335043 +0000 UTC m=+20.199829373 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Oct 03 08:39:35 crc kubenswrapper[4765]: I1003 08:39:35.398252 4765 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/925f1c65-6136-48ba-85aa-3a3b50560753-ovn-control-plane-metrics-cert" (OuterVolumeSpecName: "ovn-control-plane-metrics-cert") pod "925f1c65-6136-48ba-85aa-3a3b50560753" (UID: "925f1c65-6136-48ba-85aa-3a3b50560753"). InnerVolumeSpecName "ovn-control-plane-metrics-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 08:39:35 crc kubenswrapper[4765]: I1003 08:39:35.398573 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rczfb\" (UniqueName: \"kubernetes.io/projected/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-kube-api-access-rczfb\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Oct 03 08:39:35 crc kubenswrapper[4765]: I1003 08:39:35.399218 4765 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-audit" (OuterVolumeSpecName: "audit") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "audit". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 03 08:39:35 crc kubenswrapper[4765]: I1003 08:39:35.399457 4765 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-apiservice-cert" (OuterVolumeSpecName: "apiservice-cert") pod "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" (UID: "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b"). InnerVolumeSpecName "apiservice-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 08:39:35 crc kubenswrapper[4765]: I1003 08:39:35.399460 4765 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-client" (OuterVolumeSpecName: "etcd-client") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "etcd-client". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 08:39:35 crc kubenswrapper[4765]: I1003 08:39:35.399598 4765 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 08:39:35 crc kubenswrapper[4765]: I1003 08:39:35.399658 4765 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 03 08:39:35 crc kubenswrapper[4765]: I1003 08:39:35.399285 4765 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" (UID: "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 08:39:35 crc kubenswrapper[4765]: I1003 08:39:35.399244 4765 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/43509403-f426-496e-be36-56cef71462f5-kube-api-access-qg5z5" (OuterVolumeSpecName: "kube-api-access-qg5z5") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "kube-api-access-qg5z5". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 08:39:35 crc kubenswrapper[4765]: I1003 08:39:35.400347 4765 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 03 08:39:35 crc kubenswrapper[4765]: I1003 08:39:35.400457 4765 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-config" (OuterVolumeSpecName: "config") pod "9d4552c7-cd75-42dd-8880-30dd377c49a4" (UID: "9d4552c7-cd75-42dd-8880-30dd377c49a4"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 03 08:39:35 crc kubenswrapper[4765]: I1003 08:39:35.400574 4765 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-registry-certificates" (OuterVolumeSpecName: "registry-certificates") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "registry-certificates". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 03 08:39:35 crc kubenswrapper[4765]: I1003 08:39:35.400823 4765 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-session" (OuterVolumeSpecName: "v4-0-config-system-session") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-session". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 08:39:35 crc kubenswrapper[4765]: I1003 08:39:35.400864 4765 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-router-certs" (OuterVolumeSpecName: "v4-0-config-system-router-certs") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-router-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 08:39:35 crc kubenswrapper[4765]: I1003 08:39:35.400895 4765 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-login" (OuterVolumeSpecName: "v4-0-config-user-template-login") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-user-template-login". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 08:39:35 crc kubenswrapper[4765]: I1003 08:39:35.400983 4765 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-idp-0-file-data" (OuterVolumeSpecName: "v4-0-config-user-idp-0-file-data") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-user-idp-0-file-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 08:39:35 crc kubenswrapper[4765]: I1003 08:39:35.401182 4765 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-client" (OuterVolumeSpecName: "etcd-client") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "etcd-client". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 08:39:35 crc kubenswrapper[4765]: I1003 08:39:35.401384 4765 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/496e6271-fb68-4057-954e-a0d97a4afa3f-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "496e6271-fb68-4057-954e-a0d97a4afa3f" (UID: "496e6271-fb68-4057-954e-a0d97a4afa3f"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 08:39:35 crc kubenswrapper[4765]: I1003 08:39:35.401458 4765 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/49ef4625-1d3a-4a9f-b595-c2433d32326d-kube-api-access-pjr6v" (OuterVolumeSpecName: "kube-api-access-pjr6v") pod "49ef4625-1d3a-4a9f-b595-c2433d32326d" (UID: "49ef4625-1d3a-4a9f-b595-c2433d32326d"). InnerVolumeSpecName "kube-api-access-pjr6v". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 08:39:35 crc kubenswrapper[4765]: I1003 08:39:35.401702 4765 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-client" (OuterVolumeSpecName: "etcd-client") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "etcd-client". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 08:39:35 crc kubenswrapper[4765]: I1003 08:39:35.401865 4765 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7539238d-5fe0-46ed-884e-1c3b566537ec-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "7539238d-5fe0-46ed-884e-1c3b566537ec" (UID: "7539238d-5fe0-46ed-884e-1c3b566537ec"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 08:39:35 crc kubenswrapper[4765]: I1003 08:39:35.404198 4765 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-client-ca" (OuterVolumeSpecName: "client-ca") pod "5441d097-087c-4d9a-baa8-b210afa90fc9" (UID: "5441d097-087c-4d9a-baa8-b210afa90fc9"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 03 08:39:35 crc kubenswrapper[4765]: I1003 08:39:35.404399 4765 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/496e6271-fb68-4057-954e-a0d97a4afa3f-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "496e6271-fb68-4057-954e-a0d97a4afa3f" (UID: "496e6271-fb68-4057-954e-a0d97a4afa3f"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 08:39:35 crc kubenswrapper[4765]: I1003 08:39:35.404538 4765 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/01ab3dd5-8196-46d0-ad33-122e2ca51def-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "01ab3dd5-8196-46d0-ad33-122e2ca51def" (UID: "01ab3dd5-8196-46d0-ad33-122e2ca51def"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 08:39:35 crc kubenswrapper[4765]: I1003 08:39:35.405694 4765 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 03 08:39:35 crc kubenswrapper[4765]: I1003 08:39:35.404627 4765 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" (UID: "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 08:39:35 crc kubenswrapper[4765]: I1003 08:39:35.404846 4765 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e7e6199b-1264-4501-8953-767f51328d08-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "e7e6199b-1264-4501-8953-767f51328d08" (UID: "e7e6199b-1264-4501-8953-767f51328d08"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 08:39:35 crc kubenswrapper[4765]: I1003 08:39:35.405039 4765 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0b574797-001e-440a-8f4e-c0be86edad0f-kube-api-access-lzf88" (OuterVolumeSpecName: "kube-api-access-lzf88") pod "0b574797-001e-440a-8f4e-c0be86edad0f" (UID: "0b574797-001e-440a-8f4e-c0be86edad0f"). InnerVolumeSpecName "kube-api-access-lzf88". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 08:39:35 crc kubenswrapper[4765]: I1003 08:39:35.405078 4765 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-kube-api-access-gf66m" (OuterVolumeSpecName: "kube-api-access-gf66m") pod "a0128f3a-b052-44ed-a84e-c4c8aaf17c13" (UID: "a0128f3a-b052-44ed-a84e-c4c8aaf17c13"). InnerVolumeSpecName "kube-api-access-gf66m". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 08:39:35 crc kubenswrapper[4765]: I1003 08:39:35.405550 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/37a5e44f-9a88-4405-be8a-b645485e7312-metrics-tls\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Oct 03 08:39:35 crc kubenswrapper[4765]: I1003 08:39:35.405164 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rdwmf\" (UniqueName: \"kubernetes.io/projected/37a5e44f-9a88-4405-be8a-b645485e7312-kube-api-access-rdwmf\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Oct 03 08:39:35 crc kubenswrapper[4765]: I1003 08:39:35.405156 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-s2kz5\" (UniqueName: \"kubernetes.io/projected/ef543e1b-8068-4ea3-b32a-61027b32e95d-kube-api-access-s2kz5\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Oct 03 08:39:35 crc kubenswrapper[4765]: I1003 08:39:35.405089 4765 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-bound-sa-token" (OuterVolumeSpecName: "bound-sa-token") pod "a31745f5-9847-4afe-82a5-3161cc66ca93" (UID: "a31745f5-9847-4afe-82a5-3161cc66ca93"). InnerVolumeSpecName "bound-sa-token". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 08:39:35 crc kubenswrapper[4765]: I1003 08:39:35.405149 4765 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-config" (OuterVolumeSpecName: "config") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 03 08:39:35 crc kubenswrapper[4765]: I1003 08:39:35.405344 4765 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-config" (OuterVolumeSpecName: "ovnkube-config") pod "6ea678ab-3438-413e-bfe3-290ae7725660" (UID: "6ea678ab-3438-413e-bfe3-290ae7725660"). InnerVolumeSpecName "ovnkube-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 03 08:39:35 crc kubenswrapper[4765]: I1003 08:39:35.405343 4765 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 08:39:35 crc kubenswrapper[4765]: I1003 08:39:35.405385 4765 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-kube-api-access-x4zgh" (OuterVolumeSpecName: "kube-api-access-x4zgh") pod "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" (UID: "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d"). InnerVolumeSpecName "kube-api-access-x4zgh". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 08:39:35 crc kubenswrapper[4765]: I1003 08:39:35.405403 4765 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7539238d-5fe0-46ed-884e-1c3b566537ec-kube-api-access-tk88c" (OuterVolumeSpecName: "kube-api-access-tk88c") pod "7539238d-5fe0-46ed-884e-1c3b566537ec" (UID: "7539238d-5fe0-46ed-884e-1c3b566537ec"). InnerVolumeSpecName "kube-api-access-tk88c". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 08:39:35 crc kubenswrapper[4765]: I1003 08:39:35.405422 4765 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-service-ca" (OuterVolumeSpecName: "service-ca") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 03 08:39:35 crc kubenswrapper[4765]: I1003 08:39:35.405586 4765 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-kube-api-access-wxkg8" (OuterVolumeSpecName: "kube-api-access-wxkg8") pod "3cb93b32-e0ae-4377-b9c8-fdb9842c6d59" (UID: "3cb93b32-e0ae-4377-b9c8-fdb9842c6d59"). InnerVolumeSpecName "kube-api-access-wxkg8". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 08:39:35 crc kubenswrapper[4765]: I1003 08:39:35.405568 4765 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-kube-api-access-rnphk" (OuterVolumeSpecName: "kube-api-access-rnphk") pod "bf126b07-da06-4140-9a57-dfd54fc6b486" (UID: "bf126b07-da06-4140-9a57-dfd54fc6b486"). InnerVolumeSpecName "kube-api-access-rnphk". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 08:39:35 crc kubenswrapper[4765]: I1003 08:39:35.405572 4765 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/efdd0498-1daa-4136-9a4a-3b948c2293fc-kube-api-access-fqsjt" (OuterVolumeSpecName: "kube-api-access-fqsjt") pod "efdd0498-1daa-4136-9a4a-3b948c2293fc" (UID: "efdd0498-1daa-4136-9a4a-3b948c2293fc"). InnerVolumeSpecName "kube-api-access-fqsjt". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 08:39:35 crc kubenswrapper[4765]: I1003 08:39:35.405785 4765 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e7e6199b-1264-4501-8953-767f51328d08-config" (OuterVolumeSpecName: "config") pod "e7e6199b-1264-4501-8953-767f51328d08" (UID: "e7e6199b-1264-4501-8953-767f51328d08"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 03 08:39:35 crc kubenswrapper[4765]: I1003 08:39:35.406297 4765 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/31d8b7a1-420e-4252-a5b7-eebe8a111292-kube-api-access-zgdk5" (OuterVolumeSpecName: "kube-api-access-zgdk5") pod "31d8b7a1-420e-4252-a5b7-eebe8a111292" (UID: "31d8b7a1-420e-4252-a5b7-eebe8a111292"). InnerVolumeSpecName "kube-api-access-zgdk5". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 08:39:35 crc kubenswrapper[4765]: I1003 08:39:35.406312 4765 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-metrics-tls" (OuterVolumeSpecName: "metrics-tls") pod "96b93a3a-6083-4aea-8eab-fe1aa8245ad9" (UID: "96b93a3a-6083-4aea-8eab-fe1aa8245ad9"). InnerVolumeSpecName "metrics-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 08:39:35 crc kubenswrapper[4765]: I1003 08:39:35.406340 4765 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-service-ca-bundle" (OuterVolumeSpecName: "service-ca-bundle") pod "6509e943-70c6-444c-bc41-48a544e36fbd" (UID: "6509e943-70c6-444c-bc41-48a544e36fbd"). InnerVolumeSpecName "service-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 03 08:39:35 crc kubenswrapper[4765]: I1003 08:39:35.406512 4765 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 03 08:39:35 crc kubenswrapper[4765]: I1003 08:39:35.406612 4765 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "6509e943-70c6-444c-bc41-48a544e36fbd" (UID: "6509e943-70c6-444c-bc41-48a544e36fbd"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 03 08:39:35 crc kubenswrapper[4765]: I1003 08:39:35.406876 4765 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9d4552c7-cd75-42dd-8880-30dd377c49a4-kube-api-access-pcxfs" (OuterVolumeSpecName: "kube-api-access-pcxfs") pod "9d4552c7-cd75-42dd-8880-30dd377c49a4" (UID: "9d4552c7-cd75-42dd-8880-30dd377c49a4"). InnerVolumeSpecName "kube-api-access-pcxfs". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 08:39:35 crc kubenswrapper[4765]: I1003 08:39:35.408136 4765 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0b574797-001e-440a-8f4e-c0be86edad0f-mcc-auth-proxy-config" (OuterVolumeSpecName: "mcc-auth-proxy-config") pod "0b574797-001e-440a-8f4e-c0be86edad0f" (UID: "0b574797-001e-440a-8f4e-c0be86edad0f"). InnerVolumeSpecName "mcc-auth-proxy-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 03 08:39:35 crc kubenswrapper[4765]: I1003 08:39:35.408152 4765 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6402fda4-df10-493c-b4e5-d0569419652d-kube-api-access-mg5zb" (OuterVolumeSpecName: "kube-api-access-mg5zb") pod "6402fda4-df10-493c-b4e5-d0569419652d" (UID: "6402fda4-df10-493c-b4e5-d0569419652d"). InnerVolumeSpecName "kube-api-access-mg5zb". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 08:39:35 crc kubenswrapper[4765]: I1003 08:39:35.408280 4765 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" (UID: "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 08:39:35 crc kubenswrapper[4765]: I1003 08:39:35.408341 4765 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e7e6199b-1264-4501-8953-767f51328d08-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "e7e6199b-1264-4501-8953-767f51328d08" (UID: "e7e6199b-1264-4501-8953-767f51328d08"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 08:39:35 crc kubenswrapper[4765]: I1003 08:39:35.409026 4765 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7539238d-5fe0-46ed-884e-1c3b566537ec-config" (OuterVolumeSpecName: "config") pod "7539238d-5fe0-46ed-884e-1c3b566537ec" (UID: "7539238d-5fe0-46ed-884e-1c3b566537ec"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 03 08:39:35 crc kubenswrapper[4765]: I1003 08:39:35.409401 4765 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-serviceca" (OuterVolumeSpecName: "serviceca") pod "3cb93b32-e0ae-4377-b9c8-fdb9842c6d59" (UID: "3cb93b32-e0ae-4377-b9c8-fdb9842c6d59"). InnerVolumeSpecName "serviceca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 03 08:39:35 crc kubenswrapper[4765]: I1003 08:39:35.420740 4765 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "57a731c4-ef35-47a8-b875-bfb08a7f8011" (UID: "57a731c4-ef35-47a8-b875-bfb08a7f8011"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 03 08:39:35 crc kubenswrapper[4765]: I1003 08:39:35.424781 4765 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/0.log" Oct 03 08:39:35 crc kubenswrapper[4765]: I1003 08:39:35.425289 4765 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8f668bae-612b-4b75-9490-919e737c6a3b-ca-trust-extracted" (OuterVolumeSpecName: "ca-trust-extracted") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "ca-trust-extracted". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 03 08:39:35 crc kubenswrapper[4765]: I1003 08:39:35.425559 4765 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" (UID: "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 03 08:39:35 crc kubenswrapper[4765]: I1003 08:39:35.426957 4765 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="aa0283dadc2c5e48aa9bfd20ef35d889a350244b72eb8529d4d4e682d5fa0e47" exitCode=255 Oct 03 08:39:35 crc kubenswrapper[4765]: I1003 08:39:35.427035 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerDied","Data":"aa0283dadc2c5e48aa9bfd20ef35d889a350244b72eb8529d4d4e682d5fa0e47"} Oct 03 08:39:35 crc kubenswrapper[4765]: I1003 08:39:35.434795 4765 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "5225d0e4-402f-4861-b410-819f433b1803" (UID: "5225d0e4-402f-4861-b410-819f433b1803"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 03 08:39:35 crc kubenswrapper[4765]: E1003 08:39:35.435215 4765 kubelet.go:1929] "Failed creating a mirror pod for" err="pods \"kube-controller-manager-crc\" already exists" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Oct 03 08:39:35 crc kubenswrapper[4765]: I1003 08:39:35.440802 4765 scope.go:117] "RemoveContainer" containerID="aa0283dadc2c5e48aa9bfd20ef35d889a350244b72eb8529d4d4e682d5fa0e47" Oct 03 08:39:35 crc kubenswrapper[4765]: I1003 08:39:35.441255 4765 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver/kube-apiserver-crc"] Oct 03 08:39:35 crc kubenswrapper[4765]: I1003 08:39:35.442555 4765 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9660b983-3561-4cf7-8ea0-31a63e8d1051\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://23c27e7d79dab0c54b22f0114e7f55a9267e3a21961b8479c37fd77d0e8b66c7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T08:39:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fb89a31c804d86cbc11b04e4dcfab79d4536f28a107d43e98d48172a1c257ebb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T08:39:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1c3168f51c49cd9633557cf31cdc0fec47b3fcf981462dc85f4253a0584fcf52\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T08:39:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://99ae775d5cfd2e88a1c7ca516e1c59f2e08ce1d383653cacbefeac66b07abcb5\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T08:39:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T08:39:16Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 03 08:39:35 crc kubenswrapper[4765]: I1003 08:39:35.453667 4765 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:35Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:35Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 03 08:39:35 crc kubenswrapper[4765]: I1003 08:39:35.460040 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-host-slash\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Oct 03 08:39:35 crc kubenswrapper[4765]: I1003 08:39:35.460082 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-etc-kube\" (UniqueName: \"kubernetes.io/host-path/37a5e44f-9a88-4405-be8a-b645485e7312-host-etc-kube\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Oct 03 08:39:35 crc kubenswrapper[4765]: I1003 08:39:35.460158 4765 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-dbsvg\" (UniqueName: \"kubernetes.io/projected/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-kube-api-access-dbsvg\") on node \"crc\" DevicePath \"\"" Oct 03 08:39:35 crc kubenswrapper[4765]: I1003 08:39:35.460163 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-host-slash\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Oct 03 08:39:35 crc kubenswrapper[4765]: I1003 08:39:35.460174 4765 reconciler_common.go:293] "Volume detached for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-srv-cert\") on node \"crc\" DevicePath \"\"" Oct 03 08:39:35 crc kubenswrapper[4765]: I1003 08:39:35.460207 4765 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qs4fp\" (UniqueName: \"kubernetes.io/projected/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-kube-api-access-qs4fp\") on node \"crc\" DevicePath \"\"" Oct 03 08:39:35 crc kubenswrapper[4765]: I1003 08:39:35.460219 4765 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pjr6v\" (UniqueName: \"kubernetes.io/projected/49ef4625-1d3a-4a9f-b595-c2433d32326d-kube-api-access-pjr6v\") on node \"crc\" DevicePath \"\"" Oct 03 08:39:35 crc kubenswrapper[4765]: I1003 08:39:35.460229 4765 reconciler_common.go:293] "Volume detached for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-audit\") on node \"crc\" DevicePath \"\"" Oct 03 08:39:35 crc kubenswrapper[4765]: I1003 08:39:35.460239 4765 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-config\") on node \"crc\" DevicePath \"\"" Oct 03 08:39:35 crc kubenswrapper[4765]: I1003 08:39:35.460250 4765 reconciler_common.go:293] "Volume detached for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/8f668bae-612b-4b75-9490-919e737c6a3b-installation-pull-secrets\") on node \"crc\" DevicePath \"\"" Oct 03 08:39:35 crc kubenswrapper[4765]: I1003 08:39:35.460261 4765 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2d4wz\" (UniqueName: \"kubernetes.io/projected/5441d097-087c-4d9a-baa8-b210afa90fc9-kube-api-access-2d4wz\") on node \"crc\" DevicePath \"\"" Oct 03 08:39:35 crc kubenswrapper[4765]: I1003 08:39:35.460272 4765 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-serving-cert\") on node \"crc\" DevicePath \"\"" Oct 03 08:39:35 crc kubenswrapper[4765]: I1003 08:39:35.460283 4765 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-config\") on node \"crc\" DevicePath \"\"" Oct 03 08:39:35 crc kubenswrapper[4765]: I1003 08:39:35.460293 4765 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-config\") on node \"crc\" DevicePath \"\"" Oct 03 08:39:35 crc kubenswrapper[4765]: I1003 08:39:35.460303 4765 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-gf66m\" (UniqueName: \"kubernetes.io/projected/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-kube-api-access-gf66m\") on node \"crc\" DevicePath \"\"" Oct 03 08:39:35 crc kubenswrapper[4765]: I1003 08:39:35.460313 4765 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-client-ca\") on node \"crc\" DevicePath \"\"" Oct 03 08:39:35 crc kubenswrapper[4765]: I1003 08:39:35.460325 4765 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pcxfs\" (UniqueName: \"kubernetes.io/projected/9d4552c7-cd75-42dd-8880-30dd377c49a4-kube-api-access-pcxfs\") on node \"crc\" DevicePath \"\"" Oct 03 08:39:35 crc kubenswrapper[4765]: I1003 08:39:35.460336 4765 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6g6sz\" (UniqueName: \"kubernetes.io/projected/6509e943-70c6-444c-bc41-48a544e36fbd-kube-api-access-6g6sz\") on node \"crc\" DevicePath \"\"" Oct 03 08:39:35 crc kubenswrapper[4765]: I1003 08:39:35.460346 4765 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-serving-cert\") on node \"crc\" DevicePath \"\"" Oct 03 08:39:35 crc kubenswrapper[4765]: I1003 08:39:35.460358 4765 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-router-certs\") on node \"crc\" DevicePath \"\"" Oct 03 08:39:35 crc kubenswrapper[4765]: I1003 08:39:35.460358 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-etc-kube\" (UniqueName: \"kubernetes.io/host-path/37a5e44f-9a88-4405-be8a-b645485e7312-host-etc-kube\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Oct 03 08:39:35 crc kubenswrapper[4765]: I1003 08:39:35.460372 4765 reconciler_common.go:293] "Volume detached for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-sysctl-allowlist\") on node \"crc\" DevicePath \"\"" Oct 03 08:39:35 crc kubenswrapper[4765]: I1003 08:39:35.460392 4765 reconciler_common.go:293] "Volume detached for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-oauth-config\") on node \"crc\" DevicePath \"\"" Oct 03 08:39:35 crc kubenswrapper[4765]: I1003 08:39:35.460405 4765 reconciler_common.go:293] "Volume detached for volume \"machine-api-operator-tls\" (UniqueName: \"kubernetes.io/secret/6402fda4-df10-493c-b4e5-d0569419652d-machine-api-operator-tls\") on node \"crc\" DevicePath \"\"" Oct 03 08:39:35 crc kubenswrapper[4765]: I1003 08:39:35.460416 4765 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-x2m85\" (UniqueName: \"kubernetes.io/projected/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d-kube-api-access-x2m85\") on node \"crc\" DevicePath \"\"" Oct 03 08:39:35 crc kubenswrapper[4765]: I1003 08:39:35.460427 4765 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1386a44e-36a2-460c-96d0-0359d2b6f0f5-serving-cert\") on node \"crc\" DevicePath \"\"" Oct 03 08:39:35 crc kubenswrapper[4765]: I1003 08:39:35.460438 4765 reconciler_common.go:293] "Volume detached for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-console-config\") on node \"crc\" DevicePath \"\"" Oct 03 08:39:35 crc kubenswrapper[4765]: I1003 08:39:35.460449 4765 reconciler_common.go:293] "Volume detached for volume \"control-plane-machine-set-operator-tls\" (UniqueName: \"kubernetes.io/secret/6731426b-95fe-49ff-bb5f-40441049fde2-control-plane-machine-set-operator-tls\") on node \"crc\" DevicePath \"\"" Oct 03 08:39:35 crc kubenswrapper[4765]: I1003 08:39:35.460461 4765 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-s4n52\" (UniqueName: \"kubernetes.io/projected/925f1c65-6136-48ba-85aa-3a3b50560753-kube-api-access-s4n52\") on node \"crc\" DevicePath \"\"" Oct 03 08:39:35 crc kubenswrapper[4765]: I1003 08:39:35.460473 4765 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bf2bz\" (UniqueName: \"kubernetes.io/projected/1d611f23-29be-4491-8495-bee1670e935f-kube-api-access-bf2bz\") on node \"crc\" DevicePath \"\"" Oct 03 08:39:35 crc kubenswrapper[4765]: I1003 08:39:35.460494 4765 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-w4xd4\" (UniqueName: \"kubernetes.io/projected/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-kube-api-access-w4xd4\") on node \"crc\" DevicePath \"\"" Oct 03 08:39:35 crc kubenswrapper[4765]: I1003 08:39:35.460506 4765 reconciler_common.go:293] "Volume detached for volume \"available-featuregates\" (UniqueName: \"kubernetes.io/empty-dir/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-available-featuregates\") on node \"crc\" DevicePath \"\"" Oct 03 08:39:35 crc kubenswrapper[4765]: I1003 08:39:35.460518 4765 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4d4hj\" (UniqueName: \"kubernetes.io/projected/3ab1a177-2de0-46d9-b765-d0d0649bb42e-kube-api-access-4d4hj\") on node \"crc\" DevicePath \"\"" Oct 03 08:39:35 crc kubenswrapper[4765]: I1003 08:39:35.460543 4765 reconciler_common.go:293] "Volume detached for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-ovnkube-config\") on node \"crc\" DevicePath \"\"" Oct 03 08:39:35 crc kubenswrapper[4765]: I1003 08:39:35.460554 4765 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-config\") on node \"crc\" DevicePath \"\"" Oct 03 08:39:35 crc kubenswrapper[4765]: I1003 08:39:35.460565 4765 reconciler_common.go:293] "Volume detached for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-samples-operator-tls\") on node \"crc\" DevicePath \"\"" Oct 03 08:39:35 crc kubenswrapper[4765]: I1003 08:39:35.460573 4765 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-error\") on node \"crc\" DevicePath \"\"" Oct 03 08:39:35 crc kubenswrapper[4765]: I1003 08:39:35.460582 4765 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 03 08:39:35 crc kubenswrapper[4765]: I1003 08:39:35.460590 4765 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-config\") on node \"crc\" DevicePath \"\"" Oct 03 08:39:35 crc kubenswrapper[4765]: I1003 08:39:35.460598 4765 reconciler_common.go:293] "Volume detached for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-audit-policies\") on node \"crc\" DevicePath \"\"" Oct 03 08:39:35 crc kubenswrapper[4765]: I1003 08:39:35.460607 4765 reconciler_common.go:293] "Volume detached for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-env-overrides\") on node \"crc\" DevicePath \"\"" Oct 03 08:39:35 crc kubenswrapper[4765]: I1003 08:39:35.460617 4765 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fqsjt\" (UniqueName: \"kubernetes.io/projected/efdd0498-1daa-4136-9a4a-3b948c2293fc-kube-api-access-fqsjt\") on node \"crc\" DevicePath \"\"" Oct 03 08:39:35 crc kubenswrapper[4765]: I1003 08:39:35.460627 4765 reconciler_common.go:293] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/fda69060-fa79-4696-b1a6-7980f124bf7c-proxy-tls\") on node \"crc\" DevicePath \"\"" Oct 03 08:39:35 crc kubenswrapper[4765]: I1003 08:39:35.460636 4765 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-serving-cert\") on node \"crc\" DevicePath \"\"" Oct 03 08:39:35 crc kubenswrapper[4765]: I1003 08:39:35.460662 4765 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-login\") on node \"crc\" DevicePath \"\"" Oct 03 08:39:35 crc kubenswrapper[4765]: I1003 08:39:35.460673 4765 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7583ce53-e0fe-4a16-9e4d-50516596a136-serving-cert\") on node \"crc\" DevicePath \"\"" Oct 03 08:39:35 crc kubenswrapper[4765]: I1003 08:39:35.460683 4765 reconciler_common.go:293] "Volume detached for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-config\") on node \"crc\" DevicePath \"\"" Oct 03 08:39:35 crc kubenswrapper[4765]: I1003 08:39:35.460694 4765 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/496e6271-fb68-4057-954e-a0d97a4afa3f-serving-cert\") on node \"crc\" DevicePath \"\"" Oct 03 08:39:35 crc kubenswrapper[4765]: I1003 08:39:35.460702 4765 reconciler_common.go:293] "Volume detached for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-profile-collector-cert\") on node \"crc\" DevicePath \"\"" Oct 03 08:39:35 crc kubenswrapper[4765]: I1003 08:39:35.460709 4765 reconciler_common.go:293] "Volume detached for volume \"etcd-service-ca\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-service-ca\") on node \"crc\" DevicePath \"\"" Oct 03 08:39:35 crc kubenswrapper[4765]: I1003 08:39:35.460717 4765 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-idp-0-file-data\") on node \"crc\" DevicePath \"\"" Oct 03 08:39:35 crc kubenswrapper[4765]: I1003 08:39:35.460726 4765 reconciler_common.go:293] "Volume detached for volume \"ovn-control-plane-metrics-cert\" (UniqueName: \"kubernetes.io/secret/925f1c65-6136-48ba-85aa-3a3b50560753-ovn-control-plane-metrics-cert\") on node \"crc\" DevicePath \"\"" Oct 03 08:39:35 crc kubenswrapper[4765]: I1003 08:39:35.460735 4765 reconciler_common.go:293] "Volume detached for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-bound-sa-token\") on node \"crc\" DevicePath \"\"" Oct 03 08:39:35 crc kubenswrapper[4765]: I1003 08:39:35.460743 4765 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-cfbct\" (UniqueName: \"kubernetes.io/projected/57a731c4-ef35-47a8-b875-bfb08a7f8011-kube-api-access-cfbct\") on node \"crc\" DevicePath \"\"" Oct 03 08:39:35 crc kubenswrapper[4765]: I1003 08:39:35.460752 4765 reconciler_common.go:293] "Volume detached for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-service-ca\") on node \"crc\" DevicePath \"\"" Oct 03 08:39:35 crc kubenswrapper[4765]: I1003 08:39:35.460760 4765 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-d6qdx\" (UniqueName: \"kubernetes.io/projected/87cf06ed-a83f-41a7-828d-70653580a8cb-kube-api-access-d6qdx\") on node \"crc\" DevicePath \"\"" Oct 03 08:39:35 crc kubenswrapper[4765]: I1003 08:39:35.460774 4765 reconciler_common.go:293] "Volume detached for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-serviceca\") on node \"crc\" DevicePath \"\"" Oct 03 08:39:35 crc kubenswrapper[4765]: I1003 08:39:35.460783 4765 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jhbk2\" (UniqueName: \"kubernetes.io/projected/bd23aa5c-e532-4e53-bccf-e79f130c5ae8-kube-api-access-jhbk2\") on node \"crc\" DevicePath \"\"" Oct 03 08:39:35 crc kubenswrapper[4765]: I1003 08:39:35.460791 4765 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wxkg8\" (UniqueName: \"kubernetes.io/projected/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-kube-api-access-wxkg8\") on node \"crc\" DevicePath \"\"" Oct 03 08:39:35 crc kubenswrapper[4765]: I1003 08:39:35.460799 4765 reconciler_common.go:293] "Volume detached for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-metrics-certs\") on node \"crc\" DevicePath \"\"" Oct 03 08:39:35 crc kubenswrapper[4765]: I1003 08:39:35.460808 4765 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mg5zb\" (UniqueName: \"kubernetes.io/projected/6402fda4-df10-493c-b4e5-d0569419652d-kube-api-access-mg5zb\") on node \"crc\" DevicePath \"\"" Oct 03 08:39:35 crc kubenswrapper[4765]: I1003 08:39:35.460816 4765 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e7e6199b-1264-4501-8953-767f51328d08-serving-cert\") on node \"crc\" DevicePath \"\"" Oct 03 08:39:35 crc kubenswrapper[4765]: I1003 08:39:35.460824 4765 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-config\") on node \"crc\" DevicePath \"\"" Oct 03 08:39:35 crc kubenswrapper[4765]: I1003 08:39:35.460832 4765 reconciler_common.go:293] "Volume detached for volume \"images\" (UniqueName: \"kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-images\") on node \"crc\" DevicePath \"\"" Oct 03 08:39:35 crc kubenswrapper[4765]: I1003 08:39:35.460840 4765 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/5441d097-087c-4d9a-baa8-b210afa90fc9-serving-cert\") on node \"crc\" DevicePath \"\"" Oct 03 08:39:35 crc kubenswrapper[4765]: I1003 08:39:35.460848 4765 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 03 08:39:35 crc kubenswrapper[4765]: I1003 08:39:35.460856 4765 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fcqwp\" (UniqueName: \"kubernetes.io/projected/5fe579f8-e8a6-4643-bce5-a661393c4dde-kube-api-access-fcqwp\") on node \"crc\" DevicePath \"\"" Oct 03 08:39:35 crc kubenswrapper[4765]: I1003 08:39:35.460916 4765 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-serving-cert\") on node \"crc\" DevicePath \"\"" Oct 03 08:39:35 crc kubenswrapper[4765]: I1003 08:39:35.460927 4765 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-w7l8j\" (UniqueName: \"kubernetes.io/projected/01ab3dd5-8196-46d0-ad33-122e2ca51def-kube-api-access-w7l8j\") on node \"crc\" DevicePath \"\"" Oct 03 08:39:35 crc kubenswrapper[4765]: I1003 08:39:35.460936 4765 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zkvpv\" (UniqueName: \"kubernetes.io/projected/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-kube-api-access-zkvpv\") on node \"crc\" DevicePath \"\"" Oct 03 08:39:35 crc kubenswrapper[4765]: I1003 08:39:35.460944 4765 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7539238d-5fe0-46ed-884e-1c3b566537ec-serving-cert\") on node \"crc\" DevicePath \"\"" Oct 03 08:39:35 crc kubenswrapper[4765]: I1003 08:39:35.460960 4765 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/01ab3dd5-8196-46d0-ad33-122e2ca51def-config\") on node \"crc\" DevicePath \"\"" Oct 03 08:39:35 crc kubenswrapper[4765]: I1003 08:39:35.460969 4765 reconciler_common.go:293] "Volume detached for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-profile-collector-cert\") on node \"crc\" DevicePath \"\"" Oct 03 08:39:35 crc kubenswrapper[4765]: I1003 08:39:35.460994 4765 reconciler_common.go:293] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 03 08:39:35 crc kubenswrapper[4765]: I1003 08:39:35.461003 4765 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-config\") on node \"crc\" DevicePath \"\"" Oct 03 08:39:35 crc kubenswrapper[4765]: I1003 08:39:35.461011 4765 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e7e6199b-1264-4501-8953-767f51328d08-config\") on node \"crc\" DevicePath \"\"" Oct 03 08:39:35 crc kubenswrapper[4765]: I1003 08:39:35.461036 4765 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-x4zgh\" (UniqueName: \"kubernetes.io/projected/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-kube-api-access-x4zgh\") on node \"crc\" DevicePath \"\"" Oct 03 08:39:35 crc kubenswrapper[4765]: I1003 08:39:35.461052 4765 reconciler_common.go:293] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/a31745f5-9847-4afe-82a5-3161cc66ca93-trusted-ca\") on node \"crc\" DevicePath \"\"" Oct 03 08:39:35 crc kubenswrapper[4765]: I1003 08:39:35.461062 4765 reconciler_common.go:293] "Volume detached for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-registry-tls\") on node \"crc\" DevicePath \"\"" Oct 03 08:39:35 crc kubenswrapper[4765]: I1003 08:39:35.461072 4765 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-serving-cert\") on node \"crc\" DevicePath \"\"" Oct 03 08:39:35 crc kubenswrapper[4765]: I1003 08:39:35.461105 4765 reconciler_common.go:293] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/bf126b07-da06-4140-9a57-dfd54fc6b486-trusted-ca\") on node \"crc\" DevicePath \"\"" Oct 03 08:39:35 crc kubenswrapper[4765]: I1003 08:39:35.461214 4765 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 03 08:39:35 crc kubenswrapper[4765]: I1003 08:39:35.461222 4765 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 03 08:39:35 crc kubenswrapper[4765]: I1003 08:39:35.461229 4765 reconciler_common.go:293] "Volume detached for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-cabundle\") on node \"crc\" DevicePath \"\"" Oct 03 08:39:35 crc kubenswrapper[4765]: I1003 08:39:35.461237 4765 reconciler_common.go:293] "Volume detached for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-proxy-ca-bundles\") on node \"crc\" DevicePath \"\"" Oct 03 08:39:35 crc kubenswrapper[4765]: I1003 08:39:35.461245 4765 reconciler_common.go:293] "Volume detached for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-key\") on node \"crc\" DevicePath \"\"" Oct 03 08:39:35 crc kubenswrapper[4765]: I1003 08:39:35.461253 4765 reconciler_common.go:293] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/0b574797-001e-440a-8f4e-c0be86edad0f-proxy-tls\") on node \"crc\" DevicePath \"\"" Oct 03 08:39:35 crc kubenswrapper[4765]: I1003 08:39:35.461260 4765 reconciler_common.go:293] "Volume detached for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/a31745f5-9847-4afe-82a5-3161cc66ca93-metrics-tls\") on node \"crc\" DevicePath \"\"" Oct 03 08:39:35 crc kubenswrapper[4765]: I1003 08:39:35.461268 4765 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vt5rc\" (UniqueName: \"kubernetes.io/projected/44663579-783b-4372-86d6-acf235a62d72-kube-api-access-vt5rc\") on node \"crc\" DevicePath \"\"" Oct 03 08:39:35 crc kubenswrapper[4765]: I1003 08:39:35.461275 4765 reconciler_common.go:293] "Volume detached for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-oauth-serving-cert\") on node \"crc\" DevicePath \"\"" Oct 03 08:39:35 crc kubenswrapper[4765]: I1003 08:39:35.461283 4765 reconciler_common.go:293] "Volume detached for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-env-overrides\") on node \"crc\" DevicePath \"\"" Oct 03 08:39:35 crc kubenswrapper[4765]: I1003 08:39:35.461291 4765 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-config\") on node \"crc\" DevicePath \"\"" Oct 03 08:39:35 crc kubenswrapper[4765]: I1003 08:39:35.461299 4765 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-utilities\") on node \"crc\" DevicePath \"\"" Oct 03 08:39:35 crc kubenswrapper[4765]: I1003 08:39:35.461306 4765 reconciler_common.go:293] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/31d8b7a1-420e-4252-a5b7-eebe8a111292-proxy-tls\") on node \"crc\" DevicePath \"\"" Oct 03 08:39:35 crc kubenswrapper[4765]: I1003 08:39:35.461314 4765 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lz9wn\" (UniqueName: \"kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-kube-api-access-lz9wn\") on node \"crc\" DevicePath \"\"" Oct 03 08:39:35 crc kubenswrapper[4765]: I1003 08:39:35.461321 4765 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9xfj7\" (UniqueName: \"kubernetes.io/projected/5225d0e4-402f-4861-b410-819f433b1803-kube-api-access-9xfj7\") on node \"crc\" DevicePath \"\"" Oct 03 08:39:35 crc kubenswrapper[4765]: I1003 08:39:35.461329 4765 reconciler_common.go:293] "Volume detached for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/6ea678ab-3438-413e-bfe3-290ae7725660-ovn-node-metrics-cert\") on node \"crc\" DevicePath \"\"" Oct 03 08:39:35 crc kubenswrapper[4765]: I1003 08:39:35.461336 4765 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-utilities\") on node \"crc\" DevicePath \"\"" Oct 03 08:39:35 crc kubenswrapper[4765]: I1003 08:39:35.461344 4765 reconciler_common.go:293] "Volume detached for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/0b78653f-4ff9-4508-8672-245ed9b561e3-service-ca\") on node \"crc\" DevicePath \"\"" Oct 03 08:39:35 crc kubenswrapper[4765]: I1003 08:39:35.461352 4765 reconciler_common.go:293] "Volume detached for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-trusted-ca\") on node \"crc\" DevicePath \"\"" Oct 03 08:39:35 crc kubenswrapper[4765]: I1003 08:39:35.461359 4765 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-utilities\") on node \"crc\" DevicePath \"\"" Oct 03 08:39:35 crc kubenswrapper[4765]: I1003 08:39:35.461366 4765 reconciler_common.go:293] "Volume detached for volume \"images\" (UniqueName: \"kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-images\") on node \"crc\" DevicePath \"\"" Oct 03 08:39:35 crc kubenswrapper[4765]: I1003 08:39:35.461380 4765 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-provider-selection\") on node \"crc\" DevicePath \"\"" Oct 03 08:39:35 crc kubenswrapper[4765]: I1003 08:39:35.461387 4765 reconciler_common.go:293] "Volume detached for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/efdd0498-1daa-4136-9a4a-3b948c2293fc-webhook-certs\") on node \"crc\" DevicePath \"\"" Oct 03 08:39:35 crc kubenswrapper[4765]: I1003 08:39:35.461399 4765 reconciler_common.go:293] "Volume detached for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-webhook-cert\") on node \"crc\" DevicePath \"\"" Oct 03 08:39:35 crc kubenswrapper[4765]: I1003 08:39:35.461407 4765 reconciler_common.go:293] "Volume detached for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-bound-sa-token\") on node \"crc\" DevicePath \"\"" Oct 03 08:39:35 crc kubenswrapper[4765]: I1003 08:39:35.461414 4765 reconciler_common.go:293] "Volume detached for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-audit-policies\") on node \"crc\" DevicePath \"\"" Oct 03 08:39:35 crc kubenswrapper[4765]: I1003 08:39:35.461422 4765 reconciler_common.go:293] "Volume detached for volume \"tmpfs\" (UniqueName: \"kubernetes.io/empty-dir/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-tmpfs\") on node \"crc\" DevicePath \"\"" Oct 03 08:39:35 crc kubenswrapper[4765]: I1003 08:39:35.461430 4765 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ngvvp\" (UniqueName: \"kubernetes.io/projected/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-kube-api-access-ngvvp\") on node \"crc\" DevicePath \"\"" Oct 03 08:39:35 crc kubenswrapper[4765]: I1003 08:39:35.461437 4765 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pj782\" (UniqueName: \"kubernetes.io/projected/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-kube-api-access-pj782\") on node \"crc\" DevicePath \"\"" Oct 03 08:39:35 crc kubenswrapper[4765]: I1003 08:39:35.461445 4765 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-cliconfig\") on node \"crc\" DevicePath \"\"" Oct 03 08:39:35 crc kubenswrapper[4765]: I1003 08:39:35.461453 4765 reconciler_common.go:293] "Volume detached for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-binary-copy\") on node \"crc\" DevicePath \"\"" Oct 03 08:39:35 crc kubenswrapper[4765]: I1003 08:39:35.461461 4765 reconciler_common.go:293] "Volume detached for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/3ab1a177-2de0-46d9-b765-d0d0649bb42e-package-server-manager-serving-cert\") on node \"crc\" DevicePath \"\"" Oct 03 08:39:35 crc kubenswrapper[4765]: I1003 08:39:35.461471 4765 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-w9rds\" (UniqueName: \"kubernetes.io/projected/20b0d48f-5fd6-431c-a545-e3c800c7b866-kube-api-access-w9rds\") on node \"crc\" DevicePath \"\"" Oct 03 08:39:35 crc kubenswrapper[4765]: I1003 08:39:35.461479 4765 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-279lb\" (UniqueName: \"kubernetes.io/projected/7bb08738-c794-4ee8-9972-3a62ca171029-kube-api-access-279lb\") on node \"crc\" DevicePath \"\"" Oct 03 08:39:35 crc kubenswrapper[4765]: I1003 08:39:35.461488 4765 reconciler_common.go:293] "Volume detached for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-serving-ca\") on node \"crc\" DevicePath \"\"" Oct 03 08:39:35 crc kubenswrapper[4765]: I1003 08:39:35.461495 4765 reconciler_common.go:293] "Volume detached for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-serving-ca\") on node \"crc\" DevicePath \"\"" Oct 03 08:39:35 crc kubenswrapper[4765]: I1003 08:39:35.461504 4765 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/496e6271-fb68-4057-954e-a0d97a4afa3f-config\") on node \"crc\" DevicePath \"\"" Oct 03 08:39:35 crc kubenswrapper[4765]: I1003 08:39:35.461512 4765 reconciler_common.go:293] "Volume detached for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/5b88f790-22fa-440e-b583-365168c0b23d-metrics-certs\") on node \"crc\" DevicePath \"\"" Oct 03 08:39:35 crc kubenswrapper[4765]: I1003 08:39:35.461520 4765 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mnrrd\" (UniqueName: \"kubernetes.io/projected/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-kube-api-access-mnrrd\") on node \"crc\" DevicePath \"\"" Oct 03 08:39:35 crc kubenswrapper[4765]: I1003 08:39:35.461528 4765 reconciler_common.go:293] "Volume detached for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-image-import-ca\") on node \"crc\" DevicePath \"\"" Oct 03 08:39:35 crc kubenswrapper[4765]: I1003 08:39:35.461535 4765 reconciler_common.go:293] "Volume detached for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-stats-auth\") on node \"crc\" DevicePath \"\"" Oct 03 08:39:35 crc kubenswrapper[4765]: I1003 08:39:35.461544 4765 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/6509e943-70c6-444c-bc41-48a544e36fbd-serving-cert\") on node \"crc\" DevicePath \"\"" Oct 03 08:39:35 crc kubenswrapper[4765]: I1003 08:39:35.461553 4765 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xcphl\" (UniqueName: \"kubernetes.io/projected/7583ce53-e0fe-4a16-9e4d-50516596a136-kube-api-access-xcphl\") on node \"crc\" DevicePath \"\"" Oct 03 08:39:35 crc kubenswrapper[4765]: I1003 08:39:35.461561 4765 reconciler_common.go:293] "Volume detached for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-encryption-config\") on node \"crc\" DevicePath \"\"" Oct 03 08:39:35 crc kubenswrapper[4765]: I1003 08:39:35.461568 4765 reconciler_common.go:293] "Volume detached for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-multus-daemon-config\") on node \"crc\" DevicePath \"\"" Oct 03 08:39:35 crc kubenswrapper[4765]: I1003 08:39:35.461578 4765 reconciler_common.go:293] "Volume detached for volume \"mcd-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/fda69060-fa79-4696-b1a6-7980f124bf7c-mcd-auth-proxy-config\") on node \"crc\" DevicePath \"\"" Oct 03 08:39:35 crc kubenswrapper[4765]: I1003 08:39:35.461586 4765 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xcgwh\" (UniqueName: \"kubernetes.io/projected/fda69060-fa79-4696-b1a6-7980f124bf7c-kube-api-access-xcgwh\") on node \"crc\" DevicePath \"\"" Oct 03 08:39:35 crc kubenswrapper[4765]: I1003 08:39:35.461594 4765 reconciler_common.go:293] "Volume detached for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-auth-proxy-config\") on node \"crc\" DevicePath \"\"" Oct 03 08:39:35 crc kubenswrapper[4765]: I1003 08:39:35.461602 4765 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-config\") on node \"crc\" DevicePath \"\"" Oct 03 08:39:35 crc kubenswrapper[4765]: I1003 08:39:35.461609 4765 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-249nr\" (UniqueName: \"kubernetes.io/projected/b6312bbd-5731-4ea0-a20f-81d5a57df44a-kube-api-access-249nr\") on node \"crc\" DevicePath \"\"" Oct 03 08:39:35 crc kubenswrapper[4765]: I1003 08:39:35.461617 4765 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6ccd8\" (UniqueName: \"kubernetes.io/projected/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-kube-api-access-6ccd8\") on node \"crc\" DevicePath \"\"" Oct 03 08:39:35 crc kubenswrapper[4765]: I1003 08:39:35.461624 4765 reconciler_common.go:293] "Volume detached for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-operator-metrics\") on node \"crc\" DevicePath \"\"" Oct 03 08:39:35 crc kubenswrapper[4765]: I1003 08:39:35.461633 4765 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-sb6h7\" (UniqueName: \"kubernetes.io/projected/1bf7eb37-55a3-4c65-b768-a94c82151e69-kube-api-access-sb6h7\") on node \"crc\" DevicePath \"\"" Oct 03 08:39:35 crc kubenswrapper[4765]: I1003 08:39:35.461655 4765 reconciler_common.go:293] "Volume detached for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-auth-proxy-config\") on node \"crc\" DevicePath \"\"" Oct 03 08:39:35 crc kubenswrapper[4765]: I1003 08:39:35.461662 4765 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2w9zh\" (UniqueName: \"kubernetes.io/projected/4bb40260-dbaa-4fb0-84df-5e680505d512-kube-api-access-2w9zh\") on node \"crc\" DevicePath \"\"" Oct 03 08:39:35 crc kubenswrapper[4765]: I1003 08:39:35.461673 4765 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-x7zkh\" (UniqueName: \"kubernetes.io/projected/6731426b-95fe-49ff-bb5f-40441049fde2-kube-api-access-x7zkh\") on node \"crc\" DevicePath \"\"" Oct 03 08:39:35 crc kubenswrapper[4765]: I1003 08:39:35.461681 4765 reconciler_common.go:293] "Volume detached for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-cni-binary-copy\") on node \"crc\" DevicePath \"\"" Oct 03 08:39:35 crc kubenswrapper[4765]: I1003 08:39:35.461690 4765 reconciler_common.go:293] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 03 08:39:35 crc kubenswrapper[4765]: I1003 08:39:35.461698 4765 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-tk88c\" (UniqueName: \"kubernetes.io/projected/7539238d-5fe0-46ed-884e-1c3b566537ec-kube-api-access-tk88c\") on node \"crc\" DevicePath \"\"" Oct 03 08:39:35 crc kubenswrapper[4765]: I1003 08:39:35.461706 4765 reconciler_common.go:293] "Volume detached for volume \"etcd-ca\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-ca\") on node \"crc\" DevicePath \"\"" Oct 03 08:39:35 crc kubenswrapper[4765]: I1003 08:39:35.461713 4765 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-htfz6\" (UniqueName: \"kubernetes.io/projected/6ea678ab-3438-413e-bfe3-290ae7725660-kube-api-access-htfz6\") on node \"crc\" DevicePath \"\"" Oct 03 08:39:35 crc kubenswrapper[4765]: I1003 08:39:35.461721 4765 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7539238d-5fe0-46ed-884e-1c3b566537ec-config\") on node \"crc\" DevicePath \"\"" Oct 03 08:39:35 crc kubenswrapper[4765]: I1003 08:39:35.461728 4765 reconciler_common.go:293] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 03 08:39:35 crc kubenswrapper[4765]: I1003 08:39:35.461736 4765 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lzf88\" (UniqueName: \"kubernetes.io/projected/0b574797-001e-440a-8f4e-c0be86edad0f-kube-api-access-lzf88\") on node \"crc\" DevicePath \"\"" Oct 03 08:39:35 crc kubenswrapper[4765]: I1003 08:39:35.461744 4765 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 03 08:39:35 crc kubenswrapper[4765]: I1003 08:39:35.461753 4765 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/1386a44e-36a2-460c-96d0-0359d2b6f0f5-kube-api-access\") on node \"crc\" DevicePath \"\"" Oct 03 08:39:35 crc kubenswrapper[4765]: I1003 08:39:35.461761 4765 reconciler_common.go:293] "Volume detached for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-srv-cert\") on node \"crc\" DevicePath \"\"" Oct 03 08:39:35 crc kubenswrapper[4765]: I1003 08:39:35.461768 4765 reconciler_common.go:293] "Volume detached for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/8f668bae-612b-4b75-9490-919e737c6a3b-ca-trust-extracted\") on node \"crc\" DevicePath \"\"" Oct 03 08:39:35 crc kubenswrapper[4765]: I1003 08:39:35.461777 4765 reconciler_common.go:293] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 03 08:39:35 crc kubenswrapper[4765]: I1003 08:39:35.461785 4765 reconciler_common.go:293] "Volume detached for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-client\") on node \"crc\" DevicePath \"\"" Oct 03 08:39:35 crc kubenswrapper[4765]: I1003 08:39:35.461793 4765 reconciler_common.go:293] "Volume detached for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-registry-certificates\") on node \"crc\" DevicePath \"\"" Oct 03 08:39:35 crc kubenswrapper[4765]: I1003 08:39:35.461801 4765 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-v47cf\" (UniqueName: \"kubernetes.io/projected/c03ee662-fb2f-4fc4-a2c1-af487c19d254-kube-api-access-v47cf\") on node \"crc\" DevicePath \"\"" Oct 03 08:39:35 crc kubenswrapper[4765]: I1003 08:39:35.461808 4765 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-serving-cert\") on node \"crc\" DevicePath \"\"" Oct 03 08:39:35 crc kubenswrapper[4765]: I1003 08:39:35.461816 4765 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/01ab3dd5-8196-46d0-ad33-122e2ca51def-serving-cert\") on node \"crc\" DevicePath \"\"" Oct 03 08:39:35 crc kubenswrapper[4765]: I1003 08:39:35.461827 4765 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-config\") on node \"crc\" DevicePath \"\"" Oct 03 08:39:35 crc kubenswrapper[4765]: I1003 08:39:35.461845 4765 reconciler_common.go:293] "Volume detached for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c03ee662-fb2f-4fc4-a2c1-af487c19d254-service-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 03 08:39:35 crc kubenswrapper[4765]: I1003 08:39:35.461856 4765 reconciler_common.go:293] "Volume detached for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-metrics-tls\") on node \"crc\" DevicePath \"\"" Oct 03 08:39:35 crc kubenswrapper[4765]: I1003 08:39:35.461867 4765 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1386a44e-36a2-460c-96d0-0359d2b6f0f5-config\") on node \"crc\" DevicePath \"\"" Oct 03 08:39:35 crc kubenswrapper[4765]: I1003 08:39:35.461877 4765 reconciler_common.go:293] "Volume detached for volume \"certs\" (UniqueName: \"kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-certs\") on node \"crc\" DevicePath \"\"" Oct 03 08:39:35 crc kubenswrapper[4765]: I1003 08:39:35.461889 4765 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-client-ca\") on node \"crc\" DevicePath \"\"" Oct 03 08:39:35 crc kubenswrapper[4765]: I1003 08:39:35.461900 4765 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zgdk5\" (UniqueName: \"kubernetes.io/projected/31d8b7a1-420e-4252-a5b7-eebe8a111292-kube-api-access-zgdk5\") on node \"crc\" DevicePath \"\"" Oct 03 08:39:35 crc kubenswrapper[4765]: I1003 08:39:35.461913 4765 reconciler_common.go:293] "Volume detached for volume \"mcc-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/0b574797-001e-440a-8f4e-c0be86edad0f-mcc-auth-proxy-config\") on node \"crc\" DevicePath \"\"" Oct 03 08:39:35 crc kubenswrapper[4765]: I1003 08:39:35.461926 4765 reconciler_common.go:293] "Volume detached for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-script-lib\") on node \"crc\" DevicePath \"\"" Oct 03 08:39:35 crc kubenswrapper[4765]: I1003 08:39:35.461937 4765 reconciler_common.go:293] "Volume detached for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-service-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 03 08:39:35 crc kubenswrapper[4765]: I1003 08:39:35.461948 4765 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/e7e6199b-1264-4501-8953-767f51328d08-kube-api-access\") on node \"crc\" DevicePath \"\"" Oct 03 08:39:35 crc kubenswrapper[4765]: I1003 08:39:35.461962 4765 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qg5z5\" (UniqueName: \"kubernetes.io/projected/43509403-f426-496e-be36-56cef71462f5-kube-api-access-qg5z5\") on node \"crc\" DevicePath \"\"" Oct 03 08:39:35 crc kubenswrapper[4765]: I1003 08:39:35.461972 4765 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-serving-cert\") on node \"crc\" DevicePath \"\"" Oct 03 08:39:35 crc kubenswrapper[4765]: I1003 08:39:35.461985 4765 reconciler_common.go:293] "Volume detached for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-apiservice-cert\") on node \"crc\" DevicePath \"\"" Oct 03 08:39:35 crc kubenswrapper[4765]: I1003 08:39:35.461996 4765 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-session\") on node \"crc\" DevicePath \"\"" Oct 03 08:39:35 crc kubenswrapper[4765]: I1003 08:39:35.462007 4765 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rnphk\" (UniqueName: \"kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-kube-api-access-rnphk\") on node \"crc\" DevicePath \"\"" Oct 03 08:39:35 crc kubenswrapper[4765]: I1003 08:39:35.462019 4765 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/496e6271-fb68-4057-954e-a0d97a4afa3f-kube-api-access\") on node \"crc\" DevicePath \"\"" Oct 03 08:39:35 crc kubenswrapper[4765]: I1003 08:39:35.462031 4765 reconciler_common.go:293] "Volume detached for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-client\") on node \"crc\" DevicePath \"\"" Oct 03 08:39:35 crc kubenswrapper[4765]: I1003 08:39:35.462419 4765 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/0b78653f-4ff9-4508-8672-245ed9b561e3-kube-api-access\") on node \"crc\" DevicePath \"\"" Oct 03 08:39:35 crc kubenswrapper[4765]: I1003 08:39:35.462432 4765 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/9d4552c7-cd75-42dd-8880-30dd377c49a4-serving-cert\") on node \"crc\" DevicePath \"\"" Oct 03 08:39:35 crc kubenswrapper[4765]: I1003 08:39:35.462440 4765 reconciler_common.go:293] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-trusted-ca\") on node \"crc\" DevicePath \"\"" Oct 03 08:39:35 crc kubenswrapper[4765]: I1003 08:39:35.462448 4765 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/87cf06ed-a83f-41a7-828d-70653580a8cb-config-volume\") on node \"crc\" DevicePath \"\"" Oct 03 08:39:35 crc kubenswrapper[4765]: I1003 08:39:35.462456 4765 reconciler_common.go:293] "Volume detached for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/bf126b07-da06-4140-9a57-dfd54fc6b486-image-registry-operator-tls\") on node \"crc\" DevicePath \"\"" Oct 03 08:39:35 crc kubenswrapper[4765]: I1003 08:39:35.462467 4765 reconciler_common.go:293] "Volume detached for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-client\") on node \"crc\" DevicePath \"\"" Oct 03 08:39:35 crc kubenswrapper[4765]: I1003 08:39:35.463473 4765 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:35Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:35Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 03 08:39:35 crc kubenswrapper[4765]: I1003 08:39:35.472972 4765 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:35Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:35Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 03 08:39:35 crc kubenswrapper[4765]: I1003 08:39:35.486828 4765 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:35Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:35Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 03 08:39:35 crc kubenswrapper[4765]: I1003 08:39:35.496902 4765 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:35Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:35Z\\\",\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":false,\\\"restartCount\\\":6,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 03 08:39:35 crc kubenswrapper[4765]: I1003 08:39:35.512927 4765 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:35Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:35Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 03 08:39:35 crc kubenswrapper[4765]: I1003 08:39:35.558788 4765 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-node-identity/network-node-identity-vrzqb" Oct 03 08:39:35 crc kubenswrapper[4765]: I1003 08:39:35.568734 4765 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Oct 03 08:39:35 crc kubenswrapper[4765]: I1003 08:39:35.577381 4765 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/iptables-alerter-4ln5h" Oct 03 08:39:35 crc kubenswrapper[4765]: I1003 08:39:35.868386 4765 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 03 08:39:35 crc kubenswrapper[4765]: E1003 08:39:35.868557 4765 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-03 08:39:36.868522881 +0000 UTC m=+21.170017221 (durationBeforeRetry 1s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 03 08:39:35 crc kubenswrapper[4765]: I1003 08:39:35.970171 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 03 08:39:35 crc kubenswrapper[4765]: I1003 08:39:35.970239 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 03 08:39:35 crc kubenswrapper[4765]: I1003 08:39:35.970270 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 03 08:39:35 crc kubenswrapper[4765]: I1003 08:39:35.970300 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 03 08:39:35 crc kubenswrapper[4765]: E1003 08:39:35.970334 4765 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Oct 03 08:39:35 crc kubenswrapper[4765]: E1003 08:39:35.970445 4765 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-10-03 08:39:36.970414285 +0000 UTC m=+21.271908605 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Oct 03 08:39:35 crc kubenswrapper[4765]: E1003 08:39:35.970455 4765 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Oct 03 08:39:35 crc kubenswrapper[4765]: E1003 08:39:35.970509 4765 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-10-03 08:39:36.970488477 +0000 UTC m=+21.271982807 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Oct 03 08:39:35 crc kubenswrapper[4765]: E1003 08:39:35.970600 4765 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Oct 03 08:39:35 crc kubenswrapper[4765]: E1003 08:39:35.970620 4765 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Oct 03 08:39:35 crc kubenswrapper[4765]: E1003 08:39:35.970634 4765 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Oct 03 08:39:35 crc kubenswrapper[4765]: E1003 08:39:35.970713 4765 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2025-10-03 08:39:36.970699462 +0000 UTC m=+21.272193792 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Oct 03 08:39:35 crc kubenswrapper[4765]: E1003 08:39:35.970785 4765 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Oct 03 08:39:35 crc kubenswrapper[4765]: E1003 08:39:35.970829 4765 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Oct 03 08:39:35 crc kubenswrapper[4765]: E1003 08:39:35.970842 4765 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Oct 03 08:39:35 crc kubenswrapper[4765]: E1003 08:39:35.970923 4765 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2025-10-03 08:39:36.970902347 +0000 UTC m=+21.272396677 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Oct 03 08:39:36 crc kubenswrapper[4765]: I1003 08:39:36.311270 4765 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="01ab3dd5-8196-46d0-ad33-122e2ca51def" path="/var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/volumes" Oct 03 08:39:36 crc kubenswrapper[4765]: I1003 08:39:36.312065 4765 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" path="/var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes" Oct 03 08:39:36 crc kubenswrapper[4765]: I1003 08:39:36.313060 4765 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="09efc573-dbb6-4249-bd59-9b87aba8dd28" path="/var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes" Oct 03 08:39:36 crc kubenswrapper[4765]: I1003 08:39:36.313985 4765 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0b574797-001e-440a-8f4e-c0be86edad0f" path="/var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/volumes" Oct 03 08:39:36 crc kubenswrapper[4765]: I1003 08:39:36.314859 4765 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0b78653f-4ff9-4508-8672-245ed9b561e3" path="/var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/volumes" Oct 03 08:39:36 crc kubenswrapper[4765]: I1003 08:39:36.315401 4765 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1386a44e-36a2-460c-96d0-0359d2b6f0f5" path="/var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/volumes" Oct 03 08:39:36 crc kubenswrapper[4765]: I1003 08:39:36.316162 4765 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1bf7eb37-55a3-4c65-b768-a94c82151e69" path="/var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes" Oct 03 08:39:36 crc kubenswrapper[4765]: I1003 08:39:36.316869 4765 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1d611f23-29be-4491-8495-bee1670e935f" path="/var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes" Oct 03 08:39:36 crc kubenswrapper[4765]: I1003 08:39:36.317563 4765 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="20b0d48f-5fd6-431c-a545-e3c800c7b866" path="/var/lib/kubelet/pods/20b0d48f-5fd6-431c-a545-e3c800c7b866/volumes" Oct 03 08:39:36 crc kubenswrapper[4765]: I1003 08:39:36.320015 4765 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" path="/var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/volumes" Oct 03 08:39:36 crc kubenswrapper[4765]: I1003 08:39:36.320549 4765 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="22c825df-677d-4ca6-82db-3454ed06e783" path="/var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes" Oct 03 08:39:36 crc kubenswrapper[4765]: I1003 08:39:36.321943 4765 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="25e176fe-21b4-4974-b1ed-c8b94f112a7f" path="/var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/volumes" Oct 03 08:39:36 crc kubenswrapper[4765]: I1003 08:39:36.322487 4765 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" path="/var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/volumes" Oct 03 08:39:36 crc kubenswrapper[4765]: I1003 08:39:36.323624 4765 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="31d8b7a1-420e-4252-a5b7-eebe8a111292" path="/var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes" Oct 03 08:39:36 crc kubenswrapper[4765]: I1003 08:39:36.324306 4765 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3ab1a177-2de0-46d9-b765-d0d0649bb42e" path="/var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/volumes" Oct 03 08:39:36 crc kubenswrapper[4765]: I1003 08:39:36.326270 4765 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3cb93b32-e0ae-4377-b9c8-fdb9842c6d59" path="/var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes" Oct 03 08:39:36 crc kubenswrapper[4765]: I1003 08:39:36.327121 4765 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="43509403-f426-496e-be36-56cef71462f5" path="/var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes" Oct 03 08:39:36 crc kubenswrapper[4765]: I1003 08:39:36.327634 4765 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="44663579-783b-4372-86d6-acf235a62d72" path="/var/lib/kubelet/pods/44663579-783b-4372-86d6-acf235a62d72/volumes" Oct 03 08:39:36 crc kubenswrapper[4765]: I1003 08:39:36.328892 4765 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="496e6271-fb68-4057-954e-a0d97a4afa3f" path="/var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/volumes" Oct 03 08:39:36 crc kubenswrapper[4765]: I1003 08:39:36.329674 4765 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" path="/var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes" Oct 03 08:39:36 crc kubenswrapper[4765]: I1003 08:39:36.330236 4765 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="49ef4625-1d3a-4a9f-b595-c2433d32326d" path="/var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/volumes" Oct 03 08:39:36 crc kubenswrapper[4765]: I1003 08:39:36.331758 4765 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4bb40260-dbaa-4fb0-84df-5e680505d512" path="/var/lib/kubelet/pods/4bb40260-dbaa-4fb0-84df-5e680505d512/volumes" Oct 03 08:39:36 crc kubenswrapper[4765]: I1003 08:39:36.332308 4765 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5225d0e4-402f-4861-b410-819f433b1803" path="/var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes" Oct 03 08:39:36 crc kubenswrapper[4765]: I1003 08:39:36.333611 4765 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5441d097-087c-4d9a-baa8-b210afa90fc9" path="/var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes" Oct 03 08:39:36 crc kubenswrapper[4765]: I1003 08:39:36.334104 4765 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="57a731c4-ef35-47a8-b875-bfb08a7f8011" path="/var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes" Oct 03 08:39:36 crc kubenswrapper[4765]: I1003 08:39:36.335495 4765 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5b88f790-22fa-440e-b583-365168c0b23d" path="/var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/volumes" Oct 03 08:39:36 crc kubenswrapper[4765]: I1003 08:39:36.336252 4765 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5fe579f8-e8a6-4643-bce5-a661393c4dde" path="/var/lib/kubelet/pods/5fe579f8-e8a6-4643-bce5-a661393c4dde/volumes" Oct 03 08:39:36 crc kubenswrapper[4765]: I1003 08:39:36.336845 4765 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6402fda4-df10-493c-b4e5-d0569419652d" path="/var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes" Oct 03 08:39:36 crc kubenswrapper[4765]: I1003 08:39:36.337986 4765 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9660b983-3561-4cf7-8ea0-31a63e8d1051\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://23c27e7d79dab0c54b22f0114e7f55a9267e3a21961b8479c37fd77d0e8b66c7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T08:39:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fb89a31c804d86cbc11b04e4dcfab79d4536f28a107d43e98d48172a1c257ebb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T08:39:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1c3168f51c49cd9633557cf31cdc0fec47b3fcf981462dc85f4253a0584fcf52\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T08:39:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://99ae775d5cfd2e88a1c7ca516e1c59f2e08ce1d383653cacbefeac66b07abcb5\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T08:39:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T08:39:16Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 03 08:39:36 crc kubenswrapper[4765]: I1003 08:39:36.338132 4765 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6509e943-70c6-444c-bc41-48a544e36fbd" path="/var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes" Oct 03 08:39:36 crc kubenswrapper[4765]: I1003 08:39:36.338882 4765 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6731426b-95fe-49ff-bb5f-40441049fde2" path="/var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/volumes" Oct 03 08:39:36 crc kubenswrapper[4765]: I1003 08:39:36.339927 4765 kubelet_volumes.go:152] "Cleaned up orphaned volume subpath from pod" podUID="6ea678ab-3438-413e-bfe3-290ae7725660" path="/var/lib/kubelet/pods/6ea678ab-3438-413e-bfe3-290ae7725660/volume-subpaths/run-systemd/ovnkube-controller/6" Oct 03 08:39:36 crc kubenswrapper[4765]: I1003 08:39:36.340061 4765 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6ea678ab-3438-413e-bfe3-290ae7725660" path="/var/lib/kubelet/pods/6ea678ab-3438-413e-bfe3-290ae7725660/volumes" Oct 03 08:39:36 crc kubenswrapper[4765]: I1003 08:39:36.342086 4765 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7539238d-5fe0-46ed-884e-1c3b566537ec" path="/var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/volumes" Oct 03 08:39:36 crc kubenswrapper[4765]: I1003 08:39:36.342998 4765 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7583ce53-e0fe-4a16-9e4d-50516596a136" path="/var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes" Oct 03 08:39:36 crc kubenswrapper[4765]: I1003 08:39:36.343385 4765 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7bb08738-c794-4ee8-9972-3a62ca171029" path="/var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes" Oct 03 08:39:36 crc kubenswrapper[4765]: I1003 08:39:36.345152 4765 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="87cf06ed-a83f-41a7-828d-70653580a8cb" path="/var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/volumes" Oct 03 08:39:36 crc kubenswrapper[4765]: I1003 08:39:36.346394 4765 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" path="/var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/volumes" Oct 03 08:39:36 crc kubenswrapper[4765]: I1003 08:39:36.347039 4765 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="925f1c65-6136-48ba-85aa-3a3b50560753" path="/var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes" Oct 03 08:39:36 crc kubenswrapper[4765]: I1003 08:39:36.348244 4765 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="96b93a3a-6083-4aea-8eab-fe1aa8245ad9" path="/var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/volumes" Oct 03 08:39:36 crc kubenswrapper[4765]: I1003 08:39:36.349051 4765 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9d4552c7-cd75-42dd-8880-30dd377c49a4" path="/var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes" Oct 03 08:39:36 crc kubenswrapper[4765]: I1003 08:39:36.349595 4765 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a0128f3a-b052-44ed-a84e-c4c8aaf17c13" path="/var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/volumes" Oct 03 08:39:36 crc kubenswrapper[4765]: I1003 08:39:36.350425 4765 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:35Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:35Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 03 08:39:36 crc kubenswrapper[4765]: I1003 08:39:36.350742 4765 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a31745f5-9847-4afe-82a5-3161cc66ca93" path="/var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/volumes" Oct 03 08:39:36 crc kubenswrapper[4765]: I1003 08:39:36.352054 4765 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" path="/var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes" Oct 03 08:39:36 crc kubenswrapper[4765]: I1003 08:39:36.352659 4765 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b6312bbd-5731-4ea0-a20f-81d5a57df44a" path="/var/lib/kubelet/pods/b6312bbd-5731-4ea0-a20f-81d5a57df44a/volumes" Oct 03 08:39:36 crc kubenswrapper[4765]: I1003 08:39:36.353534 4765 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b6cd30de-2eeb-49a2-ab40-9167f4560ff5" path="/var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/volumes" Oct 03 08:39:36 crc kubenswrapper[4765]: I1003 08:39:36.354105 4765 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" path="/var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes" Oct 03 08:39:36 crc kubenswrapper[4765]: I1003 08:39:36.355073 4765 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bd23aa5c-e532-4e53-bccf-e79f130c5ae8" path="/var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/volumes" Oct 03 08:39:36 crc kubenswrapper[4765]: I1003 08:39:36.355812 4765 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bf126b07-da06-4140-9a57-dfd54fc6b486" path="/var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/volumes" Oct 03 08:39:36 crc kubenswrapper[4765]: I1003 08:39:36.356834 4765 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c03ee662-fb2f-4fc4-a2c1-af487c19d254" path="/var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/volumes" Oct 03 08:39:36 crc kubenswrapper[4765]: I1003 08:39:36.357317 4765 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d" path="/var/lib/kubelet/pods/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d/volumes" Oct 03 08:39:36 crc kubenswrapper[4765]: I1003 08:39:36.357895 4765 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e7e6199b-1264-4501-8953-767f51328d08" path="/var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/volumes" Oct 03 08:39:36 crc kubenswrapper[4765]: I1003 08:39:36.358840 4765 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="efdd0498-1daa-4136-9a4a-3b948c2293fc" path="/var/lib/kubelet/pods/efdd0498-1daa-4136-9a4a-3b948c2293fc/volumes" Oct 03 08:39:36 crc kubenswrapper[4765]: I1003 08:39:36.359402 4765 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" path="/var/lib/kubelet/pods/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9/volumes" Oct 03 08:39:36 crc kubenswrapper[4765]: I1003 08:39:36.360247 4765 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="fda69060-fa79-4696-b1a6-7980f124bf7c" path="/var/lib/kubelet/pods/fda69060-fa79-4696-b1a6-7980f124bf7c/volumes" Oct 03 08:39:36 crc kubenswrapper[4765]: I1003 08:39:36.364103 4765 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:35Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:35Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 03 08:39:36 crc kubenswrapper[4765]: I1003 08:39:36.377801 4765 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:35Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:35Z\\\",\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":false,\\\"restartCount\\\":6,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 03 08:39:36 crc kubenswrapper[4765]: I1003 08:39:36.425188 4765 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:35Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:35Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T08:39:36Z is after 2025-08-24T17:21:41Z" Oct 03 08:39:36 crc kubenswrapper[4765]: I1003 08:39:36.441686 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" event={"ID":"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49","Type":"ContainerStarted","Data":"028fe135b9ce98b6566f2f9e285a1d2f2109da7aa755ac1b7b2779cabe33e1f2"} Oct 03 08:39:36 crc kubenswrapper[4765]: I1003 08:39:36.443280 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" event={"ID":"37a5e44f-9a88-4405-be8a-b645485e7312","Type":"ContainerStarted","Data":"e2003e4dd90b26bd915c05a690d0ab12b21ef7773138f11993382b0e7ac2d778"} Oct 03 08:39:36 crc kubenswrapper[4765]: I1003 08:39:36.443336 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" event={"ID":"37a5e44f-9a88-4405-be8a-b645485e7312","Type":"ContainerStarted","Data":"765de60e30b8cb55e1e2e6d3cdb5f0ba238a352d79696a57c2522e6d0bba713b"} Oct 03 08:39:36 crc kubenswrapper[4765]: I1003 08:39:36.448640 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" event={"ID":"ef543e1b-8068-4ea3-b32a-61027b32e95d","Type":"ContainerStarted","Data":"e8d6f534a0a702832db2f8947c1528a98d511d3950cc5a6ec0ac3b31b3dbcb7c"} Oct 03 08:39:36 crc kubenswrapper[4765]: I1003 08:39:36.448707 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" event={"ID":"ef543e1b-8068-4ea3-b32a-61027b32e95d","Type":"ContainerStarted","Data":"20ad16cb9f0f7e17ac946cd2c3f7c01b6e6c95d6d76c99f482b3761546689af2"} Oct 03 08:39:36 crc kubenswrapper[4765]: I1003 08:39:36.448716 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" event={"ID":"ef543e1b-8068-4ea3-b32a-61027b32e95d","Type":"ContainerStarted","Data":"8a752661d18f6e0a8d60c3e69863a395e494616ee2d1858a3337c38ef771b94e"} Oct 03 08:39:36 crc kubenswrapper[4765]: I1003 08:39:36.450654 4765 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/0.log" Oct 03 08:39:36 crc kubenswrapper[4765]: I1003 08:39:36.452277 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"f2e21a2b21d807288e991a3a44ea38d316985590080aa4291aa3385816f826dd"} Oct 03 08:39:36 crc kubenswrapper[4765]: I1003 08:39:36.472678 4765 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0c434639-9c6c-420c-a51b-fdf59b654daa\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:16Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:16Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d31497fd54f7500ac776bdd9a16414d873c053353911ed5ba237b201e9e7ac12\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T08:39:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e89b19d6a5b90a2051665bf2e5e150f73df7899eff246ee75246bc2127c415ea\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T08:39:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://79fad446c147481b1a0ff2a173848b2d24384e6b6aafcd0749dc820e9abfe929\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T08:39:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://aa0283dadc2c5e48aa9bfd20ef35d889a350244b72eb8529d4d4e682d5fa0e47\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://aa0283dadc2c5e48aa9bfd20ef35d889a350244b72eb8529d4d4e682d5fa0e47\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-03T08:39:35Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1003 08:39:29.830291 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1003 08:39:29.833185 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2710500186/tls.crt::/tmp/serving-cert-2710500186/tls.key\\\\\\\"\\\\nI1003 08:39:35.213224 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1003 08:39:35.219008 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1003 08:39:35.219055 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1003 08:39:35.219088 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1003 08:39:35.219098 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1003 08:39:35.227302 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI1003 08:39:35.227314 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW1003 08:39:35.227372 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1003 08:39:35.227381 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1003 08:39:35.227385 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1003 08:39:35.227395 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1003 08:39:35.227398 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1003 08:39:35.227401 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF1003 08:39:35.229781 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-03T08:39:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fa1d1c0f4dab4b4c6c9f3afccac34473eab40a714015a2a7ce725ed1a92b609c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T08:39:19Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7c94b0f96f50324a67a7b189e9d3c4e406193421aa2e793692219bef9f2578dd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7c94b0f96f50324a67a7b189e9d3c4e406193421aa2e793692219bef9f2578dd\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T08:39:17Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T08:39:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T08:39:16Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T08:39:36Z is after 2025-08-24T17:21:41Z" Oct 03 08:39:36 crc kubenswrapper[4765]: I1003 08:39:36.492457 4765 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:35Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:35Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T08:39:36Z is after 2025-08-24T17:21:41Z" Oct 03 08:39:36 crc kubenswrapper[4765]: I1003 08:39:36.508077 4765 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:35Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:35Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T08:39:36Z is after 2025-08-24T17:21:41Z" Oct 03 08:39:36 crc kubenswrapper[4765]: I1003 08:39:36.522311 4765 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9660b983-3561-4cf7-8ea0-31a63e8d1051\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://23c27e7d79dab0c54b22f0114e7f55a9267e3a21961b8479c37fd77d0e8b66c7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T08:39:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fb89a31c804d86cbc11b04e4dcfab79d4536f28a107d43e98d48172a1c257ebb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T08:39:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1c3168f51c49cd9633557cf31cdc0fec47b3fcf981462dc85f4253a0584fcf52\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T08:39:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://99ae775d5cfd2e88a1c7ca516e1c59f2e08ce1d383653cacbefeac66b07abcb5\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T08:39:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T08:39:16Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T08:39:36Z is after 2025-08-24T17:21:41Z" Oct 03 08:39:36 crc kubenswrapper[4765]: I1003 08:39:36.537849 4765 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-multus/multus-additional-cni-plugins-4bmrv"] Oct 03 08:39:36 crc kubenswrapper[4765]: I1003 08:39:36.538579 4765 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-dns/node-resolver-p9gf5"] Oct 03 08:39:36 crc kubenswrapper[4765]: I1003 08:39:36.538853 4765 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-multus/multus-csb5z"] Oct 03 08:39:36 crc kubenswrapper[4765]: I1003 08:39:36.538965 4765 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-additional-cni-plugins-4bmrv" Oct 03 08:39:36 crc kubenswrapper[4765]: I1003 08:39:36.539015 4765 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/node-resolver-p9gf5" Oct 03 08:39:36 crc kubenswrapper[4765]: I1003 08:39:36.539144 4765 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-csb5z" Oct 03 08:39:36 crc kubenswrapper[4765]: I1003 08:39:36.540799 4765 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-config-operator/machine-config-daemon-j8mss"] Oct 03 08:39:36 crc kubenswrapper[4765]: I1003 08:39:36.541207 4765 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-daemon-j8mss" Oct 03 08:39:36 crc kubenswrapper[4765]: I1003 08:39:36.545529 4765 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:35Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:35Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T08:39:36Z is after 2025-08-24T17:21:41Z" Oct 03 08:39:36 crc kubenswrapper[4765]: I1003 08:39:36.545683 4765 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"multus-daemon-config" Oct 03 08:39:36 crc kubenswrapper[4765]: I1003 08:39:36.545683 4765 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"default-cni-sysctl-allowlist" Oct 03 08:39:36 crc kubenswrapper[4765]: I1003 08:39:36.546638 4765 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"cni-copy-resources" Oct 03 08:39:36 crc kubenswrapper[4765]: I1003 08:39:36.546957 4765 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"openshift-service-ca.crt" Oct 03 08:39:36 crc kubenswrapper[4765]: I1003 08:39:36.547234 4765 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"openshift-service-ca.crt" Oct 03 08:39:36 crc kubenswrapper[4765]: I1003 08:39:36.547768 4765 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"openshift-service-ca.crt" Oct 03 08:39:36 crc kubenswrapper[4765]: I1003 08:39:36.547912 4765 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"kube-rbac-proxy" Oct 03 08:39:36 crc kubenswrapper[4765]: I1003 08:39:36.548395 4765 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"default-dockercfg-2q5b6" Oct 03 08:39:36 crc kubenswrapper[4765]: I1003 08:39:36.548559 4765 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"kube-root-ca.crt" Oct 03 08:39:36 crc kubenswrapper[4765]: I1003 08:39:36.548708 4765 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns"/"node-resolver-dockercfg-kz9s7" Oct 03 08:39:36 crc kubenswrapper[4765]: I1003 08:39:36.548909 4765 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"kube-root-ca.crt" Oct 03 08:39:36 crc kubenswrapper[4765]: I1003 08:39:36.549121 4765 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-daemon-dockercfg-r5tcq" Oct 03 08:39:36 crc kubenswrapper[4765]: I1003 08:39:36.549363 4765 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"proxy-tls" Oct 03 08:39:36 crc kubenswrapper[4765]: I1003 08:39:36.549412 4765 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"kube-root-ca.crt" Oct 03 08:39:36 crc kubenswrapper[4765]: I1003 08:39:36.552154 4765 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-ancillary-tools-dockercfg-vnmsz" Oct 03 08:39:36 crc kubenswrapper[4765]: I1003 08:39:36.569302 4765 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:35Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:35Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T08:39:36Z is after 2025-08-24T17:21:41Z" Oct 03 08:39:36 crc kubenswrapper[4765]: I1003 08:39:36.576443 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/912755c8-dd28-4fbc-82de-9cf85df54f4f-cni-binary-copy\") pod \"multus-csb5z\" (UID: \"912755c8-dd28-4fbc-82de-9cf85df54f4f\") " pod="openshift-multus/multus-csb5z" Oct 03 08:39:36 crc kubenswrapper[4765]: I1003 08:39:36.576479 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/912755c8-dd28-4fbc-82de-9cf85df54f4f-host-run-k8s-cni-cncf-io\") pod \"multus-csb5z\" (UID: \"912755c8-dd28-4fbc-82de-9cf85df54f4f\") " pod="openshift-multus/multus-csb5z" Oct 03 08:39:36 crc kubenswrapper[4765]: I1003 08:39:36.576499 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rootfs\" (UniqueName: \"kubernetes.io/host-path/d636dbad-9ffa-4ba7-953f-adea04b76a23-rootfs\") pod \"machine-config-daemon-j8mss\" (UID: \"d636dbad-9ffa-4ba7-953f-adea04b76a23\") " pod="openshift-machine-config-operator/machine-config-daemon-j8mss" Oct 03 08:39:36 crc kubenswrapper[4765]: I1003 08:39:36.576604 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dhzzj\" (UniqueName: \"kubernetes.io/projected/d636dbad-9ffa-4ba7-953f-adea04b76a23-kube-api-access-dhzzj\") pod \"machine-config-daemon-j8mss\" (UID: \"d636dbad-9ffa-4ba7-953f-adea04b76a23\") " pod="openshift-machine-config-operator/machine-config-daemon-j8mss" Oct 03 08:39:36 crc kubenswrapper[4765]: I1003 08:39:36.576623 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/912755c8-dd28-4fbc-82de-9cf85df54f4f-multus-socket-dir-parent\") pod \"multus-csb5z\" (UID: \"912755c8-dd28-4fbc-82de-9cf85df54f4f\") " pod="openshift-multus/multus-csb5z" Oct 03 08:39:36 crc kubenswrapper[4765]: I1003 08:39:36.576722 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/d636dbad-9ffa-4ba7-953f-adea04b76a23-proxy-tls\") pod \"machine-config-daemon-j8mss\" (UID: \"d636dbad-9ffa-4ba7-953f-adea04b76a23\") " pod="openshift-machine-config-operator/machine-config-daemon-j8mss" Oct 03 08:39:36 crc kubenswrapper[4765]: I1003 08:39:36.576763 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6lhz2\" (UniqueName: \"kubernetes.io/projected/4f105c06-3e67-486f-a622-923ae442117c-kube-api-access-6lhz2\") pod \"multus-additional-cni-plugins-4bmrv\" (UID: \"4f105c06-3e67-486f-a622-923ae442117c\") " pod="openshift-multus/multus-additional-cni-plugins-4bmrv" Oct 03 08:39:36 crc kubenswrapper[4765]: I1003 08:39:36.576793 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/912755c8-dd28-4fbc-82de-9cf85df54f4f-host-var-lib-cni-bin\") pod \"multus-csb5z\" (UID: \"912755c8-dd28-4fbc-82de-9cf85df54f4f\") " pod="openshift-multus/multus-csb5z" Oct 03 08:39:36 crc kubenswrapper[4765]: I1003 08:39:36.576815 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/912755c8-dd28-4fbc-82de-9cf85df54f4f-cnibin\") pod \"multus-csb5z\" (UID: \"912755c8-dd28-4fbc-82de-9cf85df54f4f\") " pod="openshift-multus/multus-csb5z" Oct 03 08:39:36 crc kubenswrapper[4765]: I1003 08:39:36.576831 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/912755c8-dd28-4fbc-82de-9cf85df54f4f-system-cni-dir\") pod \"multus-csb5z\" (UID: \"912755c8-dd28-4fbc-82de-9cf85df54f4f\") " pod="openshift-multus/multus-csb5z" Oct 03 08:39:36 crc kubenswrapper[4765]: I1003 08:39:36.576847 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/912755c8-dd28-4fbc-82de-9cf85df54f4f-host-var-lib-cni-multus\") pod \"multus-csb5z\" (UID: \"912755c8-dd28-4fbc-82de-9cf85df54f4f\") " pod="openshift-multus/multus-csb5z" Oct 03 08:39:36 crc kubenswrapper[4765]: I1003 08:39:36.577113 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/4f105c06-3e67-486f-a622-923ae442117c-os-release\") pod \"multus-additional-cni-plugins-4bmrv\" (UID: \"4f105c06-3e67-486f-a622-923ae442117c\") " pod="openshift-multus/multus-additional-cni-plugins-4bmrv" Oct 03 08:39:36 crc kubenswrapper[4765]: I1003 08:39:36.577145 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2ldt7\" (UniqueName: \"kubernetes.io/projected/46c76a49-e10b-4a12-a6c7-12c330cd3c4e-kube-api-access-2ldt7\") pod \"node-resolver-p9gf5\" (UID: \"46c76a49-e10b-4a12-a6c7-12c330cd3c4e\") " pod="openshift-dns/node-resolver-p9gf5" Oct 03 08:39:36 crc kubenswrapper[4765]: I1003 08:39:36.577161 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/912755c8-dd28-4fbc-82de-9cf85df54f4f-host-run-netns\") pod \"multus-csb5z\" (UID: \"912755c8-dd28-4fbc-82de-9cf85df54f4f\") " pod="openshift-multus/multus-csb5z" Oct 03 08:39:36 crc kubenswrapper[4765]: I1003 08:39:36.577178 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/912755c8-dd28-4fbc-82de-9cf85df54f4f-multus-daemon-config\") pod \"multus-csb5z\" (UID: \"912755c8-dd28-4fbc-82de-9cf85df54f4f\") " pod="openshift-multus/multus-csb5z" Oct 03 08:39:36 crc kubenswrapper[4765]: I1003 08:39:36.577194 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/4f105c06-3e67-486f-a622-923ae442117c-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-4bmrv\" (UID: \"4f105c06-3e67-486f-a622-923ae442117c\") " pod="openshift-multus/multus-additional-cni-plugins-4bmrv" Oct 03 08:39:36 crc kubenswrapper[4765]: I1003 08:39:36.577209 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/912755c8-dd28-4fbc-82de-9cf85df54f4f-etc-kubernetes\") pod \"multus-csb5z\" (UID: \"912755c8-dd28-4fbc-82de-9cf85df54f4f\") " pod="openshift-multus/multus-csb5z" Oct 03 08:39:36 crc kubenswrapper[4765]: I1003 08:39:36.577263 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/46c76a49-e10b-4a12-a6c7-12c330cd3c4e-hosts-file\") pod \"node-resolver-p9gf5\" (UID: \"46c76a49-e10b-4a12-a6c7-12c330cd3c4e\") " pod="openshift-dns/node-resolver-p9gf5" Oct 03 08:39:36 crc kubenswrapper[4765]: I1003 08:39:36.577289 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/912755c8-dd28-4fbc-82de-9cf85df54f4f-multus-cni-dir\") pod \"multus-csb5z\" (UID: \"912755c8-dd28-4fbc-82de-9cf85df54f4f\") " pod="openshift-multus/multus-csb5z" Oct 03 08:39:36 crc kubenswrapper[4765]: I1003 08:39:36.577310 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/912755c8-dd28-4fbc-82de-9cf85df54f4f-os-release\") pod \"multus-csb5z\" (UID: \"912755c8-dd28-4fbc-82de-9cf85df54f4f\") " pod="openshift-multus/multus-csb5z" Oct 03 08:39:36 crc kubenswrapper[4765]: I1003 08:39:36.577334 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/4f105c06-3e67-486f-a622-923ae442117c-tuning-conf-dir\") pod \"multus-additional-cni-plugins-4bmrv\" (UID: \"4f105c06-3e67-486f-a622-923ae442117c\") " pod="openshift-multus/multus-additional-cni-plugins-4bmrv" Oct 03 08:39:36 crc kubenswrapper[4765]: I1003 08:39:36.577351 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/912755c8-dd28-4fbc-82de-9cf85df54f4f-host-run-multus-certs\") pod \"multus-csb5z\" (UID: \"912755c8-dd28-4fbc-82de-9cf85df54f4f\") " pod="openshift-multus/multus-csb5z" Oct 03 08:39:36 crc kubenswrapper[4765]: I1003 08:39:36.577368 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/4f105c06-3e67-486f-a622-923ae442117c-system-cni-dir\") pod \"multus-additional-cni-plugins-4bmrv\" (UID: \"4f105c06-3e67-486f-a622-923ae442117c\") " pod="openshift-multus/multus-additional-cni-plugins-4bmrv" Oct 03 08:39:36 crc kubenswrapper[4765]: I1003 08:39:36.577396 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/4f105c06-3e67-486f-a622-923ae442117c-cni-binary-copy\") pod \"multus-additional-cni-plugins-4bmrv\" (UID: \"4f105c06-3e67-486f-a622-923ae442117c\") " pod="openshift-multus/multus-additional-cni-plugins-4bmrv" Oct 03 08:39:36 crc kubenswrapper[4765]: I1003 08:39:36.577414 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/912755c8-dd28-4fbc-82de-9cf85df54f4f-host-var-lib-kubelet\") pod \"multus-csb5z\" (UID: \"912755c8-dd28-4fbc-82de-9cf85df54f4f\") " pod="openshift-multus/multus-csb5z" Oct 03 08:39:36 crc kubenswrapper[4765]: I1003 08:39:36.577464 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/4f105c06-3e67-486f-a622-923ae442117c-cnibin\") pod \"multus-additional-cni-plugins-4bmrv\" (UID: \"4f105c06-3e67-486f-a622-923ae442117c\") " pod="openshift-multus/multus-additional-cni-plugins-4bmrv" Oct 03 08:39:36 crc kubenswrapper[4765]: I1003 08:39:36.577496 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-w8k2k\" (UniqueName: \"kubernetes.io/projected/912755c8-dd28-4fbc-82de-9cf85df54f4f-kube-api-access-w8k2k\") pod \"multus-csb5z\" (UID: \"912755c8-dd28-4fbc-82de-9cf85df54f4f\") " pod="openshift-multus/multus-csb5z" Oct 03 08:39:36 crc kubenswrapper[4765]: I1003 08:39:36.577523 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"mcd-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/d636dbad-9ffa-4ba7-953f-adea04b76a23-mcd-auth-proxy-config\") pod \"machine-config-daemon-j8mss\" (UID: \"d636dbad-9ffa-4ba7-953f-adea04b76a23\") " pod="openshift-machine-config-operator/machine-config-daemon-j8mss" Oct 03 08:39:36 crc kubenswrapper[4765]: I1003 08:39:36.577539 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/912755c8-dd28-4fbc-82de-9cf85df54f4f-hostroot\") pod \"multus-csb5z\" (UID: \"912755c8-dd28-4fbc-82de-9cf85df54f4f\") " pod="openshift-multus/multus-csb5z" Oct 03 08:39:36 crc kubenswrapper[4765]: I1003 08:39:36.577553 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/912755c8-dd28-4fbc-82de-9cf85df54f4f-multus-conf-dir\") pod \"multus-csb5z\" (UID: \"912755c8-dd28-4fbc-82de-9cf85df54f4f\") " pod="openshift-multus/multus-csb5z" Oct 03 08:39:36 crc kubenswrapper[4765]: I1003 08:39:36.585468 4765 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0c434639-9c6c-420c-a51b-fdf59b654daa\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:16Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:16Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d31497fd54f7500ac776bdd9a16414d873c053353911ed5ba237b201e9e7ac12\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T08:39:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e89b19d6a5b90a2051665bf2e5e150f73df7899eff246ee75246bc2127c415ea\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T08:39:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://79fad446c147481b1a0ff2a173848b2d24384e6b6aafcd0749dc820e9abfe929\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T08:39:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f2e21a2b21d807288e991a3a44ea38d316985590080aa4291aa3385816f826dd\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://aa0283dadc2c5e48aa9bfd20ef35d889a350244b72eb8529d4d4e682d5fa0e47\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-03T08:39:35Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1003 08:39:29.830291 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1003 08:39:29.833185 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2710500186/tls.crt::/tmp/serving-cert-2710500186/tls.key\\\\\\\"\\\\nI1003 08:39:35.213224 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1003 08:39:35.219008 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1003 08:39:35.219055 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1003 08:39:35.219088 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1003 08:39:35.219098 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1003 08:39:35.227302 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI1003 08:39:35.227314 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW1003 08:39:35.227372 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1003 08:39:35.227381 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1003 08:39:35.227385 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1003 08:39:35.227395 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1003 08:39:35.227398 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1003 08:39:35.227401 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF1003 08:39:35.229781 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-03T08:39:19Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T08:39:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fa1d1c0f4dab4b4c6c9f3afccac34473eab40a714015a2a7ce725ed1a92b609c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T08:39:19Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7c94b0f96f50324a67a7b189e9d3c4e406193421aa2e793692219bef9f2578dd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7c94b0f96f50324a67a7b189e9d3c4e406193421aa2e793692219bef9f2578dd\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T08:39:17Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T08:39:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T08:39:16Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T08:39:36Z is after 2025-08-24T17:21:41Z" Oct 03 08:39:36 crc kubenswrapper[4765]: I1003 08:39:36.598529 4765 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:35Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:35Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T08:39:36Z is after 2025-08-24T17:21:41Z" Oct 03 08:39:36 crc kubenswrapper[4765]: I1003 08:39:36.615419 4765 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:35Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:35Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T08:39:36Z is after 2025-08-24T17:21:41Z" Oct 03 08:39:36 crc kubenswrapper[4765]: I1003 08:39:36.631247 4765 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:36Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:36Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e2003e4dd90b26bd915c05a690d0ab12b21ef7773138f11993382b0e7ac2d778\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T08:39:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T08:39:36Z is after 2025-08-24T17:21:41Z" Oct 03 08:39:36 crc kubenswrapper[4765]: I1003 08:39:36.644987 4765 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:36Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:36Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e8d6f534a0a702832db2f8947c1528a98d511d3950cc5a6ec0ac3b31b3dbcb7c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T08:39:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://20ad16cb9f0f7e17ac946cd2c3f7c01b6e6c95d6d76c99f482b3761546689af2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T08:39:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T08:39:36Z is after 2025-08-24T17:21:41Z" Oct 03 08:39:36 crc kubenswrapper[4765]: I1003 08:39:36.659627 4765 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:35Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:35Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T08:39:36Z is after 2025-08-24T17:21:41Z" Oct 03 08:39:36 crc kubenswrapper[4765]: I1003 08:39:36.676377 4765 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-csb5z" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"912755c8-dd28-4fbc-82de-9cf85df54f4f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:36Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:36Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:36Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w8k2k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T08:39:36Z\\\"}}\" for pod \"openshift-multus\"/\"multus-csb5z\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T08:39:36Z is after 2025-08-24T17:21:41Z" Oct 03 08:39:36 crc kubenswrapper[4765]: I1003 08:39:36.678764 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/912755c8-dd28-4fbc-82de-9cf85df54f4f-host-var-lib-kubelet\") pod \"multus-csb5z\" (UID: \"912755c8-dd28-4fbc-82de-9cf85df54f4f\") " pod="openshift-multus/multus-csb5z" Oct 03 08:39:36 crc kubenswrapper[4765]: I1003 08:39:36.678824 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/4f105c06-3e67-486f-a622-923ae442117c-cni-binary-copy\") pod \"multus-additional-cni-plugins-4bmrv\" (UID: \"4f105c06-3e67-486f-a622-923ae442117c\") " pod="openshift-multus/multus-additional-cni-plugins-4bmrv" Oct 03 08:39:36 crc kubenswrapper[4765]: I1003 08:39:36.678887 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-w8k2k\" (UniqueName: \"kubernetes.io/projected/912755c8-dd28-4fbc-82de-9cf85df54f4f-kube-api-access-w8k2k\") pod \"multus-csb5z\" (UID: \"912755c8-dd28-4fbc-82de-9cf85df54f4f\") " pod="openshift-multus/multus-csb5z" Oct 03 08:39:36 crc kubenswrapper[4765]: I1003 08:39:36.678905 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/4f105c06-3e67-486f-a622-923ae442117c-cnibin\") pod \"multus-additional-cni-plugins-4bmrv\" (UID: \"4f105c06-3e67-486f-a622-923ae442117c\") " pod="openshift-multus/multus-additional-cni-plugins-4bmrv" Oct 03 08:39:36 crc kubenswrapper[4765]: I1003 08:39:36.678921 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"mcd-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/d636dbad-9ffa-4ba7-953f-adea04b76a23-mcd-auth-proxy-config\") pod \"machine-config-daemon-j8mss\" (UID: \"d636dbad-9ffa-4ba7-953f-adea04b76a23\") " pod="openshift-machine-config-operator/machine-config-daemon-j8mss" Oct 03 08:39:36 crc kubenswrapper[4765]: I1003 08:39:36.678937 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/912755c8-dd28-4fbc-82de-9cf85df54f4f-hostroot\") pod \"multus-csb5z\" (UID: \"912755c8-dd28-4fbc-82de-9cf85df54f4f\") " pod="openshift-multus/multus-csb5z" Oct 03 08:39:36 crc kubenswrapper[4765]: I1003 08:39:36.678953 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/912755c8-dd28-4fbc-82de-9cf85df54f4f-multus-conf-dir\") pod \"multus-csb5z\" (UID: \"912755c8-dd28-4fbc-82de-9cf85df54f4f\") " pod="openshift-multus/multus-csb5z" Oct 03 08:39:36 crc kubenswrapper[4765]: I1003 08:39:36.678970 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/912755c8-dd28-4fbc-82de-9cf85df54f4f-host-run-k8s-cni-cncf-io\") pod \"multus-csb5z\" (UID: \"912755c8-dd28-4fbc-82de-9cf85df54f4f\") " pod="openshift-multus/multus-csb5z" Oct 03 08:39:36 crc kubenswrapper[4765]: I1003 08:39:36.678987 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/912755c8-dd28-4fbc-82de-9cf85df54f4f-cni-binary-copy\") pod \"multus-csb5z\" (UID: \"912755c8-dd28-4fbc-82de-9cf85df54f4f\") " pod="openshift-multus/multus-csb5z" Oct 03 08:39:36 crc kubenswrapper[4765]: I1003 08:39:36.679003 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rootfs\" (UniqueName: \"kubernetes.io/host-path/d636dbad-9ffa-4ba7-953f-adea04b76a23-rootfs\") pod \"machine-config-daemon-j8mss\" (UID: \"d636dbad-9ffa-4ba7-953f-adea04b76a23\") " pod="openshift-machine-config-operator/machine-config-daemon-j8mss" Oct 03 08:39:36 crc kubenswrapper[4765]: I1003 08:39:36.679025 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/d636dbad-9ffa-4ba7-953f-adea04b76a23-proxy-tls\") pod \"machine-config-daemon-j8mss\" (UID: \"d636dbad-9ffa-4ba7-953f-adea04b76a23\") " pod="openshift-machine-config-operator/machine-config-daemon-j8mss" Oct 03 08:39:36 crc kubenswrapper[4765]: I1003 08:39:36.679040 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dhzzj\" (UniqueName: \"kubernetes.io/projected/d636dbad-9ffa-4ba7-953f-adea04b76a23-kube-api-access-dhzzj\") pod \"machine-config-daemon-j8mss\" (UID: \"d636dbad-9ffa-4ba7-953f-adea04b76a23\") " pod="openshift-machine-config-operator/machine-config-daemon-j8mss" Oct 03 08:39:36 crc kubenswrapper[4765]: I1003 08:39:36.679058 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/912755c8-dd28-4fbc-82de-9cf85df54f4f-multus-socket-dir-parent\") pod \"multus-csb5z\" (UID: \"912755c8-dd28-4fbc-82de-9cf85df54f4f\") " pod="openshift-multus/multus-csb5z" Oct 03 08:39:36 crc kubenswrapper[4765]: I1003 08:39:36.679109 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6lhz2\" (UniqueName: \"kubernetes.io/projected/4f105c06-3e67-486f-a622-923ae442117c-kube-api-access-6lhz2\") pod \"multus-additional-cni-plugins-4bmrv\" (UID: \"4f105c06-3e67-486f-a622-923ae442117c\") " pod="openshift-multus/multus-additional-cni-plugins-4bmrv" Oct 03 08:39:36 crc kubenswrapper[4765]: I1003 08:39:36.679135 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/912755c8-dd28-4fbc-82de-9cf85df54f4f-cnibin\") pod \"multus-csb5z\" (UID: \"912755c8-dd28-4fbc-82de-9cf85df54f4f\") " pod="openshift-multus/multus-csb5z" Oct 03 08:39:36 crc kubenswrapper[4765]: I1003 08:39:36.679155 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/912755c8-dd28-4fbc-82de-9cf85df54f4f-host-var-lib-cni-bin\") pod \"multus-csb5z\" (UID: \"912755c8-dd28-4fbc-82de-9cf85df54f4f\") " pod="openshift-multus/multus-csb5z" Oct 03 08:39:36 crc kubenswrapper[4765]: I1003 08:39:36.679175 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2ldt7\" (UniqueName: \"kubernetes.io/projected/46c76a49-e10b-4a12-a6c7-12c330cd3c4e-kube-api-access-2ldt7\") pod \"node-resolver-p9gf5\" (UID: \"46c76a49-e10b-4a12-a6c7-12c330cd3c4e\") " pod="openshift-dns/node-resolver-p9gf5" Oct 03 08:39:36 crc kubenswrapper[4765]: I1003 08:39:36.679195 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/912755c8-dd28-4fbc-82de-9cf85df54f4f-system-cni-dir\") pod \"multus-csb5z\" (UID: \"912755c8-dd28-4fbc-82de-9cf85df54f4f\") " pod="openshift-multus/multus-csb5z" Oct 03 08:39:36 crc kubenswrapper[4765]: I1003 08:39:36.679215 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/912755c8-dd28-4fbc-82de-9cf85df54f4f-host-var-lib-cni-multus\") pod \"multus-csb5z\" (UID: \"912755c8-dd28-4fbc-82de-9cf85df54f4f\") " pod="openshift-multus/multus-csb5z" Oct 03 08:39:36 crc kubenswrapper[4765]: I1003 08:39:36.679240 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/4f105c06-3e67-486f-a622-923ae442117c-os-release\") pod \"multus-additional-cni-plugins-4bmrv\" (UID: \"4f105c06-3e67-486f-a622-923ae442117c\") " pod="openshift-multus/multus-additional-cni-plugins-4bmrv" Oct 03 08:39:36 crc kubenswrapper[4765]: I1003 08:39:36.679258 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/4f105c06-3e67-486f-a622-923ae442117c-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-4bmrv\" (UID: \"4f105c06-3e67-486f-a622-923ae442117c\") " pod="openshift-multus/multus-additional-cni-plugins-4bmrv" Oct 03 08:39:36 crc kubenswrapper[4765]: I1003 08:39:36.679276 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/912755c8-dd28-4fbc-82de-9cf85df54f4f-host-run-netns\") pod \"multus-csb5z\" (UID: \"912755c8-dd28-4fbc-82de-9cf85df54f4f\") " pod="openshift-multus/multus-csb5z" Oct 03 08:39:36 crc kubenswrapper[4765]: I1003 08:39:36.679292 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/912755c8-dd28-4fbc-82de-9cf85df54f4f-multus-daemon-config\") pod \"multus-csb5z\" (UID: \"912755c8-dd28-4fbc-82de-9cf85df54f4f\") " pod="openshift-multus/multus-csb5z" Oct 03 08:39:36 crc kubenswrapper[4765]: I1003 08:39:36.679310 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/912755c8-dd28-4fbc-82de-9cf85df54f4f-multus-cni-dir\") pod \"multus-csb5z\" (UID: \"912755c8-dd28-4fbc-82de-9cf85df54f4f\") " pod="openshift-multus/multus-csb5z" Oct 03 08:39:36 crc kubenswrapper[4765]: I1003 08:39:36.679360 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/912755c8-dd28-4fbc-82de-9cf85df54f4f-etc-kubernetes\") pod \"multus-csb5z\" (UID: \"912755c8-dd28-4fbc-82de-9cf85df54f4f\") " pod="openshift-multus/multus-csb5z" Oct 03 08:39:36 crc kubenswrapper[4765]: I1003 08:39:36.679383 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/46c76a49-e10b-4a12-a6c7-12c330cd3c4e-hosts-file\") pod \"node-resolver-p9gf5\" (UID: \"46c76a49-e10b-4a12-a6c7-12c330cd3c4e\") " pod="openshift-dns/node-resolver-p9gf5" Oct 03 08:39:36 crc kubenswrapper[4765]: I1003 08:39:36.679400 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/912755c8-dd28-4fbc-82de-9cf85df54f4f-os-release\") pod \"multus-csb5z\" (UID: \"912755c8-dd28-4fbc-82de-9cf85df54f4f\") " pod="openshift-multus/multus-csb5z" Oct 03 08:39:36 crc kubenswrapper[4765]: I1003 08:39:36.679422 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/4f105c06-3e67-486f-a622-923ae442117c-system-cni-dir\") pod \"multus-additional-cni-plugins-4bmrv\" (UID: \"4f105c06-3e67-486f-a622-923ae442117c\") " pod="openshift-multus/multus-additional-cni-plugins-4bmrv" Oct 03 08:39:36 crc kubenswrapper[4765]: I1003 08:39:36.679444 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/4f105c06-3e67-486f-a622-923ae442117c-tuning-conf-dir\") pod \"multus-additional-cni-plugins-4bmrv\" (UID: \"4f105c06-3e67-486f-a622-923ae442117c\") " pod="openshift-multus/multus-additional-cni-plugins-4bmrv" Oct 03 08:39:36 crc kubenswrapper[4765]: I1003 08:39:36.679463 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/912755c8-dd28-4fbc-82de-9cf85df54f4f-host-run-multus-certs\") pod \"multus-csb5z\" (UID: \"912755c8-dd28-4fbc-82de-9cf85df54f4f\") " pod="openshift-multus/multus-csb5z" Oct 03 08:39:36 crc kubenswrapper[4765]: I1003 08:39:36.679557 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/912755c8-dd28-4fbc-82de-9cf85df54f4f-host-run-multus-certs\") pod \"multus-csb5z\" (UID: \"912755c8-dd28-4fbc-82de-9cf85df54f4f\") " pod="openshift-multus/multus-csb5z" Oct 03 08:39:36 crc kubenswrapper[4765]: I1003 08:39:36.679601 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/912755c8-dd28-4fbc-82de-9cf85df54f4f-host-var-lib-kubelet\") pod \"multus-csb5z\" (UID: \"912755c8-dd28-4fbc-82de-9cf85df54f4f\") " pod="openshift-multus/multus-csb5z" Oct 03 08:39:36 crc kubenswrapper[4765]: I1003 08:39:36.680345 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/4f105c06-3e67-486f-a622-923ae442117c-cni-binary-copy\") pod \"multus-additional-cni-plugins-4bmrv\" (UID: \"4f105c06-3e67-486f-a622-923ae442117c\") " pod="openshift-multus/multus-additional-cni-plugins-4bmrv" Oct 03 08:39:36 crc kubenswrapper[4765]: I1003 08:39:36.680637 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/4f105c06-3e67-486f-a622-923ae442117c-cnibin\") pod \"multus-additional-cni-plugins-4bmrv\" (UID: \"4f105c06-3e67-486f-a622-923ae442117c\") " pod="openshift-multus/multus-additional-cni-plugins-4bmrv" Oct 03 08:39:36 crc kubenswrapper[4765]: I1003 08:39:36.681226 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"mcd-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/d636dbad-9ffa-4ba7-953f-adea04b76a23-mcd-auth-proxy-config\") pod \"machine-config-daemon-j8mss\" (UID: \"d636dbad-9ffa-4ba7-953f-adea04b76a23\") " pod="openshift-machine-config-operator/machine-config-daemon-j8mss" Oct 03 08:39:36 crc kubenswrapper[4765]: I1003 08:39:36.681283 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/912755c8-dd28-4fbc-82de-9cf85df54f4f-hostroot\") pod \"multus-csb5z\" (UID: \"912755c8-dd28-4fbc-82de-9cf85df54f4f\") " pod="openshift-multus/multus-csb5z" Oct 03 08:39:36 crc kubenswrapper[4765]: I1003 08:39:36.681317 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/912755c8-dd28-4fbc-82de-9cf85df54f4f-multus-conf-dir\") pod \"multus-csb5z\" (UID: \"912755c8-dd28-4fbc-82de-9cf85df54f4f\") " pod="openshift-multus/multus-csb5z" Oct 03 08:39:36 crc kubenswrapper[4765]: I1003 08:39:36.681347 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/912755c8-dd28-4fbc-82de-9cf85df54f4f-host-run-k8s-cni-cncf-io\") pod \"multus-csb5z\" (UID: \"912755c8-dd28-4fbc-82de-9cf85df54f4f\") " pod="openshift-multus/multus-csb5z" Oct 03 08:39:36 crc kubenswrapper[4765]: I1003 08:39:36.681587 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/912755c8-dd28-4fbc-82de-9cf85df54f4f-host-var-lib-cni-multus\") pod \"multus-csb5z\" (UID: \"912755c8-dd28-4fbc-82de-9cf85df54f4f\") " pod="openshift-multus/multus-csb5z" Oct 03 08:39:36 crc kubenswrapper[4765]: I1003 08:39:36.681673 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rootfs\" (UniqueName: \"kubernetes.io/host-path/d636dbad-9ffa-4ba7-953f-adea04b76a23-rootfs\") pod \"machine-config-daemon-j8mss\" (UID: \"d636dbad-9ffa-4ba7-953f-adea04b76a23\") " pod="openshift-machine-config-operator/machine-config-daemon-j8mss" Oct 03 08:39:36 crc kubenswrapper[4765]: I1003 08:39:36.681946 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/912755c8-dd28-4fbc-82de-9cf85df54f4f-multus-cni-dir\") pod \"multus-csb5z\" (UID: \"912755c8-dd28-4fbc-82de-9cf85df54f4f\") " pod="openshift-multus/multus-csb5z" Oct 03 08:39:36 crc kubenswrapper[4765]: I1003 08:39:36.682004 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/912755c8-dd28-4fbc-82de-9cf85df54f4f-cnibin\") pod \"multus-csb5z\" (UID: \"912755c8-dd28-4fbc-82de-9cf85df54f4f\") " pod="openshift-multus/multus-csb5z" Oct 03 08:39:36 crc kubenswrapper[4765]: I1003 08:39:36.682048 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/912755c8-dd28-4fbc-82de-9cf85df54f4f-etc-kubernetes\") pod \"multus-csb5z\" (UID: \"912755c8-dd28-4fbc-82de-9cf85df54f4f\") " pod="openshift-multus/multus-csb5z" Oct 03 08:39:36 crc kubenswrapper[4765]: I1003 08:39:36.682014 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/912755c8-dd28-4fbc-82de-9cf85df54f4f-cni-binary-copy\") pod \"multus-csb5z\" (UID: \"912755c8-dd28-4fbc-82de-9cf85df54f4f\") " pod="openshift-multus/multus-csb5z" Oct 03 08:39:36 crc kubenswrapper[4765]: I1003 08:39:36.682102 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/46c76a49-e10b-4a12-a6c7-12c330cd3c4e-hosts-file\") pod \"node-resolver-p9gf5\" (UID: \"46c76a49-e10b-4a12-a6c7-12c330cd3c4e\") " pod="openshift-dns/node-resolver-p9gf5" Oct 03 08:39:36 crc kubenswrapper[4765]: I1003 08:39:36.682175 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/912755c8-dd28-4fbc-82de-9cf85df54f4f-host-var-lib-cni-bin\") pod \"multus-csb5z\" (UID: \"912755c8-dd28-4fbc-82de-9cf85df54f4f\") " pod="openshift-multus/multus-csb5z" Oct 03 08:39:36 crc kubenswrapper[4765]: I1003 08:39:36.682240 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/912755c8-dd28-4fbc-82de-9cf85df54f4f-multus-socket-dir-parent\") pod \"multus-csb5z\" (UID: \"912755c8-dd28-4fbc-82de-9cf85df54f4f\") " pod="openshift-multus/multus-csb5z" Oct 03 08:39:36 crc kubenswrapper[4765]: I1003 08:39:36.682336 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/4f105c06-3e67-486f-a622-923ae442117c-os-release\") pod \"multus-additional-cni-plugins-4bmrv\" (UID: \"4f105c06-3e67-486f-a622-923ae442117c\") " pod="openshift-multus/multus-additional-cni-plugins-4bmrv" Oct 03 08:39:36 crc kubenswrapper[4765]: I1003 08:39:36.682342 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/912755c8-dd28-4fbc-82de-9cf85df54f4f-system-cni-dir\") pod \"multus-csb5z\" (UID: \"912755c8-dd28-4fbc-82de-9cf85df54f4f\") " pod="openshift-multus/multus-csb5z" Oct 03 08:39:36 crc kubenswrapper[4765]: I1003 08:39:36.682380 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/4f105c06-3e67-486f-a622-923ae442117c-system-cni-dir\") pod \"multus-additional-cni-plugins-4bmrv\" (UID: \"4f105c06-3e67-486f-a622-923ae442117c\") " pod="openshift-multus/multus-additional-cni-plugins-4bmrv" Oct 03 08:39:36 crc kubenswrapper[4765]: I1003 08:39:36.682330 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/912755c8-dd28-4fbc-82de-9cf85df54f4f-os-release\") pod \"multus-csb5z\" (UID: \"912755c8-dd28-4fbc-82de-9cf85df54f4f\") " pod="openshift-multus/multus-csb5z" Oct 03 08:39:36 crc kubenswrapper[4765]: I1003 08:39:36.682426 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/912755c8-dd28-4fbc-82de-9cf85df54f4f-host-run-netns\") pod \"multus-csb5z\" (UID: \"912755c8-dd28-4fbc-82de-9cf85df54f4f\") " pod="openshift-multus/multus-csb5z" Oct 03 08:39:36 crc kubenswrapper[4765]: I1003 08:39:36.682691 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/4f105c06-3e67-486f-a622-923ae442117c-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-4bmrv\" (UID: \"4f105c06-3e67-486f-a622-923ae442117c\") " pod="openshift-multus/multus-additional-cni-plugins-4bmrv" Oct 03 08:39:36 crc kubenswrapper[4765]: I1003 08:39:36.682932 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/4f105c06-3e67-486f-a622-923ae442117c-tuning-conf-dir\") pod \"multus-additional-cni-plugins-4bmrv\" (UID: \"4f105c06-3e67-486f-a622-923ae442117c\") " pod="openshift-multus/multus-additional-cni-plugins-4bmrv" Oct 03 08:39:36 crc kubenswrapper[4765]: I1003 08:39:36.682958 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/912755c8-dd28-4fbc-82de-9cf85df54f4f-multus-daemon-config\") pod \"multus-csb5z\" (UID: \"912755c8-dd28-4fbc-82de-9cf85df54f4f\") " pod="openshift-multus/multus-csb5z" Oct 03 08:39:36 crc kubenswrapper[4765]: I1003 08:39:36.687480 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/d636dbad-9ffa-4ba7-953f-adea04b76a23-proxy-tls\") pod \"machine-config-daemon-j8mss\" (UID: \"d636dbad-9ffa-4ba7-953f-adea04b76a23\") " pod="openshift-machine-config-operator/machine-config-daemon-j8mss" Oct 03 08:39:36 crc kubenswrapper[4765]: I1003 08:39:36.701210 4765 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-j8mss" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d636dbad-9ffa-4ba7-953f-adea04b76a23\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:36Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:36Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:36Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dhzzj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dhzzj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T08:39:36Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-j8mss\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T08:39:36Z is after 2025-08-24T17:21:41Z" Oct 03 08:39:36 crc kubenswrapper[4765]: I1003 08:39:36.705850 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dhzzj\" (UniqueName: \"kubernetes.io/projected/d636dbad-9ffa-4ba7-953f-adea04b76a23-kube-api-access-dhzzj\") pod \"machine-config-daemon-j8mss\" (UID: \"d636dbad-9ffa-4ba7-953f-adea04b76a23\") " pod="openshift-machine-config-operator/machine-config-daemon-j8mss" Oct 03 08:39:36 crc kubenswrapper[4765]: I1003 08:39:36.707277 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6lhz2\" (UniqueName: \"kubernetes.io/projected/4f105c06-3e67-486f-a622-923ae442117c-kube-api-access-6lhz2\") pod \"multus-additional-cni-plugins-4bmrv\" (UID: \"4f105c06-3e67-486f-a622-923ae442117c\") " pod="openshift-multus/multus-additional-cni-plugins-4bmrv" Oct 03 08:39:36 crc kubenswrapper[4765]: I1003 08:39:36.709574 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-w8k2k\" (UniqueName: \"kubernetes.io/projected/912755c8-dd28-4fbc-82de-9cf85df54f4f-kube-api-access-w8k2k\") pod \"multus-csb5z\" (UID: \"912755c8-dd28-4fbc-82de-9cf85df54f4f\") " pod="openshift-multus/multus-csb5z" Oct 03 08:39:36 crc kubenswrapper[4765]: I1003 08:39:36.709730 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2ldt7\" (UniqueName: \"kubernetes.io/projected/46c76a49-e10b-4a12-a6c7-12c330cd3c4e-kube-api-access-2ldt7\") pod \"node-resolver-p9gf5\" (UID: \"46c76a49-e10b-4a12-a6c7-12c330cd3c4e\") " pod="openshift-dns/node-resolver-p9gf5" Oct 03 08:39:36 crc kubenswrapper[4765]: I1003 08:39:36.724236 4765 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:35Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:35Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T08:39:36Z is after 2025-08-24T17:21:41Z" Oct 03 08:39:36 crc kubenswrapper[4765]: I1003 08:39:36.747634 4765 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:36Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:36Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e8d6f534a0a702832db2f8947c1528a98d511d3950cc5a6ec0ac3b31b3dbcb7c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T08:39:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://20ad16cb9f0f7e17ac946cd2c3f7c01b6e6c95d6d76c99f482b3761546689af2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T08:39:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T08:39:36Z is after 2025-08-24T17:21:41Z" Oct 03 08:39:36 crc kubenswrapper[4765]: I1003 08:39:36.768604 4765 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:35Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:35Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T08:39:36Z is after 2025-08-24T17:21:41Z" Oct 03 08:39:36 crc kubenswrapper[4765]: I1003 08:39:36.790589 4765 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9660b983-3561-4cf7-8ea0-31a63e8d1051\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://23c27e7d79dab0c54b22f0114e7f55a9267e3a21961b8479c37fd77d0e8b66c7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T08:39:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fb89a31c804d86cbc11b04e4dcfab79d4536f28a107d43e98d48172a1c257ebb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T08:39:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1c3168f51c49cd9633557cf31cdc0fec47b3fcf981462dc85f4253a0584fcf52\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T08:39:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://99ae775d5cfd2e88a1c7ca516e1c59f2e08ce1d383653cacbefeac66b07abcb5\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T08:39:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T08:39:16Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T08:39:36Z is after 2025-08-24T17:21:41Z" Oct 03 08:39:36 crc kubenswrapper[4765]: I1003 08:39:36.845567 4765 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-4bmrv" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4f105c06-3e67-486f-a622-923ae442117c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:36Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:36Z\\\",\\\"message\\\":\\\"containers with incomplete status: [egress-router-binary-copy cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:36Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:36Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6lhz2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6lhz2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6lhz2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6lhz2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6lhz2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6lhz2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6lhz2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T08:39:36Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-4bmrv\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T08:39:36Z is after 2025-08-24T17:21:41Z" Oct 03 08:39:36 crc kubenswrapper[4765]: I1003 08:39:36.854837 4765 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-csb5z" Oct 03 08:39:36 crc kubenswrapper[4765]: I1003 08:39:36.865922 4765 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/node-resolver-p9gf5" Oct 03 08:39:36 crc kubenswrapper[4765]: W1003 08:39:36.871598 4765 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod912755c8_dd28_4fbc_82de_9cf85df54f4f.slice/crio-fd2ad09ba6976acbfa0aac07cfcb1386a545ab56dd95fdf87edfa11c5d91a2ea WatchSource:0}: Error finding container fd2ad09ba6976acbfa0aac07cfcb1386a545ab56dd95fdf87edfa11c5d91a2ea: Status 404 returned error can't find the container with id fd2ad09ba6976acbfa0aac07cfcb1386a545ab56dd95fdf87edfa11c5d91a2ea Oct 03 08:39:36 crc kubenswrapper[4765]: I1003 08:39:36.872910 4765 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-additional-cni-plugins-4bmrv" Oct 03 08:39:36 crc kubenswrapper[4765]: I1003 08:39:36.880743 4765 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 03 08:39:36 crc kubenswrapper[4765]: E1003 08:39:36.880911 4765 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-03 08:39:38.880878072 +0000 UTC m=+23.182372412 (durationBeforeRetry 2s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 03 08:39:36 crc kubenswrapper[4765]: I1003 08:39:36.888059 4765 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-p9gf5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"46c76a49-e10b-4a12-a6c7-12c330cd3c4e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:36Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:36Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:36Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2ldt7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T08:39:36Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-p9gf5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T08:39:36Z is after 2025-08-24T17:21:41Z" Oct 03 08:39:36 crc kubenswrapper[4765]: I1003 08:39:36.897222 4765 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-daemon-j8mss" Oct 03 08:39:36 crc kubenswrapper[4765]: I1003 08:39:36.934680 4765 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0c434639-9c6c-420c-a51b-fdf59b654daa\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:16Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:16Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d31497fd54f7500ac776bdd9a16414d873c053353911ed5ba237b201e9e7ac12\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T08:39:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e89b19d6a5b90a2051665bf2e5e150f73df7899eff246ee75246bc2127c415ea\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T08:39:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://79fad446c147481b1a0ff2a173848b2d24384e6b6aafcd0749dc820e9abfe929\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T08:39:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f2e21a2b21d807288e991a3a44ea38d316985590080aa4291aa3385816f826dd\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://aa0283dadc2c5e48aa9bfd20ef35d889a350244b72eb8529d4d4e682d5fa0e47\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-03T08:39:35Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1003 08:39:29.830291 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1003 08:39:29.833185 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2710500186/tls.crt::/tmp/serving-cert-2710500186/tls.key\\\\\\\"\\\\nI1003 08:39:35.213224 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1003 08:39:35.219008 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1003 08:39:35.219055 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1003 08:39:35.219088 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1003 08:39:35.219098 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1003 08:39:35.227302 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI1003 08:39:35.227314 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW1003 08:39:35.227372 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1003 08:39:35.227381 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1003 08:39:35.227385 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1003 08:39:35.227395 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1003 08:39:35.227398 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1003 08:39:35.227401 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF1003 08:39:35.229781 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-03T08:39:19Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T08:39:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fa1d1c0f4dab4b4c6c9f3afccac34473eab40a714015a2a7ce725ed1a92b609c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T08:39:19Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7c94b0f96f50324a67a7b189e9d3c4e406193421aa2e793692219bef9f2578dd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7c94b0f96f50324a67a7b189e9d3c4e406193421aa2e793692219bef9f2578dd\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T08:39:17Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T08:39:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T08:39:16Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T08:39:36Z is after 2025-08-24T17:21:41Z" Oct 03 08:39:36 crc kubenswrapper[4765]: I1003 08:39:36.984245 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 03 08:39:36 crc kubenswrapper[4765]: I1003 08:39:36.984322 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 03 08:39:36 crc kubenswrapper[4765]: I1003 08:39:36.984354 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 03 08:39:36 crc kubenswrapper[4765]: I1003 08:39:36.984383 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 03 08:39:36 crc kubenswrapper[4765]: E1003 08:39:36.984544 4765 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Oct 03 08:39:36 crc kubenswrapper[4765]: E1003 08:39:36.984659 4765 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-10-03 08:39:38.984622031 +0000 UTC m=+23.286116361 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Oct 03 08:39:36 crc kubenswrapper[4765]: E1003 08:39:36.984954 4765 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Oct 03 08:39:36 crc kubenswrapper[4765]: E1003 08:39:36.985051 4765 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-10-03 08:39:38.985022941 +0000 UTC m=+23.286517431 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Oct 03 08:39:36 crc kubenswrapper[4765]: E1003 08:39:36.985152 4765 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Oct 03 08:39:36 crc kubenswrapper[4765]: E1003 08:39:36.985179 4765 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Oct 03 08:39:36 crc kubenswrapper[4765]: E1003 08:39:36.985194 4765 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Oct 03 08:39:36 crc kubenswrapper[4765]: E1003 08:39:36.985229 4765 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2025-10-03 08:39:38.985219205 +0000 UTC m=+23.286713695 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Oct 03 08:39:36 crc kubenswrapper[4765]: E1003 08:39:36.985159 4765 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Oct 03 08:39:36 crc kubenswrapper[4765]: E1003 08:39:36.985262 4765 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Oct 03 08:39:36 crc kubenswrapper[4765]: E1003 08:39:36.985273 4765 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Oct 03 08:39:36 crc kubenswrapper[4765]: E1003 08:39:36.985311 4765 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2025-10-03 08:39:38.985302537 +0000 UTC m=+23.286797107 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Oct 03 08:39:36 crc kubenswrapper[4765]: I1003 08:39:36.985761 4765 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ovn-kubernetes/ovnkube-node-srgbb"] Oct 03 08:39:36 crc kubenswrapper[4765]: I1003 08:39:36.986810 4765 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-srgbb" Oct 03 08:39:36 crc kubenswrapper[4765]: I1003 08:39:36.988537 4765 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:35Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:35Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T08:39:36Z is after 2025-08-24T17:21:41Z" Oct 03 08:39:36 crc kubenswrapper[4765]: I1003 08:39:36.992757 4765 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-node-metrics-cert" Oct 03 08:39:36 crc kubenswrapper[4765]: I1003 08:39:36.993033 4765 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"env-overrides" Oct 03 08:39:36 crc kubenswrapper[4765]: I1003 08:39:36.993508 4765 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-kubernetes-node-dockercfg-pwtwl" Oct 03 08:39:36 crc kubenswrapper[4765]: I1003 08:39:36.993349 4765 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"openshift-service-ca.crt" Oct 03 08:39:36 crc kubenswrapper[4765]: I1003 08:39:36.993424 4765 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"ovnkube-config" Oct 03 08:39:36 crc kubenswrapper[4765]: I1003 08:39:36.993920 4765 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"kube-root-ca.crt" Oct 03 08:39:36 crc kubenswrapper[4765]: I1003 08:39:36.994113 4765 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"ovnkube-script-lib" Oct 03 08:39:37 crc kubenswrapper[4765]: I1003 08:39:37.026130 4765 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:36Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:36Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e2003e4dd90b26bd915c05a690d0ab12b21ef7773138f11993382b0e7ac2d778\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T08:39:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T08:39:37Z is after 2025-08-24T17:21:41Z" Oct 03 08:39:37 crc kubenswrapper[4765]: I1003 08:39:37.041139 4765 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:35Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:35Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T08:39:37Z is after 2025-08-24T17:21:41Z" Oct 03 08:39:37 crc kubenswrapper[4765]: I1003 08:39:37.058001 4765 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:36Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:36Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e8d6f534a0a702832db2f8947c1528a98d511d3950cc5a6ec0ac3b31b3dbcb7c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T08:39:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://20ad16cb9f0f7e17ac946cd2c3f7c01b6e6c95d6d76c99f482b3761546689af2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T08:39:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T08:39:37Z is after 2025-08-24T17:21:41Z" Oct 03 08:39:37 crc kubenswrapper[4765]: I1003 08:39:37.075435 4765 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:35Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:35Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T08:39:37Z is after 2025-08-24T17:21:41Z" Oct 03 08:39:37 crc kubenswrapper[4765]: I1003 08:39:37.085750 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/ea01fba1-445f-46c1-898c-1ceb34866850-host-run-netns\") pod \"ovnkube-node-srgbb\" (UID: \"ea01fba1-445f-46c1-898c-1ceb34866850\") " pod="openshift-ovn-kubernetes/ovnkube-node-srgbb" Oct 03 08:39:37 crc kubenswrapper[4765]: I1003 08:39:37.085792 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/ea01fba1-445f-46c1-898c-1ceb34866850-var-lib-openvswitch\") pod \"ovnkube-node-srgbb\" (UID: \"ea01fba1-445f-46c1-898c-1ceb34866850\") " pod="openshift-ovn-kubernetes/ovnkube-node-srgbb" Oct 03 08:39:37 crc kubenswrapper[4765]: I1003 08:39:37.086159 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/ea01fba1-445f-46c1-898c-1ceb34866850-node-log\") pod \"ovnkube-node-srgbb\" (UID: \"ea01fba1-445f-46c1-898c-1ceb34866850\") " pod="openshift-ovn-kubernetes/ovnkube-node-srgbb" Oct 03 08:39:37 crc kubenswrapper[4765]: I1003 08:39:37.086191 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/ea01fba1-445f-46c1-898c-1ceb34866850-ovnkube-config\") pod \"ovnkube-node-srgbb\" (UID: \"ea01fba1-445f-46c1-898c-1ceb34866850\") " pod="openshift-ovn-kubernetes/ovnkube-node-srgbb" Oct 03 08:39:37 crc kubenswrapper[4765]: I1003 08:39:37.086377 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/ea01fba1-445f-46c1-898c-1ceb34866850-systemd-units\") pod \"ovnkube-node-srgbb\" (UID: \"ea01fba1-445f-46c1-898c-1ceb34866850\") " pod="openshift-ovn-kubernetes/ovnkube-node-srgbb" Oct 03 08:39:37 crc kubenswrapper[4765]: I1003 08:39:37.086450 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/ea01fba1-445f-46c1-898c-1ceb34866850-ovn-node-metrics-cert\") pod \"ovnkube-node-srgbb\" (UID: \"ea01fba1-445f-46c1-898c-1ceb34866850\") " pod="openshift-ovn-kubernetes/ovnkube-node-srgbb" Oct 03 08:39:37 crc kubenswrapper[4765]: I1003 08:39:37.086470 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/ea01fba1-445f-46c1-898c-1ceb34866850-ovnkube-script-lib\") pod \"ovnkube-node-srgbb\" (UID: \"ea01fba1-445f-46c1-898c-1ceb34866850\") " pod="openshift-ovn-kubernetes/ovnkube-node-srgbb" Oct 03 08:39:37 crc kubenswrapper[4765]: I1003 08:39:37.086489 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/ea01fba1-445f-46c1-898c-1ceb34866850-run-systemd\") pod \"ovnkube-node-srgbb\" (UID: \"ea01fba1-445f-46c1-898c-1ceb34866850\") " pod="openshift-ovn-kubernetes/ovnkube-node-srgbb" Oct 03 08:39:37 crc kubenswrapper[4765]: I1003 08:39:37.086509 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/ea01fba1-445f-46c1-898c-1ceb34866850-etc-openvswitch\") pod \"ovnkube-node-srgbb\" (UID: \"ea01fba1-445f-46c1-898c-1ceb34866850\") " pod="openshift-ovn-kubernetes/ovnkube-node-srgbb" Oct 03 08:39:37 crc kubenswrapper[4765]: I1003 08:39:37.086537 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/ea01fba1-445f-46c1-898c-1ceb34866850-env-overrides\") pod \"ovnkube-node-srgbb\" (UID: \"ea01fba1-445f-46c1-898c-1ceb34866850\") " pod="openshift-ovn-kubernetes/ovnkube-node-srgbb" Oct 03 08:39:37 crc kubenswrapper[4765]: I1003 08:39:37.086580 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-m4zqv\" (UniqueName: \"kubernetes.io/projected/ea01fba1-445f-46c1-898c-1ceb34866850-kube-api-access-m4zqv\") pod \"ovnkube-node-srgbb\" (UID: \"ea01fba1-445f-46c1-898c-1ceb34866850\") " pod="openshift-ovn-kubernetes/ovnkube-node-srgbb" Oct 03 08:39:37 crc kubenswrapper[4765]: I1003 08:39:37.086606 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/ea01fba1-445f-46c1-898c-1ceb34866850-host-kubelet\") pod \"ovnkube-node-srgbb\" (UID: \"ea01fba1-445f-46c1-898c-1ceb34866850\") " pod="openshift-ovn-kubernetes/ovnkube-node-srgbb" Oct 03 08:39:37 crc kubenswrapper[4765]: I1003 08:39:37.086626 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/ea01fba1-445f-46c1-898c-1ceb34866850-run-ovn\") pod \"ovnkube-node-srgbb\" (UID: \"ea01fba1-445f-46c1-898c-1ceb34866850\") " pod="openshift-ovn-kubernetes/ovnkube-node-srgbb" Oct 03 08:39:37 crc kubenswrapper[4765]: I1003 08:39:37.086678 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/ea01fba1-445f-46c1-898c-1ceb34866850-host-run-ovn-kubernetes\") pod \"ovnkube-node-srgbb\" (UID: \"ea01fba1-445f-46c1-898c-1ceb34866850\") " pod="openshift-ovn-kubernetes/ovnkube-node-srgbb" Oct 03 08:39:37 crc kubenswrapper[4765]: I1003 08:39:37.086716 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/ea01fba1-445f-46c1-898c-1ceb34866850-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-srgbb\" (UID: \"ea01fba1-445f-46c1-898c-1ceb34866850\") " pod="openshift-ovn-kubernetes/ovnkube-node-srgbb" Oct 03 08:39:37 crc kubenswrapper[4765]: I1003 08:39:37.086744 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/ea01fba1-445f-46c1-898c-1ceb34866850-run-openvswitch\") pod \"ovnkube-node-srgbb\" (UID: \"ea01fba1-445f-46c1-898c-1ceb34866850\") " pod="openshift-ovn-kubernetes/ovnkube-node-srgbb" Oct 03 08:39:37 crc kubenswrapper[4765]: I1003 08:39:37.086763 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/ea01fba1-445f-46c1-898c-1ceb34866850-host-cni-bin\") pod \"ovnkube-node-srgbb\" (UID: \"ea01fba1-445f-46c1-898c-1ceb34866850\") " pod="openshift-ovn-kubernetes/ovnkube-node-srgbb" Oct 03 08:39:37 crc kubenswrapper[4765]: I1003 08:39:37.086784 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/ea01fba1-445f-46c1-898c-1ceb34866850-host-slash\") pod \"ovnkube-node-srgbb\" (UID: \"ea01fba1-445f-46c1-898c-1ceb34866850\") " pod="openshift-ovn-kubernetes/ovnkube-node-srgbb" Oct 03 08:39:37 crc kubenswrapper[4765]: I1003 08:39:37.086803 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/ea01fba1-445f-46c1-898c-1ceb34866850-log-socket\") pod \"ovnkube-node-srgbb\" (UID: \"ea01fba1-445f-46c1-898c-1ceb34866850\") " pod="openshift-ovn-kubernetes/ovnkube-node-srgbb" Oct 03 08:39:37 crc kubenswrapper[4765]: I1003 08:39:37.086825 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/ea01fba1-445f-46c1-898c-1ceb34866850-host-cni-netd\") pod \"ovnkube-node-srgbb\" (UID: \"ea01fba1-445f-46c1-898c-1ceb34866850\") " pod="openshift-ovn-kubernetes/ovnkube-node-srgbb" Oct 03 08:39:37 crc kubenswrapper[4765]: I1003 08:39:37.099553 4765 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-srgbb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ea01fba1-445f-46c1-898c-1ceb34866850\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:36Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:36Z\\\",\\\"message\\\":\\\"containers with incomplete status: [kubecfg-setup]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:36Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:36Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m4zqv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m4zqv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m4zqv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m4zqv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m4zqv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m4zqv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m4zqv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m4zqv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m4zqv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T08:39:36Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-srgbb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T08:39:37Z is after 2025-08-24T17:21:41Z" Oct 03 08:39:37 crc kubenswrapper[4765]: I1003 08:39:37.114606 4765 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9660b983-3561-4cf7-8ea0-31a63e8d1051\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://23c27e7d79dab0c54b22f0114e7f55a9267e3a21961b8479c37fd77d0e8b66c7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T08:39:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fb89a31c804d86cbc11b04e4dcfab79d4536f28a107d43e98d48172a1c257ebb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T08:39:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1c3168f51c49cd9633557cf31cdc0fec47b3fcf981462dc85f4253a0584fcf52\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T08:39:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://99ae775d5cfd2e88a1c7ca516e1c59f2e08ce1d383653cacbefeac66b07abcb5\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T08:39:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T08:39:16Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T08:39:37Z is after 2025-08-24T17:21:41Z" Oct 03 08:39:37 crc kubenswrapper[4765]: I1003 08:39:37.134508 4765 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-4bmrv" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4f105c06-3e67-486f-a622-923ae442117c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:36Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:36Z\\\",\\\"message\\\":\\\"containers with incomplete status: [egress-router-binary-copy cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:36Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:36Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6lhz2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6lhz2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6lhz2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6lhz2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6lhz2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6lhz2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6lhz2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T08:39:36Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-4bmrv\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T08:39:37Z is after 2025-08-24T17:21:41Z" Oct 03 08:39:37 crc kubenswrapper[4765]: I1003 08:39:37.149914 4765 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-p9gf5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"46c76a49-e10b-4a12-a6c7-12c330cd3c4e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:36Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:36Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:36Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2ldt7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T08:39:36Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-p9gf5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T08:39:37Z is after 2025-08-24T17:21:41Z" Oct 03 08:39:37 crc kubenswrapper[4765]: I1003 08:39:37.168353 4765 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0c434639-9c6c-420c-a51b-fdf59b654daa\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:16Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:16Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d31497fd54f7500ac776bdd9a16414d873c053353911ed5ba237b201e9e7ac12\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T08:39:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e89b19d6a5b90a2051665bf2e5e150f73df7899eff246ee75246bc2127c415ea\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T08:39:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://79fad446c147481b1a0ff2a173848b2d24384e6b6aafcd0749dc820e9abfe929\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T08:39:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f2e21a2b21d807288e991a3a44ea38d316985590080aa4291aa3385816f826dd\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://aa0283dadc2c5e48aa9bfd20ef35d889a350244b72eb8529d4d4e682d5fa0e47\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-03T08:39:35Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1003 08:39:29.830291 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1003 08:39:29.833185 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2710500186/tls.crt::/tmp/serving-cert-2710500186/tls.key\\\\\\\"\\\\nI1003 08:39:35.213224 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1003 08:39:35.219008 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1003 08:39:35.219055 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1003 08:39:35.219088 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1003 08:39:35.219098 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1003 08:39:35.227302 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI1003 08:39:35.227314 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW1003 08:39:35.227372 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1003 08:39:35.227381 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1003 08:39:35.227385 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1003 08:39:35.227395 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1003 08:39:35.227398 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1003 08:39:35.227401 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF1003 08:39:35.229781 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-03T08:39:19Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T08:39:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fa1d1c0f4dab4b4c6c9f3afccac34473eab40a714015a2a7ce725ed1a92b609c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T08:39:19Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7c94b0f96f50324a67a7b189e9d3c4e406193421aa2e793692219bef9f2578dd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7c94b0f96f50324a67a7b189e9d3c4e406193421aa2e793692219bef9f2578dd\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T08:39:17Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T08:39:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T08:39:16Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T08:39:37Z is after 2025-08-24T17:21:41Z" Oct 03 08:39:37 crc kubenswrapper[4765]: I1003 08:39:37.187901 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-m4zqv\" (UniqueName: \"kubernetes.io/projected/ea01fba1-445f-46c1-898c-1ceb34866850-kube-api-access-m4zqv\") pod \"ovnkube-node-srgbb\" (UID: \"ea01fba1-445f-46c1-898c-1ceb34866850\") " pod="openshift-ovn-kubernetes/ovnkube-node-srgbb" Oct 03 08:39:37 crc kubenswrapper[4765]: I1003 08:39:37.187951 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/ea01fba1-445f-46c1-898c-1ceb34866850-run-ovn\") pod \"ovnkube-node-srgbb\" (UID: \"ea01fba1-445f-46c1-898c-1ceb34866850\") " pod="openshift-ovn-kubernetes/ovnkube-node-srgbb" Oct 03 08:39:37 crc kubenswrapper[4765]: I1003 08:39:37.187977 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/ea01fba1-445f-46c1-898c-1ceb34866850-host-run-ovn-kubernetes\") pod \"ovnkube-node-srgbb\" (UID: \"ea01fba1-445f-46c1-898c-1ceb34866850\") " pod="openshift-ovn-kubernetes/ovnkube-node-srgbb" Oct 03 08:39:37 crc kubenswrapper[4765]: I1003 08:39:37.188019 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/ea01fba1-445f-46c1-898c-1ceb34866850-host-kubelet\") pod \"ovnkube-node-srgbb\" (UID: \"ea01fba1-445f-46c1-898c-1ceb34866850\") " pod="openshift-ovn-kubernetes/ovnkube-node-srgbb" Oct 03 08:39:37 crc kubenswrapper[4765]: I1003 08:39:37.188042 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/ea01fba1-445f-46c1-898c-1ceb34866850-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-srgbb\" (UID: \"ea01fba1-445f-46c1-898c-1ceb34866850\") " pod="openshift-ovn-kubernetes/ovnkube-node-srgbb" Oct 03 08:39:37 crc kubenswrapper[4765]: I1003 08:39:37.188065 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/ea01fba1-445f-46c1-898c-1ceb34866850-host-cni-bin\") pod \"ovnkube-node-srgbb\" (UID: \"ea01fba1-445f-46c1-898c-1ceb34866850\") " pod="openshift-ovn-kubernetes/ovnkube-node-srgbb" Oct 03 08:39:37 crc kubenswrapper[4765]: I1003 08:39:37.188095 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/ea01fba1-445f-46c1-898c-1ceb34866850-run-openvswitch\") pod \"ovnkube-node-srgbb\" (UID: \"ea01fba1-445f-46c1-898c-1ceb34866850\") " pod="openshift-ovn-kubernetes/ovnkube-node-srgbb" Oct 03 08:39:37 crc kubenswrapper[4765]: I1003 08:39:37.188115 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/ea01fba1-445f-46c1-898c-1ceb34866850-host-slash\") pod \"ovnkube-node-srgbb\" (UID: \"ea01fba1-445f-46c1-898c-1ceb34866850\") " pod="openshift-ovn-kubernetes/ovnkube-node-srgbb" Oct 03 08:39:37 crc kubenswrapper[4765]: I1003 08:39:37.188132 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/ea01fba1-445f-46c1-898c-1ceb34866850-log-socket\") pod \"ovnkube-node-srgbb\" (UID: \"ea01fba1-445f-46c1-898c-1ceb34866850\") " pod="openshift-ovn-kubernetes/ovnkube-node-srgbb" Oct 03 08:39:37 crc kubenswrapper[4765]: I1003 08:39:37.188151 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/ea01fba1-445f-46c1-898c-1ceb34866850-host-cni-netd\") pod \"ovnkube-node-srgbb\" (UID: \"ea01fba1-445f-46c1-898c-1ceb34866850\") " pod="openshift-ovn-kubernetes/ovnkube-node-srgbb" Oct 03 08:39:37 crc kubenswrapper[4765]: I1003 08:39:37.188168 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/ea01fba1-445f-46c1-898c-1ceb34866850-var-lib-openvswitch\") pod \"ovnkube-node-srgbb\" (UID: \"ea01fba1-445f-46c1-898c-1ceb34866850\") " pod="openshift-ovn-kubernetes/ovnkube-node-srgbb" Oct 03 08:39:37 crc kubenswrapper[4765]: I1003 08:39:37.188187 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/ea01fba1-445f-46c1-898c-1ceb34866850-host-run-netns\") pod \"ovnkube-node-srgbb\" (UID: \"ea01fba1-445f-46c1-898c-1ceb34866850\") " pod="openshift-ovn-kubernetes/ovnkube-node-srgbb" Oct 03 08:39:37 crc kubenswrapper[4765]: I1003 08:39:37.188204 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/ea01fba1-445f-46c1-898c-1ceb34866850-node-log\") pod \"ovnkube-node-srgbb\" (UID: \"ea01fba1-445f-46c1-898c-1ceb34866850\") " pod="openshift-ovn-kubernetes/ovnkube-node-srgbb" Oct 03 08:39:37 crc kubenswrapper[4765]: I1003 08:39:37.188223 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/ea01fba1-445f-46c1-898c-1ceb34866850-ovnkube-config\") pod \"ovnkube-node-srgbb\" (UID: \"ea01fba1-445f-46c1-898c-1ceb34866850\") " pod="openshift-ovn-kubernetes/ovnkube-node-srgbb" Oct 03 08:39:37 crc kubenswrapper[4765]: I1003 08:39:37.188249 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/ea01fba1-445f-46c1-898c-1ceb34866850-systemd-units\") pod \"ovnkube-node-srgbb\" (UID: \"ea01fba1-445f-46c1-898c-1ceb34866850\") " pod="openshift-ovn-kubernetes/ovnkube-node-srgbb" Oct 03 08:39:37 crc kubenswrapper[4765]: I1003 08:39:37.188268 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/ea01fba1-445f-46c1-898c-1ceb34866850-ovn-node-metrics-cert\") pod \"ovnkube-node-srgbb\" (UID: \"ea01fba1-445f-46c1-898c-1ceb34866850\") " pod="openshift-ovn-kubernetes/ovnkube-node-srgbb" Oct 03 08:39:37 crc kubenswrapper[4765]: I1003 08:39:37.188288 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/ea01fba1-445f-46c1-898c-1ceb34866850-ovnkube-script-lib\") pod \"ovnkube-node-srgbb\" (UID: \"ea01fba1-445f-46c1-898c-1ceb34866850\") " pod="openshift-ovn-kubernetes/ovnkube-node-srgbb" Oct 03 08:39:37 crc kubenswrapper[4765]: I1003 08:39:37.188307 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/ea01fba1-445f-46c1-898c-1ceb34866850-run-systemd\") pod \"ovnkube-node-srgbb\" (UID: \"ea01fba1-445f-46c1-898c-1ceb34866850\") " pod="openshift-ovn-kubernetes/ovnkube-node-srgbb" Oct 03 08:39:37 crc kubenswrapper[4765]: I1003 08:39:37.188325 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/ea01fba1-445f-46c1-898c-1ceb34866850-etc-openvswitch\") pod \"ovnkube-node-srgbb\" (UID: \"ea01fba1-445f-46c1-898c-1ceb34866850\") " pod="openshift-ovn-kubernetes/ovnkube-node-srgbb" Oct 03 08:39:37 crc kubenswrapper[4765]: I1003 08:39:37.188352 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/ea01fba1-445f-46c1-898c-1ceb34866850-env-overrides\") pod \"ovnkube-node-srgbb\" (UID: \"ea01fba1-445f-46c1-898c-1ceb34866850\") " pod="openshift-ovn-kubernetes/ovnkube-node-srgbb" Oct 03 08:39:37 crc kubenswrapper[4765]: I1003 08:39:37.188536 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/ea01fba1-445f-46c1-898c-1ceb34866850-host-cni-netd\") pod \"ovnkube-node-srgbb\" (UID: \"ea01fba1-445f-46c1-898c-1ceb34866850\") " pod="openshift-ovn-kubernetes/ovnkube-node-srgbb" Oct 03 08:39:37 crc kubenswrapper[4765]: I1003 08:39:37.188613 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/ea01fba1-445f-46c1-898c-1ceb34866850-run-ovn\") pod \"ovnkube-node-srgbb\" (UID: \"ea01fba1-445f-46c1-898c-1ceb34866850\") " pod="openshift-ovn-kubernetes/ovnkube-node-srgbb" Oct 03 08:39:37 crc kubenswrapper[4765]: I1003 08:39:37.188662 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/ea01fba1-445f-46c1-898c-1ceb34866850-host-run-ovn-kubernetes\") pod \"ovnkube-node-srgbb\" (UID: \"ea01fba1-445f-46c1-898c-1ceb34866850\") " pod="openshift-ovn-kubernetes/ovnkube-node-srgbb" Oct 03 08:39:37 crc kubenswrapper[4765]: I1003 08:39:37.188697 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/ea01fba1-445f-46c1-898c-1ceb34866850-host-kubelet\") pod \"ovnkube-node-srgbb\" (UID: \"ea01fba1-445f-46c1-898c-1ceb34866850\") " pod="openshift-ovn-kubernetes/ovnkube-node-srgbb" Oct 03 08:39:37 crc kubenswrapper[4765]: I1003 08:39:37.188741 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/ea01fba1-445f-46c1-898c-1ceb34866850-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-srgbb\" (UID: \"ea01fba1-445f-46c1-898c-1ceb34866850\") " pod="openshift-ovn-kubernetes/ovnkube-node-srgbb" Oct 03 08:39:37 crc kubenswrapper[4765]: I1003 08:39:37.188786 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/ea01fba1-445f-46c1-898c-1ceb34866850-host-cni-bin\") pod \"ovnkube-node-srgbb\" (UID: \"ea01fba1-445f-46c1-898c-1ceb34866850\") " pod="openshift-ovn-kubernetes/ovnkube-node-srgbb" Oct 03 08:39:37 crc kubenswrapper[4765]: I1003 08:39:37.188833 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/ea01fba1-445f-46c1-898c-1ceb34866850-run-openvswitch\") pod \"ovnkube-node-srgbb\" (UID: \"ea01fba1-445f-46c1-898c-1ceb34866850\") " pod="openshift-ovn-kubernetes/ovnkube-node-srgbb" Oct 03 08:39:37 crc kubenswrapper[4765]: I1003 08:39:37.188875 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/ea01fba1-445f-46c1-898c-1ceb34866850-host-slash\") pod \"ovnkube-node-srgbb\" (UID: \"ea01fba1-445f-46c1-898c-1ceb34866850\") " pod="openshift-ovn-kubernetes/ovnkube-node-srgbb" Oct 03 08:39:37 crc kubenswrapper[4765]: I1003 08:39:37.188909 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/ea01fba1-445f-46c1-898c-1ceb34866850-log-socket\") pod \"ovnkube-node-srgbb\" (UID: \"ea01fba1-445f-46c1-898c-1ceb34866850\") " pod="openshift-ovn-kubernetes/ovnkube-node-srgbb" Oct 03 08:39:37 crc kubenswrapper[4765]: I1003 08:39:37.188952 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/ea01fba1-445f-46c1-898c-1ceb34866850-systemd-units\") pod \"ovnkube-node-srgbb\" (UID: \"ea01fba1-445f-46c1-898c-1ceb34866850\") " pod="openshift-ovn-kubernetes/ovnkube-node-srgbb" Oct 03 08:39:37 crc kubenswrapper[4765]: I1003 08:39:37.188988 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/ea01fba1-445f-46c1-898c-1ceb34866850-var-lib-openvswitch\") pod \"ovnkube-node-srgbb\" (UID: \"ea01fba1-445f-46c1-898c-1ceb34866850\") " pod="openshift-ovn-kubernetes/ovnkube-node-srgbb" Oct 03 08:39:37 crc kubenswrapper[4765]: I1003 08:39:37.189015 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/ea01fba1-445f-46c1-898c-1ceb34866850-env-overrides\") pod \"ovnkube-node-srgbb\" (UID: \"ea01fba1-445f-46c1-898c-1ceb34866850\") " pod="openshift-ovn-kubernetes/ovnkube-node-srgbb" Oct 03 08:39:37 crc kubenswrapper[4765]: I1003 08:39:37.189023 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/ea01fba1-445f-46c1-898c-1ceb34866850-host-run-netns\") pod \"ovnkube-node-srgbb\" (UID: \"ea01fba1-445f-46c1-898c-1ceb34866850\") " pod="openshift-ovn-kubernetes/ovnkube-node-srgbb" Oct 03 08:39:37 crc kubenswrapper[4765]: I1003 08:39:37.189059 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/ea01fba1-445f-46c1-898c-1ceb34866850-node-log\") pod \"ovnkube-node-srgbb\" (UID: \"ea01fba1-445f-46c1-898c-1ceb34866850\") " pod="openshift-ovn-kubernetes/ovnkube-node-srgbb" Oct 03 08:39:37 crc kubenswrapper[4765]: I1003 08:39:37.189759 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/ea01fba1-445f-46c1-898c-1ceb34866850-etc-openvswitch\") pod \"ovnkube-node-srgbb\" (UID: \"ea01fba1-445f-46c1-898c-1ceb34866850\") " pod="openshift-ovn-kubernetes/ovnkube-node-srgbb" Oct 03 08:39:37 crc kubenswrapper[4765]: I1003 08:39:37.189767 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/ea01fba1-445f-46c1-898c-1ceb34866850-ovnkube-config\") pod \"ovnkube-node-srgbb\" (UID: \"ea01fba1-445f-46c1-898c-1ceb34866850\") " pod="openshift-ovn-kubernetes/ovnkube-node-srgbb" Oct 03 08:39:37 crc kubenswrapper[4765]: I1003 08:39:37.189780 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/ea01fba1-445f-46c1-898c-1ceb34866850-run-systemd\") pod \"ovnkube-node-srgbb\" (UID: \"ea01fba1-445f-46c1-898c-1ceb34866850\") " pod="openshift-ovn-kubernetes/ovnkube-node-srgbb" Oct 03 08:39:37 crc kubenswrapper[4765]: I1003 08:39:37.190419 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/ea01fba1-445f-46c1-898c-1ceb34866850-ovnkube-script-lib\") pod \"ovnkube-node-srgbb\" (UID: \"ea01fba1-445f-46c1-898c-1ceb34866850\") " pod="openshift-ovn-kubernetes/ovnkube-node-srgbb" Oct 03 08:39:37 crc kubenswrapper[4765]: I1003 08:39:37.192714 4765 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:35Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:35Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T08:39:37Z is after 2025-08-24T17:21:41Z" Oct 03 08:39:37 crc kubenswrapper[4765]: I1003 08:39:37.194367 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/ea01fba1-445f-46c1-898c-1ceb34866850-ovn-node-metrics-cert\") pod \"ovnkube-node-srgbb\" (UID: \"ea01fba1-445f-46c1-898c-1ceb34866850\") " pod="openshift-ovn-kubernetes/ovnkube-node-srgbb" Oct 03 08:39:37 crc kubenswrapper[4765]: I1003 08:39:37.210837 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-m4zqv\" (UniqueName: \"kubernetes.io/projected/ea01fba1-445f-46c1-898c-1ceb34866850-kube-api-access-m4zqv\") pod \"ovnkube-node-srgbb\" (UID: \"ea01fba1-445f-46c1-898c-1ceb34866850\") " pod="openshift-ovn-kubernetes/ovnkube-node-srgbb" Oct 03 08:39:37 crc kubenswrapper[4765]: I1003 08:39:37.212402 4765 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:36Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:36Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e2003e4dd90b26bd915c05a690d0ab12b21ef7773138f11993382b0e7ac2d778\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T08:39:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T08:39:37Z is after 2025-08-24T17:21:41Z" Oct 03 08:39:37 crc kubenswrapper[4765]: I1003 08:39:37.231632 4765 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:35Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:35Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T08:39:37Z is after 2025-08-24T17:21:41Z" Oct 03 08:39:37 crc kubenswrapper[4765]: I1003 08:39:37.248956 4765 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-csb5z" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"912755c8-dd28-4fbc-82de-9cf85df54f4f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:36Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:36Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:36Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w8k2k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T08:39:36Z\\\"}}\" for pod \"openshift-multus\"/\"multus-csb5z\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T08:39:37Z is after 2025-08-24T17:21:41Z" Oct 03 08:39:37 crc kubenswrapper[4765]: I1003 08:39:37.270157 4765 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-j8mss" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d636dbad-9ffa-4ba7-953f-adea04b76a23\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:36Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:36Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:36Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dhzzj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dhzzj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T08:39:36Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-j8mss\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T08:39:37Z is after 2025-08-24T17:21:41Z" Oct 03 08:39:37 crc kubenswrapper[4765]: I1003 08:39:37.305783 4765 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 03 08:39:37 crc kubenswrapper[4765]: I1003 08:39:37.305846 4765 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 03 08:39:37 crc kubenswrapper[4765]: I1003 08:39:37.305855 4765 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 03 08:39:37 crc kubenswrapper[4765]: E1003 08:39:37.305973 4765 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 03 08:39:37 crc kubenswrapper[4765]: E1003 08:39:37.306069 4765 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 03 08:39:37 crc kubenswrapper[4765]: I1003 08:39:37.306116 4765 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-srgbb" Oct 03 08:39:37 crc kubenswrapper[4765]: E1003 08:39:37.306235 4765 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 03 08:39:37 crc kubenswrapper[4765]: W1003 08:39:37.321188 4765 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podea01fba1_445f_46c1_898c_1ceb34866850.slice/crio-5b0d53b60064af65d60dce9388dbb62aebfc827635dea2e607ca3af27c7883de WatchSource:0}: Error finding container 5b0d53b60064af65d60dce9388dbb62aebfc827635dea2e607ca3af27c7883de: Status 404 returned error can't find the container with id 5b0d53b60064af65d60dce9388dbb62aebfc827635dea2e607ca3af27c7883de Oct 03 08:39:37 crc kubenswrapper[4765]: I1003 08:39:37.457164 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-4bmrv" event={"ID":"4f105c06-3e67-486f-a622-923ae442117c","Type":"ContainerStarted","Data":"ab314ed006e71cae099ecf8be4ac18fb777dd83fe4e233d9bc347965f76b6a0b"} Oct 03 08:39:37 crc kubenswrapper[4765]: I1003 08:39:37.457271 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-4bmrv" event={"ID":"4f105c06-3e67-486f-a622-923ae442117c","Type":"ContainerStarted","Data":"53c9d3430352eb2c0f01fd9ea08778557734b1be1c9f8b65c93730201c615711"} Oct 03 08:39:37 crc kubenswrapper[4765]: I1003 08:39:37.459540 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns/node-resolver-p9gf5" event={"ID":"46c76a49-e10b-4a12-a6c7-12c330cd3c4e","Type":"ContainerStarted","Data":"d127171dd11041892813dd0596574630e756cc4f2e54b149619bffdbe9bae37d"} Oct 03 08:39:37 crc kubenswrapper[4765]: I1003 08:39:37.459604 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns/node-resolver-p9gf5" event={"ID":"46c76a49-e10b-4a12-a6c7-12c330cd3c4e","Type":"ContainerStarted","Data":"cc4a33aaeaf5832f66b6e5e9f91d41c8d081cb37ff450367d02b7c3d582a0ab3"} Oct 03 08:39:37 crc kubenswrapper[4765]: I1003 08:39:37.462049 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-j8mss" event={"ID":"d636dbad-9ffa-4ba7-953f-adea04b76a23","Type":"ContainerStarted","Data":"33c95fa1034cd2135f4293956d73825e809195d220ff0b10a6604bd399a5730a"} Oct 03 08:39:37 crc kubenswrapper[4765]: I1003 08:39:37.462135 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-j8mss" event={"ID":"d636dbad-9ffa-4ba7-953f-adea04b76a23","Type":"ContainerStarted","Data":"714c78e9165f96e2aee03ad7be980399f06aeb852da4d76611c236f262518281"} Oct 03 08:39:37 crc kubenswrapper[4765]: I1003 08:39:37.462152 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-j8mss" event={"ID":"d636dbad-9ffa-4ba7-953f-adea04b76a23","Type":"ContainerStarted","Data":"2d92e46583fd22e67d6c1836819a4cf9723055c3e14c56faa39be07869968f2f"} Oct 03 08:39:37 crc kubenswrapper[4765]: I1003 08:39:37.464150 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-csb5z" event={"ID":"912755c8-dd28-4fbc-82de-9cf85df54f4f","Type":"ContainerStarted","Data":"d7f179012e9f55f30c641a1ae3640cc90cefb3d2527d0c1e0580c219899503e1"} Oct 03 08:39:37 crc kubenswrapper[4765]: I1003 08:39:37.464192 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-csb5z" event={"ID":"912755c8-dd28-4fbc-82de-9cf85df54f4f","Type":"ContainerStarted","Data":"fd2ad09ba6976acbfa0aac07cfcb1386a545ab56dd95fdf87edfa11c5d91a2ea"} Oct 03 08:39:37 crc kubenswrapper[4765]: I1003 08:39:37.466293 4765 generic.go:334] "Generic (PLEG): container finished" podID="ea01fba1-445f-46c1-898c-1ceb34866850" containerID="761ad034a8a6a7a6280fcccaca136b26ddd423885bc2a7ab6b8e1856f16f7416" exitCode=0 Oct 03 08:39:37 crc kubenswrapper[4765]: I1003 08:39:37.466909 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-srgbb" event={"ID":"ea01fba1-445f-46c1-898c-1ceb34866850","Type":"ContainerDied","Data":"761ad034a8a6a7a6280fcccaca136b26ddd423885bc2a7ab6b8e1856f16f7416"} Oct 03 08:39:37 crc kubenswrapper[4765]: I1003 08:39:37.466985 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-srgbb" event={"ID":"ea01fba1-445f-46c1-898c-1ceb34866850","Type":"ContainerStarted","Data":"5b0d53b60064af65d60dce9388dbb62aebfc827635dea2e607ca3af27c7883de"} Oct 03 08:39:37 crc kubenswrapper[4765]: I1003 08:39:37.467006 4765 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-crc" Oct 03 08:39:37 crc kubenswrapper[4765]: I1003 08:39:37.479259 4765 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9660b983-3561-4cf7-8ea0-31a63e8d1051\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://23c27e7d79dab0c54b22f0114e7f55a9267e3a21961b8479c37fd77d0e8b66c7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T08:39:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fb89a31c804d86cbc11b04e4dcfab79d4536f28a107d43e98d48172a1c257ebb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T08:39:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1c3168f51c49cd9633557cf31cdc0fec47b3fcf981462dc85f4253a0584fcf52\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T08:39:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://99ae775d5cfd2e88a1c7ca516e1c59f2e08ce1d383653cacbefeac66b07abcb5\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T08:39:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T08:39:16Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T08:39:37Z is after 2025-08-24T17:21:41Z" Oct 03 08:39:37 crc kubenswrapper[4765]: I1003 08:39:37.498408 4765 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-4bmrv" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4f105c06-3e67-486f-a622-923ae442117c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:36Z\\\",\\\"message\\\":\\\"containers with incomplete status: [egress-router-binary-copy cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:36Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:36Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6lhz2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ab314ed006e71cae099ecf8be4ac18fb777dd83fe4e233d9bc347965f76b6a0b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T08:39:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6lhz2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6lhz2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6lhz2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6lhz2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6lhz2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6lhz2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T08:39:36Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-4bmrv\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T08:39:37Z is after 2025-08-24T17:21:41Z" Oct 03 08:39:37 crc kubenswrapper[4765]: I1003 08:39:37.511959 4765 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-p9gf5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"46c76a49-e10b-4a12-a6c7-12c330cd3c4e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:36Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:36Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:36Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2ldt7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T08:39:36Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-p9gf5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T08:39:37Z is after 2025-08-24T17:21:41Z" Oct 03 08:39:37 crc kubenswrapper[4765]: I1003 08:39:37.539019 4765 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:36Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:36Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e2003e4dd90b26bd915c05a690d0ab12b21ef7773138f11993382b0e7ac2d778\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T08:39:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T08:39:37Z is after 2025-08-24T17:21:41Z" Oct 03 08:39:37 crc kubenswrapper[4765]: I1003 08:39:37.558104 4765 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0c434639-9c6c-420c-a51b-fdf59b654daa\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:16Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:16Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d31497fd54f7500ac776bdd9a16414d873c053353911ed5ba237b201e9e7ac12\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T08:39:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e89b19d6a5b90a2051665bf2e5e150f73df7899eff246ee75246bc2127c415ea\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T08:39:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://79fad446c147481b1a0ff2a173848b2d24384e6b6aafcd0749dc820e9abfe929\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T08:39:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f2e21a2b21d807288e991a3a44ea38d316985590080aa4291aa3385816f826dd\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://aa0283dadc2c5e48aa9bfd20ef35d889a350244b72eb8529d4d4e682d5fa0e47\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-03T08:39:35Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1003 08:39:29.830291 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1003 08:39:29.833185 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2710500186/tls.crt::/tmp/serving-cert-2710500186/tls.key\\\\\\\"\\\\nI1003 08:39:35.213224 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1003 08:39:35.219008 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1003 08:39:35.219055 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1003 08:39:35.219088 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1003 08:39:35.219098 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1003 08:39:35.227302 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI1003 08:39:35.227314 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW1003 08:39:35.227372 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1003 08:39:35.227381 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1003 08:39:35.227385 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1003 08:39:35.227395 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1003 08:39:35.227398 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1003 08:39:35.227401 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF1003 08:39:35.229781 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-03T08:39:19Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T08:39:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fa1d1c0f4dab4b4c6c9f3afccac34473eab40a714015a2a7ce725ed1a92b609c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T08:39:19Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7c94b0f96f50324a67a7b189e9d3c4e406193421aa2e793692219bef9f2578dd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7c94b0f96f50324a67a7b189e9d3c4e406193421aa2e793692219bef9f2578dd\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T08:39:17Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T08:39:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T08:39:16Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T08:39:37Z is after 2025-08-24T17:21:41Z" Oct 03 08:39:37 crc kubenswrapper[4765]: I1003 08:39:37.570267 4765 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-etcd/etcd-crc" Oct 03 08:39:37 crc kubenswrapper[4765]: I1003 08:39:37.581057 4765 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:35Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:35Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T08:39:37Z is after 2025-08-24T17:21:41Z" Oct 03 08:39:37 crc kubenswrapper[4765]: I1003 08:39:37.588190 4765 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-etcd/etcd-crc" Oct 03 08:39:37 crc kubenswrapper[4765]: I1003 08:39:37.605948 4765 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:35Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:35Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T08:39:37Z is after 2025-08-24T17:21:41Z" Oct 03 08:39:37 crc kubenswrapper[4765]: I1003 08:39:37.621095 4765 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-csb5z" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"912755c8-dd28-4fbc-82de-9cf85df54f4f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:36Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:36Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:36Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w8k2k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T08:39:36Z\\\"}}\" for pod \"openshift-multus\"/\"multus-csb5z\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T08:39:37Z is after 2025-08-24T17:21:41Z" Oct 03 08:39:37 crc kubenswrapper[4765]: I1003 08:39:37.637996 4765 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-j8mss" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d636dbad-9ffa-4ba7-953f-adea04b76a23\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:36Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:36Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:36Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dhzzj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dhzzj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T08:39:36Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-j8mss\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T08:39:37Z is after 2025-08-24T17:21:41Z" Oct 03 08:39:37 crc kubenswrapper[4765]: I1003 08:39:37.657971 4765 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:35Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:35Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T08:39:37Z is after 2025-08-24T17:21:41Z" Oct 03 08:39:37 crc kubenswrapper[4765]: I1003 08:39:37.674240 4765 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:36Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:36Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e8d6f534a0a702832db2f8947c1528a98d511d3950cc5a6ec0ac3b31b3dbcb7c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T08:39:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://20ad16cb9f0f7e17ac946cd2c3f7c01b6e6c95d6d76c99f482b3761546689af2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T08:39:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T08:39:37Z is after 2025-08-24T17:21:41Z" Oct 03 08:39:37 crc kubenswrapper[4765]: I1003 08:39:37.677397 4765 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-etcd/etcd-crc"] Oct 03 08:39:37 crc kubenswrapper[4765]: I1003 08:39:37.694055 4765 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:35Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:35Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T08:39:37Z is after 2025-08-24T17:21:41Z" Oct 03 08:39:37 crc kubenswrapper[4765]: I1003 08:39:37.741240 4765 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-srgbb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ea01fba1-445f-46c1-898c-1ceb34866850\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:36Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:36Z\\\",\\\"message\\\":\\\"containers with incomplete status: [kubecfg-setup]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:36Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:36Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m4zqv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m4zqv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m4zqv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m4zqv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m4zqv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m4zqv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m4zqv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m4zqv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m4zqv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T08:39:36Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-srgbb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T08:39:37Z is after 2025-08-24T17:21:41Z" Oct 03 08:39:37 crc kubenswrapper[4765]: I1003 08:39:37.774357 4765 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0c434639-9c6c-420c-a51b-fdf59b654daa\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:16Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:16Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d31497fd54f7500ac776bdd9a16414d873c053353911ed5ba237b201e9e7ac12\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T08:39:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e89b19d6a5b90a2051665bf2e5e150f73df7899eff246ee75246bc2127c415ea\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T08:39:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://79fad446c147481b1a0ff2a173848b2d24384e6b6aafcd0749dc820e9abfe929\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T08:39:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f2e21a2b21d807288e991a3a44ea38d316985590080aa4291aa3385816f826dd\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://aa0283dadc2c5e48aa9bfd20ef35d889a350244b72eb8529d4d4e682d5fa0e47\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-03T08:39:35Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1003 08:39:29.830291 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1003 08:39:29.833185 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2710500186/tls.crt::/tmp/serving-cert-2710500186/tls.key\\\\\\\"\\\\nI1003 08:39:35.213224 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1003 08:39:35.219008 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1003 08:39:35.219055 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1003 08:39:35.219088 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1003 08:39:35.219098 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1003 08:39:35.227302 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI1003 08:39:35.227314 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW1003 08:39:35.227372 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1003 08:39:35.227381 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1003 08:39:35.227385 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1003 08:39:35.227395 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1003 08:39:35.227398 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1003 08:39:35.227401 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF1003 08:39:35.229781 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-03T08:39:19Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T08:39:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fa1d1c0f4dab4b4c6c9f3afccac34473eab40a714015a2a7ce725ed1a92b609c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T08:39:19Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7c94b0f96f50324a67a7b189e9d3c4e406193421aa2e793692219bef9f2578dd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7c94b0f96f50324a67a7b189e9d3c4e406193421aa2e793692219bef9f2578dd\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T08:39:17Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T08:39:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T08:39:16Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T08:39:37Z is after 2025-08-24T17:21:41Z" Oct 03 08:39:37 crc kubenswrapper[4765]: I1003 08:39:37.795607 4765 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:35Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:35Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T08:39:37Z is after 2025-08-24T17:21:41Z" Oct 03 08:39:37 crc kubenswrapper[4765]: I1003 08:39:37.817793 4765 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:36Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:36Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e2003e4dd90b26bd915c05a690d0ab12b21ef7773138f11993382b0e7ac2d778\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T08:39:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T08:39:37Z is after 2025-08-24T17:21:41Z" Oct 03 08:39:37 crc kubenswrapper[4765]: I1003 08:39:37.834107 4765 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:35Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:35Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T08:39:37Z is after 2025-08-24T17:21:41Z" Oct 03 08:39:37 crc kubenswrapper[4765]: I1003 08:39:37.852215 4765 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-csb5z" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"912755c8-dd28-4fbc-82de-9cf85df54f4f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d7f179012e9f55f30c641a1ae3640cc90cefb3d2527d0c1e0580c219899503e1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T08:39:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w8k2k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T08:39:36Z\\\"}}\" for pod \"openshift-multus\"/\"multus-csb5z\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T08:39:37Z is after 2025-08-24T17:21:41Z" Oct 03 08:39:37 crc kubenswrapper[4765]: I1003 08:39:37.867608 4765 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-j8mss" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d636dbad-9ffa-4ba7-953f-adea04b76a23\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://33c95fa1034cd2135f4293956d73825e809195d220ff0b10a6604bd399a5730a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T08:39:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dhzzj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://714c78e9165f96e2aee03ad7be980399f06aeb852da4d76611c236f262518281\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T08:39:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dhzzj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T08:39:36Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-j8mss\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T08:39:37Z is after 2025-08-24T17:21:41Z" Oct 03 08:39:37 crc kubenswrapper[4765]: I1003 08:39:37.905596 4765 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"859ee4f1-636f-48e5-ad72-fef19f311c64\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ffaf0cbc60fa84230a87aff908b5b2a76956abfa937aeea94363abe91640b93e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T08:39:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0fee410f71d4fa82e7bf54dad906736bc7182be512825a06bf7a4c76ed2f2789\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T08:39:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ab0ed26066c771f9943b6435fa382ff61fb04f0c8bef3d505aba4c5d1a1d4740\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T08:39:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://153c9584928c3d064c6098126dad58733015ed123b9a55c959e69ddcc0ad2110\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T08:39:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6fa1bc45d80d90bc08ca3a7177e2ac77b66c36f5a0f863532174be7719bfaae0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T08:39:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c1359775b3b907743ce1d7754c904fab5cf953ad3a28f91e781be95462e6a9d4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c1359775b3b907743ce1d7754c904fab5cf953ad3a28f91e781be95462e6a9d4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T08:39:17Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T08:39:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a56c24b3af6b3d5c18009cbb5517273674eb36f57157344f99903f9c50c7d1a0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a56c24b3af6b3d5c18009cbb5517273674eb36f57157344f99903f9c50c7d1a0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T08:39:18Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T08:39:18Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://18dbd66a12f96fa4beef4da9dcc0e1b1d3a5164a8b21a6774142289078d6c05f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://18dbd66a12f96fa4beef4da9dcc0e1b1d3a5164a8b21a6774142289078d6c05f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T08:39:19Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T08:39:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T08:39:16Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T08:39:37Z is after 2025-08-24T17:21:41Z" Oct 03 08:39:37 crc kubenswrapper[4765]: I1003 08:39:37.938440 4765 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:36Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:36Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e8d6f534a0a702832db2f8947c1528a98d511d3950cc5a6ec0ac3b31b3dbcb7c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T08:39:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://20ad16cb9f0f7e17ac946cd2c3f7c01b6e6c95d6d76c99f482b3761546689af2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T08:39:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T08:39:37Z is after 2025-08-24T17:21:41Z" Oct 03 08:39:37 crc kubenswrapper[4765]: I1003 08:39:37.960240 4765 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:35Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:35Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T08:39:37Z is after 2025-08-24T17:21:41Z" Oct 03 08:39:37 crc kubenswrapper[4765]: I1003 08:39:37.980100 4765 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-srgbb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ea01fba1-445f-46c1-898c-1ceb34866850\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:36Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:36Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m4zqv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m4zqv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m4zqv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m4zqv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m4zqv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m4zqv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m4zqv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m4zqv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://761ad034a8a6a7a6280fcccaca136b26ddd423885bc2a7ab6b8e1856f16f7416\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://761ad034a8a6a7a6280fcccaca136b26ddd423885bc2a7ab6b8e1856f16f7416\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T08:39:37Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T08:39:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m4zqv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T08:39:36Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-srgbb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T08:39:37Z is after 2025-08-24T17:21:41Z" Oct 03 08:39:37 crc kubenswrapper[4765]: I1003 08:39:37.997809 4765 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:35Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:35Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T08:39:37Z is after 2025-08-24T17:21:41Z" Oct 03 08:39:38 crc kubenswrapper[4765]: I1003 08:39:38.017542 4765 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-4bmrv" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4f105c06-3e67-486f-a622-923ae442117c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:36Z\\\",\\\"message\\\":\\\"containers with incomplete status: [egress-router-binary-copy cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:36Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:36Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6lhz2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ab314ed006e71cae099ecf8be4ac18fb777dd83fe4e233d9bc347965f76b6a0b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T08:39:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6lhz2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6lhz2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6lhz2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6lhz2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6lhz2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6lhz2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T08:39:36Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-4bmrv\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T08:39:38Z is after 2025-08-24T17:21:41Z" Oct 03 08:39:38 crc kubenswrapper[4765]: I1003 08:39:38.035441 4765 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-p9gf5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"46c76a49-e10b-4a12-a6c7-12c330cd3c4e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d127171dd11041892813dd0596574630e756cc4f2e54b149619bffdbe9bae37d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T08:39:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2ldt7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T08:39:36Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-p9gf5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T08:39:38Z is after 2025-08-24T17:21:41Z" Oct 03 08:39:38 crc kubenswrapper[4765]: I1003 08:39:38.049908 4765 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9660b983-3561-4cf7-8ea0-31a63e8d1051\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://23c27e7d79dab0c54b22f0114e7f55a9267e3a21961b8479c37fd77d0e8b66c7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T08:39:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fb89a31c804d86cbc11b04e4dcfab79d4536f28a107d43e98d48172a1c257ebb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T08:39:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1c3168f51c49cd9633557cf31cdc0fec47b3fcf981462dc85f4253a0584fcf52\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T08:39:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://99ae775d5cfd2e88a1c7ca516e1c59f2e08ce1d383653cacbefeac66b07abcb5\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T08:39:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T08:39:16Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T08:39:38Z is after 2025-08-24T17:21:41Z" Oct 03 08:39:38 crc kubenswrapper[4765]: I1003 08:39:38.470795 4765 generic.go:334] "Generic (PLEG): container finished" podID="4f105c06-3e67-486f-a622-923ae442117c" containerID="ab314ed006e71cae099ecf8be4ac18fb777dd83fe4e233d9bc347965f76b6a0b" exitCode=0 Oct 03 08:39:38 crc kubenswrapper[4765]: I1003 08:39:38.470878 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-4bmrv" event={"ID":"4f105c06-3e67-486f-a622-923ae442117c","Type":"ContainerDied","Data":"ab314ed006e71cae099ecf8be4ac18fb777dd83fe4e233d9bc347965f76b6a0b"} Oct 03 08:39:38 crc kubenswrapper[4765]: I1003 08:39:38.474573 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" event={"ID":"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49","Type":"ContainerStarted","Data":"a37f2b5f797755065158a077232872befbc61f2f19c80dfd27bba7f131db794c"} Oct 03 08:39:38 crc kubenswrapper[4765]: I1003 08:39:38.478817 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-srgbb" event={"ID":"ea01fba1-445f-46c1-898c-1ceb34866850","Type":"ContainerStarted","Data":"902d94d2cc9ce526c6ea774f1bb70fbee7da85cedab72fcd842f87d47ee8a458"} Oct 03 08:39:38 crc kubenswrapper[4765]: I1003 08:39:38.478855 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-srgbb" event={"ID":"ea01fba1-445f-46c1-898c-1ceb34866850","Type":"ContainerStarted","Data":"95502595a856f5f235331ab5db3d4f97a50f968857c1962d12b873a714689f0c"} Oct 03 08:39:38 crc kubenswrapper[4765]: I1003 08:39:38.478866 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-srgbb" event={"ID":"ea01fba1-445f-46c1-898c-1ceb34866850","Type":"ContainerStarted","Data":"fa40947035e07c4926ee170348e2bd545830d0c6c1fa6b59a2aa7f12eac2c6da"} Oct 03 08:39:38 crc kubenswrapper[4765]: I1003 08:39:38.478875 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-srgbb" event={"ID":"ea01fba1-445f-46c1-898c-1ceb34866850","Type":"ContainerStarted","Data":"d73e2e54676fc570262cfd551322ed003812c372ddc25695ca3b34ae2a05423b"} Oct 03 08:39:38 crc kubenswrapper[4765]: I1003 08:39:38.478884 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-srgbb" event={"ID":"ea01fba1-445f-46c1-898c-1ceb34866850","Type":"ContainerStarted","Data":"a3ad66691c9dcf004703b79d697a78f9b42791fafba2ddf278997b6ad28bdd4a"} Oct 03 08:39:38 crc kubenswrapper[4765]: I1003 08:39:38.478901 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-srgbb" event={"ID":"ea01fba1-445f-46c1-898c-1ceb34866850","Type":"ContainerStarted","Data":"68b9b8a7ec5c072f50d44aa0d3800b7cdee18bdd868d37ec129ceb37a23bd3ca"} Oct 03 08:39:38 crc kubenswrapper[4765]: E1003 08:39:38.486620 4765 kubelet.go:1929] "Failed creating a mirror pod for" err="pods \"etcd-crc\" already exists" pod="openshift-etcd/etcd-crc" Oct 03 08:39:38 crc kubenswrapper[4765]: I1003 08:39:38.486707 4765 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:35Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:35Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T08:39:38Z is after 2025-08-24T17:21:41Z" Oct 03 08:39:38 crc kubenswrapper[4765]: I1003 08:39:38.502856 4765 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:36Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:36Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e2003e4dd90b26bd915c05a690d0ab12b21ef7773138f11993382b0e7ac2d778\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T08:39:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T08:39:38Z is after 2025-08-24T17:21:41Z" Oct 03 08:39:38 crc kubenswrapper[4765]: I1003 08:39:38.520196 4765 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0c434639-9c6c-420c-a51b-fdf59b654daa\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:16Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:16Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d31497fd54f7500ac776bdd9a16414d873c053353911ed5ba237b201e9e7ac12\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T08:39:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e89b19d6a5b90a2051665bf2e5e150f73df7899eff246ee75246bc2127c415ea\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T08:39:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://79fad446c147481b1a0ff2a173848b2d24384e6b6aafcd0749dc820e9abfe929\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T08:39:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f2e21a2b21d807288e991a3a44ea38d316985590080aa4291aa3385816f826dd\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://aa0283dadc2c5e48aa9bfd20ef35d889a350244b72eb8529d4d4e682d5fa0e47\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-03T08:39:35Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1003 08:39:29.830291 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1003 08:39:29.833185 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2710500186/tls.crt::/tmp/serving-cert-2710500186/tls.key\\\\\\\"\\\\nI1003 08:39:35.213224 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1003 08:39:35.219008 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1003 08:39:35.219055 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1003 08:39:35.219088 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1003 08:39:35.219098 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1003 08:39:35.227302 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI1003 08:39:35.227314 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW1003 08:39:35.227372 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1003 08:39:35.227381 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1003 08:39:35.227385 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1003 08:39:35.227395 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1003 08:39:35.227398 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1003 08:39:35.227401 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF1003 08:39:35.229781 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-03T08:39:19Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T08:39:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fa1d1c0f4dab4b4c6c9f3afccac34473eab40a714015a2a7ce725ed1a92b609c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T08:39:19Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7c94b0f96f50324a67a7b189e9d3c4e406193421aa2e793692219bef9f2578dd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7c94b0f96f50324a67a7b189e9d3c4e406193421aa2e793692219bef9f2578dd\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T08:39:17Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T08:39:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T08:39:16Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T08:39:38Z is after 2025-08-24T17:21:41Z" Oct 03 08:39:38 crc kubenswrapper[4765]: I1003 08:39:38.536371 4765 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-csb5z" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"912755c8-dd28-4fbc-82de-9cf85df54f4f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d7f179012e9f55f30c641a1ae3640cc90cefb3d2527d0c1e0580c219899503e1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T08:39:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w8k2k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T08:39:36Z\\\"}}\" for pod \"openshift-multus\"/\"multus-csb5z\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T08:39:38Z is after 2025-08-24T17:21:41Z" Oct 03 08:39:38 crc kubenswrapper[4765]: I1003 08:39:38.549114 4765 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-j8mss" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d636dbad-9ffa-4ba7-953f-adea04b76a23\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://33c95fa1034cd2135f4293956d73825e809195d220ff0b10a6604bd399a5730a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T08:39:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dhzzj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://714c78e9165f96e2aee03ad7be980399f06aeb852da4d76611c236f262518281\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T08:39:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dhzzj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T08:39:36Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-j8mss\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T08:39:38Z is after 2025-08-24T17:21:41Z" Oct 03 08:39:38 crc kubenswrapper[4765]: I1003 08:39:38.572572 4765 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"859ee4f1-636f-48e5-ad72-fef19f311c64\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ffaf0cbc60fa84230a87aff908b5b2a76956abfa937aeea94363abe91640b93e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T08:39:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0fee410f71d4fa82e7bf54dad906736bc7182be512825a06bf7a4c76ed2f2789\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T08:39:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ab0ed26066c771f9943b6435fa382ff61fb04f0c8bef3d505aba4c5d1a1d4740\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T08:39:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://153c9584928c3d064c6098126dad58733015ed123b9a55c959e69ddcc0ad2110\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T08:39:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6fa1bc45d80d90bc08ca3a7177e2ac77b66c36f5a0f863532174be7719bfaae0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T08:39:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c1359775b3b907743ce1d7754c904fab5cf953ad3a28f91e781be95462e6a9d4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c1359775b3b907743ce1d7754c904fab5cf953ad3a28f91e781be95462e6a9d4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T08:39:17Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T08:39:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a56c24b3af6b3d5c18009cbb5517273674eb36f57157344f99903f9c50c7d1a0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a56c24b3af6b3d5c18009cbb5517273674eb36f57157344f99903f9c50c7d1a0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T08:39:18Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T08:39:18Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://18dbd66a12f96fa4beef4da9dcc0e1b1d3a5164a8b21a6774142289078d6c05f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://18dbd66a12f96fa4beef4da9dcc0e1b1d3a5164a8b21a6774142289078d6c05f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T08:39:19Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T08:39:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T08:39:16Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T08:39:38Z is after 2025-08-24T17:21:41Z" Oct 03 08:39:38 crc kubenswrapper[4765]: I1003 08:39:38.588720 4765 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:35Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:35Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T08:39:38Z is after 2025-08-24T17:21:41Z" Oct 03 08:39:38 crc kubenswrapper[4765]: I1003 08:39:38.609140 4765 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:35Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:35Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T08:39:38Z is after 2025-08-24T17:21:41Z" Oct 03 08:39:38 crc kubenswrapper[4765]: I1003 08:39:38.632441 4765 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-srgbb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ea01fba1-445f-46c1-898c-1ceb34866850\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:36Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:36Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m4zqv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m4zqv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m4zqv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m4zqv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m4zqv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m4zqv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m4zqv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m4zqv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://761ad034a8a6a7a6280fcccaca136b26ddd423885bc2a7ab6b8e1856f16f7416\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://761ad034a8a6a7a6280fcccaca136b26ddd423885bc2a7ab6b8e1856f16f7416\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T08:39:37Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T08:39:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m4zqv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T08:39:36Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-srgbb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T08:39:38Z is after 2025-08-24T17:21:41Z" Oct 03 08:39:38 crc kubenswrapper[4765]: I1003 08:39:38.646589 4765 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:35Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:35Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T08:39:38Z is after 2025-08-24T17:21:41Z" Oct 03 08:39:38 crc kubenswrapper[4765]: I1003 08:39:38.660156 4765 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:36Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:36Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e8d6f534a0a702832db2f8947c1528a98d511d3950cc5a6ec0ac3b31b3dbcb7c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T08:39:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://20ad16cb9f0f7e17ac946cd2c3f7c01b6e6c95d6d76c99f482b3761546689af2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T08:39:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T08:39:38Z is after 2025-08-24T17:21:41Z" Oct 03 08:39:38 crc kubenswrapper[4765]: I1003 08:39:38.670878 4765 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-p9gf5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"46c76a49-e10b-4a12-a6c7-12c330cd3c4e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d127171dd11041892813dd0596574630e756cc4f2e54b149619bffdbe9bae37d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T08:39:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2ldt7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T08:39:36Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-p9gf5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T08:39:38Z is after 2025-08-24T17:21:41Z" Oct 03 08:39:38 crc kubenswrapper[4765]: I1003 08:39:38.685271 4765 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9660b983-3561-4cf7-8ea0-31a63e8d1051\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://23c27e7d79dab0c54b22f0114e7f55a9267e3a21961b8479c37fd77d0e8b66c7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T08:39:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fb89a31c804d86cbc11b04e4dcfab79d4536f28a107d43e98d48172a1c257ebb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T08:39:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1c3168f51c49cd9633557cf31cdc0fec47b3fcf981462dc85f4253a0584fcf52\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T08:39:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://99ae775d5cfd2e88a1c7ca516e1c59f2e08ce1d383653cacbefeac66b07abcb5\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T08:39:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T08:39:16Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T08:39:38Z is after 2025-08-24T17:21:41Z" Oct 03 08:39:38 crc kubenswrapper[4765]: I1003 08:39:38.702172 4765 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-4bmrv" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4f105c06-3e67-486f-a622-923ae442117c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:36Z\\\",\\\"message\\\":\\\"containers with incomplete status: [cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:36Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:36Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6lhz2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ab314ed006e71cae099ecf8be4ac18fb777dd83fe4e233d9bc347965f76b6a0b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ab314ed006e71cae099ecf8be4ac18fb777dd83fe4e233d9bc347965f76b6a0b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T08:39:37Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T08:39:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6lhz2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6lhz2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6lhz2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6lhz2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6lhz2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6lhz2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T08:39:36Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-4bmrv\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T08:39:38Z is after 2025-08-24T17:21:41Z" Oct 03 08:39:38 crc kubenswrapper[4765]: I1003 08:39:38.717036 4765 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:36Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:36Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e2003e4dd90b26bd915c05a690d0ab12b21ef7773138f11993382b0e7ac2d778\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T08:39:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T08:39:38Z is after 2025-08-24T17:21:41Z" Oct 03 08:39:38 crc kubenswrapper[4765]: I1003 08:39:38.734229 4765 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0c434639-9c6c-420c-a51b-fdf59b654daa\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:16Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:16Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d31497fd54f7500ac776bdd9a16414d873c053353911ed5ba237b201e9e7ac12\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T08:39:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e89b19d6a5b90a2051665bf2e5e150f73df7899eff246ee75246bc2127c415ea\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T08:39:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://79fad446c147481b1a0ff2a173848b2d24384e6b6aafcd0749dc820e9abfe929\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T08:39:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f2e21a2b21d807288e991a3a44ea38d316985590080aa4291aa3385816f826dd\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://aa0283dadc2c5e48aa9bfd20ef35d889a350244b72eb8529d4d4e682d5fa0e47\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-03T08:39:35Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1003 08:39:29.830291 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1003 08:39:29.833185 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2710500186/tls.crt::/tmp/serving-cert-2710500186/tls.key\\\\\\\"\\\\nI1003 08:39:35.213224 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1003 08:39:35.219008 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1003 08:39:35.219055 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1003 08:39:35.219088 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1003 08:39:35.219098 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1003 08:39:35.227302 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI1003 08:39:35.227314 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW1003 08:39:35.227372 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1003 08:39:35.227381 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1003 08:39:35.227385 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1003 08:39:35.227395 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1003 08:39:35.227398 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1003 08:39:35.227401 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF1003 08:39:35.229781 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-03T08:39:19Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T08:39:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fa1d1c0f4dab4b4c6c9f3afccac34473eab40a714015a2a7ce725ed1a92b609c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T08:39:19Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7c94b0f96f50324a67a7b189e9d3c4e406193421aa2e793692219bef9f2578dd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7c94b0f96f50324a67a7b189e9d3c4e406193421aa2e793692219bef9f2578dd\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T08:39:17Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T08:39:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T08:39:16Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T08:39:38Z is after 2025-08-24T17:21:41Z" Oct 03 08:39:38 crc kubenswrapper[4765]: I1003 08:39:38.748414 4765 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:35Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:35Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T08:39:38Z is after 2025-08-24T17:21:41Z" Oct 03 08:39:38 crc kubenswrapper[4765]: I1003 08:39:38.770730 4765 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"859ee4f1-636f-48e5-ad72-fef19f311c64\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ffaf0cbc60fa84230a87aff908b5b2a76956abfa937aeea94363abe91640b93e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T08:39:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0fee410f71d4fa82e7bf54dad906736bc7182be512825a06bf7a4c76ed2f2789\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T08:39:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ab0ed26066c771f9943b6435fa382ff61fb04f0c8bef3d505aba4c5d1a1d4740\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T08:39:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://153c9584928c3d064c6098126dad58733015ed123b9a55c959e69ddcc0ad2110\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T08:39:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6fa1bc45d80d90bc08ca3a7177e2ac77b66c36f5a0f863532174be7719bfaae0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T08:39:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c1359775b3b907743ce1d7754c904fab5cf953ad3a28f91e781be95462e6a9d4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c1359775b3b907743ce1d7754c904fab5cf953ad3a28f91e781be95462e6a9d4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T08:39:17Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T08:39:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a56c24b3af6b3d5c18009cbb5517273674eb36f57157344f99903f9c50c7d1a0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a56c24b3af6b3d5c18009cbb5517273674eb36f57157344f99903f9c50c7d1a0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T08:39:18Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T08:39:18Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://18dbd66a12f96fa4beef4da9dcc0e1b1d3a5164a8b21a6774142289078d6c05f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://18dbd66a12f96fa4beef4da9dcc0e1b1d3a5164a8b21a6774142289078d6c05f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T08:39:19Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T08:39:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T08:39:16Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T08:39:38Z is after 2025-08-24T17:21:41Z" Oct 03 08:39:38 crc kubenswrapper[4765]: I1003 08:39:38.787453 4765 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:35Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:35Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T08:39:38Z is after 2025-08-24T17:21:41Z" Oct 03 08:39:38 crc kubenswrapper[4765]: I1003 08:39:38.806853 4765 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-csb5z" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"912755c8-dd28-4fbc-82de-9cf85df54f4f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d7f179012e9f55f30c641a1ae3640cc90cefb3d2527d0c1e0580c219899503e1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T08:39:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w8k2k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T08:39:36Z\\\"}}\" for pod \"openshift-multus\"/\"multus-csb5z\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T08:39:38Z is after 2025-08-24T17:21:41Z" Oct 03 08:39:38 crc kubenswrapper[4765]: I1003 08:39:38.820991 4765 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-j8mss" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d636dbad-9ffa-4ba7-953f-adea04b76a23\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://33c95fa1034cd2135f4293956d73825e809195d220ff0b10a6604bd399a5730a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T08:39:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dhzzj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://714c78e9165f96e2aee03ad7be980399f06aeb852da4d76611c236f262518281\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T08:39:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dhzzj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T08:39:36Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-j8mss\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T08:39:38Z is after 2025-08-24T17:21:41Z" Oct 03 08:39:38 crc kubenswrapper[4765]: I1003 08:39:38.832946 4765 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:35Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:35Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T08:39:38Z is after 2025-08-24T17:21:41Z" Oct 03 08:39:38 crc kubenswrapper[4765]: I1003 08:39:38.847081 4765 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:36Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:36Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e8d6f534a0a702832db2f8947c1528a98d511d3950cc5a6ec0ac3b31b3dbcb7c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T08:39:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://20ad16cb9f0f7e17ac946cd2c3f7c01b6e6c95d6d76c99f482b3761546689af2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T08:39:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T08:39:38Z is after 2025-08-24T17:21:41Z" Oct 03 08:39:38 crc kubenswrapper[4765]: I1003 08:39:38.862154 4765 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:38Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:38Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a37f2b5f797755065158a077232872befbc61f2f19c80dfd27bba7f131db794c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T08:39:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T08:39:38Z is after 2025-08-24T17:21:41Z" Oct 03 08:39:38 crc kubenswrapper[4765]: I1003 08:39:38.881914 4765 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-srgbb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ea01fba1-445f-46c1-898c-1ceb34866850\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:36Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:36Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m4zqv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m4zqv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m4zqv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m4zqv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m4zqv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m4zqv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m4zqv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m4zqv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://761ad034a8a6a7a6280fcccaca136b26ddd423885bc2a7ab6b8e1856f16f7416\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://761ad034a8a6a7a6280fcccaca136b26ddd423885bc2a7ab6b8e1856f16f7416\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T08:39:37Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T08:39:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m4zqv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T08:39:36Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-srgbb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T08:39:38Z is after 2025-08-24T17:21:41Z" Oct 03 08:39:38 crc kubenswrapper[4765]: I1003 08:39:38.894434 4765 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9660b983-3561-4cf7-8ea0-31a63e8d1051\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://23c27e7d79dab0c54b22f0114e7f55a9267e3a21961b8479c37fd77d0e8b66c7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T08:39:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fb89a31c804d86cbc11b04e4dcfab79d4536f28a107d43e98d48172a1c257ebb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T08:39:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1c3168f51c49cd9633557cf31cdc0fec47b3fcf981462dc85f4253a0584fcf52\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T08:39:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://99ae775d5cfd2e88a1c7ca516e1c59f2e08ce1d383653cacbefeac66b07abcb5\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T08:39:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T08:39:16Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T08:39:38Z is after 2025-08-24T17:21:41Z" Oct 03 08:39:38 crc kubenswrapper[4765]: I1003 08:39:38.907437 4765 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 03 08:39:38 crc kubenswrapper[4765]: E1003 08:39:38.907750 4765 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-03 08:39:42.907722615 +0000 UTC m=+27.209216955 (durationBeforeRetry 4s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 03 08:39:38 crc kubenswrapper[4765]: I1003 08:39:38.909004 4765 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-4bmrv" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4f105c06-3e67-486f-a622-923ae442117c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:36Z\\\",\\\"message\\\":\\\"containers with incomplete status: [cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:36Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:36Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6lhz2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ab314ed006e71cae099ecf8be4ac18fb777dd83fe4e233d9bc347965f76b6a0b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ab314ed006e71cae099ecf8be4ac18fb777dd83fe4e233d9bc347965f76b6a0b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T08:39:37Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T08:39:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6lhz2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6lhz2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6lhz2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6lhz2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6lhz2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6lhz2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T08:39:36Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-4bmrv\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T08:39:38Z is after 2025-08-24T17:21:41Z" Oct 03 08:39:38 crc kubenswrapper[4765]: I1003 08:39:38.921920 4765 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-p9gf5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"46c76a49-e10b-4a12-a6c7-12c330cd3c4e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d127171dd11041892813dd0596574630e756cc4f2e54b149619bffdbe9bae37d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T08:39:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2ldt7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T08:39:36Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-p9gf5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T08:39:38Z is after 2025-08-24T17:21:41Z" Oct 03 08:39:39 crc kubenswrapper[4765]: I1003 08:39:39.008304 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 03 08:39:39 crc kubenswrapper[4765]: I1003 08:39:39.008366 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 03 08:39:39 crc kubenswrapper[4765]: I1003 08:39:39.008395 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 03 08:39:39 crc kubenswrapper[4765]: I1003 08:39:39.008421 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 03 08:39:39 crc kubenswrapper[4765]: E1003 08:39:39.008587 4765 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Oct 03 08:39:39 crc kubenswrapper[4765]: E1003 08:39:39.008595 4765 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Oct 03 08:39:39 crc kubenswrapper[4765]: E1003 08:39:39.008658 4765 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Oct 03 08:39:39 crc kubenswrapper[4765]: E1003 08:39:39.008674 4765 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Oct 03 08:39:39 crc kubenswrapper[4765]: E1003 08:39:39.008690 4765 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-10-03 08:39:43.008666767 +0000 UTC m=+27.310161097 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Oct 03 08:39:39 crc kubenswrapper[4765]: E1003 08:39:39.008748 4765 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2025-10-03 08:39:43.008722249 +0000 UTC m=+27.310216759 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Oct 03 08:39:39 crc kubenswrapper[4765]: E1003 08:39:39.008593 4765 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Oct 03 08:39:39 crc kubenswrapper[4765]: E1003 08:39:39.008801 4765 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-10-03 08:39:43.00879225 +0000 UTC m=+27.310286790 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Oct 03 08:39:39 crc kubenswrapper[4765]: E1003 08:39:39.008805 4765 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Oct 03 08:39:39 crc kubenswrapper[4765]: E1003 08:39:39.008847 4765 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Oct 03 08:39:39 crc kubenswrapper[4765]: E1003 08:39:39.008867 4765 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Oct 03 08:39:39 crc kubenswrapper[4765]: E1003 08:39:39.008952 4765 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2025-10-03 08:39:43.008923713 +0000 UTC m=+27.310418043 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Oct 03 08:39:39 crc kubenswrapper[4765]: I1003 08:39:39.296152 4765 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-image-registry/node-ca-svqbq"] Oct 03 08:39:39 crc kubenswrapper[4765]: I1003 08:39:39.296924 4765 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/node-ca-svqbq" Oct 03 08:39:39 crc kubenswrapper[4765]: I1003 08:39:39.298273 4765 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"kube-root-ca.crt" Oct 03 08:39:39 crc kubenswrapper[4765]: I1003 08:39:39.298678 4765 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"openshift-service-ca.crt" Oct 03 08:39:39 crc kubenswrapper[4765]: I1003 08:39:39.299796 4765 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"image-registry-certificates" Oct 03 08:39:39 crc kubenswrapper[4765]: I1003 08:39:39.300127 4765 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"node-ca-dockercfg-4777p" Oct 03 08:39:39 crc kubenswrapper[4765]: I1003 08:39:39.305912 4765 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 03 08:39:39 crc kubenswrapper[4765]: I1003 08:39:39.305910 4765 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 03 08:39:39 crc kubenswrapper[4765]: E1003 08:39:39.306042 4765 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 03 08:39:39 crc kubenswrapper[4765]: E1003 08:39:39.306214 4765 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 03 08:39:39 crc kubenswrapper[4765]: I1003 08:39:39.307412 4765 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 03 08:39:39 crc kubenswrapper[4765]: E1003 08:39:39.307514 4765 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 03 08:39:39 crc kubenswrapper[4765]: I1003 08:39:39.314247 4765 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0c434639-9c6c-420c-a51b-fdf59b654daa\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:16Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:16Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d31497fd54f7500ac776bdd9a16414d873c053353911ed5ba237b201e9e7ac12\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T08:39:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e89b19d6a5b90a2051665bf2e5e150f73df7899eff246ee75246bc2127c415ea\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T08:39:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://79fad446c147481b1a0ff2a173848b2d24384e6b6aafcd0749dc820e9abfe929\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T08:39:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f2e21a2b21d807288e991a3a44ea38d316985590080aa4291aa3385816f826dd\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://aa0283dadc2c5e48aa9bfd20ef35d889a350244b72eb8529d4d4e682d5fa0e47\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-03T08:39:35Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1003 08:39:29.830291 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1003 08:39:29.833185 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2710500186/tls.crt::/tmp/serving-cert-2710500186/tls.key\\\\\\\"\\\\nI1003 08:39:35.213224 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1003 08:39:35.219008 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1003 08:39:35.219055 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1003 08:39:35.219088 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1003 08:39:35.219098 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1003 08:39:35.227302 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI1003 08:39:35.227314 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW1003 08:39:35.227372 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1003 08:39:35.227381 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1003 08:39:35.227385 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1003 08:39:35.227395 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1003 08:39:35.227398 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1003 08:39:35.227401 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF1003 08:39:35.229781 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-03T08:39:19Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T08:39:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fa1d1c0f4dab4b4c6c9f3afccac34473eab40a714015a2a7ce725ed1a92b609c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T08:39:19Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7c94b0f96f50324a67a7b189e9d3c4e406193421aa2e793692219bef9f2578dd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7c94b0f96f50324a67a7b189e9d3c4e406193421aa2e793692219bef9f2578dd\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T08:39:17Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T08:39:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T08:39:16Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T08:39:39Z is after 2025-08-24T17:21:41Z" Oct 03 08:39:39 crc kubenswrapper[4765]: I1003 08:39:39.327076 4765 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:35Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:35Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T08:39:39Z is after 2025-08-24T17:21:41Z" Oct 03 08:39:39 crc kubenswrapper[4765]: I1003 08:39:39.342131 4765 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:36Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:36Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e2003e4dd90b26bd915c05a690d0ab12b21ef7773138f11993382b0e7ac2d778\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T08:39:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T08:39:39Z is after 2025-08-24T17:21:41Z" Oct 03 08:39:39 crc kubenswrapper[4765]: I1003 08:39:39.356575 4765 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:35Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:35Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T08:39:39Z is after 2025-08-24T17:21:41Z" Oct 03 08:39:39 crc kubenswrapper[4765]: I1003 08:39:39.369059 4765 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-csb5z" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"912755c8-dd28-4fbc-82de-9cf85df54f4f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d7f179012e9f55f30c641a1ae3640cc90cefb3d2527d0c1e0580c219899503e1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T08:39:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w8k2k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T08:39:36Z\\\"}}\" for pod \"openshift-multus\"/\"multus-csb5z\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T08:39:39Z is after 2025-08-24T17:21:41Z" Oct 03 08:39:39 crc kubenswrapper[4765]: I1003 08:39:39.383579 4765 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-j8mss" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d636dbad-9ffa-4ba7-953f-adea04b76a23\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://33c95fa1034cd2135f4293956d73825e809195d220ff0b10a6604bd399a5730a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T08:39:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dhzzj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://714c78e9165f96e2aee03ad7be980399f06aeb852da4d76611c236f262518281\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T08:39:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dhzzj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T08:39:36Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-j8mss\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T08:39:39Z is after 2025-08-24T17:21:41Z" Oct 03 08:39:39 crc kubenswrapper[4765]: I1003 08:39:39.410762 4765 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"859ee4f1-636f-48e5-ad72-fef19f311c64\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ffaf0cbc60fa84230a87aff908b5b2a76956abfa937aeea94363abe91640b93e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T08:39:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0fee410f71d4fa82e7bf54dad906736bc7182be512825a06bf7a4c76ed2f2789\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T08:39:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ab0ed26066c771f9943b6435fa382ff61fb04f0c8bef3d505aba4c5d1a1d4740\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T08:39:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://153c9584928c3d064c6098126dad58733015ed123b9a55c959e69ddcc0ad2110\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T08:39:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6fa1bc45d80d90bc08ca3a7177e2ac77b66c36f5a0f863532174be7719bfaae0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T08:39:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c1359775b3b907743ce1d7754c904fab5cf953ad3a28f91e781be95462e6a9d4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c1359775b3b907743ce1d7754c904fab5cf953ad3a28f91e781be95462e6a9d4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T08:39:17Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T08:39:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a56c24b3af6b3d5c18009cbb5517273674eb36f57157344f99903f9c50c7d1a0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a56c24b3af6b3d5c18009cbb5517273674eb36f57157344f99903f9c50c7d1a0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T08:39:18Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T08:39:18Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://18dbd66a12f96fa4beef4da9dcc0e1b1d3a5164a8b21a6774142289078d6c05f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://18dbd66a12f96fa4beef4da9dcc0e1b1d3a5164a8b21a6774142289078d6c05f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T08:39:19Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T08:39:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T08:39:16Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T08:39:39Z is after 2025-08-24T17:21:41Z" Oct 03 08:39:39 crc kubenswrapper[4765]: I1003 08:39:39.411537 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/84cdf1d7-9997-4015-bdbf-eedacc081685-host\") pod \"node-ca-svqbq\" (UID: \"84cdf1d7-9997-4015-bdbf-eedacc081685\") " pod="openshift-image-registry/node-ca-svqbq" Oct 03 08:39:39 crc kubenswrapper[4765]: I1003 08:39:39.411591 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/84cdf1d7-9997-4015-bdbf-eedacc081685-serviceca\") pod \"node-ca-svqbq\" (UID: \"84cdf1d7-9997-4015-bdbf-eedacc081685\") " pod="openshift-image-registry/node-ca-svqbq" Oct 03 08:39:39 crc kubenswrapper[4765]: I1003 08:39:39.411621 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tm2z5\" (UniqueName: \"kubernetes.io/projected/84cdf1d7-9997-4015-bdbf-eedacc081685-kube-api-access-tm2z5\") pod \"node-ca-svqbq\" (UID: \"84cdf1d7-9997-4015-bdbf-eedacc081685\") " pod="openshift-image-registry/node-ca-svqbq" Oct 03 08:39:39 crc kubenswrapper[4765]: I1003 08:39:39.426425 4765 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:36Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:36Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e8d6f534a0a702832db2f8947c1528a98d511d3950cc5a6ec0ac3b31b3dbcb7c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T08:39:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://20ad16cb9f0f7e17ac946cd2c3f7c01b6e6c95d6d76c99f482b3761546689af2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T08:39:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T08:39:39Z is after 2025-08-24T17:21:41Z" Oct 03 08:39:39 crc kubenswrapper[4765]: I1003 08:39:39.440404 4765 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:38Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:38Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a37f2b5f797755065158a077232872befbc61f2f19c80dfd27bba7f131db794c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T08:39:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T08:39:39Z is after 2025-08-24T17:21:41Z" Oct 03 08:39:39 crc kubenswrapper[4765]: I1003 08:39:39.458909 4765 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-srgbb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ea01fba1-445f-46c1-898c-1ceb34866850\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:36Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:36Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m4zqv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m4zqv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m4zqv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m4zqv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m4zqv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m4zqv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m4zqv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m4zqv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://761ad034a8a6a7a6280fcccaca136b26ddd423885bc2a7ab6b8e1856f16f7416\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://761ad034a8a6a7a6280fcccaca136b26ddd423885bc2a7ab6b8e1856f16f7416\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T08:39:37Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T08:39:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m4zqv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T08:39:36Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-srgbb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T08:39:39Z is after 2025-08-24T17:21:41Z" Oct 03 08:39:39 crc kubenswrapper[4765]: I1003 08:39:39.476052 4765 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:35Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:35Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T08:39:39Z is after 2025-08-24T17:21:41Z" Oct 03 08:39:39 crc kubenswrapper[4765]: I1003 08:39:39.486581 4765 generic.go:334] "Generic (PLEG): container finished" podID="4f105c06-3e67-486f-a622-923ae442117c" containerID="261208c132a527b6786f64f37dd699e889f68ebc01265028288b3620615274bc" exitCode=0 Oct 03 08:39:39 crc kubenswrapper[4765]: I1003 08:39:39.486688 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-4bmrv" event={"ID":"4f105c06-3e67-486f-a622-923ae442117c","Type":"ContainerDied","Data":"261208c132a527b6786f64f37dd699e889f68ebc01265028288b3620615274bc"} Oct 03 08:39:39 crc kubenswrapper[4765]: I1003 08:39:39.493315 4765 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-4bmrv" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4f105c06-3e67-486f-a622-923ae442117c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:36Z\\\",\\\"message\\\":\\\"containers with incomplete status: [cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:36Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:36Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6lhz2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ab314ed006e71cae099ecf8be4ac18fb777dd83fe4e233d9bc347965f76b6a0b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ab314ed006e71cae099ecf8be4ac18fb777dd83fe4e233d9bc347965f76b6a0b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T08:39:37Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T08:39:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6lhz2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6lhz2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6lhz2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6lhz2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6lhz2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6lhz2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T08:39:36Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-4bmrv\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T08:39:39Z is after 2025-08-24T17:21:41Z" Oct 03 08:39:39 crc kubenswrapper[4765]: I1003 08:39:39.506478 4765 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-p9gf5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"46c76a49-e10b-4a12-a6c7-12c330cd3c4e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d127171dd11041892813dd0596574630e756cc4f2e54b149619bffdbe9bae37d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T08:39:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2ldt7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T08:39:36Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-p9gf5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T08:39:39Z is after 2025-08-24T17:21:41Z" Oct 03 08:39:39 crc kubenswrapper[4765]: I1003 08:39:39.512671 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/84cdf1d7-9997-4015-bdbf-eedacc081685-host\") pod \"node-ca-svqbq\" (UID: \"84cdf1d7-9997-4015-bdbf-eedacc081685\") " pod="openshift-image-registry/node-ca-svqbq" Oct 03 08:39:39 crc kubenswrapper[4765]: I1003 08:39:39.512900 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/84cdf1d7-9997-4015-bdbf-eedacc081685-host\") pod \"node-ca-svqbq\" (UID: \"84cdf1d7-9997-4015-bdbf-eedacc081685\") " pod="openshift-image-registry/node-ca-svqbq" Oct 03 08:39:39 crc kubenswrapper[4765]: I1003 08:39:39.513289 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/84cdf1d7-9997-4015-bdbf-eedacc081685-serviceca\") pod \"node-ca-svqbq\" (UID: \"84cdf1d7-9997-4015-bdbf-eedacc081685\") " pod="openshift-image-registry/node-ca-svqbq" Oct 03 08:39:39 crc kubenswrapper[4765]: I1003 08:39:39.513387 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tm2z5\" (UniqueName: \"kubernetes.io/projected/84cdf1d7-9997-4015-bdbf-eedacc081685-kube-api-access-tm2z5\") pod \"node-ca-svqbq\" (UID: \"84cdf1d7-9997-4015-bdbf-eedacc081685\") " pod="openshift-image-registry/node-ca-svqbq" Oct 03 08:39:39 crc kubenswrapper[4765]: I1003 08:39:39.514694 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/84cdf1d7-9997-4015-bdbf-eedacc081685-serviceca\") pod \"node-ca-svqbq\" (UID: \"84cdf1d7-9997-4015-bdbf-eedacc081685\") " pod="openshift-image-registry/node-ca-svqbq" Oct 03 08:39:39 crc kubenswrapper[4765]: I1003 08:39:39.523026 4765 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-svqbq" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"84cdf1d7-9997-4015-bdbf-eedacc081685\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:39Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:39Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:39Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tm2z5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T08:39:39Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-svqbq\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T08:39:39Z is after 2025-08-24T17:21:41Z" Oct 03 08:39:39 crc kubenswrapper[4765]: I1003 08:39:39.536233 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tm2z5\" (UniqueName: \"kubernetes.io/projected/84cdf1d7-9997-4015-bdbf-eedacc081685-kube-api-access-tm2z5\") pod \"node-ca-svqbq\" (UID: \"84cdf1d7-9997-4015-bdbf-eedacc081685\") " pod="openshift-image-registry/node-ca-svqbq" Oct 03 08:39:39 crc kubenswrapper[4765]: I1003 08:39:39.539690 4765 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9660b983-3561-4cf7-8ea0-31a63e8d1051\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://23c27e7d79dab0c54b22f0114e7f55a9267e3a21961b8479c37fd77d0e8b66c7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T08:39:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fb89a31c804d86cbc11b04e4dcfab79d4536f28a107d43e98d48172a1c257ebb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T08:39:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1c3168f51c49cd9633557cf31cdc0fec47b3fcf981462dc85f4253a0584fcf52\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T08:39:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://99ae775d5cfd2e88a1c7ca516e1c59f2e08ce1d383653cacbefeac66b07abcb5\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T08:39:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T08:39:16Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T08:39:39Z is after 2025-08-24T17:21:41Z" Oct 03 08:39:39 crc kubenswrapper[4765]: I1003 08:39:39.559634 4765 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"859ee4f1-636f-48e5-ad72-fef19f311c64\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ffaf0cbc60fa84230a87aff908b5b2a76956abfa937aeea94363abe91640b93e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T08:39:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0fee410f71d4fa82e7bf54dad906736bc7182be512825a06bf7a4c76ed2f2789\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T08:39:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ab0ed26066c771f9943b6435fa382ff61fb04f0c8bef3d505aba4c5d1a1d4740\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T08:39:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://153c9584928c3d064c6098126dad58733015ed123b9a55c959e69ddcc0ad2110\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T08:39:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6fa1bc45d80d90bc08ca3a7177e2ac77b66c36f5a0f863532174be7719bfaae0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T08:39:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c1359775b3b907743ce1d7754c904fab5cf953ad3a28f91e781be95462e6a9d4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c1359775b3b907743ce1d7754c904fab5cf953ad3a28f91e781be95462e6a9d4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T08:39:17Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T08:39:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a56c24b3af6b3d5c18009cbb5517273674eb36f57157344f99903f9c50c7d1a0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a56c24b3af6b3d5c18009cbb5517273674eb36f57157344f99903f9c50c7d1a0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T08:39:18Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T08:39:18Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://18dbd66a12f96fa4beef4da9dcc0e1b1d3a5164a8b21a6774142289078d6c05f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://18dbd66a12f96fa4beef4da9dcc0e1b1d3a5164a8b21a6774142289078d6c05f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T08:39:19Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T08:39:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T08:39:16Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T08:39:39Z is after 2025-08-24T17:21:41Z" Oct 03 08:39:39 crc kubenswrapper[4765]: I1003 08:39:39.572809 4765 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:35Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:35Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T08:39:39Z is after 2025-08-24T17:21:41Z" Oct 03 08:39:39 crc kubenswrapper[4765]: I1003 08:39:39.585064 4765 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-csb5z" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"912755c8-dd28-4fbc-82de-9cf85df54f4f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d7f179012e9f55f30c641a1ae3640cc90cefb3d2527d0c1e0580c219899503e1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T08:39:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w8k2k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T08:39:36Z\\\"}}\" for pod \"openshift-multus\"/\"multus-csb5z\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T08:39:39Z is after 2025-08-24T17:21:41Z" Oct 03 08:39:39 crc kubenswrapper[4765]: I1003 08:39:39.597086 4765 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-j8mss" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d636dbad-9ffa-4ba7-953f-adea04b76a23\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://33c95fa1034cd2135f4293956d73825e809195d220ff0b10a6604bd399a5730a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T08:39:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dhzzj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://714c78e9165f96e2aee03ad7be980399f06aeb852da4d76611c236f262518281\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T08:39:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dhzzj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T08:39:36Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-j8mss\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T08:39:39Z is after 2025-08-24T17:21:41Z" Oct 03 08:39:39 crc kubenswrapper[4765]: I1003 08:39:39.610628 4765 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/node-ca-svqbq" Oct 03 08:39:39 crc kubenswrapper[4765]: W1003 08:39:39.627923 4765 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod84cdf1d7_9997_4015_bdbf_eedacc081685.slice/crio-2e79be5bbd5546c72cb76b883ca0163407b15febd54f81a57654df5409c3bf96 WatchSource:0}: Error finding container 2e79be5bbd5546c72cb76b883ca0163407b15febd54f81a57654df5409c3bf96: Status 404 returned error can't find the container with id 2e79be5bbd5546c72cb76b883ca0163407b15febd54f81a57654df5409c3bf96 Oct 03 08:39:39 crc kubenswrapper[4765]: I1003 08:39:39.639935 4765 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:35Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:35Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T08:39:39Z is after 2025-08-24T17:21:41Z" Oct 03 08:39:39 crc kubenswrapper[4765]: I1003 08:39:39.678496 4765 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:36Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:36Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e8d6f534a0a702832db2f8947c1528a98d511d3950cc5a6ec0ac3b31b3dbcb7c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T08:39:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://20ad16cb9f0f7e17ac946cd2c3f7c01b6e6c95d6d76c99f482b3761546689af2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T08:39:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T08:39:39Z is after 2025-08-24T17:21:41Z" Oct 03 08:39:39 crc kubenswrapper[4765]: I1003 08:39:39.716140 4765 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:38Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:38Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a37f2b5f797755065158a077232872befbc61f2f19c80dfd27bba7f131db794c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T08:39:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T08:39:39Z is after 2025-08-24T17:21:41Z" Oct 03 08:39:39 crc kubenswrapper[4765]: I1003 08:39:39.803148 4765 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-srgbb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ea01fba1-445f-46c1-898c-1ceb34866850\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:36Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:36Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m4zqv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m4zqv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m4zqv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m4zqv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m4zqv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m4zqv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m4zqv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m4zqv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://761ad034a8a6a7a6280fcccaca136b26ddd423885bc2a7ab6b8e1856f16f7416\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://761ad034a8a6a7a6280fcccaca136b26ddd423885bc2a7ab6b8e1856f16f7416\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T08:39:37Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T08:39:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m4zqv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T08:39:36Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-srgbb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T08:39:39Z is after 2025-08-24T17:21:41Z" Oct 03 08:39:39 crc kubenswrapper[4765]: I1003 08:39:39.823565 4765 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9660b983-3561-4cf7-8ea0-31a63e8d1051\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://23c27e7d79dab0c54b22f0114e7f55a9267e3a21961b8479c37fd77d0e8b66c7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T08:39:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fb89a31c804d86cbc11b04e4dcfab79d4536f28a107d43e98d48172a1c257ebb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T08:39:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1c3168f51c49cd9633557cf31cdc0fec47b3fcf981462dc85f4253a0584fcf52\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T08:39:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://99ae775d5cfd2e88a1c7ca516e1c59f2e08ce1d383653cacbefeac66b07abcb5\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T08:39:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T08:39:16Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T08:39:39Z is after 2025-08-24T17:21:41Z" Oct 03 08:39:39 crc kubenswrapper[4765]: I1003 08:39:39.839510 4765 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-4bmrv" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4f105c06-3e67-486f-a622-923ae442117c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:36Z\\\",\\\"message\\\":\\\"containers with incomplete status: [bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:36Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:36Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6lhz2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ab314ed006e71cae099ecf8be4ac18fb777dd83fe4e233d9bc347965f76b6a0b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ab314ed006e71cae099ecf8be4ac18fb777dd83fe4e233d9bc347965f76b6a0b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T08:39:37Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T08:39:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6lhz2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://261208c132a527b6786f64f37dd699e889f68ebc01265028288b3620615274bc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://261208c132a527b6786f64f37dd699e889f68ebc01265028288b3620615274bc\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T08:39:39Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T08:39:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6lhz2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6lhz2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6lhz2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6lhz2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6lhz2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T08:39:36Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-4bmrv\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T08:39:39Z is after 2025-08-24T17:21:41Z" Oct 03 08:39:39 crc kubenswrapper[4765]: I1003 08:39:39.877183 4765 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-p9gf5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"46c76a49-e10b-4a12-a6c7-12c330cd3c4e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d127171dd11041892813dd0596574630e756cc4f2e54b149619bffdbe9bae37d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T08:39:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2ldt7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T08:39:36Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-p9gf5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T08:39:39Z is after 2025-08-24T17:21:41Z" Oct 03 08:39:39 crc kubenswrapper[4765]: I1003 08:39:39.916962 4765 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-svqbq" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"84cdf1d7-9997-4015-bdbf-eedacc081685\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:39Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:39Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:39Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tm2z5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T08:39:39Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-svqbq\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T08:39:39Z is after 2025-08-24T17:21:41Z" Oct 03 08:39:39 crc kubenswrapper[4765]: I1003 08:39:39.959308 4765 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0c434639-9c6c-420c-a51b-fdf59b654daa\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:16Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:16Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d31497fd54f7500ac776bdd9a16414d873c053353911ed5ba237b201e9e7ac12\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T08:39:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e89b19d6a5b90a2051665bf2e5e150f73df7899eff246ee75246bc2127c415ea\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T08:39:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://79fad446c147481b1a0ff2a173848b2d24384e6b6aafcd0749dc820e9abfe929\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T08:39:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f2e21a2b21d807288e991a3a44ea38d316985590080aa4291aa3385816f826dd\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://aa0283dadc2c5e48aa9bfd20ef35d889a350244b72eb8529d4d4e682d5fa0e47\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-03T08:39:35Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1003 08:39:29.830291 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1003 08:39:29.833185 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2710500186/tls.crt::/tmp/serving-cert-2710500186/tls.key\\\\\\\"\\\\nI1003 08:39:35.213224 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1003 08:39:35.219008 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1003 08:39:35.219055 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1003 08:39:35.219088 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1003 08:39:35.219098 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1003 08:39:35.227302 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI1003 08:39:35.227314 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW1003 08:39:35.227372 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1003 08:39:35.227381 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1003 08:39:35.227385 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1003 08:39:35.227395 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1003 08:39:35.227398 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1003 08:39:35.227401 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF1003 08:39:35.229781 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-03T08:39:19Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T08:39:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fa1d1c0f4dab4b4c6c9f3afccac34473eab40a714015a2a7ce725ed1a92b609c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T08:39:19Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7c94b0f96f50324a67a7b189e9d3c4e406193421aa2e793692219bef9f2578dd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7c94b0f96f50324a67a7b189e9d3c4e406193421aa2e793692219bef9f2578dd\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T08:39:17Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T08:39:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T08:39:16Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T08:39:39Z is after 2025-08-24T17:21:41Z" Oct 03 08:39:39 crc kubenswrapper[4765]: I1003 08:39:39.997087 4765 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:35Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:35Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T08:39:39Z is after 2025-08-24T17:21:41Z" Oct 03 08:39:40 crc kubenswrapper[4765]: I1003 08:39:40.038476 4765 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:36Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:36Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e2003e4dd90b26bd915c05a690d0ab12b21ef7773138f11993382b0e7ac2d778\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T08:39:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T08:39:40Z is after 2025-08-24T17:21:41Z" Oct 03 08:39:40 crc kubenswrapper[4765]: I1003 08:39:40.494616 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-srgbb" event={"ID":"ea01fba1-445f-46c1-898c-1ceb34866850","Type":"ContainerStarted","Data":"6d5d60eb6ab5ff22cc2c6826b1d47220bb827fa0429f2a59020ae01d0a43f6bf"} Oct 03 08:39:40 crc kubenswrapper[4765]: I1003 08:39:40.496865 4765 generic.go:334] "Generic (PLEG): container finished" podID="4f105c06-3e67-486f-a622-923ae442117c" containerID="8358f38c6c98de8206fc03fda21200b0e526972747588a6e32d525c064aa57ac" exitCode=0 Oct 03 08:39:40 crc kubenswrapper[4765]: I1003 08:39:40.496917 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-4bmrv" event={"ID":"4f105c06-3e67-486f-a622-923ae442117c","Type":"ContainerDied","Data":"8358f38c6c98de8206fc03fda21200b0e526972747588a6e32d525c064aa57ac"} Oct 03 08:39:40 crc kubenswrapper[4765]: I1003 08:39:40.498413 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/node-ca-svqbq" event={"ID":"84cdf1d7-9997-4015-bdbf-eedacc081685","Type":"ContainerStarted","Data":"43441b23076aa88505c0014c6734ffd0302f9011300711eece573befc94f3fbf"} Oct 03 08:39:40 crc kubenswrapper[4765]: I1003 08:39:40.498457 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/node-ca-svqbq" event={"ID":"84cdf1d7-9997-4015-bdbf-eedacc081685","Type":"ContainerStarted","Data":"2e79be5bbd5546c72cb76b883ca0163407b15febd54f81a57654df5409c3bf96"} Oct 03 08:39:40 crc kubenswrapper[4765]: I1003 08:39:40.521768 4765 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"859ee4f1-636f-48e5-ad72-fef19f311c64\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ffaf0cbc60fa84230a87aff908b5b2a76956abfa937aeea94363abe91640b93e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T08:39:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0fee410f71d4fa82e7bf54dad906736bc7182be512825a06bf7a4c76ed2f2789\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T08:39:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ab0ed26066c771f9943b6435fa382ff61fb04f0c8bef3d505aba4c5d1a1d4740\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T08:39:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://153c9584928c3d064c6098126dad58733015ed123b9a55c959e69ddcc0ad2110\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T08:39:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6fa1bc45d80d90bc08ca3a7177e2ac77b66c36f5a0f863532174be7719bfaae0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T08:39:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c1359775b3b907743ce1d7754c904fab5cf953ad3a28f91e781be95462e6a9d4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c1359775b3b907743ce1d7754c904fab5cf953ad3a28f91e781be95462e6a9d4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T08:39:17Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T08:39:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a56c24b3af6b3d5c18009cbb5517273674eb36f57157344f99903f9c50c7d1a0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a56c24b3af6b3d5c18009cbb5517273674eb36f57157344f99903f9c50c7d1a0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T08:39:18Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T08:39:18Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://18dbd66a12f96fa4beef4da9dcc0e1b1d3a5164a8b21a6774142289078d6c05f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://18dbd66a12f96fa4beef4da9dcc0e1b1d3a5164a8b21a6774142289078d6c05f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T08:39:19Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T08:39:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T08:39:16Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T08:39:40Z is after 2025-08-24T17:21:41Z" Oct 03 08:39:40 crc kubenswrapper[4765]: I1003 08:39:40.538199 4765 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:35Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:35Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T08:39:40Z is after 2025-08-24T17:21:41Z" Oct 03 08:39:40 crc kubenswrapper[4765]: I1003 08:39:40.562136 4765 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-csb5z" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"912755c8-dd28-4fbc-82de-9cf85df54f4f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d7f179012e9f55f30c641a1ae3640cc90cefb3d2527d0c1e0580c219899503e1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T08:39:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w8k2k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T08:39:36Z\\\"}}\" for pod \"openshift-multus\"/\"multus-csb5z\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T08:39:40Z is after 2025-08-24T17:21:41Z" Oct 03 08:39:40 crc kubenswrapper[4765]: I1003 08:39:40.576040 4765 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-j8mss" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d636dbad-9ffa-4ba7-953f-adea04b76a23\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://33c95fa1034cd2135f4293956d73825e809195d220ff0b10a6604bd399a5730a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T08:39:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dhzzj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://714c78e9165f96e2aee03ad7be980399f06aeb852da4d76611c236f262518281\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T08:39:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dhzzj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T08:39:36Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-j8mss\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T08:39:40Z is after 2025-08-24T17:21:41Z" Oct 03 08:39:40 crc kubenswrapper[4765]: I1003 08:39:40.594505 4765 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:35Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:35Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T08:39:40Z is after 2025-08-24T17:21:41Z" Oct 03 08:39:40 crc kubenswrapper[4765]: I1003 08:39:40.608274 4765 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:36Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:36Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e8d6f534a0a702832db2f8947c1528a98d511d3950cc5a6ec0ac3b31b3dbcb7c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T08:39:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://20ad16cb9f0f7e17ac946cd2c3f7c01b6e6c95d6d76c99f482b3761546689af2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T08:39:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T08:39:40Z is after 2025-08-24T17:21:41Z" Oct 03 08:39:40 crc kubenswrapper[4765]: I1003 08:39:40.627844 4765 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:38Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:38Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a37f2b5f797755065158a077232872befbc61f2f19c80dfd27bba7f131db794c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T08:39:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T08:39:40Z is after 2025-08-24T17:21:41Z" Oct 03 08:39:40 crc kubenswrapper[4765]: I1003 08:39:40.647457 4765 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-srgbb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ea01fba1-445f-46c1-898c-1ceb34866850\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:36Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:36Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m4zqv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m4zqv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m4zqv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m4zqv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m4zqv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m4zqv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m4zqv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m4zqv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://761ad034a8a6a7a6280fcccaca136b26ddd423885bc2a7ab6b8e1856f16f7416\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://761ad034a8a6a7a6280fcccaca136b26ddd423885bc2a7ab6b8e1856f16f7416\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T08:39:37Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T08:39:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m4zqv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T08:39:36Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-srgbb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T08:39:40Z is after 2025-08-24T17:21:41Z" Oct 03 08:39:40 crc kubenswrapper[4765]: I1003 08:39:40.665858 4765 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9660b983-3561-4cf7-8ea0-31a63e8d1051\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://23c27e7d79dab0c54b22f0114e7f55a9267e3a21961b8479c37fd77d0e8b66c7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T08:39:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fb89a31c804d86cbc11b04e4dcfab79d4536f28a107d43e98d48172a1c257ebb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T08:39:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1c3168f51c49cd9633557cf31cdc0fec47b3fcf981462dc85f4253a0584fcf52\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T08:39:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://99ae775d5cfd2e88a1c7ca516e1c59f2e08ce1d383653cacbefeac66b07abcb5\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T08:39:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T08:39:16Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T08:39:40Z is after 2025-08-24T17:21:41Z" Oct 03 08:39:40 crc kubenswrapper[4765]: I1003 08:39:40.693436 4765 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-4bmrv" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4f105c06-3e67-486f-a622-923ae442117c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:36Z\\\",\\\"message\\\":\\\"containers with incomplete status: [routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:36Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:36Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6lhz2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ab314ed006e71cae099ecf8be4ac18fb777dd83fe4e233d9bc347965f76b6a0b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ab314ed006e71cae099ecf8be4ac18fb777dd83fe4e233d9bc347965f76b6a0b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T08:39:37Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T08:39:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6lhz2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://261208c132a527b6786f64f37dd699e889f68ebc01265028288b3620615274bc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://261208c132a527b6786f64f37dd699e889f68ebc01265028288b3620615274bc\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T08:39:39Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T08:39:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6lhz2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8358f38c6c98de8206fc03fda21200b0e526972747588a6e32d525c064aa57ac\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8358f38c6c98de8206fc03fda21200b0e526972747588a6e32d525c064aa57ac\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T08:39:39Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T08:39:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6lhz2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6lhz2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6lhz2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6lhz2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T08:39:36Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-4bmrv\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T08:39:40Z is after 2025-08-24T17:21:41Z" Oct 03 08:39:40 crc kubenswrapper[4765]: I1003 08:39:40.706304 4765 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-p9gf5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"46c76a49-e10b-4a12-a6c7-12c330cd3c4e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d127171dd11041892813dd0596574630e756cc4f2e54b149619bffdbe9bae37d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T08:39:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2ldt7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T08:39:36Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-p9gf5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T08:39:40Z is after 2025-08-24T17:21:41Z" Oct 03 08:39:40 crc kubenswrapper[4765]: I1003 08:39:40.719426 4765 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-svqbq" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"84cdf1d7-9997-4015-bdbf-eedacc081685\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:39Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:39Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:39Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tm2z5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T08:39:39Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-svqbq\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T08:39:40Z is after 2025-08-24T17:21:41Z" Oct 03 08:39:40 crc kubenswrapper[4765]: I1003 08:39:40.734244 4765 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:36Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:36Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e2003e4dd90b26bd915c05a690d0ab12b21ef7773138f11993382b0e7ac2d778\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T08:39:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T08:39:40Z is after 2025-08-24T17:21:41Z" Oct 03 08:39:40 crc kubenswrapper[4765]: I1003 08:39:40.752024 4765 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0c434639-9c6c-420c-a51b-fdf59b654daa\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:16Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:16Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d31497fd54f7500ac776bdd9a16414d873c053353911ed5ba237b201e9e7ac12\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T08:39:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e89b19d6a5b90a2051665bf2e5e150f73df7899eff246ee75246bc2127c415ea\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T08:39:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://79fad446c147481b1a0ff2a173848b2d24384e6b6aafcd0749dc820e9abfe929\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T08:39:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f2e21a2b21d807288e991a3a44ea38d316985590080aa4291aa3385816f826dd\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://aa0283dadc2c5e48aa9bfd20ef35d889a350244b72eb8529d4d4e682d5fa0e47\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-03T08:39:35Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1003 08:39:29.830291 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1003 08:39:29.833185 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2710500186/tls.crt::/tmp/serving-cert-2710500186/tls.key\\\\\\\"\\\\nI1003 08:39:35.213224 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1003 08:39:35.219008 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1003 08:39:35.219055 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1003 08:39:35.219088 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1003 08:39:35.219098 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1003 08:39:35.227302 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI1003 08:39:35.227314 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW1003 08:39:35.227372 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1003 08:39:35.227381 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1003 08:39:35.227385 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1003 08:39:35.227395 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1003 08:39:35.227398 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1003 08:39:35.227401 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF1003 08:39:35.229781 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-03T08:39:19Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T08:39:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fa1d1c0f4dab4b4c6c9f3afccac34473eab40a714015a2a7ce725ed1a92b609c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T08:39:19Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7c94b0f96f50324a67a7b189e9d3c4e406193421aa2e793692219bef9f2578dd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7c94b0f96f50324a67a7b189e9d3c4e406193421aa2e793692219bef9f2578dd\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T08:39:17Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T08:39:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T08:39:16Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T08:39:40Z is after 2025-08-24T17:21:41Z" Oct 03 08:39:40 crc kubenswrapper[4765]: I1003 08:39:40.765557 4765 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:35Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:35Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T08:39:40Z is after 2025-08-24T17:21:41Z" Oct 03 08:39:40 crc kubenswrapper[4765]: I1003 08:39:40.783140 4765 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:36Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:36Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e2003e4dd90b26bd915c05a690d0ab12b21ef7773138f11993382b0e7ac2d778\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T08:39:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T08:39:40Z is after 2025-08-24T17:21:41Z" Oct 03 08:39:40 crc kubenswrapper[4765]: I1003 08:39:40.798308 4765 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0c434639-9c6c-420c-a51b-fdf59b654daa\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:16Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:16Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d31497fd54f7500ac776bdd9a16414d873c053353911ed5ba237b201e9e7ac12\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T08:39:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e89b19d6a5b90a2051665bf2e5e150f73df7899eff246ee75246bc2127c415ea\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T08:39:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://79fad446c147481b1a0ff2a173848b2d24384e6b6aafcd0749dc820e9abfe929\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T08:39:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f2e21a2b21d807288e991a3a44ea38d316985590080aa4291aa3385816f826dd\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://aa0283dadc2c5e48aa9bfd20ef35d889a350244b72eb8529d4d4e682d5fa0e47\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-03T08:39:35Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1003 08:39:29.830291 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1003 08:39:29.833185 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2710500186/tls.crt::/tmp/serving-cert-2710500186/tls.key\\\\\\\"\\\\nI1003 08:39:35.213224 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1003 08:39:35.219008 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1003 08:39:35.219055 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1003 08:39:35.219088 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1003 08:39:35.219098 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1003 08:39:35.227302 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI1003 08:39:35.227314 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW1003 08:39:35.227372 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1003 08:39:35.227381 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1003 08:39:35.227385 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1003 08:39:35.227395 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1003 08:39:35.227398 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1003 08:39:35.227401 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF1003 08:39:35.229781 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-03T08:39:19Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T08:39:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fa1d1c0f4dab4b4c6c9f3afccac34473eab40a714015a2a7ce725ed1a92b609c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T08:39:19Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7c94b0f96f50324a67a7b189e9d3c4e406193421aa2e793692219bef9f2578dd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7c94b0f96f50324a67a7b189e9d3c4e406193421aa2e793692219bef9f2578dd\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T08:39:17Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T08:39:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T08:39:16Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T08:39:40Z is after 2025-08-24T17:21:41Z" Oct 03 08:39:40 crc kubenswrapper[4765]: I1003 08:39:40.812363 4765 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:35Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:35Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T08:39:40Z is after 2025-08-24T17:21:41Z" Oct 03 08:39:40 crc kubenswrapper[4765]: I1003 08:39:40.836313 4765 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"859ee4f1-636f-48e5-ad72-fef19f311c64\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ffaf0cbc60fa84230a87aff908b5b2a76956abfa937aeea94363abe91640b93e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T08:39:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0fee410f71d4fa82e7bf54dad906736bc7182be512825a06bf7a4c76ed2f2789\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T08:39:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ab0ed26066c771f9943b6435fa382ff61fb04f0c8bef3d505aba4c5d1a1d4740\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T08:39:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://153c9584928c3d064c6098126dad58733015ed123b9a55c959e69ddcc0ad2110\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T08:39:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6fa1bc45d80d90bc08ca3a7177e2ac77b66c36f5a0f863532174be7719bfaae0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T08:39:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c1359775b3b907743ce1d7754c904fab5cf953ad3a28f91e781be95462e6a9d4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c1359775b3b907743ce1d7754c904fab5cf953ad3a28f91e781be95462e6a9d4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T08:39:17Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T08:39:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a56c24b3af6b3d5c18009cbb5517273674eb36f57157344f99903f9c50c7d1a0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a56c24b3af6b3d5c18009cbb5517273674eb36f57157344f99903f9c50c7d1a0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T08:39:18Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T08:39:18Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://18dbd66a12f96fa4beef4da9dcc0e1b1d3a5164a8b21a6774142289078d6c05f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://18dbd66a12f96fa4beef4da9dcc0e1b1d3a5164a8b21a6774142289078d6c05f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T08:39:19Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T08:39:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T08:39:16Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T08:39:40Z is after 2025-08-24T17:21:41Z" Oct 03 08:39:40 crc kubenswrapper[4765]: I1003 08:39:40.850616 4765 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:35Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:35Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T08:39:40Z is after 2025-08-24T17:21:41Z" Oct 03 08:39:40 crc kubenswrapper[4765]: I1003 08:39:40.881103 4765 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-csb5z" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"912755c8-dd28-4fbc-82de-9cf85df54f4f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d7f179012e9f55f30c641a1ae3640cc90cefb3d2527d0c1e0580c219899503e1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T08:39:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w8k2k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T08:39:36Z\\\"}}\" for pod \"openshift-multus\"/\"multus-csb5z\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T08:39:40Z is after 2025-08-24T17:21:41Z" Oct 03 08:39:40 crc kubenswrapper[4765]: I1003 08:39:40.917397 4765 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-j8mss" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d636dbad-9ffa-4ba7-953f-adea04b76a23\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://33c95fa1034cd2135f4293956d73825e809195d220ff0b10a6604bd399a5730a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T08:39:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dhzzj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://714c78e9165f96e2aee03ad7be980399f06aeb852da4d76611c236f262518281\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T08:39:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dhzzj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T08:39:36Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-j8mss\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T08:39:40Z is after 2025-08-24T17:21:41Z" Oct 03 08:39:40 crc kubenswrapper[4765]: I1003 08:39:40.957903 4765 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:35Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:35Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T08:39:40Z is after 2025-08-24T17:21:41Z" Oct 03 08:39:40 crc kubenswrapper[4765]: I1003 08:39:40.998263 4765 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:36Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:36Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e8d6f534a0a702832db2f8947c1528a98d511d3950cc5a6ec0ac3b31b3dbcb7c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T08:39:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://20ad16cb9f0f7e17ac946cd2c3f7c01b6e6c95d6d76c99f482b3761546689af2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T08:39:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T08:39:40Z is after 2025-08-24T17:21:41Z" Oct 03 08:39:41 crc kubenswrapper[4765]: I1003 08:39:41.034874 4765 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:38Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:38Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a37f2b5f797755065158a077232872befbc61f2f19c80dfd27bba7f131db794c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T08:39:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T08:39:41Z is after 2025-08-24T17:21:41Z" Oct 03 08:39:41 crc kubenswrapper[4765]: I1003 08:39:41.084333 4765 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-srgbb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ea01fba1-445f-46c1-898c-1ceb34866850\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:36Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:36Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m4zqv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m4zqv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m4zqv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m4zqv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m4zqv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m4zqv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m4zqv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m4zqv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://761ad034a8a6a7a6280fcccaca136b26ddd423885bc2a7ab6b8e1856f16f7416\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://761ad034a8a6a7a6280fcccaca136b26ddd423885bc2a7ab6b8e1856f16f7416\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T08:39:37Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T08:39:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m4zqv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T08:39:36Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-srgbb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T08:39:41Z is after 2025-08-24T17:21:41Z" Oct 03 08:39:41 crc kubenswrapper[4765]: I1003 08:39:41.117260 4765 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9660b983-3561-4cf7-8ea0-31a63e8d1051\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://23c27e7d79dab0c54b22f0114e7f55a9267e3a21961b8479c37fd77d0e8b66c7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T08:39:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fb89a31c804d86cbc11b04e4dcfab79d4536f28a107d43e98d48172a1c257ebb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T08:39:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1c3168f51c49cd9633557cf31cdc0fec47b3fcf981462dc85f4253a0584fcf52\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T08:39:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://99ae775d5cfd2e88a1c7ca516e1c59f2e08ce1d383653cacbefeac66b07abcb5\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T08:39:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T08:39:16Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T08:39:41Z is after 2025-08-24T17:21:41Z" Oct 03 08:39:41 crc kubenswrapper[4765]: I1003 08:39:41.158198 4765 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-4bmrv" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4f105c06-3e67-486f-a622-923ae442117c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:36Z\\\",\\\"message\\\":\\\"containers with incomplete status: [routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:36Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:36Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6lhz2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ab314ed006e71cae099ecf8be4ac18fb777dd83fe4e233d9bc347965f76b6a0b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ab314ed006e71cae099ecf8be4ac18fb777dd83fe4e233d9bc347965f76b6a0b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T08:39:37Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T08:39:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6lhz2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://261208c132a527b6786f64f37dd699e889f68ebc01265028288b3620615274bc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://261208c132a527b6786f64f37dd699e889f68ebc01265028288b3620615274bc\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T08:39:39Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T08:39:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6lhz2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8358f38c6c98de8206fc03fda21200b0e526972747588a6e32d525c064aa57ac\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8358f38c6c98de8206fc03fda21200b0e526972747588a6e32d525c064aa57ac\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T08:39:39Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T08:39:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6lhz2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6lhz2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6lhz2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6lhz2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T08:39:36Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-4bmrv\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T08:39:41Z is after 2025-08-24T17:21:41Z" Oct 03 08:39:41 crc kubenswrapper[4765]: I1003 08:39:41.197603 4765 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-p9gf5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"46c76a49-e10b-4a12-a6c7-12c330cd3c4e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d127171dd11041892813dd0596574630e756cc4f2e54b149619bffdbe9bae37d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T08:39:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2ldt7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T08:39:36Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-p9gf5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T08:39:41Z is after 2025-08-24T17:21:41Z" Oct 03 08:39:41 crc kubenswrapper[4765]: I1003 08:39:41.237342 4765 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-svqbq" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"84cdf1d7-9997-4015-bdbf-eedacc081685\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://43441b23076aa88505c0014c6734ffd0302f9011300711eece573befc94f3fbf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T08:39:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tm2z5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T08:39:39Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-svqbq\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T08:39:41Z is after 2025-08-24T17:21:41Z" Oct 03 08:39:41 crc kubenswrapper[4765]: I1003 08:39:41.306065 4765 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 03 08:39:41 crc kubenswrapper[4765]: I1003 08:39:41.306114 4765 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 03 08:39:41 crc kubenswrapper[4765]: I1003 08:39:41.306087 4765 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 03 08:39:41 crc kubenswrapper[4765]: E1003 08:39:41.306214 4765 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 03 08:39:41 crc kubenswrapper[4765]: E1003 08:39:41.306326 4765 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 03 08:39:41 crc kubenswrapper[4765]: E1003 08:39:41.306413 4765 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 03 08:39:41 crc kubenswrapper[4765]: I1003 08:39:41.451572 4765 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 03 08:39:41 crc kubenswrapper[4765]: I1003 08:39:41.453755 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 08:39:41 crc kubenswrapper[4765]: I1003 08:39:41.453794 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 08:39:41 crc kubenswrapper[4765]: I1003 08:39:41.453810 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 08:39:41 crc kubenswrapper[4765]: I1003 08:39:41.453945 4765 kubelet_node_status.go:76] "Attempting to register node" node="crc" Oct 03 08:39:41 crc kubenswrapper[4765]: I1003 08:39:41.463276 4765 kubelet_node_status.go:115] "Node was previously registered" node="crc" Oct 03 08:39:41 crc kubenswrapper[4765]: I1003 08:39:41.463597 4765 kubelet_node_status.go:79] "Successfully registered node" node="crc" Oct 03 08:39:41 crc kubenswrapper[4765]: I1003 08:39:41.464791 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 08:39:41 crc kubenswrapper[4765]: I1003 08:39:41.464822 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 08:39:41 crc kubenswrapper[4765]: I1003 08:39:41.464834 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 08:39:41 crc kubenswrapper[4765]: I1003 08:39:41.464852 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 08:39:41 crc kubenswrapper[4765]: I1003 08:39:41.464867 4765 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T08:39:41Z","lastTransitionTime":"2025-10-03T08:39:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 08:39:41 crc kubenswrapper[4765]: E1003 08:39:41.480804 4765 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-03T08:39:41Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:41Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-03T08:39:41Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:41Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-03T08:39:41Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:41Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-03T08:39:41Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:41Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"4a5a1b91-d1b3-462d-b8c2-89eae83d6c3d\\\",\\\"systemUUID\\\":\\\"c85bcae8-d463-4f60-8737-09c0f3c02573\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T08:39:41Z is after 2025-08-24T17:21:41Z" Oct 03 08:39:41 crc kubenswrapper[4765]: I1003 08:39:41.485190 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 08:39:41 crc kubenswrapper[4765]: I1003 08:39:41.485234 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 08:39:41 crc kubenswrapper[4765]: I1003 08:39:41.485249 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 08:39:41 crc kubenswrapper[4765]: I1003 08:39:41.485268 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 08:39:41 crc kubenswrapper[4765]: I1003 08:39:41.485280 4765 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T08:39:41Z","lastTransitionTime":"2025-10-03T08:39:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 08:39:41 crc kubenswrapper[4765]: E1003 08:39:41.500672 4765 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-03T08:39:41Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:41Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-03T08:39:41Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:41Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-03T08:39:41Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:41Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-03T08:39:41Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:41Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"4a5a1b91-d1b3-462d-b8c2-89eae83d6c3d\\\",\\\"systemUUID\\\":\\\"c85bcae8-d463-4f60-8737-09c0f3c02573\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T08:39:41Z is after 2025-08-24T17:21:41Z" Oct 03 08:39:41 crc kubenswrapper[4765]: I1003 08:39:41.504633 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 08:39:41 crc kubenswrapper[4765]: I1003 08:39:41.504694 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 08:39:41 crc kubenswrapper[4765]: I1003 08:39:41.504704 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 08:39:41 crc kubenswrapper[4765]: I1003 08:39:41.504717 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 08:39:41 crc kubenswrapper[4765]: I1003 08:39:41.504740 4765 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T08:39:41Z","lastTransitionTime":"2025-10-03T08:39:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 08:39:41 crc kubenswrapper[4765]: I1003 08:39:41.506360 4765 generic.go:334] "Generic (PLEG): container finished" podID="4f105c06-3e67-486f-a622-923ae442117c" containerID="9af7a0993c4e8d1177050ee170ae306c2e2570b0daca2d3f5c812b5f0e9c81da" exitCode=0 Oct 03 08:39:41 crc kubenswrapper[4765]: I1003 08:39:41.506395 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-4bmrv" event={"ID":"4f105c06-3e67-486f-a622-923ae442117c","Type":"ContainerDied","Data":"9af7a0993c4e8d1177050ee170ae306c2e2570b0daca2d3f5c812b5f0e9c81da"} Oct 03 08:39:41 crc kubenswrapper[4765]: E1003 08:39:41.519289 4765 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-03T08:39:41Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:41Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-03T08:39:41Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:41Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-03T08:39:41Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:41Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-03T08:39:41Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:41Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"4a5a1b91-d1b3-462d-b8c2-89eae83d6c3d\\\",\\\"systemUUID\\\":\\\"c85bcae8-d463-4f60-8737-09c0f3c02573\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T08:39:41Z is after 2025-08-24T17:21:41Z" Oct 03 08:39:41 crc kubenswrapper[4765]: I1003 08:39:41.522408 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 08:39:41 crc kubenswrapper[4765]: I1003 08:39:41.522450 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 08:39:41 crc kubenswrapper[4765]: I1003 08:39:41.522462 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 08:39:41 crc kubenswrapper[4765]: I1003 08:39:41.522480 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 08:39:41 crc kubenswrapper[4765]: I1003 08:39:41.522492 4765 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T08:39:41Z","lastTransitionTime":"2025-10-03T08:39:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 08:39:41 crc kubenswrapper[4765]: I1003 08:39:41.532211 4765 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"859ee4f1-636f-48e5-ad72-fef19f311c64\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ffaf0cbc60fa84230a87aff908b5b2a76956abfa937aeea94363abe91640b93e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T08:39:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0fee410f71d4fa82e7bf54dad906736bc7182be512825a06bf7a4c76ed2f2789\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T08:39:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ab0ed26066c771f9943b6435fa382ff61fb04f0c8bef3d505aba4c5d1a1d4740\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T08:39:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://153c9584928c3d064c6098126dad58733015ed123b9a55c959e69ddcc0ad2110\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T08:39:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6fa1bc45d80d90bc08ca3a7177e2ac77b66c36f5a0f863532174be7719bfaae0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T08:39:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c1359775b3b907743ce1d7754c904fab5cf953ad3a28f91e781be95462e6a9d4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c1359775b3b907743ce1d7754c904fab5cf953ad3a28f91e781be95462e6a9d4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T08:39:17Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T08:39:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a56c24b3af6b3d5c18009cbb5517273674eb36f57157344f99903f9c50c7d1a0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a56c24b3af6b3d5c18009cbb5517273674eb36f57157344f99903f9c50c7d1a0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T08:39:18Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T08:39:18Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://18dbd66a12f96fa4beef4da9dcc0e1b1d3a5164a8b21a6774142289078d6c05f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://18dbd66a12f96fa4beef4da9dcc0e1b1d3a5164a8b21a6774142289078d6c05f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T08:39:19Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T08:39:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T08:39:16Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T08:39:41Z is after 2025-08-24T17:21:41Z" Oct 03 08:39:41 crc kubenswrapper[4765]: E1003 08:39:41.535112 4765 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-03T08:39:41Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:41Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-03T08:39:41Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:41Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-03T08:39:41Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:41Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-03T08:39:41Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:41Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"4a5a1b91-d1b3-462d-b8c2-89eae83d6c3d\\\",\\\"systemUUID\\\":\\\"c85bcae8-d463-4f60-8737-09c0f3c02573\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T08:39:41Z is after 2025-08-24T17:21:41Z" Oct 03 08:39:41 crc kubenswrapper[4765]: I1003 08:39:41.544359 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 08:39:41 crc kubenswrapper[4765]: I1003 08:39:41.544395 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 08:39:41 crc kubenswrapper[4765]: I1003 08:39:41.544405 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 08:39:41 crc kubenswrapper[4765]: I1003 08:39:41.544421 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 08:39:41 crc kubenswrapper[4765]: I1003 08:39:41.544432 4765 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T08:39:41Z","lastTransitionTime":"2025-10-03T08:39:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 08:39:41 crc kubenswrapper[4765]: I1003 08:39:41.550212 4765 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:35Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:35Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T08:39:41Z is after 2025-08-24T17:21:41Z" Oct 03 08:39:41 crc kubenswrapper[4765]: E1003 08:39:41.559162 4765 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-03T08:39:41Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:41Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-03T08:39:41Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:41Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-03T08:39:41Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:41Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-03T08:39:41Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:41Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"4a5a1b91-d1b3-462d-b8c2-89eae83d6c3d\\\",\\\"systemUUID\\\":\\\"c85bcae8-d463-4f60-8737-09c0f3c02573\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T08:39:41Z is after 2025-08-24T17:21:41Z" Oct 03 08:39:41 crc kubenswrapper[4765]: E1003 08:39:41.559308 4765 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Oct 03 08:39:41 crc kubenswrapper[4765]: I1003 08:39:41.561691 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 08:39:41 crc kubenswrapper[4765]: I1003 08:39:41.561715 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 08:39:41 crc kubenswrapper[4765]: I1003 08:39:41.561723 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 08:39:41 crc kubenswrapper[4765]: I1003 08:39:41.561738 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 08:39:41 crc kubenswrapper[4765]: I1003 08:39:41.561748 4765 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T08:39:41Z","lastTransitionTime":"2025-10-03T08:39:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 08:39:41 crc kubenswrapper[4765]: I1003 08:39:41.565150 4765 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-csb5z" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"912755c8-dd28-4fbc-82de-9cf85df54f4f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d7f179012e9f55f30c641a1ae3640cc90cefb3d2527d0c1e0580c219899503e1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T08:39:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w8k2k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T08:39:36Z\\\"}}\" for pod \"openshift-multus\"/\"multus-csb5z\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T08:39:41Z is after 2025-08-24T17:21:41Z" Oct 03 08:39:41 crc kubenswrapper[4765]: I1003 08:39:41.577854 4765 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-j8mss" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d636dbad-9ffa-4ba7-953f-adea04b76a23\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://33c95fa1034cd2135f4293956d73825e809195d220ff0b10a6604bd399a5730a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T08:39:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dhzzj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://714c78e9165f96e2aee03ad7be980399f06aeb852da4d76611c236f262518281\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T08:39:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dhzzj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T08:39:36Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-j8mss\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T08:39:41Z is after 2025-08-24T17:21:41Z" Oct 03 08:39:41 crc kubenswrapper[4765]: I1003 08:39:41.605885 4765 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:35Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:35Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T08:39:41Z is after 2025-08-24T17:21:41Z" Oct 03 08:39:41 crc kubenswrapper[4765]: I1003 08:39:41.638730 4765 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:36Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:36Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e8d6f534a0a702832db2f8947c1528a98d511d3950cc5a6ec0ac3b31b3dbcb7c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T08:39:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://20ad16cb9f0f7e17ac946cd2c3f7c01b6e6c95d6d76c99f482b3761546689af2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T08:39:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T08:39:41Z is after 2025-08-24T17:21:41Z" Oct 03 08:39:41 crc kubenswrapper[4765]: I1003 08:39:41.658637 4765 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:38Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:38Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a37f2b5f797755065158a077232872befbc61f2f19c80dfd27bba7f131db794c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T08:39:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T08:39:41Z is after 2025-08-24T17:21:41Z" Oct 03 08:39:41 crc kubenswrapper[4765]: I1003 08:39:41.664415 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 08:39:41 crc kubenswrapper[4765]: I1003 08:39:41.664456 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 08:39:41 crc kubenswrapper[4765]: I1003 08:39:41.664467 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 08:39:41 crc kubenswrapper[4765]: I1003 08:39:41.664484 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 08:39:41 crc kubenswrapper[4765]: I1003 08:39:41.664495 4765 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T08:39:41Z","lastTransitionTime":"2025-10-03T08:39:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 08:39:41 crc kubenswrapper[4765]: I1003 08:39:41.681021 4765 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-srgbb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ea01fba1-445f-46c1-898c-1ceb34866850\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:36Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:36Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m4zqv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m4zqv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m4zqv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m4zqv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m4zqv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m4zqv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m4zqv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m4zqv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://761ad034a8a6a7a6280fcccaca136b26ddd423885bc2a7ab6b8e1856f16f7416\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://761ad034a8a6a7a6280fcccaca136b26ddd423885bc2a7ab6b8e1856f16f7416\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T08:39:37Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T08:39:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m4zqv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T08:39:36Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-srgbb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T08:39:41Z is after 2025-08-24T17:21:41Z" Oct 03 08:39:41 crc kubenswrapper[4765]: I1003 08:39:41.702139 4765 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9660b983-3561-4cf7-8ea0-31a63e8d1051\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://23c27e7d79dab0c54b22f0114e7f55a9267e3a21961b8479c37fd77d0e8b66c7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T08:39:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fb89a31c804d86cbc11b04e4dcfab79d4536f28a107d43e98d48172a1c257ebb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T08:39:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1c3168f51c49cd9633557cf31cdc0fec47b3fcf981462dc85f4253a0584fcf52\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T08:39:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://99ae775d5cfd2e88a1c7ca516e1c59f2e08ce1d383653cacbefeac66b07abcb5\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T08:39:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T08:39:16Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T08:39:41Z is after 2025-08-24T17:21:41Z" Oct 03 08:39:41 crc kubenswrapper[4765]: I1003 08:39:41.719212 4765 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-4bmrv" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4f105c06-3e67-486f-a622-923ae442117c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:36Z\\\",\\\"message\\\":\\\"containers with incomplete status: [whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:36Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:36Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6lhz2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ab314ed006e71cae099ecf8be4ac18fb777dd83fe4e233d9bc347965f76b6a0b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ab314ed006e71cae099ecf8be4ac18fb777dd83fe4e233d9bc347965f76b6a0b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T08:39:37Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T08:39:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6lhz2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://261208c132a527b6786f64f37dd699e889f68ebc01265028288b3620615274bc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://261208c132a527b6786f64f37dd699e889f68ebc01265028288b3620615274bc\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T08:39:39Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T08:39:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6lhz2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8358f38c6c98de8206fc03fda21200b0e526972747588a6e32d525c064aa57ac\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8358f38c6c98de8206fc03fda21200b0e526972747588a6e32d525c064aa57ac\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T08:39:39Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T08:39:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6lhz2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9af7a0993c4e8d1177050ee170ae306c2e2570b0daca2d3f5c812b5f0e9c81da\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9af7a0993c4e8d1177050ee170ae306c2e2570b0daca2d3f5c812b5f0e9c81da\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T08:39:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T08:39:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6lhz2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6lhz2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6lhz2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T08:39:36Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-4bmrv\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T08:39:41Z is after 2025-08-24T17:21:41Z" Oct 03 08:39:41 crc kubenswrapper[4765]: I1003 08:39:41.733604 4765 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-p9gf5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"46c76a49-e10b-4a12-a6c7-12c330cd3c4e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d127171dd11041892813dd0596574630e756cc4f2e54b149619bffdbe9bae37d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T08:39:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2ldt7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T08:39:36Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-p9gf5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T08:39:41Z is after 2025-08-24T17:21:41Z" Oct 03 08:39:41 crc kubenswrapper[4765]: I1003 08:39:41.756687 4765 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-svqbq" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"84cdf1d7-9997-4015-bdbf-eedacc081685\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://43441b23076aa88505c0014c6734ffd0302f9011300711eece573befc94f3fbf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T08:39:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tm2z5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T08:39:39Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-svqbq\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T08:39:41Z is after 2025-08-24T17:21:41Z" Oct 03 08:39:41 crc kubenswrapper[4765]: I1003 08:39:41.767630 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 08:39:41 crc kubenswrapper[4765]: I1003 08:39:41.767687 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 08:39:41 crc kubenswrapper[4765]: I1003 08:39:41.767699 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 08:39:41 crc kubenswrapper[4765]: I1003 08:39:41.767716 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 08:39:41 crc kubenswrapper[4765]: I1003 08:39:41.767729 4765 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T08:39:41Z","lastTransitionTime":"2025-10-03T08:39:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 08:39:41 crc kubenswrapper[4765]: I1003 08:39:41.801563 4765 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0c434639-9c6c-420c-a51b-fdf59b654daa\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:16Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:16Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d31497fd54f7500ac776bdd9a16414d873c053353911ed5ba237b201e9e7ac12\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T08:39:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e89b19d6a5b90a2051665bf2e5e150f73df7899eff246ee75246bc2127c415ea\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T08:39:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://79fad446c147481b1a0ff2a173848b2d24384e6b6aafcd0749dc820e9abfe929\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T08:39:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f2e21a2b21d807288e991a3a44ea38d316985590080aa4291aa3385816f826dd\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://aa0283dadc2c5e48aa9bfd20ef35d889a350244b72eb8529d4d4e682d5fa0e47\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-03T08:39:35Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1003 08:39:29.830291 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1003 08:39:29.833185 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2710500186/tls.crt::/tmp/serving-cert-2710500186/tls.key\\\\\\\"\\\\nI1003 08:39:35.213224 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1003 08:39:35.219008 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1003 08:39:35.219055 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1003 08:39:35.219088 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1003 08:39:35.219098 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1003 08:39:35.227302 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI1003 08:39:35.227314 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW1003 08:39:35.227372 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1003 08:39:35.227381 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1003 08:39:35.227385 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1003 08:39:35.227395 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1003 08:39:35.227398 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1003 08:39:35.227401 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF1003 08:39:35.229781 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-03T08:39:19Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T08:39:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fa1d1c0f4dab4b4c6c9f3afccac34473eab40a714015a2a7ce725ed1a92b609c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T08:39:19Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7c94b0f96f50324a67a7b189e9d3c4e406193421aa2e793692219bef9f2578dd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7c94b0f96f50324a67a7b189e9d3c4e406193421aa2e793692219bef9f2578dd\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T08:39:17Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T08:39:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T08:39:16Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T08:39:41Z is after 2025-08-24T17:21:41Z" Oct 03 08:39:41 crc kubenswrapper[4765]: I1003 08:39:41.838546 4765 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:35Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:35Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T08:39:41Z is after 2025-08-24T17:21:41Z" Oct 03 08:39:41 crc kubenswrapper[4765]: I1003 08:39:41.870812 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 08:39:41 crc kubenswrapper[4765]: I1003 08:39:41.870859 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 08:39:41 crc kubenswrapper[4765]: I1003 08:39:41.870871 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 08:39:41 crc kubenswrapper[4765]: I1003 08:39:41.870890 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 08:39:41 crc kubenswrapper[4765]: I1003 08:39:41.870902 4765 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T08:39:41Z","lastTransitionTime":"2025-10-03T08:39:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 08:39:41 crc kubenswrapper[4765]: I1003 08:39:41.879497 4765 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:36Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:36Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e2003e4dd90b26bd915c05a690d0ab12b21ef7773138f11993382b0e7ac2d778\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T08:39:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T08:39:41Z is after 2025-08-24T17:21:41Z" Oct 03 08:39:41 crc kubenswrapper[4765]: I1003 08:39:41.974567 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 08:39:41 crc kubenswrapper[4765]: I1003 08:39:41.974603 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 08:39:41 crc kubenswrapper[4765]: I1003 08:39:41.974613 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 08:39:41 crc kubenswrapper[4765]: I1003 08:39:41.974629 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 08:39:41 crc kubenswrapper[4765]: I1003 08:39:41.974640 4765 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T08:39:41Z","lastTransitionTime":"2025-10-03T08:39:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 08:39:42 crc kubenswrapper[4765]: I1003 08:39:42.077177 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 08:39:42 crc kubenswrapper[4765]: I1003 08:39:42.077216 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 08:39:42 crc kubenswrapper[4765]: I1003 08:39:42.077225 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 08:39:42 crc kubenswrapper[4765]: I1003 08:39:42.077240 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 08:39:42 crc kubenswrapper[4765]: I1003 08:39:42.077248 4765 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T08:39:42Z","lastTransitionTime":"2025-10-03T08:39:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 08:39:42 crc kubenswrapper[4765]: I1003 08:39:42.180174 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 08:39:42 crc kubenswrapper[4765]: I1003 08:39:42.180217 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 08:39:42 crc kubenswrapper[4765]: I1003 08:39:42.180226 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 08:39:42 crc kubenswrapper[4765]: I1003 08:39:42.180244 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 08:39:42 crc kubenswrapper[4765]: I1003 08:39:42.180257 4765 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T08:39:42Z","lastTransitionTime":"2025-10-03T08:39:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 08:39:42 crc kubenswrapper[4765]: I1003 08:39:42.283591 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 08:39:42 crc kubenswrapper[4765]: I1003 08:39:42.283689 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 08:39:42 crc kubenswrapper[4765]: I1003 08:39:42.283703 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 08:39:42 crc kubenswrapper[4765]: I1003 08:39:42.283723 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 08:39:42 crc kubenswrapper[4765]: I1003 08:39:42.283736 4765 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T08:39:42Z","lastTransitionTime":"2025-10-03T08:39:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 08:39:42 crc kubenswrapper[4765]: I1003 08:39:42.386719 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 08:39:42 crc kubenswrapper[4765]: I1003 08:39:42.386779 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 08:39:42 crc kubenswrapper[4765]: I1003 08:39:42.386792 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 08:39:42 crc kubenswrapper[4765]: I1003 08:39:42.386814 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 08:39:42 crc kubenswrapper[4765]: I1003 08:39:42.386830 4765 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T08:39:42Z","lastTransitionTime":"2025-10-03T08:39:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 08:39:42 crc kubenswrapper[4765]: I1003 08:39:42.489891 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 08:39:42 crc kubenswrapper[4765]: I1003 08:39:42.489932 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 08:39:42 crc kubenswrapper[4765]: I1003 08:39:42.489942 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 08:39:42 crc kubenswrapper[4765]: I1003 08:39:42.489959 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 08:39:42 crc kubenswrapper[4765]: I1003 08:39:42.489970 4765 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T08:39:42Z","lastTransitionTime":"2025-10-03T08:39:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 08:39:42 crc kubenswrapper[4765]: I1003 08:39:42.515475 4765 generic.go:334] "Generic (PLEG): container finished" podID="4f105c06-3e67-486f-a622-923ae442117c" containerID="23ac91bc25ecc5c606b22bf6df52129330bb8c214ef8ec881fb202df6350c853" exitCode=0 Oct 03 08:39:42 crc kubenswrapper[4765]: I1003 08:39:42.515542 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-4bmrv" event={"ID":"4f105c06-3e67-486f-a622-923ae442117c","Type":"ContainerDied","Data":"23ac91bc25ecc5c606b22bf6df52129330bb8c214ef8ec881fb202df6350c853"} Oct 03 08:39:42 crc kubenswrapper[4765]: I1003 08:39:42.531973 4765 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:35Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:35Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T08:39:42Z is after 2025-08-24T17:21:41Z" Oct 03 08:39:42 crc kubenswrapper[4765]: I1003 08:39:42.552538 4765 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:36Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:36Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e8d6f534a0a702832db2f8947c1528a98d511d3950cc5a6ec0ac3b31b3dbcb7c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T08:39:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://20ad16cb9f0f7e17ac946cd2c3f7c01b6e6c95d6d76c99f482b3761546689af2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T08:39:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T08:39:42Z is after 2025-08-24T17:21:41Z" Oct 03 08:39:42 crc kubenswrapper[4765]: I1003 08:39:42.567370 4765 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:38Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:38Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a37f2b5f797755065158a077232872befbc61f2f19c80dfd27bba7f131db794c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T08:39:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T08:39:42Z is after 2025-08-24T17:21:41Z" Oct 03 08:39:42 crc kubenswrapper[4765]: I1003 08:39:42.587133 4765 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-srgbb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ea01fba1-445f-46c1-898c-1ceb34866850\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:36Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:36Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m4zqv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m4zqv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m4zqv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m4zqv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m4zqv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m4zqv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m4zqv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m4zqv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://761ad034a8a6a7a6280fcccaca136b26ddd423885bc2a7ab6b8e1856f16f7416\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://761ad034a8a6a7a6280fcccaca136b26ddd423885bc2a7ab6b8e1856f16f7416\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T08:39:37Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T08:39:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m4zqv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T08:39:36Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-srgbb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T08:39:42Z is after 2025-08-24T17:21:41Z" Oct 03 08:39:42 crc kubenswrapper[4765]: I1003 08:39:42.592630 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 08:39:42 crc kubenswrapper[4765]: I1003 08:39:42.592674 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 08:39:42 crc kubenswrapper[4765]: I1003 08:39:42.592683 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 08:39:42 crc kubenswrapper[4765]: I1003 08:39:42.592701 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 08:39:42 crc kubenswrapper[4765]: I1003 08:39:42.592712 4765 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T08:39:42Z","lastTransitionTime":"2025-10-03T08:39:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 08:39:42 crc kubenswrapper[4765]: I1003 08:39:42.602019 4765 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9660b983-3561-4cf7-8ea0-31a63e8d1051\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://23c27e7d79dab0c54b22f0114e7f55a9267e3a21961b8479c37fd77d0e8b66c7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T08:39:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fb89a31c804d86cbc11b04e4dcfab79d4536f28a107d43e98d48172a1c257ebb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T08:39:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1c3168f51c49cd9633557cf31cdc0fec47b3fcf981462dc85f4253a0584fcf52\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T08:39:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://99ae775d5cfd2e88a1c7ca516e1c59f2e08ce1d383653cacbefeac66b07abcb5\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T08:39:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T08:39:16Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T08:39:42Z is after 2025-08-24T17:21:41Z" Oct 03 08:39:42 crc kubenswrapper[4765]: I1003 08:39:42.618982 4765 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-4bmrv" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4f105c06-3e67-486f-a622-923ae442117c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:36Z\\\",\\\"message\\\":\\\"containers with incomplete status: [whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:36Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:36Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6lhz2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ab314ed006e71cae099ecf8be4ac18fb777dd83fe4e233d9bc347965f76b6a0b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ab314ed006e71cae099ecf8be4ac18fb777dd83fe4e233d9bc347965f76b6a0b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T08:39:37Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T08:39:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6lhz2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://261208c132a527b6786f64f37dd699e889f68ebc01265028288b3620615274bc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://261208c132a527b6786f64f37dd699e889f68ebc01265028288b3620615274bc\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T08:39:39Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T08:39:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6lhz2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8358f38c6c98de8206fc03fda21200b0e526972747588a6e32d525c064aa57ac\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8358f38c6c98de8206fc03fda21200b0e526972747588a6e32d525c064aa57ac\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T08:39:39Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T08:39:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6lhz2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9af7a0993c4e8d1177050ee170ae306c2e2570b0daca2d3f5c812b5f0e9c81da\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9af7a0993c4e8d1177050ee170ae306c2e2570b0daca2d3f5c812b5f0e9c81da\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T08:39:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T08:39:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6lhz2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://23ac91bc25ecc5c606b22bf6df52129330bb8c214ef8ec881fb202df6350c853\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://23ac91bc25ecc5c606b22bf6df52129330bb8c214ef8ec881fb202df6350c853\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T08:39:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T08:39:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6lhz2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6lhz2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T08:39:36Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-4bmrv\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T08:39:42Z is after 2025-08-24T17:21:41Z" Oct 03 08:39:42 crc kubenswrapper[4765]: I1003 08:39:42.633336 4765 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-p9gf5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"46c76a49-e10b-4a12-a6c7-12c330cd3c4e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d127171dd11041892813dd0596574630e756cc4f2e54b149619bffdbe9bae37d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T08:39:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2ldt7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T08:39:36Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-p9gf5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T08:39:42Z is after 2025-08-24T17:21:41Z" Oct 03 08:39:42 crc kubenswrapper[4765]: I1003 08:39:42.648550 4765 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-svqbq" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"84cdf1d7-9997-4015-bdbf-eedacc081685\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://43441b23076aa88505c0014c6734ffd0302f9011300711eece573befc94f3fbf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T08:39:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tm2z5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T08:39:39Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-svqbq\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T08:39:42Z is after 2025-08-24T17:21:41Z" Oct 03 08:39:42 crc kubenswrapper[4765]: I1003 08:39:42.663503 4765 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:36Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:36Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e2003e4dd90b26bd915c05a690d0ab12b21ef7773138f11993382b0e7ac2d778\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T08:39:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T08:39:42Z is after 2025-08-24T17:21:41Z" Oct 03 08:39:42 crc kubenswrapper[4765]: I1003 08:39:42.677558 4765 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0c434639-9c6c-420c-a51b-fdf59b654daa\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:16Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:16Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d31497fd54f7500ac776bdd9a16414d873c053353911ed5ba237b201e9e7ac12\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T08:39:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e89b19d6a5b90a2051665bf2e5e150f73df7899eff246ee75246bc2127c415ea\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T08:39:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://79fad446c147481b1a0ff2a173848b2d24384e6b6aafcd0749dc820e9abfe929\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T08:39:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f2e21a2b21d807288e991a3a44ea38d316985590080aa4291aa3385816f826dd\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://aa0283dadc2c5e48aa9bfd20ef35d889a350244b72eb8529d4d4e682d5fa0e47\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-03T08:39:35Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1003 08:39:29.830291 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1003 08:39:29.833185 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2710500186/tls.crt::/tmp/serving-cert-2710500186/tls.key\\\\\\\"\\\\nI1003 08:39:35.213224 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1003 08:39:35.219008 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1003 08:39:35.219055 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1003 08:39:35.219088 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1003 08:39:35.219098 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1003 08:39:35.227302 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI1003 08:39:35.227314 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW1003 08:39:35.227372 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1003 08:39:35.227381 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1003 08:39:35.227385 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1003 08:39:35.227395 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1003 08:39:35.227398 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1003 08:39:35.227401 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF1003 08:39:35.229781 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-03T08:39:19Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T08:39:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fa1d1c0f4dab4b4c6c9f3afccac34473eab40a714015a2a7ce725ed1a92b609c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T08:39:19Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7c94b0f96f50324a67a7b189e9d3c4e406193421aa2e793692219bef9f2578dd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7c94b0f96f50324a67a7b189e9d3c4e406193421aa2e793692219bef9f2578dd\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T08:39:17Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T08:39:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T08:39:16Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T08:39:42Z is after 2025-08-24T17:21:41Z" Oct 03 08:39:42 crc kubenswrapper[4765]: I1003 08:39:42.690276 4765 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:35Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:35Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T08:39:42Z is after 2025-08-24T17:21:41Z" Oct 03 08:39:42 crc kubenswrapper[4765]: I1003 08:39:42.694930 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 08:39:42 crc kubenswrapper[4765]: I1003 08:39:42.694967 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 08:39:42 crc kubenswrapper[4765]: I1003 08:39:42.694979 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 08:39:42 crc kubenswrapper[4765]: I1003 08:39:42.694996 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 08:39:42 crc kubenswrapper[4765]: I1003 08:39:42.695008 4765 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T08:39:42Z","lastTransitionTime":"2025-10-03T08:39:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 08:39:42 crc kubenswrapper[4765]: I1003 08:39:42.711268 4765 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"859ee4f1-636f-48e5-ad72-fef19f311c64\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ffaf0cbc60fa84230a87aff908b5b2a76956abfa937aeea94363abe91640b93e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T08:39:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0fee410f71d4fa82e7bf54dad906736bc7182be512825a06bf7a4c76ed2f2789\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T08:39:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ab0ed26066c771f9943b6435fa382ff61fb04f0c8bef3d505aba4c5d1a1d4740\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T08:39:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://153c9584928c3d064c6098126dad58733015ed123b9a55c959e69ddcc0ad2110\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T08:39:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6fa1bc45d80d90bc08ca3a7177e2ac77b66c36f5a0f863532174be7719bfaae0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T08:39:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c1359775b3b907743ce1d7754c904fab5cf953ad3a28f91e781be95462e6a9d4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c1359775b3b907743ce1d7754c904fab5cf953ad3a28f91e781be95462e6a9d4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T08:39:17Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T08:39:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a56c24b3af6b3d5c18009cbb5517273674eb36f57157344f99903f9c50c7d1a0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a56c24b3af6b3d5c18009cbb5517273674eb36f57157344f99903f9c50c7d1a0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T08:39:18Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T08:39:18Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://18dbd66a12f96fa4beef4da9dcc0e1b1d3a5164a8b21a6774142289078d6c05f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://18dbd66a12f96fa4beef4da9dcc0e1b1d3a5164a8b21a6774142289078d6c05f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T08:39:19Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T08:39:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T08:39:16Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T08:39:42Z is after 2025-08-24T17:21:41Z" Oct 03 08:39:42 crc kubenswrapper[4765]: I1003 08:39:42.722722 4765 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:35Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:35Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T08:39:42Z is after 2025-08-24T17:21:41Z" Oct 03 08:39:42 crc kubenswrapper[4765]: I1003 08:39:42.736144 4765 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-csb5z" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"912755c8-dd28-4fbc-82de-9cf85df54f4f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d7f179012e9f55f30c641a1ae3640cc90cefb3d2527d0c1e0580c219899503e1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T08:39:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w8k2k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T08:39:36Z\\\"}}\" for pod \"openshift-multus\"/\"multus-csb5z\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T08:39:42Z is after 2025-08-24T17:21:41Z" Oct 03 08:39:42 crc kubenswrapper[4765]: I1003 08:39:42.747477 4765 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-j8mss" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d636dbad-9ffa-4ba7-953f-adea04b76a23\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://33c95fa1034cd2135f4293956d73825e809195d220ff0b10a6604bd399a5730a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T08:39:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dhzzj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://714c78e9165f96e2aee03ad7be980399f06aeb852da4d76611c236f262518281\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T08:39:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dhzzj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T08:39:36Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-j8mss\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T08:39:42Z is after 2025-08-24T17:21:41Z" Oct 03 08:39:42 crc kubenswrapper[4765]: I1003 08:39:42.797351 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 08:39:42 crc kubenswrapper[4765]: I1003 08:39:42.797397 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 08:39:42 crc kubenswrapper[4765]: I1003 08:39:42.797409 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 08:39:42 crc kubenswrapper[4765]: I1003 08:39:42.797429 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 08:39:42 crc kubenswrapper[4765]: I1003 08:39:42.797441 4765 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T08:39:42Z","lastTransitionTime":"2025-10-03T08:39:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 08:39:42 crc kubenswrapper[4765]: I1003 08:39:42.900754 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 08:39:42 crc kubenswrapper[4765]: I1003 08:39:42.901114 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 08:39:42 crc kubenswrapper[4765]: I1003 08:39:42.901127 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 08:39:42 crc kubenswrapper[4765]: I1003 08:39:42.901148 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 08:39:42 crc kubenswrapper[4765]: I1003 08:39:42.901157 4765 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T08:39:42Z","lastTransitionTime":"2025-10-03T08:39:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 08:39:42 crc kubenswrapper[4765]: I1003 08:39:42.950037 4765 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 03 08:39:42 crc kubenswrapper[4765]: E1003 08:39:42.950245 4765 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-03 08:39:50.950216861 +0000 UTC m=+35.251711191 (durationBeforeRetry 8s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 03 08:39:43 crc kubenswrapper[4765]: I1003 08:39:43.003581 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 08:39:43 crc kubenswrapper[4765]: I1003 08:39:43.003619 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 08:39:43 crc kubenswrapper[4765]: I1003 08:39:43.003628 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 08:39:43 crc kubenswrapper[4765]: I1003 08:39:43.003657 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 08:39:43 crc kubenswrapper[4765]: I1003 08:39:43.003669 4765 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T08:39:43Z","lastTransitionTime":"2025-10-03T08:39:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 08:39:43 crc kubenswrapper[4765]: I1003 08:39:43.051415 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 03 08:39:43 crc kubenswrapper[4765]: I1003 08:39:43.051460 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 03 08:39:43 crc kubenswrapper[4765]: I1003 08:39:43.051494 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 03 08:39:43 crc kubenswrapper[4765]: I1003 08:39:43.051525 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 03 08:39:43 crc kubenswrapper[4765]: E1003 08:39:43.051596 4765 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Oct 03 08:39:43 crc kubenswrapper[4765]: E1003 08:39:43.051674 4765 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-10-03 08:39:51.051631645 +0000 UTC m=+35.353125965 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Oct 03 08:39:43 crc kubenswrapper[4765]: E1003 08:39:43.052061 4765 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Oct 03 08:39:43 crc kubenswrapper[4765]: E1003 08:39:43.052080 4765 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Oct 03 08:39:43 crc kubenswrapper[4765]: E1003 08:39:43.052092 4765 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Oct 03 08:39:43 crc kubenswrapper[4765]: E1003 08:39:43.052118 4765 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2025-10-03 08:39:51.052109326 +0000 UTC m=+35.353603656 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Oct 03 08:39:43 crc kubenswrapper[4765]: E1003 08:39:43.052160 4765 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Oct 03 08:39:43 crc kubenswrapper[4765]: E1003 08:39:43.052186 4765 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-10-03 08:39:51.052179958 +0000 UTC m=+35.353674288 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Oct 03 08:39:43 crc kubenswrapper[4765]: E1003 08:39:43.052224 4765 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Oct 03 08:39:43 crc kubenswrapper[4765]: E1003 08:39:43.052233 4765 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Oct 03 08:39:43 crc kubenswrapper[4765]: E1003 08:39:43.052240 4765 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Oct 03 08:39:43 crc kubenswrapper[4765]: E1003 08:39:43.052259 4765 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2025-10-03 08:39:51.05225385 +0000 UTC m=+35.353748180 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Oct 03 08:39:43 crc kubenswrapper[4765]: I1003 08:39:43.106719 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 08:39:43 crc kubenswrapper[4765]: I1003 08:39:43.106784 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 08:39:43 crc kubenswrapper[4765]: I1003 08:39:43.106801 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 08:39:43 crc kubenswrapper[4765]: I1003 08:39:43.106827 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 08:39:43 crc kubenswrapper[4765]: I1003 08:39:43.106845 4765 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T08:39:43Z","lastTransitionTime":"2025-10-03T08:39:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 08:39:43 crc kubenswrapper[4765]: I1003 08:39:43.210234 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 08:39:43 crc kubenswrapper[4765]: I1003 08:39:43.210269 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 08:39:43 crc kubenswrapper[4765]: I1003 08:39:43.210279 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 08:39:43 crc kubenswrapper[4765]: I1003 08:39:43.210296 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 08:39:43 crc kubenswrapper[4765]: I1003 08:39:43.210305 4765 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T08:39:43Z","lastTransitionTime":"2025-10-03T08:39:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 08:39:43 crc kubenswrapper[4765]: I1003 08:39:43.305811 4765 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 03 08:39:43 crc kubenswrapper[4765]: I1003 08:39:43.305912 4765 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 03 08:39:43 crc kubenswrapper[4765]: I1003 08:39:43.305835 4765 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 03 08:39:43 crc kubenswrapper[4765]: E1003 08:39:43.305999 4765 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 03 08:39:43 crc kubenswrapper[4765]: E1003 08:39:43.306065 4765 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 03 08:39:43 crc kubenswrapper[4765]: E1003 08:39:43.306209 4765 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 03 08:39:43 crc kubenswrapper[4765]: I1003 08:39:43.312887 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 08:39:43 crc kubenswrapper[4765]: I1003 08:39:43.312922 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 08:39:43 crc kubenswrapper[4765]: I1003 08:39:43.312938 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 08:39:43 crc kubenswrapper[4765]: I1003 08:39:43.312956 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 08:39:43 crc kubenswrapper[4765]: I1003 08:39:43.312967 4765 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T08:39:43Z","lastTransitionTime":"2025-10-03T08:39:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 08:39:43 crc kubenswrapper[4765]: I1003 08:39:43.415292 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 08:39:43 crc kubenswrapper[4765]: I1003 08:39:43.415344 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 08:39:43 crc kubenswrapper[4765]: I1003 08:39:43.415353 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 08:39:43 crc kubenswrapper[4765]: I1003 08:39:43.415370 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 08:39:43 crc kubenswrapper[4765]: I1003 08:39:43.415381 4765 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T08:39:43Z","lastTransitionTime":"2025-10-03T08:39:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 08:39:43 crc kubenswrapper[4765]: I1003 08:39:43.517730 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 08:39:43 crc kubenswrapper[4765]: I1003 08:39:43.517781 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 08:39:43 crc kubenswrapper[4765]: I1003 08:39:43.517794 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 08:39:43 crc kubenswrapper[4765]: I1003 08:39:43.517812 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 08:39:43 crc kubenswrapper[4765]: I1003 08:39:43.517828 4765 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T08:39:43Z","lastTransitionTime":"2025-10-03T08:39:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 08:39:43 crc kubenswrapper[4765]: I1003 08:39:43.522551 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-srgbb" event={"ID":"ea01fba1-445f-46c1-898c-1ceb34866850","Type":"ContainerStarted","Data":"38b17cdcd60c8c79a10f7b5c583c138ae27bd93dcfb6571c37d800acf1360556"} Oct 03 08:39:43 crc kubenswrapper[4765]: I1003 08:39:43.522969 4765 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-srgbb" Oct 03 08:39:43 crc kubenswrapper[4765]: I1003 08:39:43.523026 4765 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-srgbb" Oct 03 08:39:43 crc kubenswrapper[4765]: I1003 08:39:43.529013 4765 generic.go:334] "Generic (PLEG): container finished" podID="4f105c06-3e67-486f-a622-923ae442117c" containerID="7c836df75da45ef369baafc15bdbed1068becc3bf57a4c83a8519280ff3eb847" exitCode=0 Oct 03 08:39:43 crc kubenswrapper[4765]: I1003 08:39:43.529076 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-4bmrv" event={"ID":"4f105c06-3e67-486f-a622-923ae442117c","Type":"ContainerDied","Data":"7c836df75da45ef369baafc15bdbed1068becc3bf57a4c83a8519280ff3eb847"} Oct 03 08:39:43 crc kubenswrapper[4765]: I1003 08:39:43.540012 4765 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:35Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:35Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T08:39:43Z is after 2025-08-24T17:21:41Z" Oct 03 08:39:43 crc kubenswrapper[4765]: I1003 08:39:43.545660 4765 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-srgbb" Oct 03 08:39:43 crc kubenswrapper[4765]: I1003 08:39:43.547759 4765 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-srgbb" Oct 03 08:39:43 crc kubenswrapper[4765]: I1003 08:39:43.555819 4765 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:36Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:36Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e8d6f534a0a702832db2f8947c1528a98d511d3950cc5a6ec0ac3b31b3dbcb7c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T08:39:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://20ad16cb9f0f7e17ac946cd2c3f7c01b6e6c95d6d76c99f482b3761546689af2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T08:39:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T08:39:43Z is after 2025-08-24T17:21:41Z" Oct 03 08:39:43 crc kubenswrapper[4765]: I1003 08:39:43.566245 4765 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:38Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:38Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a37f2b5f797755065158a077232872befbc61f2f19c80dfd27bba7f131db794c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T08:39:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T08:39:43Z is after 2025-08-24T17:21:41Z" Oct 03 08:39:43 crc kubenswrapper[4765]: I1003 08:39:43.584034 4765 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-srgbb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ea01fba1-445f-46c1-898c-1ceb34866850\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:36Z\\\",\\\"message\\\":\\\"containers with unready status: [nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:36Z\\\",\\\"message\\\":\\\"containers with unready status: [nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d73e2e54676fc570262cfd551322ed003812c372ddc25695ca3b34ae2a05423b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T08:39:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m4zqv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fa40947035e07c4926ee170348e2bd545830d0c6c1fa6b59a2aa7f12eac2c6da\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T08:39:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m4zqv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://902d94d2cc9ce526c6ea774f1bb70fbee7da85cedab72fcd842f87d47ee8a458\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T08:39:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m4zqv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://95502595a856f5f235331ab5db3d4f97a50f968857c1962d12b873a714689f0c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T08:39:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m4zqv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a3ad66691c9dcf004703b79d697a78f9b42791fafba2ddf278997b6ad28bdd4a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T08:39:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m4zqv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://68b9b8a7ec5c072f50d44aa0d3800b7cdee18bdd868d37ec129ceb37a23bd3ca\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T08:39:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m4zqv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://38b17cdcd60c8c79a10f7b5c583c138ae27bd93dcfb6571c37d800acf1360556\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T08:39:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m4zqv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6d5d60eb6ab5ff22cc2c6826b1d47220bb827fa0429f2a59020ae01d0a43f6bf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T08:39:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m4zqv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://761ad034a8a6a7a6280fcccaca136b26ddd423885bc2a7ab6b8e1856f16f7416\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://761ad034a8a6a7a6280fcccaca136b26ddd423885bc2a7ab6b8e1856f16f7416\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T08:39:37Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T08:39:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m4zqv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T08:39:36Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-srgbb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T08:39:43Z is after 2025-08-24T17:21:41Z" Oct 03 08:39:43 crc kubenswrapper[4765]: I1003 08:39:43.596906 4765 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9660b983-3561-4cf7-8ea0-31a63e8d1051\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://23c27e7d79dab0c54b22f0114e7f55a9267e3a21961b8479c37fd77d0e8b66c7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T08:39:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fb89a31c804d86cbc11b04e4dcfab79d4536f28a107d43e98d48172a1c257ebb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T08:39:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1c3168f51c49cd9633557cf31cdc0fec47b3fcf981462dc85f4253a0584fcf52\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T08:39:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://99ae775d5cfd2e88a1c7ca516e1c59f2e08ce1d383653cacbefeac66b07abcb5\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T08:39:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T08:39:16Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T08:39:43Z is after 2025-08-24T17:21:41Z" Oct 03 08:39:43 crc kubenswrapper[4765]: I1003 08:39:43.613932 4765 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-4bmrv" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4f105c06-3e67-486f-a622-923ae442117c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:36Z\\\",\\\"message\\\":\\\"containers with incomplete status: [whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:36Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:36Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6lhz2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ab314ed006e71cae099ecf8be4ac18fb777dd83fe4e233d9bc347965f76b6a0b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ab314ed006e71cae099ecf8be4ac18fb777dd83fe4e233d9bc347965f76b6a0b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T08:39:37Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T08:39:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6lhz2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://261208c132a527b6786f64f37dd699e889f68ebc01265028288b3620615274bc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://261208c132a527b6786f64f37dd699e889f68ebc01265028288b3620615274bc\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T08:39:39Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T08:39:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6lhz2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8358f38c6c98de8206fc03fda21200b0e526972747588a6e32d525c064aa57ac\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8358f38c6c98de8206fc03fda21200b0e526972747588a6e32d525c064aa57ac\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T08:39:39Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T08:39:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6lhz2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9af7a0993c4e8d1177050ee170ae306c2e2570b0daca2d3f5c812b5f0e9c81da\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9af7a0993c4e8d1177050ee170ae306c2e2570b0daca2d3f5c812b5f0e9c81da\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T08:39:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T08:39:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6lhz2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://23ac91bc25ecc5c606b22bf6df52129330bb8c214ef8ec881fb202df6350c853\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://23ac91bc25ecc5c606b22bf6df52129330bb8c214ef8ec881fb202df6350c853\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T08:39:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T08:39:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6lhz2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6lhz2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T08:39:36Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-4bmrv\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T08:39:43Z is after 2025-08-24T17:21:41Z" Oct 03 08:39:43 crc kubenswrapper[4765]: I1003 08:39:43.621377 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 08:39:43 crc kubenswrapper[4765]: I1003 08:39:43.621417 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 08:39:43 crc kubenswrapper[4765]: I1003 08:39:43.621426 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 08:39:43 crc kubenswrapper[4765]: I1003 08:39:43.621446 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 08:39:43 crc kubenswrapper[4765]: I1003 08:39:43.621456 4765 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T08:39:43Z","lastTransitionTime":"2025-10-03T08:39:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 08:39:43 crc kubenswrapper[4765]: I1003 08:39:43.627324 4765 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-p9gf5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"46c76a49-e10b-4a12-a6c7-12c330cd3c4e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d127171dd11041892813dd0596574630e756cc4f2e54b149619bffdbe9bae37d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T08:39:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2ldt7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T08:39:36Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-p9gf5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T08:39:43Z is after 2025-08-24T17:21:41Z" Oct 03 08:39:43 crc kubenswrapper[4765]: I1003 08:39:43.639026 4765 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-svqbq" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"84cdf1d7-9997-4015-bdbf-eedacc081685\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://43441b23076aa88505c0014c6734ffd0302f9011300711eece573befc94f3fbf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T08:39:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tm2z5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T08:39:39Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-svqbq\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T08:39:43Z is after 2025-08-24T17:21:41Z" Oct 03 08:39:43 crc kubenswrapper[4765]: I1003 08:39:43.664947 4765 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0c434639-9c6c-420c-a51b-fdf59b654daa\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:16Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:16Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d31497fd54f7500ac776bdd9a16414d873c053353911ed5ba237b201e9e7ac12\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T08:39:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e89b19d6a5b90a2051665bf2e5e150f73df7899eff246ee75246bc2127c415ea\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T08:39:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://79fad446c147481b1a0ff2a173848b2d24384e6b6aafcd0749dc820e9abfe929\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T08:39:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f2e21a2b21d807288e991a3a44ea38d316985590080aa4291aa3385816f826dd\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://aa0283dadc2c5e48aa9bfd20ef35d889a350244b72eb8529d4d4e682d5fa0e47\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-03T08:39:35Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1003 08:39:29.830291 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1003 08:39:29.833185 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2710500186/tls.crt::/tmp/serving-cert-2710500186/tls.key\\\\\\\"\\\\nI1003 08:39:35.213224 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1003 08:39:35.219008 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1003 08:39:35.219055 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1003 08:39:35.219088 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1003 08:39:35.219098 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1003 08:39:35.227302 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI1003 08:39:35.227314 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW1003 08:39:35.227372 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1003 08:39:35.227381 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1003 08:39:35.227385 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1003 08:39:35.227395 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1003 08:39:35.227398 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1003 08:39:35.227401 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF1003 08:39:35.229781 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-03T08:39:19Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T08:39:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fa1d1c0f4dab4b4c6c9f3afccac34473eab40a714015a2a7ce725ed1a92b609c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T08:39:19Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7c94b0f96f50324a67a7b189e9d3c4e406193421aa2e793692219bef9f2578dd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7c94b0f96f50324a67a7b189e9d3c4e406193421aa2e793692219bef9f2578dd\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T08:39:17Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T08:39:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T08:39:16Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T08:39:43Z is after 2025-08-24T17:21:41Z" Oct 03 08:39:43 crc kubenswrapper[4765]: I1003 08:39:43.678822 4765 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:35Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:35Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T08:39:43Z is after 2025-08-24T17:21:41Z" Oct 03 08:39:43 crc kubenswrapper[4765]: I1003 08:39:43.692125 4765 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:36Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:36Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e2003e4dd90b26bd915c05a690d0ab12b21ef7773138f11993382b0e7ac2d778\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T08:39:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T08:39:43Z is after 2025-08-24T17:21:41Z" Oct 03 08:39:43 crc kubenswrapper[4765]: I1003 08:39:43.715992 4765 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"859ee4f1-636f-48e5-ad72-fef19f311c64\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ffaf0cbc60fa84230a87aff908b5b2a76956abfa937aeea94363abe91640b93e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T08:39:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0fee410f71d4fa82e7bf54dad906736bc7182be512825a06bf7a4c76ed2f2789\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T08:39:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ab0ed26066c771f9943b6435fa382ff61fb04f0c8bef3d505aba4c5d1a1d4740\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T08:39:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://153c9584928c3d064c6098126dad58733015ed123b9a55c959e69ddcc0ad2110\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T08:39:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6fa1bc45d80d90bc08ca3a7177e2ac77b66c36f5a0f863532174be7719bfaae0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T08:39:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c1359775b3b907743ce1d7754c904fab5cf953ad3a28f91e781be95462e6a9d4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c1359775b3b907743ce1d7754c904fab5cf953ad3a28f91e781be95462e6a9d4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T08:39:17Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T08:39:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a56c24b3af6b3d5c18009cbb5517273674eb36f57157344f99903f9c50c7d1a0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a56c24b3af6b3d5c18009cbb5517273674eb36f57157344f99903f9c50c7d1a0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T08:39:18Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T08:39:18Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://18dbd66a12f96fa4beef4da9dcc0e1b1d3a5164a8b21a6774142289078d6c05f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://18dbd66a12f96fa4beef4da9dcc0e1b1d3a5164a8b21a6774142289078d6c05f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T08:39:19Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T08:39:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T08:39:16Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T08:39:43Z is after 2025-08-24T17:21:41Z" Oct 03 08:39:43 crc kubenswrapper[4765]: I1003 08:39:43.724164 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 08:39:43 crc kubenswrapper[4765]: I1003 08:39:43.724359 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 08:39:43 crc kubenswrapper[4765]: I1003 08:39:43.724447 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 08:39:43 crc kubenswrapper[4765]: I1003 08:39:43.724608 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 08:39:43 crc kubenswrapper[4765]: I1003 08:39:43.724734 4765 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T08:39:43Z","lastTransitionTime":"2025-10-03T08:39:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 08:39:43 crc kubenswrapper[4765]: I1003 08:39:43.732053 4765 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:35Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:35Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T08:39:43Z is after 2025-08-24T17:21:41Z" Oct 03 08:39:43 crc kubenswrapper[4765]: I1003 08:39:43.745130 4765 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-csb5z" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"912755c8-dd28-4fbc-82de-9cf85df54f4f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d7f179012e9f55f30c641a1ae3640cc90cefb3d2527d0c1e0580c219899503e1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T08:39:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w8k2k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T08:39:36Z\\\"}}\" for pod \"openshift-multus\"/\"multus-csb5z\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T08:39:43Z is after 2025-08-24T17:21:41Z" Oct 03 08:39:43 crc kubenswrapper[4765]: I1003 08:39:43.760824 4765 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-j8mss" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d636dbad-9ffa-4ba7-953f-adea04b76a23\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://33c95fa1034cd2135f4293956d73825e809195d220ff0b10a6604bd399a5730a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T08:39:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dhzzj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://714c78e9165f96e2aee03ad7be980399f06aeb852da4d76611c236f262518281\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T08:39:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dhzzj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T08:39:36Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-j8mss\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T08:39:43Z is after 2025-08-24T17:21:41Z" Oct 03 08:39:43 crc kubenswrapper[4765]: I1003 08:39:43.774409 4765 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:35Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:35Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T08:39:43Z is after 2025-08-24T17:21:41Z" Oct 03 08:39:43 crc kubenswrapper[4765]: I1003 08:39:43.788408 4765 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:36Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:36Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e8d6f534a0a702832db2f8947c1528a98d511d3950cc5a6ec0ac3b31b3dbcb7c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T08:39:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://20ad16cb9f0f7e17ac946cd2c3f7c01b6e6c95d6d76c99f482b3761546689af2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T08:39:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T08:39:43Z is after 2025-08-24T17:21:41Z" Oct 03 08:39:43 crc kubenswrapper[4765]: I1003 08:39:43.801027 4765 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:38Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:38Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a37f2b5f797755065158a077232872befbc61f2f19c80dfd27bba7f131db794c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T08:39:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T08:39:43Z is after 2025-08-24T17:21:41Z" Oct 03 08:39:43 crc kubenswrapper[4765]: I1003 08:39:43.820018 4765 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-srgbb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ea01fba1-445f-46c1-898c-1ceb34866850\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:36Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:36Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d73e2e54676fc570262cfd551322ed003812c372ddc25695ca3b34ae2a05423b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T08:39:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m4zqv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fa40947035e07c4926ee170348e2bd545830d0c6c1fa6b59a2aa7f12eac2c6da\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T08:39:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m4zqv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://902d94d2cc9ce526c6ea774f1bb70fbee7da85cedab72fcd842f87d47ee8a458\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T08:39:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m4zqv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://95502595a856f5f235331ab5db3d4f97a50f968857c1962d12b873a714689f0c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T08:39:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m4zqv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a3ad66691c9dcf004703b79d697a78f9b42791fafba2ddf278997b6ad28bdd4a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T08:39:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m4zqv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://68b9b8a7ec5c072f50d44aa0d3800b7cdee18bdd868d37ec129ceb37a23bd3ca\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T08:39:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m4zqv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://38b17cdcd60c8c79a10f7b5c583c138ae27bd93dcfb6571c37d800acf1360556\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T08:39:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m4zqv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6d5d60eb6ab5ff22cc2c6826b1d47220bb827fa0429f2a59020ae01d0a43f6bf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T08:39:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m4zqv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://761ad034a8a6a7a6280fcccaca136b26ddd423885bc2a7ab6b8e1856f16f7416\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://761ad034a8a6a7a6280fcccaca136b26ddd423885bc2a7ab6b8e1856f16f7416\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T08:39:37Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T08:39:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m4zqv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T08:39:36Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-srgbb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T08:39:43Z is after 2025-08-24T17:21:41Z" Oct 03 08:39:43 crc kubenswrapper[4765]: I1003 08:39:43.826565 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 08:39:43 crc kubenswrapper[4765]: I1003 08:39:43.826820 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 08:39:43 crc kubenswrapper[4765]: I1003 08:39:43.826919 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 08:39:43 crc kubenswrapper[4765]: I1003 08:39:43.826990 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 08:39:43 crc kubenswrapper[4765]: I1003 08:39:43.827051 4765 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T08:39:43Z","lastTransitionTime":"2025-10-03T08:39:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 08:39:43 crc kubenswrapper[4765]: I1003 08:39:43.834500 4765 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9660b983-3561-4cf7-8ea0-31a63e8d1051\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://23c27e7d79dab0c54b22f0114e7f55a9267e3a21961b8479c37fd77d0e8b66c7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T08:39:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fb89a31c804d86cbc11b04e4dcfab79d4536f28a107d43e98d48172a1c257ebb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T08:39:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1c3168f51c49cd9633557cf31cdc0fec47b3fcf981462dc85f4253a0584fcf52\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T08:39:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://99ae775d5cfd2e88a1c7ca516e1c59f2e08ce1d383653cacbefeac66b07abcb5\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T08:39:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T08:39:16Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T08:39:43Z is after 2025-08-24T17:21:41Z" Oct 03 08:39:43 crc kubenswrapper[4765]: I1003 08:39:43.847982 4765 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-4bmrv" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4f105c06-3e67-486f-a622-923ae442117c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:36Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:36Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6lhz2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ab314ed006e71cae099ecf8be4ac18fb777dd83fe4e233d9bc347965f76b6a0b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ab314ed006e71cae099ecf8be4ac18fb777dd83fe4e233d9bc347965f76b6a0b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T08:39:37Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T08:39:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6lhz2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://261208c132a527b6786f64f37dd699e889f68ebc01265028288b3620615274bc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://261208c132a527b6786f64f37dd699e889f68ebc01265028288b3620615274bc\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T08:39:39Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T08:39:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6lhz2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8358f38c6c98de8206fc03fda21200b0e526972747588a6e32d525c064aa57ac\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8358f38c6c98de8206fc03fda21200b0e526972747588a6e32d525c064aa57ac\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T08:39:39Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T08:39:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6lhz2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9af7a0993c4e8d1177050ee170ae306c2e2570b0daca2d3f5c812b5f0e9c81da\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9af7a0993c4e8d1177050ee170ae306c2e2570b0daca2d3f5c812b5f0e9c81da\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T08:39:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T08:39:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6lhz2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://23ac91bc25ecc5c606b22bf6df52129330bb8c214ef8ec881fb202df6350c853\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://23ac91bc25ecc5c606b22bf6df52129330bb8c214ef8ec881fb202df6350c853\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T08:39:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T08:39:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6lhz2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7c836df75da45ef369baafc15bdbed1068becc3bf57a4c83a8519280ff3eb847\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7c836df75da45ef369baafc15bdbed1068becc3bf57a4c83a8519280ff3eb847\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T08:39:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T08:39:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6lhz2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T08:39:36Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-4bmrv\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T08:39:43Z is after 2025-08-24T17:21:41Z" Oct 03 08:39:43 crc kubenswrapper[4765]: I1003 08:39:43.858560 4765 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-p9gf5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"46c76a49-e10b-4a12-a6c7-12c330cd3c4e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d127171dd11041892813dd0596574630e756cc4f2e54b149619bffdbe9bae37d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T08:39:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2ldt7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T08:39:36Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-p9gf5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T08:39:43Z is after 2025-08-24T17:21:41Z" Oct 03 08:39:43 crc kubenswrapper[4765]: I1003 08:39:43.870833 4765 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-svqbq" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"84cdf1d7-9997-4015-bdbf-eedacc081685\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://43441b23076aa88505c0014c6734ffd0302f9011300711eece573befc94f3fbf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T08:39:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tm2z5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T08:39:39Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-svqbq\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T08:39:43Z is after 2025-08-24T17:21:41Z" Oct 03 08:39:43 crc kubenswrapper[4765]: I1003 08:39:43.884856 4765 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0c434639-9c6c-420c-a51b-fdf59b654daa\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:16Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:16Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d31497fd54f7500ac776bdd9a16414d873c053353911ed5ba237b201e9e7ac12\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T08:39:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e89b19d6a5b90a2051665bf2e5e150f73df7899eff246ee75246bc2127c415ea\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T08:39:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://79fad446c147481b1a0ff2a173848b2d24384e6b6aafcd0749dc820e9abfe929\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T08:39:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f2e21a2b21d807288e991a3a44ea38d316985590080aa4291aa3385816f826dd\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://aa0283dadc2c5e48aa9bfd20ef35d889a350244b72eb8529d4d4e682d5fa0e47\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-03T08:39:35Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1003 08:39:29.830291 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1003 08:39:29.833185 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2710500186/tls.crt::/tmp/serving-cert-2710500186/tls.key\\\\\\\"\\\\nI1003 08:39:35.213224 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1003 08:39:35.219008 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1003 08:39:35.219055 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1003 08:39:35.219088 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1003 08:39:35.219098 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1003 08:39:35.227302 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI1003 08:39:35.227314 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW1003 08:39:35.227372 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1003 08:39:35.227381 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1003 08:39:35.227385 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1003 08:39:35.227395 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1003 08:39:35.227398 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1003 08:39:35.227401 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF1003 08:39:35.229781 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-03T08:39:19Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T08:39:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fa1d1c0f4dab4b4c6c9f3afccac34473eab40a714015a2a7ce725ed1a92b609c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T08:39:19Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7c94b0f96f50324a67a7b189e9d3c4e406193421aa2e793692219bef9f2578dd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7c94b0f96f50324a67a7b189e9d3c4e406193421aa2e793692219bef9f2578dd\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T08:39:17Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T08:39:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T08:39:16Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T08:39:43Z is after 2025-08-24T17:21:41Z" Oct 03 08:39:43 crc kubenswrapper[4765]: I1003 08:39:43.896316 4765 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:35Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:35Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T08:39:43Z is after 2025-08-24T17:21:41Z" Oct 03 08:39:43 crc kubenswrapper[4765]: I1003 08:39:43.909221 4765 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:36Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:36Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e2003e4dd90b26bd915c05a690d0ab12b21ef7773138f11993382b0e7ac2d778\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T08:39:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T08:39:43Z is after 2025-08-24T17:21:41Z" Oct 03 08:39:43 crc kubenswrapper[4765]: I1003 08:39:43.928249 4765 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"859ee4f1-636f-48e5-ad72-fef19f311c64\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ffaf0cbc60fa84230a87aff908b5b2a76956abfa937aeea94363abe91640b93e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T08:39:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0fee410f71d4fa82e7bf54dad906736bc7182be512825a06bf7a4c76ed2f2789\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T08:39:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ab0ed26066c771f9943b6435fa382ff61fb04f0c8bef3d505aba4c5d1a1d4740\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T08:39:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://153c9584928c3d064c6098126dad58733015ed123b9a55c959e69ddcc0ad2110\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T08:39:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6fa1bc45d80d90bc08ca3a7177e2ac77b66c36f5a0f863532174be7719bfaae0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T08:39:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c1359775b3b907743ce1d7754c904fab5cf953ad3a28f91e781be95462e6a9d4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c1359775b3b907743ce1d7754c904fab5cf953ad3a28f91e781be95462e6a9d4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T08:39:17Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T08:39:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a56c24b3af6b3d5c18009cbb5517273674eb36f57157344f99903f9c50c7d1a0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a56c24b3af6b3d5c18009cbb5517273674eb36f57157344f99903f9c50c7d1a0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T08:39:18Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T08:39:18Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://18dbd66a12f96fa4beef4da9dcc0e1b1d3a5164a8b21a6774142289078d6c05f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://18dbd66a12f96fa4beef4da9dcc0e1b1d3a5164a8b21a6774142289078d6c05f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T08:39:19Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T08:39:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T08:39:16Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T08:39:43Z is after 2025-08-24T17:21:41Z" Oct 03 08:39:43 crc kubenswrapper[4765]: I1003 08:39:43.929945 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 08:39:43 crc kubenswrapper[4765]: I1003 08:39:43.929974 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 08:39:43 crc kubenswrapper[4765]: I1003 08:39:43.929983 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 08:39:43 crc kubenswrapper[4765]: I1003 08:39:43.929997 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 08:39:43 crc kubenswrapper[4765]: I1003 08:39:43.930007 4765 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T08:39:43Z","lastTransitionTime":"2025-10-03T08:39:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 08:39:43 crc kubenswrapper[4765]: I1003 08:39:43.941494 4765 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:35Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:35Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T08:39:43Z is after 2025-08-24T17:21:41Z" Oct 03 08:39:43 crc kubenswrapper[4765]: I1003 08:39:43.956346 4765 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-csb5z" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"912755c8-dd28-4fbc-82de-9cf85df54f4f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d7f179012e9f55f30c641a1ae3640cc90cefb3d2527d0c1e0580c219899503e1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T08:39:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w8k2k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T08:39:36Z\\\"}}\" for pod \"openshift-multus\"/\"multus-csb5z\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T08:39:43Z is after 2025-08-24T17:21:41Z" Oct 03 08:39:43 crc kubenswrapper[4765]: I1003 08:39:43.968964 4765 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-j8mss" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d636dbad-9ffa-4ba7-953f-adea04b76a23\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://33c95fa1034cd2135f4293956d73825e809195d220ff0b10a6604bd399a5730a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T08:39:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dhzzj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://714c78e9165f96e2aee03ad7be980399f06aeb852da4d76611c236f262518281\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T08:39:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dhzzj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T08:39:36Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-j8mss\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T08:39:43Z is after 2025-08-24T17:21:41Z" Oct 03 08:39:44 crc kubenswrapper[4765]: I1003 08:39:44.032339 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 08:39:44 crc kubenswrapper[4765]: I1003 08:39:44.032381 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 08:39:44 crc kubenswrapper[4765]: I1003 08:39:44.032392 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 08:39:44 crc kubenswrapper[4765]: I1003 08:39:44.032422 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 08:39:44 crc kubenswrapper[4765]: I1003 08:39:44.032437 4765 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T08:39:44Z","lastTransitionTime":"2025-10-03T08:39:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 08:39:44 crc kubenswrapper[4765]: I1003 08:39:44.134818 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 08:39:44 crc kubenswrapper[4765]: I1003 08:39:44.134856 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 08:39:44 crc kubenswrapper[4765]: I1003 08:39:44.134867 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 08:39:44 crc kubenswrapper[4765]: I1003 08:39:44.134881 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 08:39:44 crc kubenswrapper[4765]: I1003 08:39:44.134890 4765 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T08:39:44Z","lastTransitionTime":"2025-10-03T08:39:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 08:39:44 crc kubenswrapper[4765]: I1003 08:39:44.236729 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 08:39:44 crc kubenswrapper[4765]: I1003 08:39:44.236775 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 08:39:44 crc kubenswrapper[4765]: I1003 08:39:44.236785 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 08:39:44 crc kubenswrapper[4765]: I1003 08:39:44.236799 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 08:39:44 crc kubenswrapper[4765]: I1003 08:39:44.236812 4765 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T08:39:44Z","lastTransitionTime":"2025-10-03T08:39:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 08:39:44 crc kubenswrapper[4765]: I1003 08:39:44.339482 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 08:39:44 crc kubenswrapper[4765]: I1003 08:39:44.339514 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 08:39:44 crc kubenswrapper[4765]: I1003 08:39:44.339522 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 08:39:44 crc kubenswrapper[4765]: I1003 08:39:44.339536 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 08:39:44 crc kubenswrapper[4765]: I1003 08:39:44.339546 4765 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T08:39:44Z","lastTransitionTime":"2025-10-03T08:39:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 08:39:44 crc kubenswrapper[4765]: I1003 08:39:44.441507 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 08:39:44 crc kubenswrapper[4765]: I1003 08:39:44.441878 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 08:39:44 crc kubenswrapper[4765]: I1003 08:39:44.441890 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 08:39:44 crc kubenswrapper[4765]: I1003 08:39:44.441906 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 08:39:44 crc kubenswrapper[4765]: I1003 08:39:44.441917 4765 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T08:39:44Z","lastTransitionTime":"2025-10-03T08:39:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 08:39:44 crc kubenswrapper[4765]: I1003 08:39:44.535452 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-4bmrv" event={"ID":"4f105c06-3e67-486f-a622-923ae442117c","Type":"ContainerStarted","Data":"d7a29ab4db9b7548c70824520272e6323f615934cddf1d92bf653f6d8f030a19"} Oct 03 08:39:44 crc kubenswrapper[4765]: I1003 08:39:44.536231 4765 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Oct 03 08:39:44 crc kubenswrapper[4765]: I1003 08:39:44.545361 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 08:39:44 crc kubenswrapper[4765]: I1003 08:39:44.545401 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 08:39:44 crc kubenswrapper[4765]: I1003 08:39:44.545410 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 08:39:44 crc kubenswrapper[4765]: I1003 08:39:44.545424 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 08:39:44 crc kubenswrapper[4765]: I1003 08:39:44.545435 4765 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T08:39:44Z","lastTransitionTime":"2025-10-03T08:39:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 08:39:44 crc kubenswrapper[4765]: I1003 08:39:44.550714 4765 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0c434639-9c6c-420c-a51b-fdf59b654daa\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:16Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:16Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d31497fd54f7500ac776bdd9a16414d873c053353911ed5ba237b201e9e7ac12\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T08:39:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e89b19d6a5b90a2051665bf2e5e150f73df7899eff246ee75246bc2127c415ea\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T08:39:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://79fad446c147481b1a0ff2a173848b2d24384e6b6aafcd0749dc820e9abfe929\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T08:39:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f2e21a2b21d807288e991a3a44ea38d316985590080aa4291aa3385816f826dd\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://aa0283dadc2c5e48aa9bfd20ef35d889a350244b72eb8529d4d4e682d5fa0e47\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-03T08:39:35Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1003 08:39:29.830291 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1003 08:39:29.833185 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2710500186/tls.crt::/tmp/serving-cert-2710500186/tls.key\\\\\\\"\\\\nI1003 08:39:35.213224 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1003 08:39:35.219008 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1003 08:39:35.219055 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1003 08:39:35.219088 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1003 08:39:35.219098 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1003 08:39:35.227302 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI1003 08:39:35.227314 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW1003 08:39:35.227372 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1003 08:39:35.227381 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1003 08:39:35.227385 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1003 08:39:35.227395 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1003 08:39:35.227398 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1003 08:39:35.227401 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF1003 08:39:35.229781 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-03T08:39:19Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T08:39:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fa1d1c0f4dab4b4c6c9f3afccac34473eab40a714015a2a7ce725ed1a92b609c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T08:39:19Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7c94b0f96f50324a67a7b189e9d3c4e406193421aa2e793692219bef9f2578dd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7c94b0f96f50324a67a7b189e9d3c4e406193421aa2e793692219bef9f2578dd\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T08:39:17Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T08:39:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T08:39:16Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T08:39:44Z is after 2025-08-24T17:21:41Z" Oct 03 08:39:44 crc kubenswrapper[4765]: I1003 08:39:44.562351 4765 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:35Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:35Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T08:39:44Z is after 2025-08-24T17:21:41Z" Oct 03 08:39:44 crc kubenswrapper[4765]: I1003 08:39:44.573961 4765 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:36Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:36Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e2003e4dd90b26bd915c05a690d0ab12b21ef7773138f11993382b0e7ac2d778\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T08:39:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T08:39:44Z is after 2025-08-24T17:21:41Z" Oct 03 08:39:44 crc kubenswrapper[4765]: I1003 08:39:44.589998 4765 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"859ee4f1-636f-48e5-ad72-fef19f311c64\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ffaf0cbc60fa84230a87aff908b5b2a76956abfa937aeea94363abe91640b93e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T08:39:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0fee410f71d4fa82e7bf54dad906736bc7182be512825a06bf7a4c76ed2f2789\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T08:39:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ab0ed26066c771f9943b6435fa382ff61fb04f0c8bef3d505aba4c5d1a1d4740\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T08:39:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://153c9584928c3d064c6098126dad58733015ed123b9a55c959e69ddcc0ad2110\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T08:39:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6fa1bc45d80d90bc08ca3a7177e2ac77b66c36f5a0f863532174be7719bfaae0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T08:39:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c1359775b3b907743ce1d7754c904fab5cf953ad3a28f91e781be95462e6a9d4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c1359775b3b907743ce1d7754c904fab5cf953ad3a28f91e781be95462e6a9d4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T08:39:17Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T08:39:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a56c24b3af6b3d5c18009cbb5517273674eb36f57157344f99903f9c50c7d1a0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a56c24b3af6b3d5c18009cbb5517273674eb36f57157344f99903f9c50c7d1a0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T08:39:18Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T08:39:18Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://18dbd66a12f96fa4beef4da9dcc0e1b1d3a5164a8b21a6774142289078d6c05f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://18dbd66a12f96fa4beef4da9dcc0e1b1d3a5164a8b21a6774142289078d6c05f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T08:39:19Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T08:39:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T08:39:16Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T08:39:44Z is after 2025-08-24T17:21:41Z" Oct 03 08:39:44 crc kubenswrapper[4765]: I1003 08:39:44.601088 4765 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:35Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:35Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T08:39:44Z is after 2025-08-24T17:21:41Z" Oct 03 08:39:44 crc kubenswrapper[4765]: I1003 08:39:44.612245 4765 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-csb5z" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"912755c8-dd28-4fbc-82de-9cf85df54f4f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d7f179012e9f55f30c641a1ae3640cc90cefb3d2527d0c1e0580c219899503e1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T08:39:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w8k2k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T08:39:36Z\\\"}}\" for pod \"openshift-multus\"/\"multus-csb5z\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T08:39:44Z is after 2025-08-24T17:21:41Z" Oct 03 08:39:44 crc kubenswrapper[4765]: I1003 08:39:44.623271 4765 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-j8mss" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d636dbad-9ffa-4ba7-953f-adea04b76a23\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://33c95fa1034cd2135f4293956d73825e809195d220ff0b10a6604bd399a5730a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T08:39:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dhzzj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://714c78e9165f96e2aee03ad7be980399f06aeb852da4d76611c236f262518281\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T08:39:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dhzzj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T08:39:36Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-j8mss\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T08:39:44Z is after 2025-08-24T17:21:41Z" Oct 03 08:39:44 crc kubenswrapper[4765]: I1003 08:39:44.635823 4765 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:35Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:35Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T08:39:44Z is after 2025-08-24T17:21:41Z" Oct 03 08:39:44 crc kubenswrapper[4765]: I1003 08:39:44.646219 4765 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:36Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:36Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e8d6f534a0a702832db2f8947c1528a98d511d3950cc5a6ec0ac3b31b3dbcb7c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T08:39:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://20ad16cb9f0f7e17ac946cd2c3f7c01b6e6c95d6d76c99f482b3761546689af2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T08:39:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T08:39:44Z is after 2025-08-24T17:21:41Z" Oct 03 08:39:44 crc kubenswrapper[4765]: I1003 08:39:44.647021 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 08:39:44 crc kubenswrapper[4765]: I1003 08:39:44.647074 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 08:39:44 crc kubenswrapper[4765]: I1003 08:39:44.647084 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 08:39:44 crc kubenswrapper[4765]: I1003 08:39:44.647100 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 08:39:44 crc kubenswrapper[4765]: I1003 08:39:44.647110 4765 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T08:39:44Z","lastTransitionTime":"2025-10-03T08:39:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 08:39:44 crc kubenswrapper[4765]: I1003 08:39:44.656409 4765 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:38Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:38Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a37f2b5f797755065158a077232872befbc61f2f19c80dfd27bba7f131db794c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T08:39:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T08:39:44Z is after 2025-08-24T17:21:41Z" Oct 03 08:39:44 crc kubenswrapper[4765]: I1003 08:39:44.672674 4765 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-srgbb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ea01fba1-445f-46c1-898c-1ceb34866850\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:36Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:36Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d73e2e54676fc570262cfd551322ed003812c372ddc25695ca3b34ae2a05423b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T08:39:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m4zqv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fa40947035e07c4926ee170348e2bd545830d0c6c1fa6b59a2aa7f12eac2c6da\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T08:39:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m4zqv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://902d94d2cc9ce526c6ea774f1bb70fbee7da85cedab72fcd842f87d47ee8a458\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T08:39:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m4zqv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://95502595a856f5f235331ab5db3d4f97a50f968857c1962d12b873a714689f0c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T08:39:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m4zqv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a3ad66691c9dcf004703b79d697a78f9b42791fafba2ddf278997b6ad28bdd4a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T08:39:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m4zqv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://68b9b8a7ec5c072f50d44aa0d3800b7cdee18bdd868d37ec129ceb37a23bd3ca\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T08:39:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m4zqv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://38b17cdcd60c8c79a10f7b5c583c138ae27bd93dcfb6571c37d800acf1360556\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T08:39:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m4zqv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6d5d60eb6ab5ff22cc2c6826b1d47220bb827fa0429f2a59020ae01d0a43f6bf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T08:39:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m4zqv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://761ad034a8a6a7a6280fcccaca136b26ddd423885bc2a7ab6b8e1856f16f7416\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://761ad034a8a6a7a6280fcccaca136b26ddd423885bc2a7ab6b8e1856f16f7416\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T08:39:37Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T08:39:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m4zqv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T08:39:36Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-srgbb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T08:39:44Z is after 2025-08-24T17:21:41Z" Oct 03 08:39:44 crc kubenswrapper[4765]: I1003 08:39:44.683282 4765 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9660b983-3561-4cf7-8ea0-31a63e8d1051\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://23c27e7d79dab0c54b22f0114e7f55a9267e3a21961b8479c37fd77d0e8b66c7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T08:39:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fb89a31c804d86cbc11b04e4dcfab79d4536f28a107d43e98d48172a1c257ebb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T08:39:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1c3168f51c49cd9633557cf31cdc0fec47b3fcf981462dc85f4253a0584fcf52\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T08:39:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://99ae775d5cfd2e88a1c7ca516e1c59f2e08ce1d383653cacbefeac66b07abcb5\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T08:39:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T08:39:16Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T08:39:44Z is after 2025-08-24T17:21:41Z" Oct 03 08:39:44 crc kubenswrapper[4765]: I1003 08:39:44.697084 4765 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-4bmrv" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4f105c06-3e67-486f-a622-923ae442117c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d7a29ab4db9b7548c70824520272e6323f615934cddf1d92bf653f6d8f030a19\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T08:39:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6lhz2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ab314ed006e71cae099ecf8be4ac18fb777dd83fe4e233d9bc347965f76b6a0b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ab314ed006e71cae099ecf8be4ac18fb777dd83fe4e233d9bc347965f76b6a0b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T08:39:37Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T08:39:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6lhz2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://261208c132a527b6786f64f37dd699e889f68ebc01265028288b3620615274bc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://261208c132a527b6786f64f37dd699e889f68ebc01265028288b3620615274bc\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T08:39:39Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T08:39:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6lhz2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8358f38c6c98de8206fc03fda21200b0e526972747588a6e32d525c064aa57ac\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8358f38c6c98de8206fc03fda21200b0e526972747588a6e32d525c064aa57ac\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T08:39:39Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T08:39:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6lhz2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9af7a0993c4e8d1177050ee170ae306c2e2570b0daca2d3f5c812b5f0e9c81da\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9af7a0993c4e8d1177050ee170ae306c2e2570b0daca2d3f5c812b5f0e9c81da\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T08:39:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T08:39:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6lhz2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://23ac91bc25ecc5c606b22bf6df52129330bb8c214ef8ec881fb202df6350c853\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://23ac91bc25ecc5c606b22bf6df52129330bb8c214ef8ec881fb202df6350c853\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T08:39:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T08:39:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6lhz2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7c836df75da45ef369baafc15bdbed1068becc3bf57a4c83a8519280ff3eb847\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7c836df75da45ef369baafc15bdbed1068becc3bf57a4c83a8519280ff3eb847\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T08:39:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T08:39:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6lhz2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T08:39:36Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-4bmrv\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T08:39:44Z is after 2025-08-24T17:21:41Z" Oct 03 08:39:44 crc kubenswrapper[4765]: I1003 08:39:44.706922 4765 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-p9gf5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"46c76a49-e10b-4a12-a6c7-12c330cd3c4e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d127171dd11041892813dd0596574630e756cc4f2e54b149619bffdbe9bae37d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T08:39:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2ldt7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T08:39:36Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-p9gf5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T08:39:44Z is after 2025-08-24T17:21:41Z" Oct 03 08:39:44 crc kubenswrapper[4765]: I1003 08:39:44.719145 4765 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-svqbq" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"84cdf1d7-9997-4015-bdbf-eedacc081685\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://43441b23076aa88505c0014c6734ffd0302f9011300711eece573befc94f3fbf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T08:39:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tm2z5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T08:39:39Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-svqbq\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T08:39:44Z is after 2025-08-24T17:21:41Z" Oct 03 08:39:44 crc kubenswrapper[4765]: I1003 08:39:44.749919 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 08:39:44 crc kubenswrapper[4765]: I1003 08:39:44.749959 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 08:39:44 crc kubenswrapper[4765]: I1003 08:39:44.749973 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 08:39:44 crc kubenswrapper[4765]: I1003 08:39:44.749992 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 08:39:44 crc kubenswrapper[4765]: I1003 08:39:44.750005 4765 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T08:39:44Z","lastTransitionTime":"2025-10-03T08:39:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 08:39:44 crc kubenswrapper[4765]: I1003 08:39:44.852821 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 08:39:44 crc kubenswrapper[4765]: I1003 08:39:44.852890 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 08:39:44 crc kubenswrapper[4765]: I1003 08:39:44.852903 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 08:39:44 crc kubenswrapper[4765]: I1003 08:39:44.852925 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 08:39:44 crc kubenswrapper[4765]: I1003 08:39:44.852939 4765 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T08:39:44Z","lastTransitionTime":"2025-10-03T08:39:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 08:39:44 crc kubenswrapper[4765]: I1003 08:39:44.955339 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 08:39:44 crc kubenswrapper[4765]: I1003 08:39:44.955378 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 08:39:44 crc kubenswrapper[4765]: I1003 08:39:44.955387 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 08:39:44 crc kubenswrapper[4765]: I1003 08:39:44.955405 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 08:39:44 crc kubenswrapper[4765]: I1003 08:39:44.955416 4765 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T08:39:44Z","lastTransitionTime":"2025-10-03T08:39:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 08:39:45 crc kubenswrapper[4765]: I1003 08:39:45.059324 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 08:39:45 crc kubenswrapper[4765]: I1003 08:39:45.059376 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 08:39:45 crc kubenswrapper[4765]: I1003 08:39:45.059389 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 08:39:45 crc kubenswrapper[4765]: I1003 08:39:45.059411 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 08:39:45 crc kubenswrapper[4765]: I1003 08:39:45.059423 4765 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T08:39:45Z","lastTransitionTime":"2025-10-03T08:39:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 08:39:45 crc kubenswrapper[4765]: I1003 08:39:45.162580 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 08:39:45 crc kubenswrapper[4765]: I1003 08:39:45.162620 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 08:39:45 crc kubenswrapper[4765]: I1003 08:39:45.162631 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 08:39:45 crc kubenswrapper[4765]: I1003 08:39:45.162667 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 08:39:45 crc kubenswrapper[4765]: I1003 08:39:45.162680 4765 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T08:39:45Z","lastTransitionTime":"2025-10-03T08:39:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 08:39:45 crc kubenswrapper[4765]: I1003 08:39:45.265143 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 08:39:45 crc kubenswrapper[4765]: I1003 08:39:45.265248 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 08:39:45 crc kubenswrapper[4765]: I1003 08:39:45.265270 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 08:39:45 crc kubenswrapper[4765]: I1003 08:39:45.265301 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 08:39:45 crc kubenswrapper[4765]: I1003 08:39:45.265321 4765 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T08:39:45Z","lastTransitionTime":"2025-10-03T08:39:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 08:39:45 crc kubenswrapper[4765]: I1003 08:39:45.306248 4765 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 03 08:39:45 crc kubenswrapper[4765]: E1003 08:39:45.306419 4765 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 03 08:39:45 crc kubenswrapper[4765]: I1003 08:39:45.306813 4765 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 03 08:39:45 crc kubenswrapper[4765]: E1003 08:39:45.306888 4765 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 03 08:39:45 crc kubenswrapper[4765]: I1003 08:39:45.306947 4765 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 03 08:39:45 crc kubenswrapper[4765]: E1003 08:39:45.306990 4765 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 03 08:39:45 crc kubenswrapper[4765]: I1003 08:39:45.368034 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 08:39:45 crc kubenswrapper[4765]: I1003 08:39:45.368072 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 08:39:45 crc kubenswrapper[4765]: I1003 08:39:45.368084 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 08:39:45 crc kubenswrapper[4765]: I1003 08:39:45.368100 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 08:39:45 crc kubenswrapper[4765]: I1003 08:39:45.368112 4765 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T08:39:45Z","lastTransitionTime":"2025-10-03T08:39:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 08:39:45 crc kubenswrapper[4765]: I1003 08:39:45.470821 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 08:39:45 crc kubenswrapper[4765]: I1003 08:39:45.471063 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 08:39:45 crc kubenswrapper[4765]: I1003 08:39:45.471190 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 08:39:45 crc kubenswrapper[4765]: I1003 08:39:45.471265 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 08:39:45 crc kubenswrapper[4765]: I1003 08:39:45.471323 4765 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T08:39:45Z","lastTransitionTime":"2025-10-03T08:39:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 08:39:45 crc kubenswrapper[4765]: I1003 08:39:45.540534 4765 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-srgbb_ea01fba1-445f-46c1-898c-1ceb34866850/ovnkube-controller/0.log" Oct 03 08:39:45 crc kubenswrapper[4765]: I1003 08:39:45.543672 4765 generic.go:334] "Generic (PLEG): container finished" podID="ea01fba1-445f-46c1-898c-1ceb34866850" containerID="38b17cdcd60c8c79a10f7b5c583c138ae27bd93dcfb6571c37d800acf1360556" exitCode=1 Oct 03 08:39:45 crc kubenswrapper[4765]: I1003 08:39:45.543755 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-srgbb" event={"ID":"ea01fba1-445f-46c1-898c-1ceb34866850","Type":"ContainerDied","Data":"38b17cdcd60c8c79a10f7b5c583c138ae27bd93dcfb6571c37d800acf1360556"} Oct 03 08:39:45 crc kubenswrapper[4765]: I1003 08:39:45.544240 4765 scope.go:117] "RemoveContainer" containerID="38b17cdcd60c8c79a10f7b5c583c138ae27bd93dcfb6571c37d800acf1360556" Oct 03 08:39:45 crc kubenswrapper[4765]: I1003 08:39:45.562515 4765 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-4bmrv" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4f105c06-3e67-486f-a622-923ae442117c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d7a29ab4db9b7548c70824520272e6323f615934cddf1d92bf653f6d8f030a19\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T08:39:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6lhz2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ab314ed006e71cae099ecf8be4ac18fb777dd83fe4e233d9bc347965f76b6a0b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ab314ed006e71cae099ecf8be4ac18fb777dd83fe4e233d9bc347965f76b6a0b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T08:39:37Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T08:39:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6lhz2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://261208c132a527b6786f64f37dd699e889f68ebc01265028288b3620615274bc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://261208c132a527b6786f64f37dd699e889f68ebc01265028288b3620615274bc\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T08:39:39Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T08:39:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6lhz2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8358f38c6c98de8206fc03fda21200b0e526972747588a6e32d525c064aa57ac\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8358f38c6c98de8206fc03fda21200b0e526972747588a6e32d525c064aa57ac\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T08:39:39Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T08:39:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6lhz2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9af7a0993c4e8d1177050ee170ae306c2e2570b0daca2d3f5c812b5f0e9c81da\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9af7a0993c4e8d1177050ee170ae306c2e2570b0daca2d3f5c812b5f0e9c81da\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T08:39:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T08:39:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6lhz2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://23ac91bc25ecc5c606b22bf6df52129330bb8c214ef8ec881fb202df6350c853\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://23ac91bc25ecc5c606b22bf6df52129330bb8c214ef8ec881fb202df6350c853\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T08:39:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T08:39:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6lhz2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7c836df75da45ef369baafc15bdbed1068becc3bf57a4c83a8519280ff3eb847\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7c836df75da45ef369baafc15bdbed1068becc3bf57a4c83a8519280ff3eb847\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T08:39:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T08:39:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6lhz2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T08:39:36Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-4bmrv\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T08:39:45Z is after 2025-08-24T17:21:41Z" Oct 03 08:39:45 crc kubenswrapper[4765]: I1003 08:39:45.573601 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 08:39:45 crc kubenswrapper[4765]: I1003 08:39:45.573639 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 08:39:45 crc kubenswrapper[4765]: I1003 08:39:45.573666 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 08:39:45 crc kubenswrapper[4765]: I1003 08:39:45.573683 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 08:39:45 crc kubenswrapper[4765]: I1003 08:39:45.573695 4765 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T08:39:45Z","lastTransitionTime":"2025-10-03T08:39:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 08:39:45 crc kubenswrapper[4765]: I1003 08:39:45.573787 4765 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-p9gf5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"46c76a49-e10b-4a12-a6c7-12c330cd3c4e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d127171dd11041892813dd0596574630e756cc4f2e54b149619bffdbe9bae37d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T08:39:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2ldt7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T08:39:36Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-p9gf5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T08:39:45Z is after 2025-08-24T17:21:41Z" Oct 03 08:39:45 crc kubenswrapper[4765]: I1003 08:39:45.584496 4765 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-svqbq" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"84cdf1d7-9997-4015-bdbf-eedacc081685\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://43441b23076aa88505c0014c6734ffd0302f9011300711eece573befc94f3fbf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T08:39:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tm2z5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T08:39:39Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-svqbq\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T08:39:45Z is after 2025-08-24T17:21:41Z" Oct 03 08:39:45 crc kubenswrapper[4765]: I1003 08:39:45.598385 4765 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9660b983-3561-4cf7-8ea0-31a63e8d1051\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://23c27e7d79dab0c54b22f0114e7f55a9267e3a21961b8479c37fd77d0e8b66c7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T08:39:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fb89a31c804d86cbc11b04e4dcfab79d4536f28a107d43e98d48172a1c257ebb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T08:39:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1c3168f51c49cd9633557cf31cdc0fec47b3fcf981462dc85f4253a0584fcf52\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T08:39:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://99ae775d5cfd2e88a1c7ca516e1c59f2e08ce1d383653cacbefeac66b07abcb5\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T08:39:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T08:39:16Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T08:39:45Z is after 2025-08-24T17:21:41Z" Oct 03 08:39:45 crc kubenswrapper[4765]: I1003 08:39:45.611833 4765 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0c434639-9c6c-420c-a51b-fdf59b654daa\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:16Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:16Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d31497fd54f7500ac776bdd9a16414d873c053353911ed5ba237b201e9e7ac12\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T08:39:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e89b19d6a5b90a2051665bf2e5e150f73df7899eff246ee75246bc2127c415ea\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T08:39:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://79fad446c147481b1a0ff2a173848b2d24384e6b6aafcd0749dc820e9abfe929\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T08:39:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f2e21a2b21d807288e991a3a44ea38d316985590080aa4291aa3385816f826dd\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://aa0283dadc2c5e48aa9bfd20ef35d889a350244b72eb8529d4d4e682d5fa0e47\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-03T08:39:35Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1003 08:39:29.830291 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1003 08:39:29.833185 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2710500186/tls.crt::/tmp/serving-cert-2710500186/tls.key\\\\\\\"\\\\nI1003 08:39:35.213224 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1003 08:39:35.219008 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1003 08:39:35.219055 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1003 08:39:35.219088 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1003 08:39:35.219098 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1003 08:39:35.227302 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI1003 08:39:35.227314 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW1003 08:39:35.227372 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1003 08:39:35.227381 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1003 08:39:35.227385 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1003 08:39:35.227395 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1003 08:39:35.227398 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1003 08:39:35.227401 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF1003 08:39:35.229781 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-03T08:39:19Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T08:39:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fa1d1c0f4dab4b4c6c9f3afccac34473eab40a714015a2a7ce725ed1a92b609c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T08:39:19Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7c94b0f96f50324a67a7b189e9d3c4e406193421aa2e793692219bef9f2578dd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7c94b0f96f50324a67a7b189e9d3c4e406193421aa2e793692219bef9f2578dd\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T08:39:17Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T08:39:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T08:39:16Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T08:39:45Z is after 2025-08-24T17:21:41Z" Oct 03 08:39:45 crc kubenswrapper[4765]: I1003 08:39:45.623235 4765 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:35Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:35Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T08:39:45Z is after 2025-08-24T17:21:41Z" Oct 03 08:39:45 crc kubenswrapper[4765]: I1003 08:39:45.637302 4765 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:36Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:36Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e2003e4dd90b26bd915c05a690d0ab12b21ef7773138f11993382b0e7ac2d778\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T08:39:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T08:39:45Z is after 2025-08-24T17:21:41Z" Oct 03 08:39:45 crc kubenswrapper[4765]: I1003 08:39:45.648878 4765 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:35Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:35Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T08:39:45Z is after 2025-08-24T17:21:41Z" Oct 03 08:39:45 crc kubenswrapper[4765]: I1003 08:39:45.661111 4765 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-csb5z" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"912755c8-dd28-4fbc-82de-9cf85df54f4f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d7f179012e9f55f30c641a1ae3640cc90cefb3d2527d0c1e0580c219899503e1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T08:39:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w8k2k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T08:39:36Z\\\"}}\" for pod \"openshift-multus\"/\"multus-csb5z\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T08:39:45Z is after 2025-08-24T17:21:41Z" Oct 03 08:39:45 crc kubenswrapper[4765]: I1003 08:39:45.673971 4765 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-j8mss" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d636dbad-9ffa-4ba7-953f-adea04b76a23\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://33c95fa1034cd2135f4293956d73825e809195d220ff0b10a6604bd399a5730a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T08:39:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dhzzj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://714c78e9165f96e2aee03ad7be980399f06aeb852da4d76611c236f262518281\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T08:39:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dhzzj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T08:39:36Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-j8mss\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T08:39:45Z is after 2025-08-24T17:21:41Z" Oct 03 08:39:45 crc kubenswrapper[4765]: I1003 08:39:45.675942 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 08:39:45 crc kubenswrapper[4765]: I1003 08:39:45.675962 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 08:39:45 crc kubenswrapper[4765]: I1003 08:39:45.675972 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 08:39:45 crc kubenswrapper[4765]: I1003 08:39:45.675988 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 08:39:45 crc kubenswrapper[4765]: I1003 08:39:45.675999 4765 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T08:39:45Z","lastTransitionTime":"2025-10-03T08:39:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 08:39:45 crc kubenswrapper[4765]: I1003 08:39:45.701498 4765 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"859ee4f1-636f-48e5-ad72-fef19f311c64\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ffaf0cbc60fa84230a87aff908b5b2a76956abfa937aeea94363abe91640b93e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T08:39:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0fee410f71d4fa82e7bf54dad906736bc7182be512825a06bf7a4c76ed2f2789\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T08:39:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ab0ed26066c771f9943b6435fa382ff61fb04f0c8bef3d505aba4c5d1a1d4740\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T08:39:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://153c9584928c3d064c6098126dad58733015ed123b9a55c959e69ddcc0ad2110\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T08:39:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6fa1bc45d80d90bc08ca3a7177e2ac77b66c36f5a0f863532174be7719bfaae0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T08:39:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c1359775b3b907743ce1d7754c904fab5cf953ad3a28f91e781be95462e6a9d4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c1359775b3b907743ce1d7754c904fab5cf953ad3a28f91e781be95462e6a9d4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T08:39:17Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T08:39:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a56c24b3af6b3d5c18009cbb5517273674eb36f57157344f99903f9c50c7d1a0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a56c24b3af6b3d5c18009cbb5517273674eb36f57157344f99903f9c50c7d1a0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T08:39:18Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T08:39:18Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://18dbd66a12f96fa4beef4da9dcc0e1b1d3a5164a8b21a6774142289078d6c05f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://18dbd66a12f96fa4beef4da9dcc0e1b1d3a5164a8b21a6774142289078d6c05f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T08:39:19Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T08:39:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T08:39:16Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T08:39:45Z is after 2025-08-24T17:21:41Z" Oct 03 08:39:45 crc kubenswrapper[4765]: I1003 08:39:45.717613 4765 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:36Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:36Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e8d6f534a0a702832db2f8947c1528a98d511d3950cc5a6ec0ac3b31b3dbcb7c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T08:39:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://20ad16cb9f0f7e17ac946cd2c3f7c01b6e6c95d6d76c99f482b3761546689af2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T08:39:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T08:39:45Z is after 2025-08-24T17:21:41Z" Oct 03 08:39:45 crc kubenswrapper[4765]: I1003 08:39:45.729664 4765 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:38Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:38Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a37f2b5f797755065158a077232872befbc61f2f19c80dfd27bba7f131db794c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T08:39:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T08:39:45Z is after 2025-08-24T17:21:41Z" Oct 03 08:39:45 crc kubenswrapper[4765]: I1003 08:39:45.747371 4765 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-srgbb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ea01fba1-445f-46c1-898c-1ceb34866850\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:36Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:36Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d73e2e54676fc570262cfd551322ed003812c372ddc25695ca3b34ae2a05423b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T08:39:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m4zqv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fa40947035e07c4926ee170348e2bd545830d0c6c1fa6b59a2aa7f12eac2c6da\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T08:39:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m4zqv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://902d94d2cc9ce526c6ea774f1bb70fbee7da85cedab72fcd842f87d47ee8a458\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T08:39:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m4zqv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://95502595a856f5f235331ab5db3d4f97a50f968857c1962d12b873a714689f0c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T08:39:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m4zqv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a3ad66691c9dcf004703b79d697a78f9b42791fafba2ddf278997b6ad28bdd4a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T08:39:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m4zqv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://68b9b8a7ec5c072f50d44aa0d3800b7cdee18bdd868d37ec129ceb37a23bd3ca\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T08:39:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m4zqv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://38b17cdcd60c8c79a10f7b5c583c138ae27bd93dcfb6571c37d800acf1360556\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://38b17cdcd60c8c79a10f7b5c583c138ae27bd93dcfb6571c37d800acf1360556\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-10-03T08:39:45Z\\\",\\\"message\\\":\\\" 6049 handler.go:208] Removed *v1.Pod event handler 3\\\\nI1003 08:39:45.407639 6049 handler.go:190] Sending *v1.NetworkPolicy event handler 4 for removal\\\\nI1003 08:39:45.407708 6049 handler.go:190] Sending *v1.Namespace event handler 1 for removal\\\\nI1003 08:39:45.407720 6049 handler.go:190] Sending *v1.Namespace event handler 5 for removal\\\\nI1003 08:39:45.407747 6049 handler.go:190] Sending *v1.EgressIP event handler 8 for removal\\\\nI1003 08:39:45.407763 6049 handler.go:190] Sending *v1.EgressFirewall event handler 9 for removal\\\\nI1003 08:39:45.407763 6049 handler.go:208] Removed *v1.NetworkPolicy event handler 4\\\\nI1003 08:39:45.407766 6049 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI1003 08:39:45.407776 6049 handler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI1003 08:39:45.407781 6049 handler.go:208] Removed *v1.Namespace event handler 5\\\\nI1003 08:39:45.407792 6049 handler.go:208] Removed *v1.EgressFirewall event handler 9\\\\nI1003 08:39:45.407792 6049 handler.go:208] Removed *v1.EgressIP event handler 8\\\\nI1003 08:39:45.407897 6049 handler.go:190] Sending *v1.Node event handler 7 for removal\\\\nI1003 08:39:45.407932 6049 handler.go:208] Removed *v1.Node event handler 7\\\\nI1003 08:39:45.407932 6049 factory.go:656] Stopping watch factory\\\\nI1003 08:39:45.407945 6049 handler.go:208] Removed *v1.Node ev\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-03T08:39:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m4zqv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6d5d60eb6ab5ff22cc2c6826b1d47220bb827fa0429f2a59020ae01d0a43f6bf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T08:39:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m4zqv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://761ad034a8a6a7a6280fcccaca136b26ddd423885bc2a7ab6b8e1856f16f7416\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://761ad034a8a6a7a6280fcccaca136b26ddd423885bc2a7ab6b8e1856f16f7416\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T08:39:37Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T08:39:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m4zqv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T08:39:36Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-srgbb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T08:39:45Z is after 2025-08-24T17:21:41Z" Oct 03 08:39:45 crc kubenswrapper[4765]: I1003 08:39:45.760355 4765 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:35Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:35Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T08:39:45Z is after 2025-08-24T17:21:41Z" Oct 03 08:39:45 crc kubenswrapper[4765]: I1003 08:39:45.778294 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 08:39:45 crc kubenswrapper[4765]: I1003 08:39:45.778328 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 08:39:45 crc kubenswrapper[4765]: I1003 08:39:45.778340 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 08:39:45 crc kubenswrapper[4765]: I1003 08:39:45.778359 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 08:39:45 crc kubenswrapper[4765]: I1003 08:39:45.778393 4765 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T08:39:45Z","lastTransitionTime":"2025-10-03T08:39:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 08:39:45 crc kubenswrapper[4765]: I1003 08:39:45.880689 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 08:39:45 crc kubenswrapper[4765]: I1003 08:39:45.880733 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 08:39:45 crc kubenswrapper[4765]: I1003 08:39:45.880743 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 08:39:45 crc kubenswrapper[4765]: I1003 08:39:45.880757 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 08:39:45 crc kubenswrapper[4765]: I1003 08:39:45.880767 4765 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T08:39:45Z","lastTransitionTime":"2025-10-03T08:39:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 08:39:45 crc kubenswrapper[4765]: I1003 08:39:45.983677 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 08:39:45 crc kubenswrapper[4765]: I1003 08:39:45.983728 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 08:39:45 crc kubenswrapper[4765]: I1003 08:39:45.983739 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 08:39:45 crc kubenswrapper[4765]: I1003 08:39:45.983760 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 08:39:45 crc kubenswrapper[4765]: I1003 08:39:45.983773 4765 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T08:39:45Z","lastTransitionTime":"2025-10-03T08:39:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 08:39:46 crc kubenswrapper[4765]: I1003 08:39:46.086547 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 08:39:46 crc kubenswrapper[4765]: I1003 08:39:46.086603 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 08:39:46 crc kubenswrapper[4765]: I1003 08:39:46.086617 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 08:39:46 crc kubenswrapper[4765]: I1003 08:39:46.086656 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 08:39:46 crc kubenswrapper[4765]: I1003 08:39:46.086671 4765 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T08:39:46Z","lastTransitionTime":"2025-10-03T08:39:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 08:39:46 crc kubenswrapper[4765]: I1003 08:39:46.189657 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 08:39:46 crc kubenswrapper[4765]: I1003 08:39:46.189718 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 08:39:46 crc kubenswrapper[4765]: I1003 08:39:46.189732 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 08:39:46 crc kubenswrapper[4765]: I1003 08:39:46.189760 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 08:39:46 crc kubenswrapper[4765]: I1003 08:39:46.189778 4765 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T08:39:46Z","lastTransitionTime":"2025-10-03T08:39:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 08:39:46 crc kubenswrapper[4765]: I1003 08:39:46.292160 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 08:39:46 crc kubenswrapper[4765]: I1003 08:39:46.292199 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 08:39:46 crc kubenswrapper[4765]: I1003 08:39:46.292208 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 08:39:46 crc kubenswrapper[4765]: I1003 08:39:46.292223 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 08:39:46 crc kubenswrapper[4765]: I1003 08:39:46.292234 4765 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T08:39:46Z","lastTransitionTime":"2025-10-03T08:39:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 08:39:46 crc kubenswrapper[4765]: I1003 08:39:46.320020 4765 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9660b983-3561-4cf7-8ea0-31a63e8d1051\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://23c27e7d79dab0c54b22f0114e7f55a9267e3a21961b8479c37fd77d0e8b66c7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T08:39:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fb89a31c804d86cbc11b04e4dcfab79d4536f28a107d43e98d48172a1c257ebb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T08:39:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1c3168f51c49cd9633557cf31cdc0fec47b3fcf981462dc85f4253a0584fcf52\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T08:39:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://99ae775d5cfd2e88a1c7ca516e1c59f2e08ce1d383653cacbefeac66b07abcb5\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T08:39:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T08:39:16Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T08:39:46Z is after 2025-08-24T17:21:41Z" Oct 03 08:39:46 crc kubenswrapper[4765]: I1003 08:39:46.336633 4765 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-4bmrv" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4f105c06-3e67-486f-a622-923ae442117c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d7a29ab4db9b7548c70824520272e6323f615934cddf1d92bf653f6d8f030a19\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T08:39:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6lhz2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ab314ed006e71cae099ecf8be4ac18fb777dd83fe4e233d9bc347965f76b6a0b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ab314ed006e71cae099ecf8be4ac18fb777dd83fe4e233d9bc347965f76b6a0b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T08:39:37Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T08:39:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6lhz2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://261208c132a527b6786f64f37dd699e889f68ebc01265028288b3620615274bc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://261208c132a527b6786f64f37dd699e889f68ebc01265028288b3620615274bc\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T08:39:39Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T08:39:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6lhz2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8358f38c6c98de8206fc03fda21200b0e526972747588a6e32d525c064aa57ac\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8358f38c6c98de8206fc03fda21200b0e526972747588a6e32d525c064aa57ac\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T08:39:39Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T08:39:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6lhz2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9af7a0993c4e8d1177050ee170ae306c2e2570b0daca2d3f5c812b5f0e9c81da\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9af7a0993c4e8d1177050ee170ae306c2e2570b0daca2d3f5c812b5f0e9c81da\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T08:39:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T08:39:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6lhz2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://23ac91bc25ecc5c606b22bf6df52129330bb8c214ef8ec881fb202df6350c853\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://23ac91bc25ecc5c606b22bf6df52129330bb8c214ef8ec881fb202df6350c853\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T08:39:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T08:39:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6lhz2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7c836df75da45ef369baafc15bdbed1068becc3bf57a4c83a8519280ff3eb847\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7c836df75da45ef369baafc15bdbed1068becc3bf57a4c83a8519280ff3eb847\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T08:39:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T08:39:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6lhz2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T08:39:36Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-4bmrv\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T08:39:46Z is after 2025-08-24T17:21:41Z" Oct 03 08:39:46 crc kubenswrapper[4765]: I1003 08:39:46.348788 4765 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-p9gf5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"46c76a49-e10b-4a12-a6c7-12c330cd3c4e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d127171dd11041892813dd0596574630e756cc4f2e54b149619bffdbe9bae37d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T08:39:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2ldt7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T08:39:36Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-p9gf5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T08:39:46Z is after 2025-08-24T17:21:41Z" Oct 03 08:39:46 crc kubenswrapper[4765]: I1003 08:39:46.361259 4765 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-svqbq" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"84cdf1d7-9997-4015-bdbf-eedacc081685\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://43441b23076aa88505c0014c6734ffd0302f9011300711eece573befc94f3fbf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T08:39:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tm2z5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T08:39:39Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-svqbq\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T08:39:46Z is after 2025-08-24T17:21:41Z" Oct 03 08:39:46 crc kubenswrapper[4765]: I1003 08:39:46.394538 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 08:39:46 crc kubenswrapper[4765]: I1003 08:39:46.394579 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 08:39:46 crc kubenswrapper[4765]: I1003 08:39:46.394588 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 08:39:46 crc kubenswrapper[4765]: I1003 08:39:46.394606 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 08:39:46 crc kubenswrapper[4765]: I1003 08:39:46.394621 4765 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T08:39:46Z","lastTransitionTime":"2025-10-03T08:39:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 08:39:46 crc kubenswrapper[4765]: I1003 08:39:46.424688 4765 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:36Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:36Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e2003e4dd90b26bd915c05a690d0ab12b21ef7773138f11993382b0e7ac2d778\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T08:39:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T08:39:46Z is after 2025-08-24T17:21:41Z" Oct 03 08:39:46 crc kubenswrapper[4765]: I1003 08:39:46.444848 4765 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0c434639-9c6c-420c-a51b-fdf59b654daa\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:16Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:16Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d31497fd54f7500ac776bdd9a16414d873c053353911ed5ba237b201e9e7ac12\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T08:39:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e89b19d6a5b90a2051665bf2e5e150f73df7899eff246ee75246bc2127c415ea\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T08:39:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://79fad446c147481b1a0ff2a173848b2d24384e6b6aafcd0749dc820e9abfe929\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T08:39:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f2e21a2b21d807288e991a3a44ea38d316985590080aa4291aa3385816f826dd\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://aa0283dadc2c5e48aa9bfd20ef35d889a350244b72eb8529d4d4e682d5fa0e47\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-03T08:39:35Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1003 08:39:29.830291 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1003 08:39:29.833185 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2710500186/tls.crt::/tmp/serving-cert-2710500186/tls.key\\\\\\\"\\\\nI1003 08:39:35.213224 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1003 08:39:35.219008 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1003 08:39:35.219055 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1003 08:39:35.219088 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1003 08:39:35.219098 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1003 08:39:35.227302 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI1003 08:39:35.227314 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW1003 08:39:35.227372 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1003 08:39:35.227381 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1003 08:39:35.227385 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1003 08:39:35.227395 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1003 08:39:35.227398 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1003 08:39:35.227401 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF1003 08:39:35.229781 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-03T08:39:19Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T08:39:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fa1d1c0f4dab4b4c6c9f3afccac34473eab40a714015a2a7ce725ed1a92b609c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T08:39:19Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7c94b0f96f50324a67a7b189e9d3c4e406193421aa2e793692219bef9f2578dd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7c94b0f96f50324a67a7b189e9d3c4e406193421aa2e793692219bef9f2578dd\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T08:39:17Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T08:39:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T08:39:16Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T08:39:46Z is after 2025-08-24T17:21:41Z" Oct 03 08:39:46 crc kubenswrapper[4765]: I1003 08:39:46.470431 4765 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:35Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:35Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T08:39:46Z is after 2025-08-24T17:21:41Z" Oct 03 08:39:46 crc kubenswrapper[4765]: I1003 08:39:46.497316 4765 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"859ee4f1-636f-48e5-ad72-fef19f311c64\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ffaf0cbc60fa84230a87aff908b5b2a76956abfa937aeea94363abe91640b93e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T08:39:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0fee410f71d4fa82e7bf54dad906736bc7182be512825a06bf7a4c76ed2f2789\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T08:39:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ab0ed26066c771f9943b6435fa382ff61fb04f0c8bef3d505aba4c5d1a1d4740\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T08:39:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://153c9584928c3d064c6098126dad58733015ed123b9a55c959e69ddcc0ad2110\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T08:39:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6fa1bc45d80d90bc08ca3a7177e2ac77b66c36f5a0f863532174be7719bfaae0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T08:39:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c1359775b3b907743ce1d7754c904fab5cf953ad3a28f91e781be95462e6a9d4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c1359775b3b907743ce1d7754c904fab5cf953ad3a28f91e781be95462e6a9d4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T08:39:17Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T08:39:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a56c24b3af6b3d5c18009cbb5517273674eb36f57157344f99903f9c50c7d1a0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a56c24b3af6b3d5c18009cbb5517273674eb36f57157344f99903f9c50c7d1a0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T08:39:18Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T08:39:18Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://18dbd66a12f96fa4beef4da9dcc0e1b1d3a5164a8b21a6774142289078d6c05f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://18dbd66a12f96fa4beef4da9dcc0e1b1d3a5164a8b21a6774142289078d6c05f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T08:39:19Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T08:39:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T08:39:16Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T08:39:46Z is after 2025-08-24T17:21:41Z" Oct 03 08:39:46 crc kubenswrapper[4765]: I1003 08:39:46.497832 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 08:39:46 crc kubenswrapper[4765]: I1003 08:39:46.497861 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 08:39:46 crc kubenswrapper[4765]: I1003 08:39:46.497905 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 08:39:46 crc kubenswrapper[4765]: I1003 08:39:46.497924 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 08:39:46 crc kubenswrapper[4765]: I1003 08:39:46.497935 4765 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T08:39:46Z","lastTransitionTime":"2025-10-03T08:39:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 08:39:46 crc kubenswrapper[4765]: I1003 08:39:46.519043 4765 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:35Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:35Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T08:39:46Z is after 2025-08-24T17:21:41Z" Oct 03 08:39:46 crc kubenswrapper[4765]: I1003 08:39:46.537080 4765 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-csb5z" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"912755c8-dd28-4fbc-82de-9cf85df54f4f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d7f179012e9f55f30c641a1ae3640cc90cefb3d2527d0c1e0580c219899503e1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T08:39:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w8k2k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T08:39:36Z\\\"}}\" for pod \"openshift-multus\"/\"multus-csb5z\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T08:39:46Z is after 2025-08-24T17:21:41Z" Oct 03 08:39:46 crc kubenswrapper[4765]: I1003 08:39:46.550405 4765 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-srgbb_ea01fba1-445f-46c1-898c-1ceb34866850/ovnkube-controller/0.log" Oct 03 08:39:46 crc kubenswrapper[4765]: I1003 08:39:46.552991 4765 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-j8mss" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d636dbad-9ffa-4ba7-953f-adea04b76a23\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://33c95fa1034cd2135f4293956d73825e809195d220ff0b10a6604bd399a5730a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T08:39:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dhzzj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://714c78e9165f96e2aee03ad7be980399f06aeb852da4d76611c236f262518281\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T08:39:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dhzzj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T08:39:36Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-j8mss\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T08:39:46Z is after 2025-08-24T17:21:41Z" Oct 03 08:39:46 crc kubenswrapper[4765]: I1003 08:39:46.553590 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-srgbb" event={"ID":"ea01fba1-445f-46c1-898c-1ceb34866850","Type":"ContainerStarted","Data":"4fa0909ee1317cdeb75c73911371e3344b889b98379e921f58d444c960308e28"} Oct 03 08:39:46 crc kubenswrapper[4765]: I1003 08:39:46.553758 4765 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Oct 03 08:39:46 crc kubenswrapper[4765]: I1003 08:39:46.567449 4765 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:35Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:35Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T08:39:46Z is after 2025-08-24T17:21:41Z" Oct 03 08:39:46 crc kubenswrapper[4765]: I1003 08:39:46.580668 4765 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:36Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:36Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e8d6f534a0a702832db2f8947c1528a98d511d3950cc5a6ec0ac3b31b3dbcb7c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T08:39:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://20ad16cb9f0f7e17ac946cd2c3f7c01b6e6c95d6d76c99f482b3761546689af2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T08:39:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T08:39:46Z is after 2025-08-24T17:21:41Z" Oct 03 08:39:46 crc kubenswrapper[4765]: I1003 08:39:46.591178 4765 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:38Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:38Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a37f2b5f797755065158a077232872befbc61f2f19c80dfd27bba7f131db794c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T08:39:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T08:39:46Z is after 2025-08-24T17:21:41Z" Oct 03 08:39:46 crc kubenswrapper[4765]: I1003 08:39:46.600761 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 08:39:46 crc kubenswrapper[4765]: I1003 08:39:46.601072 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 08:39:46 crc kubenswrapper[4765]: I1003 08:39:46.601209 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 08:39:46 crc kubenswrapper[4765]: I1003 08:39:46.601329 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 08:39:46 crc kubenswrapper[4765]: I1003 08:39:46.601414 4765 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T08:39:46Z","lastTransitionTime":"2025-10-03T08:39:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 08:39:46 crc kubenswrapper[4765]: I1003 08:39:46.609066 4765 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-srgbb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ea01fba1-445f-46c1-898c-1ceb34866850\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:36Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:36Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d73e2e54676fc570262cfd551322ed003812c372ddc25695ca3b34ae2a05423b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T08:39:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m4zqv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fa40947035e07c4926ee170348e2bd545830d0c6c1fa6b59a2aa7f12eac2c6da\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T08:39:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m4zqv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://902d94d2cc9ce526c6ea774f1bb70fbee7da85cedab72fcd842f87d47ee8a458\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T08:39:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m4zqv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://95502595a856f5f235331ab5db3d4f97a50f968857c1962d12b873a714689f0c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T08:39:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m4zqv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a3ad66691c9dcf004703b79d697a78f9b42791fafba2ddf278997b6ad28bdd4a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T08:39:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m4zqv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://68b9b8a7ec5c072f50d44aa0d3800b7cdee18bdd868d37ec129ceb37a23bd3ca\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T08:39:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m4zqv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://38b17cdcd60c8c79a10f7b5c583c138ae27bd93dcfb6571c37d800acf1360556\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://38b17cdcd60c8c79a10f7b5c583c138ae27bd93dcfb6571c37d800acf1360556\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-10-03T08:39:45Z\\\",\\\"message\\\":\\\" 6049 handler.go:208] Removed *v1.Pod event handler 3\\\\nI1003 08:39:45.407639 6049 handler.go:190] Sending *v1.NetworkPolicy event handler 4 for removal\\\\nI1003 08:39:45.407708 6049 handler.go:190] Sending *v1.Namespace event handler 1 for removal\\\\nI1003 08:39:45.407720 6049 handler.go:190] Sending *v1.Namespace event handler 5 for removal\\\\nI1003 08:39:45.407747 6049 handler.go:190] Sending *v1.EgressIP event handler 8 for removal\\\\nI1003 08:39:45.407763 6049 handler.go:190] Sending *v1.EgressFirewall event handler 9 for removal\\\\nI1003 08:39:45.407763 6049 handler.go:208] Removed *v1.NetworkPolicy event handler 4\\\\nI1003 08:39:45.407766 6049 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI1003 08:39:45.407776 6049 handler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI1003 08:39:45.407781 6049 handler.go:208] Removed *v1.Namespace event handler 5\\\\nI1003 08:39:45.407792 6049 handler.go:208] Removed *v1.EgressFirewall event handler 9\\\\nI1003 08:39:45.407792 6049 handler.go:208] Removed *v1.EgressIP event handler 8\\\\nI1003 08:39:45.407897 6049 handler.go:190] Sending *v1.Node event handler 7 for removal\\\\nI1003 08:39:45.407932 6049 handler.go:208] Removed *v1.Node event handler 7\\\\nI1003 08:39:45.407932 6049 factory.go:656] Stopping watch factory\\\\nI1003 08:39:45.407945 6049 handler.go:208] Removed *v1.Node ev\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-03T08:39:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m4zqv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6d5d60eb6ab5ff22cc2c6826b1d47220bb827fa0429f2a59020ae01d0a43f6bf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T08:39:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m4zqv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://761ad034a8a6a7a6280fcccaca136b26ddd423885bc2a7ab6b8e1856f16f7416\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://761ad034a8a6a7a6280fcccaca136b26ddd423885bc2a7ab6b8e1856f16f7416\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T08:39:37Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T08:39:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m4zqv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T08:39:36Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-srgbb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T08:39:46Z is after 2025-08-24T17:21:41Z" Oct 03 08:39:46 crc kubenswrapper[4765]: I1003 08:39:46.622914 4765 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:35Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:35Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T08:39:46Z is after 2025-08-24T17:21:41Z" Oct 03 08:39:46 crc kubenswrapper[4765]: I1003 08:39:46.637064 4765 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:36Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:36Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e8d6f534a0a702832db2f8947c1528a98d511d3950cc5a6ec0ac3b31b3dbcb7c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T08:39:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://20ad16cb9f0f7e17ac946cd2c3f7c01b6e6c95d6d76c99f482b3761546689af2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T08:39:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T08:39:46Z is after 2025-08-24T17:21:41Z" Oct 03 08:39:46 crc kubenswrapper[4765]: I1003 08:39:46.650447 4765 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:38Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:38Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a37f2b5f797755065158a077232872befbc61f2f19c80dfd27bba7f131db794c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T08:39:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T08:39:46Z is after 2025-08-24T17:21:41Z" Oct 03 08:39:46 crc kubenswrapper[4765]: I1003 08:39:46.669511 4765 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-srgbb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ea01fba1-445f-46c1-898c-1ceb34866850\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:36Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:36Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d73e2e54676fc570262cfd551322ed003812c372ddc25695ca3b34ae2a05423b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T08:39:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m4zqv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fa40947035e07c4926ee170348e2bd545830d0c6c1fa6b59a2aa7f12eac2c6da\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T08:39:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m4zqv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://902d94d2cc9ce526c6ea774f1bb70fbee7da85cedab72fcd842f87d47ee8a458\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T08:39:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m4zqv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://95502595a856f5f235331ab5db3d4f97a50f968857c1962d12b873a714689f0c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T08:39:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m4zqv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a3ad66691c9dcf004703b79d697a78f9b42791fafba2ddf278997b6ad28bdd4a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T08:39:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m4zqv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://68b9b8a7ec5c072f50d44aa0d3800b7cdee18bdd868d37ec129ceb37a23bd3ca\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T08:39:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m4zqv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4fa0909ee1317cdeb75c73911371e3344b889b98379e921f58d444c960308e28\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://38b17cdcd60c8c79a10f7b5c583c138ae27bd93dcfb6571c37d800acf1360556\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-10-03T08:39:45Z\\\",\\\"message\\\":\\\" 6049 handler.go:208] Removed *v1.Pod event handler 3\\\\nI1003 08:39:45.407639 6049 handler.go:190] Sending *v1.NetworkPolicy event handler 4 for removal\\\\nI1003 08:39:45.407708 6049 handler.go:190] Sending *v1.Namespace event handler 1 for removal\\\\nI1003 08:39:45.407720 6049 handler.go:190] Sending *v1.Namespace event handler 5 for removal\\\\nI1003 08:39:45.407747 6049 handler.go:190] Sending *v1.EgressIP event handler 8 for removal\\\\nI1003 08:39:45.407763 6049 handler.go:190] Sending *v1.EgressFirewall event handler 9 for removal\\\\nI1003 08:39:45.407763 6049 handler.go:208] Removed *v1.NetworkPolicy event handler 4\\\\nI1003 08:39:45.407766 6049 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI1003 08:39:45.407776 6049 handler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI1003 08:39:45.407781 6049 handler.go:208] Removed *v1.Namespace event handler 5\\\\nI1003 08:39:45.407792 6049 handler.go:208] Removed *v1.EgressFirewall event handler 9\\\\nI1003 08:39:45.407792 6049 handler.go:208] Removed *v1.EgressIP event handler 8\\\\nI1003 08:39:45.407897 6049 handler.go:190] Sending *v1.Node event handler 7 for removal\\\\nI1003 08:39:45.407932 6049 handler.go:208] Removed *v1.Node event handler 7\\\\nI1003 08:39:45.407932 6049 factory.go:656] Stopping watch factory\\\\nI1003 08:39:45.407945 6049 handler.go:208] Removed *v1.Node ev\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-03T08:39:42Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T08:39:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m4zqv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6d5d60eb6ab5ff22cc2c6826b1d47220bb827fa0429f2a59020ae01d0a43f6bf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T08:39:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m4zqv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://761ad034a8a6a7a6280fcccaca136b26ddd423885bc2a7ab6b8e1856f16f7416\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://761ad034a8a6a7a6280fcccaca136b26ddd423885bc2a7ab6b8e1856f16f7416\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T08:39:37Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T08:39:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m4zqv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T08:39:36Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-srgbb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T08:39:46Z is after 2025-08-24T17:21:41Z" Oct 03 08:39:46 crc kubenswrapper[4765]: I1003 08:39:46.687854 4765 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9660b983-3561-4cf7-8ea0-31a63e8d1051\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://23c27e7d79dab0c54b22f0114e7f55a9267e3a21961b8479c37fd77d0e8b66c7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T08:39:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fb89a31c804d86cbc11b04e4dcfab79d4536f28a107d43e98d48172a1c257ebb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T08:39:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1c3168f51c49cd9633557cf31cdc0fec47b3fcf981462dc85f4253a0584fcf52\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T08:39:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://99ae775d5cfd2e88a1c7ca516e1c59f2e08ce1d383653cacbefeac66b07abcb5\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T08:39:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T08:39:16Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T08:39:46Z is after 2025-08-24T17:21:41Z" Oct 03 08:39:46 crc kubenswrapper[4765]: I1003 08:39:46.705334 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 08:39:46 crc kubenswrapper[4765]: I1003 08:39:46.705395 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 08:39:46 crc kubenswrapper[4765]: I1003 08:39:46.705289 4765 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-4bmrv" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4f105c06-3e67-486f-a622-923ae442117c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d7a29ab4db9b7548c70824520272e6323f615934cddf1d92bf653f6d8f030a19\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T08:39:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6lhz2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ab314ed006e71cae099ecf8be4ac18fb777dd83fe4e233d9bc347965f76b6a0b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ab314ed006e71cae099ecf8be4ac18fb777dd83fe4e233d9bc347965f76b6a0b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T08:39:37Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T08:39:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6lhz2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://261208c132a527b6786f64f37dd699e889f68ebc01265028288b3620615274bc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://261208c132a527b6786f64f37dd699e889f68ebc01265028288b3620615274bc\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T08:39:39Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T08:39:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6lhz2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8358f38c6c98de8206fc03fda21200b0e526972747588a6e32d525c064aa57ac\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8358f38c6c98de8206fc03fda21200b0e526972747588a6e32d525c064aa57ac\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T08:39:39Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T08:39:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6lhz2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9af7a0993c4e8d1177050ee170ae306c2e2570b0daca2d3f5c812b5f0e9c81da\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9af7a0993c4e8d1177050ee170ae306c2e2570b0daca2d3f5c812b5f0e9c81da\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T08:39:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T08:39:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6lhz2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://23ac91bc25ecc5c606b22bf6df52129330bb8c214ef8ec881fb202df6350c853\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://23ac91bc25ecc5c606b22bf6df52129330bb8c214ef8ec881fb202df6350c853\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T08:39:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T08:39:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6lhz2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7c836df75da45ef369baafc15bdbed1068becc3bf57a4c83a8519280ff3eb847\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7c836df75da45ef369baafc15bdbed1068becc3bf57a4c83a8519280ff3eb847\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T08:39:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T08:39:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6lhz2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T08:39:36Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-4bmrv\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T08:39:46Z is after 2025-08-24T17:21:41Z" Oct 03 08:39:46 crc kubenswrapper[4765]: I1003 08:39:46.705407 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 08:39:46 crc kubenswrapper[4765]: I1003 08:39:46.705504 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 08:39:46 crc kubenswrapper[4765]: I1003 08:39:46.705520 4765 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T08:39:46Z","lastTransitionTime":"2025-10-03T08:39:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 08:39:46 crc kubenswrapper[4765]: I1003 08:39:46.719203 4765 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-p9gf5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"46c76a49-e10b-4a12-a6c7-12c330cd3c4e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d127171dd11041892813dd0596574630e756cc4f2e54b149619bffdbe9bae37d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T08:39:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2ldt7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T08:39:36Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-p9gf5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T08:39:46Z is after 2025-08-24T17:21:41Z" Oct 03 08:39:46 crc kubenswrapper[4765]: I1003 08:39:46.730820 4765 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-svqbq" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"84cdf1d7-9997-4015-bdbf-eedacc081685\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://43441b23076aa88505c0014c6734ffd0302f9011300711eece573befc94f3fbf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T08:39:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tm2z5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T08:39:39Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-svqbq\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T08:39:46Z is after 2025-08-24T17:21:41Z" Oct 03 08:39:46 crc kubenswrapper[4765]: I1003 08:39:46.744016 4765 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0c434639-9c6c-420c-a51b-fdf59b654daa\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:16Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:16Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d31497fd54f7500ac776bdd9a16414d873c053353911ed5ba237b201e9e7ac12\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T08:39:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e89b19d6a5b90a2051665bf2e5e150f73df7899eff246ee75246bc2127c415ea\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T08:39:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://79fad446c147481b1a0ff2a173848b2d24384e6b6aafcd0749dc820e9abfe929\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T08:39:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f2e21a2b21d807288e991a3a44ea38d316985590080aa4291aa3385816f826dd\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://aa0283dadc2c5e48aa9bfd20ef35d889a350244b72eb8529d4d4e682d5fa0e47\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-03T08:39:35Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1003 08:39:29.830291 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1003 08:39:29.833185 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2710500186/tls.crt::/tmp/serving-cert-2710500186/tls.key\\\\\\\"\\\\nI1003 08:39:35.213224 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1003 08:39:35.219008 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1003 08:39:35.219055 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1003 08:39:35.219088 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1003 08:39:35.219098 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1003 08:39:35.227302 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI1003 08:39:35.227314 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW1003 08:39:35.227372 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1003 08:39:35.227381 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1003 08:39:35.227385 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1003 08:39:35.227395 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1003 08:39:35.227398 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1003 08:39:35.227401 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF1003 08:39:35.229781 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-03T08:39:19Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T08:39:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fa1d1c0f4dab4b4c6c9f3afccac34473eab40a714015a2a7ce725ed1a92b609c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T08:39:19Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7c94b0f96f50324a67a7b189e9d3c4e406193421aa2e793692219bef9f2578dd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7c94b0f96f50324a67a7b189e9d3c4e406193421aa2e793692219bef9f2578dd\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T08:39:17Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T08:39:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T08:39:16Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T08:39:46Z is after 2025-08-24T17:21:41Z" Oct 03 08:39:46 crc kubenswrapper[4765]: I1003 08:39:46.756596 4765 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:35Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:35Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T08:39:46Z is after 2025-08-24T17:21:41Z" Oct 03 08:39:46 crc kubenswrapper[4765]: I1003 08:39:46.770241 4765 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:36Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:36Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e2003e4dd90b26bd915c05a690d0ab12b21ef7773138f11993382b0e7ac2d778\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T08:39:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T08:39:46Z is after 2025-08-24T17:21:41Z" Oct 03 08:39:46 crc kubenswrapper[4765]: I1003 08:39:46.790746 4765 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"859ee4f1-636f-48e5-ad72-fef19f311c64\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ffaf0cbc60fa84230a87aff908b5b2a76956abfa937aeea94363abe91640b93e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T08:39:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0fee410f71d4fa82e7bf54dad906736bc7182be512825a06bf7a4c76ed2f2789\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T08:39:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ab0ed26066c771f9943b6435fa382ff61fb04f0c8bef3d505aba4c5d1a1d4740\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T08:39:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://153c9584928c3d064c6098126dad58733015ed123b9a55c959e69ddcc0ad2110\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T08:39:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6fa1bc45d80d90bc08ca3a7177e2ac77b66c36f5a0f863532174be7719bfaae0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T08:39:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c1359775b3b907743ce1d7754c904fab5cf953ad3a28f91e781be95462e6a9d4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c1359775b3b907743ce1d7754c904fab5cf953ad3a28f91e781be95462e6a9d4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T08:39:17Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T08:39:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a56c24b3af6b3d5c18009cbb5517273674eb36f57157344f99903f9c50c7d1a0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a56c24b3af6b3d5c18009cbb5517273674eb36f57157344f99903f9c50c7d1a0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T08:39:18Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T08:39:18Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://18dbd66a12f96fa4beef4da9dcc0e1b1d3a5164a8b21a6774142289078d6c05f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://18dbd66a12f96fa4beef4da9dcc0e1b1d3a5164a8b21a6774142289078d6c05f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T08:39:19Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T08:39:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T08:39:16Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T08:39:46Z is after 2025-08-24T17:21:41Z" Oct 03 08:39:46 crc kubenswrapper[4765]: I1003 08:39:46.801622 4765 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:35Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:35Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T08:39:46Z is after 2025-08-24T17:21:41Z" Oct 03 08:39:46 crc kubenswrapper[4765]: I1003 08:39:46.808407 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 08:39:46 crc kubenswrapper[4765]: I1003 08:39:46.808437 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 08:39:46 crc kubenswrapper[4765]: I1003 08:39:46.808488 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 08:39:46 crc kubenswrapper[4765]: I1003 08:39:46.808503 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 08:39:46 crc kubenswrapper[4765]: I1003 08:39:46.808512 4765 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T08:39:46Z","lastTransitionTime":"2025-10-03T08:39:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 08:39:46 crc kubenswrapper[4765]: I1003 08:39:46.814577 4765 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-csb5z" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"912755c8-dd28-4fbc-82de-9cf85df54f4f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d7f179012e9f55f30c641a1ae3640cc90cefb3d2527d0c1e0580c219899503e1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T08:39:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w8k2k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T08:39:36Z\\\"}}\" for pod \"openshift-multus\"/\"multus-csb5z\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T08:39:46Z is after 2025-08-24T17:21:41Z" Oct 03 08:39:46 crc kubenswrapper[4765]: I1003 08:39:46.826485 4765 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-j8mss" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d636dbad-9ffa-4ba7-953f-adea04b76a23\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://33c95fa1034cd2135f4293956d73825e809195d220ff0b10a6604bd399a5730a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T08:39:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dhzzj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://714c78e9165f96e2aee03ad7be980399f06aeb852da4d76611c236f262518281\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T08:39:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dhzzj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T08:39:36Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-j8mss\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T08:39:46Z is after 2025-08-24T17:21:41Z" Oct 03 08:39:46 crc kubenswrapper[4765]: I1003 08:39:46.910352 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 08:39:46 crc kubenswrapper[4765]: I1003 08:39:46.910390 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 08:39:46 crc kubenswrapper[4765]: I1003 08:39:46.910400 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 08:39:46 crc kubenswrapper[4765]: I1003 08:39:46.910416 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 08:39:46 crc kubenswrapper[4765]: I1003 08:39:46.910428 4765 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T08:39:46Z","lastTransitionTime":"2025-10-03T08:39:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 08:39:47 crc kubenswrapper[4765]: I1003 08:39:47.012952 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 08:39:47 crc kubenswrapper[4765]: I1003 08:39:47.013503 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 08:39:47 crc kubenswrapper[4765]: I1003 08:39:47.013518 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 08:39:47 crc kubenswrapper[4765]: I1003 08:39:47.013538 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 08:39:47 crc kubenswrapper[4765]: I1003 08:39:47.013551 4765 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T08:39:47Z","lastTransitionTime":"2025-10-03T08:39:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 08:39:47 crc kubenswrapper[4765]: I1003 08:39:47.116462 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 08:39:47 crc kubenswrapper[4765]: I1003 08:39:47.116535 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 08:39:47 crc kubenswrapper[4765]: I1003 08:39:47.116547 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 08:39:47 crc kubenswrapper[4765]: I1003 08:39:47.116566 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 08:39:47 crc kubenswrapper[4765]: I1003 08:39:47.116582 4765 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T08:39:47Z","lastTransitionTime":"2025-10-03T08:39:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 08:39:47 crc kubenswrapper[4765]: I1003 08:39:47.219287 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 08:39:47 crc kubenswrapper[4765]: I1003 08:39:47.219333 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 08:39:47 crc kubenswrapper[4765]: I1003 08:39:47.219345 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 08:39:47 crc kubenswrapper[4765]: I1003 08:39:47.219362 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 08:39:47 crc kubenswrapper[4765]: I1003 08:39:47.219375 4765 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T08:39:47Z","lastTransitionTime":"2025-10-03T08:39:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 08:39:47 crc kubenswrapper[4765]: I1003 08:39:47.305924 4765 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 03 08:39:47 crc kubenswrapper[4765]: I1003 08:39:47.305936 4765 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 03 08:39:47 crc kubenswrapper[4765]: E1003 08:39:47.306073 4765 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 03 08:39:47 crc kubenswrapper[4765]: I1003 08:39:47.305924 4765 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 03 08:39:47 crc kubenswrapper[4765]: E1003 08:39:47.306171 4765 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 03 08:39:47 crc kubenswrapper[4765]: E1003 08:39:47.306358 4765 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 03 08:39:47 crc kubenswrapper[4765]: I1003 08:39:47.322041 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 08:39:47 crc kubenswrapper[4765]: I1003 08:39:47.322081 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 08:39:47 crc kubenswrapper[4765]: I1003 08:39:47.322091 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 08:39:47 crc kubenswrapper[4765]: I1003 08:39:47.322109 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 08:39:47 crc kubenswrapper[4765]: I1003 08:39:47.322120 4765 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T08:39:47Z","lastTransitionTime":"2025-10-03T08:39:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 08:39:47 crc kubenswrapper[4765]: I1003 08:39:47.424382 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 08:39:47 crc kubenswrapper[4765]: I1003 08:39:47.424701 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 08:39:47 crc kubenswrapper[4765]: I1003 08:39:47.424776 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 08:39:47 crc kubenswrapper[4765]: I1003 08:39:47.424846 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 08:39:47 crc kubenswrapper[4765]: I1003 08:39:47.424910 4765 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T08:39:47Z","lastTransitionTime":"2025-10-03T08:39:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 08:39:47 crc kubenswrapper[4765]: I1003 08:39:47.527784 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 08:39:47 crc kubenswrapper[4765]: I1003 08:39:47.528062 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 08:39:47 crc kubenswrapper[4765]: I1003 08:39:47.528141 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 08:39:47 crc kubenswrapper[4765]: I1003 08:39:47.528216 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 08:39:47 crc kubenswrapper[4765]: I1003 08:39:47.528277 4765 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T08:39:47Z","lastTransitionTime":"2025-10-03T08:39:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 08:39:47 crc kubenswrapper[4765]: I1003 08:39:47.558554 4765 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-srgbb_ea01fba1-445f-46c1-898c-1ceb34866850/ovnkube-controller/1.log" Oct 03 08:39:47 crc kubenswrapper[4765]: I1003 08:39:47.559626 4765 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-srgbb_ea01fba1-445f-46c1-898c-1ceb34866850/ovnkube-controller/0.log" Oct 03 08:39:47 crc kubenswrapper[4765]: I1003 08:39:47.562346 4765 generic.go:334] "Generic (PLEG): container finished" podID="ea01fba1-445f-46c1-898c-1ceb34866850" containerID="4fa0909ee1317cdeb75c73911371e3344b889b98379e921f58d444c960308e28" exitCode=1 Oct 03 08:39:47 crc kubenswrapper[4765]: I1003 08:39:47.562393 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-srgbb" event={"ID":"ea01fba1-445f-46c1-898c-1ceb34866850","Type":"ContainerDied","Data":"4fa0909ee1317cdeb75c73911371e3344b889b98379e921f58d444c960308e28"} Oct 03 08:39:47 crc kubenswrapper[4765]: I1003 08:39:47.562450 4765 scope.go:117] "RemoveContainer" containerID="38b17cdcd60c8c79a10f7b5c583c138ae27bd93dcfb6571c37d800acf1360556" Oct 03 08:39:47 crc kubenswrapper[4765]: I1003 08:39:47.562970 4765 scope.go:117] "RemoveContainer" containerID="4fa0909ee1317cdeb75c73911371e3344b889b98379e921f58d444c960308e28" Oct 03 08:39:47 crc kubenswrapper[4765]: E1003 08:39:47.563129 4765 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 10s restarting failed container=ovnkube-controller pod=ovnkube-node-srgbb_openshift-ovn-kubernetes(ea01fba1-445f-46c1-898c-1ceb34866850)\"" pod="openshift-ovn-kubernetes/ovnkube-node-srgbb" podUID="ea01fba1-445f-46c1-898c-1ceb34866850" Oct 03 08:39:47 crc kubenswrapper[4765]: I1003 08:39:47.575674 4765 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:35Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:35Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T08:39:47Z is after 2025-08-24T17:21:41Z" Oct 03 08:39:47 crc kubenswrapper[4765]: I1003 08:39:47.587760 4765 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:36Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:36Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e2003e4dd90b26bd915c05a690d0ab12b21ef7773138f11993382b0e7ac2d778\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T08:39:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T08:39:47Z is after 2025-08-24T17:21:41Z" Oct 03 08:39:47 crc kubenswrapper[4765]: I1003 08:39:47.607011 4765 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0c434639-9c6c-420c-a51b-fdf59b654daa\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:16Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:16Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d31497fd54f7500ac776bdd9a16414d873c053353911ed5ba237b201e9e7ac12\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T08:39:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e89b19d6a5b90a2051665bf2e5e150f73df7899eff246ee75246bc2127c415ea\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T08:39:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://79fad446c147481b1a0ff2a173848b2d24384e6b6aafcd0749dc820e9abfe929\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T08:39:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f2e21a2b21d807288e991a3a44ea38d316985590080aa4291aa3385816f826dd\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://aa0283dadc2c5e48aa9bfd20ef35d889a350244b72eb8529d4d4e682d5fa0e47\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-03T08:39:35Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1003 08:39:29.830291 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1003 08:39:29.833185 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2710500186/tls.crt::/tmp/serving-cert-2710500186/tls.key\\\\\\\"\\\\nI1003 08:39:35.213224 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1003 08:39:35.219008 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1003 08:39:35.219055 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1003 08:39:35.219088 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1003 08:39:35.219098 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1003 08:39:35.227302 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI1003 08:39:35.227314 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW1003 08:39:35.227372 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1003 08:39:35.227381 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1003 08:39:35.227385 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1003 08:39:35.227395 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1003 08:39:35.227398 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1003 08:39:35.227401 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF1003 08:39:35.229781 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-03T08:39:19Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T08:39:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fa1d1c0f4dab4b4c6c9f3afccac34473eab40a714015a2a7ce725ed1a92b609c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T08:39:19Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7c94b0f96f50324a67a7b189e9d3c4e406193421aa2e793692219bef9f2578dd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7c94b0f96f50324a67a7b189e9d3c4e406193421aa2e793692219bef9f2578dd\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T08:39:17Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T08:39:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T08:39:16Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T08:39:47Z is after 2025-08-24T17:21:41Z" Oct 03 08:39:47 crc kubenswrapper[4765]: I1003 08:39:47.618679 4765 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-csb5z" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"912755c8-dd28-4fbc-82de-9cf85df54f4f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d7f179012e9f55f30c641a1ae3640cc90cefb3d2527d0c1e0580c219899503e1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T08:39:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w8k2k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T08:39:36Z\\\"}}\" for pod \"openshift-multus\"/\"multus-csb5z\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T08:39:47Z is after 2025-08-24T17:21:41Z" Oct 03 08:39:47 crc kubenswrapper[4765]: I1003 08:39:47.629060 4765 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-j8mss" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d636dbad-9ffa-4ba7-953f-adea04b76a23\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://33c95fa1034cd2135f4293956d73825e809195d220ff0b10a6604bd399a5730a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T08:39:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dhzzj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://714c78e9165f96e2aee03ad7be980399f06aeb852da4d76611c236f262518281\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T08:39:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dhzzj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T08:39:36Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-j8mss\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T08:39:47Z is after 2025-08-24T17:21:41Z" Oct 03 08:39:47 crc kubenswrapper[4765]: I1003 08:39:47.630304 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 08:39:47 crc kubenswrapper[4765]: I1003 08:39:47.630342 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 08:39:47 crc kubenswrapper[4765]: I1003 08:39:47.630353 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 08:39:47 crc kubenswrapper[4765]: I1003 08:39:47.630373 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 08:39:47 crc kubenswrapper[4765]: I1003 08:39:47.630386 4765 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T08:39:47Z","lastTransitionTime":"2025-10-03T08:39:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 08:39:47 crc kubenswrapper[4765]: I1003 08:39:47.648582 4765 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"859ee4f1-636f-48e5-ad72-fef19f311c64\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ffaf0cbc60fa84230a87aff908b5b2a76956abfa937aeea94363abe91640b93e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T08:39:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0fee410f71d4fa82e7bf54dad906736bc7182be512825a06bf7a4c76ed2f2789\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T08:39:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ab0ed26066c771f9943b6435fa382ff61fb04f0c8bef3d505aba4c5d1a1d4740\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T08:39:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://153c9584928c3d064c6098126dad58733015ed123b9a55c959e69ddcc0ad2110\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T08:39:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6fa1bc45d80d90bc08ca3a7177e2ac77b66c36f5a0f863532174be7719bfaae0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T08:39:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c1359775b3b907743ce1d7754c904fab5cf953ad3a28f91e781be95462e6a9d4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c1359775b3b907743ce1d7754c904fab5cf953ad3a28f91e781be95462e6a9d4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T08:39:17Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T08:39:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a56c24b3af6b3d5c18009cbb5517273674eb36f57157344f99903f9c50c7d1a0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a56c24b3af6b3d5c18009cbb5517273674eb36f57157344f99903f9c50c7d1a0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T08:39:18Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T08:39:18Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://18dbd66a12f96fa4beef4da9dcc0e1b1d3a5164a8b21a6774142289078d6c05f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://18dbd66a12f96fa4beef4da9dcc0e1b1d3a5164a8b21a6774142289078d6c05f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T08:39:19Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T08:39:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T08:39:16Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T08:39:47Z is after 2025-08-24T17:21:41Z" Oct 03 08:39:47 crc kubenswrapper[4765]: I1003 08:39:47.661032 4765 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:35Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:35Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T08:39:47Z is after 2025-08-24T17:21:41Z" Oct 03 08:39:47 crc kubenswrapper[4765]: I1003 08:39:47.672795 4765 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:38Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:38Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a37f2b5f797755065158a077232872befbc61f2f19c80dfd27bba7f131db794c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T08:39:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T08:39:47Z is after 2025-08-24T17:21:41Z" Oct 03 08:39:47 crc kubenswrapper[4765]: I1003 08:39:47.690892 4765 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-srgbb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ea01fba1-445f-46c1-898c-1ceb34866850\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:36Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:36Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d73e2e54676fc570262cfd551322ed003812c372ddc25695ca3b34ae2a05423b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T08:39:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m4zqv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fa40947035e07c4926ee170348e2bd545830d0c6c1fa6b59a2aa7f12eac2c6da\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T08:39:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m4zqv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://902d94d2cc9ce526c6ea774f1bb70fbee7da85cedab72fcd842f87d47ee8a458\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T08:39:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m4zqv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://95502595a856f5f235331ab5db3d4f97a50f968857c1962d12b873a714689f0c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T08:39:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m4zqv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a3ad66691c9dcf004703b79d697a78f9b42791fafba2ddf278997b6ad28bdd4a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T08:39:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m4zqv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://68b9b8a7ec5c072f50d44aa0d3800b7cdee18bdd868d37ec129ceb37a23bd3ca\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T08:39:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m4zqv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4fa0909ee1317cdeb75c73911371e3344b889b98379e921f58d444c960308e28\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://38b17cdcd60c8c79a10f7b5c583c138ae27bd93dcfb6571c37d800acf1360556\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-10-03T08:39:45Z\\\",\\\"message\\\":\\\" 6049 handler.go:208] Removed *v1.Pod event handler 3\\\\nI1003 08:39:45.407639 6049 handler.go:190] Sending *v1.NetworkPolicy event handler 4 for removal\\\\nI1003 08:39:45.407708 6049 handler.go:190] Sending *v1.Namespace event handler 1 for removal\\\\nI1003 08:39:45.407720 6049 handler.go:190] Sending *v1.Namespace event handler 5 for removal\\\\nI1003 08:39:45.407747 6049 handler.go:190] Sending *v1.EgressIP event handler 8 for removal\\\\nI1003 08:39:45.407763 6049 handler.go:190] Sending *v1.EgressFirewall event handler 9 for removal\\\\nI1003 08:39:45.407763 6049 handler.go:208] Removed *v1.NetworkPolicy event handler 4\\\\nI1003 08:39:45.407766 6049 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI1003 08:39:45.407776 6049 handler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI1003 08:39:45.407781 6049 handler.go:208] Removed *v1.Namespace event handler 5\\\\nI1003 08:39:45.407792 6049 handler.go:208] Removed *v1.EgressFirewall event handler 9\\\\nI1003 08:39:45.407792 6049 handler.go:208] Removed *v1.EgressIP event handler 8\\\\nI1003 08:39:45.407897 6049 handler.go:190] Sending *v1.Node event handler 7 for removal\\\\nI1003 08:39:45.407932 6049 handler.go:208] Removed *v1.Node event handler 7\\\\nI1003 08:39:45.407932 6049 factory.go:656] Stopping watch factory\\\\nI1003 08:39:45.407945 6049 handler.go:208] Removed *v1.Node ev\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-03T08:39:42Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4fa0909ee1317cdeb75c73911371e3344b889b98379e921f58d444c960308e28\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-10-03T08:39:46Z\\\",\\\"message\\\":\\\"erator/iptables-alerter-4ln5h\\\\nI1003 08:39:46.761968 6208 ovn.go:134] Ensuring zone local for Pod openshift-network-operator/iptables-alerter-4ln5h in node crc\\\\nI1003 08:39:46.761841 6208 obj_retry.go:303] Retry object setup: *v1.Pod openshift-network-node-identity/network-node-identity-vrzqb\\\\nI1003 08:39:46.761976 6208 obj_retry.go:386] Retry successful for *v1.Pod openshift-network-operator/iptables-alerter-4ln5h after 0 failed attempt(s)\\\\nI1003 08:39:46.761984 6208 default_network_controller.go:776] Recording success event on pod openshift-network-operator/iptables-alerter-4ln5h\\\\nI1003 08:39:46.761892 6208 ovn.go:134] Ensuring zone local for Pod openshift-kube-controller-manager/kube-controller-manager-crc in node crc\\\\nI1003 08:39:46.762002 6208 obj_retry.go:386] Retry successful for *v1.Pod openshift-kube-controller-manager/kube-controller-manager-crc after 0 failed attempt(s)\\\\nI1003 08:39:46.762008 6208 default_network_controller.go:776] Recording success event on pod openshift-kube-controller-manager/kube-controller-manager-crc\\\\nI1003 08:39:46.761983 6208 obj_retry.go:365] Adding new object: *v1.Pod openshift-network-node-identity/network-node-identity-vrzqb\\\\nF1003 08:39:46.762028 6208 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-03T08:39:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m4zqv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6d5d60eb6ab5ff22cc2c6826b1d47220bb827fa0429f2a59020ae01d0a43f6bf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T08:39:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m4zqv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://761ad034a8a6a7a6280fcccaca136b26ddd423885bc2a7ab6b8e1856f16f7416\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://761ad034a8a6a7a6280fcccaca136b26ddd423885bc2a7ab6b8e1856f16f7416\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T08:39:37Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T08:39:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m4zqv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T08:39:36Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-srgbb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T08:39:47Z is after 2025-08-24T17:21:41Z" Oct 03 08:39:47 crc kubenswrapper[4765]: I1003 08:39:47.705478 4765 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:35Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:35Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T08:39:47Z is after 2025-08-24T17:21:41Z" Oct 03 08:39:47 crc kubenswrapper[4765]: I1003 08:39:47.717621 4765 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:36Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:36Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e8d6f534a0a702832db2f8947c1528a98d511d3950cc5a6ec0ac3b31b3dbcb7c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T08:39:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://20ad16cb9f0f7e17ac946cd2c3f7c01b6e6c95d6d76c99f482b3761546689af2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T08:39:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T08:39:47Z is after 2025-08-24T17:21:41Z" Oct 03 08:39:47 crc kubenswrapper[4765]: I1003 08:39:47.728212 4765 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-p9gf5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"46c76a49-e10b-4a12-a6c7-12c330cd3c4e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d127171dd11041892813dd0596574630e756cc4f2e54b149619bffdbe9bae37d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T08:39:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2ldt7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T08:39:36Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-p9gf5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T08:39:47Z is after 2025-08-24T17:21:41Z" Oct 03 08:39:47 crc kubenswrapper[4765]: I1003 08:39:47.732089 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 08:39:47 crc kubenswrapper[4765]: I1003 08:39:47.732132 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 08:39:47 crc kubenswrapper[4765]: I1003 08:39:47.732145 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 08:39:47 crc kubenswrapper[4765]: I1003 08:39:47.732164 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 08:39:47 crc kubenswrapper[4765]: I1003 08:39:47.732177 4765 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T08:39:47Z","lastTransitionTime":"2025-10-03T08:39:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 08:39:47 crc kubenswrapper[4765]: I1003 08:39:47.739193 4765 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-svqbq" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"84cdf1d7-9997-4015-bdbf-eedacc081685\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://43441b23076aa88505c0014c6734ffd0302f9011300711eece573befc94f3fbf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T08:39:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tm2z5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T08:39:39Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-svqbq\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T08:39:47Z is after 2025-08-24T17:21:41Z" Oct 03 08:39:47 crc kubenswrapper[4765]: I1003 08:39:47.751884 4765 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9660b983-3561-4cf7-8ea0-31a63e8d1051\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://23c27e7d79dab0c54b22f0114e7f55a9267e3a21961b8479c37fd77d0e8b66c7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T08:39:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fb89a31c804d86cbc11b04e4dcfab79d4536f28a107d43e98d48172a1c257ebb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T08:39:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1c3168f51c49cd9633557cf31cdc0fec47b3fcf981462dc85f4253a0584fcf52\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T08:39:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://99ae775d5cfd2e88a1c7ca516e1c59f2e08ce1d383653cacbefeac66b07abcb5\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T08:39:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T08:39:16Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T08:39:47Z is after 2025-08-24T17:21:41Z" Oct 03 08:39:47 crc kubenswrapper[4765]: I1003 08:39:47.766034 4765 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-4bmrv" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4f105c06-3e67-486f-a622-923ae442117c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d7a29ab4db9b7548c70824520272e6323f615934cddf1d92bf653f6d8f030a19\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T08:39:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6lhz2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ab314ed006e71cae099ecf8be4ac18fb777dd83fe4e233d9bc347965f76b6a0b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ab314ed006e71cae099ecf8be4ac18fb777dd83fe4e233d9bc347965f76b6a0b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T08:39:37Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T08:39:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6lhz2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://261208c132a527b6786f64f37dd699e889f68ebc01265028288b3620615274bc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://261208c132a527b6786f64f37dd699e889f68ebc01265028288b3620615274bc\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T08:39:39Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T08:39:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6lhz2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8358f38c6c98de8206fc03fda21200b0e526972747588a6e32d525c064aa57ac\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8358f38c6c98de8206fc03fda21200b0e526972747588a6e32d525c064aa57ac\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T08:39:39Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T08:39:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6lhz2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9af7a0993c4e8d1177050ee170ae306c2e2570b0daca2d3f5c812b5f0e9c81da\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9af7a0993c4e8d1177050ee170ae306c2e2570b0daca2d3f5c812b5f0e9c81da\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T08:39:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T08:39:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6lhz2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://23ac91bc25ecc5c606b22bf6df52129330bb8c214ef8ec881fb202df6350c853\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://23ac91bc25ecc5c606b22bf6df52129330bb8c214ef8ec881fb202df6350c853\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T08:39:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T08:39:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6lhz2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7c836df75da45ef369baafc15bdbed1068becc3bf57a4c83a8519280ff3eb847\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7c836df75da45ef369baafc15bdbed1068becc3bf57a4c83a8519280ff3eb847\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T08:39:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T08:39:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6lhz2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T08:39:36Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-4bmrv\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T08:39:47Z is after 2025-08-24T17:21:41Z" Oct 03 08:39:47 crc kubenswrapper[4765]: I1003 08:39:47.834217 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 08:39:47 crc kubenswrapper[4765]: I1003 08:39:47.834260 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 08:39:47 crc kubenswrapper[4765]: I1003 08:39:47.834271 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 08:39:47 crc kubenswrapper[4765]: I1003 08:39:47.834289 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 08:39:47 crc kubenswrapper[4765]: I1003 08:39:47.834302 4765 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T08:39:47Z","lastTransitionTime":"2025-10-03T08:39:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 08:39:47 crc kubenswrapper[4765]: I1003 08:39:47.937682 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 08:39:47 crc kubenswrapper[4765]: I1003 08:39:47.937719 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 08:39:47 crc kubenswrapper[4765]: I1003 08:39:47.937749 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 08:39:47 crc kubenswrapper[4765]: I1003 08:39:47.937764 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 08:39:47 crc kubenswrapper[4765]: I1003 08:39:47.937775 4765 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T08:39:47Z","lastTransitionTime":"2025-10-03T08:39:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 08:39:48 crc kubenswrapper[4765]: I1003 08:39:48.041241 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 08:39:48 crc kubenswrapper[4765]: I1003 08:39:48.041328 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 08:39:48 crc kubenswrapper[4765]: I1003 08:39:48.041346 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 08:39:48 crc kubenswrapper[4765]: I1003 08:39:48.041370 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 08:39:48 crc kubenswrapper[4765]: I1003 08:39:48.041385 4765 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T08:39:48Z","lastTransitionTime":"2025-10-03T08:39:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 08:39:48 crc kubenswrapper[4765]: I1003 08:39:48.143886 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 08:39:48 crc kubenswrapper[4765]: I1003 08:39:48.143945 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 08:39:48 crc kubenswrapper[4765]: I1003 08:39:48.143963 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 08:39:48 crc kubenswrapper[4765]: I1003 08:39:48.143983 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 08:39:48 crc kubenswrapper[4765]: I1003 08:39:48.143995 4765 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T08:39:48Z","lastTransitionTime":"2025-10-03T08:39:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 08:39:48 crc kubenswrapper[4765]: I1003 08:39:48.246823 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 08:39:48 crc kubenswrapper[4765]: I1003 08:39:48.246869 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 08:39:48 crc kubenswrapper[4765]: I1003 08:39:48.246880 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 08:39:48 crc kubenswrapper[4765]: I1003 08:39:48.246896 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 08:39:48 crc kubenswrapper[4765]: I1003 08:39:48.246906 4765 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T08:39:48Z","lastTransitionTime":"2025-10-03T08:39:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 08:39:48 crc kubenswrapper[4765]: I1003 08:39:48.349383 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 08:39:48 crc kubenswrapper[4765]: I1003 08:39:48.349433 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 08:39:48 crc kubenswrapper[4765]: I1003 08:39:48.349444 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 08:39:48 crc kubenswrapper[4765]: I1003 08:39:48.349464 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 08:39:48 crc kubenswrapper[4765]: I1003 08:39:48.349477 4765 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T08:39:48Z","lastTransitionTime":"2025-10-03T08:39:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 08:39:48 crc kubenswrapper[4765]: I1003 08:39:48.452887 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 08:39:48 crc kubenswrapper[4765]: I1003 08:39:48.453311 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 08:39:48 crc kubenswrapper[4765]: I1003 08:39:48.453429 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 08:39:48 crc kubenswrapper[4765]: I1003 08:39:48.453502 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 08:39:48 crc kubenswrapper[4765]: I1003 08:39:48.453561 4765 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T08:39:48Z","lastTransitionTime":"2025-10-03T08:39:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 08:39:48 crc kubenswrapper[4765]: I1003 08:39:48.516558 4765 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-srgbb" Oct 03 08:39:48 crc kubenswrapper[4765]: I1003 08:39:48.521814 4765 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-apiserver/kube-apiserver-crc" Oct 03 08:39:48 crc kubenswrapper[4765]: I1003 08:39:48.537216 4765 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-p9gf5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"46c76a49-e10b-4a12-a6c7-12c330cd3c4e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d127171dd11041892813dd0596574630e756cc4f2e54b149619bffdbe9bae37d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T08:39:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2ldt7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T08:39:36Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-p9gf5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T08:39:48Z is after 2025-08-24T17:21:41Z" Oct 03 08:39:48 crc kubenswrapper[4765]: I1003 08:39:48.553126 4765 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-svqbq" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"84cdf1d7-9997-4015-bdbf-eedacc081685\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://43441b23076aa88505c0014c6734ffd0302f9011300711eece573befc94f3fbf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T08:39:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tm2z5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T08:39:39Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-svqbq\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T08:39:48Z is after 2025-08-24T17:21:41Z" Oct 03 08:39:48 crc kubenswrapper[4765]: I1003 08:39:48.555998 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 08:39:48 crc kubenswrapper[4765]: I1003 08:39:48.556127 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 08:39:48 crc kubenswrapper[4765]: I1003 08:39:48.556228 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 08:39:48 crc kubenswrapper[4765]: I1003 08:39:48.556298 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 08:39:48 crc kubenswrapper[4765]: I1003 08:39:48.556363 4765 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T08:39:48Z","lastTransitionTime":"2025-10-03T08:39:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 08:39:48 crc kubenswrapper[4765]: I1003 08:39:48.566728 4765 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-srgbb_ea01fba1-445f-46c1-898c-1ceb34866850/ovnkube-controller/1.log" Oct 03 08:39:48 crc kubenswrapper[4765]: I1003 08:39:48.571683 4765 scope.go:117] "RemoveContainer" containerID="4fa0909ee1317cdeb75c73911371e3344b889b98379e921f58d444c960308e28" Oct 03 08:39:48 crc kubenswrapper[4765]: E1003 08:39:48.571839 4765 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 10s restarting failed container=ovnkube-controller pod=ovnkube-node-srgbb_openshift-ovn-kubernetes(ea01fba1-445f-46c1-898c-1ceb34866850)\"" pod="openshift-ovn-kubernetes/ovnkube-node-srgbb" podUID="ea01fba1-445f-46c1-898c-1ceb34866850" Oct 03 08:39:48 crc kubenswrapper[4765]: I1003 08:39:48.575587 4765 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9660b983-3561-4cf7-8ea0-31a63e8d1051\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://23c27e7d79dab0c54b22f0114e7f55a9267e3a21961b8479c37fd77d0e8b66c7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T08:39:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fb89a31c804d86cbc11b04e4dcfab79d4536f28a107d43e98d48172a1c257ebb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T08:39:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1c3168f51c49cd9633557cf31cdc0fec47b3fcf981462dc85f4253a0584fcf52\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T08:39:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://99ae775d5cfd2e88a1c7ca516e1c59f2e08ce1d383653cacbefeac66b07abcb5\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T08:39:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T08:39:16Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T08:39:48Z is after 2025-08-24T17:21:41Z" Oct 03 08:39:48 crc kubenswrapper[4765]: I1003 08:39:48.584243 4765 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-9pssq"] Oct 03 08:39:48 crc kubenswrapper[4765]: I1003 08:39:48.584677 4765 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-9pssq" Oct 03 08:39:48 crc kubenswrapper[4765]: I1003 08:39:48.587442 4765 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-control-plane-metrics-cert" Oct 03 08:39:48 crc kubenswrapper[4765]: I1003 08:39:48.591383 4765 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-kubernetes-control-plane-dockercfg-gs7dd" Oct 03 08:39:48 crc kubenswrapper[4765]: I1003 08:39:48.597348 4765 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-4bmrv" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4f105c06-3e67-486f-a622-923ae442117c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d7a29ab4db9b7548c70824520272e6323f615934cddf1d92bf653f6d8f030a19\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T08:39:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6lhz2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ab314ed006e71cae099ecf8be4ac18fb777dd83fe4e233d9bc347965f76b6a0b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ab314ed006e71cae099ecf8be4ac18fb777dd83fe4e233d9bc347965f76b6a0b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T08:39:37Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T08:39:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6lhz2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://261208c132a527b6786f64f37dd699e889f68ebc01265028288b3620615274bc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://261208c132a527b6786f64f37dd699e889f68ebc01265028288b3620615274bc\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T08:39:39Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T08:39:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6lhz2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8358f38c6c98de8206fc03fda21200b0e526972747588a6e32d525c064aa57ac\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8358f38c6c98de8206fc03fda21200b0e526972747588a6e32d525c064aa57ac\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T08:39:39Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T08:39:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6lhz2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9af7a0993c4e8d1177050ee170ae306c2e2570b0daca2d3f5c812b5f0e9c81da\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9af7a0993c4e8d1177050ee170ae306c2e2570b0daca2d3f5c812b5f0e9c81da\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T08:39:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T08:39:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6lhz2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://23ac91bc25ecc5c606b22bf6df52129330bb8c214ef8ec881fb202df6350c853\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://23ac91bc25ecc5c606b22bf6df52129330bb8c214ef8ec881fb202df6350c853\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T08:39:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T08:39:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6lhz2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7c836df75da45ef369baafc15bdbed1068becc3bf57a4c83a8519280ff3eb847\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7c836df75da45ef369baafc15bdbed1068becc3bf57a4c83a8519280ff3eb847\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T08:39:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T08:39:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6lhz2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T08:39:36Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-4bmrv\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T08:39:48Z is after 2025-08-24T17:21:41Z" Oct 03 08:39:48 crc kubenswrapper[4765]: I1003 08:39:48.615454 4765 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:35Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:35Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T08:39:48Z is after 2025-08-24T17:21:41Z" Oct 03 08:39:48 crc kubenswrapper[4765]: I1003 08:39:48.627404 4765 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:36Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:36Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e2003e4dd90b26bd915c05a690d0ab12b21ef7773138f11993382b0e7ac2d778\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T08:39:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T08:39:48Z is after 2025-08-24T17:21:41Z" Oct 03 08:39:48 crc kubenswrapper[4765]: I1003 08:39:48.640502 4765 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0c434639-9c6c-420c-a51b-fdf59b654daa\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d31497fd54f7500ac776bdd9a16414d873c053353911ed5ba237b201e9e7ac12\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T08:39:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e89b19d6a5b90a2051665bf2e5e150f73df7899eff246ee75246bc2127c415ea\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T08:39:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://79fad446c147481b1a0ff2a173848b2d24384e6b6aafcd0749dc820e9abfe929\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T08:39:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f2e21a2b21d807288e991a3a44ea38d316985590080aa4291aa3385816f826dd\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://aa0283dadc2c5e48aa9bfd20ef35d889a350244b72eb8529d4d4e682d5fa0e47\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-03T08:39:35Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1003 08:39:29.830291 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1003 08:39:29.833185 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2710500186/tls.crt::/tmp/serving-cert-2710500186/tls.key\\\\\\\"\\\\nI1003 08:39:35.213224 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1003 08:39:35.219008 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1003 08:39:35.219055 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1003 08:39:35.219088 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1003 08:39:35.219098 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1003 08:39:35.227302 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI1003 08:39:35.227314 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW1003 08:39:35.227372 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1003 08:39:35.227381 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1003 08:39:35.227385 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1003 08:39:35.227395 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1003 08:39:35.227398 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1003 08:39:35.227401 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF1003 08:39:35.229781 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-03T08:39:19Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T08:39:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fa1d1c0f4dab4b4c6c9f3afccac34473eab40a714015a2a7ce725ed1a92b609c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T08:39:19Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7c94b0f96f50324a67a7b189e9d3c4e406193421aa2e793692219bef9f2578dd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7c94b0f96f50324a67a7b189e9d3c4e406193421aa2e793692219bef9f2578dd\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T08:39:17Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T08:39:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T08:39:16Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T08:39:48Z is after 2025-08-24T17:21:41Z" Oct 03 08:39:48 crc kubenswrapper[4765]: I1003 08:39:48.656144 4765 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-csb5z" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"912755c8-dd28-4fbc-82de-9cf85df54f4f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d7f179012e9f55f30c641a1ae3640cc90cefb3d2527d0c1e0580c219899503e1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T08:39:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w8k2k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T08:39:36Z\\\"}}\" for pod \"openshift-multus\"/\"multus-csb5z\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T08:39:48Z is after 2025-08-24T17:21:41Z" Oct 03 08:39:48 crc kubenswrapper[4765]: I1003 08:39:48.659041 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 08:39:48 crc kubenswrapper[4765]: I1003 08:39:48.659122 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 08:39:48 crc kubenswrapper[4765]: I1003 08:39:48.659138 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 08:39:48 crc kubenswrapper[4765]: I1003 08:39:48.659162 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 08:39:48 crc kubenswrapper[4765]: I1003 08:39:48.659178 4765 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T08:39:48Z","lastTransitionTime":"2025-10-03T08:39:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 08:39:48 crc kubenswrapper[4765]: I1003 08:39:48.682602 4765 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-j8mss" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d636dbad-9ffa-4ba7-953f-adea04b76a23\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://33c95fa1034cd2135f4293956d73825e809195d220ff0b10a6604bd399a5730a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T08:39:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dhzzj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://714c78e9165f96e2aee03ad7be980399f06aeb852da4d76611c236f262518281\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T08:39:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dhzzj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T08:39:36Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-j8mss\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T08:39:48Z is after 2025-08-24T17:21:41Z" Oct 03 08:39:48 crc kubenswrapper[4765]: I1003 08:39:48.706355 4765 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"859ee4f1-636f-48e5-ad72-fef19f311c64\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ffaf0cbc60fa84230a87aff908b5b2a76956abfa937aeea94363abe91640b93e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T08:39:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0fee410f71d4fa82e7bf54dad906736bc7182be512825a06bf7a4c76ed2f2789\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T08:39:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ab0ed26066c771f9943b6435fa382ff61fb04f0c8bef3d505aba4c5d1a1d4740\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T08:39:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://153c9584928c3d064c6098126dad58733015ed123b9a55c959e69ddcc0ad2110\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T08:39:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6fa1bc45d80d90bc08ca3a7177e2ac77b66c36f5a0f863532174be7719bfaae0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T08:39:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c1359775b3b907743ce1d7754c904fab5cf953ad3a28f91e781be95462e6a9d4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c1359775b3b907743ce1d7754c904fab5cf953ad3a28f91e781be95462e6a9d4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T08:39:17Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T08:39:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a56c24b3af6b3d5c18009cbb5517273674eb36f57157344f99903f9c50c7d1a0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a56c24b3af6b3d5c18009cbb5517273674eb36f57157344f99903f9c50c7d1a0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T08:39:18Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T08:39:18Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://18dbd66a12f96fa4beef4da9dcc0e1b1d3a5164a8b21a6774142289078d6c05f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://18dbd66a12f96fa4beef4da9dcc0e1b1d3a5164a8b21a6774142289078d6c05f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T08:39:19Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T08:39:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T08:39:16Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T08:39:48Z is after 2025-08-24T17:21:41Z" Oct 03 08:39:48 crc kubenswrapper[4765]: I1003 08:39:48.711022 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-control-plane-metrics-cert\" (UniqueName: \"kubernetes.io/secret/fcbd8c60-e4bc-43c1-b769-9ae58a05ea0f-ovn-control-plane-metrics-cert\") pod \"ovnkube-control-plane-749d76644c-9pssq\" (UID: \"fcbd8c60-e4bc-43c1-b769-9ae58a05ea0f\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-9pssq" Oct 03 08:39:48 crc kubenswrapper[4765]: I1003 08:39:48.711069 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/fcbd8c60-e4bc-43c1-b769-9ae58a05ea0f-ovnkube-config\") pod \"ovnkube-control-plane-749d76644c-9pssq\" (UID: \"fcbd8c60-e4bc-43c1-b769-9ae58a05ea0f\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-9pssq" Oct 03 08:39:48 crc kubenswrapper[4765]: I1003 08:39:48.711308 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ggxtg\" (UniqueName: \"kubernetes.io/projected/fcbd8c60-e4bc-43c1-b769-9ae58a05ea0f-kube-api-access-ggxtg\") pod \"ovnkube-control-plane-749d76644c-9pssq\" (UID: \"fcbd8c60-e4bc-43c1-b769-9ae58a05ea0f\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-9pssq" Oct 03 08:39:48 crc kubenswrapper[4765]: I1003 08:39:48.711408 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/fcbd8c60-e4bc-43c1-b769-9ae58a05ea0f-env-overrides\") pod \"ovnkube-control-plane-749d76644c-9pssq\" (UID: \"fcbd8c60-e4bc-43c1-b769-9ae58a05ea0f\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-9pssq" Oct 03 08:39:48 crc kubenswrapper[4765]: I1003 08:39:48.722492 4765 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:35Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:35Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T08:39:48Z is after 2025-08-24T17:21:41Z" Oct 03 08:39:48 crc kubenswrapper[4765]: I1003 08:39:48.740024 4765 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:38Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:38Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a37f2b5f797755065158a077232872befbc61f2f19c80dfd27bba7f131db794c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T08:39:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T08:39:48Z is after 2025-08-24T17:21:41Z" Oct 03 08:39:48 crc kubenswrapper[4765]: I1003 08:39:48.761505 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 08:39:48 crc kubenswrapper[4765]: I1003 08:39:48.761567 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 08:39:48 crc kubenswrapper[4765]: I1003 08:39:48.761579 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 08:39:48 crc kubenswrapper[4765]: I1003 08:39:48.761597 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 08:39:48 crc kubenswrapper[4765]: I1003 08:39:48.761611 4765 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T08:39:48Z","lastTransitionTime":"2025-10-03T08:39:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 08:39:48 crc kubenswrapper[4765]: I1003 08:39:48.768471 4765 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-srgbb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ea01fba1-445f-46c1-898c-1ceb34866850\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:36Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:36Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d73e2e54676fc570262cfd551322ed003812c372ddc25695ca3b34ae2a05423b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T08:39:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m4zqv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fa40947035e07c4926ee170348e2bd545830d0c6c1fa6b59a2aa7f12eac2c6da\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T08:39:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m4zqv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://902d94d2cc9ce526c6ea774f1bb70fbee7da85cedab72fcd842f87d47ee8a458\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T08:39:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m4zqv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://95502595a856f5f235331ab5db3d4f97a50f968857c1962d12b873a714689f0c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T08:39:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m4zqv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a3ad66691c9dcf004703b79d697a78f9b42791fafba2ddf278997b6ad28bdd4a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T08:39:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m4zqv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://68b9b8a7ec5c072f50d44aa0d3800b7cdee18bdd868d37ec129ceb37a23bd3ca\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T08:39:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m4zqv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4fa0909ee1317cdeb75c73911371e3344b889b98379e921f58d444c960308e28\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://38b17cdcd60c8c79a10f7b5c583c138ae27bd93dcfb6571c37d800acf1360556\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-10-03T08:39:45Z\\\",\\\"message\\\":\\\" 6049 handler.go:208] Removed *v1.Pod event handler 3\\\\nI1003 08:39:45.407639 6049 handler.go:190] Sending *v1.NetworkPolicy event handler 4 for removal\\\\nI1003 08:39:45.407708 6049 handler.go:190] Sending *v1.Namespace event handler 1 for removal\\\\nI1003 08:39:45.407720 6049 handler.go:190] Sending *v1.Namespace event handler 5 for removal\\\\nI1003 08:39:45.407747 6049 handler.go:190] Sending *v1.EgressIP event handler 8 for removal\\\\nI1003 08:39:45.407763 6049 handler.go:190] Sending *v1.EgressFirewall event handler 9 for removal\\\\nI1003 08:39:45.407763 6049 handler.go:208] Removed *v1.NetworkPolicy event handler 4\\\\nI1003 08:39:45.407766 6049 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI1003 08:39:45.407776 6049 handler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI1003 08:39:45.407781 6049 handler.go:208] Removed *v1.Namespace event handler 5\\\\nI1003 08:39:45.407792 6049 handler.go:208] Removed *v1.EgressFirewall event handler 9\\\\nI1003 08:39:45.407792 6049 handler.go:208] Removed *v1.EgressIP event handler 8\\\\nI1003 08:39:45.407897 6049 handler.go:190] Sending *v1.Node event handler 7 for removal\\\\nI1003 08:39:45.407932 6049 handler.go:208] Removed *v1.Node event handler 7\\\\nI1003 08:39:45.407932 6049 factory.go:656] Stopping watch factory\\\\nI1003 08:39:45.407945 6049 handler.go:208] Removed *v1.Node ev\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-03T08:39:42Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4fa0909ee1317cdeb75c73911371e3344b889b98379e921f58d444c960308e28\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-10-03T08:39:46Z\\\",\\\"message\\\":\\\"erator/iptables-alerter-4ln5h\\\\nI1003 08:39:46.761968 6208 ovn.go:134] Ensuring zone local for Pod openshift-network-operator/iptables-alerter-4ln5h in node crc\\\\nI1003 08:39:46.761841 6208 obj_retry.go:303] Retry object setup: *v1.Pod openshift-network-node-identity/network-node-identity-vrzqb\\\\nI1003 08:39:46.761976 6208 obj_retry.go:386] Retry successful for *v1.Pod openshift-network-operator/iptables-alerter-4ln5h after 0 failed attempt(s)\\\\nI1003 08:39:46.761984 6208 default_network_controller.go:776] Recording success event on pod openshift-network-operator/iptables-alerter-4ln5h\\\\nI1003 08:39:46.761892 6208 ovn.go:134] Ensuring zone local for Pod openshift-kube-controller-manager/kube-controller-manager-crc in node crc\\\\nI1003 08:39:46.762002 6208 obj_retry.go:386] Retry successful for *v1.Pod openshift-kube-controller-manager/kube-controller-manager-crc after 0 failed attempt(s)\\\\nI1003 08:39:46.762008 6208 default_network_controller.go:776] Recording success event on pod openshift-kube-controller-manager/kube-controller-manager-crc\\\\nI1003 08:39:46.761983 6208 obj_retry.go:365] Adding new object: *v1.Pod openshift-network-node-identity/network-node-identity-vrzqb\\\\nF1003 08:39:46.762028 6208 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-03T08:39:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m4zqv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6d5d60eb6ab5ff22cc2c6826b1d47220bb827fa0429f2a59020ae01d0a43f6bf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T08:39:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m4zqv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://761ad034a8a6a7a6280fcccaca136b26ddd423885bc2a7ab6b8e1856f16f7416\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://761ad034a8a6a7a6280fcccaca136b26ddd423885bc2a7ab6b8e1856f16f7416\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T08:39:37Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T08:39:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m4zqv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T08:39:36Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-srgbb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T08:39:48Z is after 2025-08-24T17:21:41Z" Oct 03 08:39:48 crc kubenswrapper[4765]: I1003 08:39:48.782522 4765 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:35Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:35Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T08:39:48Z is after 2025-08-24T17:21:41Z" Oct 03 08:39:48 crc kubenswrapper[4765]: I1003 08:39:48.802972 4765 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:36Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:36Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e8d6f534a0a702832db2f8947c1528a98d511d3950cc5a6ec0ac3b31b3dbcb7c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T08:39:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://20ad16cb9f0f7e17ac946cd2c3f7c01b6e6c95d6d76c99f482b3761546689af2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T08:39:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T08:39:48Z is after 2025-08-24T17:21:41Z" Oct 03 08:39:48 crc kubenswrapper[4765]: I1003 08:39:48.812741 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-control-plane-metrics-cert\" (UniqueName: \"kubernetes.io/secret/fcbd8c60-e4bc-43c1-b769-9ae58a05ea0f-ovn-control-plane-metrics-cert\") pod \"ovnkube-control-plane-749d76644c-9pssq\" (UID: \"fcbd8c60-e4bc-43c1-b769-9ae58a05ea0f\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-9pssq" Oct 03 08:39:48 crc kubenswrapper[4765]: I1003 08:39:48.812962 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/fcbd8c60-e4bc-43c1-b769-9ae58a05ea0f-ovnkube-config\") pod \"ovnkube-control-plane-749d76644c-9pssq\" (UID: \"fcbd8c60-e4bc-43c1-b769-9ae58a05ea0f\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-9pssq" Oct 03 08:39:48 crc kubenswrapper[4765]: I1003 08:39:48.813070 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ggxtg\" (UniqueName: \"kubernetes.io/projected/fcbd8c60-e4bc-43c1-b769-9ae58a05ea0f-kube-api-access-ggxtg\") pod \"ovnkube-control-plane-749d76644c-9pssq\" (UID: \"fcbd8c60-e4bc-43c1-b769-9ae58a05ea0f\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-9pssq" Oct 03 08:39:48 crc kubenswrapper[4765]: I1003 08:39:48.813159 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/fcbd8c60-e4bc-43c1-b769-9ae58a05ea0f-env-overrides\") pod \"ovnkube-control-plane-749d76644c-9pssq\" (UID: \"fcbd8c60-e4bc-43c1-b769-9ae58a05ea0f\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-9pssq" Oct 03 08:39:48 crc kubenswrapper[4765]: I1003 08:39:48.813874 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/fcbd8c60-e4bc-43c1-b769-9ae58a05ea0f-env-overrides\") pod \"ovnkube-control-plane-749d76644c-9pssq\" (UID: \"fcbd8c60-e4bc-43c1-b769-9ae58a05ea0f\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-9pssq" Oct 03 08:39:48 crc kubenswrapper[4765]: I1003 08:39:48.814209 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/fcbd8c60-e4bc-43c1-b769-9ae58a05ea0f-ovnkube-config\") pod \"ovnkube-control-plane-749d76644c-9pssq\" (UID: \"fcbd8c60-e4bc-43c1-b769-9ae58a05ea0f\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-9pssq" Oct 03 08:39:48 crc kubenswrapper[4765]: I1003 08:39:48.818626 4765 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0c434639-9c6c-420c-a51b-fdf59b654daa\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d31497fd54f7500ac776bdd9a16414d873c053353911ed5ba237b201e9e7ac12\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T08:39:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e89b19d6a5b90a2051665bf2e5e150f73df7899eff246ee75246bc2127c415ea\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T08:39:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://79fad446c147481b1a0ff2a173848b2d24384e6b6aafcd0749dc820e9abfe929\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T08:39:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f2e21a2b21d807288e991a3a44ea38d316985590080aa4291aa3385816f826dd\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://aa0283dadc2c5e48aa9bfd20ef35d889a350244b72eb8529d4d4e682d5fa0e47\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-03T08:39:35Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1003 08:39:29.830291 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1003 08:39:29.833185 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2710500186/tls.crt::/tmp/serving-cert-2710500186/tls.key\\\\\\\"\\\\nI1003 08:39:35.213224 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1003 08:39:35.219008 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1003 08:39:35.219055 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1003 08:39:35.219088 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1003 08:39:35.219098 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1003 08:39:35.227302 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI1003 08:39:35.227314 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW1003 08:39:35.227372 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1003 08:39:35.227381 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1003 08:39:35.227385 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1003 08:39:35.227395 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1003 08:39:35.227398 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1003 08:39:35.227401 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF1003 08:39:35.229781 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-03T08:39:19Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T08:39:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fa1d1c0f4dab4b4c6c9f3afccac34473eab40a714015a2a7ce725ed1a92b609c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T08:39:19Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7c94b0f96f50324a67a7b189e9d3c4e406193421aa2e793692219bef9f2578dd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7c94b0f96f50324a67a7b189e9d3c4e406193421aa2e793692219bef9f2578dd\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T08:39:17Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T08:39:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T08:39:16Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T08:39:48Z is after 2025-08-24T17:21:41Z" Oct 03 08:39:48 crc kubenswrapper[4765]: I1003 08:39:48.820889 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-control-plane-metrics-cert\" (UniqueName: \"kubernetes.io/secret/fcbd8c60-e4bc-43c1-b769-9ae58a05ea0f-ovn-control-plane-metrics-cert\") pod \"ovnkube-control-plane-749d76644c-9pssq\" (UID: \"fcbd8c60-e4bc-43c1-b769-9ae58a05ea0f\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-9pssq" Oct 03 08:39:48 crc kubenswrapper[4765]: I1003 08:39:48.830399 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ggxtg\" (UniqueName: \"kubernetes.io/projected/fcbd8c60-e4bc-43c1-b769-9ae58a05ea0f-kube-api-access-ggxtg\") pod \"ovnkube-control-plane-749d76644c-9pssq\" (UID: \"fcbd8c60-e4bc-43c1-b769-9ae58a05ea0f\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-9pssq" Oct 03 08:39:48 crc kubenswrapper[4765]: I1003 08:39:48.832456 4765 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:35Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:35Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T08:39:48Z is after 2025-08-24T17:21:41Z" Oct 03 08:39:48 crc kubenswrapper[4765]: I1003 08:39:48.849503 4765 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:36Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:36Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e2003e4dd90b26bd915c05a690d0ab12b21ef7773138f11993382b0e7ac2d778\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T08:39:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T08:39:48Z is after 2025-08-24T17:21:41Z" Oct 03 08:39:48 crc kubenswrapper[4765]: I1003 08:39:48.864709 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 08:39:48 crc kubenswrapper[4765]: I1003 08:39:48.864991 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 08:39:48 crc kubenswrapper[4765]: I1003 08:39:48.865105 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 08:39:48 crc kubenswrapper[4765]: I1003 08:39:48.865210 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 08:39:48 crc kubenswrapper[4765]: I1003 08:39:48.865285 4765 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T08:39:48Z","lastTransitionTime":"2025-10-03T08:39:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 08:39:48 crc kubenswrapper[4765]: I1003 08:39:48.869184 4765 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"859ee4f1-636f-48e5-ad72-fef19f311c64\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ffaf0cbc60fa84230a87aff908b5b2a76956abfa937aeea94363abe91640b93e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T08:39:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0fee410f71d4fa82e7bf54dad906736bc7182be512825a06bf7a4c76ed2f2789\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T08:39:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ab0ed26066c771f9943b6435fa382ff61fb04f0c8bef3d505aba4c5d1a1d4740\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T08:39:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://153c9584928c3d064c6098126dad58733015ed123b9a55c959e69ddcc0ad2110\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T08:39:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6fa1bc45d80d90bc08ca3a7177e2ac77b66c36f5a0f863532174be7719bfaae0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T08:39:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c1359775b3b907743ce1d7754c904fab5cf953ad3a28f91e781be95462e6a9d4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c1359775b3b907743ce1d7754c904fab5cf953ad3a28f91e781be95462e6a9d4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T08:39:17Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T08:39:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a56c24b3af6b3d5c18009cbb5517273674eb36f57157344f99903f9c50c7d1a0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a56c24b3af6b3d5c18009cbb5517273674eb36f57157344f99903f9c50c7d1a0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T08:39:18Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T08:39:18Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://18dbd66a12f96fa4beef4da9dcc0e1b1d3a5164a8b21a6774142289078d6c05f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://18dbd66a12f96fa4beef4da9dcc0e1b1d3a5164a8b21a6774142289078d6c05f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T08:39:19Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T08:39:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T08:39:16Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T08:39:48Z is after 2025-08-24T17:21:41Z" Oct 03 08:39:48 crc kubenswrapper[4765]: I1003 08:39:48.888831 4765 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:35Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:35Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T08:39:48Z is after 2025-08-24T17:21:41Z" Oct 03 08:39:48 crc kubenswrapper[4765]: I1003 08:39:48.899016 4765 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-9pssq" Oct 03 08:39:48 crc kubenswrapper[4765]: I1003 08:39:48.907796 4765 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-csb5z" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"912755c8-dd28-4fbc-82de-9cf85df54f4f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d7f179012e9f55f30c641a1ae3640cc90cefb3d2527d0c1e0580c219899503e1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T08:39:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w8k2k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T08:39:36Z\\\"}}\" for pod \"openshift-multus\"/\"multus-csb5z\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T08:39:48Z is after 2025-08-24T17:21:41Z" Oct 03 08:39:48 crc kubenswrapper[4765]: W1003 08:39:48.915157 4765 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podfcbd8c60_e4bc_43c1_b769_9ae58a05ea0f.slice/crio-da4e052d948810f7ad501828dad6ce7b40faf46494bcbc7f76c03e5d9e3d4b9a WatchSource:0}: Error finding container da4e052d948810f7ad501828dad6ce7b40faf46494bcbc7f76c03e5d9e3d4b9a: Status 404 returned error can't find the container with id da4e052d948810f7ad501828dad6ce7b40faf46494bcbc7f76c03e5d9e3d4b9a Oct 03 08:39:48 crc kubenswrapper[4765]: I1003 08:39:48.923692 4765 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-j8mss" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d636dbad-9ffa-4ba7-953f-adea04b76a23\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://33c95fa1034cd2135f4293956d73825e809195d220ff0b10a6604bd399a5730a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T08:39:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dhzzj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://714c78e9165f96e2aee03ad7be980399f06aeb852da4d76611c236f262518281\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T08:39:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dhzzj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T08:39:36Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-j8mss\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T08:39:48Z is after 2025-08-24T17:21:41Z" Oct 03 08:39:48 crc kubenswrapper[4765]: I1003 08:39:48.938121 4765 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:35Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:35Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T08:39:48Z is after 2025-08-24T17:21:41Z" Oct 03 08:39:48 crc kubenswrapper[4765]: I1003 08:39:48.952223 4765 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:36Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:36Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e8d6f534a0a702832db2f8947c1528a98d511d3950cc5a6ec0ac3b31b3dbcb7c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T08:39:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://20ad16cb9f0f7e17ac946cd2c3f7c01b6e6c95d6d76c99f482b3761546689af2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T08:39:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T08:39:48Z is after 2025-08-24T17:21:41Z" Oct 03 08:39:48 crc kubenswrapper[4765]: I1003 08:39:48.966000 4765 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:38Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:38Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a37f2b5f797755065158a077232872befbc61f2f19c80dfd27bba7f131db794c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T08:39:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T08:39:48Z is after 2025-08-24T17:21:41Z" Oct 03 08:39:48 crc kubenswrapper[4765]: I1003 08:39:48.969362 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 08:39:48 crc kubenswrapper[4765]: I1003 08:39:48.969396 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 08:39:48 crc kubenswrapper[4765]: I1003 08:39:48.969411 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 08:39:48 crc kubenswrapper[4765]: I1003 08:39:48.969425 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 08:39:48 crc kubenswrapper[4765]: I1003 08:39:48.969437 4765 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T08:39:48Z","lastTransitionTime":"2025-10-03T08:39:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 08:39:48 crc kubenswrapper[4765]: I1003 08:39:48.989130 4765 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-srgbb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ea01fba1-445f-46c1-898c-1ceb34866850\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:36Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:36Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d73e2e54676fc570262cfd551322ed003812c372ddc25695ca3b34ae2a05423b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T08:39:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m4zqv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fa40947035e07c4926ee170348e2bd545830d0c6c1fa6b59a2aa7f12eac2c6da\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T08:39:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m4zqv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://902d94d2cc9ce526c6ea774f1bb70fbee7da85cedab72fcd842f87d47ee8a458\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T08:39:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m4zqv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://95502595a856f5f235331ab5db3d4f97a50f968857c1962d12b873a714689f0c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T08:39:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m4zqv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a3ad66691c9dcf004703b79d697a78f9b42791fafba2ddf278997b6ad28bdd4a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T08:39:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m4zqv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://68b9b8a7ec5c072f50d44aa0d3800b7cdee18bdd868d37ec129ceb37a23bd3ca\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T08:39:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m4zqv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4fa0909ee1317cdeb75c73911371e3344b889b98379e921f58d444c960308e28\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4fa0909ee1317cdeb75c73911371e3344b889b98379e921f58d444c960308e28\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-10-03T08:39:46Z\\\",\\\"message\\\":\\\"erator/iptables-alerter-4ln5h\\\\nI1003 08:39:46.761968 6208 ovn.go:134] Ensuring zone local for Pod openshift-network-operator/iptables-alerter-4ln5h in node crc\\\\nI1003 08:39:46.761841 6208 obj_retry.go:303] Retry object setup: *v1.Pod openshift-network-node-identity/network-node-identity-vrzqb\\\\nI1003 08:39:46.761976 6208 obj_retry.go:386] Retry successful for *v1.Pod openshift-network-operator/iptables-alerter-4ln5h after 0 failed attempt(s)\\\\nI1003 08:39:46.761984 6208 default_network_controller.go:776] Recording success event on pod openshift-network-operator/iptables-alerter-4ln5h\\\\nI1003 08:39:46.761892 6208 ovn.go:134] Ensuring zone local for Pod openshift-kube-controller-manager/kube-controller-manager-crc in node crc\\\\nI1003 08:39:46.762002 6208 obj_retry.go:386] Retry successful for *v1.Pod openshift-kube-controller-manager/kube-controller-manager-crc after 0 failed attempt(s)\\\\nI1003 08:39:46.762008 6208 default_network_controller.go:776] Recording success event on pod openshift-kube-controller-manager/kube-controller-manager-crc\\\\nI1003 08:39:46.761983 6208 obj_retry.go:365] Adding new object: *v1.Pod openshift-network-node-identity/network-node-identity-vrzqb\\\\nF1003 08:39:46.762028 6208 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-03T08:39:45Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=ovnkube-controller pod=ovnkube-node-srgbb_openshift-ovn-kubernetes(ea01fba1-445f-46c1-898c-1ceb34866850)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m4zqv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6d5d60eb6ab5ff22cc2c6826b1d47220bb827fa0429f2a59020ae01d0a43f6bf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T08:39:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m4zqv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://761ad034a8a6a7a6280fcccaca136b26ddd423885bc2a7ab6b8e1856f16f7416\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://761ad034a8a6a7a6280fcccaca136b26ddd423885bc2a7ab6b8e1856f16f7416\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T08:39:37Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T08:39:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m4zqv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T08:39:36Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-srgbb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T08:39:48Z is after 2025-08-24T17:21:41Z" Oct 03 08:39:49 crc kubenswrapper[4765]: I1003 08:39:49.006279 4765 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9660b983-3561-4cf7-8ea0-31a63e8d1051\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://23c27e7d79dab0c54b22f0114e7f55a9267e3a21961b8479c37fd77d0e8b66c7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T08:39:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fb89a31c804d86cbc11b04e4dcfab79d4536f28a107d43e98d48172a1c257ebb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T08:39:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1c3168f51c49cd9633557cf31cdc0fec47b3fcf981462dc85f4253a0584fcf52\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T08:39:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://99ae775d5cfd2e88a1c7ca516e1c59f2e08ce1d383653cacbefeac66b07abcb5\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T08:39:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T08:39:16Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T08:39:49Z is after 2025-08-24T17:21:41Z" Oct 03 08:39:49 crc kubenswrapper[4765]: I1003 08:39:49.022427 4765 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-4bmrv" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4f105c06-3e67-486f-a622-923ae442117c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d7a29ab4db9b7548c70824520272e6323f615934cddf1d92bf653f6d8f030a19\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T08:39:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6lhz2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ab314ed006e71cae099ecf8be4ac18fb777dd83fe4e233d9bc347965f76b6a0b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ab314ed006e71cae099ecf8be4ac18fb777dd83fe4e233d9bc347965f76b6a0b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T08:39:37Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T08:39:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6lhz2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://261208c132a527b6786f64f37dd699e889f68ebc01265028288b3620615274bc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://261208c132a527b6786f64f37dd699e889f68ebc01265028288b3620615274bc\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T08:39:39Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T08:39:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6lhz2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8358f38c6c98de8206fc03fda21200b0e526972747588a6e32d525c064aa57ac\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8358f38c6c98de8206fc03fda21200b0e526972747588a6e32d525c064aa57ac\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T08:39:39Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T08:39:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6lhz2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9af7a0993c4e8d1177050ee170ae306c2e2570b0daca2d3f5c812b5f0e9c81da\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9af7a0993c4e8d1177050ee170ae306c2e2570b0daca2d3f5c812b5f0e9c81da\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T08:39:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T08:39:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6lhz2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://23ac91bc25ecc5c606b22bf6df52129330bb8c214ef8ec881fb202df6350c853\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://23ac91bc25ecc5c606b22bf6df52129330bb8c214ef8ec881fb202df6350c853\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T08:39:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T08:39:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6lhz2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7c836df75da45ef369baafc15bdbed1068becc3bf57a4c83a8519280ff3eb847\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7c836df75da45ef369baafc15bdbed1068becc3bf57a4c83a8519280ff3eb847\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T08:39:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T08:39:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6lhz2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T08:39:36Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-4bmrv\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T08:39:49Z is after 2025-08-24T17:21:41Z" Oct 03 08:39:49 crc kubenswrapper[4765]: I1003 08:39:49.033904 4765 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-p9gf5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"46c76a49-e10b-4a12-a6c7-12c330cd3c4e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d127171dd11041892813dd0596574630e756cc4f2e54b149619bffdbe9bae37d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T08:39:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2ldt7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T08:39:36Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-p9gf5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T08:39:49Z is after 2025-08-24T17:21:41Z" Oct 03 08:39:49 crc kubenswrapper[4765]: I1003 08:39:49.043607 4765 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-svqbq" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"84cdf1d7-9997-4015-bdbf-eedacc081685\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://43441b23076aa88505c0014c6734ffd0302f9011300711eece573befc94f3fbf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T08:39:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tm2z5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T08:39:39Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-svqbq\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T08:39:49Z is after 2025-08-24T17:21:41Z" Oct 03 08:39:49 crc kubenswrapper[4765]: I1003 08:39:49.055755 4765 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-9pssq" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fcbd8c60-e4bc-43c1-b769-9ae58a05ea0f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:48Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:48Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-rbac-proxy ovnkube-cluster-manager]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:48Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-rbac-proxy ovnkube-cluster-manager]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ggxtg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ggxtg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T08:39:48Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-9pssq\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T08:39:49Z is after 2025-08-24T17:21:41Z" Oct 03 08:39:49 crc kubenswrapper[4765]: I1003 08:39:49.077305 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 08:39:49 crc kubenswrapper[4765]: I1003 08:39:49.077348 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 08:39:49 crc kubenswrapper[4765]: I1003 08:39:49.077357 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 08:39:49 crc kubenswrapper[4765]: I1003 08:39:49.077378 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 08:39:49 crc kubenswrapper[4765]: I1003 08:39:49.077393 4765 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T08:39:49Z","lastTransitionTime":"2025-10-03T08:39:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 08:39:49 crc kubenswrapper[4765]: I1003 08:39:49.180564 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 08:39:49 crc kubenswrapper[4765]: I1003 08:39:49.180624 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 08:39:49 crc kubenswrapper[4765]: I1003 08:39:49.180634 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 08:39:49 crc kubenswrapper[4765]: I1003 08:39:49.180665 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 08:39:49 crc kubenswrapper[4765]: I1003 08:39:49.180678 4765 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T08:39:49Z","lastTransitionTime":"2025-10-03T08:39:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 08:39:49 crc kubenswrapper[4765]: I1003 08:39:49.283280 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 08:39:49 crc kubenswrapper[4765]: I1003 08:39:49.283322 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 08:39:49 crc kubenswrapper[4765]: I1003 08:39:49.283339 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 08:39:49 crc kubenswrapper[4765]: I1003 08:39:49.283363 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 08:39:49 crc kubenswrapper[4765]: I1003 08:39:49.283376 4765 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T08:39:49Z","lastTransitionTime":"2025-10-03T08:39:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 08:39:49 crc kubenswrapper[4765]: I1003 08:39:49.305845 4765 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 03 08:39:49 crc kubenswrapper[4765]: I1003 08:39:49.305877 4765 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 03 08:39:49 crc kubenswrapper[4765]: I1003 08:39:49.305848 4765 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 03 08:39:49 crc kubenswrapper[4765]: E1003 08:39:49.305985 4765 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 03 08:39:49 crc kubenswrapper[4765]: E1003 08:39:49.306305 4765 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 03 08:39:49 crc kubenswrapper[4765]: E1003 08:39:49.306380 4765 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 03 08:39:49 crc kubenswrapper[4765]: I1003 08:39:49.385989 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 08:39:49 crc kubenswrapper[4765]: I1003 08:39:49.386054 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 08:39:49 crc kubenswrapper[4765]: I1003 08:39:49.386067 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 08:39:49 crc kubenswrapper[4765]: I1003 08:39:49.386089 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 08:39:49 crc kubenswrapper[4765]: I1003 08:39:49.386103 4765 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T08:39:49Z","lastTransitionTime":"2025-10-03T08:39:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 08:39:49 crc kubenswrapper[4765]: I1003 08:39:49.489054 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 08:39:49 crc kubenswrapper[4765]: I1003 08:39:49.489087 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 08:39:49 crc kubenswrapper[4765]: I1003 08:39:49.489096 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 08:39:49 crc kubenswrapper[4765]: I1003 08:39:49.489112 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 08:39:49 crc kubenswrapper[4765]: I1003 08:39:49.489123 4765 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T08:39:49Z","lastTransitionTime":"2025-10-03T08:39:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 08:39:49 crc kubenswrapper[4765]: I1003 08:39:49.576749 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-9pssq" event={"ID":"fcbd8c60-e4bc-43c1-b769-9ae58a05ea0f","Type":"ContainerStarted","Data":"d810b33fb4971c7a1473884cbe04ad15b3cac6c0ca9af2384819d72a748ab173"} Oct 03 08:39:49 crc kubenswrapper[4765]: I1003 08:39:49.576828 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-9pssq" event={"ID":"fcbd8c60-e4bc-43c1-b769-9ae58a05ea0f","Type":"ContainerStarted","Data":"fb36c0727cbf11d911102b2e91c3989a264374191f4ff34349ed6ec8eba2e58d"} Oct 03 08:39:49 crc kubenswrapper[4765]: I1003 08:39:49.576847 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-9pssq" event={"ID":"fcbd8c60-e4bc-43c1-b769-9ae58a05ea0f","Type":"ContainerStarted","Data":"da4e052d948810f7ad501828dad6ce7b40faf46494bcbc7f76c03e5d9e3d4b9a"} Oct 03 08:39:49 crc kubenswrapper[4765]: I1003 08:39:49.577519 4765 scope.go:117] "RemoveContainer" containerID="4fa0909ee1317cdeb75c73911371e3344b889b98379e921f58d444c960308e28" Oct 03 08:39:49 crc kubenswrapper[4765]: E1003 08:39:49.577666 4765 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 10s restarting failed container=ovnkube-controller pod=ovnkube-node-srgbb_openshift-ovn-kubernetes(ea01fba1-445f-46c1-898c-1ceb34866850)\"" pod="openshift-ovn-kubernetes/ovnkube-node-srgbb" podUID="ea01fba1-445f-46c1-898c-1ceb34866850" Oct 03 08:39:49 crc kubenswrapper[4765]: I1003 08:39:49.591309 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 08:39:49 crc kubenswrapper[4765]: I1003 08:39:49.591359 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 08:39:49 crc kubenswrapper[4765]: I1003 08:39:49.591370 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 08:39:49 crc kubenswrapper[4765]: I1003 08:39:49.591387 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 08:39:49 crc kubenswrapper[4765]: I1003 08:39:49.591399 4765 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T08:39:49Z","lastTransitionTime":"2025-10-03T08:39:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 08:39:49 crc kubenswrapper[4765]: I1003 08:39:49.601232 4765 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"859ee4f1-636f-48e5-ad72-fef19f311c64\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ffaf0cbc60fa84230a87aff908b5b2a76956abfa937aeea94363abe91640b93e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T08:39:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0fee410f71d4fa82e7bf54dad906736bc7182be512825a06bf7a4c76ed2f2789\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T08:39:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ab0ed26066c771f9943b6435fa382ff61fb04f0c8bef3d505aba4c5d1a1d4740\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T08:39:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://153c9584928c3d064c6098126dad58733015ed123b9a55c959e69ddcc0ad2110\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T08:39:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6fa1bc45d80d90bc08ca3a7177e2ac77b66c36f5a0f863532174be7719bfaae0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T08:39:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c1359775b3b907743ce1d7754c904fab5cf953ad3a28f91e781be95462e6a9d4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c1359775b3b907743ce1d7754c904fab5cf953ad3a28f91e781be95462e6a9d4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T08:39:17Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T08:39:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a56c24b3af6b3d5c18009cbb5517273674eb36f57157344f99903f9c50c7d1a0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a56c24b3af6b3d5c18009cbb5517273674eb36f57157344f99903f9c50c7d1a0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T08:39:18Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T08:39:18Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://18dbd66a12f96fa4beef4da9dcc0e1b1d3a5164a8b21a6774142289078d6c05f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://18dbd66a12f96fa4beef4da9dcc0e1b1d3a5164a8b21a6774142289078d6c05f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T08:39:19Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T08:39:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T08:39:16Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T08:39:49Z is after 2025-08-24T17:21:41Z" Oct 03 08:39:49 crc kubenswrapper[4765]: I1003 08:39:49.618928 4765 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:35Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:35Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T08:39:49Z is after 2025-08-24T17:21:41Z" Oct 03 08:39:49 crc kubenswrapper[4765]: I1003 08:39:49.634575 4765 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-csb5z" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"912755c8-dd28-4fbc-82de-9cf85df54f4f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d7f179012e9f55f30c641a1ae3640cc90cefb3d2527d0c1e0580c219899503e1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T08:39:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w8k2k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T08:39:36Z\\\"}}\" for pod \"openshift-multus\"/\"multus-csb5z\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T08:39:49Z is after 2025-08-24T17:21:41Z" Oct 03 08:39:49 crc kubenswrapper[4765]: I1003 08:39:49.648423 4765 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-j8mss" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d636dbad-9ffa-4ba7-953f-adea04b76a23\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://33c95fa1034cd2135f4293956d73825e809195d220ff0b10a6604bd399a5730a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T08:39:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dhzzj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://714c78e9165f96e2aee03ad7be980399f06aeb852da4d76611c236f262518281\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T08:39:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dhzzj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T08:39:36Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-j8mss\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T08:39:49Z is after 2025-08-24T17:21:41Z" Oct 03 08:39:49 crc kubenswrapper[4765]: I1003 08:39:49.663128 4765 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:35Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:35Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T08:39:49Z is after 2025-08-24T17:21:41Z" Oct 03 08:39:49 crc kubenswrapper[4765]: I1003 08:39:49.680527 4765 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:36Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:36Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e8d6f534a0a702832db2f8947c1528a98d511d3950cc5a6ec0ac3b31b3dbcb7c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T08:39:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://20ad16cb9f0f7e17ac946cd2c3f7c01b6e6c95d6d76c99f482b3761546689af2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T08:39:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T08:39:49Z is after 2025-08-24T17:21:41Z" Oct 03 08:39:49 crc kubenswrapper[4765]: I1003 08:39:49.694356 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 08:39:49 crc kubenswrapper[4765]: I1003 08:39:49.694406 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 08:39:49 crc kubenswrapper[4765]: I1003 08:39:49.694417 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 08:39:49 crc kubenswrapper[4765]: I1003 08:39:49.694440 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 08:39:49 crc kubenswrapper[4765]: I1003 08:39:49.694454 4765 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T08:39:49Z","lastTransitionTime":"2025-10-03T08:39:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 08:39:49 crc kubenswrapper[4765]: I1003 08:39:49.695325 4765 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:38Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:38Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a37f2b5f797755065158a077232872befbc61f2f19c80dfd27bba7f131db794c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T08:39:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T08:39:49Z is after 2025-08-24T17:21:41Z" Oct 03 08:39:49 crc kubenswrapper[4765]: I1003 08:39:49.715984 4765 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-srgbb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ea01fba1-445f-46c1-898c-1ceb34866850\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:36Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:36Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d73e2e54676fc570262cfd551322ed003812c372ddc25695ca3b34ae2a05423b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T08:39:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m4zqv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fa40947035e07c4926ee170348e2bd545830d0c6c1fa6b59a2aa7f12eac2c6da\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T08:39:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m4zqv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://902d94d2cc9ce526c6ea774f1bb70fbee7da85cedab72fcd842f87d47ee8a458\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T08:39:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m4zqv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://95502595a856f5f235331ab5db3d4f97a50f968857c1962d12b873a714689f0c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T08:39:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m4zqv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a3ad66691c9dcf004703b79d697a78f9b42791fafba2ddf278997b6ad28bdd4a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T08:39:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m4zqv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://68b9b8a7ec5c072f50d44aa0d3800b7cdee18bdd868d37ec129ceb37a23bd3ca\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T08:39:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m4zqv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4fa0909ee1317cdeb75c73911371e3344b889b98379e921f58d444c960308e28\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4fa0909ee1317cdeb75c73911371e3344b889b98379e921f58d444c960308e28\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-10-03T08:39:46Z\\\",\\\"message\\\":\\\"erator/iptables-alerter-4ln5h\\\\nI1003 08:39:46.761968 6208 ovn.go:134] Ensuring zone local for Pod openshift-network-operator/iptables-alerter-4ln5h in node crc\\\\nI1003 08:39:46.761841 6208 obj_retry.go:303] Retry object setup: *v1.Pod openshift-network-node-identity/network-node-identity-vrzqb\\\\nI1003 08:39:46.761976 6208 obj_retry.go:386] Retry successful for *v1.Pod openshift-network-operator/iptables-alerter-4ln5h after 0 failed attempt(s)\\\\nI1003 08:39:46.761984 6208 default_network_controller.go:776] Recording success event on pod openshift-network-operator/iptables-alerter-4ln5h\\\\nI1003 08:39:46.761892 6208 ovn.go:134] Ensuring zone local for Pod openshift-kube-controller-manager/kube-controller-manager-crc in node crc\\\\nI1003 08:39:46.762002 6208 obj_retry.go:386] Retry successful for *v1.Pod openshift-kube-controller-manager/kube-controller-manager-crc after 0 failed attempt(s)\\\\nI1003 08:39:46.762008 6208 default_network_controller.go:776] Recording success event on pod openshift-kube-controller-manager/kube-controller-manager-crc\\\\nI1003 08:39:46.761983 6208 obj_retry.go:365] Adding new object: *v1.Pod openshift-network-node-identity/network-node-identity-vrzqb\\\\nF1003 08:39:46.762028 6208 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-03T08:39:45Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=ovnkube-controller pod=ovnkube-node-srgbb_openshift-ovn-kubernetes(ea01fba1-445f-46c1-898c-1ceb34866850)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m4zqv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6d5d60eb6ab5ff22cc2c6826b1d47220bb827fa0429f2a59020ae01d0a43f6bf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T08:39:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m4zqv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://761ad034a8a6a7a6280fcccaca136b26ddd423885bc2a7ab6b8e1856f16f7416\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://761ad034a8a6a7a6280fcccaca136b26ddd423885bc2a7ab6b8e1856f16f7416\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T08:39:37Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T08:39:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m4zqv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T08:39:36Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-srgbb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T08:39:49Z is after 2025-08-24T17:21:41Z" Oct 03 08:39:49 crc kubenswrapper[4765]: I1003 08:39:49.732780 4765 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9660b983-3561-4cf7-8ea0-31a63e8d1051\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://23c27e7d79dab0c54b22f0114e7f55a9267e3a21961b8479c37fd77d0e8b66c7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T08:39:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fb89a31c804d86cbc11b04e4dcfab79d4536f28a107d43e98d48172a1c257ebb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T08:39:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1c3168f51c49cd9633557cf31cdc0fec47b3fcf981462dc85f4253a0584fcf52\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T08:39:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://99ae775d5cfd2e88a1c7ca516e1c59f2e08ce1d383653cacbefeac66b07abcb5\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T08:39:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T08:39:16Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T08:39:49Z is after 2025-08-24T17:21:41Z" Oct 03 08:39:49 crc kubenswrapper[4765]: I1003 08:39:49.750861 4765 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-4bmrv" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4f105c06-3e67-486f-a622-923ae442117c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d7a29ab4db9b7548c70824520272e6323f615934cddf1d92bf653f6d8f030a19\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T08:39:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6lhz2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ab314ed006e71cae099ecf8be4ac18fb777dd83fe4e233d9bc347965f76b6a0b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ab314ed006e71cae099ecf8be4ac18fb777dd83fe4e233d9bc347965f76b6a0b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T08:39:37Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T08:39:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6lhz2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://261208c132a527b6786f64f37dd699e889f68ebc01265028288b3620615274bc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://261208c132a527b6786f64f37dd699e889f68ebc01265028288b3620615274bc\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T08:39:39Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T08:39:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6lhz2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8358f38c6c98de8206fc03fda21200b0e526972747588a6e32d525c064aa57ac\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8358f38c6c98de8206fc03fda21200b0e526972747588a6e32d525c064aa57ac\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T08:39:39Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T08:39:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6lhz2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9af7a0993c4e8d1177050ee170ae306c2e2570b0daca2d3f5c812b5f0e9c81da\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9af7a0993c4e8d1177050ee170ae306c2e2570b0daca2d3f5c812b5f0e9c81da\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T08:39:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T08:39:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6lhz2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://23ac91bc25ecc5c606b22bf6df52129330bb8c214ef8ec881fb202df6350c853\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://23ac91bc25ecc5c606b22bf6df52129330bb8c214ef8ec881fb202df6350c853\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T08:39:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T08:39:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6lhz2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7c836df75da45ef369baafc15bdbed1068becc3bf57a4c83a8519280ff3eb847\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7c836df75da45ef369baafc15bdbed1068becc3bf57a4c83a8519280ff3eb847\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T08:39:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T08:39:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6lhz2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T08:39:36Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-4bmrv\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T08:39:49Z is after 2025-08-24T17:21:41Z" Oct 03 08:39:49 crc kubenswrapper[4765]: I1003 08:39:49.762524 4765 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-p9gf5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"46c76a49-e10b-4a12-a6c7-12c330cd3c4e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d127171dd11041892813dd0596574630e756cc4f2e54b149619bffdbe9bae37d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T08:39:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2ldt7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T08:39:36Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-p9gf5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T08:39:49Z is after 2025-08-24T17:21:41Z" Oct 03 08:39:49 crc kubenswrapper[4765]: I1003 08:39:49.774623 4765 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-svqbq" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"84cdf1d7-9997-4015-bdbf-eedacc081685\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://43441b23076aa88505c0014c6734ffd0302f9011300711eece573befc94f3fbf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T08:39:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tm2z5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T08:39:39Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-svqbq\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T08:39:49Z is after 2025-08-24T17:21:41Z" Oct 03 08:39:49 crc kubenswrapper[4765]: I1003 08:39:49.788788 4765 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-9pssq" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fcbd8c60-e4bc-43c1-b769-9ae58a05ea0f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fb36c0727cbf11d911102b2e91c3989a264374191f4ff34349ed6ec8eba2e58d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T08:39:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ggxtg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d810b33fb4971c7a1473884cbe04ad15b3cac6c0ca9af2384819d72a748ab173\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T08:39:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ggxtg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T08:39:48Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-9pssq\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T08:39:49Z is after 2025-08-24T17:21:41Z" Oct 03 08:39:49 crc kubenswrapper[4765]: I1003 08:39:49.796863 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 08:39:49 crc kubenswrapper[4765]: I1003 08:39:49.796901 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 08:39:49 crc kubenswrapper[4765]: I1003 08:39:49.796911 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 08:39:49 crc kubenswrapper[4765]: I1003 08:39:49.796928 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 08:39:49 crc kubenswrapper[4765]: I1003 08:39:49.796939 4765 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T08:39:49Z","lastTransitionTime":"2025-10-03T08:39:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 08:39:49 crc kubenswrapper[4765]: I1003 08:39:49.807104 4765 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0c434639-9c6c-420c-a51b-fdf59b654daa\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d31497fd54f7500ac776bdd9a16414d873c053353911ed5ba237b201e9e7ac12\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T08:39:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e89b19d6a5b90a2051665bf2e5e150f73df7899eff246ee75246bc2127c415ea\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T08:39:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://79fad446c147481b1a0ff2a173848b2d24384e6b6aafcd0749dc820e9abfe929\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T08:39:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f2e21a2b21d807288e991a3a44ea38d316985590080aa4291aa3385816f826dd\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://aa0283dadc2c5e48aa9bfd20ef35d889a350244b72eb8529d4d4e682d5fa0e47\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-03T08:39:35Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1003 08:39:29.830291 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1003 08:39:29.833185 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2710500186/tls.crt::/tmp/serving-cert-2710500186/tls.key\\\\\\\"\\\\nI1003 08:39:35.213224 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1003 08:39:35.219008 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1003 08:39:35.219055 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1003 08:39:35.219088 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1003 08:39:35.219098 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1003 08:39:35.227302 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI1003 08:39:35.227314 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW1003 08:39:35.227372 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1003 08:39:35.227381 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1003 08:39:35.227385 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1003 08:39:35.227395 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1003 08:39:35.227398 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1003 08:39:35.227401 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF1003 08:39:35.229781 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-03T08:39:19Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T08:39:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fa1d1c0f4dab4b4c6c9f3afccac34473eab40a714015a2a7ce725ed1a92b609c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T08:39:19Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7c94b0f96f50324a67a7b189e9d3c4e406193421aa2e793692219bef9f2578dd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7c94b0f96f50324a67a7b189e9d3c4e406193421aa2e793692219bef9f2578dd\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T08:39:17Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T08:39:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T08:39:16Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T08:39:49Z is after 2025-08-24T17:21:41Z" Oct 03 08:39:49 crc kubenswrapper[4765]: I1003 08:39:49.828062 4765 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:35Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:35Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T08:39:49Z is after 2025-08-24T17:21:41Z" Oct 03 08:39:49 crc kubenswrapper[4765]: I1003 08:39:49.843928 4765 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:36Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:36Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e2003e4dd90b26bd915c05a690d0ab12b21ef7773138f11993382b0e7ac2d778\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T08:39:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T08:39:49Z is after 2025-08-24T17:21:41Z" Oct 03 08:39:49 crc kubenswrapper[4765]: I1003 08:39:49.899422 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 08:39:49 crc kubenswrapper[4765]: I1003 08:39:49.899475 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 08:39:49 crc kubenswrapper[4765]: I1003 08:39:49.899491 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 08:39:49 crc kubenswrapper[4765]: I1003 08:39:49.899515 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 08:39:49 crc kubenswrapper[4765]: I1003 08:39:49.899531 4765 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T08:39:49Z","lastTransitionTime":"2025-10-03T08:39:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 08:39:50 crc kubenswrapper[4765]: I1003 08:39:50.006206 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 08:39:50 crc kubenswrapper[4765]: I1003 08:39:50.006254 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 08:39:50 crc kubenswrapper[4765]: I1003 08:39:50.006264 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 08:39:50 crc kubenswrapper[4765]: I1003 08:39:50.006280 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 08:39:50 crc kubenswrapper[4765]: I1003 08:39:50.006290 4765 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T08:39:50Z","lastTransitionTime":"2025-10-03T08:39:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 08:39:50 crc kubenswrapper[4765]: I1003 08:39:50.109745 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 08:39:50 crc kubenswrapper[4765]: I1003 08:39:50.109789 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 08:39:50 crc kubenswrapper[4765]: I1003 08:39:50.109801 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 08:39:50 crc kubenswrapper[4765]: I1003 08:39:50.109819 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 08:39:50 crc kubenswrapper[4765]: I1003 08:39:50.109830 4765 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T08:39:50Z","lastTransitionTime":"2025-10-03T08:39:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 08:39:50 crc kubenswrapper[4765]: I1003 08:39:50.213276 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 08:39:50 crc kubenswrapper[4765]: I1003 08:39:50.213334 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 08:39:50 crc kubenswrapper[4765]: I1003 08:39:50.213349 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 08:39:50 crc kubenswrapper[4765]: I1003 08:39:50.213369 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 08:39:50 crc kubenswrapper[4765]: I1003 08:39:50.213380 4765 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T08:39:50Z","lastTransitionTime":"2025-10-03T08:39:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 08:39:50 crc kubenswrapper[4765]: I1003 08:39:50.315530 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 08:39:50 crc kubenswrapper[4765]: I1003 08:39:50.315614 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 08:39:50 crc kubenswrapper[4765]: I1003 08:39:50.315674 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 08:39:50 crc kubenswrapper[4765]: I1003 08:39:50.315708 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 08:39:50 crc kubenswrapper[4765]: I1003 08:39:50.315733 4765 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T08:39:50Z","lastTransitionTime":"2025-10-03T08:39:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 08:39:50 crc kubenswrapper[4765]: I1003 08:39:50.419169 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 08:39:50 crc kubenswrapper[4765]: I1003 08:39:50.419238 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 08:39:50 crc kubenswrapper[4765]: I1003 08:39:50.419256 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 08:39:50 crc kubenswrapper[4765]: I1003 08:39:50.419287 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 08:39:50 crc kubenswrapper[4765]: I1003 08:39:50.419307 4765 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T08:39:50Z","lastTransitionTime":"2025-10-03T08:39:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 08:39:50 crc kubenswrapper[4765]: I1003 08:39:50.429616 4765 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-multus/network-metrics-daemon-wdwf5"] Oct 03 08:39:50 crc kubenswrapper[4765]: I1003 08:39:50.430893 4765 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-wdwf5" Oct 03 08:39:50 crc kubenswrapper[4765]: E1003 08:39:50.431040 4765 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-wdwf5" podUID="6824483c-e9a7-4e95-bb3d-e00bac2af3aa" Oct 03 08:39:50 crc kubenswrapper[4765]: I1003 08:39:50.455712 4765 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0c434639-9c6c-420c-a51b-fdf59b654daa\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d31497fd54f7500ac776bdd9a16414d873c053353911ed5ba237b201e9e7ac12\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T08:39:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e89b19d6a5b90a2051665bf2e5e150f73df7899eff246ee75246bc2127c415ea\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T08:39:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://79fad446c147481b1a0ff2a173848b2d24384e6b6aafcd0749dc820e9abfe929\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T08:39:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f2e21a2b21d807288e991a3a44ea38d316985590080aa4291aa3385816f826dd\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://aa0283dadc2c5e48aa9bfd20ef35d889a350244b72eb8529d4d4e682d5fa0e47\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-03T08:39:35Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1003 08:39:29.830291 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1003 08:39:29.833185 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2710500186/tls.crt::/tmp/serving-cert-2710500186/tls.key\\\\\\\"\\\\nI1003 08:39:35.213224 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1003 08:39:35.219008 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1003 08:39:35.219055 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1003 08:39:35.219088 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1003 08:39:35.219098 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1003 08:39:35.227302 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI1003 08:39:35.227314 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW1003 08:39:35.227372 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1003 08:39:35.227381 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1003 08:39:35.227385 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1003 08:39:35.227395 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1003 08:39:35.227398 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1003 08:39:35.227401 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF1003 08:39:35.229781 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-03T08:39:19Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T08:39:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fa1d1c0f4dab4b4c6c9f3afccac34473eab40a714015a2a7ce725ed1a92b609c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T08:39:19Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7c94b0f96f50324a67a7b189e9d3c4e406193421aa2e793692219bef9f2578dd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7c94b0f96f50324a67a7b189e9d3c4e406193421aa2e793692219bef9f2578dd\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T08:39:17Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T08:39:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T08:39:16Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T08:39:50Z is after 2025-08-24T17:21:41Z" Oct 03 08:39:50 crc kubenswrapper[4765]: I1003 08:39:50.473329 4765 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:35Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:35Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T08:39:50Z is after 2025-08-24T17:21:41Z" Oct 03 08:39:50 crc kubenswrapper[4765]: I1003 08:39:50.496816 4765 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:36Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:36Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e2003e4dd90b26bd915c05a690d0ab12b21ef7773138f11993382b0e7ac2d778\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T08:39:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T08:39:50Z is after 2025-08-24T17:21:41Z" Oct 03 08:39:50 crc kubenswrapper[4765]: I1003 08:39:50.512327 4765 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-wdwf5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6824483c-e9a7-4e95-bb3d-e00bac2af3aa\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:50Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:50Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:50Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9t858\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9t858\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T08:39:50Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-wdwf5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T08:39:50Z is after 2025-08-24T17:21:41Z" Oct 03 08:39:50 crc kubenswrapper[4765]: I1003 08:39:50.522179 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 08:39:50 crc kubenswrapper[4765]: I1003 08:39:50.522236 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 08:39:50 crc kubenswrapper[4765]: I1003 08:39:50.522252 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 08:39:50 crc kubenswrapper[4765]: I1003 08:39:50.522275 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 08:39:50 crc kubenswrapper[4765]: I1003 08:39:50.522290 4765 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T08:39:50Z","lastTransitionTime":"2025-10-03T08:39:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 08:39:50 crc kubenswrapper[4765]: I1003 08:39:50.532547 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9t858\" (UniqueName: \"kubernetes.io/projected/6824483c-e9a7-4e95-bb3d-e00bac2af3aa-kube-api-access-9t858\") pod \"network-metrics-daemon-wdwf5\" (UID: \"6824483c-e9a7-4e95-bb3d-e00bac2af3aa\") " pod="openshift-multus/network-metrics-daemon-wdwf5" Oct 03 08:39:50 crc kubenswrapper[4765]: I1003 08:39:50.532943 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/6824483c-e9a7-4e95-bb3d-e00bac2af3aa-metrics-certs\") pod \"network-metrics-daemon-wdwf5\" (UID: \"6824483c-e9a7-4e95-bb3d-e00bac2af3aa\") " pod="openshift-multus/network-metrics-daemon-wdwf5" Oct 03 08:39:50 crc kubenswrapper[4765]: I1003 08:39:50.532597 4765 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:35Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:35Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T08:39:50Z is after 2025-08-24T17:21:41Z" Oct 03 08:39:50 crc kubenswrapper[4765]: I1003 08:39:50.549759 4765 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-csb5z" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"912755c8-dd28-4fbc-82de-9cf85df54f4f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d7f179012e9f55f30c641a1ae3640cc90cefb3d2527d0c1e0580c219899503e1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T08:39:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w8k2k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T08:39:36Z\\\"}}\" for pod \"openshift-multus\"/\"multus-csb5z\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T08:39:50Z is after 2025-08-24T17:21:41Z" Oct 03 08:39:50 crc kubenswrapper[4765]: I1003 08:39:50.565425 4765 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-j8mss" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d636dbad-9ffa-4ba7-953f-adea04b76a23\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://33c95fa1034cd2135f4293956d73825e809195d220ff0b10a6604bd399a5730a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T08:39:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dhzzj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://714c78e9165f96e2aee03ad7be980399f06aeb852da4d76611c236f262518281\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T08:39:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dhzzj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T08:39:36Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-j8mss\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T08:39:50Z is after 2025-08-24T17:21:41Z" Oct 03 08:39:50 crc kubenswrapper[4765]: I1003 08:39:50.597479 4765 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"859ee4f1-636f-48e5-ad72-fef19f311c64\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ffaf0cbc60fa84230a87aff908b5b2a76956abfa937aeea94363abe91640b93e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T08:39:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0fee410f71d4fa82e7bf54dad906736bc7182be512825a06bf7a4c76ed2f2789\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T08:39:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ab0ed26066c771f9943b6435fa382ff61fb04f0c8bef3d505aba4c5d1a1d4740\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T08:39:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://153c9584928c3d064c6098126dad58733015ed123b9a55c959e69ddcc0ad2110\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T08:39:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6fa1bc45d80d90bc08ca3a7177e2ac77b66c36f5a0f863532174be7719bfaae0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T08:39:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c1359775b3b907743ce1d7754c904fab5cf953ad3a28f91e781be95462e6a9d4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c1359775b3b907743ce1d7754c904fab5cf953ad3a28f91e781be95462e6a9d4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T08:39:17Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T08:39:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a56c24b3af6b3d5c18009cbb5517273674eb36f57157344f99903f9c50c7d1a0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a56c24b3af6b3d5c18009cbb5517273674eb36f57157344f99903f9c50c7d1a0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T08:39:18Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T08:39:18Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://18dbd66a12f96fa4beef4da9dcc0e1b1d3a5164a8b21a6774142289078d6c05f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://18dbd66a12f96fa4beef4da9dcc0e1b1d3a5164a8b21a6774142289078d6c05f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T08:39:19Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T08:39:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T08:39:16Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T08:39:50Z is after 2025-08-24T17:21:41Z" Oct 03 08:39:50 crc kubenswrapper[4765]: I1003 08:39:50.612983 4765 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:36Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:36Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e8d6f534a0a702832db2f8947c1528a98d511d3950cc5a6ec0ac3b31b3dbcb7c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T08:39:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://20ad16cb9f0f7e17ac946cd2c3f7c01b6e6c95d6d76c99f482b3761546689af2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T08:39:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T08:39:50Z is after 2025-08-24T17:21:41Z" Oct 03 08:39:50 crc kubenswrapper[4765]: I1003 08:39:50.628755 4765 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:38Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:38Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a37f2b5f797755065158a077232872befbc61f2f19c80dfd27bba7f131db794c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T08:39:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T08:39:50Z is after 2025-08-24T17:21:41Z" Oct 03 08:39:50 crc kubenswrapper[4765]: I1003 08:39:50.631009 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 08:39:50 crc kubenswrapper[4765]: I1003 08:39:50.631053 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 08:39:50 crc kubenswrapper[4765]: I1003 08:39:50.631065 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 08:39:50 crc kubenswrapper[4765]: I1003 08:39:50.631084 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 08:39:50 crc kubenswrapper[4765]: I1003 08:39:50.631096 4765 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T08:39:50Z","lastTransitionTime":"2025-10-03T08:39:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 08:39:50 crc kubenswrapper[4765]: I1003 08:39:50.633475 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9t858\" (UniqueName: \"kubernetes.io/projected/6824483c-e9a7-4e95-bb3d-e00bac2af3aa-kube-api-access-9t858\") pod \"network-metrics-daemon-wdwf5\" (UID: \"6824483c-e9a7-4e95-bb3d-e00bac2af3aa\") " pod="openshift-multus/network-metrics-daemon-wdwf5" Oct 03 08:39:50 crc kubenswrapper[4765]: I1003 08:39:50.633530 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/6824483c-e9a7-4e95-bb3d-e00bac2af3aa-metrics-certs\") pod \"network-metrics-daemon-wdwf5\" (UID: \"6824483c-e9a7-4e95-bb3d-e00bac2af3aa\") " pod="openshift-multus/network-metrics-daemon-wdwf5" Oct 03 08:39:50 crc kubenswrapper[4765]: E1003 08:39:50.633700 4765 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Oct 03 08:39:50 crc kubenswrapper[4765]: E1003 08:39:50.633750 4765 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/6824483c-e9a7-4e95-bb3d-e00bac2af3aa-metrics-certs podName:6824483c-e9a7-4e95-bb3d-e00bac2af3aa nodeName:}" failed. No retries permitted until 2025-10-03 08:39:51.133736201 +0000 UTC m=+35.435230531 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/6824483c-e9a7-4e95-bb3d-e00bac2af3aa-metrics-certs") pod "network-metrics-daemon-wdwf5" (UID: "6824483c-e9a7-4e95-bb3d-e00bac2af3aa") : object "openshift-multus"/"metrics-daemon-secret" not registered Oct 03 08:39:50 crc kubenswrapper[4765]: I1003 08:39:50.653732 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9t858\" (UniqueName: \"kubernetes.io/projected/6824483c-e9a7-4e95-bb3d-e00bac2af3aa-kube-api-access-9t858\") pod \"network-metrics-daemon-wdwf5\" (UID: \"6824483c-e9a7-4e95-bb3d-e00bac2af3aa\") " pod="openshift-multus/network-metrics-daemon-wdwf5" Oct 03 08:39:50 crc kubenswrapper[4765]: I1003 08:39:50.654898 4765 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-srgbb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ea01fba1-445f-46c1-898c-1ceb34866850\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:36Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:36Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d73e2e54676fc570262cfd551322ed003812c372ddc25695ca3b34ae2a05423b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T08:39:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m4zqv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fa40947035e07c4926ee170348e2bd545830d0c6c1fa6b59a2aa7f12eac2c6da\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T08:39:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m4zqv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://902d94d2cc9ce526c6ea774f1bb70fbee7da85cedab72fcd842f87d47ee8a458\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T08:39:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m4zqv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://95502595a856f5f235331ab5db3d4f97a50f968857c1962d12b873a714689f0c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T08:39:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m4zqv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a3ad66691c9dcf004703b79d697a78f9b42791fafba2ddf278997b6ad28bdd4a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T08:39:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m4zqv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://68b9b8a7ec5c072f50d44aa0d3800b7cdee18bdd868d37ec129ceb37a23bd3ca\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T08:39:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m4zqv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4fa0909ee1317cdeb75c73911371e3344b889b98379e921f58d444c960308e28\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4fa0909ee1317cdeb75c73911371e3344b889b98379e921f58d444c960308e28\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-10-03T08:39:46Z\\\",\\\"message\\\":\\\"erator/iptables-alerter-4ln5h\\\\nI1003 08:39:46.761968 6208 ovn.go:134] Ensuring zone local for Pod openshift-network-operator/iptables-alerter-4ln5h in node crc\\\\nI1003 08:39:46.761841 6208 obj_retry.go:303] Retry object setup: *v1.Pod openshift-network-node-identity/network-node-identity-vrzqb\\\\nI1003 08:39:46.761976 6208 obj_retry.go:386] Retry successful for *v1.Pod openshift-network-operator/iptables-alerter-4ln5h after 0 failed attempt(s)\\\\nI1003 08:39:46.761984 6208 default_network_controller.go:776] Recording success event on pod openshift-network-operator/iptables-alerter-4ln5h\\\\nI1003 08:39:46.761892 6208 ovn.go:134] Ensuring zone local for Pod openshift-kube-controller-manager/kube-controller-manager-crc in node crc\\\\nI1003 08:39:46.762002 6208 obj_retry.go:386] Retry successful for *v1.Pod openshift-kube-controller-manager/kube-controller-manager-crc after 0 failed attempt(s)\\\\nI1003 08:39:46.762008 6208 default_network_controller.go:776] Recording success event on pod openshift-kube-controller-manager/kube-controller-manager-crc\\\\nI1003 08:39:46.761983 6208 obj_retry.go:365] Adding new object: *v1.Pod openshift-network-node-identity/network-node-identity-vrzqb\\\\nF1003 08:39:46.762028 6208 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-03T08:39:45Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=ovnkube-controller pod=ovnkube-node-srgbb_openshift-ovn-kubernetes(ea01fba1-445f-46c1-898c-1ceb34866850)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m4zqv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6d5d60eb6ab5ff22cc2c6826b1d47220bb827fa0429f2a59020ae01d0a43f6bf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T08:39:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m4zqv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://761ad034a8a6a7a6280fcccaca136b26ddd423885bc2a7ab6b8e1856f16f7416\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://761ad034a8a6a7a6280fcccaca136b26ddd423885bc2a7ab6b8e1856f16f7416\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T08:39:37Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T08:39:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m4zqv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T08:39:36Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-srgbb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T08:39:50Z is after 2025-08-24T17:21:41Z" Oct 03 08:39:50 crc kubenswrapper[4765]: I1003 08:39:50.670801 4765 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:35Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:35Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T08:39:50Z is after 2025-08-24T17:21:41Z" Oct 03 08:39:50 crc kubenswrapper[4765]: I1003 08:39:50.690472 4765 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-4bmrv" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4f105c06-3e67-486f-a622-923ae442117c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d7a29ab4db9b7548c70824520272e6323f615934cddf1d92bf653f6d8f030a19\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T08:39:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6lhz2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ab314ed006e71cae099ecf8be4ac18fb777dd83fe4e233d9bc347965f76b6a0b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ab314ed006e71cae099ecf8be4ac18fb777dd83fe4e233d9bc347965f76b6a0b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T08:39:37Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T08:39:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6lhz2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://261208c132a527b6786f64f37dd699e889f68ebc01265028288b3620615274bc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://261208c132a527b6786f64f37dd699e889f68ebc01265028288b3620615274bc\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T08:39:39Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T08:39:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6lhz2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8358f38c6c98de8206fc03fda21200b0e526972747588a6e32d525c064aa57ac\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8358f38c6c98de8206fc03fda21200b0e526972747588a6e32d525c064aa57ac\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T08:39:39Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T08:39:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6lhz2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9af7a0993c4e8d1177050ee170ae306c2e2570b0daca2d3f5c812b5f0e9c81da\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9af7a0993c4e8d1177050ee170ae306c2e2570b0daca2d3f5c812b5f0e9c81da\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T08:39:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T08:39:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6lhz2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://23ac91bc25ecc5c606b22bf6df52129330bb8c214ef8ec881fb202df6350c853\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://23ac91bc25ecc5c606b22bf6df52129330bb8c214ef8ec881fb202df6350c853\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T08:39:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T08:39:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6lhz2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7c836df75da45ef369baafc15bdbed1068becc3bf57a4c83a8519280ff3eb847\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7c836df75da45ef369baafc15bdbed1068becc3bf57a4c83a8519280ff3eb847\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T08:39:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T08:39:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6lhz2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T08:39:36Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-4bmrv\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T08:39:50Z is after 2025-08-24T17:21:41Z" Oct 03 08:39:50 crc kubenswrapper[4765]: I1003 08:39:50.707285 4765 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-p9gf5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"46c76a49-e10b-4a12-a6c7-12c330cd3c4e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d127171dd11041892813dd0596574630e756cc4f2e54b149619bffdbe9bae37d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T08:39:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2ldt7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T08:39:36Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-p9gf5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T08:39:50Z is after 2025-08-24T17:21:41Z" Oct 03 08:39:50 crc kubenswrapper[4765]: I1003 08:39:50.721549 4765 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-svqbq" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"84cdf1d7-9997-4015-bdbf-eedacc081685\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://43441b23076aa88505c0014c6734ffd0302f9011300711eece573befc94f3fbf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T08:39:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tm2z5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T08:39:39Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-svqbq\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T08:39:50Z is after 2025-08-24T17:21:41Z" Oct 03 08:39:50 crc kubenswrapper[4765]: I1003 08:39:50.733630 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 08:39:50 crc kubenswrapper[4765]: I1003 08:39:50.733695 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 08:39:50 crc kubenswrapper[4765]: I1003 08:39:50.733709 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 08:39:50 crc kubenswrapper[4765]: I1003 08:39:50.733731 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 08:39:50 crc kubenswrapper[4765]: I1003 08:39:50.733747 4765 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T08:39:50Z","lastTransitionTime":"2025-10-03T08:39:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 08:39:50 crc kubenswrapper[4765]: I1003 08:39:50.736551 4765 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-9pssq" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fcbd8c60-e4bc-43c1-b769-9ae58a05ea0f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fb36c0727cbf11d911102b2e91c3989a264374191f4ff34349ed6ec8eba2e58d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T08:39:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ggxtg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d810b33fb4971c7a1473884cbe04ad15b3cac6c0ca9af2384819d72a748ab173\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T08:39:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ggxtg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T08:39:48Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-9pssq\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T08:39:50Z is after 2025-08-24T17:21:41Z" Oct 03 08:39:50 crc kubenswrapper[4765]: I1003 08:39:50.749855 4765 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9660b983-3561-4cf7-8ea0-31a63e8d1051\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://23c27e7d79dab0c54b22f0114e7f55a9267e3a21961b8479c37fd77d0e8b66c7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T08:39:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fb89a31c804d86cbc11b04e4dcfab79d4536f28a107d43e98d48172a1c257ebb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T08:39:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1c3168f51c49cd9633557cf31cdc0fec47b3fcf981462dc85f4253a0584fcf52\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T08:39:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://99ae775d5cfd2e88a1c7ca516e1c59f2e08ce1d383653cacbefeac66b07abcb5\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T08:39:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T08:39:16Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T08:39:50Z is after 2025-08-24T17:21:41Z" Oct 03 08:39:50 crc kubenswrapper[4765]: I1003 08:39:50.836968 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 08:39:50 crc kubenswrapper[4765]: I1003 08:39:50.837028 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 08:39:50 crc kubenswrapper[4765]: I1003 08:39:50.837037 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 08:39:50 crc kubenswrapper[4765]: I1003 08:39:50.837056 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 08:39:50 crc kubenswrapper[4765]: I1003 08:39:50.837068 4765 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T08:39:50Z","lastTransitionTime":"2025-10-03T08:39:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 08:39:50 crc kubenswrapper[4765]: I1003 08:39:50.940279 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 08:39:50 crc kubenswrapper[4765]: I1003 08:39:50.940345 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 08:39:50 crc kubenswrapper[4765]: I1003 08:39:50.940365 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 08:39:50 crc kubenswrapper[4765]: I1003 08:39:50.940392 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 08:39:50 crc kubenswrapper[4765]: I1003 08:39:50.940413 4765 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T08:39:50Z","lastTransitionTime":"2025-10-03T08:39:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 08:39:51 crc kubenswrapper[4765]: I1003 08:39:51.037359 4765 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 03 08:39:51 crc kubenswrapper[4765]: E1003 08:39:51.037845 4765 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-03 08:40:07.037780206 +0000 UTC m=+51.339274586 (durationBeforeRetry 16s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 03 08:39:51 crc kubenswrapper[4765]: I1003 08:39:51.043738 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 08:39:51 crc kubenswrapper[4765]: I1003 08:39:51.043798 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 08:39:51 crc kubenswrapper[4765]: I1003 08:39:51.043817 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 08:39:51 crc kubenswrapper[4765]: I1003 08:39:51.043849 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 08:39:51 crc kubenswrapper[4765]: I1003 08:39:51.043870 4765 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T08:39:51Z","lastTransitionTime":"2025-10-03T08:39:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 08:39:51 crc kubenswrapper[4765]: I1003 08:39:51.138683 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/6824483c-e9a7-4e95-bb3d-e00bac2af3aa-metrics-certs\") pod \"network-metrics-daemon-wdwf5\" (UID: \"6824483c-e9a7-4e95-bb3d-e00bac2af3aa\") " pod="openshift-multus/network-metrics-daemon-wdwf5" Oct 03 08:39:51 crc kubenswrapper[4765]: I1003 08:39:51.138739 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 03 08:39:51 crc kubenswrapper[4765]: I1003 08:39:51.138760 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 03 08:39:51 crc kubenswrapper[4765]: I1003 08:39:51.138784 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 03 08:39:51 crc kubenswrapper[4765]: I1003 08:39:51.138812 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 03 08:39:51 crc kubenswrapper[4765]: E1003 08:39:51.138962 4765 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Oct 03 08:39:51 crc kubenswrapper[4765]: E1003 08:39:51.138981 4765 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Oct 03 08:39:51 crc kubenswrapper[4765]: E1003 08:39:51.138993 4765 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Oct 03 08:39:51 crc kubenswrapper[4765]: E1003 08:39:51.138987 4765 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Oct 03 08:39:51 crc kubenswrapper[4765]: E1003 08:39:51.139229 4765 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Oct 03 08:39:51 crc kubenswrapper[4765]: E1003 08:39:51.139052 4765 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2025-10-03 08:40:07.139033655 +0000 UTC m=+51.440527985 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Oct 03 08:39:51 crc kubenswrapper[4765]: E1003 08:39:51.139300 4765 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-10-03 08:40:07.139271651 +0000 UTC m=+51.440766051 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Oct 03 08:39:51 crc kubenswrapper[4765]: E1003 08:39:51.139319 4765 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/6824483c-e9a7-4e95-bb3d-e00bac2af3aa-metrics-certs podName:6824483c-e9a7-4e95-bb3d-e00bac2af3aa nodeName:}" failed. No retries permitted until 2025-10-03 08:39:52.139309852 +0000 UTC m=+36.440804282 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/6824483c-e9a7-4e95-bb3d-e00bac2af3aa-metrics-certs") pod "network-metrics-daemon-wdwf5" (UID: "6824483c-e9a7-4e95-bb3d-e00bac2af3aa") : object "openshift-multus"/"metrics-daemon-secret" not registered Oct 03 08:39:51 crc kubenswrapper[4765]: E1003 08:39:51.139370 4765 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Oct 03 08:39:51 crc kubenswrapper[4765]: E1003 08:39:51.139408 4765 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-10-03 08:40:07.139397264 +0000 UTC m=+51.440891704 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Oct 03 08:39:51 crc kubenswrapper[4765]: E1003 08:39:51.139498 4765 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Oct 03 08:39:51 crc kubenswrapper[4765]: E1003 08:39:51.139514 4765 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Oct 03 08:39:51 crc kubenswrapper[4765]: E1003 08:39:51.139529 4765 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Oct 03 08:39:51 crc kubenswrapper[4765]: E1003 08:39:51.139575 4765 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2025-10-03 08:40:07.139565868 +0000 UTC m=+51.441060198 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Oct 03 08:39:51 crc kubenswrapper[4765]: I1003 08:39:51.146698 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 08:39:51 crc kubenswrapper[4765]: I1003 08:39:51.146742 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 08:39:51 crc kubenswrapper[4765]: I1003 08:39:51.146751 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 08:39:51 crc kubenswrapper[4765]: I1003 08:39:51.146766 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 08:39:51 crc kubenswrapper[4765]: I1003 08:39:51.146778 4765 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T08:39:51Z","lastTransitionTime":"2025-10-03T08:39:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 08:39:51 crc kubenswrapper[4765]: I1003 08:39:51.249842 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 08:39:51 crc kubenswrapper[4765]: I1003 08:39:51.250331 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 08:39:51 crc kubenswrapper[4765]: I1003 08:39:51.250342 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 08:39:51 crc kubenswrapper[4765]: I1003 08:39:51.250362 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 08:39:51 crc kubenswrapper[4765]: I1003 08:39:51.250375 4765 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T08:39:51Z","lastTransitionTime":"2025-10-03T08:39:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 08:39:51 crc kubenswrapper[4765]: I1003 08:39:51.307489 4765 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 03 08:39:51 crc kubenswrapper[4765]: E1003 08:39:51.307906 4765 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 03 08:39:51 crc kubenswrapper[4765]: I1003 08:39:51.307957 4765 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 03 08:39:51 crc kubenswrapper[4765]: E1003 08:39:51.308136 4765 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 03 08:39:51 crc kubenswrapper[4765]: I1003 08:39:51.308266 4765 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 03 08:39:51 crc kubenswrapper[4765]: E1003 08:39:51.308352 4765 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 03 08:39:51 crc kubenswrapper[4765]: I1003 08:39:51.353449 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 08:39:51 crc kubenswrapper[4765]: I1003 08:39:51.353498 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 08:39:51 crc kubenswrapper[4765]: I1003 08:39:51.353517 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 08:39:51 crc kubenswrapper[4765]: I1003 08:39:51.353538 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 08:39:51 crc kubenswrapper[4765]: I1003 08:39:51.353550 4765 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T08:39:51Z","lastTransitionTime":"2025-10-03T08:39:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 08:39:51 crc kubenswrapper[4765]: I1003 08:39:51.456952 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 08:39:51 crc kubenswrapper[4765]: I1003 08:39:51.457046 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 08:39:51 crc kubenswrapper[4765]: I1003 08:39:51.457067 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 08:39:51 crc kubenswrapper[4765]: I1003 08:39:51.457096 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 08:39:51 crc kubenswrapper[4765]: I1003 08:39:51.457117 4765 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T08:39:51Z","lastTransitionTime":"2025-10-03T08:39:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 08:39:51 crc kubenswrapper[4765]: I1003 08:39:51.559326 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 08:39:51 crc kubenswrapper[4765]: I1003 08:39:51.559393 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 08:39:51 crc kubenswrapper[4765]: I1003 08:39:51.559403 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 08:39:51 crc kubenswrapper[4765]: I1003 08:39:51.559417 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 08:39:51 crc kubenswrapper[4765]: I1003 08:39:51.559427 4765 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T08:39:51Z","lastTransitionTime":"2025-10-03T08:39:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 08:39:51 crc kubenswrapper[4765]: I1003 08:39:51.662581 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 08:39:51 crc kubenswrapper[4765]: I1003 08:39:51.662633 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 08:39:51 crc kubenswrapper[4765]: I1003 08:39:51.662660 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 08:39:51 crc kubenswrapper[4765]: I1003 08:39:51.662679 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 08:39:51 crc kubenswrapper[4765]: I1003 08:39:51.662690 4765 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T08:39:51Z","lastTransitionTime":"2025-10-03T08:39:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 08:39:51 crc kubenswrapper[4765]: I1003 08:39:51.766032 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 08:39:51 crc kubenswrapper[4765]: I1003 08:39:51.766074 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 08:39:51 crc kubenswrapper[4765]: I1003 08:39:51.766086 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 08:39:51 crc kubenswrapper[4765]: I1003 08:39:51.766101 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 08:39:51 crc kubenswrapper[4765]: I1003 08:39:51.766117 4765 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T08:39:51Z","lastTransitionTime":"2025-10-03T08:39:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 08:39:51 crc kubenswrapper[4765]: I1003 08:39:51.869069 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 08:39:51 crc kubenswrapper[4765]: I1003 08:39:51.869113 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 08:39:51 crc kubenswrapper[4765]: I1003 08:39:51.869122 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 08:39:51 crc kubenswrapper[4765]: I1003 08:39:51.869137 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 08:39:51 crc kubenswrapper[4765]: I1003 08:39:51.869150 4765 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T08:39:51Z","lastTransitionTime":"2025-10-03T08:39:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 08:39:51 crc kubenswrapper[4765]: I1003 08:39:51.891716 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 08:39:51 crc kubenswrapper[4765]: I1003 08:39:51.891975 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 08:39:51 crc kubenswrapper[4765]: I1003 08:39:51.891988 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 08:39:51 crc kubenswrapper[4765]: I1003 08:39:51.892014 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 08:39:51 crc kubenswrapper[4765]: I1003 08:39:51.892028 4765 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T08:39:51Z","lastTransitionTime":"2025-10-03T08:39:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 08:39:51 crc kubenswrapper[4765]: E1003 08:39:51.906403 4765 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-03T08:39:51Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:51Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-03T08:39:51Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:51Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-03T08:39:51Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:51Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-03T08:39:51Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:51Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"4a5a1b91-d1b3-462d-b8c2-89eae83d6c3d\\\",\\\"systemUUID\\\":\\\"c85bcae8-d463-4f60-8737-09c0f3c02573\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T08:39:51Z is after 2025-08-24T17:21:41Z" Oct 03 08:39:51 crc kubenswrapper[4765]: I1003 08:39:51.911955 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 08:39:51 crc kubenswrapper[4765]: I1003 08:39:51.912015 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 08:39:51 crc kubenswrapper[4765]: I1003 08:39:51.912027 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 08:39:51 crc kubenswrapper[4765]: I1003 08:39:51.912047 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 08:39:51 crc kubenswrapper[4765]: I1003 08:39:51.912061 4765 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T08:39:51Z","lastTransitionTime":"2025-10-03T08:39:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 08:39:51 crc kubenswrapper[4765]: E1003 08:39:51.931638 4765 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-03T08:39:51Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:51Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-03T08:39:51Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:51Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-03T08:39:51Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:51Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-03T08:39:51Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:51Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"4a5a1b91-d1b3-462d-b8c2-89eae83d6c3d\\\",\\\"systemUUID\\\":\\\"c85bcae8-d463-4f60-8737-09c0f3c02573\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T08:39:51Z is after 2025-08-24T17:21:41Z" Oct 03 08:39:51 crc kubenswrapper[4765]: I1003 08:39:51.935414 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 08:39:51 crc kubenswrapper[4765]: I1003 08:39:51.935458 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 08:39:51 crc kubenswrapper[4765]: I1003 08:39:51.935467 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 08:39:51 crc kubenswrapper[4765]: I1003 08:39:51.935485 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 08:39:51 crc kubenswrapper[4765]: I1003 08:39:51.935495 4765 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T08:39:51Z","lastTransitionTime":"2025-10-03T08:39:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 08:39:51 crc kubenswrapper[4765]: E1003 08:39:51.946669 4765 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-03T08:39:51Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:51Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-03T08:39:51Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:51Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-03T08:39:51Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:51Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-03T08:39:51Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:51Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"4a5a1b91-d1b3-462d-b8c2-89eae83d6c3d\\\",\\\"systemUUID\\\":\\\"c85bcae8-d463-4f60-8737-09c0f3c02573\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T08:39:51Z is after 2025-08-24T17:21:41Z" Oct 03 08:39:51 crc kubenswrapper[4765]: I1003 08:39:51.950619 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 08:39:51 crc kubenswrapper[4765]: I1003 08:39:51.950685 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 08:39:51 crc kubenswrapper[4765]: I1003 08:39:51.950698 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 08:39:51 crc kubenswrapper[4765]: I1003 08:39:51.950717 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 08:39:51 crc kubenswrapper[4765]: I1003 08:39:51.950730 4765 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T08:39:51Z","lastTransitionTime":"2025-10-03T08:39:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 08:39:51 crc kubenswrapper[4765]: E1003 08:39:51.961711 4765 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-03T08:39:51Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:51Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-03T08:39:51Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:51Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-03T08:39:51Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:51Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-03T08:39:51Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:51Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"4a5a1b91-d1b3-462d-b8c2-89eae83d6c3d\\\",\\\"systemUUID\\\":\\\"c85bcae8-d463-4f60-8737-09c0f3c02573\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T08:39:51Z is after 2025-08-24T17:21:41Z" Oct 03 08:39:51 crc kubenswrapper[4765]: I1003 08:39:51.965346 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 08:39:51 crc kubenswrapper[4765]: I1003 08:39:51.965374 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 08:39:51 crc kubenswrapper[4765]: I1003 08:39:51.965383 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 08:39:51 crc kubenswrapper[4765]: I1003 08:39:51.965423 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 08:39:51 crc kubenswrapper[4765]: I1003 08:39:51.965439 4765 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T08:39:51Z","lastTransitionTime":"2025-10-03T08:39:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 08:39:51 crc kubenswrapper[4765]: E1003 08:39:51.976412 4765 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-03T08:39:51Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:51Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-03T08:39:51Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:51Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-03T08:39:51Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:51Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-03T08:39:51Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:51Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"4a5a1b91-d1b3-462d-b8c2-89eae83d6c3d\\\",\\\"systemUUID\\\":\\\"c85bcae8-d463-4f60-8737-09c0f3c02573\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T08:39:51Z is after 2025-08-24T17:21:41Z" Oct 03 08:39:51 crc kubenswrapper[4765]: E1003 08:39:51.976519 4765 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Oct 03 08:39:51 crc kubenswrapper[4765]: I1003 08:39:51.978057 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 08:39:51 crc kubenswrapper[4765]: I1003 08:39:51.978114 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 08:39:51 crc kubenswrapper[4765]: I1003 08:39:51.978128 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 08:39:51 crc kubenswrapper[4765]: I1003 08:39:51.978149 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 08:39:51 crc kubenswrapper[4765]: I1003 08:39:51.978163 4765 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T08:39:51Z","lastTransitionTime":"2025-10-03T08:39:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 08:39:52 crc kubenswrapper[4765]: I1003 08:39:52.080741 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 08:39:52 crc kubenswrapper[4765]: I1003 08:39:52.080789 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 08:39:52 crc kubenswrapper[4765]: I1003 08:39:52.080799 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 08:39:52 crc kubenswrapper[4765]: I1003 08:39:52.080819 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 08:39:52 crc kubenswrapper[4765]: I1003 08:39:52.080831 4765 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T08:39:52Z","lastTransitionTime":"2025-10-03T08:39:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 08:39:52 crc kubenswrapper[4765]: I1003 08:39:52.151271 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/6824483c-e9a7-4e95-bb3d-e00bac2af3aa-metrics-certs\") pod \"network-metrics-daemon-wdwf5\" (UID: \"6824483c-e9a7-4e95-bb3d-e00bac2af3aa\") " pod="openshift-multus/network-metrics-daemon-wdwf5" Oct 03 08:39:52 crc kubenswrapper[4765]: E1003 08:39:52.151557 4765 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Oct 03 08:39:52 crc kubenswrapper[4765]: E1003 08:39:52.151706 4765 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/6824483c-e9a7-4e95-bb3d-e00bac2af3aa-metrics-certs podName:6824483c-e9a7-4e95-bb3d-e00bac2af3aa nodeName:}" failed. No retries permitted until 2025-10-03 08:39:54.151680895 +0000 UTC m=+38.453175415 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/6824483c-e9a7-4e95-bb3d-e00bac2af3aa-metrics-certs") pod "network-metrics-daemon-wdwf5" (UID: "6824483c-e9a7-4e95-bb3d-e00bac2af3aa") : object "openshift-multus"/"metrics-daemon-secret" not registered Oct 03 08:39:52 crc kubenswrapper[4765]: I1003 08:39:52.184009 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 08:39:52 crc kubenswrapper[4765]: I1003 08:39:52.184084 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 08:39:52 crc kubenswrapper[4765]: I1003 08:39:52.184100 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 08:39:52 crc kubenswrapper[4765]: I1003 08:39:52.184123 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 08:39:52 crc kubenswrapper[4765]: I1003 08:39:52.184139 4765 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T08:39:52Z","lastTransitionTime":"2025-10-03T08:39:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 08:39:52 crc kubenswrapper[4765]: I1003 08:39:52.286716 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 08:39:52 crc kubenswrapper[4765]: I1003 08:39:52.286775 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 08:39:52 crc kubenswrapper[4765]: I1003 08:39:52.286788 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 08:39:52 crc kubenswrapper[4765]: I1003 08:39:52.286810 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 08:39:52 crc kubenswrapper[4765]: I1003 08:39:52.286823 4765 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T08:39:52Z","lastTransitionTime":"2025-10-03T08:39:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 08:39:52 crc kubenswrapper[4765]: I1003 08:39:52.306107 4765 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-wdwf5" Oct 03 08:39:52 crc kubenswrapper[4765]: E1003 08:39:52.306330 4765 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-wdwf5" podUID="6824483c-e9a7-4e95-bb3d-e00bac2af3aa" Oct 03 08:39:52 crc kubenswrapper[4765]: I1003 08:39:52.388502 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 08:39:52 crc kubenswrapper[4765]: I1003 08:39:52.388534 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 08:39:52 crc kubenswrapper[4765]: I1003 08:39:52.388544 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 08:39:52 crc kubenswrapper[4765]: I1003 08:39:52.388558 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 08:39:52 crc kubenswrapper[4765]: I1003 08:39:52.388568 4765 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T08:39:52Z","lastTransitionTime":"2025-10-03T08:39:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 08:39:52 crc kubenswrapper[4765]: I1003 08:39:52.491543 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 08:39:52 crc kubenswrapper[4765]: I1003 08:39:52.491586 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 08:39:52 crc kubenswrapper[4765]: I1003 08:39:52.491597 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 08:39:52 crc kubenswrapper[4765]: I1003 08:39:52.491614 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 08:39:52 crc kubenswrapper[4765]: I1003 08:39:52.491628 4765 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T08:39:52Z","lastTransitionTime":"2025-10-03T08:39:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 08:39:52 crc kubenswrapper[4765]: I1003 08:39:52.594487 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 08:39:52 crc kubenswrapper[4765]: I1003 08:39:52.594530 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 08:39:52 crc kubenswrapper[4765]: I1003 08:39:52.594538 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 08:39:52 crc kubenswrapper[4765]: I1003 08:39:52.594553 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 08:39:52 crc kubenswrapper[4765]: I1003 08:39:52.594563 4765 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T08:39:52Z","lastTransitionTime":"2025-10-03T08:39:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 08:39:52 crc kubenswrapper[4765]: I1003 08:39:52.698503 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 08:39:52 crc kubenswrapper[4765]: I1003 08:39:52.698549 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 08:39:52 crc kubenswrapper[4765]: I1003 08:39:52.698560 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 08:39:52 crc kubenswrapper[4765]: I1003 08:39:52.698578 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 08:39:52 crc kubenswrapper[4765]: I1003 08:39:52.698590 4765 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T08:39:52Z","lastTransitionTime":"2025-10-03T08:39:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 08:39:52 crc kubenswrapper[4765]: I1003 08:39:52.801553 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 08:39:52 crc kubenswrapper[4765]: I1003 08:39:52.801612 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 08:39:52 crc kubenswrapper[4765]: I1003 08:39:52.801625 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 08:39:52 crc kubenswrapper[4765]: I1003 08:39:52.801672 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 08:39:52 crc kubenswrapper[4765]: I1003 08:39:52.801763 4765 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T08:39:52Z","lastTransitionTime":"2025-10-03T08:39:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 08:39:52 crc kubenswrapper[4765]: I1003 08:39:52.904683 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 08:39:52 crc kubenswrapper[4765]: I1003 08:39:52.904749 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 08:39:52 crc kubenswrapper[4765]: I1003 08:39:52.904763 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 08:39:52 crc kubenswrapper[4765]: I1003 08:39:52.904787 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 08:39:52 crc kubenswrapper[4765]: I1003 08:39:52.904800 4765 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T08:39:52Z","lastTransitionTime":"2025-10-03T08:39:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 08:39:53 crc kubenswrapper[4765]: I1003 08:39:53.007740 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 08:39:53 crc kubenswrapper[4765]: I1003 08:39:53.007822 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 08:39:53 crc kubenswrapper[4765]: I1003 08:39:53.007846 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 08:39:53 crc kubenswrapper[4765]: I1003 08:39:53.007882 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 08:39:53 crc kubenswrapper[4765]: I1003 08:39:53.007907 4765 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T08:39:53Z","lastTransitionTime":"2025-10-03T08:39:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 08:39:53 crc kubenswrapper[4765]: I1003 08:39:53.111003 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 08:39:53 crc kubenswrapper[4765]: I1003 08:39:53.111075 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 08:39:53 crc kubenswrapper[4765]: I1003 08:39:53.111090 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 08:39:53 crc kubenswrapper[4765]: I1003 08:39:53.111115 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 08:39:53 crc kubenswrapper[4765]: I1003 08:39:53.111135 4765 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T08:39:53Z","lastTransitionTime":"2025-10-03T08:39:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 08:39:53 crc kubenswrapper[4765]: I1003 08:39:53.214498 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 08:39:53 crc kubenswrapper[4765]: I1003 08:39:53.214555 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 08:39:53 crc kubenswrapper[4765]: I1003 08:39:53.214566 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 08:39:53 crc kubenswrapper[4765]: I1003 08:39:53.214584 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 08:39:53 crc kubenswrapper[4765]: I1003 08:39:53.214596 4765 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T08:39:53Z","lastTransitionTime":"2025-10-03T08:39:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 08:39:53 crc kubenswrapper[4765]: I1003 08:39:53.306633 4765 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 03 08:39:53 crc kubenswrapper[4765]: I1003 08:39:53.306690 4765 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 03 08:39:53 crc kubenswrapper[4765]: I1003 08:39:53.306693 4765 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 03 08:39:53 crc kubenswrapper[4765]: E1003 08:39:53.307058 4765 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 03 08:39:53 crc kubenswrapper[4765]: E1003 08:39:53.307129 4765 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 03 08:39:53 crc kubenswrapper[4765]: E1003 08:39:53.307274 4765 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 03 08:39:53 crc kubenswrapper[4765]: I1003 08:39:53.316925 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 08:39:53 crc kubenswrapper[4765]: I1003 08:39:53.317007 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 08:39:53 crc kubenswrapper[4765]: I1003 08:39:53.317019 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 08:39:53 crc kubenswrapper[4765]: I1003 08:39:53.317041 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 08:39:53 crc kubenswrapper[4765]: I1003 08:39:53.317055 4765 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T08:39:53Z","lastTransitionTime":"2025-10-03T08:39:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 08:39:53 crc kubenswrapper[4765]: I1003 08:39:53.419473 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 08:39:53 crc kubenswrapper[4765]: I1003 08:39:53.419528 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 08:39:53 crc kubenswrapper[4765]: I1003 08:39:53.419540 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 08:39:53 crc kubenswrapper[4765]: I1003 08:39:53.419556 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 08:39:53 crc kubenswrapper[4765]: I1003 08:39:53.419565 4765 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T08:39:53Z","lastTransitionTime":"2025-10-03T08:39:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 08:39:53 crc kubenswrapper[4765]: I1003 08:39:53.522389 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 08:39:53 crc kubenswrapper[4765]: I1003 08:39:53.522422 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 08:39:53 crc kubenswrapper[4765]: I1003 08:39:53.522433 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 08:39:53 crc kubenswrapper[4765]: I1003 08:39:53.522450 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 08:39:53 crc kubenswrapper[4765]: I1003 08:39:53.522461 4765 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T08:39:53Z","lastTransitionTime":"2025-10-03T08:39:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 08:39:53 crc kubenswrapper[4765]: I1003 08:39:53.625515 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 08:39:53 crc kubenswrapper[4765]: I1003 08:39:53.625592 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 08:39:53 crc kubenswrapper[4765]: I1003 08:39:53.625608 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 08:39:53 crc kubenswrapper[4765]: I1003 08:39:53.625629 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 08:39:53 crc kubenswrapper[4765]: I1003 08:39:53.625672 4765 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T08:39:53Z","lastTransitionTime":"2025-10-03T08:39:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 08:39:53 crc kubenswrapper[4765]: I1003 08:39:53.728768 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 08:39:53 crc kubenswrapper[4765]: I1003 08:39:53.728826 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 08:39:53 crc kubenswrapper[4765]: I1003 08:39:53.728839 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 08:39:53 crc kubenswrapper[4765]: I1003 08:39:53.728860 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 08:39:53 crc kubenswrapper[4765]: I1003 08:39:53.728874 4765 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T08:39:53Z","lastTransitionTime":"2025-10-03T08:39:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 08:39:53 crc kubenswrapper[4765]: I1003 08:39:53.833002 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 08:39:53 crc kubenswrapper[4765]: I1003 08:39:53.833057 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 08:39:53 crc kubenswrapper[4765]: I1003 08:39:53.833067 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 08:39:53 crc kubenswrapper[4765]: I1003 08:39:53.833086 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 08:39:53 crc kubenswrapper[4765]: I1003 08:39:53.833097 4765 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T08:39:53Z","lastTransitionTime":"2025-10-03T08:39:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 08:39:53 crc kubenswrapper[4765]: I1003 08:39:53.936509 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 08:39:53 crc kubenswrapper[4765]: I1003 08:39:53.936571 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 08:39:53 crc kubenswrapper[4765]: I1003 08:39:53.936583 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 08:39:53 crc kubenswrapper[4765]: I1003 08:39:53.936608 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 08:39:53 crc kubenswrapper[4765]: I1003 08:39:53.936622 4765 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T08:39:53Z","lastTransitionTime":"2025-10-03T08:39:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 08:39:54 crc kubenswrapper[4765]: I1003 08:39:54.039935 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 08:39:54 crc kubenswrapper[4765]: I1003 08:39:54.039982 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 08:39:54 crc kubenswrapper[4765]: I1003 08:39:54.039993 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 08:39:54 crc kubenswrapper[4765]: I1003 08:39:54.040010 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 08:39:54 crc kubenswrapper[4765]: I1003 08:39:54.040024 4765 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T08:39:54Z","lastTransitionTime":"2025-10-03T08:39:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 08:39:54 crc kubenswrapper[4765]: I1003 08:39:54.142876 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 08:39:54 crc kubenswrapper[4765]: I1003 08:39:54.142932 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 08:39:54 crc kubenswrapper[4765]: I1003 08:39:54.142946 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 08:39:54 crc kubenswrapper[4765]: I1003 08:39:54.142965 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 08:39:54 crc kubenswrapper[4765]: I1003 08:39:54.142980 4765 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T08:39:54Z","lastTransitionTime":"2025-10-03T08:39:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 08:39:54 crc kubenswrapper[4765]: I1003 08:39:54.173801 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/6824483c-e9a7-4e95-bb3d-e00bac2af3aa-metrics-certs\") pod \"network-metrics-daemon-wdwf5\" (UID: \"6824483c-e9a7-4e95-bb3d-e00bac2af3aa\") " pod="openshift-multus/network-metrics-daemon-wdwf5" Oct 03 08:39:54 crc kubenswrapper[4765]: E1003 08:39:54.173965 4765 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Oct 03 08:39:54 crc kubenswrapper[4765]: E1003 08:39:54.174133 4765 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/6824483c-e9a7-4e95-bb3d-e00bac2af3aa-metrics-certs podName:6824483c-e9a7-4e95-bb3d-e00bac2af3aa nodeName:}" failed. No retries permitted until 2025-10-03 08:39:58.17411049 +0000 UTC m=+42.475604820 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/6824483c-e9a7-4e95-bb3d-e00bac2af3aa-metrics-certs") pod "network-metrics-daemon-wdwf5" (UID: "6824483c-e9a7-4e95-bb3d-e00bac2af3aa") : object "openshift-multus"/"metrics-daemon-secret" not registered Oct 03 08:39:54 crc kubenswrapper[4765]: I1003 08:39:54.247015 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 08:39:54 crc kubenswrapper[4765]: I1003 08:39:54.247074 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 08:39:54 crc kubenswrapper[4765]: I1003 08:39:54.247087 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 08:39:54 crc kubenswrapper[4765]: I1003 08:39:54.247106 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 08:39:54 crc kubenswrapper[4765]: I1003 08:39:54.247119 4765 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T08:39:54Z","lastTransitionTime":"2025-10-03T08:39:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 08:39:54 crc kubenswrapper[4765]: I1003 08:39:54.305883 4765 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-wdwf5" Oct 03 08:39:54 crc kubenswrapper[4765]: E1003 08:39:54.306136 4765 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-wdwf5" podUID="6824483c-e9a7-4e95-bb3d-e00bac2af3aa" Oct 03 08:39:54 crc kubenswrapper[4765]: I1003 08:39:54.350704 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 08:39:54 crc kubenswrapper[4765]: I1003 08:39:54.350760 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 08:39:54 crc kubenswrapper[4765]: I1003 08:39:54.350774 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 08:39:54 crc kubenswrapper[4765]: I1003 08:39:54.350796 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 08:39:54 crc kubenswrapper[4765]: I1003 08:39:54.350810 4765 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T08:39:54Z","lastTransitionTime":"2025-10-03T08:39:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 08:39:54 crc kubenswrapper[4765]: I1003 08:39:54.453297 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 08:39:54 crc kubenswrapper[4765]: I1003 08:39:54.453350 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 08:39:54 crc kubenswrapper[4765]: I1003 08:39:54.453365 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 08:39:54 crc kubenswrapper[4765]: I1003 08:39:54.453383 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 08:39:54 crc kubenswrapper[4765]: I1003 08:39:54.453395 4765 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T08:39:54Z","lastTransitionTime":"2025-10-03T08:39:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 08:39:54 crc kubenswrapper[4765]: I1003 08:39:54.556386 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 08:39:54 crc kubenswrapper[4765]: I1003 08:39:54.556428 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 08:39:54 crc kubenswrapper[4765]: I1003 08:39:54.556437 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 08:39:54 crc kubenswrapper[4765]: I1003 08:39:54.556453 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 08:39:54 crc kubenswrapper[4765]: I1003 08:39:54.556463 4765 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T08:39:54Z","lastTransitionTime":"2025-10-03T08:39:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 08:39:54 crc kubenswrapper[4765]: I1003 08:39:54.659981 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 08:39:54 crc kubenswrapper[4765]: I1003 08:39:54.660042 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 08:39:54 crc kubenswrapper[4765]: I1003 08:39:54.660059 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 08:39:54 crc kubenswrapper[4765]: I1003 08:39:54.660077 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 08:39:54 crc kubenswrapper[4765]: I1003 08:39:54.660091 4765 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T08:39:54Z","lastTransitionTime":"2025-10-03T08:39:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 08:39:54 crc kubenswrapper[4765]: I1003 08:39:54.763824 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 08:39:54 crc kubenswrapper[4765]: I1003 08:39:54.763889 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 08:39:54 crc kubenswrapper[4765]: I1003 08:39:54.763903 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 08:39:54 crc kubenswrapper[4765]: I1003 08:39:54.763924 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 08:39:54 crc kubenswrapper[4765]: I1003 08:39:54.763937 4765 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T08:39:54Z","lastTransitionTime":"2025-10-03T08:39:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 08:39:54 crc kubenswrapper[4765]: I1003 08:39:54.867357 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 08:39:54 crc kubenswrapper[4765]: I1003 08:39:54.867410 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 08:39:54 crc kubenswrapper[4765]: I1003 08:39:54.867419 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 08:39:54 crc kubenswrapper[4765]: I1003 08:39:54.867438 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 08:39:54 crc kubenswrapper[4765]: I1003 08:39:54.867447 4765 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T08:39:54Z","lastTransitionTime":"2025-10-03T08:39:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 08:39:54 crc kubenswrapper[4765]: I1003 08:39:54.970593 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 08:39:54 crc kubenswrapper[4765]: I1003 08:39:54.970665 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 08:39:54 crc kubenswrapper[4765]: I1003 08:39:54.970688 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 08:39:54 crc kubenswrapper[4765]: I1003 08:39:54.970707 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 08:39:54 crc kubenswrapper[4765]: I1003 08:39:54.970719 4765 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T08:39:54Z","lastTransitionTime":"2025-10-03T08:39:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 08:39:55 crc kubenswrapper[4765]: I1003 08:39:55.073841 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 08:39:55 crc kubenswrapper[4765]: I1003 08:39:55.073925 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 08:39:55 crc kubenswrapper[4765]: I1003 08:39:55.073939 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 08:39:55 crc kubenswrapper[4765]: I1003 08:39:55.073964 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 08:39:55 crc kubenswrapper[4765]: I1003 08:39:55.073979 4765 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T08:39:55Z","lastTransitionTime":"2025-10-03T08:39:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 08:39:55 crc kubenswrapper[4765]: I1003 08:39:55.177355 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 08:39:55 crc kubenswrapper[4765]: I1003 08:39:55.177408 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 08:39:55 crc kubenswrapper[4765]: I1003 08:39:55.177421 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 08:39:55 crc kubenswrapper[4765]: I1003 08:39:55.177445 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 08:39:55 crc kubenswrapper[4765]: I1003 08:39:55.177460 4765 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T08:39:55Z","lastTransitionTime":"2025-10-03T08:39:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 08:39:55 crc kubenswrapper[4765]: I1003 08:39:55.280529 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 08:39:55 crc kubenswrapper[4765]: I1003 08:39:55.280592 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 08:39:55 crc kubenswrapper[4765]: I1003 08:39:55.280606 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 08:39:55 crc kubenswrapper[4765]: I1003 08:39:55.280628 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 08:39:55 crc kubenswrapper[4765]: I1003 08:39:55.280668 4765 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T08:39:55Z","lastTransitionTime":"2025-10-03T08:39:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 08:39:55 crc kubenswrapper[4765]: I1003 08:39:55.306130 4765 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 03 08:39:55 crc kubenswrapper[4765]: I1003 08:39:55.306312 4765 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 03 08:39:55 crc kubenswrapper[4765]: I1003 08:39:55.306340 4765 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 03 08:39:55 crc kubenswrapper[4765]: E1003 08:39:55.306624 4765 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 03 08:39:55 crc kubenswrapper[4765]: E1003 08:39:55.306815 4765 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 03 08:39:55 crc kubenswrapper[4765]: E1003 08:39:55.306866 4765 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 03 08:39:55 crc kubenswrapper[4765]: I1003 08:39:55.384008 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 08:39:55 crc kubenswrapper[4765]: I1003 08:39:55.384071 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 08:39:55 crc kubenswrapper[4765]: I1003 08:39:55.384085 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 08:39:55 crc kubenswrapper[4765]: I1003 08:39:55.384108 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 08:39:55 crc kubenswrapper[4765]: I1003 08:39:55.384124 4765 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T08:39:55Z","lastTransitionTime":"2025-10-03T08:39:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 08:39:55 crc kubenswrapper[4765]: I1003 08:39:55.487578 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 08:39:55 crc kubenswrapper[4765]: I1003 08:39:55.487678 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 08:39:55 crc kubenswrapper[4765]: I1003 08:39:55.487688 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 08:39:55 crc kubenswrapper[4765]: I1003 08:39:55.487707 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 08:39:55 crc kubenswrapper[4765]: I1003 08:39:55.487717 4765 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T08:39:55Z","lastTransitionTime":"2025-10-03T08:39:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 08:39:55 crc kubenswrapper[4765]: I1003 08:39:55.590826 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 08:39:55 crc kubenswrapper[4765]: I1003 08:39:55.590884 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 08:39:55 crc kubenswrapper[4765]: I1003 08:39:55.590906 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 08:39:55 crc kubenswrapper[4765]: I1003 08:39:55.590929 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 08:39:55 crc kubenswrapper[4765]: I1003 08:39:55.590942 4765 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T08:39:55Z","lastTransitionTime":"2025-10-03T08:39:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 08:39:55 crc kubenswrapper[4765]: I1003 08:39:55.693788 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 08:39:55 crc kubenswrapper[4765]: I1003 08:39:55.693835 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 08:39:55 crc kubenswrapper[4765]: I1003 08:39:55.693846 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 08:39:55 crc kubenswrapper[4765]: I1003 08:39:55.693864 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 08:39:55 crc kubenswrapper[4765]: I1003 08:39:55.693873 4765 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T08:39:55Z","lastTransitionTime":"2025-10-03T08:39:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 08:39:55 crc kubenswrapper[4765]: I1003 08:39:55.797144 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 08:39:55 crc kubenswrapper[4765]: I1003 08:39:55.797204 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 08:39:55 crc kubenswrapper[4765]: I1003 08:39:55.797213 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 08:39:55 crc kubenswrapper[4765]: I1003 08:39:55.797230 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 08:39:55 crc kubenswrapper[4765]: I1003 08:39:55.797241 4765 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T08:39:55Z","lastTransitionTime":"2025-10-03T08:39:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 08:39:55 crc kubenswrapper[4765]: I1003 08:39:55.900182 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 08:39:55 crc kubenswrapper[4765]: I1003 08:39:55.900224 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 08:39:55 crc kubenswrapper[4765]: I1003 08:39:55.900232 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 08:39:55 crc kubenswrapper[4765]: I1003 08:39:55.900248 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 08:39:55 crc kubenswrapper[4765]: I1003 08:39:55.900261 4765 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T08:39:55Z","lastTransitionTime":"2025-10-03T08:39:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 08:39:56 crc kubenswrapper[4765]: I1003 08:39:56.002824 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 08:39:56 crc kubenswrapper[4765]: I1003 08:39:56.002897 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 08:39:56 crc kubenswrapper[4765]: I1003 08:39:56.002906 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 08:39:56 crc kubenswrapper[4765]: I1003 08:39:56.002922 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 08:39:56 crc kubenswrapper[4765]: I1003 08:39:56.002932 4765 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T08:39:56Z","lastTransitionTime":"2025-10-03T08:39:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 08:39:56 crc kubenswrapper[4765]: I1003 08:39:56.105787 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 08:39:56 crc kubenswrapper[4765]: I1003 08:39:56.105867 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 08:39:56 crc kubenswrapper[4765]: I1003 08:39:56.105881 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 08:39:56 crc kubenswrapper[4765]: I1003 08:39:56.105902 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 08:39:56 crc kubenswrapper[4765]: I1003 08:39:56.105916 4765 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T08:39:56Z","lastTransitionTime":"2025-10-03T08:39:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 08:39:56 crc kubenswrapper[4765]: I1003 08:39:56.209352 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 08:39:56 crc kubenswrapper[4765]: I1003 08:39:56.209403 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 08:39:56 crc kubenswrapper[4765]: I1003 08:39:56.209413 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 08:39:56 crc kubenswrapper[4765]: I1003 08:39:56.209436 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 08:39:56 crc kubenswrapper[4765]: I1003 08:39:56.209451 4765 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T08:39:56Z","lastTransitionTime":"2025-10-03T08:39:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 08:39:56 crc kubenswrapper[4765]: I1003 08:39:56.305922 4765 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-wdwf5" Oct 03 08:39:56 crc kubenswrapper[4765]: E1003 08:39:56.306102 4765 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-wdwf5" podUID="6824483c-e9a7-4e95-bb3d-e00bac2af3aa" Oct 03 08:39:56 crc kubenswrapper[4765]: I1003 08:39:56.311066 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 08:39:56 crc kubenswrapper[4765]: I1003 08:39:56.311112 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 08:39:56 crc kubenswrapper[4765]: I1003 08:39:56.311120 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 08:39:56 crc kubenswrapper[4765]: I1003 08:39:56.311134 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 08:39:56 crc kubenswrapper[4765]: I1003 08:39:56.311144 4765 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T08:39:56Z","lastTransitionTime":"2025-10-03T08:39:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 08:39:56 crc kubenswrapper[4765]: I1003 08:39:56.323360 4765 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0c434639-9c6c-420c-a51b-fdf59b654daa\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d31497fd54f7500ac776bdd9a16414d873c053353911ed5ba237b201e9e7ac12\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T08:39:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e89b19d6a5b90a2051665bf2e5e150f73df7899eff246ee75246bc2127c415ea\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T08:39:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://79fad446c147481b1a0ff2a173848b2d24384e6b6aafcd0749dc820e9abfe929\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T08:39:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f2e21a2b21d807288e991a3a44ea38d316985590080aa4291aa3385816f826dd\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://aa0283dadc2c5e48aa9bfd20ef35d889a350244b72eb8529d4d4e682d5fa0e47\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-03T08:39:35Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1003 08:39:29.830291 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1003 08:39:29.833185 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2710500186/tls.crt::/tmp/serving-cert-2710500186/tls.key\\\\\\\"\\\\nI1003 08:39:35.213224 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1003 08:39:35.219008 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1003 08:39:35.219055 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1003 08:39:35.219088 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1003 08:39:35.219098 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1003 08:39:35.227302 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI1003 08:39:35.227314 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW1003 08:39:35.227372 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1003 08:39:35.227381 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1003 08:39:35.227385 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1003 08:39:35.227395 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1003 08:39:35.227398 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1003 08:39:35.227401 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF1003 08:39:35.229781 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-03T08:39:19Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T08:39:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fa1d1c0f4dab4b4c6c9f3afccac34473eab40a714015a2a7ce725ed1a92b609c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T08:39:19Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7c94b0f96f50324a67a7b189e9d3c4e406193421aa2e793692219bef9f2578dd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7c94b0f96f50324a67a7b189e9d3c4e406193421aa2e793692219bef9f2578dd\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T08:39:17Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T08:39:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T08:39:16Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T08:39:56Z is after 2025-08-24T17:21:41Z" Oct 03 08:39:56 crc kubenswrapper[4765]: I1003 08:39:56.338231 4765 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:35Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:35Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T08:39:56Z is after 2025-08-24T17:21:41Z" Oct 03 08:39:56 crc kubenswrapper[4765]: I1003 08:39:56.352269 4765 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:36Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:36Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e2003e4dd90b26bd915c05a690d0ab12b21ef7773138f11993382b0e7ac2d778\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T08:39:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T08:39:56Z is after 2025-08-24T17:21:41Z" Oct 03 08:39:56 crc kubenswrapper[4765]: I1003 08:39:56.365910 4765 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-wdwf5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6824483c-e9a7-4e95-bb3d-e00bac2af3aa\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:50Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:50Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:50Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9t858\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9t858\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T08:39:50Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-wdwf5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T08:39:56Z is after 2025-08-24T17:21:41Z" Oct 03 08:39:56 crc kubenswrapper[4765]: I1003 08:39:56.378685 4765 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:35Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:35Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T08:39:56Z is after 2025-08-24T17:21:41Z" Oct 03 08:39:56 crc kubenswrapper[4765]: I1003 08:39:56.394451 4765 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-csb5z" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"912755c8-dd28-4fbc-82de-9cf85df54f4f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d7f179012e9f55f30c641a1ae3640cc90cefb3d2527d0c1e0580c219899503e1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T08:39:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w8k2k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T08:39:36Z\\\"}}\" for pod \"openshift-multus\"/\"multus-csb5z\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T08:39:56Z is after 2025-08-24T17:21:41Z" Oct 03 08:39:56 crc kubenswrapper[4765]: I1003 08:39:56.408834 4765 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-j8mss" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d636dbad-9ffa-4ba7-953f-adea04b76a23\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://33c95fa1034cd2135f4293956d73825e809195d220ff0b10a6604bd399a5730a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T08:39:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dhzzj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://714c78e9165f96e2aee03ad7be980399f06aeb852da4d76611c236f262518281\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T08:39:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dhzzj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T08:39:36Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-j8mss\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T08:39:56Z is after 2025-08-24T17:21:41Z" Oct 03 08:39:56 crc kubenswrapper[4765]: I1003 08:39:56.412956 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 08:39:56 crc kubenswrapper[4765]: I1003 08:39:56.412998 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 08:39:56 crc kubenswrapper[4765]: I1003 08:39:56.413008 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 08:39:56 crc kubenswrapper[4765]: I1003 08:39:56.413024 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 08:39:56 crc kubenswrapper[4765]: I1003 08:39:56.413035 4765 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T08:39:56Z","lastTransitionTime":"2025-10-03T08:39:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 08:39:56 crc kubenswrapper[4765]: I1003 08:39:56.438677 4765 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"859ee4f1-636f-48e5-ad72-fef19f311c64\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ffaf0cbc60fa84230a87aff908b5b2a76956abfa937aeea94363abe91640b93e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T08:39:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0fee410f71d4fa82e7bf54dad906736bc7182be512825a06bf7a4c76ed2f2789\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T08:39:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ab0ed26066c771f9943b6435fa382ff61fb04f0c8bef3d505aba4c5d1a1d4740\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T08:39:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://153c9584928c3d064c6098126dad58733015ed123b9a55c959e69ddcc0ad2110\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T08:39:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6fa1bc45d80d90bc08ca3a7177e2ac77b66c36f5a0f863532174be7719bfaae0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T08:39:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c1359775b3b907743ce1d7754c904fab5cf953ad3a28f91e781be95462e6a9d4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c1359775b3b907743ce1d7754c904fab5cf953ad3a28f91e781be95462e6a9d4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T08:39:17Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T08:39:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a56c24b3af6b3d5c18009cbb5517273674eb36f57157344f99903f9c50c7d1a0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a56c24b3af6b3d5c18009cbb5517273674eb36f57157344f99903f9c50c7d1a0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T08:39:18Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T08:39:18Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://18dbd66a12f96fa4beef4da9dcc0e1b1d3a5164a8b21a6774142289078d6c05f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://18dbd66a12f96fa4beef4da9dcc0e1b1d3a5164a8b21a6774142289078d6c05f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T08:39:19Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T08:39:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T08:39:16Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T08:39:56Z is after 2025-08-24T17:21:41Z" Oct 03 08:39:56 crc kubenswrapper[4765]: I1003 08:39:56.457104 4765 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:36Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:36Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e8d6f534a0a702832db2f8947c1528a98d511d3950cc5a6ec0ac3b31b3dbcb7c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T08:39:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://20ad16cb9f0f7e17ac946cd2c3f7c01b6e6c95d6d76c99f482b3761546689af2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T08:39:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T08:39:56Z is after 2025-08-24T17:21:41Z" Oct 03 08:39:56 crc kubenswrapper[4765]: I1003 08:39:56.471811 4765 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:38Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:38Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a37f2b5f797755065158a077232872befbc61f2f19c80dfd27bba7f131db794c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T08:39:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T08:39:56Z is after 2025-08-24T17:21:41Z" Oct 03 08:39:56 crc kubenswrapper[4765]: I1003 08:39:56.493420 4765 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-srgbb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ea01fba1-445f-46c1-898c-1ceb34866850\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:36Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:36Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d73e2e54676fc570262cfd551322ed003812c372ddc25695ca3b34ae2a05423b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T08:39:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m4zqv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fa40947035e07c4926ee170348e2bd545830d0c6c1fa6b59a2aa7f12eac2c6da\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T08:39:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m4zqv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://902d94d2cc9ce526c6ea774f1bb70fbee7da85cedab72fcd842f87d47ee8a458\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T08:39:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m4zqv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://95502595a856f5f235331ab5db3d4f97a50f968857c1962d12b873a714689f0c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T08:39:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m4zqv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a3ad66691c9dcf004703b79d697a78f9b42791fafba2ddf278997b6ad28bdd4a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T08:39:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m4zqv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://68b9b8a7ec5c072f50d44aa0d3800b7cdee18bdd868d37ec129ceb37a23bd3ca\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T08:39:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m4zqv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4fa0909ee1317cdeb75c73911371e3344b889b98379e921f58d444c960308e28\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4fa0909ee1317cdeb75c73911371e3344b889b98379e921f58d444c960308e28\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-10-03T08:39:46Z\\\",\\\"message\\\":\\\"erator/iptables-alerter-4ln5h\\\\nI1003 08:39:46.761968 6208 ovn.go:134] Ensuring zone local for Pod openshift-network-operator/iptables-alerter-4ln5h in node crc\\\\nI1003 08:39:46.761841 6208 obj_retry.go:303] Retry object setup: *v1.Pod openshift-network-node-identity/network-node-identity-vrzqb\\\\nI1003 08:39:46.761976 6208 obj_retry.go:386] Retry successful for *v1.Pod openshift-network-operator/iptables-alerter-4ln5h after 0 failed attempt(s)\\\\nI1003 08:39:46.761984 6208 default_network_controller.go:776] Recording success event on pod openshift-network-operator/iptables-alerter-4ln5h\\\\nI1003 08:39:46.761892 6208 ovn.go:134] Ensuring zone local for Pod openshift-kube-controller-manager/kube-controller-manager-crc in node crc\\\\nI1003 08:39:46.762002 6208 obj_retry.go:386] Retry successful for *v1.Pod openshift-kube-controller-manager/kube-controller-manager-crc after 0 failed attempt(s)\\\\nI1003 08:39:46.762008 6208 default_network_controller.go:776] Recording success event on pod openshift-kube-controller-manager/kube-controller-manager-crc\\\\nI1003 08:39:46.761983 6208 obj_retry.go:365] Adding new object: *v1.Pod openshift-network-node-identity/network-node-identity-vrzqb\\\\nF1003 08:39:46.762028 6208 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-03T08:39:45Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=ovnkube-controller pod=ovnkube-node-srgbb_openshift-ovn-kubernetes(ea01fba1-445f-46c1-898c-1ceb34866850)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m4zqv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6d5d60eb6ab5ff22cc2c6826b1d47220bb827fa0429f2a59020ae01d0a43f6bf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T08:39:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m4zqv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://761ad034a8a6a7a6280fcccaca136b26ddd423885bc2a7ab6b8e1856f16f7416\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://761ad034a8a6a7a6280fcccaca136b26ddd423885bc2a7ab6b8e1856f16f7416\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T08:39:37Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T08:39:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m4zqv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T08:39:36Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-srgbb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T08:39:56Z is after 2025-08-24T17:21:41Z" Oct 03 08:39:56 crc kubenswrapper[4765]: I1003 08:39:56.509338 4765 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:35Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:35Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T08:39:56Z is after 2025-08-24T17:21:41Z" Oct 03 08:39:56 crc kubenswrapper[4765]: I1003 08:39:56.515867 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 08:39:56 crc kubenswrapper[4765]: I1003 08:39:56.516012 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 08:39:56 crc kubenswrapper[4765]: I1003 08:39:56.516125 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 08:39:56 crc kubenswrapper[4765]: I1003 08:39:56.516216 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 08:39:56 crc kubenswrapper[4765]: I1003 08:39:56.516298 4765 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T08:39:56Z","lastTransitionTime":"2025-10-03T08:39:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 08:39:56 crc kubenswrapper[4765]: I1003 08:39:56.527253 4765 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-4bmrv" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4f105c06-3e67-486f-a622-923ae442117c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d7a29ab4db9b7548c70824520272e6323f615934cddf1d92bf653f6d8f030a19\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T08:39:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6lhz2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ab314ed006e71cae099ecf8be4ac18fb777dd83fe4e233d9bc347965f76b6a0b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ab314ed006e71cae099ecf8be4ac18fb777dd83fe4e233d9bc347965f76b6a0b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T08:39:37Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T08:39:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6lhz2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://261208c132a527b6786f64f37dd699e889f68ebc01265028288b3620615274bc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://261208c132a527b6786f64f37dd699e889f68ebc01265028288b3620615274bc\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T08:39:39Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T08:39:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6lhz2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8358f38c6c98de8206fc03fda21200b0e526972747588a6e32d525c064aa57ac\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8358f38c6c98de8206fc03fda21200b0e526972747588a6e32d525c064aa57ac\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T08:39:39Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T08:39:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6lhz2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9af7a0993c4e8d1177050ee170ae306c2e2570b0daca2d3f5c812b5f0e9c81da\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9af7a0993c4e8d1177050ee170ae306c2e2570b0daca2d3f5c812b5f0e9c81da\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T08:39:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T08:39:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6lhz2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://23ac91bc25ecc5c606b22bf6df52129330bb8c214ef8ec881fb202df6350c853\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://23ac91bc25ecc5c606b22bf6df52129330bb8c214ef8ec881fb202df6350c853\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T08:39:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T08:39:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6lhz2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7c836df75da45ef369baafc15bdbed1068becc3bf57a4c83a8519280ff3eb847\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7c836df75da45ef369baafc15bdbed1068becc3bf57a4c83a8519280ff3eb847\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T08:39:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T08:39:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6lhz2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T08:39:36Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-4bmrv\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T08:39:56Z is after 2025-08-24T17:21:41Z" Oct 03 08:39:56 crc kubenswrapper[4765]: I1003 08:39:56.540105 4765 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-p9gf5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"46c76a49-e10b-4a12-a6c7-12c330cd3c4e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d127171dd11041892813dd0596574630e756cc4f2e54b149619bffdbe9bae37d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T08:39:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2ldt7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T08:39:36Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-p9gf5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T08:39:56Z is after 2025-08-24T17:21:41Z" Oct 03 08:39:56 crc kubenswrapper[4765]: I1003 08:39:56.552467 4765 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-svqbq" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"84cdf1d7-9997-4015-bdbf-eedacc081685\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://43441b23076aa88505c0014c6734ffd0302f9011300711eece573befc94f3fbf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T08:39:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tm2z5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T08:39:39Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-svqbq\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T08:39:56Z is after 2025-08-24T17:21:41Z" Oct 03 08:39:56 crc kubenswrapper[4765]: I1003 08:39:56.565322 4765 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-9pssq" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fcbd8c60-e4bc-43c1-b769-9ae58a05ea0f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fb36c0727cbf11d911102b2e91c3989a264374191f4ff34349ed6ec8eba2e58d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T08:39:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ggxtg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d810b33fb4971c7a1473884cbe04ad15b3cac6c0ca9af2384819d72a748ab173\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T08:39:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ggxtg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T08:39:48Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-9pssq\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T08:39:56Z is after 2025-08-24T17:21:41Z" Oct 03 08:39:56 crc kubenswrapper[4765]: I1003 08:39:56.578202 4765 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9660b983-3561-4cf7-8ea0-31a63e8d1051\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://23c27e7d79dab0c54b22f0114e7f55a9267e3a21961b8479c37fd77d0e8b66c7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T08:39:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fb89a31c804d86cbc11b04e4dcfab79d4536f28a107d43e98d48172a1c257ebb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T08:39:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1c3168f51c49cd9633557cf31cdc0fec47b3fcf981462dc85f4253a0584fcf52\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T08:39:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://99ae775d5cfd2e88a1c7ca516e1c59f2e08ce1d383653cacbefeac66b07abcb5\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T08:39:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T08:39:16Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T08:39:56Z is after 2025-08-24T17:21:41Z" Oct 03 08:39:56 crc kubenswrapper[4765]: I1003 08:39:56.619740 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 08:39:56 crc kubenswrapper[4765]: I1003 08:39:56.619812 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 08:39:56 crc kubenswrapper[4765]: I1003 08:39:56.619825 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 08:39:56 crc kubenswrapper[4765]: I1003 08:39:56.619843 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 08:39:56 crc kubenswrapper[4765]: I1003 08:39:56.619854 4765 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T08:39:56Z","lastTransitionTime":"2025-10-03T08:39:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 08:39:56 crc kubenswrapper[4765]: I1003 08:39:56.722999 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 08:39:56 crc kubenswrapper[4765]: I1003 08:39:56.723045 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 08:39:56 crc kubenswrapper[4765]: I1003 08:39:56.723055 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 08:39:56 crc kubenswrapper[4765]: I1003 08:39:56.723085 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 08:39:56 crc kubenswrapper[4765]: I1003 08:39:56.723098 4765 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T08:39:56Z","lastTransitionTime":"2025-10-03T08:39:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 08:39:56 crc kubenswrapper[4765]: I1003 08:39:56.825018 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 08:39:56 crc kubenswrapper[4765]: I1003 08:39:56.825088 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 08:39:56 crc kubenswrapper[4765]: I1003 08:39:56.825102 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 08:39:56 crc kubenswrapper[4765]: I1003 08:39:56.825123 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 08:39:56 crc kubenswrapper[4765]: I1003 08:39:56.825139 4765 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T08:39:56Z","lastTransitionTime":"2025-10-03T08:39:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 08:39:56 crc kubenswrapper[4765]: I1003 08:39:56.928264 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 08:39:56 crc kubenswrapper[4765]: I1003 08:39:56.928326 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 08:39:56 crc kubenswrapper[4765]: I1003 08:39:56.928342 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 08:39:56 crc kubenswrapper[4765]: I1003 08:39:56.928368 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 08:39:56 crc kubenswrapper[4765]: I1003 08:39:56.928390 4765 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T08:39:56Z","lastTransitionTime":"2025-10-03T08:39:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 08:39:57 crc kubenswrapper[4765]: I1003 08:39:57.031245 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 08:39:57 crc kubenswrapper[4765]: I1003 08:39:57.031600 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 08:39:57 crc kubenswrapper[4765]: I1003 08:39:57.031623 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 08:39:57 crc kubenswrapper[4765]: I1003 08:39:57.031661 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 08:39:57 crc kubenswrapper[4765]: I1003 08:39:57.031673 4765 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T08:39:57Z","lastTransitionTime":"2025-10-03T08:39:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 08:39:57 crc kubenswrapper[4765]: I1003 08:39:57.134735 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 08:39:57 crc kubenswrapper[4765]: I1003 08:39:57.134779 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 08:39:57 crc kubenswrapper[4765]: I1003 08:39:57.134789 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 08:39:57 crc kubenswrapper[4765]: I1003 08:39:57.134806 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 08:39:57 crc kubenswrapper[4765]: I1003 08:39:57.134817 4765 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T08:39:57Z","lastTransitionTime":"2025-10-03T08:39:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 08:39:57 crc kubenswrapper[4765]: I1003 08:39:57.237471 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 08:39:57 crc kubenswrapper[4765]: I1003 08:39:57.237512 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 08:39:57 crc kubenswrapper[4765]: I1003 08:39:57.237521 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 08:39:57 crc kubenswrapper[4765]: I1003 08:39:57.237535 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 08:39:57 crc kubenswrapper[4765]: I1003 08:39:57.237544 4765 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T08:39:57Z","lastTransitionTime":"2025-10-03T08:39:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 08:39:57 crc kubenswrapper[4765]: I1003 08:39:57.306497 4765 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 03 08:39:57 crc kubenswrapper[4765]: E1003 08:39:57.306675 4765 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 03 08:39:57 crc kubenswrapper[4765]: I1003 08:39:57.306688 4765 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 03 08:39:57 crc kubenswrapper[4765]: I1003 08:39:57.306792 4765 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 03 08:39:57 crc kubenswrapper[4765]: E1003 08:39:57.307065 4765 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 03 08:39:57 crc kubenswrapper[4765]: E1003 08:39:57.307148 4765 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 03 08:39:57 crc kubenswrapper[4765]: I1003 08:39:57.340418 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 08:39:57 crc kubenswrapper[4765]: I1003 08:39:57.340500 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 08:39:57 crc kubenswrapper[4765]: I1003 08:39:57.340514 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 08:39:57 crc kubenswrapper[4765]: I1003 08:39:57.340536 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 08:39:57 crc kubenswrapper[4765]: I1003 08:39:57.340968 4765 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T08:39:57Z","lastTransitionTime":"2025-10-03T08:39:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 08:39:57 crc kubenswrapper[4765]: I1003 08:39:57.443818 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 08:39:57 crc kubenswrapper[4765]: I1003 08:39:57.443865 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 08:39:57 crc kubenswrapper[4765]: I1003 08:39:57.443874 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 08:39:57 crc kubenswrapper[4765]: I1003 08:39:57.443895 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 08:39:57 crc kubenswrapper[4765]: I1003 08:39:57.443909 4765 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T08:39:57Z","lastTransitionTime":"2025-10-03T08:39:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 08:39:57 crc kubenswrapper[4765]: I1003 08:39:57.546846 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 08:39:57 crc kubenswrapper[4765]: I1003 08:39:57.546907 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 08:39:57 crc kubenswrapper[4765]: I1003 08:39:57.546919 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 08:39:57 crc kubenswrapper[4765]: I1003 08:39:57.546940 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 08:39:57 crc kubenswrapper[4765]: I1003 08:39:57.546955 4765 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T08:39:57Z","lastTransitionTime":"2025-10-03T08:39:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 08:39:57 crc kubenswrapper[4765]: I1003 08:39:57.650345 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 08:39:57 crc kubenswrapper[4765]: I1003 08:39:57.650389 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 08:39:57 crc kubenswrapper[4765]: I1003 08:39:57.650401 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 08:39:57 crc kubenswrapper[4765]: I1003 08:39:57.650422 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 08:39:57 crc kubenswrapper[4765]: I1003 08:39:57.650437 4765 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T08:39:57Z","lastTransitionTime":"2025-10-03T08:39:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 08:39:57 crc kubenswrapper[4765]: I1003 08:39:57.754010 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 08:39:57 crc kubenswrapper[4765]: I1003 08:39:57.754067 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 08:39:57 crc kubenswrapper[4765]: I1003 08:39:57.754083 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 08:39:57 crc kubenswrapper[4765]: I1003 08:39:57.754104 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 08:39:57 crc kubenswrapper[4765]: I1003 08:39:57.754120 4765 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T08:39:57Z","lastTransitionTime":"2025-10-03T08:39:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 08:39:57 crc kubenswrapper[4765]: I1003 08:39:57.857344 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 08:39:57 crc kubenswrapper[4765]: I1003 08:39:57.857414 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 08:39:57 crc kubenswrapper[4765]: I1003 08:39:57.857423 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 08:39:57 crc kubenswrapper[4765]: I1003 08:39:57.857441 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 08:39:57 crc kubenswrapper[4765]: I1003 08:39:57.857452 4765 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T08:39:57Z","lastTransitionTime":"2025-10-03T08:39:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 08:39:57 crc kubenswrapper[4765]: I1003 08:39:57.960360 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 08:39:57 crc kubenswrapper[4765]: I1003 08:39:57.960420 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 08:39:57 crc kubenswrapper[4765]: I1003 08:39:57.960433 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 08:39:57 crc kubenswrapper[4765]: I1003 08:39:57.960454 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 08:39:57 crc kubenswrapper[4765]: I1003 08:39:57.960469 4765 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T08:39:57Z","lastTransitionTime":"2025-10-03T08:39:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 08:39:58 crc kubenswrapper[4765]: I1003 08:39:58.063217 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 08:39:58 crc kubenswrapper[4765]: I1003 08:39:58.063278 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 08:39:58 crc kubenswrapper[4765]: I1003 08:39:58.063292 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 08:39:58 crc kubenswrapper[4765]: I1003 08:39:58.063311 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 08:39:58 crc kubenswrapper[4765]: I1003 08:39:58.063322 4765 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T08:39:58Z","lastTransitionTime":"2025-10-03T08:39:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 08:39:58 crc kubenswrapper[4765]: I1003 08:39:58.165813 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 08:39:58 crc kubenswrapper[4765]: I1003 08:39:58.165864 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 08:39:58 crc kubenswrapper[4765]: I1003 08:39:58.165878 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 08:39:58 crc kubenswrapper[4765]: I1003 08:39:58.165895 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 08:39:58 crc kubenswrapper[4765]: I1003 08:39:58.165906 4765 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T08:39:58Z","lastTransitionTime":"2025-10-03T08:39:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 08:39:58 crc kubenswrapper[4765]: I1003 08:39:58.215622 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/6824483c-e9a7-4e95-bb3d-e00bac2af3aa-metrics-certs\") pod \"network-metrics-daemon-wdwf5\" (UID: \"6824483c-e9a7-4e95-bb3d-e00bac2af3aa\") " pod="openshift-multus/network-metrics-daemon-wdwf5" Oct 03 08:39:58 crc kubenswrapper[4765]: E1003 08:39:58.215782 4765 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Oct 03 08:39:58 crc kubenswrapper[4765]: E1003 08:39:58.215841 4765 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/6824483c-e9a7-4e95-bb3d-e00bac2af3aa-metrics-certs podName:6824483c-e9a7-4e95-bb3d-e00bac2af3aa nodeName:}" failed. No retries permitted until 2025-10-03 08:40:06.215827939 +0000 UTC m=+50.517322269 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/6824483c-e9a7-4e95-bb3d-e00bac2af3aa-metrics-certs") pod "network-metrics-daemon-wdwf5" (UID: "6824483c-e9a7-4e95-bb3d-e00bac2af3aa") : object "openshift-multus"/"metrics-daemon-secret" not registered Oct 03 08:39:58 crc kubenswrapper[4765]: I1003 08:39:58.267885 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 08:39:58 crc kubenswrapper[4765]: I1003 08:39:58.267932 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 08:39:58 crc kubenswrapper[4765]: I1003 08:39:58.267944 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 08:39:58 crc kubenswrapper[4765]: I1003 08:39:58.267962 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 08:39:58 crc kubenswrapper[4765]: I1003 08:39:58.267975 4765 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T08:39:58Z","lastTransitionTime":"2025-10-03T08:39:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 08:39:58 crc kubenswrapper[4765]: I1003 08:39:58.306996 4765 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-wdwf5" Oct 03 08:39:58 crc kubenswrapper[4765]: E1003 08:39:58.307182 4765 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-wdwf5" podUID="6824483c-e9a7-4e95-bb3d-e00bac2af3aa" Oct 03 08:39:58 crc kubenswrapper[4765]: I1003 08:39:58.371029 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 08:39:58 crc kubenswrapper[4765]: I1003 08:39:58.371065 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 08:39:58 crc kubenswrapper[4765]: I1003 08:39:58.371098 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 08:39:58 crc kubenswrapper[4765]: I1003 08:39:58.371117 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 08:39:58 crc kubenswrapper[4765]: I1003 08:39:58.371127 4765 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T08:39:58Z","lastTransitionTime":"2025-10-03T08:39:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 08:39:58 crc kubenswrapper[4765]: I1003 08:39:58.474157 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 08:39:58 crc kubenswrapper[4765]: I1003 08:39:58.474214 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 08:39:58 crc kubenswrapper[4765]: I1003 08:39:58.474231 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 08:39:58 crc kubenswrapper[4765]: I1003 08:39:58.474252 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 08:39:58 crc kubenswrapper[4765]: I1003 08:39:58.474268 4765 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T08:39:58Z","lastTransitionTime":"2025-10-03T08:39:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 08:39:58 crc kubenswrapper[4765]: I1003 08:39:58.576996 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 08:39:58 crc kubenswrapper[4765]: I1003 08:39:58.577053 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 08:39:58 crc kubenswrapper[4765]: I1003 08:39:58.577066 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 08:39:58 crc kubenswrapper[4765]: I1003 08:39:58.577085 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 08:39:58 crc kubenswrapper[4765]: I1003 08:39:58.577101 4765 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T08:39:58Z","lastTransitionTime":"2025-10-03T08:39:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 08:39:58 crc kubenswrapper[4765]: I1003 08:39:58.680681 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 08:39:58 crc kubenswrapper[4765]: I1003 08:39:58.680742 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 08:39:58 crc kubenswrapper[4765]: I1003 08:39:58.680756 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 08:39:58 crc kubenswrapper[4765]: I1003 08:39:58.680775 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 08:39:58 crc kubenswrapper[4765]: I1003 08:39:58.680789 4765 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T08:39:58Z","lastTransitionTime":"2025-10-03T08:39:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 08:39:58 crc kubenswrapper[4765]: I1003 08:39:58.784099 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 08:39:58 crc kubenswrapper[4765]: I1003 08:39:58.784154 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 08:39:58 crc kubenswrapper[4765]: I1003 08:39:58.784166 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 08:39:58 crc kubenswrapper[4765]: I1003 08:39:58.784186 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 08:39:58 crc kubenswrapper[4765]: I1003 08:39:58.784199 4765 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T08:39:58Z","lastTransitionTime":"2025-10-03T08:39:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 08:39:58 crc kubenswrapper[4765]: I1003 08:39:58.886817 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 08:39:58 crc kubenswrapper[4765]: I1003 08:39:58.886869 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 08:39:58 crc kubenswrapper[4765]: I1003 08:39:58.886880 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 08:39:58 crc kubenswrapper[4765]: I1003 08:39:58.886899 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 08:39:58 crc kubenswrapper[4765]: I1003 08:39:58.886910 4765 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T08:39:58Z","lastTransitionTime":"2025-10-03T08:39:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 08:39:58 crc kubenswrapper[4765]: I1003 08:39:58.989462 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 08:39:58 crc kubenswrapper[4765]: I1003 08:39:58.989518 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 08:39:58 crc kubenswrapper[4765]: I1003 08:39:58.989528 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 08:39:58 crc kubenswrapper[4765]: I1003 08:39:58.989543 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 08:39:58 crc kubenswrapper[4765]: I1003 08:39:58.989554 4765 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T08:39:58Z","lastTransitionTime":"2025-10-03T08:39:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 08:39:59 crc kubenswrapper[4765]: I1003 08:39:59.092690 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 08:39:59 crc kubenswrapper[4765]: I1003 08:39:59.092772 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 08:39:59 crc kubenswrapper[4765]: I1003 08:39:59.092796 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 08:39:59 crc kubenswrapper[4765]: I1003 08:39:59.092828 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 08:39:59 crc kubenswrapper[4765]: I1003 08:39:59.092860 4765 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T08:39:59Z","lastTransitionTime":"2025-10-03T08:39:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 08:39:59 crc kubenswrapper[4765]: I1003 08:39:59.195853 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 08:39:59 crc kubenswrapper[4765]: I1003 08:39:59.195928 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 08:39:59 crc kubenswrapper[4765]: I1003 08:39:59.195948 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 08:39:59 crc kubenswrapper[4765]: I1003 08:39:59.195976 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 08:39:59 crc kubenswrapper[4765]: I1003 08:39:59.195997 4765 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T08:39:59Z","lastTransitionTime":"2025-10-03T08:39:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 08:39:59 crc kubenswrapper[4765]: I1003 08:39:59.297900 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 08:39:59 crc kubenswrapper[4765]: I1003 08:39:59.297949 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 08:39:59 crc kubenswrapper[4765]: I1003 08:39:59.297960 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 08:39:59 crc kubenswrapper[4765]: I1003 08:39:59.297980 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 08:39:59 crc kubenswrapper[4765]: I1003 08:39:59.297992 4765 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T08:39:59Z","lastTransitionTime":"2025-10-03T08:39:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 08:39:59 crc kubenswrapper[4765]: I1003 08:39:59.306197 4765 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 03 08:39:59 crc kubenswrapper[4765]: I1003 08:39:59.306214 4765 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 03 08:39:59 crc kubenswrapper[4765]: I1003 08:39:59.306242 4765 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 03 08:39:59 crc kubenswrapper[4765]: E1003 08:39:59.306686 4765 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 03 08:39:59 crc kubenswrapper[4765]: E1003 08:39:59.306764 4765 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 03 08:39:59 crc kubenswrapper[4765]: E1003 08:39:59.306828 4765 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 03 08:39:59 crc kubenswrapper[4765]: I1003 08:39:59.400815 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 08:39:59 crc kubenswrapper[4765]: I1003 08:39:59.400864 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 08:39:59 crc kubenswrapper[4765]: I1003 08:39:59.400873 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 08:39:59 crc kubenswrapper[4765]: I1003 08:39:59.400894 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 08:39:59 crc kubenswrapper[4765]: I1003 08:39:59.400906 4765 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T08:39:59Z","lastTransitionTime":"2025-10-03T08:39:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 08:39:59 crc kubenswrapper[4765]: I1003 08:39:59.503976 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 08:39:59 crc kubenswrapper[4765]: I1003 08:39:59.504044 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 08:39:59 crc kubenswrapper[4765]: I1003 08:39:59.504069 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 08:39:59 crc kubenswrapper[4765]: I1003 08:39:59.504101 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 08:39:59 crc kubenswrapper[4765]: I1003 08:39:59.504126 4765 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T08:39:59Z","lastTransitionTime":"2025-10-03T08:39:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 08:39:59 crc kubenswrapper[4765]: I1003 08:39:59.607610 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 08:39:59 crc kubenswrapper[4765]: I1003 08:39:59.607683 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 08:39:59 crc kubenswrapper[4765]: I1003 08:39:59.607696 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 08:39:59 crc kubenswrapper[4765]: I1003 08:39:59.607712 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 08:39:59 crc kubenswrapper[4765]: I1003 08:39:59.607723 4765 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T08:39:59Z","lastTransitionTime":"2025-10-03T08:39:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 08:39:59 crc kubenswrapper[4765]: I1003 08:39:59.711729 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 08:39:59 crc kubenswrapper[4765]: I1003 08:39:59.711821 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 08:39:59 crc kubenswrapper[4765]: I1003 08:39:59.711846 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 08:39:59 crc kubenswrapper[4765]: I1003 08:39:59.711876 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 08:39:59 crc kubenswrapper[4765]: I1003 08:39:59.711899 4765 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T08:39:59Z","lastTransitionTime":"2025-10-03T08:39:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 08:39:59 crc kubenswrapper[4765]: I1003 08:39:59.815544 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 08:39:59 crc kubenswrapper[4765]: I1003 08:39:59.815597 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 08:39:59 crc kubenswrapper[4765]: I1003 08:39:59.815608 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 08:39:59 crc kubenswrapper[4765]: I1003 08:39:59.815626 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 08:39:59 crc kubenswrapper[4765]: I1003 08:39:59.815637 4765 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T08:39:59Z","lastTransitionTime":"2025-10-03T08:39:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 08:39:59 crc kubenswrapper[4765]: I1003 08:39:59.918020 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 08:39:59 crc kubenswrapper[4765]: I1003 08:39:59.918101 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 08:39:59 crc kubenswrapper[4765]: I1003 08:39:59.918117 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 08:39:59 crc kubenswrapper[4765]: I1003 08:39:59.918144 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 08:39:59 crc kubenswrapper[4765]: I1003 08:39:59.918176 4765 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T08:39:59Z","lastTransitionTime":"2025-10-03T08:39:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 08:40:00 crc kubenswrapper[4765]: I1003 08:40:00.020721 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 08:40:00 crc kubenswrapper[4765]: I1003 08:40:00.020789 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 08:40:00 crc kubenswrapper[4765]: I1003 08:40:00.020799 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 08:40:00 crc kubenswrapper[4765]: I1003 08:40:00.020815 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 08:40:00 crc kubenswrapper[4765]: I1003 08:40:00.020826 4765 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T08:40:00Z","lastTransitionTime":"2025-10-03T08:40:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 08:40:00 crc kubenswrapper[4765]: I1003 08:40:00.124077 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 08:40:00 crc kubenswrapper[4765]: I1003 08:40:00.124139 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 08:40:00 crc kubenswrapper[4765]: I1003 08:40:00.124157 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 08:40:00 crc kubenswrapper[4765]: I1003 08:40:00.124179 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 08:40:00 crc kubenswrapper[4765]: I1003 08:40:00.124196 4765 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T08:40:00Z","lastTransitionTime":"2025-10-03T08:40:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 08:40:00 crc kubenswrapper[4765]: I1003 08:40:00.228759 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 08:40:00 crc kubenswrapper[4765]: I1003 08:40:00.228825 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 08:40:00 crc kubenswrapper[4765]: I1003 08:40:00.228839 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 08:40:00 crc kubenswrapper[4765]: I1003 08:40:00.228859 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 08:40:00 crc kubenswrapper[4765]: I1003 08:40:00.228870 4765 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T08:40:00Z","lastTransitionTime":"2025-10-03T08:40:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 08:40:00 crc kubenswrapper[4765]: I1003 08:40:00.306547 4765 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-wdwf5" Oct 03 08:40:00 crc kubenswrapper[4765]: E1003 08:40:00.306725 4765 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-wdwf5" podUID="6824483c-e9a7-4e95-bb3d-e00bac2af3aa" Oct 03 08:40:00 crc kubenswrapper[4765]: I1003 08:40:00.331506 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 08:40:00 crc kubenswrapper[4765]: I1003 08:40:00.331552 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 08:40:00 crc kubenswrapper[4765]: I1003 08:40:00.331564 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 08:40:00 crc kubenswrapper[4765]: I1003 08:40:00.331584 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 08:40:00 crc kubenswrapper[4765]: I1003 08:40:00.331600 4765 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T08:40:00Z","lastTransitionTime":"2025-10-03T08:40:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 08:40:00 crc kubenswrapper[4765]: I1003 08:40:00.434396 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 08:40:00 crc kubenswrapper[4765]: I1003 08:40:00.434454 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 08:40:00 crc kubenswrapper[4765]: I1003 08:40:00.434464 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 08:40:00 crc kubenswrapper[4765]: I1003 08:40:00.434484 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 08:40:00 crc kubenswrapper[4765]: I1003 08:40:00.434498 4765 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T08:40:00Z","lastTransitionTime":"2025-10-03T08:40:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 08:40:00 crc kubenswrapper[4765]: I1003 08:40:00.537223 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 08:40:00 crc kubenswrapper[4765]: I1003 08:40:00.537306 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 08:40:00 crc kubenswrapper[4765]: I1003 08:40:00.537321 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 08:40:00 crc kubenswrapper[4765]: I1003 08:40:00.537339 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 08:40:00 crc kubenswrapper[4765]: I1003 08:40:00.537351 4765 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T08:40:00Z","lastTransitionTime":"2025-10-03T08:40:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 08:40:00 crc kubenswrapper[4765]: I1003 08:40:00.639885 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 08:40:00 crc kubenswrapper[4765]: I1003 08:40:00.639956 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 08:40:00 crc kubenswrapper[4765]: I1003 08:40:00.639971 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 08:40:00 crc kubenswrapper[4765]: I1003 08:40:00.639996 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 08:40:00 crc kubenswrapper[4765]: I1003 08:40:00.640020 4765 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T08:40:00Z","lastTransitionTime":"2025-10-03T08:40:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 08:40:00 crc kubenswrapper[4765]: I1003 08:40:00.742946 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 08:40:00 crc kubenswrapper[4765]: I1003 08:40:00.743033 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 08:40:00 crc kubenswrapper[4765]: I1003 08:40:00.743057 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 08:40:00 crc kubenswrapper[4765]: I1003 08:40:00.743091 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 08:40:00 crc kubenswrapper[4765]: I1003 08:40:00.743110 4765 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T08:40:00Z","lastTransitionTime":"2025-10-03T08:40:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 08:40:00 crc kubenswrapper[4765]: I1003 08:40:00.845712 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 08:40:00 crc kubenswrapper[4765]: I1003 08:40:00.845807 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 08:40:00 crc kubenswrapper[4765]: I1003 08:40:00.845842 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 08:40:00 crc kubenswrapper[4765]: I1003 08:40:00.845872 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 08:40:00 crc kubenswrapper[4765]: I1003 08:40:00.845889 4765 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T08:40:00Z","lastTransitionTime":"2025-10-03T08:40:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 08:40:00 crc kubenswrapper[4765]: I1003 08:40:00.949262 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 08:40:00 crc kubenswrapper[4765]: I1003 08:40:00.949339 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 08:40:00 crc kubenswrapper[4765]: I1003 08:40:00.949349 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 08:40:00 crc kubenswrapper[4765]: I1003 08:40:00.949367 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 08:40:00 crc kubenswrapper[4765]: I1003 08:40:00.949379 4765 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T08:40:00Z","lastTransitionTime":"2025-10-03T08:40:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 08:40:01 crc kubenswrapper[4765]: I1003 08:40:01.051694 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 08:40:01 crc kubenswrapper[4765]: I1003 08:40:01.051739 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 08:40:01 crc kubenswrapper[4765]: I1003 08:40:01.051769 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 08:40:01 crc kubenswrapper[4765]: I1003 08:40:01.051793 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 08:40:01 crc kubenswrapper[4765]: I1003 08:40:01.051808 4765 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T08:40:01Z","lastTransitionTime":"2025-10-03T08:40:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 08:40:01 crc kubenswrapper[4765]: I1003 08:40:01.154629 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 08:40:01 crc kubenswrapper[4765]: I1003 08:40:01.154707 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 08:40:01 crc kubenswrapper[4765]: I1003 08:40:01.154717 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 08:40:01 crc kubenswrapper[4765]: I1003 08:40:01.154732 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 08:40:01 crc kubenswrapper[4765]: I1003 08:40:01.154743 4765 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T08:40:01Z","lastTransitionTime":"2025-10-03T08:40:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 08:40:01 crc kubenswrapper[4765]: I1003 08:40:01.257335 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 08:40:01 crc kubenswrapper[4765]: I1003 08:40:01.257394 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 08:40:01 crc kubenswrapper[4765]: I1003 08:40:01.257406 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 08:40:01 crc kubenswrapper[4765]: I1003 08:40:01.257422 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 08:40:01 crc kubenswrapper[4765]: I1003 08:40:01.257432 4765 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T08:40:01Z","lastTransitionTime":"2025-10-03T08:40:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 08:40:01 crc kubenswrapper[4765]: I1003 08:40:01.306494 4765 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 03 08:40:01 crc kubenswrapper[4765]: I1003 08:40:01.306625 4765 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 03 08:40:01 crc kubenswrapper[4765]: I1003 08:40:01.306535 4765 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 03 08:40:01 crc kubenswrapper[4765]: E1003 08:40:01.306881 4765 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 03 08:40:01 crc kubenswrapper[4765]: E1003 08:40:01.307105 4765 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 03 08:40:01 crc kubenswrapper[4765]: E1003 08:40:01.307353 4765 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 03 08:40:01 crc kubenswrapper[4765]: I1003 08:40:01.360830 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 08:40:01 crc kubenswrapper[4765]: I1003 08:40:01.360877 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 08:40:01 crc kubenswrapper[4765]: I1003 08:40:01.360886 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 08:40:01 crc kubenswrapper[4765]: I1003 08:40:01.360906 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 08:40:01 crc kubenswrapper[4765]: I1003 08:40:01.360916 4765 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T08:40:01Z","lastTransitionTime":"2025-10-03T08:40:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 08:40:01 crc kubenswrapper[4765]: I1003 08:40:01.465527 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 08:40:01 crc kubenswrapper[4765]: I1003 08:40:01.465604 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 08:40:01 crc kubenswrapper[4765]: I1003 08:40:01.465733 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 08:40:01 crc kubenswrapper[4765]: I1003 08:40:01.465761 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 08:40:01 crc kubenswrapper[4765]: I1003 08:40:01.465788 4765 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T08:40:01Z","lastTransitionTime":"2025-10-03T08:40:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 08:40:01 crc kubenswrapper[4765]: I1003 08:40:01.568665 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 08:40:01 crc kubenswrapper[4765]: I1003 08:40:01.568714 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 08:40:01 crc kubenswrapper[4765]: I1003 08:40:01.568723 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 08:40:01 crc kubenswrapper[4765]: I1003 08:40:01.568740 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 08:40:01 crc kubenswrapper[4765]: I1003 08:40:01.568751 4765 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T08:40:01Z","lastTransitionTime":"2025-10-03T08:40:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 08:40:01 crc kubenswrapper[4765]: I1003 08:40:01.670996 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 08:40:01 crc kubenswrapper[4765]: I1003 08:40:01.671045 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 08:40:01 crc kubenswrapper[4765]: I1003 08:40:01.671059 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 08:40:01 crc kubenswrapper[4765]: I1003 08:40:01.671079 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 08:40:01 crc kubenswrapper[4765]: I1003 08:40:01.671092 4765 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T08:40:01Z","lastTransitionTime":"2025-10-03T08:40:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 08:40:01 crc kubenswrapper[4765]: I1003 08:40:01.774736 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 08:40:01 crc kubenswrapper[4765]: I1003 08:40:01.774791 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 08:40:01 crc kubenswrapper[4765]: I1003 08:40:01.774805 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 08:40:01 crc kubenswrapper[4765]: I1003 08:40:01.774824 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 08:40:01 crc kubenswrapper[4765]: I1003 08:40:01.774841 4765 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T08:40:01Z","lastTransitionTime":"2025-10-03T08:40:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 08:40:01 crc kubenswrapper[4765]: I1003 08:40:01.878781 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 08:40:01 crc kubenswrapper[4765]: I1003 08:40:01.878851 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 08:40:01 crc kubenswrapper[4765]: I1003 08:40:01.878870 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 08:40:01 crc kubenswrapper[4765]: I1003 08:40:01.878896 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 08:40:01 crc kubenswrapper[4765]: I1003 08:40:01.878917 4765 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T08:40:01Z","lastTransitionTime":"2025-10-03T08:40:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 08:40:01 crc kubenswrapper[4765]: I1003 08:40:01.981223 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 08:40:01 crc kubenswrapper[4765]: I1003 08:40:01.981264 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 08:40:01 crc kubenswrapper[4765]: I1003 08:40:01.981277 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 08:40:01 crc kubenswrapper[4765]: I1003 08:40:01.981320 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 08:40:01 crc kubenswrapper[4765]: I1003 08:40:01.981333 4765 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T08:40:01Z","lastTransitionTime":"2025-10-03T08:40:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 08:40:02 crc kubenswrapper[4765]: I1003 08:40:02.083379 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 08:40:02 crc kubenswrapper[4765]: I1003 08:40:02.083428 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 08:40:02 crc kubenswrapper[4765]: I1003 08:40:02.083442 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 08:40:02 crc kubenswrapper[4765]: I1003 08:40:02.083462 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 08:40:02 crc kubenswrapper[4765]: I1003 08:40:02.083475 4765 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T08:40:02Z","lastTransitionTime":"2025-10-03T08:40:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 08:40:02 crc kubenswrapper[4765]: I1003 08:40:02.185622 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 08:40:02 crc kubenswrapper[4765]: I1003 08:40:02.185679 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 08:40:02 crc kubenswrapper[4765]: I1003 08:40:02.185693 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 08:40:02 crc kubenswrapper[4765]: I1003 08:40:02.185712 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 08:40:02 crc kubenswrapper[4765]: I1003 08:40:02.185723 4765 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T08:40:02Z","lastTransitionTime":"2025-10-03T08:40:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 08:40:02 crc kubenswrapper[4765]: I1003 08:40:02.288818 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 08:40:02 crc kubenswrapper[4765]: I1003 08:40:02.288863 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 08:40:02 crc kubenswrapper[4765]: I1003 08:40:02.288877 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 08:40:02 crc kubenswrapper[4765]: I1003 08:40:02.288898 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 08:40:02 crc kubenswrapper[4765]: I1003 08:40:02.288910 4765 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T08:40:02Z","lastTransitionTime":"2025-10-03T08:40:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 08:40:02 crc kubenswrapper[4765]: I1003 08:40:02.306402 4765 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-wdwf5" Oct 03 08:40:02 crc kubenswrapper[4765]: E1003 08:40:02.306576 4765 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-wdwf5" podUID="6824483c-e9a7-4e95-bb3d-e00bac2af3aa" Oct 03 08:40:02 crc kubenswrapper[4765]: I1003 08:40:02.307311 4765 scope.go:117] "RemoveContainer" containerID="4fa0909ee1317cdeb75c73911371e3344b889b98379e921f58d444c960308e28" Oct 03 08:40:02 crc kubenswrapper[4765]: I1003 08:40:02.340293 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 08:40:02 crc kubenswrapper[4765]: I1003 08:40:02.340513 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 08:40:02 crc kubenswrapper[4765]: I1003 08:40:02.340712 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 08:40:02 crc kubenswrapper[4765]: I1003 08:40:02.340996 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 08:40:02 crc kubenswrapper[4765]: I1003 08:40:02.341212 4765 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T08:40:02Z","lastTransitionTime":"2025-10-03T08:40:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 08:40:02 crc kubenswrapper[4765]: E1003 08:40:02.353860 4765 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-03T08:40:02Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:02Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-03T08:40:02Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:02Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-03T08:40:02Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:02Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-03T08:40:02Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:02Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"4a5a1b91-d1b3-462d-b8c2-89eae83d6c3d\\\",\\\"systemUUID\\\":\\\"c85bcae8-d463-4f60-8737-09c0f3c02573\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T08:40:02Z is after 2025-08-24T17:21:41Z" Oct 03 08:40:02 crc kubenswrapper[4765]: I1003 08:40:02.358273 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 08:40:02 crc kubenswrapper[4765]: I1003 08:40:02.358313 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 08:40:02 crc kubenswrapper[4765]: I1003 08:40:02.358322 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 08:40:02 crc kubenswrapper[4765]: I1003 08:40:02.358339 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 08:40:02 crc kubenswrapper[4765]: I1003 08:40:02.358349 4765 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T08:40:02Z","lastTransitionTime":"2025-10-03T08:40:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 08:40:02 crc kubenswrapper[4765]: E1003 08:40:02.374053 4765 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-03T08:40:02Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:02Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-03T08:40:02Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:02Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-03T08:40:02Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:02Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-03T08:40:02Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:02Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"4a5a1b91-d1b3-462d-b8c2-89eae83d6c3d\\\",\\\"systemUUID\\\":\\\"c85bcae8-d463-4f60-8737-09c0f3c02573\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T08:40:02Z is after 2025-08-24T17:21:41Z" Oct 03 08:40:02 crc kubenswrapper[4765]: I1003 08:40:02.379560 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 08:40:02 crc kubenswrapper[4765]: I1003 08:40:02.379614 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 08:40:02 crc kubenswrapper[4765]: I1003 08:40:02.379624 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 08:40:02 crc kubenswrapper[4765]: I1003 08:40:02.379658 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 08:40:02 crc kubenswrapper[4765]: I1003 08:40:02.379674 4765 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T08:40:02Z","lastTransitionTime":"2025-10-03T08:40:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 08:40:02 crc kubenswrapper[4765]: E1003 08:40:02.391544 4765 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-03T08:40:02Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:02Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-03T08:40:02Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:02Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-03T08:40:02Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:02Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-03T08:40:02Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:02Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"4a5a1b91-d1b3-462d-b8c2-89eae83d6c3d\\\",\\\"systemUUID\\\":\\\"c85bcae8-d463-4f60-8737-09c0f3c02573\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T08:40:02Z is after 2025-08-24T17:21:41Z" Oct 03 08:40:02 crc kubenswrapper[4765]: I1003 08:40:02.395275 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 08:40:02 crc kubenswrapper[4765]: I1003 08:40:02.395318 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 08:40:02 crc kubenswrapper[4765]: I1003 08:40:02.395333 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 08:40:02 crc kubenswrapper[4765]: I1003 08:40:02.395353 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 08:40:02 crc kubenswrapper[4765]: I1003 08:40:02.395365 4765 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T08:40:02Z","lastTransitionTime":"2025-10-03T08:40:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 08:40:02 crc kubenswrapper[4765]: E1003 08:40:02.408376 4765 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-03T08:40:02Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:02Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-03T08:40:02Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:02Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-03T08:40:02Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:02Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-03T08:40:02Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:02Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"4a5a1b91-d1b3-462d-b8c2-89eae83d6c3d\\\",\\\"systemUUID\\\":\\\"c85bcae8-d463-4f60-8737-09c0f3c02573\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T08:40:02Z is after 2025-08-24T17:21:41Z" Oct 03 08:40:02 crc kubenswrapper[4765]: I1003 08:40:02.412000 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 08:40:02 crc kubenswrapper[4765]: I1003 08:40:02.412034 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 08:40:02 crc kubenswrapper[4765]: I1003 08:40:02.412042 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 08:40:02 crc kubenswrapper[4765]: I1003 08:40:02.412058 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 08:40:02 crc kubenswrapper[4765]: I1003 08:40:02.412067 4765 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T08:40:02Z","lastTransitionTime":"2025-10-03T08:40:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 08:40:02 crc kubenswrapper[4765]: E1003 08:40:02.425615 4765 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-03T08:40:02Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:02Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-03T08:40:02Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:02Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-03T08:40:02Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:02Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-03T08:40:02Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:02Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"4a5a1b91-d1b3-462d-b8c2-89eae83d6c3d\\\",\\\"systemUUID\\\":\\\"c85bcae8-d463-4f60-8737-09c0f3c02573\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T08:40:02Z is after 2025-08-24T17:21:41Z" Oct 03 08:40:02 crc kubenswrapper[4765]: E1003 08:40:02.425757 4765 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Oct 03 08:40:02 crc kubenswrapper[4765]: I1003 08:40:02.427693 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 08:40:02 crc kubenswrapper[4765]: I1003 08:40:02.427723 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 08:40:02 crc kubenswrapper[4765]: I1003 08:40:02.427735 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 08:40:02 crc kubenswrapper[4765]: I1003 08:40:02.427752 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 08:40:02 crc kubenswrapper[4765]: I1003 08:40:02.427766 4765 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T08:40:02Z","lastTransitionTime":"2025-10-03T08:40:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 08:40:02 crc kubenswrapper[4765]: I1003 08:40:02.529930 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 08:40:02 crc kubenswrapper[4765]: I1003 08:40:02.529966 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 08:40:02 crc kubenswrapper[4765]: I1003 08:40:02.529975 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 08:40:02 crc kubenswrapper[4765]: I1003 08:40:02.529989 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 08:40:02 crc kubenswrapper[4765]: I1003 08:40:02.529999 4765 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T08:40:02Z","lastTransitionTime":"2025-10-03T08:40:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 08:40:02 crc kubenswrapper[4765]: I1003 08:40:02.622970 4765 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-srgbb_ea01fba1-445f-46c1-898c-1ceb34866850/ovnkube-controller/1.log" Oct 03 08:40:02 crc kubenswrapper[4765]: I1003 08:40:02.626601 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-srgbb" event={"ID":"ea01fba1-445f-46c1-898c-1ceb34866850","Type":"ContainerStarted","Data":"a4d481217db9abe6da65a66219fdf2298353f237df78c085f40bb803f7349ccd"} Oct 03 08:40:02 crc kubenswrapper[4765]: I1003 08:40:02.627061 4765 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-srgbb" Oct 03 08:40:02 crc kubenswrapper[4765]: I1003 08:40:02.642323 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 08:40:02 crc kubenswrapper[4765]: I1003 08:40:02.642369 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 08:40:02 crc kubenswrapper[4765]: I1003 08:40:02.642382 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 08:40:02 crc kubenswrapper[4765]: I1003 08:40:02.642401 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 08:40:02 crc kubenswrapper[4765]: I1003 08:40:02.642413 4765 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T08:40:02Z","lastTransitionTime":"2025-10-03T08:40:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 08:40:02 crc kubenswrapper[4765]: I1003 08:40:02.662345 4765 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"859ee4f1-636f-48e5-ad72-fef19f311c64\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ffaf0cbc60fa84230a87aff908b5b2a76956abfa937aeea94363abe91640b93e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T08:39:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0fee410f71d4fa82e7bf54dad906736bc7182be512825a06bf7a4c76ed2f2789\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T08:39:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ab0ed26066c771f9943b6435fa382ff61fb04f0c8bef3d505aba4c5d1a1d4740\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T08:39:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://153c9584928c3d064c6098126dad58733015ed123b9a55c959e69ddcc0ad2110\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T08:39:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6fa1bc45d80d90bc08ca3a7177e2ac77b66c36f5a0f863532174be7719bfaae0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T08:39:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c1359775b3b907743ce1d7754c904fab5cf953ad3a28f91e781be95462e6a9d4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c1359775b3b907743ce1d7754c904fab5cf953ad3a28f91e781be95462e6a9d4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T08:39:17Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T08:39:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a56c24b3af6b3d5c18009cbb5517273674eb36f57157344f99903f9c50c7d1a0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a56c24b3af6b3d5c18009cbb5517273674eb36f57157344f99903f9c50c7d1a0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T08:39:18Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T08:39:18Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://18dbd66a12f96fa4beef4da9dcc0e1b1d3a5164a8b21a6774142289078d6c05f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://18dbd66a12f96fa4beef4da9dcc0e1b1d3a5164a8b21a6774142289078d6c05f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T08:39:19Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T08:39:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T08:39:16Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T08:40:02Z is after 2025-08-24T17:21:41Z" Oct 03 08:40:02 crc kubenswrapper[4765]: I1003 08:40:02.682555 4765 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:35Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:35Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T08:40:02Z is after 2025-08-24T17:21:41Z" Oct 03 08:40:02 crc kubenswrapper[4765]: I1003 08:40:02.699475 4765 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-csb5z" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"912755c8-dd28-4fbc-82de-9cf85df54f4f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d7f179012e9f55f30c641a1ae3640cc90cefb3d2527d0c1e0580c219899503e1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T08:39:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w8k2k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T08:39:36Z\\\"}}\" for pod \"openshift-multus\"/\"multus-csb5z\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T08:40:02Z is after 2025-08-24T17:21:41Z" Oct 03 08:40:02 crc kubenswrapper[4765]: I1003 08:40:02.723680 4765 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-j8mss" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d636dbad-9ffa-4ba7-953f-adea04b76a23\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://33c95fa1034cd2135f4293956d73825e809195d220ff0b10a6604bd399a5730a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T08:39:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dhzzj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://714c78e9165f96e2aee03ad7be980399f06aeb852da4d76611c236f262518281\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T08:39:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dhzzj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T08:39:36Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-j8mss\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T08:40:02Z is after 2025-08-24T17:21:41Z" Oct 03 08:40:02 crc kubenswrapper[4765]: I1003 08:40:02.743942 4765 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:35Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:35Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T08:40:02Z is after 2025-08-24T17:21:41Z" Oct 03 08:40:02 crc kubenswrapper[4765]: I1003 08:40:02.746007 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 08:40:02 crc kubenswrapper[4765]: I1003 08:40:02.746032 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 08:40:02 crc kubenswrapper[4765]: I1003 08:40:02.746042 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 08:40:02 crc kubenswrapper[4765]: I1003 08:40:02.746055 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 08:40:02 crc kubenswrapper[4765]: I1003 08:40:02.746066 4765 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T08:40:02Z","lastTransitionTime":"2025-10-03T08:40:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 08:40:02 crc kubenswrapper[4765]: I1003 08:40:02.759777 4765 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:36Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:36Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e8d6f534a0a702832db2f8947c1528a98d511d3950cc5a6ec0ac3b31b3dbcb7c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T08:39:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://20ad16cb9f0f7e17ac946cd2c3f7c01b6e6c95d6d76c99f482b3761546689af2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T08:39:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T08:40:02Z is after 2025-08-24T17:21:41Z" Oct 03 08:40:02 crc kubenswrapper[4765]: I1003 08:40:02.771339 4765 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:38Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:38Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a37f2b5f797755065158a077232872befbc61f2f19c80dfd27bba7f131db794c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T08:39:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T08:40:02Z is after 2025-08-24T17:21:41Z" Oct 03 08:40:02 crc kubenswrapper[4765]: I1003 08:40:02.789590 4765 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-srgbb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ea01fba1-445f-46c1-898c-1ceb34866850\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:36Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:36Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d73e2e54676fc570262cfd551322ed003812c372ddc25695ca3b34ae2a05423b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T08:39:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m4zqv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fa40947035e07c4926ee170348e2bd545830d0c6c1fa6b59a2aa7f12eac2c6da\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T08:39:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m4zqv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://902d94d2cc9ce526c6ea774f1bb70fbee7da85cedab72fcd842f87d47ee8a458\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T08:39:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m4zqv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://95502595a856f5f235331ab5db3d4f97a50f968857c1962d12b873a714689f0c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T08:39:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m4zqv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a3ad66691c9dcf004703b79d697a78f9b42791fafba2ddf278997b6ad28bdd4a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T08:39:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m4zqv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://68b9b8a7ec5c072f50d44aa0d3800b7cdee18bdd868d37ec129ceb37a23bd3ca\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T08:39:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m4zqv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a4d481217db9abe6da65a66219fdf2298353f237df78c085f40bb803f7349ccd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4fa0909ee1317cdeb75c73911371e3344b889b98379e921f58d444c960308e28\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-10-03T08:39:46Z\\\",\\\"message\\\":\\\"erator/iptables-alerter-4ln5h\\\\nI1003 08:39:46.761968 6208 ovn.go:134] Ensuring zone local for Pod openshift-network-operator/iptables-alerter-4ln5h in node crc\\\\nI1003 08:39:46.761841 6208 obj_retry.go:303] Retry object setup: *v1.Pod openshift-network-node-identity/network-node-identity-vrzqb\\\\nI1003 08:39:46.761976 6208 obj_retry.go:386] Retry successful for *v1.Pod openshift-network-operator/iptables-alerter-4ln5h after 0 failed attempt(s)\\\\nI1003 08:39:46.761984 6208 default_network_controller.go:776] Recording success event on pod openshift-network-operator/iptables-alerter-4ln5h\\\\nI1003 08:39:46.761892 6208 ovn.go:134] Ensuring zone local for Pod openshift-kube-controller-manager/kube-controller-manager-crc in node crc\\\\nI1003 08:39:46.762002 6208 obj_retry.go:386] Retry successful for *v1.Pod openshift-kube-controller-manager/kube-controller-manager-crc after 0 failed attempt(s)\\\\nI1003 08:39:46.762008 6208 default_network_controller.go:776] Recording success event on pod openshift-kube-controller-manager/kube-controller-manager-crc\\\\nI1003 08:39:46.761983 6208 obj_retry.go:365] Adding new object: *v1.Pod openshift-network-node-identity/network-node-identity-vrzqb\\\\nF1003 08:39:46.762028 6208 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-03T08:39:45Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T08:40:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m4zqv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6d5d60eb6ab5ff22cc2c6826b1d47220bb827fa0429f2a59020ae01d0a43f6bf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T08:39:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m4zqv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://761ad034a8a6a7a6280fcccaca136b26ddd423885bc2a7ab6b8e1856f16f7416\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://761ad034a8a6a7a6280fcccaca136b26ddd423885bc2a7ab6b8e1856f16f7416\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T08:39:37Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T08:39:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m4zqv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T08:39:36Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-srgbb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T08:40:02Z is after 2025-08-24T17:21:41Z" Oct 03 08:40:02 crc kubenswrapper[4765]: I1003 08:40:02.802299 4765 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9660b983-3561-4cf7-8ea0-31a63e8d1051\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://23c27e7d79dab0c54b22f0114e7f55a9267e3a21961b8479c37fd77d0e8b66c7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T08:39:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fb89a31c804d86cbc11b04e4dcfab79d4536f28a107d43e98d48172a1c257ebb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T08:39:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1c3168f51c49cd9633557cf31cdc0fec47b3fcf981462dc85f4253a0584fcf52\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T08:39:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://99ae775d5cfd2e88a1c7ca516e1c59f2e08ce1d383653cacbefeac66b07abcb5\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T08:39:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T08:39:16Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T08:40:02Z is after 2025-08-24T17:21:41Z" Oct 03 08:40:02 crc kubenswrapper[4765]: I1003 08:40:02.816608 4765 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-4bmrv" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4f105c06-3e67-486f-a622-923ae442117c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d7a29ab4db9b7548c70824520272e6323f615934cddf1d92bf653f6d8f030a19\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T08:39:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6lhz2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ab314ed006e71cae099ecf8be4ac18fb777dd83fe4e233d9bc347965f76b6a0b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ab314ed006e71cae099ecf8be4ac18fb777dd83fe4e233d9bc347965f76b6a0b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T08:39:37Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T08:39:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6lhz2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://261208c132a527b6786f64f37dd699e889f68ebc01265028288b3620615274bc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://261208c132a527b6786f64f37dd699e889f68ebc01265028288b3620615274bc\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T08:39:39Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T08:39:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6lhz2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8358f38c6c98de8206fc03fda21200b0e526972747588a6e32d525c064aa57ac\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8358f38c6c98de8206fc03fda21200b0e526972747588a6e32d525c064aa57ac\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T08:39:39Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T08:39:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6lhz2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9af7a0993c4e8d1177050ee170ae306c2e2570b0daca2d3f5c812b5f0e9c81da\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9af7a0993c4e8d1177050ee170ae306c2e2570b0daca2d3f5c812b5f0e9c81da\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T08:39:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T08:39:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6lhz2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://23ac91bc25ecc5c606b22bf6df52129330bb8c214ef8ec881fb202df6350c853\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://23ac91bc25ecc5c606b22bf6df52129330bb8c214ef8ec881fb202df6350c853\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T08:39:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T08:39:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6lhz2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7c836df75da45ef369baafc15bdbed1068becc3bf57a4c83a8519280ff3eb847\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7c836df75da45ef369baafc15bdbed1068becc3bf57a4c83a8519280ff3eb847\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T08:39:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T08:39:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6lhz2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T08:39:36Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-4bmrv\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T08:40:02Z is after 2025-08-24T17:21:41Z" Oct 03 08:40:02 crc kubenswrapper[4765]: I1003 08:40:02.827441 4765 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-p9gf5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"46c76a49-e10b-4a12-a6c7-12c330cd3c4e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d127171dd11041892813dd0596574630e756cc4f2e54b149619bffdbe9bae37d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T08:39:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2ldt7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T08:39:36Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-p9gf5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T08:40:02Z is after 2025-08-24T17:21:41Z" Oct 03 08:40:02 crc kubenswrapper[4765]: I1003 08:40:02.838122 4765 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-svqbq" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"84cdf1d7-9997-4015-bdbf-eedacc081685\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://43441b23076aa88505c0014c6734ffd0302f9011300711eece573befc94f3fbf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T08:39:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tm2z5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T08:39:39Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-svqbq\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T08:40:02Z is after 2025-08-24T17:21:41Z" Oct 03 08:40:02 crc kubenswrapper[4765]: I1003 08:40:02.849049 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 08:40:02 crc kubenswrapper[4765]: I1003 08:40:02.849110 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 08:40:02 crc kubenswrapper[4765]: I1003 08:40:02.849123 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 08:40:02 crc kubenswrapper[4765]: I1003 08:40:02.849139 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 08:40:02 crc kubenswrapper[4765]: I1003 08:40:02.849151 4765 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T08:40:02Z","lastTransitionTime":"2025-10-03T08:40:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 08:40:02 crc kubenswrapper[4765]: I1003 08:40:02.850266 4765 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-9pssq" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fcbd8c60-e4bc-43c1-b769-9ae58a05ea0f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fb36c0727cbf11d911102b2e91c3989a264374191f4ff34349ed6ec8eba2e58d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T08:39:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ggxtg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d810b33fb4971c7a1473884cbe04ad15b3cac6c0ca9af2384819d72a748ab173\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T08:39:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ggxtg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T08:39:48Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-9pssq\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T08:40:02Z is after 2025-08-24T17:21:41Z" Oct 03 08:40:02 crc kubenswrapper[4765]: I1003 08:40:02.863068 4765 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-wdwf5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6824483c-e9a7-4e95-bb3d-e00bac2af3aa\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:50Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:50Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:50Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9t858\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9t858\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T08:39:50Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-wdwf5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T08:40:02Z is after 2025-08-24T17:21:41Z" Oct 03 08:40:02 crc kubenswrapper[4765]: I1003 08:40:02.880936 4765 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0c434639-9c6c-420c-a51b-fdf59b654daa\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d31497fd54f7500ac776bdd9a16414d873c053353911ed5ba237b201e9e7ac12\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T08:39:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e89b19d6a5b90a2051665bf2e5e150f73df7899eff246ee75246bc2127c415ea\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T08:39:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://79fad446c147481b1a0ff2a173848b2d24384e6b6aafcd0749dc820e9abfe929\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T08:39:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f2e21a2b21d807288e991a3a44ea38d316985590080aa4291aa3385816f826dd\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://aa0283dadc2c5e48aa9bfd20ef35d889a350244b72eb8529d4d4e682d5fa0e47\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-03T08:39:35Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1003 08:39:29.830291 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1003 08:39:29.833185 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2710500186/tls.crt::/tmp/serving-cert-2710500186/tls.key\\\\\\\"\\\\nI1003 08:39:35.213224 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1003 08:39:35.219008 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1003 08:39:35.219055 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1003 08:39:35.219088 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1003 08:39:35.219098 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1003 08:39:35.227302 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI1003 08:39:35.227314 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW1003 08:39:35.227372 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1003 08:39:35.227381 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1003 08:39:35.227385 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1003 08:39:35.227395 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1003 08:39:35.227398 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1003 08:39:35.227401 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF1003 08:39:35.229781 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-03T08:39:19Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T08:39:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fa1d1c0f4dab4b4c6c9f3afccac34473eab40a714015a2a7ce725ed1a92b609c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T08:39:19Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7c94b0f96f50324a67a7b189e9d3c4e406193421aa2e793692219bef9f2578dd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7c94b0f96f50324a67a7b189e9d3c4e406193421aa2e793692219bef9f2578dd\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T08:39:17Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T08:39:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T08:39:16Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T08:40:02Z is after 2025-08-24T17:21:41Z" Oct 03 08:40:02 crc kubenswrapper[4765]: I1003 08:40:02.901651 4765 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:35Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:35Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T08:40:02Z is after 2025-08-24T17:21:41Z" Oct 03 08:40:02 crc kubenswrapper[4765]: I1003 08:40:02.922217 4765 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:36Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:36Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e2003e4dd90b26bd915c05a690d0ab12b21ef7773138f11993382b0e7ac2d778\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T08:39:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T08:40:02Z is after 2025-08-24T17:21:41Z" Oct 03 08:40:02 crc kubenswrapper[4765]: I1003 08:40:02.952564 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 08:40:02 crc kubenswrapper[4765]: I1003 08:40:02.952616 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 08:40:02 crc kubenswrapper[4765]: I1003 08:40:02.952629 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 08:40:02 crc kubenswrapper[4765]: I1003 08:40:02.952666 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 08:40:02 crc kubenswrapper[4765]: I1003 08:40:02.952678 4765 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T08:40:02Z","lastTransitionTime":"2025-10-03T08:40:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 08:40:03 crc kubenswrapper[4765]: I1003 08:40:03.055251 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 08:40:03 crc kubenswrapper[4765]: I1003 08:40:03.055298 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 08:40:03 crc kubenswrapper[4765]: I1003 08:40:03.055309 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 08:40:03 crc kubenswrapper[4765]: I1003 08:40:03.055358 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 08:40:03 crc kubenswrapper[4765]: I1003 08:40:03.055370 4765 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T08:40:03Z","lastTransitionTime":"2025-10-03T08:40:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 08:40:03 crc kubenswrapper[4765]: I1003 08:40:03.157998 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 08:40:03 crc kubenswrapper[4765]: I1003 08:40:03.158032 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 08:40:03 crc kubenswrapper[4765]: I1003 08:40:03.158043 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 08:40:03 crc kubenswrapper[4765]: I1003 08:40:03.158063 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 08:40:03 crc kubenswrapper[4765]: I1003 08:40:03.158074 4765 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T08:40:03Z","lastTransitionTime":"2025-10-03T08:40:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 08:40:03 crc kubenswrapper[4765]: I1003 08:40:03.260900 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 08:40:03 crc kubenswrapper[4765]: I1003 08:40:03.260945 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 08:40:03 crc kubenswrapper[4765]: I1003 08:40:03.260987 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 08:40:03 crc kubenswrapper[4765]: I1003 08:40:03.261005 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 08:40:03 crc kubenswrapper[4765]: I1003 08:40:03.261018 4765 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T08:40:03Z","lastTransitionTime":"2025-10-03T08:40:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 08:40:03 crc kubenswrapper[4765]: I1003 08:40:03.305848 4765 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 03 08:40:03 crc kubenswrapper[4765]: I1003 08:40:03.305907 4765 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 03 08:40:03 crc kubenswrapper[4765]: I1003 08:40:03.305932 4765 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 03 08:40:03 crc kubenswrapper[4765]: E1003 08:40:03.306015 4765 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 03 08:40:03 crc kubenswrapper[4765]: E1003 08:40:03.306074 4765 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 03 08:40:03 crc kubenswrapper[4765]: E1003 08:40:03.306188 4765 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 03 08:40:03 crc kubenswrapper[4765]: I1003 08:40:03.363118 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 08:40:03 crc kubenswrapper[4765]: I1003 08:40:03.363178 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 08:40:03 crc kubenswrapper[4765]: I1003 08:40:03.363195 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 08:40:03 crc kubenswrapper[4765]: I1003 08:40:03.363215 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 08:40:03 crc kubenswrapper[4765]: I1003 08:40:03.363227 4765 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T08:40:03Z","lastTransitionTime":"2025-10-03T08:40:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 08:40:03 crc kubenswrapper[4765]: I1003 08:40:03.465837 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 08:40:03 crc kubenswrapper[4765]: I1003 08:40:03.465888 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 08:40:03 crc kubenswrapper[4765]: I1003 08:40:03.465905 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 08:40:03 crc kubenswrapper[4765]: I1003 08:40:03.465924 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 08:40:03 crc kubenswrapper[4765]: I1003 08:40:03.465935 4765 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T08:40:03Z","lastTransitionTime":"2025-10-03T08:40:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 08:40:03 crc kubenswrapper[4765]: I1003 08:40:03.568744 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 08:40:03 crc kubenswrapper[4765]: I1003 08:40:03.568810 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 08:40:03 crc kubenswrapper[4765]: I1003 08:40:03.568824 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 08:40:03 crc kubenswrapper[4765]: I1003 08:40:03.568843 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 08:40:03 crc kubenswrapper[4765]: I1003 08:40:03.568855 4765 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T08:40:03Z","lastTransitionTime":"2025-10-03T08:40:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 08:40:03 crc kubenswrapper[4765]: I1003 08:40:03.630427 4765 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-srgbb_ea01fba1-445f-46c1-898c-1ceb34866850/ovnkube-controller/2.log" Oct 03 08:40:03 crc kubenswrapper[4765]: I1003 08:40:03.630937 4765 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-srgbb_ea01fba1-445f-46c1-898c-1ceb34866850/ovnkube-controller/1.log" Oct 03 08:40:03 crc kubenswrapper[4765]: I1003 08:40:03.633510 4765 generic.go:334] "Generic (PLEG): container finished" podID="ea01fba1-445f-46c1-898c-1ceb34866850" containerID="a4d481217db9abe6da65a66219fdf2298353f237df78c085f40bb803f7349ccd" exitCode=1 Oct 03 08:40:03 crc kubenswrapper[4765]: I1003 08:40:03.633553 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-srgbb" event={"ID":"ea01fba1-445f-46c1-898c-1ceb34866850","Type":"ContainerDied","Data":"a4d481217db9abe6da65a66219fdf2298353f237df78c085f40bb803f7349ccd"} Oct 03 08:40:03 crc kubenswrapper[4765]: I1003 08:40:03.633608 4765 scope.go:117] "RemoveContainer" containerID="4fa0909ee1317cdeb75c73911371e3344b889b98379e921f58d444c960308e28" Oct 03 08:40:03 crc kubenswrapper[4765]: I1003 08:40:03.634377 4765 scope.go:117] "RemoveContainer" containerID="a4d481217db9abe6da65a66219fdf2298353f237df78c085f40bb803f7349ccd" Oct 03 08:40:03 crc kubenswrapper[4765]: E1003 08:40:03.634563 4765 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-srgbb_openshift-ovn-kubernetes(ea01fba1-445f-46c1-898c-1ceb34866850)\"" pod="openshift-ovn-kubernetes/ovnkube-node-srgbb" podUID="ea01fba1-445f-46c1-898c-1ceb34866850" Oct 03 08:40:03 crc kubenswrapper[4765]: I1003 08:40:03.647563 4765 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:35Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:35Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T08:40:03Z is after 2025-08-24T17:21:41Z" Oct 03 08:40:03 crc kubenswrapper[4765]: I1003 08:40:03.659915 4765 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-csb5z" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"912755c8-dd28-4fbc-82de-9cf85df54f4f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d7f179012e9f55f30c641a1ae3640cc90cefb3d2527d0c1e0580c219899503e1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T08:39:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w8k2k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T08:39:36Z\\\"}}\" for pod \"openshift-multus\"/\"multus-csb5z\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T08:40:03Z is after 2025-08-24T17:21:41Z" Oct 03 08:40:03 crc kubenswrapper[4765]: I1003 08:40:03.672178 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 08:40:03 crc kubenswrapper[4765]: I1003 08:40:03.672210 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 08:40:03 crc kubenswrapper[4765]: I1003 08:40:03.672219 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 08:40:03 crc kubenswrapper[4765]: I1003 08:40:03.672233 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 08:40:03 crc kubenswrapper[4765]: I1003 08:40:03.672243 4765 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T08:40:03Z","lastTransitionTime":"2025-10-03T08:40:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 08:40:03 crc kubenswrapper[4765]: I1003 08:40:03.673631 4765 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-j8mss" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d636dbad-9ffa-4ba7-953f-adea04b76a23\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://33c95fa1034cd2135f4293956d73825e809195d220ff0b10a6604bd399a5730a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T08:39:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dhzzj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://714c78e9165f96e2aee03ad7be980399f06aeb852da4d76611c236f262518281\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T08:39:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dhzzj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T08:39:36Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-j8mss\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T08:40:03Z is after 2025-08-24T17:21:41Z" Oct 03 08:40:03 crc kubenswrapper[4765]: I1003 08:40:03.691114 4765 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"859ee4f1-636f-48e5-ad72-fef19f311c64\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ffaf0cbc60fa84230a87aff908b5b2a76956abfa937aeea94363abe91640b93e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T08:39:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0fee410f71d4fa82e7bf54dad906736bc7182be512825a06bf7a4c76ed2f2789\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T08:39:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ab0ed26066c771f9943b6435fa382ff61fb04f0c8bef3d505aba4c5d1a1d4740\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T08:39:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://153c9584928c3d064c6098126dad58733015ed123b9a55c959e69ddcc0ad2110\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T08:39:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6fa1bc45d80d90bc08ca3a7177e2ac77b66c36f5a0f863532174be7719bfaae0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T08:39:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c1359775b3b907743ce1d7754c904fab5cf953ad3a28f91e781be95462e6a9d4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c1359775b3b907743ce1d7754c904fab5cf953ad3a28f91e781be95462e6a9d4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T08:39:17Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T08:39:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a56c24b3af6b3d5c18009cbb5517273674eb36f57157344f99903f9c50c7d1a0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a56c24b3af6b3d5c18009cbb5517273674eb36f57157344f99903f9c50c7d1a0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T08:39:18Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T08:39:18Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://18dbd66a12f96fa4beef4da9dcc0e1b1d3a5164a8b21a6774142289078d6c05f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://18dbd66a12f96fa4beef4da9dcc0e1b1d3a5164a8b21a6774142289078d6c05f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T08:39:19Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T08:39:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T08:39:16Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T08:40:03Z is after 2025-08-24T17:21:41Z" Oct 03 08:40:03 crc kubenswrapper[4765]: I1003 08:40:03.702805 4765 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:36Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:36Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e8d6f534a0a702832db2f8947c1528a98d511d3950cc5a6ec0ac3b31b3dbcb7c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T08:39:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://20ad16cb9f0f7e17ac946cd2c3f7c01b6e6c95d6d76c99f482b3761546689af2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T08:39:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T08:40:03Z is after 2025-08-24T17:21:41Z" Oct 03 08:40:03 crc kubenswrapper[4765]: I1003 08:40:03.714257 4765 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:38Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:38Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a37f2b5f797755065158a077232872befbc61f2f19c80dfd27bba7f131db794c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T08:39:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T08:40:03Z is after 2025-08-24T17:21:41Z" Oct 03 08:40:03 crc kubenswrapper[4765]: I1003 08:40:03.730536 4765 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-srgbb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ea01fba1-445f-46c1-898c-1ceb34866850\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:36Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:36Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d73e2e54676fc570262cfd551322ed003812c372ddc25695ca3b34ae2a05423b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T08:39:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m4zqv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fa40947035e07c4926ee170348e2bd545830d0c6c1fa6b59a2aa7f12eac2c6da\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T08:39:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m4zqv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://902d94d2cc9ce526c6ea774f1bb70fbee7da85cedab72fcd842f87d47ee8a458\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T08:39:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m4zqv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://95502595a856f5f235331ab5db3d4f97a50f968857c1962d12b873a714689f0c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T08:39:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m4zqv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a3ad66691c9dcf004703b79d697a78f9b42791fafba2ddf278997b6ad28bdd4a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T08:39:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m4zqv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://68b9b8a7ec5c072f50d44aa0d3800b7cdee18bdd868d37ec129ceb37a23bd3ca\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T08:39:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m4zqv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a4d481217db9abe6da65a66219fdf2298353f237df78c085f40bb803f7349ccd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4fa0909ee1317cdeb75c73911371e3344b889b98379e921f58d444c960308e28\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-10-03T08:39:46Z\\\",\\\"message\\\":\\\"erator/iptables-alerter-4ln5h\\\\nI1003 08:39:46.761968 6208 ovn.go:134] Ensuring zone local for Pod openshift-network-operator/iptables-alerter-4ln5h in node crc\\\\nI1003 08:39:46.761841 6208 obj_retry.go:303] Retry object setup: *v1.Pod openshift-network-node-identity/network-node-identity-vrzqb\\\\nI1003 08:39:46.761976 6208 obj_retry.go:386] Retry successful for *v1.Pod openshift-network-operator/iptables-alerter-4ln5h after 0 failed attempt(s)\\\\nI1003 08:39:46.761984 6208 default_network_controller.go:776] Recording success event on pod openshift-network-operator/iptables-alerter-4ln5h\\\\nI1003 08:39:46.761892 6208 ovn.go:134] Ensuring zone local for Pod openshift-kube-controller-manager/kube-controller-manager-crc in node crc\\\\nI1003 08:39:46.762002 6208 obj_retry.go:386] Retry successful for *v1.Pod openshift-kube-controller-manager/kube-controller-manager-crc after 0 failed attempt(s)\\\\nI1003 08:39:46.762008 6208 default_network_controller.go:776] Recording success event on pod openshift-kube-controller-manager/kube-controller-manager-crc\\\\nI1003 08:39:46.761983 6208 obj_retry.go:365] Adding new object: *v1.Pod openshift-network-node-identity/network-node-identity-vrzqb\\\\nF1003 08:39:46.762028 6208 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-03T08:39:45Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a4d481217db9abe6da65a66219fdf2298353f237df78c085f40bb803f7349ccd\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-10-03T08:40:03Z\\\",\\\"message\\\":\\\":false hairpin_snat_ip:169.254.0.5 fd69::5 neighbor_responder:none reject:true skip_snat:false]} protocol:{GoSet:[tcp]} selection_fields:{GoSet:[]} vips:{GoMap:map[10.217.5.188:443:]}] Rows:[] Columns:[] Mutations:[] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {53c717ca-2174-4315-bb03-c937a9c0d9b6}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nI1003 08:40:03.133396 6432 transact.go:42] Configuring OVN: [{Op:update Table:Load_Balancer Row:map[external_ids:{GoMap:map[k8s.ovn.org/kind:Service k8s.ovn.org/owner:openshift-cluster-version/cluster-version-operator]} name:Service_openshift-cluster-version/cluster-version-operator_TCP_cluster options:{GoMap:map[event:false hairpin_snat_ip:169.254.0.5 fd69::5 neighbor_responder:none reject:true skip_snat:false]} protocol:{GoSet:[tcp]} selection_fields:{GoSet:[]} vips:{GoMap:map[10.217.4.182:9099:]}] Rows:[] Columns:[] Mutations:[] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {61d39e4d-21a9-4387-9a2b-fa4ad14792e2}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nI1003 08:40:03.133455 6432 base_network_controller_pods.go:916] Annotation values: ip=[10.217.0.3/23] ; mac=0a:58:0a:d9:00:03 ; gw=[10.217.0.1]\\\\nF1003 08:40:03.133502 6432 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-03T08:40:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m4zqv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6d5d60eb6ab5ff22cc2c6826b1d47220bb827fa0429f2a59020ae01d0a43f6bf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T08:39:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m4zqv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://761ad034a8a6a7a6280fcccaca136b26ddd423885bc2a7ab6b8e1856f16f7416\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://761ad034a8a6a7a6280fcccaca136b26ddd423885bc2a7ab6b8e1856f16f7416\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T08:39:37Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T08:39:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m4zqv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T08:39:36Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-srgbb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T08:40:03Z is after 2025-08-24T17:21:41Z" Oct 03 08:40:03 crc kubenswrapper[4765]: I1003 08:40:03.743059 4765 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:35Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:35Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T08:40:03Z is after 2025-08-24T17:21:41Z" Oct 03 08:40:03 crc kubenswrapper[4765]: I1003 08:40:03.756524 4765 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-4bmrv" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4f105c06-3e67-486f-a622-923ae442117c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d7a29ab4db9b7548c70824520272e6323f615934cddf1d92bf653f6d8f030a19\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T08:39:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6lhz2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ab314ed006e71cae099ecf8be4ac18fb777dd83fe4e233d9bc347965f76b6a0b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ab314ed006e71cae099ecf8be4ac18fb777dd83fe4e233d9bc347965f76b6a0b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T08:39:37Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T08:39:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6lhz2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://261208c132a527b6786f64f37dd699e889f68ebc01265028288b3620615274bc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://261208c132a527b6786f64f37dd699e889f68ebc01265028288b3620615274bc\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T08:39:39Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T08:39:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6lhz2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8358f38c6c98de8206fc03fda21200b0e526972747588a6e32d525c064aa57ac\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8358f38c6c98de8206fc03fda21200b0e526972747588a6e32d525c064aa57ac\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T08:39:39Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T08:39:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6lhz2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9af7a0993c4e8d1177050ee170ae306c2e2570b0daca2d3f5c812b5f0e9c81da\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9af7a0993c4e8d1177050ee170ae306c2e2570b0daca2d3f5c812b5f0e9c81da\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T08:39:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T08:39:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6lhz2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://23ac91bc25ecc5c606b22bf6df52129330bb8c214ef8ec881fb202df6350c853\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://23ac91bc25ecc5c606b22bf6df52129330bb8c214ef8ec881fb202df6350c853\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T08:39:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T08:39:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6lhz2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7c836df75da45ef369baafc15bdbed1068becc3bf57a4c83a8519280ff3eb847\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7c836df75da45ef369baafc15bdbed1068becc3bf57a4c83a8519280ff3eb847\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T08:39:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T08:39:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6lhz2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T08:39:36Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-4bmrv\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T08:40:03Z is after 2025-08-24T17:21:41Z" Oct 03 08:40:03 crc kubenswrapper[4765]: I1003 08:40:03.765039 4765 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-p9gf5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"46c76a49-e10b-4a12-a6c7-12c330cd3c4e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d127171dd11041892813dd0596574630e756cc4f2e54b149619bffdbe9bae37d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T08:39:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2ldt7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T08:39:36Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-p9gf5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T08:40:03Z is after 2025-08-24T17:21:41Z" Oct 03 08:40:03 crc kubenswrapper[4765]: I1003 08:40:03.774292 4765 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-svqbq" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"84cdf1d7-9997-4015-bdbf-eedacc081685\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://43441b23076aa88505c0014c6734ffd0302f9011300711eece573befc94f3fbf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T08:39:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tm2z5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T08:39:39Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-svqbq\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T08:40:03Z is after 2025-08-24T17:21:41Z" Oct 03 08:40:03 crc kubenswrapper[4765]: I1003 08:40:03.774363 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 08:40:03 crc kubenswrapper[4765]: I1003 08:40:03.774380 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 08:40:03 crc kubenswrapper[4765]: I1003 08:40:03.774389 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 08:40:03 crc kubenswrapper[4765]: I1003 08:40:03.774402 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 08:40:03 crc kubenswrapper[4765]: I1003 08:40:03.774411 4765 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T08:40:03Z","lastTransitionTime":"2025-10-03T08:40:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 08:40:03 crc kubenswrapper[4765]: I1003 08:40:03.784576 4765 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-9pssq" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fcbd8c60-e4bc-43c1-b769-9ae58a05ea0f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fb36c0727cbf11d911102b2e91c3989a264374191f4ff34349ed6ec8eba2e58d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T08:39:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ggxtg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d810b33fb4971c7a1473884cbe04ad15b3cac6c0ca9af2384819d72a748ab173\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T08:39:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ggxtg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T08:39:48Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-9pssq\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T08:40:03Z is after 2025-08-24T17:21:41Z" Oct 03 08:40:03 crc kubenswrapper[4765]: I1003 08:40:03.795679 4765 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9660b983-3561-4cf7-8ea0-31a63e8d1051\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://23c27e7d79dab0c54b22f0114e7f55a9267e3a21961b8479c37fd77d0e8b66c7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T08:39:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fb89a31c804d86cbc11b04e4dcfab79d4536f28a107d43e98d48172a1c257ebb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T08:39:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1c3168f51c49cd9633557cf31cdc0fec47b3fcf981462dc85f4253a0584fcf52\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T08:39:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://99ae775d5cfd2e88a1c7ca516e1c59f2e08ce1d383653cacbefeac66b07abcb5\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T08:39:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T08:39:16Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T08:40:03Z is after 2025-08-24T17:21:41Z" Oct 03 08:40:03 crc kubenswrapper[4765]: I1003 08:40:03.808945 4765 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0c434639-9c6c-420c-a51b-fdf59b654daa\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d31497fd54f7500ac776bdd9a16414d873c053353911ed5ba237b201e9e7ac12\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T08:39:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e89b19d6a5b90a2051665bf2e5e150f73df7899eff246ee75246bc2127c415ea\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T08:39:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://79fad446c147481b1a0ff2a173848b2d24384e6b6aafcd0749dc820e9abfe929\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T08:39:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f2e21a2b21d807288e991a3a44ea38d316985590080aa4291aa3385816f826dd\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://aa0283dadc2c5e48aa9bfd20ef35d889a350244b72eb8529d4d4e682d5fa0e47\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-03T08:39:35Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1003 08:39:29.830291 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1003 08:39:29.833185 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2710500186/tls.crt::/tmp/serving-cert-2710500186/tls.key\\\\\\\"\\\\nI1003 08:39:35.213224 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1003 08:39:35.219008 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1003 08:39:35.219055 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1003 08:39:35.219088 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1003 08:39:35.219098 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1003 08:39:35.227302 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI1003 08:39:35.227314 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW1003 08:39:35.227372 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1003 08:39:35.227381 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1003 08:39:35.227385 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1003 08:39:35.227395 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1003 08:39:35.227398 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1003 08:39:35.227401 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF1003 08:39:35.229781 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-03T08:39:19Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T08:39:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fa1d1c0f4dab4b4c6c9f3afccac34473eab40a714015a2a7ce725ed1a92b609c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T08:39:19Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7c94b0f96f50324a67a7b189e9d3c4e406193421aa2e793692219bef9f2578dd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7c94b0f96f50324a67a7b189e9d3c4e406193421aa2e793692219bef9f2578dd\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T08:39:17Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T08:39:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T08:39:16Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T08:40:03Z is after 2025-08-24T17:21:41Z" Oct 03 08:40:03 crc kubenswrapper[4765]: I1003 08:40:03.820836 4765 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:35Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:35Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T08:40:03Z is after 2025-08-24T17:21:41Z" Oct 03 08:40:03 crc kubenswrapper[4765]: I1003 08:40:03.832984 4765 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:36Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:36Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e2003e4dd90b26bd915c05a690d0ab12b21ef7773138f11993382b0e7ac2d778\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T08:39:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T08:40:03Z is after 2025-08-24T17:21:41Z" Oct 03 08:40:03 crc kubenswrapper[4765]: I1003 08:40:03.848959 4765 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-wdwf5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6824483c-e9a7-4e95-bb3d-e00bac2af3aa\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:50Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:50Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:50Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9t858\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9t858\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T08:39:50Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-wdwf5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T08:40:03Z is after 2025-08-24T17:21:41Z" Oct 03 08:40:03 crc kubenswrapper[4765]: I1003 08:40:03.876747 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 08:40:03 crc kubenswrapper[4765]: I1003 08:40:03.876804 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 08:40:03 crc kubenswrapper[4765]: I1003 08:40:03.876816 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 08:40:03 crc kubenswrapper[4765]: I1003 08:40:03.876833 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 08:40:03 crc kubenswrapper[4765]: I1003 08:40:03.876845 4765 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T08:40:03Z","lastTransitionTime":"2025-10-03T08:40:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 08:40:03 crc kubenswrapper[4765]: I1003 08:40:03.979023 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 08:40:03 crc kubenswrapper[4765]: I1003 08:40:03.979066 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 08:40:03 crc kubenswrapper[4765]: I1003 08:40:03.979079 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 08:40:03 crc kubenswrapper[4765]: I1003 08:40:03.979098 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 08:40:03 crc kubenswrapper[4765]: I1003 08:40:03.979109 4765 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T08:40:03Z","lastTransitionTime":"2025-10-03T08:40:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 08:40:04 crc kubenswrapper[4765]: I1003 08:40:04.081352 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 08:40:04 crc kubenswrapper[4765]: I1003 08:40:04.081394 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 08:40:04 crc kubenswrapper[4765]: I1003 08:40:04.081404 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 08:40:04 crc kubenswrapper[4765]: I1003 08:40:04.081418 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 08:40:04 crc kubenswrapper[4765]: I1003 08:40:04.081428 4765 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T08:40:04Z","lastTransitionTime":"2025-10-03T08:40:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 08:40:04 crc kubenswrapper[4765]: I1003 08:40:04.183615 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 08:40:04 crc kubenswrapper[4765]: I1003 08:40:04.183761 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 08:40:04 crc kubenswrapper[4765]: I1003 08:40:04.183777 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 08:40:04 crc kubenswrapper[4765]: I1003 08:40:04.183795 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 08:40:04 crc kubenswrapper[4765]: I1003 08:40:04.183806 4765 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T08:40:04Z","lastTransitionTime":"2025-10-03T08:40:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 08:40:04 crc kubenswrapper[4765]: I1003 08:40:04.286173 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 08:40:04 crc kubenswrapper[4765]: I1003 08:40:04.286210 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 08:40:04 crc kubenswrapper[4765]: I1003 08:40:04.286223 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 08:40:04 crc kubenswrapper[4765]: I1003 08:40:04.286240 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 08:40:04 crc kubenswrapper[4765]: I1003 08:40:04.286253 4765 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T08:40:04Z","lastTransitionTime":"2025-10-03T08:40:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 08:40:04 crc kubenswrapper[4765]: I1003 08:40:04.306761 4765 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-wdwf5" Oct 03 08:40:04 crc kubenswrapper[4765]: E1003 08:40:04.306885 4765 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-wdwf5" podUID="6824483c-e9a7-4e95-bb3d-e00bac2af3aa" Oct 03 08:40:04 crc kubenswrapper[4765]: I1003 08:40:04.388244 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 08:40:04 crc kubenswrapper[4765]: I1003 08:40:04.388273 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 08:40:04 crc kubenswrapper[4765]: I1003 08:40:04.388283 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 08:40:04 crc kubenswrapper[4765]: I1003 08:40:04.388298 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 08:40:04 crc kubenswrapper[4765]: I1003 08:40:04.388307 4765 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T08:40:04Z","lastTransitionTime":"2025-10-03T08:40:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 08:40:04 crc kubenswrapper[4765]: I1003 08:40:04.490283 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 08:40:04 crc kubenswrapper[4765]: I1003 08:40:04.490311 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 08:40:04 crc kubenswrapper[4765]: I1003 08:40:04.490320 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 08:40:04 crc kubenswrapper[4765]: I1003 08:40:04.490334 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 08:40:04 crc kubenswrapper[4765]: I1003 08:40:04.490344 4765 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T08:40:04Z","lastTransitionTime":"2025-10-03T08:40:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 08:40:04 crc kubenswrapper[4765]: I1003 08:40:04.592924 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 08:40:04 crc kubenswrapper[4765]: I1003 08:40:04.592958 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 08:40:04 crc kubenswrapper[4765]: I1003 08:40:04.592978 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 08:40:04 crc kubenswrapper[4765]: I1003 08:40:04.593000 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 08:40:04 crc kubenswrapper[4765]: I1003 08:40:04.593012 4765 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T08:40:04Z","lastTransitionTime":"2025-10-03T08:40:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 08:40:04 crc kubenswrapper[4765]: I1003 08:40:04.638391 4765 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-srgbb_ea01fba1-445f-46c1-898c-1ceb34866850/ovnkube-controller/2.log" Oct 03 08:40:04 crc kubenswrapper[4765]: I1003 08:40:04.640839 4765 scope.go:117] "RemoveContainer" containerID="a4d481217db9abe6da65a66219fdf2298353f237df78c085f40bb803f7349ccd" Oct 03 08:40:04 crc kubenswrapper[4765]: E1003 08:40:04.640989 4765 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-srgbb_openshift-ovn-kubernetes(ea01fba1-445f-46c1-898c-1ceb34866850)\"" pod="openshift-ovn-kubernetes/ovnkube-node-srgbb" podUID="ea01fba1-445f-46c1-898c-1ceb34866850" Oct 03 08:40:04 crc kubenswrapper[4765]: I1003 08:40:04.653071 4765 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0c434639-9c6c-420c-a51b-fdf59b654daa\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d31497fd54f7500ac776bdd9a16414d873c053353911ed5ba237b201e9e7ac12\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T08:39:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e89b19d6a5b90a2051665bf2e5e150f73df7899eff246ee75246bc2127c415ea\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T08:39:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://79fad446c147481b1a0ff2a173848b2d24384e6b6aafcd0749dc820e9abfe929\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T08:39:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f2e21a2b21d807288e991a3a44ea38d316985590080aa4291aa3385816f826dd\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://aa0283dadc2c5e48aa9bfd20ef35d889a350244b72eb8529d4d4e682d5fa0e47\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-03T08:39:35Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1003 08:39:29.830291 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1003 08:39:29.833185 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2710500186/tls.crt::/tmp/serving-cert-2710500186/tls.key\\\\\\\"\\\\nI1003 08:39:35.213224 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1003 08:39:35.219008 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1003 08:39:35.219055 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1003 08:39:35.219088 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1003 08:39:35.219098 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1003 08:39:35.227302 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI1003 08:39:35.227314 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW1003 08:39:35.227372 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1003 08:39:35.227381 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1003 08:39:35.227385 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1003 08:39:35.227395 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1003 08:39:35.227398 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1003 08:39:35.227401 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF1003 08:39:35.229781 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-03T08:39:19Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T08:39:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fa1d1c0f4dab4b4c6c9f3afccac34473eab40a714015a2a7ce725ed1a92b609c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T08:39:19Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7c94b0f96f50324a67a7b189e9d3c4e406193421aa2e793692219bef9f2578dd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7c94b0f96f50324a67a7b189e9d3c4e406193421aa2e793692219bef9f2578dd\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T08:39:17Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T08:39:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T08:39:16Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T08:40:04Z is after 2025-08-24T17:21:41Z" Oct 03 08:40:04 crc kubenswrapper[4765]: I1003 08:40:04.663620 4765 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:35Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:35Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T08:40:04Z is after 2025-08-24T17:21:41Z" Oct 03 08:40:04 crc kubenswrapper[4765]: I1003 08:40:04.674860 4765 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:36Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:36Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e2003e4dd90b26bd915c05a690d0ab12b21ef7773138f11993382b0e7ac2d778\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T08:39:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T08:40:04Z is after 2025-08-24T17:21:41Z" Oct 03 08:40:04 crc kubenswrapper[4765]: I1003 08:40:04.683847 4765 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-wdwf5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6824483c-e9a7-4e95-bb3d-e00bac2af3aa\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:50Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:50Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:50Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9t858\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9t858\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T08:39:50Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-wdwf5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T08:40:04Z is after 2025-08-24T17:21:41Z" Oct 03 08:40:04 crc kubenswrapper[4765]: I1003 08:40:04.694120 4765 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:35Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:35Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T08:40:04Z is after 2025-08-24T17:21:41Z" Oct 03 08:40:04 crc kubenswrapper[4765]: I1003 08:40:04.695422 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 08:40:04 crc kubenswrapper[4765]: I1003 08:40:04.695465 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 08:40:04 crc kubenswrapper[4765]: I1003 08:40:04.695474 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 08:40:04 crc kubenswrapper[4765]: I1003 08:40:04.695489 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 08:40:04 crc kubenswrapper[4765]: I1003 08:40:04.695498 4765 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T08:40:04Z","lastTransitionTime":"2025-10-03T08:40:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 08:40:04 crc kubenswrapper[4765]: I1003 08:40:04.704422 4765 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-csb5z" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"912755c8-dd28-4fbc-82de-9cf85df54f4f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d7f179012e9f55f30c641a1ae3640cc90cefb3d2527d0c1e0580c219899503e1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T08:39:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w8k2k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T08:39:36Z\\\"}}\" for pod \"openshift-multus\"/\"multus-csb5z\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T08:40:04Z is after 2025-08-24T17:21:41Z" Oct 03 08:40:04 crc kubenswrapper[4765]: I1003 08:40:04.714173 4765 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-j8mss" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d636dbad-9ffa-4ba7-953f-adea04b76a23\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://33c95fa1034cd2135f4293956d73825e809195d220ff0b10a6604bd399a5730a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T08:39:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dhzzj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://714c78e9165f96e2aee03ad7be980399f06aeb852da4d76611c236f262518281\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T08:39:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dhzzj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T08:39:36Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-j8mss\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T08:40:04Z is after 2025-08-24T17:21:41Z" Oct 03 08:40:04 crc kubenswrapper[4765]: I1003 08:40:04.730701 4765 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"859ee4f1-636f-48e5-ad72-fef19f311c64\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ffaf0cbc60fa84230a87aff908b5b2a76956abfa937aeea94363abe91640b93e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T08:39:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0fee410f71d4fa82e7bf54dad906736bc7182be512825a06bf7a4c76ed2f2789\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T08:39:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ab0ed26066c771f9943b6435fa382ff61fb04f0c8bef3d505aba4c5d1a1d4740\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T08:39:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://153c9584928c3d064c6098126dad58733015ed123b9a55c959e69ddcc0ad2110\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T08:39:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6fa1bc45d80d90bc08ca3a7177e2ac77b66c36f5a0f863532174be7719bfaae0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T08:39:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c1359775b3b907743ce1d7754c904fab5cf953ad3a28f91e781be95462e6a9d4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c1359775b3b907743ce1d7754c904fab5cf953ad3a28f91e781be95462e6a9d4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T08:39:17Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T08:39:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a56c24b3af6b3d5c18009cbb5517273674eb36f57157344f99903f9c50c7d1a0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a56c24b3af6b3d5c18009cbb5517273674eb36f57157344f99903f9c50c7d1a0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T08:39:18Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T08:39:18Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://18dbd66a12f96fa4beef4da9dcc0e1b1d3a5164a8b21a6774142289078d6c05f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://18dbd66a12f96fa4beef4da9dcc0e1b1d3a5164a8b21a6774142289078d6c05f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T08:39:19Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T08:39:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T08:39:16Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T08:40:04Z is after 2025-08-24T17:21:41Z" Oct 03 08:40:04 crc kubenswrapper[4765]: I1003 08:40:04.742182 4765 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:36Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:36Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e8d6f534a0a702832db2f8947c1528a98d511d3950cc5a6ec0ac3b31b3dbcb7c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T08:39:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://20ad16cb9f0f7e17ac946cd2c3f7c01b6e6c95d6d76c99f482b3761546689af2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T08:39:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T08:40:04Z is after 2025-08-24T17:21:41Z" Oct 03 08:40:04 crc kubenswrapper[4765]: I1003 08:40:04.751740 4765 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:38Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:38Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a37f2b5f797755065158a077232872befbc61f2f19c80dfd27bba7f131db794c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T08:39:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T08:40:04Z is after 2025-08-24T17:21:41Z" Oct 03 08:40:04 crc kubenswrapper[4765]: I1003 08:40:04.768391 4765 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-srgbb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ea01fba1-445f-46c1-898c-1ceb34866850\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:36Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:36Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d73e2e54676fc570262cfd551322ed003812c372ddc25695ca3b34ae2a05423b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T08:39:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m4zqv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fa40947035e07c4926ee170348e2bd545830d0c6c1fa6b59a2aa7f12eac2c6da\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T08:39:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m4zqv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://902d94d2cc9ce526c6ea774f1bb70fbee7da85cedab72fcd842f87d47ee8a458\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T08:39:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m4zqv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://95502595a856f5f235331ab5db3d4f97a50f968857c1962d12b873a714689f0c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T08:39:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m4zqv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a3ad66691c9dcf004703b79d697a78f9b42791fafba2ddf278997b6ad28bdd4a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T08:39:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m4zqv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://68b9b8a7ec5c072f50d44aa0d3800b7cdee18bdd868d37ec129ceb37a23bd3ca\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T08:39:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m4zqv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a4d481217db9abe6da65a66219fdf2298353f237df78c085f40bb803f7349ccd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a4d481217db9abe6da65a66219fdf2298353f237df78c085f40bb803f7349ccd\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-10-03T08:40:03Z\\\",\\\"message\\\":\\\":false hairpin_snat_ip:169.254.0.5 fd69::5 neighbor_responder:none reject:true skip_snat:false]} protocol:{GoSet:[tcp]} selection_fields:{GoSet:[]} vips:{GoMap:map[10.217.5.188:443:]}] Rows:[] Columns:[] Mutations:[] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {53c717ca-2174-4315-bb03-c937a9c0d9b6}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nI1003 08:40:03.133396 6432 transact.go:42] Configuring OVN: [{Op:update Table:Load_Balancer Row:map[external_ids:{GoMap:map[k8s.ovn.org/kind:Service k8s.ovn.org/owner:openshift-cluster-version/cluster-version-operator]} name:Service_openshift-cluster-version/cluster-version-operator_TCP_cluster options:{GoMap:map[event:false hairpin_snat_ip:169.254.0.5 fd69::5 neighbor_responder:none reject:true skip_snat:false]} protocol:{GoSet:[tcp]} selection_fields:{GoSet:[]} vips:{GoMap:map[10.217.4.182:9099:]}] Rows:[] Columns:[] Mutations:[] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {61d39e4d-21a9-4387-9a2b-fa4ad14792e2}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nI1003 08:40:03.133455 6432 base_network_controller_pods.go:916] Annotation values: ip=[10.217.0.3/23] ; mac=0a:58:0a:d9:00:03 ; gw=[10.217.0.1]\\\\nF1003 08:40:03.133502 6432 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-03T08:40:02Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-srgbb_openshift-ovn-kubernetes(ea01fba1-445f-46c1-898c-1ceb34866850)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m4zqv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6d5d60eb6ab5ff22cc2c6826b1d47220bb827fa0429f2a59020ae01d0a43f6bf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T08:39:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m4zqv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://761ad034a8a6a7a6280fcccaca136b26ddd423885bc2a7ab6b8e1856f16f7416\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://761ad034a8a6a7a6280fcccaca136b26ddd423885bc2a7ab6b8e1856f16f7416\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T08:39:37Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T08:39:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m4zqv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T08:39:36Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-srgbb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T08:40:04Z is after 2025-08-24T17:21:41Z" Oct 03 08:40:04 crc kubenswrapper[4765]: I1003 08:40:04.780521 4765 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:35Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:35Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T08:40:04Z is after 2025-08-24T17:21:41Z" Oct 03 08:40:04 crc kubenswrapper[4765]: I1003 08:40:04.797990 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 08:40:04 crc kubenswrapper[4765]: I1003 08:40:04.798034 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 08:40:04 crc kubenswrapper[4765]: I1003 08:40:04.798048 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 08:40:04 crc kubenswrapper[4765]: I1003 08:40:04.798065 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 08:40:04 crc kubenswrapper[4765]: I1003 08:40:04.798078 4765 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T08:40:04Z","lastTransitionTime":"2025-10-03T08:40:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 08:40:04 crc kubenswrapper[4765]: I1003 08:40:04.798252 4765 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-4bmrv" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4f105c06-3e67-486f-a622-923ae442117c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d7a29ab4db9b7548c70824520272e6323f615934cddf1d92bf653f6d8f030a19\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T08:39:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6lhz2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ab314ed006e71cae099ecf8be4ac18fb777dd83fe4e233d9bc347965f76b6a0b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ab314ed006e71cae099ecf8be4ac18fb777dd83fe4e233d9bc347965f76b6a0b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T08:39:37Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T08:39:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6lhz2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://261208c132a527b6786f64f37dd699e889f68ebc01265028288b3620615274bc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://261208c132a527b6786f64f37dd699e889f68ebc01265028288b3620615274bc\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T08:39:39Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T08:39:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6lhz2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8358f38c6c98de8206fc03fda21200b0e526972747588a6e32d525c064aa57ac\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8358f38c6c98de8206fc03fda21200b0e526972747588a6e32d525c064aa57ac\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T08:39:39Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T08:39:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6lhz2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9af7a0993c4e8d1177050ee170ae306c2e2570b0daca2d3f5c812b5f0e9c81da\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9af7a0993c4e8d1177050ee170ae306c2e2570b0daca2d3f5c812b5f0e9c81da\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T08:39:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T08:39:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6lhz2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://23ac91bc25ecc5c606b22bf6df52129330bb8c214ef8ec881fb202df6350c853\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://23ac91bc25ecc5c606b22bf6df52129330bb8c214ef8ec881fb202df6350c853\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T08:39:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T08:39:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6lhz2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7c836df75da45ef369baafc15bdbed1068becc3bf57a4c83a8519280ff3eb847\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7c836df75da45ef369baafc15bdbed1068becc3bf57a4c83a8519280ff3eb847\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T08:39:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T08:39:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6lhz2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T08:39:36Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-4bmrv\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T08:40:04Z is after 2025-08-24T17:21:41Z" Oct 03 08:40:04 crc kubenswrapper[4765]: I1003 08:40:04.815552 4765 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-p9gf5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"46c76a49-e10b-4a12-a6c7-12c330cd3c4e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d127171dd11041892813dd0596574630e756cc4f2e54b149619bffdbe9bae37d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T08:39:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2ldt7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T08:39:36Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-p9gf5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T08:40:04Z is after 2025-08-24T17:21:41Z" Oct 03 08:40:04 crc kubenswrapper[4765]: I1003 08:40:04.827105 4765 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-svqbq" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"84cdf1d7-9997-4015-bdbf-eedacc081685\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://43441b23076aa88505c0014c6734ffd0302f9011300711eece573befc94f3fbf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T08:39:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tm2z5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T08:39:39Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-svqbq\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T08:40:04Z is after 2025-08-24T17:21:41Z" Oct 03 08:40:04 crc kubenswrapper[4765]: I1003 08:40:04.837787 4765 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-9pssq" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fcbd8c60-e4bc-43c1-b769-9ae58a05ea0f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fb36c0727cbf11d911102b2e91c3989a264374191f4ff34349ed6ec8eba2e58d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T08:39:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ggxtg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d810b33fb4971c7a1473884cbe04ad15b3cac6c0ca9af2384819d72a748ab173\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T08:39:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ggxtg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T08:39:48Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-9pssq\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T08:40:04Z is after 2025-08-24T17:21:41Z" Oct 03 08:40:04 crc kubenswrapper[4765]: I1003 08:40:04.850369 4765 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9660b983-3561-4cf7-8ea0-31a63e8d1051\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://23c27e7d79dab0c54b22f0114e7f55a9267e3a21961b8479c37fd77d0e8b66c7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T08:39:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fb89a31c804d86cbc11b04e4dcfab79d4536f28a107d43e98d48172a1c257ebb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T08:39:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1c3168f51c49cd9633557cf31cdc0fec47b3fcf981462dc85f4253a0584fcf52\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T08:39:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://99ae775d5cfd2e88a1c7ca516e1c59f2e08ce1d383653cacbefeac66b07abcb5\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T08:39:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T08:39:16Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T08:40:04Z is after 2025-08-24T17:21:41Z" Oct 03 08:40:04 crc kubenswrapper[4765]: I1003 08:40:04.900853 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 08:40:04 crc kubenswrapper[4765]: I1003 08:40:04.900901 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 08:40:04 crc kubenswrapper[4765]: I1003 08:40:04.900914 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 08:40:04 crc kubenswrapper[4765]: I1003 08:40:04.900934 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 08:40:04 crc kubenswrapper[4765]: I1003 08:40:04.900948 4765 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T08:40:04Z","lastTransitionTime":"2025-10-03T08:40:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 08:40:05 crc kubenswrapper[4765]: I1003 08:40:05.004386 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 08:40:05 crc kubenswrapper[4765]: I1003 08:40:05.004428 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 08:40:05 crc kubenswrapper[4765]: I1003 08:40:05.004436 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 08:40:05 crc kubenswrapper[4765]: I1003 08:40:05.004452 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 08:40:05 crc kubenswrapper[4765]: I1003 08:40:05.004462 4765 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T08:40:05Z","lastTransitionTime":"2025-10-03T08:40:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 08:40:05 crc kubenswrapper[4765]: I1003 08:40:05.106972 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 08:40:05 crc kubenswrapper[4765]: I1003 08:40:05.107068 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 08:40:05 crc kubenswrapper[4765]: I1003 08:40:05.107091 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 08:40:05 crc kubenswrapper[4765]: I1003 08:40:05.107120 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 08:40:05 crc kubenswrapper[4765]: I1003 08:40:05.107139 4765 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T08:40:05Z","lastTransitionTime":"2025-10-03T08:40:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 08:40:05 crc kubenswrapper[4765]: I1003 08:40:05.209301 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 08:40:05 crc kubenswrapper[4765]: I1003 08:40:05.209366 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 08:40:05 crc kubenswrapper[4765]: I1003 08:40:05.209381 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 08:40:05 crc kubenswrapper[4765]: I1003 08:40:05.209408 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 08:40:05 crc kubenswrapper[4765]: I1003 08:40:05.209421 4765 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T08:40:05Z","lastTransitionTime":"2025-10-03T08:40:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 08:40:05 crc kubenswrapper[4765]: I1003 08:40:05.305826 4765 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 03 08:40:05 crc kubenswrapper[4765]: E1003 08:40:05.305979 4765 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 03 08:40:05 crc kubenswrapper[4765]: I1003 08:40:05.306032 4765 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 03 08:40:05 crc kubenswrapper[4765]: I1003 08:40:05.306107 4765 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 03 08:40:05 crc kubenswrapper[4765]: E1003 08:40:05.306151 4765 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 03 08:40:05 crc kubenswrapper[4765]: E1003 08:40:05.306433 4765 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 03 08:40:05 crc kubenswrapper[4765]: I1003 08:40:05.311830 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 08:40:05 crc kubenswrapper[4765]: I1003 08:40:05.311856 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 08:40:05 crc kubenswrapper[4765]: I1003 08:40:05.311866 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 08:40:05 crc kubenswrapper[4765]: I1003 08:40:05.311885 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 08:40:05 crc kubenswrapper[4765]: I1003 08:40:05.311898 4765 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T08:40:05Z","lastTransitionTime":"2025-10-03T08:40:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 08:40:05 crc kubenswrapper[4765]: I1003 08:40:05.413946 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 08:40:05 crc kubenswrapper[4765]: I1003 08:40:05.413982 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 08:40:05 crc kubenswrapper[4765]: I1003 08:40:05.413992 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 08:40:05 crc kubenswrapper[4765]: I1003 08:40:05.414006 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 08:40:05 crc kubenswrapper[4765]: I1003 08:40:05.414014 4765 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T08:40:05Z","lastTransitionTime":"2025-10-03T08:40:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 08:40:05 crc kubenswrapper[4765]: I1003 08:40:05.516270 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 08:40:05 crc kubenswrapper[4765]: I1003 08:40:05.516313 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 08:40:05 crc kubenswrapper[4765]: I1003 08:40:05.516324 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 08:40:05 crc kubenswrapper[4765]: I1003 08:40:05.516340 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 08:40:05 crc kubenswrapper[4765]: I1003 08:40:05.516351 4765 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T08:40:05Z","lastTransitionTime":"2025-10-03T08:40:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 08:40:05 crc kubenswrapper[4765]: I1003 08:40:05.618231 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 08:40:05 crc kubenswrapper[4765]: I1003 08:40:05.618273 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 08:40:05 crc kubenswrapper[4765]: I1003 08:40:05.618283 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 08:40:05 crc kubenswrapper[4765]: I1003 08:40:05.618300 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 08:40:05 crc kubenswrapper[4765]: I1003 08:40:05.618313 4765 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T08:40:05Z","lastTransitionTime":"2025-10-03T08:40:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 08:40:05 crc kubenswrapper[4765]: I1003 08:40:05.720259 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 08:40:05 crc kubenswrapper[4765]: I1003 08:40:05.720302 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 08:40:05 crc kubenswrapper[4765]: I1003 08:40:05.720313 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 08:40:05 crc kubenswrapper[4765]: I1003 08:40:05.720330 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 08:40:05 crc kubenswrapper[4765]: I1003 08:40:05.720342 4765 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T08:40:05Z","lastTransitionTime":"2025-10-03T08:40:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 08:40:05 crc kubenswrapper[4765]: I1003 08:40:05.821934 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 08:40:05 crc kubenswrapper[4765]: I1003 08:40:05.821979 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 08:40:05 crc kubenswrapper[4765]: I1003 08:40:05.822011 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 08:40:05 crc kubenswrapper[4765]: I1003 08:40:05.822028 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 08:40:05 crc kubenswrapper[4765]: I1003 08:40:05.822037 4765 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T08:40:05Z","lastTransitionTime":"2025-10-03T08:40:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 08:40:05 crc kubenswrapper[4765]: I1003 08:40:05.924290 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 08:40:05 crc kubenswrapper[4765]: I1003 08:40:05.924350 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 08:40:05 crc kubenswrapper[4765]: I1003 08:40:05.924364 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 08:40:05 crc kubenswrapper[4765]: I1003 08:40:05.924383 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 08:40:05 crc kubenswrapper[4765]: I1003 08:40:05.924417 4765 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T08:40:05Z","lastTransitionTime":"2025-10-03T08:40:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 08:40:06 crc kubenswrapper[4765]: I1003 08:40:06.026681 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 08:40:06 crc kubenswrapper[4765]: I1003 08:40:06.026723 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 08:40:06 crc kubenswrapper[4765]: I1003 08:40:06.026736 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 08:40:06 crc kubenswrapper[4765]: I1003 08:40:06.026777 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 08:40:06 crc kubenswrapper[4765]: I1003 08:40:06.026794 4765 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T08:40:06Z","lastTransitionTime":"2025-10-03T08:40:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 08:40:06 crc kubenswrapper[4765]: I1003 08:40:06.129287 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 08:40:06 crc kubenswrapper[4765]: I1003 08:40:06.129323 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 08:40:06 crc kubenswrapper[4765]: I1003 08:40:06.129336 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 08:40:06 crc kubenswrapper[4765]: I1003 08:40:06.129352 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 08:40:06 crc kubenswrapper[4765]: I1003 08:40:06.129363 4765 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T08:40:06Z","lastTransitionTime":"2025-10-03T08:40:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 08:40:06 crc kubenswrapper[4765]: I1003 08:40:06.232452 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 08:40:06 crc kubenswrapper[4765]: I1003 08:40:06.232500 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 08:40:06 crc kubenswrapper[4765]: I1003 08:40:06.232509 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 08:40:06 crc kubenswrapper[4765]: I1003 08:40:06.232523 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 08:40:06 crc kubenswrapper[4765]: I1003 08:40:06.232533 4765 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T08:40:06Z","lastTransitionTime":"2025-10-03T08:40:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 08:40:06 crc kubenswrapper[4765]: I1003 08:40:06.299621 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/6824483c-e9a7-4e95-bb3d-e00bac2af3aa-metrics-certs\") pod \"network-metrics-daemon-wdwf5\" (UID: \"6824483c-e9a7-4e95-bb3d-e00bac2af3aa\") " pod="openshift-multus/network-metrics-daemon-wdwf5" Oct 03 08:40:06 crc kubenswrapper[4765]: E1003 08:40:06.299826 4765 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Oct 03 08:40:06 crc kubenswrapper[4765]: E1003 08:40:06.299890 4765 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/6824483c-e9a7-4e95-bb3d-e00bac2af3aa-metrics-certs podName:6824483c-e9a7-4e95-bb3d-e00bac2af3aa nodeName:}" failed. No retries permitted until 2025-10-03 08:40:22.299874921 +0000 UTC m=+66.601369251 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/6824483c-e9a7-4e95-bb3d-e00bac2af3aa-metrics-certs") pod "network-metrics-daemon-wdwf5" (UID: "6824483c-e9a7-4e95-bb3d-e00bac2af3aa") : object "openshift-multus"/"metrics-daemon-secret" not registered Oct 03 08:40:06 crc kubenswrapper[4765]: I1003 08:40:06.306480 4765 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-wdwf5" Oct 03 08:40:06 crc kubenswrapper[4765]: E1003 08:40:06.306663 4765 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-wdwf5" podUID="6824483c-e9a7-4e95-bb3d-e00bac2af3aa" Oct 03 08:40:06 crc kubenswrapper[4765]: I1003 08:40:06.319973 4765 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:35Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:35Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T08:40:06Z is after 2025-08-24T17:21:41Z" Oct 03 08:40:06 crc kubenswrapper[4765]: I1003 08:40:06.332502 4765 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:36Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:36Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e2003e4dd90b26bd915c05a690d0ab12b21ef7773138f11993382b0e7ac2d778\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T08:39:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T08:40:06Z is after 2025-08-24T17:21:41Z" Oct 03 08:40:06 crc kubenswrapper[4765]: I1003 08:40:06.335128 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 08:40:06 crc kubenswrapper[4765]: I1003 08:40:06.335153 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 08:40:06 crc kubenswrapper[4765]: I1003 08:40:06.335163 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 08:40:06 crc kubenswrapper[4765]: I1003 08:40:06.335196 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 08:40:06 crc kubenswrapper[4765]: I1003 08:40:06.335208 4765 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T08:40:06Z","lastTransitionTime":"2025-10-03T08:40:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 08:40:06 crc kubenswrapper[4765]: I1003 08:40:06.343618 4765 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-wdwf5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6824483c-e9a7-4e95-bb3d-e00bac2af3aa\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:50Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:50Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:50Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9t858\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9t858\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T08:39:50Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-wdwf5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T08:40:06Z is after 2025-08-24T17:21:41Z" Oct 03 08:40:06 crc kubenswrapper[4765]: I1003 08:40:06.357479 4765 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0c434639-9c6c-420c-a51b-fdf59b654daa\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d31497fd54f7500ac776bdd9a16414d873c053353911ed5ba237b201e9e7ac12\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T08:39:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e89b19d6a5b90a2051665bf2e5e150f73df7899eff246ee75246bc2127c415ea\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T08:39:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://79fad446c147481b1a0ff2a173848b2d24384e6b6aafcd0749dc820e9abfe929\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T08:39:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f2e21a2b21d807288e991a3a44ea38d316985590080aa4291aa3385816f826dd\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://aa0283dadc2c5e48aa9bfd20ef35d889a350244b72eb8529d4d4e682d5fa0e47\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-03T08:39:35Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1003 08:39:29.830291 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1003 08:39:29.833185 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2710500186/tls.crt::/tmp/serving-cert-2710500186/tls.key\\\\\\\"\\\\nI1003 08:39:35.213224 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1003 08:39:35.219008 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1003 08:39:35.219055 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1003 08:39:35.219088 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1003 08:39:35.219098 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1003 08:39:35.227302 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI1003 08:39:35.227314 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW1003 08:39:35.227372 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1003 08:39:35.227381 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1003 08:39:35.227385 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1003 08:39:35.227395 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1003 08:39:35.227398 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1003 08:39:35.227401 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF1003 08:39:35.229781 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-03T08:39:19Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T08:39:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fa1d1c0f4dab4b4c6c9f3afccac34473eab40a714015a2a7ce725ed1a92b609c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T08:39:19Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7c94b0f96f50324a67a7b189e9d3c4e406193421aa2e793692219bef9f2578dd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7c94b0f96f50324a67a7b189e9d3c4e406193421aa2e793692219bef9f2578dd\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T08:39:17Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T08:39:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T08:39:16Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T08:40:06Z is after 2025-08-24T17:21:41Z" Oct 03 08:40:06 crc kubenswrapper[4765]: I1003 08:40:06.369119 4765 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-csb5z" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"912755c8-dd28-4fbc-82de-9cf85df54f4f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d7f179012e9f55f30c641a1ae3640cc90cefb3d2527d0c1e0580c219899503e1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T08:39:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w8k2k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T08:39:36Z\\\"}}\" for pod \"openshift-multus\"/\"multus-csb5z\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T08:40:06Z is after 2025-08-24T17:21:41Z" Oct 03 08:40:06 crc kubenswrapper[4765]: I1003 08:40:06.380696 4765 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-j8mss" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d636dbad-9ffa-4ba7-953f-adea04b76a23\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://33c95fa1034cd2135f4293956d73825e809195d220ff0b10a6604bd399a5730a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T08:39:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dhzzj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://714c78e9165f96e2aee03ad7be980399f06aeb852da4d76611c236f262518281\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T08:39:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dhzzj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T08:39:36Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-j8mss\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T08:40:06Z is after 2025-08-24T17:21:41Z" Oct 03 08:40:06 crc kubenswrapper[4765]: I1003 08:40:06.398635 4765 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"859ee4f1-636f-48e5-ad72-fef19f311c64\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ffaf0cbc60fa84230a87aff908b5b2a76956abfa937aeea94363abe91640b93e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T08:39:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0fee410f71d4fa82e7bf54dad906736bc7182be512825a06bf7a4c76ed2f2789\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T08:39:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ab0ed26066c771f9943b6435fa382ff61fb04f0c8bef3d505aba4c5d1a1d4740\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T08:39:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://153c9584928c3d064c6098126dad58733015ed123b9a55c959e69ddcc0ad2110\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T08:39:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6fa1bc45d80d90bc08ca3a7177e2ac77b66c36f5a0f863532174be7719bfaae0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T08:39:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c1359775b3b907743ce1d7754c904fab5cf953ad3a28f91e781be95462e6a9d4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c1359775b3b907743ce1d7754c904fab5cf953ad3a28f91e781be95462e6a9d4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T08:39:17Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T08:39:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a56c24b3af6b3d5c18009cbb5517273674eb36f57157344f99903f9c50c7d1a0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a56c24b3af6b3d5c18009cbb5517273674eb36f57157344f99903f9c50c7d1a0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T08:39:18Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T08:39:18Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://18dbd66a12f96fa4beef4da9dcc0e1b1d3a5164a8b21a6774142289078d6c05f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://18dbd66a12f96fa4beef4da9dcc0e1b1d3a5164a8b21a6774142289078d6c05f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T08:39:19Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T08:39:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T08:39:16Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T08:40:06Z is after 2025-08-24T17:21:41Z" Oct 03 08:40:06 crc kubenswrapper[4765]: I1003 08:40:06.418946 4765 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:35Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:35Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T08:40:06Z is after 2025-08-24T17:21:41Z" Oct 03 08:40:06 crc kubenswrapper[4765]: I1003 08:40:06.432725 4765 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:38Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:38Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a37f2b5f797755065158a077232872befbc61f2f19c80dfd27bba7f131db794c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T08:39:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T08:40:06Z is after 2025-08-24T17:21:41Z" Oct 03 08:40:06 crc kubenswrapper[4765]: I1003 08:40:06.437861 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 08:40:06 crc kubenswrapper[4765]: I1003 08:40:06.437902 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 08:40:06 crc kubenswrapper[4765]: I1003 08:40:06.437915 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 08:40:06 crc kubenswrapper[4765]: I1003 08:40:06.437933 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 08:40:06 crc kubenswrapper[4765]: I1003 08:40:06.437949 4765 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T08:40:06Z","lastTransitionTime":"2025-10-03T08:40:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 08:40:06 crc kubenswrapper[4765]: I1003 08:40:06.457555 4765 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-srgbb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ea01fba1-445f-46c1-898c-1ceb34866850\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:36Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:36Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d73e2e54676fc570262cfd551322ed003812c372ddc25695ca3b34ae2a05423b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T08:39:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m4zqv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fa40947035e07c4926ee170348e2bd545830d0c6c1fa6b59a2aa7f12eac2c6da\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T08:39:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m4zqv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://902d94d2cc9ce526c6ea774f1bb70fbee7da85cedab72fcd842f87d47ee8a458\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T08:39:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m4zqv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://95502595a856f5f235331ab5db3d4f97a50f968857c1962d12b873a714689f0c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T08:39:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m4zqv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a3ad66691c9dcf004703b79d697a78f9b42791fafba2ddf278997b6ad28bdd4a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T08:39:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m4zqv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://68b9b8a7ec5c072f50d44aa0d3800b7cdee18bdd868d37ec129ceb37a23bd3ca\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T08:39:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m4zqv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a4d481217db9abe6da65a66219fdf2298353f237df78c085f40bb803f7349ccd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a4d481217db9abe6da65a66219fdf2298353f237df78c085f40bb803f7349ccd\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-10-03T08:40:03Z\\\",\\\"message\\\":\\\":false hairpin_snat_ip:169.254.0.5 fd69::5 neighbor_responder:none reject:true skip_snat:false]} protocol:{GoSet:[tcp]} selection_fields:{GoSet:[]} vips:{GoMap:map[10.217.5.188:443:]}] Rows:[] Columns:[] Mutations:[] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {53c717ca-2174-4315-bb03-c937a9c0d9b6}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nI1003 08:40:03.133396 6432 transact.go:42] Configuring OVN: [{Op:update Table:Load_Balancer Row:map[external_ids:{GoMap:map[k8s.ovn.org/kind:Service k8s.ovn.org/owner:openshift-cluster-version/cluster-version-operator]} name:Service_openshift-cluster-version/cluster-version-operator_TCP_cluster options:{GoMap:map[event:false hairpin_snat_ip:169.254.0.5 fd69::5 neighbor_responder:none reject:true skip_snat:false]} protocol:{GoSet:[tcp]} selection_fields:{GoSet:[]} vips:{GoMap:map[10.217.4.182:9099:]}] Rows:[] Columns:[] Mutations:[] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {61d39e4d-21a9-4387-9a2b-fa4ad14792e2}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nI1003 08:40:03.133455 6432 base_network_controller_pods.go:916] Annotation values: ip=[10.217.0.3/23] ; mac=0a:58:0a:d9:00:03 ; gw=[10.217.0.1]\\\\nF1003 08:40:03.133502 6432 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-03T08:40:02Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-srgbb_openshift-ovn-kubernetes(ea01fba1-445f-46c1-898c-1ceb34866850)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m4zqv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6d5d60eb6ab5ff22cc2c6826b1d47220bb827fa0429f2a59020ae01d0a43f6bf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T08:39:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m4zqv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://761ad034a8a6a7a6280fcccaca136b26ddd423885bc2a7ab6b8e1856f16f7416\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://761ad034a8a6a7a6280fcccaca136b26ddd423885bc2a7ab6b8e1856f16f7416\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T08:39:37Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T08:39:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m4zqv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T08:39:36Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-srgbb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T08:40:06Z is after 2025-08-24T17:21:41Z" Oct 03 08:40:06 crc kubenswrapper[4765]: I1003 08:40:06.470636 4765 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:35Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:35Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T08:40:06Z is after 2025-08-24T17:21:41Z" Oct 03 08:40:06 crc kubenswrapper[4765]: I1003 08:40:06.481723 4765 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:36Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:36Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e8d6f534a0a702832db2f8947c1528a98d511d3950cc5a6ec0ac3b31b3dbcb7c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T08:39:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://20ad16cb9f0f7e17ac946cd2c3f7c01b6e6c95d6d76c99f482b3761546689af2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T08:39:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T08:40:06Z is after 2025-08-24T17:21:41Z" Oct 03 08:40:06 crc kubenswrapper[4765]: I1003 08:40:06.490491 4765 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-p9gf5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"46c76a49-e10b-4a12-a6c7-12c330cd3c4e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d127171dd11041892813dd0596574630e756cc4f2e54b149619bffdbe9bae37d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T08:39:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2ldt7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T08:39:36Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-p9gf5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T08:40:06Z is after 2025-08-24T17:21:41Z" Oct 03 08:40:06 crc kubenswrapper[4765]: I1003 08:40:06.499190 4765 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-svqbq" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"84cdf1d7-9997-4015-bdbf-eedacc081685\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://43441b23076aa88505c0014c6734ffd0302f9011300711eece573befc94f3fbf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T08:39:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tm2z5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T08:39:39Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-svqbq\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T08:40:06Z is after 2025-08-24T17:21:41Z" Oct 03 08:40:06 crc kubenswrapper[4765]: I1003 08:40:06.510680 4765 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-9pssq" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fcbd8c60-e4bc-43c1-b769-9ae58a05ea0f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fb36c0727cbf11d911102b2e91c3989a264374191f4ff34349ed6ec8eba2e58d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T08:39:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ggxtg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d810b33fb4971c7a1473884cbe04ad15b3cac6c0ca9af2384819d72a748ab173\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T08:39:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ggxtg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T08:39:48Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-9pssq\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T08:40:06Z is after 2025-08-24T17:21:41Z" Oct 03 08:40:06 crc kubenswrapper[4765]: I1003 08:40:06.522869 4765 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9660b983-3561-4cf7-8ea0-31a63e8d1051\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://23c27e7d79dab0c54b22f0114e7f55a9267e3a21961b8479c37fd77d0e8b66c7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T08:39:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fb89a31c804d86cbc11b04e4dcfab79d4536f28a107d43e98d48172a1c257ebb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T08:39:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1c3168f51c49cd9633557cf31cdc0fec47b3fcf981462dc85f4253a0584fcf52\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T08:39:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://99ae775d5cfd2e88a1c7ca516e1c59f2e08ce1d383653cacbefeac66b07abcb5\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T08:39:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T08:39:16Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T08:40:06Z is after 2025-08-24T17:21:41Z" Oct 03 08:40:06 crc kubenswrapper[4765]: I1003 08:40:06.535391 4765 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-4bmrv" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4f105c06-3e67-486f-a622-923ae442117c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d7a29ab4db9b7548c70824520272e6323f615934cddf1d92bf653f6d8f030a19\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T08:39:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6lhz2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ab314ed006e71cae099ecf8be4ac18fb777dd83fe4e233d9bc347965f76b6a0b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ab314ed006e71cae099ecf8be4ac18fb777dd83fe4e233d9bc347965f76b6a0b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T08:39:37Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T08:39:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6lhz2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://261208c132a527b6786f64f37dd699e889f68ebc01265028288b3620615274bc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://261208c132a527b6786f64f37dd699e889f68ebc01265028288b3620615274bc\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T08:39:39Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T08:39:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6lhz2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8358f38c6c98de8206fc03fda21200b0e526972747588a6e32d525c064aa57ac\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8358f38c6c98de8206fc03fda21200b0e526972747588a6e32d525c064aa57ac\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T08:39:39Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T08:39:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6lhz2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9af7a0993c4e8d1177050ee170ae306c2e2570b0daca2d3f5c812b5f0e9c81da\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9af7a0993c4e8d1177050ee170ae306c2e2570b0daca2d3f5c812b5f0e9c81da\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T08:39:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T08:39:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6lhz2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://23ac91bc25ecc5c606b22bf6df52129330bb8c214ef8ec881fb202df6350c853\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://23ac91bc25ecc5c606b22bf6df52129330bb8c214ef8ec881fb202df6350c853\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T08:39:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T08:39:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6lhz2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7c836df75da45ef369baafc15bdbed1068becc3bf57a4c83a8519280ff3eb847\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7c836df75da45ef369baafc15bdbed1068becc3bf57a4c83a8519280ff3eb847\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T08:39:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T08:39:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6lhz2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T08:39:36Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-4bmrv\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T08:40:06Z is after 2025-08-24T17:21:41Z" Oct 03 08:40:06 crc kubenswrapper[4765]: I1003 08:40:06.540064 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 08:40:06 crc kubenswrapper[4765]: I1003 08:40:06.540105 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 08:40:06 crc kubenswrapper[4765]: I1003 08:40:06.540117 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 08:40:06 crc kubenswrapper[4765]: I1003 08:40:06.540133 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 08:40:06 crc kubenswrapper[4765]: I1003 08:40:06.540143 4765 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T08:40:06Z","lastTransitionTime":"2025-10-03T08:40:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 08:40:06 crc kubenswrapper[4765]: I1003 08:40:06.642838 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 08:40:06 crc kubenswrapper[4765]: I1003 08:40:06.642882 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 08:40:06 crc kubenswrapper[4765]: I1003 08:40:06.642894 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 08:40:06 crc kubenswrapper[4765]: I1003 08:40:06.642932 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 08:40:06 crc kubenswrapper[4765]: I1003 08:40:06.642942 4765 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T08:40:06Z","lastTransitionTime":"2025-10-03T08:40:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 08:40:06 crc kubenswrapper[4765]: I1003 08:40:06.745717 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 08:40:06 crc kubenswrapper[4765]: I1003 08:40:06.745768 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 08:40:06 crc kubenswrapper[4765]: I1003 08:40:06.745781 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 08:40:06 crc kubenswrapper[4765]: I1003 08:40:06.745801 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 08:40:06 crc kubenswrapper[4765]: I1003 08:40:06.745813 4765 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T08:40:06Z","lastTransitionTime":"2025-10-03T08:40:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 08:40:06 crc kubenswrapper[4765]: I1003 08:40:06.848900 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 08:40:06 crc kubenswrapper[4765]: I1003 08:40:06.849146 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 08:40:06 crc kubenswrapper[4765]: I1003 08:40:06.849215 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 08:40:06 crc kubenswrapper[4765]: I1003 08:40:06.849320 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 08:40:06 crc kubenswrapper[4765]: I1003 08:40:06.849380 4765 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T08:40:06Z","lastTransitionTime":"2025-10-03T08:40:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 08:40:06 crc kubenswrapper[4765]: I1003 08:40:06.952260 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 08:40:06 crc kubenswrapper[4765]: I1003 08:40:06.952524 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 08:40:06 crc kubenswrapper[4765]: I1003 08:40:06.952598 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 08:40:06 crc kubenswrapper[4765]: I1003 08:40:06.952725 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 08:40:06 crc kubenswrapper[4765]: I1003 08:40:06.952797 4765 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T08:40:06Z","lastTransitionTime":"2025-10-03T08:40:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 08:40:07 crc kubenswrapper[4765]: I1003 08:40:07.054572 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 08:40:07 crc kubenswrapper[4765]: I1003 08:40:07.054827 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 08:40:07 crc kubenswrapper[4765]: I1003 08:40:07.054926 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 08:40:07 crc kubenswrapper[4765]: I1003 08:40:07.054989 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 08:40:07 crc kubenswrapper[4765]: I1003 08:40:07.055042 4765 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T08:40:07Z","lastTransitionTime":"2025-10-03T08:40:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 08:40:07 crc kubenswrapper[4765]: I1003 08:40:07.108607 4765 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 03 08:40:07 crc kubenswrapper[4765]: E1003 08:40:07.108788 4765 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-03 08:40:39.108760039 +0000 UTC m=+83.410254369 (durationBeforeRetry 32s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 03 08:40:07 crc kubenswrapper[4765]: I1003 08:40:07.157252 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 08:40:07 crc kubenswrapper[4765]: I1003 08:40:07.157297 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 08:40:07 crc kubenswrapper[4765]: I1003 08:40:07.157309 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 08:40:07 crc kubenswrapper[4765]: I1003 08:40:07.157326 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 08:40:07 crc kubenswrapper[4765]: I1003 08:40:07.157339 4765 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T08:40:07Z","lastTransitionTime":"2025-10-03T08:40:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 08:40:07 crc kubenswrapper[4765]: I1003 08:40:07.209865 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 03 08:40:07 crc kubenswrapper[4765]: I1003 08:40:07.209933 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 03 08:40:07 crc kubenswrapper[4765]: I1003 08:40:07.209958 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 03 08:40:07 crc kubenswrapper[4765]: I1003 08:40:07.209983 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 03 08:40:07 crc kubenswrapper[4765]: E1003 08:40:07.210067 4765 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Oct 03 08:40:07 crc kubenswrapper[4765]: E1003 08:40:07.210105 4765 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Oct 03 08:40:07 crc kubenswrapper[4765]: E1003 08:40:07.210174 4765 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Oct 03 08:40:07 crc kubenswrapper[4765]: E1003 08:40:07.210186 4765 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-10-03 08:40:39.210168413 +0000 UTC m=+83.511662743 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Oct 03 08:40:07 crc kubenswrapper[4765]: E1003 08:40:07.210194 4765 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Oct 03 08:40:07 crc kubenswrapper[4765]: E1003 08:40:07.210261 4765 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Oct 03 08:40:07 crc kubenswrapper[4765]: E1003 08:40:07.210315 4765 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2025-10-03 08:40:39.210296706 +0000 UTC m=+83.511791196 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Oct 03 08:40:07 crc kubenswrapper[4765]: E1003 08:40:07.210109 4765 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Oct 03 08:40:07 crc kubenswrapper[4765]: E1003 08:40:07.210352 4765 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-10-03 08:40:39.210344627 +0000 UTC m=+83.511839167 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Oct 03 08:40:07 crc kubenswrapper[4765]: E1003 08:40:07.210115 4765 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Oct 03 08:40:07 crc kubenswrapper[4765]: E1003 08:40:07.210370 4765 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Oct 03 08:40:07 crc kubenswrapper[4765]: E1003 08:40:07.210423 4765 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2025-10-03 08:40:39.210416009 +0000 UTC m=+83.511910539 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Oct 03 08:40:07 crc kubenswrapper[4765]: I1003 08:40:07.259688 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 08:40:07 crc kubenswrapper[4765]: I1003 08:40:07.259730 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 08:40:07 crc kubenswrapper[4765]: I1003 08:40:07.259742 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 08:40:07 crc kubenswrapper[4765]: I1003 08:40:07.259761 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 08:40:07 crc kubenswrapper[4765]: I1003 08:40:07.259773 4765 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T08:40:07Z","lastTransitionTime":"2025-10-03T08:40:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 08:40:07 crc kubenswrapper[4765]: I1003 08:40:07.306390 4765 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 03 08:40:07 crc kubenswrapper[4765]: I1003 08:40:07.306390 4765 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 03 08:40:07 crc kubenswrapper[4765]: I1003 08:40:07.306414 4765 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 03 08:40:07 crc kubenswrapper[4765]: E1003 08:40:07.306665 4765 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 03 08:40:07 crc kubenswrapper[4765]: E1003 08:40:07.306709 4765 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 03 08:40:07 crc kubenswrapper[4765]: E1003 08:40:07.306539 4765 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 03 08:40:07 crc kubenswrapper[4765]: I1003 08:40:07.362458 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 08:40:07 crc kubenswrapper[4765]: I1003 08:40:07.362497 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 08:40:07 crc kubenswrapper[4765]: I1003 08:40:07.362508 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 08:40:07 crc kubenswrapper[4765]: I1003 08:40:07.362547 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 08:40:07 crc kubenswrapper[4765]: I1003 08:40:07.362564 4765 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T08:40:07Z","lastTransitionTime":"2025-10-03T08:40:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 08:40:07 crc kubenswrapper[4765]: I1003 08:40:07.464998 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 08:40:07 crc kubenswrapper[4765]: I1003 08:40:07.465044 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 08:40:07 crc kubenswrapper[4765]: I1003 08:40:07.465057 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 08:40:07 crc kubenswrapper[4765]: I1003 08:40:07.465073 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 08:40:07 crc kubenswrapper[4765]: I1003 08:40:07.465084 4765 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T08:40:07Z","lastTransitionTime":"2025-10-03T08:40:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 08:40:07 crc kubenswrapper[4765]: I1003 08:40:07.567078 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 08:40:07 crc kubenswrapper[4765]: I1003 08:40:07.567117 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 08:40:07 crc kubenswrapper[4765]: I1003 08:40:07.567126 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 08:40:07 crc kubenswrapper[4765]: I1003 08:40:07.567140 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 08:40:07 crc kubenswrapper[4765]: I1003 08:40:07.567149 4765 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T08:40:07Z","lastTransitionTime":"2025-10-03T08:40:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 08:40:07 crc kubenswrapper[4765]: I1003 08:40:07.670778 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 08:40:07 crc kubenswrapper[4765]: I1003 08:40:07.670857 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 08:40:07 crc kubenswrapper[4765]: I1003 08:40:07.670879 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 08:40:07 crc kubenswrapper[4765]: I1003 08:40:07.670910 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 08:40:07 crc kubenswrapper[4765]: I1003 08:40:07.670933 4765 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T08:40:07Z","lastTransitionTime":"2025-10-03T08:40:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 08:40:07 crc kubenswrapper[4765]: I1003 08:40:07.777804 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 08:40:07 crc kubenswrapper[4765]: I1003 08:40:07.777857 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 08:40:07 crc kubenswrapper[4765]: I1003 08:40:07.777881 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 08:40:07 crc kubenswrapper[4765]: I1003 08:40:07.777906 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 08:40:07 crc kubenswrapper[4765]: I1003 08:40:07.777922 4765 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T08:40:07Z","lastTransitionTime":"2025-10-03T08:40:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 08:40:07 crc kubenswrapper[4765]: I1003 08:40:07.880284 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 08:40:07 crc kubenswrapper[4765]: I1003 08:40:07.880550 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 08:40:07 crc kubenswrapper[4765]: I1003 08:40:07.880618 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 08:40:07 crc kubenswrapper[4765]: I1003 08:40:07.880719 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 08:40:07 crc kubenswrapper[4765]: I1003 08:40:07.880820 4765 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T08:40:07Z","lastTransitionTime":"2025-10-03T08:40:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 08:40:07 crc kubenswrapper[4765]: I1003 08:40:07.982897 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 08:40:07 crc kubenswrapper[4765]: I1003 08:40:07.983216 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 08:40:07 crc kubenswrapper[4765]: I1003 08:40:07.983228 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 08:40:07 crc kubenswrapper[4765]: I1003 08:40:07.983244 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 08:40:07 crc kubenswrapper[4765]: I1003 08:40:07.983255 4765 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T08:40:07Z","lastTransitionTime":"2025-10-03T08:40:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 08:40:08 crc kubenswrapper[4765]: I1003 08:40:08.085411 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 08:40:08 crc kubenswrapper[4765]: I1003 08:40:08.085447 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 08:40:08 crc kubenswrapper[4765]: I1003 08:40:08.085457 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 08:40:08 crc kubenswrapper[4765]: I1003 08:40:08.085473 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 08:40:08 crc kubenswrapper[4765]: I1003 08:40:08.085482 4765 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T08:40:08Z","lastTransitionTime":"2025-10-03T08:40:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 08:40:08 crc kubenswrapper[4765]: I1003 08:40:08.187444 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 08:40:08 crc kubenswrapper[4765]: I1003 08:40:08.187470 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 08:40:08 crc kubenswrapper[4765]: I1003 08:40:08.187479 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 08:40:08 crc kubenswrapper[4765]: I1003 08:40:08.187493 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 08:40:08 crc kubenswrapper[4765]: I1003 08:40:08.187505 4765 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T08:40:08Z","lastTransitionTime":"2025-10-03T08:40:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 08:40:08 crc kubenswrapper[4765]: I1003 08:40:08.290394 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 08:40:08 crc kubenswrapper[4765]: I1003 08:40:08.290439 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 08:40:08 crc kubenswrapper[4765]: I1003 08:40:08.290452 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 08:40:08 crc kubenswrapper[4765]: I1003 08:40:08.290469 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 08:40:08 crc kubenswrapper[4765]: I1003 08:40:08.290479 4765 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T08:40:08Z","lastTransitionTime":"2025-10-03T08:40:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 08:40:08 crc kubenswrapper[4765]: I1003 08:40:08.306765 4765 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-wdwf5" Oct 03 08:40:08 crc kubenswrapper[4765]: E1003 08:40:08.306948 4765 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-wdwf5" podUID="6824483c-e9a7-4e95-bb3d-e00bac2af3aa" Oct 03 08:40:08 crc kubenswrapper[4765]: I1003 08:40:08.393280 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 08:40:08 crc kubenswrapper[4765]: I1003 08:40:08.393330 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 08:40:08 crc kubenswrapper[4765]: I1003 08:40:08.393366 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 08:40:08 crc kubenswrapper[4765]: I1003 08:40:08.393386 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 08:40:08 crc kubenswrapper[4765]: I1003 08:40:08.393400 4765 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T08:40:08Z","lastTransitionTime":"2025-10-03T08:40:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 08:40:08 crc kubenswrapper[4765]: I1003 08:40:08.495547 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 08:40:08 crc kubenswrapper[4765]: I1003 08:40:08.495671 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 08:40:08 crc kubenswrapper[4765]: I1003 08:40:08.495682 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 08:40:08 crc kubenswrapper[4765]: I1003 08:40:08.495700 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 08:40:08 crc kubenswrapper[4765]: I1003 08:40:08.495711 4765 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T08:40:08Z","lastTransitionTime":"2025-10-03T08:40:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 08:40:08 crc kubenswrapper[4765]: I1003 08:40:08.598102 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 08:40:08 crc kubenswrapper[4765]: I1003 08:40:08.598136 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 08:40:08 crc kubenswrapper[4765]: I1003 08:40:08.598146 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 08:40:08 crc kubenswrapper[4765]: I1003 08:40:08.598162 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 08:40:08 crc kubenswrapper[4765]: I1003 08:40:08.598171 4765 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T08:40:08Z","lastTransitionTime":"2025-10-03T08:40:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 08:40:08 crc kubenswrapper[4765]: I1003 08:40:08.700141 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 08:40:08 crc kubenswrapper[4765]: I1003 08:40:08.700177 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 08:40:08 crc kubenswrapper[4765]: I1003 08:40:08.700186 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 08:40:08 crc kubenswrapper[4765]: I1003 08:40:08.700201 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 08:40:08 crc kubenswrapper[4765]: I1003 08:40:08.700212 4765 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T08:40:08Z","lastTransitionTime":"2025-10-03T08:40:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 08:40:08 crc kubenswrapper[4765]: I1003 08:40:08.802715 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 08:40:08 crc kubenswrapper[4765]: I1003 08:40:08.802762 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 08:40:08 crc kubenswrapper[4765]: I1003 08:40:08.802777 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 08:40:08 crc kubenswrapper[4765]: I1003 08:40:08.802796 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 08:40:08 crc kubenswrapper[4765]: I1003 08:40:08.802808 4765 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T08:40:08Z","lastTransitionTime":"2025-10-03T08:40:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 08:40:08 crc kubenswrapper[4765]: I1003 08:40:08.905270 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 08:40:08 crc kubenswrapper[4765]: I1003 08:40:08.905323 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 08:40:08 crc kubenswrapper[4765]: I1003 08:40:08.905333 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 08:40:08 crc kubenswrapper[4765]: I1003 08:40:08.905346 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 08:40:08 crc kubenswrapper[4765]: I1003 08:40:08.905356 4765 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T08:40:08Z","lastTransitionTime":"2025-10-03T08:40:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 08:40:09 crc kubenswrapper[4765]: I1003 08:40:09.007472 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 08:40:09 crc kubenswrapper[4765]: I1003 08:40:09.007542 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 08:40:09 crc kubenswrapper[4765]: I1003 08:40:09.007558 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 08:40:09 crc kubenswrapper[4765]: I1003 08:40:09.007580 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 08:40:09 crc kubenswrapper[4765]: I1003 08:40:09.007591 4765 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T08:40:09Z","lastTransitionTime":"2025-10-03T08:40:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 08:40:09 crc kubenswrapper[4765]: I1003 08:40:09.109640 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 08:40:09 crc kubenswrapper[4765]: I1003 08:40:09.109698 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 08:40:09 crc kubenswrapper[4765]: I1003 08:40:09.109709 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 08:40:09 crc kubenswrapper[4765]: I1003 08:40:09.109722 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 08:40:09 crc kubenswrapper[4765]: I1003 08:40:09.109732 4765 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T08:40:09Z","lastTransitionTime":"2025-10-03T08:40:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 08:40:09 crc kubenswrapper[4765]: I1003 08:40:09.212341 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 08:40:09 crc kubenswrapper[4765]: I1003 08:40:09.212381 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 08:40:09 crc kubenswrapper[4765]: I1003 08:40:09.212393 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 08:40:09 crc kubenswrapper[4765]: I1003 08:40:09.212410 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 08:40:09 crc kubenswrapper[4765]: I1003 08:40:09.212420 4765 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T08:40:09Z","lastTransitionTime":"2025-10-03T08:40:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 08:40:09 crc kubenswrapper[4765]: I1003 08:40:09.305670 4765 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 03 08:40:09 crc kubenswrapper[4765]: I1003 08:40:09.305774 4765 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 03 08:40:09 crc kubenswrapper[4765]: I1003 08:40:09.305796 4765 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 03 08:40:09 crc kubenswrapper[4765]: E1003 08:40:09.305892 4765 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 03 08:40:09 crc kubenswrapper[4765]: E1003 08:40:09.305987 4765 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 03 08:40:09 crc kubenswrapper[4765]: E1003 08:40:09.306064 4765 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 03 08:40:09 crc kubenswrapper[4765]: I1003 08:40:09.314680 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 08:40:09 crc kubenswrapper[4765]: I1003 08:40:09.314733 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 08:40:09 crc kubenswrapper[4765]: I1003 08:40:09.314747 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 08:40:09 crc kubenswrapper[4765]: I1003 08:40:09.314769 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 08:40:09 crc kubenswrapper[4765]: I1003 08:40:09.314784 4765 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T08:40:09Z","lastTransitionTime":"2025-10-03T08:40:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 08:40:09 crc kubenswrapper[4765]: I1003 08:40:09.416785 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 08:40:09 crc kubenswrapper[4765]: I1003 08:40:09.416816 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 08:40:09 crc kubenswrapper[4765]: I1003 08:40:09.416826 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 08:40:09 crc kubenswrapper[4765]: I1003 08:40:09.416844 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 08:40:09 crc kubenswrapper[4765]: I1003 08:40:09.416855 4765 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T08:40:09Z","lastTransitionTime":"2025-10-03T08:40:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 08:40:09 crc kubenswrapper[4765]: I1003 08:40:09.519686 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 08:40:09 crc kubenswrapper[4765]: I1003 08:40:09.519726 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 08:40:09 crc kubenswrapper[4765]: I1003 08:40:09.519738 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 08:40:09 crc kubenswrapper[4765]: I1003 08:40:09.519764 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 08:40:09 crc kubenswrapper[4765]: I1003 08:40:09.519778 4765 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T08:40:09Z","lastTransitionTime":"2025-10-03T08:40:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 08:40:09 crc kubenswrapper[4765]: I1003 08:40:09.622615 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 08:40:09 crc kubenswrapper[4765]: I1003 08:40:09.622887 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 08:40:09 crc kubenswrapper[4765]: I1003 08:40:09.622980 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 08:40:09 crc kubenswrapper[4765]: I1003 08:40:09.623051 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 08:40:09 crc kubenswrapper[4765]: I1003 08:40:09.623124 4765 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T08:40:09Z","lastTransitionTime":"2025-10-03T08:40:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 08:40:09 crc kubenswrapper[4765]: I1003 08:40:09.726077 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 08:40:09 crc kubenswrapper[4765]: I1003 08:40:09.726340 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 08:40:09 crc kubenswrapper[4765]: I1003 08:40:09.726409 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 08:40:09 crc kubenswrapper[4765]: I1003 08:40:09.726483 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 08:40:09 crc kubenswrapper[4765]: I1003 08:40:09.726549 4765 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T08:40:09Z","lastTransitionTime":"2025-10-03T08:40:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 08:40:09 crc kubenswrapper[4765]: I1003 08:40:09.829325 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 08:40:09 crc kubenswrapper[4765]: I1003 08:40:09.829357 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 08:40:09 crc kubenswrapper[4765]: I1003 08:40:09.829366 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 08:40:09 crc kubenswrapper[4765]: I1003 08:40:09.829383 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 08:40:09 crc kubenswrapper[4765]: I1003 08:40:09.829392 4765 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T08:40:09Z","lastTransitionTime":"2025-10-03T08:40:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 08:40:09 crc kubenswrapper[4765]: I1003 08:40:09.936362 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 08:40:09 crc kubenswrapper[4765]: I1003 08:40:09.936454 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 08:40:09 crc kubenswrapper[4765]: I1003 08:40:09.936471 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 08:40:09 crc kubenswrapper[4765]: I1003 08:40:09.936501 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 08:40:09 crc kubenswrapper[4765]: I1003 08:40:09.936516 4765 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T08:40:09Z","lastTransitionTime":"2025-10-03T08:40:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 08:40:10 crc kubenswrapper[4765]: I1003 08:40:10.038748 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 08:40:10 crc kubenswrapper[4765]: I1003 08:40:10.038800 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 08:40:10 crc kubenswrapper[4765]: I1003 08:40:10.038810 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 08:40:10 crc kubenswrapper[4765]: I1003 08:40:10.038825 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 08:40:10 crc kubenswrapper[4765]: I1003 08:40:10.038835 4765 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T08:40:10Z","lastTransitionTime":"2025-10-03T08:40:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 08:40:10 crc kubenswrapper[4765]: I1003 08:40:10.142196 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 08:40:10 crc kubenswrapper[4765]: I1003 08:40:10.142485 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 08:40:10 crc kubenswrapper[4765]: I1003 08:40:10.142569 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 08:40:10 crc kubenswrapper[4765]: I1003 08:40:10.143270 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 08:40:10 crc kubenswrapper[4765]: I1003 08:40:10.143306 4765 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T08:40:10Z","lastTransitionTime":"2025-10-03T08:40:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 08:40:10 crc kubenswrapper[4765]: I1003 08:40:10.246555 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 08:40:10 crc kubenswrapper[4765]: I1003 08:40:10.246595 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 08:40:10 crc kubenswrapper[4765]: I1003 08:40:10.246610 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 08:40:10 crc kubenswrapper[4765]: I1003 08:40:10.246631 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 08:40:10 crc kubenswrapper[4765]: I1003 08:40:10.246680 4765 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T08:40:10Z","lastTransitionTime":"2025-10-03T08:40:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 08:40:10 crc kubenswrapper[4765]: I1003 08:40:10.306530 4765 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-wdwf5" Oct 03 08:40:10 crc kubenswrapper[4765]: E1003 08:40:10.306714 4765 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-wdwf5" podUID="6824483c-e9a7-4e95-bb3d-e00bac2af3aa" Oct 03 08:40:10 crc kubenswrapper[4765]: I1003 08:40:10.349540 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 08:40:10 crc kubenswrapper[4765]: I1003 08:40:10.349575 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 08:40:10 crc kubenswrapper[4765]: I1003 08:40:10.349583 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 08:40:10 crc kubenswrapper[4765]: I1003 08:40:10.349599 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 08:40:10 crc kubenswrapper[4765]: I1003 08:40:10.349610 4765 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T08:40:10Z","lastTransitionTime":"2025-10-03T08:40:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 08:40:10 crc kubenswrapper[4765]: I1003 08:40:10.451665 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 08:40:10 crc kubenswrapper[4765]: I1003 08:40:10.451723 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 08:40:10 crc kubenswrapper[4765]: I1003 08:40:10.451736 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 08:40:10 crc kubenswrapper[4765]: I1003 08:40:10.451754 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 08:40:10 crc kubenswrapper[4765]: I1003 08:40:10.451766 4765 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T08:40:10Z","lastTransitionTime":"2025-10-03T08:40:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 08:40:10 crc kubenswrapper[4765]: I1003 08:40:10.553785 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 08:40:10 crc kubenswrapper[4765]: I1003 08:40:10.553818 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 08:40:10 crc kubenswrapper[4765]: I1003 08:40:10.553827 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 08:40:10 crc kubenswrapper[4765]: I1003 08:40:10.553840 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 08:40:10 crc kubenswrapper[4765]: I1003 08:40:10.553849 4765 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T08:40:10Z","lastTransitionTime":"2025-10-03T08:40:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 08:40:10 crc kubenswrapper[4765]: I1003 08:40:10.656736 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 08:40:10 crc kubenswrapper[4765]: I1003 08:40:10.656789 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 08:40:10 crc kubenswrapper[4765]: I1003 08:40:10.656800 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 08:40:10 crc kubenswrapper[4765]: I1003 08:40:10.656820 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 08:40:10 crc kubenswrapper[4765]: I1003 08:40:10.656830 4765 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T08:40:10Z","lastTransitionTime":"2025-10-03T08:40:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 08:40:10 crc kubenswrapper[4765]: I1003 08:40:10.759448 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 08:40:10 crc kubenswrapper[4765]: I1003 08:40:10.759486 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 08:40:10 crc kubenswrapper[4765]: I1003 08:40:10.759497 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 08:40:10 crc kubenswrapper[4765]: I1003 08:40:10.759514 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 08:40:10 crc kubenswrapper[4765]: I1003 08:40:10.759526 4765 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T08:40:10Z","lastTransitionTime":"2025-10-03T08:40:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 08:40:10 crc kubenswrapper[4765]: I1003 08:40:10.861937 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 08:40:10 crc kubenswrapper[4765]: I1003 08:40:10.861978 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 08:40:10 crc kubenswrapper[4765]: I1003 08:40:10.861988 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 08:40:10 crc kubenswrapper[4765]: I1003 08:40:10.862002 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 08:40:10 crc kubenswrapper[4765]: I1003 08:40:10.862010 4765 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T08:40:10Z","lastTransitionTime":"2025-10-03T08:40:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 08:40:10 crc kubenswrapper[4765]: I1003 08:40:10.964355 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 08:40:10 crc kubenswrapper[4765]: I1003 08:40:10.964403 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 08:40:10 crc kubenswrapper[4765]: I1003 08:40:10.964415 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 08:40:10 crc kubenswrapper[4765]: I1003 08:40:10.964431 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 08:40:10 crc kubenswrapper[4765]: I1003 08:40:10.964441 4765 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T08:40:10Z","lastTransitionTime":"2025-10-03T08:40:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 08:40:11 crc kubenswrapper[4765]: I1003 08:40:11.066401 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 08:40:11 crc kubenswrapper[4765]: I1003 08:40:11.066445 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 08:40:11 crc kubenswrapper[4765]: I1003 08:40:11.066456 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 08:40:11 crc kubenswrapper[4765]: I1003 08:40:11.066472 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 08:40:11 crc kubenswrapper[4765]: I1003 08:40:11.066481 4765 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T08:40:11Z","lastTransitionTime":"2025-10-03T08:40:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 08:40:11 crc kubenswrapper[4765]: I1003 08:40:11.168604 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 08:40:11 crc kubenswrapper[4765]: I1003 08:40:11.168671 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 08:40:11 crc kubenswrapper[4765]: I1003 08:40:11.168683 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 08:40:11 crc kubenswrapper[4765]: I1003 08:40:11.168699 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 08:40:11 crc kubenswrapper[4765]: I1003 08:40:11.168714 4765 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T08:40:11Z","lastTransitionTime":"2025-10-03T08:40:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 08:40:11 crc kubenswrapper[4765]: I1003 08:40:11.270342 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 08:40:11 crc kubenswrapper[4765]: I1003 08:40:11.270376 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 08:40:11 crc kubenswrapper[4765]: I1003 08:40:11.270386 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 08:40:11 crc kubenswrapper[4765]: I1003 08:40:11.270405 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 08:40:11 crc kubenswrapper[4765]: I1003 08:40:11.270422 4765 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T08:40:11Z","lastTransitionTime":"2025-10-03T08:40:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 08:40:11 crc kubenswrapper[4765]: I1003 08:40:11.306052 4765 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 03 08:40:11 crc kubenswrapper[4765]: I1003 08:40:11.306120 4765 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 03 08:40:11 crc kubenswrapper[4765]: I1003 08:40:11.306078 4765 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 03 08:40:11 crc kubenswrapper[4765]: E1003 08:40:11.306275 4765 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 03 08:40:11 crc kubenswrapper[4765]: E1003 08:40:11.306187 4765 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 03 08:40:11 crc kubenswrapper[4765]: E1003 08:40:11.306437 4765 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 03 08:40:11 crc kubenswrapper[4765]: I1003 08:40:11.373514 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 08:40:11 crc kubenswrapper[4765]: I1003 08:40:11.373562 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 08:40:11 crc kubenswrapper[4765]: I1003 08:40:11.373575 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 08:40:11 crc kubenswrapper[4765]: I1003 08:40:11.373593 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 08:40:11 crc kubenswrapper[4765]: I1003 08:40:11.373606 4765 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T08:40:11Z","lastTransitionTime":"2025-10-03T08:40:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 08:40:11 crc kubenswrapper[4765]: I1003 08:40:11.476584 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 08:40:11 crc kubenswrapper[4765]: I1003 08:40:11.476641 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 08:40:11 crc kubenswrapper[4765]: I1003 08:40:11.476672 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 08:40:11 crc kubenswrapper[4765]: I1003 08:40:11.476693 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 08:40:11 crc kubenswrapper[4765]: I1003 08:40:11.476709 4765 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T08:40:11Z","lastTransitionTime":"2025-10-03T08:40:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 08:40:11 crc kubenswrapper[4765]: I1003 08:40:11.579852 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 08:40:11 crc kubenswrapper[4765]: I1003 08:40:11.580062 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 08:40:11 crc kubenswrapper[4765]: I1003 08:40:11.580134 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 08:40:11 crc kubenswrapper[4765]: I1003 08:40:11.580158 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 08:40:11 crc kubenswrapper[4765]: I1003 08:40:11.580170 4765 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T08:40:11Z","lastTransitionTime":"2025-10-03T08:40:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 08:40:11 crc kubenswrapper[4765]: I1003 08:40:11.682556 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 08:40:11 crc kubenswrapper[4765]: I1003 08:40:11.682709 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 08:40:11 crc kubenswrapper[4765]: I1003 08:40:11.682742 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 08:40:11 crc kubenswrapper[4765]: I1003 08:40:11.682784 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 08:40:11 crc kubenswrapper[4765]: I1003 08:40:11.682819 4765 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T08:40:11Z","lastTransitionTime":"2025-10-03T08:40:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 08:40:11 crc kubenswrapper[4765]: I1003 08:40:11.785764 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 08:40:11 crc kubenswrapper[4765]: I1003 08:40:11.785876 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 08:40:11 crc kubenswrapper[4765]: I1003 08:40:11.785901 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 08:40:11 crc kubenswrapper[4765]: I1003 08:40:11.785933 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 08:40:11 crc kubenswrapper[4765]: I1003 08:40:11.785960 4765 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T08:40:11Z","lastTransitionTime":"2025-10-03T08:40:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 08:40:11 crc kubenswrapper[4765]: I1003 08:40:11.890004 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 08:40:11 crc kubenswrapper[4765]: I1003 08:40:11.890062 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 08:40:11 crc kubenswrapper[4765]: I1003 08:40:11.890082 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 08:40:11 crc kubenswrapper[4765]: I1003 08:40:11.890113 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 08:40:11 crc kubenswrapper[4765]: I1003 08:40:11.890140 4765 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T08:40:11Z","lastTransitionTime":"2025-10-03T08:40:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 08:40:11 crc kubenswrapper[4765]: I1003 08:40:11.992418 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 08:40:11 crc kubenswrapper[4765]: I1003 08:40:11.992463 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 08:40:11 crc kubenswrapper[4765]: I1003 08:40:11.992489 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 08:40:11 crc kubenswrapper[4765]: I1003 08:40:11.992504 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 08:40:11 crc kubenswrapper[4765]: I1003 08:40:11.992515 4765 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T08:40:11Z","lastTransitionTime":"2025-10-03T08:40:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 08:40:12 crc kubenswrapper[4765]: I1003 08:40:12.094591 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 08:40:12 crc kubenswrapper[4765]: I1003 08:40:12.094636 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 08:40:12 crc kubenswrapper[4765]: I1003 08:40:12.094684 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 08:40:12 crc kubenswrapper[4765]: I1003 08:40:12.094704 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 08:40:12 crc kubenswrapper[4765]: I1003 08:40:12.094714 4765 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T08:40:12Z","lastTransitionTime":"2025-10-03T08:40:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 08:40:12 crc kubenswrapper[4765]: I1003 08:40:12.199870 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 08:40:12 crc kubenswrapper[4765]: I1003 08:40:12.199915 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 08:40:12 crc kubenswrapper[4765]: I1003 08:40:12.199928 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 08:40:12 crc kubenswrapper[4765]: I1003 08:40:12.199946 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 08:40:12 crc kubenswrapper[4765]: I1003 08:40:12.199958 4765 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T08:40:12Z","lastTransitionTime":"2025-10-03T08:40:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 08:40:12 crc kubenswrapper[4765]: I1003 08:40:12.302633 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 08:40:12 crc kubenswrapper[4765]: I1003 08:40:12.302695 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 08:40:12 crc kubenswrapper[4765]: I1003 08:40:12.302704 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 08:40:12 crc kubenswrapper[4765]: I1003 08:40:12.302718 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 08:40:12 crc kubenswrapper[4765]: I1003 08:40:12.302726 4765 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T08:40:12Z","lastTransitionTime":"2025-10-03T08:40:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 08:40:12 crc kubenswrapper[4765]: I1003 08:40:12.307014 4765 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-wdwf5" Oct 03 08:40:12 crc kubenswrapper[4765]: E1003 08:40:12.307270 4765 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-wdwf5" podUID="6824483c-e9a7-4e95-bb3d-e00bac2af3aa" Oct 03 08:40:12 crc kubenswrapper[4765]: I1003 08:40:12.404829 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 08:40:12 crc kubenswrapper[4765]: I1003 08:40:12.404867 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 08:40:12 crc kubenswrapper[4765]: I1003 08:40:12.404880 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 08:40:12 crc kubenswrapper[4765]: I1003 08:40:12.404897 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 08:40:12 crc kubenswrapper[4765]: I1003 08:40:12.404909 4765 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T08:40:12Z","lastTransitionTime":"2025-10-03T08:40:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 08:40:12 crc kubenswrapper[4765]: I1003 08:40:12.507140 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 08:40:12 crc kubenswrapper[4765]: I1003 08:40:12.507467 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 08:40:12 crc kubenswrapper[4765]: I1003 08:40:12.507587 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 08:40:12 crc kubenswrapper[4765]: I1003 08:40:12.507724 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 08:40:12 crc kubenswrapper[4765]: I1003 08:40:12.507817 4765 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T08:40:12Z","lastTransitionTime":"2025-10-03T08:40:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 08:40:12 crc kubenswrapper[4765]: I1003 08:40:12.609740 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 08:40:12 crc kubenswrapper[4765]: I1003 08:40:12.609982 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 08:40:12 crc kubenswrapper[4765]: I1003 08:40:12.610085 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 08:40:12 crc kubenswrapper[4765]: I1003 08:40:12.610185 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 08:40:12 crc kubenswrapper[4765]: I1003 08:40:12.610267 4765 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T08:40:12Z","lastTransitionTime":"2025-10-03T08:40:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 08:40:12 crc kubenswrapper[4765]: I1003 08:40:12.651676 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 08:40:12 crc kubenswrapper[4765]: I1003 08:40:12.651901 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 08:40:12 crc kubenswrapper[4765]: I1003 08:40:12.652036 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 08:40:12 crc kubenswrapper[4765]: I1003 08:40:12.652148 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 08:40:12 crc kubenswrapper[4765]: I1003 08:40:12.652224 4765 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T08:40:12Z","lastTransitionTime":"2025-10-03T08:40:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 08:40:12 crc kubenswrapper[4765]: E1003 08:40:12.667059 4765 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-03T08:40:12Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:12Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-03T08:40:12Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:12Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-03T08:40:12Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:12Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-03T08:40:12Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:12Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"4a5a1b91-d1b3-462d-b8c2-89eae83d6c3d\\\",\\\"systemUUID\\\":\\\"c85bcae8-d463-4f60-8737-09c0f3c02573\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T08:40:12Z is after 2025-08-24T17:21:41Z" Oct 03 08:40:12 crc kubenswrapper[4765]: I1003 08:40:12.671376 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 08:40:12 crc kubenswrapper[4765]: I1003 08:40:12.671916 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 08:40:12 crc kubenswrapper[4765]: I1003 08:40:12.672041 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 08:40:12 crc kubenswrapper[4765]: I1003 08:40:12.672135 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 08:40:12 crc kubenswrapper[4765]: I1003 08:40:12.672218 4765 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T08:40:12Z","lastTransitionTime":"2025-10-03T08:40:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 08:40:12 crc kubenswrapper[4765]: E1003 08:40:12.683979 4765 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-03T08:40:12Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:12Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-03T08:40:12Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:12Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-03T08:40:12Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:12Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-03T08:40:12Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:12Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"4a5a1b91-d1b3-462d-b8c2-89eae83d6c3d\\\",\\\"systemUUID\\\":\\\"c85bcae8-d463-4f60-8737-09c0f3c02573\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T08:40:12Z is after 2025-08-24T17:21:41Z" Oct 03 08:40:12 crc kubenswrapper[4765]: I1003 08:40:12.688125 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 08:40:12 crc kubenswrapper[4765]: I1003 08:40:12.688162 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 08:40:12 crc kubenswrapper[4765]: I1003 08:40:12.688173 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 08:40:12 crc kubenswrapper[4765]: I1003 08:40:12.688188 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 08:40:12 crc kubenswrapper[4765]: I1003 08:40:12.688199 4765 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T08:40:12Z","lastTransitionTime":"2025-10-03T08:40:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 08:40:12 crc kubenswrapper[4765]: E1003 08:40:12.700459 4765 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-03T08:40:12Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:12Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-03T08:40:12Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:12Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-03T08:40:12Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:12Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-03T08:40:12Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:12Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"4a5a1b91-d1b3-462d-b8c2-89eae83d6c3d\\\",\\\"systemUUID\\\":\\\"c85bcae8-d463-4f60-8737-09c0f3c02573\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T08:40:12Z is after 2025-08-24T17:21:41Z" Oct 03 08:40:12 crc kubenswrapper[4765]: I1003 08:40:12.704563 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 08:40:12 crc kubenswrapper[4765]: I1003 08:40:12.704624 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 08:40:12 crc kubenswrapper[4765]: I1003 08:40:12.704671 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 08:40:12 crc kubenswrapper[4765]: I1003 08:40:12.704692 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 08:40:12 crc kubenswrapper[4765]: I1003 08:40:12.704702 4765 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T08:40:12Z","lastTransitionTime":"2025-10-03T08:40:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 08:40:12 crc kubenswrapper[4765]: E1003 08:40:12.718625 4765 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-03T08:40:12Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:12Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-03T08:40:12Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:12Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-03T08:40:12Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:12Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-03T08:40:12Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:12Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"4a5a1b91-d1b3-462d-b8c2-89eae83d6c3d\\\",\\\"systemUUID\\\":\\\"c85bcae8-d463-4f60-8737-09c0f3c02573\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T08:40:12Z is after 2025-08-24T17:21:41Z" Oct 03 08:40:12 crc kubenswrapper[4765]: I1003 08:40:12.722787 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 08:40:12 crc kubenswrapper[4765]: I1003 08:40:12.722855 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 08:40:12 crc kubenswrapper[4765]: I1003 08:40:12.722864 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 08:40:12 crc kubenswrapper[4765]: I1003 08:40:12.722904 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 08:40:12 crc kubenswrapper[4765]: I1003 08:40:12.722916 4765 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T08:40:12Z","lastTransitionTime":"2025-10-03T08:40:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 08:40:12 crc kubenswrapper[4765]: E1003 08:40:12.733563 4765 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-03T08:40:12Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:12Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-03T08:40:12Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:12Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-03T08:40:12Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:12Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-03T08:40:12Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:12Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"4a5a1b91-d1b3-462d-b8c2-89eae83d6c3d\\\",\\\"systemUUID\\\":\\\"c85bcae8-d463-4f60-8737-09c0f3c02573\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T08:40:12Z is after 2025-08-24T17:21:41Z" Oct 03 08:40:12 crc kubenswrapper[4765]: E1003 08:40:12.733714 4765 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Oct 03 08:40:12 crc kubenswrapper[4765]: I1003 08:40:12.735288 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 08:40:12 crc kubenswrapper[4765]: I1003 08:40:12.735328 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 08:40:12 crc kubenswrapper[4765]: I1003 08:40:12.735337 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 08:40:12 crc kubenswrapper[4765]: I1003 08:40:12.735353 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 08:40:12 crc kubenswrapper[4765]: I1003 08:40:12.735362 4765 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T08:40:12Z","lastTransitionTime":"2025-10-03T08:40:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 08:40:12 crc kubenswrapper[4765]: I1003 08:40:12.811268 4765 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Oct 03 08:40:12 crc kubenswrapper[4765]: I1003 08:40:12.820619 4765 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-scheduler/openshift-kube-scheduler-crc"] Oct 03 08:40:12 crc kubenswrapper[4765]: I1003 08:40:12.824447 4765 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:35Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:35Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T08:40:12Z is after 2025-08-24T17:21:41Z" Oct 03 08:40:12 crc kubenswrapper[4765]: I1003 08:40:12.837389 4765 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:36Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:36Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e8d6f534a0a702832db2f8947c1528a98d511d3950cc5a6ec0ac3b31b3dbcb7c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T08:39:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://20ad16cb9f0f7e17ac946cd2c3f7c01b6e6c95d6d76c99f482b3761546689af2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T08:39:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T08:40:12Z is after 2025-08-24T17:21:41Z" Oct 03 08:40:12 crc kubenswrapper[4765]: I1003 08:40:12.838162 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 08:40:12 crc kubenswrapper[4765]: I1003 08:40:12.838207 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 08:40:12 crc kubenswrapper[4765]: I1003 08:40:12.838216 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 08:40:12 crc kubenswrapper[4765]: I1003 08:40:12.838237 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 08:40:12 crc kubenswrapper[4765]: I1003 08:40:12.838248 4765 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T08:40:12Z","lastTransitionTime":"2025-10-03T08:40:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 08:40:12 crc kubenswrapper[4765]: I1003 08:40:12.851858 4765 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:38Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:38Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a37f2b5f797755065158a077232872befbc61f2f19c80dfd27bba7f131db794c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T08:39:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T08:40:12Z is after 2025-08-24T17:21:41Z" Oct 03 08:40:12 crc kubenswrapper[4765]: I1003 08:40:12.874046 4765 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-srgbb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ea01fba1-445f-46c1-898c-1ceb34866850\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:36Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:36Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d73e2e54676fc570262cfd551322ed003812c372ddc25695ca3b34ae2a05423b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T08:39:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m4zqv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fa40947035e07c4926ee170348e2bd545830d0c6c1fa6b59a2aa7f12eac2c6da\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T08:39:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m4zqv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://902d94d2cc9ce526c6ea774f1bb70fbee7da85cedab72fcd842f87d47ee8a458\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T08:39:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m4zqv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://95502595a856f5f235331ab5db3d4f97a50f968857c1962d12b873a714689f0c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T08:39:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m4zqv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a3ad66691c9dcf004703b79d697a78f9b42791fafba2ddf278997b6ad28bdd4a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T08:39:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m4zqv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://68b9b8a7ec5c072f50d44aa0d3800b7cdee18bdd868d37ec129ceb37a23bd3ca\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T08:39:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m4zqv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a4d481217db9abe6da65a66219fdf2298353f237df78c085f40bb803f7349ccd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a4d481217db9abe6da65a66219fdf2298353f237df78c085f40bb803f7349ccd\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-10-03T08:40:03Z\\\",\\\"message\\\":\\\":false hairpin_snat_ip:169.254.0.5 fd69::5 neighbor_responder:none reject:true skip_snat:false]} protocol:{GoSet:[tcp]} selection_fields:{GoSet:[]} vips:{GoMap:map[10.217.5.188:443:]}] Rows:[] Columns:[] Mutations:[] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {53c717ca-2174-4315-bb03-c937a9c0d9b6}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nI1003 08:40:03.133396 6432 transact.go:42] Configuring OVN: [{Op:update Table:Load_Balancer Row:map[external_ids:{GoMap:map[k8s.ovn.org/kind:Service k8s.ovn.org/owner:openshift-cluster-version/cluster-version-operator]} name:Service_openshift-cluster-version/cluster-version-operator_TCP_cluster options:{GoMap:map[event:false hairpin_snat_ip:169.254.0.5 fd69::5 neighbor_responder:none reject:true skip_snat:false]} protocol:{GoSet:[tcp]} selection_fields:{GoSet:[]} vips:{GoMap:map[10.217.4.182:9099:]}] Rows:[] Columns:[] Mutations:[] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {61d39e4d-21a9-4387-9a2b-fa4ad14792e2}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nI1003 08:40:03.133455 6432 base_network_controller_pods.go:916] Annotation values: ip=[10.217.0.3/23] ; mac=0a:58:0a:d9:00:03 ; gw=[10.217.0.1]\\\\nF1003 08:40:03.133502 6432 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-03T08:40:02Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-srgbb_openshift-ovn-kubernetes(ea01fba1-445f-46c1-898c-1ceb34866850)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m4zqv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6d5d60eb6ab5ff22cc2c6826b1d47220bb827fa0429f2a59020ae01d0a43f6bf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T08:39:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m4zqv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://761ad034a8a6a7a6280fcccaca136b26ddd423885bc2a7ab6b8e1856f16f7416\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://761ad034a8a6a7a6280fcccaca136b26ddd423885bc2a7ab6b8e1856f16f7416\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T08:39:37Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T08:39:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m4zqv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T08:39:36Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-srgbb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T08:40:12Z is after 2025-08-24T17:21:41Z" Oct 03 08:40:12 crc kubenswrapper[4765]: I1003 08:40:12.890329 4765 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9660b983-3561-4cf7-8ea0-31a63e8d1051\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://23c27e7d79dab0c54b22f0114e7f55a9267e3a21961b8479c37fd77d0e8b66c7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T08:39:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fb89a31c804d86cbc11b04e4dcfab79d4536f28a107d43e98d48172a1c257ebb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T08:39:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1c3168f51c49cd9633557cf31cdc0fec47b3fcf981462dc85f4253a0584fcf52\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T08:39:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://99ae775d5cfd2e88a1c7ca516e1c59f2e08ce1d383653cacbefeac66b07abcb5\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T08:39:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T08:39:16Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T08:40:12Z is after 2025-08-24T17:21:41Z" Oct 03 08:40:12 crc kubenswrapper[4765]: I1003 08:40:12.905336 4765 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-4bmrv" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4f105c06-3e67-486f-a622-923ae442117c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d7a29ab4db9b7548c70824520272e6323f615934cddf1d92bf653f6d8f030a19\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T08:39:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6lhz2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ab314ed006e71cae099ecf8be4ac18fb777dd83fe4e233d9bc347965f76b6a0b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ab314ed006e71cae099ecf8be4ac18fb777dd83fe4e233d9bc347965f76b6a0b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T08:39:37Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T08:39:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6lhz2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://261208c132a527b6786f64f37dd699e889f68ebc01265028288b3620615274bc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://261208c132a527b6786f64f37dd699e889f68ebc01265028288b3620615274bc\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T08:39:39Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T08:39:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6lhz2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8358f38c6c98de8206fc03fda21200b0e526972747588a6e32d525c064aa57ac\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8358f38c6c98de8206fc03fda21200b0e526972747588a6e32d525c064aa57ac\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T08:39:39Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T08:39:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6lhz2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9af7a0993c4e8d1177050ee170ae306c2e2570b0daca2d3f5c812b5f0e9c81da\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9af7a0993c4e8d1177050ee170ae306c2e2570b0daca2d3f5c812b5f0e9c81da\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T08:39:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T08:39:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6lhz2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://23ac91bc25ecc5c606b22bf6df52129330bb8c214ef8ec881fb202df6350c853\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://23ac91bc25ecc5c606b22bf6df52129330bb8c214ef8ec881fb202df6350c853\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T08:39:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T08:39:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6lhz2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7c836df75da45ef369baafc15bdbed1068becc3bf57a4c83a8519280ff3eb847\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7c836df75da45ef369baafc15bdbed1068becc3bf57a4c83a8519280ff3eb847\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T08:39:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T08:39:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6lhz2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T08:39:36Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-4bmrv\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T08:40:12Z is after 2025-08-24T17:21:41Z" Oct 03 08:40:12 crc kubenswrapper[4765]: I1003 08:40:12.915064 4765 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-p9gf5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"46c76a49-e10b-4a12-a6c7-12c330cd3c4e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d127171dd11041892813dd0596574630e756cc4f2e54b149619bffdbe9bae37d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T08:39:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2ldt7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T08:39:36Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-p9gf5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T08:40:12Z is after 2025-08-24T17:21:41Z" Oct 03 08:40:12 crc kubenswrapper[4765]: I1003 08:40:12.924542 4765 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-svqbq" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"84cdf1d7-9997-4015-bdbf-eedacc081685\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://43441b23076aa88505c0014c6734ffd0302f9011300711eece573befc94f3fbf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T08:39:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tm2z5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T08:39:39Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-svqbq\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T08:40:12Z is after 2025-08-24T17:21:41Z" Oct 03 08:40:12 crc kubenswrapper[4765]: I1003 08:40:12.934706 4765 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-9pssq" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fcbd8c60-e4bc-43c1-b769-9ae58a05ea0f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fb36c0727cbf11d911102b2e91c3989a264374191f4ff34349ed6ec8eba2e58d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T08:39:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ggxtg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d810b33fb4971c7a1473884cbe04ad15b3cac6c0ca9af2384819d72a748ab173\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T08:39:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ggxtg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T08:39:48Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-9pssq\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T08:40:12Z is after 2025-08-24T17:21:41Z" Oct 03 08:40:12 crc kubenswrapper[4765]: I1003 08:40:12.940003 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 08:40:12 crc kubenswrapper[4765]: I1003 08:40:12.940070 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 08:40:12 crc kubenswrapper[4765]: I1003 08:40:12.940083 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 08:40:12 crc kubenswrapper[4765]: I1003 08:40:12.940100 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 08:40:12 crc kubenswrapper[4765]: I1003 08:40:12.940111 4765 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T08:40:12Z","lastTransitionTime":"2025-10-03T08:40:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 08:40:12 crc kubenswrapper[4765]: I1003 08:40:12.948924 4765 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0c434639-9c6c-420c-a51b-fdf59b654daa\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d31497fd54f7500ac776bdd9a16414d873c053353911ed5ba237b201e9e7ac12\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T08:39:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e89b19d6a5b90a2051665bf2e5e150f73df7899eff246ee75246bc2127c415ea\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T08:39:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://79fad446c147481b1a0ff2a173848b2d24384e6b6aafcd0749dc820e9abfe929\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T08:39:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f2e21a2b21d807288e991a3a44ea38d316985590080aa4291aa3385816f826dd\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://aa0283dadc2c5e48aa9bfd20ef35d889a350244b72eb8529d4d4e682d5fa0e47\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-03T08:39:35Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1003 08:39:29.830291 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1003 08:39:29.833185 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2710500186/tls.crt::/tmp/serving-cert-2710500186/tls.key\\\\\\\"\\\\nI1003 08:39:35.213224 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1003 08:39:35.219008 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1003 08:39:35.219055 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1003 08:39:35.219088 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1003 08:39:35.219098 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1003 08:39:35.227302 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI1003 08:39:35.227314 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW1003 08:39:35.227372 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1003 08:39:35.227381 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1003 08:39:35.227385 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1003 08:39:35.227395 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1003 08:39:35.227398 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1003 08:39:35.227401 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF1003 08:39:35.229781 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-03T08:39:19Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T08:39:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fa1d1c0f4dab4b4c6c9f3afccac34473eab40a714015a2a7ce725ed1a92b609c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T08:39:19Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7c94b0f96f50324a67a7b189e9d3c4e406193421aa2e793692219bef9f2578dd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7c94b0f96f50324a67a7b189e9d3c4e406193421aa2e793692219bef9f2578dd\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T08:39:17Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T08:39:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T08:39:16Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T08:40:12Z is after 2025-08-24T17:21:41Z" Oct 03 08:40:12 crc kubenswrapper[4765]: I1003 08:40:12.959783 4765 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:35Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:35Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T08:40:12Z is after 2025-08-24T17:21:41Z" Oct 03 08:40:12 crc kubenswrapper[4765]: I1003 08:40:12.974053 4765 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:36Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:36Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e2003e4dd90b26bd915c05a690d0ab12b21ef7773138f11993382b0e7ac2d778\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T08:39:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T08:40:12Z is after 2025-08-24T17:21:41Z" Oct 03 08:40:12 crc kubenswrapper[4765]: I1003 08:40:12.984219 4765 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-wdwf5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6824483c-e9a7-4e95-bb3d-e00bac2af3aa\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:50Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:50Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:50Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9t858\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9t858\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T08:39:50Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-wdwf5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T08:40:12Z is after 2025-08-24T17:21:41Z" Oct 03 08:40:13 crc kubenswrapper[4765]: I1003 08:40:13.001637 4765 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"859ee4f1-636f-48e5-ad72-fef19f311c64\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ffaf0cbc60fa84230a87aff908b5b2a76956abfa937aeea94363abe91640b93e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T08:39:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0fee410f71d4fa82e7bf54dad906736bc7182be512825a06bf7a4c76ed2f2789\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T08:39:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ab0ed26066c771f9943b6435fa382ff61fb04f0c8bef3d505aba4c5d1a1d4740\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T08:39:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://153c9584928c3d064c6098126dad58733015ed123b9a55c959e69ddcc0ad2110\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T08:39:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6fa1bc45d80d90bc08ca3a7177e2ac77b66c36f5a0f863532174be7719bfaae0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T08:39:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c1359775b3b907743ce1d7754c904fab5cf953ad3a28f91e781be95462e6a9d4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c1359775b3b907743ce1d7754c904fab5cf953ad3a28f91e781be95462e6a9d4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T08:39:17Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T08:39:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a56c24b3af6b3d5c18009cbb5517273674eb36f57157344f99903f9c50c7d1a0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a56c24b3af6b3d5c18009cbb5517273674eb36f57157344f99903f9c50c7d1a0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T08:39:18Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T08:39:18Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://18dbd66a12f96fa4beef4da9dcc0e1b1d3a5164a8b21a6774142289078d6c05f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://18dbd66a12f96fa4beef4da9dcc0e1b1d3a5164a8b21a6774142289078d6c05f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T08:39:19Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T08:39:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T08:39:16Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T08:40:12Z is after 2025-08-24T17:21:41Z" Oct 03 08:40:13 crc kubenswrapper[4765]: I1003 08:40:13.011936 4765 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:35Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:35Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T08:40:13Z is after 2025-08-24T17:21:41Z" Oct 03 08:40:13 crc kubenswrapper[4765]: I1003 08:40:13.022099 4765 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-csb5z" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"912755c8-dd28-4fbc-82de-9cf85df54f4f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d7f179012e9f55f30c641a1ae3640cc90cefb3d2527d0c1e0580c219899503e1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T08:39:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w8k2k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T08:39:36Z\\\"}}\" for pod \"openshift-multus\"/\"multus-csb5z\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T08:40:13Z is after 2025-08-24T17:21:41Z" Oct 03 08:40:13 crc kubenswrapper[4765]: I1003 08:40:13.032161 4765 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-j8mss" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d636dbad-9ffa-4ba7-953f-adea04b76a23\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://33c95fa1034cd2135f4293956d73825e809195d220ff0b10a6604bd399a5730a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T08:39:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dhzzj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://714c78e9165f96e2aee03ad7be980399f06aeb852da4d76611c236f262518281\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T08:39:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dhzzj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T08:39:36Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-j8mss\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T08:40:13Z is after 2025-08-24T17:21:41Z" Oct 03 08:40:13 crc kubenswrapper[4765]: I1003 08:40:13.042703 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 08:40:13 crc kubenswrapper[4765]: I1003 08:40:13.042731 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 08:40:13 crc kubenswrapper[4765]: I1003 08:40:13.042744 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 08:40:13 crc kubenswrapper[4765]: I1003 08:40:13.042759 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 08:40:13 crc kubenswrapper[4765]: I1003 08:40:13.042768 4765 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T08:40:13Z","lastTransitionTime":"2025-10-03T08:40:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 08:40:13 crc kubenswrapper[4765]: I1003 08:40:13.144854 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 08:40:13 crc kubenswrapper[4765]: I1003 08:40:13.145162 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 08:40:13 crc kubenswrapper[4765]: I1003 08:40:13.145235 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 08:40:13 crc kubenswrapper[4765]: I1003 08:40:13.145319 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 08:40:13 crc kubenswrapper[4765]: I1003 08:40:13.145573 4765 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T08:40:13Z","lastTransitionTime":"2025-10-03T08:40:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 08:40:13 crc kubenswrapper[4765]: I1003 08:40:13.248699 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 08:40:13 crc kubenswrapper[4765]: I1003 08:40:13.248755 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 08:40:13 crc kubenswrapper[4765]: I1003 08:40:13.248770 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 08:40:13 crc kubenswrapper[4765]: I1003 08:40:13.248788 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 08:40:13 crc kubenswrapper[4765]: I1003 08:40:13.248801 4765 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T08:40:13Z","lastTransitionTime":"2025-10-03T08:40:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 08:40:13 crc kubenswrapper[4765]: I1003 08:40:13.306402 4765 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 03 08:40:13 crc kubenswrapper[4765]: I1003 08:40:13.306432 4765 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 03 08:40:13 crc kubenswrapper[4765]: I1003 08:40:13.306402 4765 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 03 08:40:13 crc kubenswrapper[4765]: E1003 08:40:13.306536 4765 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 03 08:40:13 crc kubenswrapper[4765]: E1003 08:40:13.306600 4765 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 03 08:40:13 crc kubenswrapper[4765]: E1003 08:40:13.306761 4765 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 03 08:40:13 crc kubenswrapper[4765]: I1003 08:40:13.351273 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 08:40:13 crc kubenswrapper[4765]: I1003 08:40:13.351314 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 08:40:13 crc kubenswrapper[4765]: I1003 08:40:13.351326 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 08:40:13 crc kubenswrapper[4765]: I1003 08:40:13.351341 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 08:40:13 crc kubenswrapper[4765]: I1003 08:40:13.351353 4765 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T08:40:13Z","lastTransitionTime":"2025-10-03T08:40:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 08:40:13 crc kubenswrapper[4765]: I1003 08:40:13.453588 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 08:40:13 crc kubenswrapper[4765]: I1003 08:40:13.453627 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 08:40:13 crc kubenswrapper[4765]: I1003 08:40:13.453636 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 08:40:13 crc kubenswrapper[4765]: I1003 08:40:13.453682 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 08:40:13 crc kubenswrapper[4765]: I1003 08:40:13.453695 4765 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T08:40:13Z","lastTransitionTime":"2025-10-03T08:40:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 08:40:13 crc kubenswrapper[4765]: I1003 08:40:13.555741 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 08:40:13 crc kubenswrapper[4765]: I1003 08:40:13.555803 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 08:40:13 crc kubenswrapper[4765]: I1003 08:40:13.555816 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 08:40:13 crc kubenswrapper[4765]: I1003 08:40:13.555830 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 08:40:13 crc kubenswrapper[4765]: I1003 08:40:13.555840 4765 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T08:40:13Z","lastTransitionTime":"2025-10-03T08:40:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 08:40:13 crc kubenswrapper[4765]: I1003 08:40:13.658362 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 08:40:13 crc kubenswrapper[4765]: I1003 08:40:13.658408 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 08:40:13 crc kubenswrapper[4765]: I1003 08:40:13.658419 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 08:40:13 crc kubenswrapper[4765]: I1003 08:40:13.658438 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 08:40:13 crc kubenswrapper[4765]: I1003 08:40:13.658450 4765 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T08:40:13Z","lastTransitionTime":"2025-10-03T08:40:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 08:40:13 crc kubenswrapper[4765]: I1003 08:40:13.761007 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 08:40:13 crc kubenswrapper[4765]: I1003 08:40:13.761048 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 08:40:13 crc kubenswrapper[4765]: I1003 08:40:13.761057 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 08:40:13 crc kubenswrapper[4765]: I1003 08:40:13.761074 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 08:40:13 crc kubenswrapper[4765]: I1003 08:40:13.761083 4765 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T08:40:13Z","lastTransitionTime":"2025-10-03T08:40:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 08:40:13 crc kubenswrapper[4765]: I1003 08:40:13.863444 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 08:40:13 crc kubenswrapper[4765]: I1003 08:40:13.863493 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 08:40:13 crc kubenswrapper[4765]: I1003 08:40:13.863503 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 08:40:13 crc kubenswrapper[4765]: I1003 08:40:13.863520 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 08:40:13 crc kubenswrapper[4765]: I1003 08:40:13.863533 4765 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T08:40:13Z","lastTransitionTime":"2025-10-03T08:40:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 08:40:13 crc kubenswrapper[4765]: I1003 08:40:13.965553 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 08:40:13 crc kubenswrapper[4765]: I1003 08:40:13.965624 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 08:40:13 crc kubenswrapper[4765]: I1003 08:40:13.965640 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 08:40:13 crc kubenswrapper[4765]: I1003 08:40:13.965685 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 08:40:13 crc kubenswrapper[4765]: I1003 08:40:13.965697 4765 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T08:40:13Z","lastTransitionTime":"2025-10-03T08:40:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 08:40:14 crc kubenswrapper[4765]: I1003 08:40:14.068289 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 08:40:14 crc kubenswrapper[4765]: I1003 08:40:14.068324 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 08:40:14 crc kubenswrapper[4765]: I1003 08:40:14.068332 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 08:40:14 crc kubenswrapper[4765]: I1003 08:40:14.068349 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 08:40:14 crc kubenswrapper[4765]: I1003 08:40:14.068357 4765 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T08:40:14Z","lastTransitionTime":"2025-10-03T08:40:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 08:40:14 crc kubenswrapper[4765]: I1003 08:40:14.170631 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 08:40:14 crc kubenswrapper[4765]: I1003 08:40:14.170687 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 08:40:14 crc kubenswrapper[4765]: I1003 08:40:14.170697 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 08:40:14 crc kubenswrapper[4765]: I1003 08:40:14.170712 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 08:40:14 crc kubenswrapper[4765]: I1003 08:40:14.170721 4765 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T08:40:14Z","lastTransitionTime":"2025-10-03T08:40:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 08:40:14 crc kubenswrapper[4765]: I1003 08:40:14.273014 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 08:40:14 crc kubenswrapper[4765]: I1003 08:40:14.273075 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 08:40:14 crc kubenswrapper[4765]: I1003 08:40:14.273088 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 08:40:14 crc kubenswrapper[4765]: I1003 08:40:14.273125 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 08:40:14 crc kubenswrapper[4765]: I1003 08:40:14.273140 4765 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T08:40:14Z","lastTransitionTime":"2025-10-03T08:40:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 08:40:14 crc kubenswrapper[4765]: I1003 08:40:14.306011 4765 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-wdwf5" Oct 03 08:40:14 crc kubenswrapper[4765]: E1003 08:40:14.306178 4765 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-wdwf5" podUID="6824483c-e9a7-4e95-bb3d-e00bac2af3aa" Oct 03 08:40:14 crc kubenswrapper[4765]: I1003 08:40:14.376214 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 08:40:14 crc kubenswrapper[4765]: I1003 08:40:14.376265 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 08:40:14 crc kubenswrapper[4765]: I1003 08:40:14.376277 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 08:40:14 crc kubenswrapper[4765]: I1003 08:40:14.376295 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 08:40:14 crc kubenswrapper[4765]: I1003 08:40:14.376307 4765 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T08:40:14Z","lastTransitionTime":"2025-10-03T08:40:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 08:40:14 crc kubenswrapper[4765]: I1003 08:40:14.481981 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 08:40:14 crc kubenswrapper[4765]: I1003 08:40:14.482019 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 08:40:14 crc kubenswrapper[4765]: I1003 08:40:14.482031 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 08:40:14 crc kubenswrapper[4765]: I1003 08:40:14.482049 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 08:40:14 crc kubenswrapper[4765]: I1003 08:40:14.482061 4765 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T08:40:14Z","lastTransitionTime":"2025-10-03T08:40:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 08:40:14 crc kubenswrapper[4765]: I1003 08:40:14.583727 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 08:40:14 crc kubenswrapper[4765]: I1003 08:40:14.583970 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 08:40:14 crc kubenswrapper[4765]: I1003 08:40:14.584041 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 08:40:14 crc kubenswrapper[4765]: I1003 08:40:14.584123 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 08:40:14 crc kubenswrapper[4765]: I1003 08:40:14.584190 4765 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T08:40:14Z","lastTransitionTime":"2025-10-03T08:40:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 08:40:14 crc kubenswrapper[4765]: I1003 08:40:14.686630 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 08:40:14 crc kubenswrapper[4765]: I1003 08:40:14.686681 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 08:40:14 crc kubenswrapper[4765]: I1003 08:40:14.686697 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 08:40:14 crc kubenswrapper[4765]: I1003 08:40:14.686714 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 08:40:14 crc kubenswrapper[4765]: I1003 08:40:14.686725 4765 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T08:40:14Z","lastTransitionTime":"2025-10-03T08:40:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 08:40:14 crc kubenswrapper[4765]: I1003 08:40:14.788858 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 08:40:14 crc kubenswrapper[4765]: I1003 08:40:14.788885 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 08:40:14 crc kubenswrapper[4765]: I1003 08:40:14.788893 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 08:40:14 crc kubenswrapper[4765]: I1003 08:40:14.788906 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 08:40:14 crc kubenswrapper[4765]: I1003 08:40:14.788915 4765 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T08:40:14Z","lastTransitionTime":"2025-10-03T08:40:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 08:40:14 crc kubenswrapper[4765]: I1003 08:40:14.890707 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 08:40:14 crc kubenswrapper[4765]: I1003 08:40:14.890745 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 08:40:14 crc kubenswrapper[4765]: I1003 08:40:14.890757 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 08:40:14 crc kubenswrapper[4765]: I1003 08:40:14.890774 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 08:40:14 crc kubenswrapper[4765]: I1003 08:40:14.890785 4765 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T08:40:14Z","lastTransitionTime":"2025-10-03T08:40:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 08:40:14 crc kubenswrapper[4765]: I1003 08:40:14.992532 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 08:40:14 crc kubenswrapper[4765]: I1003 08:40:14.992566 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 08:40:14 crc kubenswrapper[4765]: I1003 08:40:14.992578 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 08:40:14 crc kubenswrapper[4765]: I1003 08:40:14.992596 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 08:40:14 crc kubenswrapper[4765]: I1003 08:40:14.992607 4765 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T08:40:14Z","lastTransitionTime":"2025-10-03T08:40:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 08:40:15 crc kubenswrapper[4765]: I1003 08:40:15.094834 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 08:40:15 crc kubenswrapper[4765]: I1003 08:40:15.094873 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 08:40:15 crc kubenswrapper[4765]: I1003 08:40:15.094883 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 08:40:15 crc kubenswrapper[4765]: I1003 08:40:15.094897 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 08:40:15 crc kubenswrapper[4765]: I1003 08:40:15.094906 4765 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T08:40:15Z","lastTransitionTime":"2025-10-03T08:40:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 08:40:15 crc kubenswrapper[4765]: I1003 08:40:15.197300 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 08:40:15 crc kubenswrapper[4765]: I1003 08:40:15.197343 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 08:40:15 crc kubenswrapper[4765]: I1003 08:40:15.197354 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 08:40:15 crc kubenswrapper[4765]: I1003 08:40:15.197370 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 08:40:15 crc kubenswrapper[4765]: I1003 08:40:15.197382 4765 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T08:40:15Z","lastTransitionTime":"2025-10-03T08:40:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 08:40:15 crc kubenswrapper[4765]: I1003 08:40:15.300211 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 08:40:15 crc kubenswrapper[4765]: I1003 08:40:15.300244 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 08:40:15 crc kubenswrapper[4765]: I1003 08:40:15.300257 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 08:40:15 crc kubenswrapper[4765]: I1003 08:40:15.300271 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 08:40:15 crc kubenswrapper[4765]: I1003 08:40:15.300281 4765 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T08:40:15Z","lastTransitionTime":"2025-10-03T08:40:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 08:40:15 crc kubenswrapper[4765]: I1003 08:40:15.306510 4765 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 03 08:40:15 crc kubenswrapper[4765]: I1003 08:40:15.306536 4765 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 03 08:40:15 crc kubenswrapper[4765]: I1003 08:40:15.306631 4765 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 03 08:40:15 crc kubenswrapper[4765]: E1003 08:40:15.306771 4765 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 03 08:40:15 crc kubenswrapper[4765]: E1003 08:40:15.306874 4765 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 03 08:40:15 crc kubenswrapper[4765]: E1003 08:40:15.306977 4765 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 03 08:40:15 crc kubenswrapper[4765]: I1003 08:40:15.403294 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 08:40:15 crc kubenswrapper[4765]: I1003 08:40:15.403918 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 08:40:15 crc kubenswrapper[4765]: I1003 08:40:15.404002 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 08:40:15 crc kubenswrapper[4765]: I1003 08:40:15.404091 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 08:40:15 crc kubenswrapper[4765]: I1003 08:40:15.404177 4765 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T08:40:15Z","lastTransitionTime":"2025-10-03T08:40:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 08:40:15 crc kubenswrapper[4765]: I1003 08:40:15.506436 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 08:40:15 crc kubenswrapper[4765]: I1003 08:40:15.506504 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 08:40:15 crc kubenswrapper[4765]: I1003 08:40:15.506522 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 08:40:15 crc kubenswrapper[4765]: I1003 08:40:15.506544 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 08:40:15 crc kubenswrapper[4765]: I1003 08:40:15.506558 4765 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T08:40:15Z","lastTransitionTime":"2025-10-03T08:40:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 08:40:15 crc kubenswrapper[4765]: I1003 08:40:15.609031 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 08:40:15 crc kubenswrapper[4765]: I1003 08:40:15.609076 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 08:40:15 crc kubenswrapper[4765]: I1003 08:40:15.609090 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 08:40:15 crc kubenswrapper[4765]: I1003 08:40:15.609109 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 08:40:15 crc kubenswrapper[4765]: I1003 08:40:15.609120 4765 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T08:40:15Z","lastTransitionTime":"2025-10-03T08:40:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 08:40:15 crc kubenswrapper[4765]: I1003 08:40:15.712279 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 08:40:15 crc kubenswrapper[4765]: I1003 08:40:15.712329 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 08:40:15 crc kubenswrapper[4765]: I1003 08:40:15.712339 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 08:40:15 crc kubenswrapper[4765]: I1003 08:40:15.712356 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 08:40:15 crc kubenswrapper[4765]: I1003 08:40:15.712370 4765 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T08:40:15Z","lastTransitionTime":"2025-10-03T08:40:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 08:40:15 crc kubenswrapper[4765]: I1003 08:40:15.815265 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 08:40:15 crc kubenswrapper[4765]: I1003 08:40:15.815313 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 08:40:15 crc kubenswrapper[4765]: I1003 08:40:15.815323 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 08:40:15 crc kubenswrapper[4765]: I1003 08:40:15.815396 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 08:40:15 crc kubenswrapper[4765]: I1003 08:40:15.815411 4765 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T08:40:15Z","lastTransitionTime":"2025-10-03T08:40:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 08:40:15 crc kubenswrapper[4765]: I1003 08:40:15.918400 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 08:40:15 crc kubenswrapper[4765]: I1003 08:40:15.918451 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 08:40:15 crc kubenswrapper[4765]: I1003 08:40:15.918463 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 08:40:15 crc kubenswrapper[4765]: I1003 08:40:15.918484 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 08:40:15 crc kubenswrapper[4765]: I1003 08:40:15.918497 4765 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T08:40:15Z","lastTransitionTime":"2025-10-03T08:40:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 08:40:16 crc kubenswrapper[4765]: I1003 08:40:16.021088 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 08:40:16 crc kubenswrapper[4765]: I1003 08:40:16.021140 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 08:40:16 crc kubenswrapper[4765]: I1003 08:40:16.021150 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 08:40:16 crc kubenswrapper[4765]: I1003 08:40:16.021197 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 08:40:16 crc kubenswrapper[4765]: I1003 08:40:16.021216 4765 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T08:40:16Z","lastTransitionTime":"2025-10-03T08:40:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 08:40:16 crc kubenswrapper[4765]: I1003 08:40:16.123714 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 08:40:16 crc kubenswrapper[4765]: I1003 08:40:16.123763 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 08:40:16 crc kubenswrapper[4765]: I1003 08:40:16.123771 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 08:40:16 crc kubenswrapper[4765]: I1003 08:40:16.123792 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 08:40:16 crc kubenswrapper[4765]: I1003 08:40:16.123810 4765 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T08:40:16Z","lastTransitionTime":"2025-10-03T08:40:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 08:40:16 crc kubenswrapper[4765]: I1003 08:40:16.226206 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 08:40:16 crc kubenswrapper[4765]: I1003 08:40:16.226269 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 08:40:16 crc kubenswrapper[4765]: I1003 08:40:16.226280 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 08:40:16 crc kubenswrapper[4765]: I1003 08:40:16.226301 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 08:40:16 crc kubenswrapper[4765]: I1003 08:40:16.226316 4765 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T08:40:16Z","lastTransitionTime":"2025-10-03T08:40:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 08:40:16 crc kubenswrapper[4765]: I1003 08:40:16.306763 4765 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-wdwf5" Oct 03 08:40:16 crc kubenswrapper[4765]: E1003 08:40:16.306963 4765 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-wdwf5" podUID="6824483c-e9a7-4e95-bb3d-e00bac2af3aa" Oct 03 08:40:16 crc kubenswrapper[4765]: I1003 08:40:16.327437 4765 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"859ee4f1-636f-48e5-ad72-fef19f311c64\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ffaf0cbc60fa84230a87aff908b5b2a76956abfa937aeea94363abe91640b93e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T08:39:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0fee410f71d4fa82e7bf54dad906736bc7182be512825a06bf7a4c76ed2f2789\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T08:39:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ab0ed26066c771f9943b6435fa382ff61fb04f0c8bef3d505aba4c5d1a1d4740\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T08:39:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://153c9584928c3d064c6098126dad58733015ed123b9a55c959e69ddcc0ad2110\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T08:39:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6fa1bc45d80d90bc08ca3a7177e2ac77b66c36f5a0f863532174be7719bfaae0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T08:39:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c1359775b3b907743ce1d7754c904fab5cf953ad3a28f91e781be95462e6a9d4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c1359775b3b907743ce1d7754c904fab5cf953ad3a28f91e781be95462e6a9d4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T08:39:17Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T08:39:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a56c24b3af6b3d5c18009cbb5517273674eb36f57157344f99903f9c50c7d1a0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a56c24b3af6b3d5c18009cbb5517273674eb36f57157344f99903f9c50c7d1a0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T08:39:18Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T08:39:18Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://18dbd66a12f96fa4beef4da9dcc0e1b1d3a5164a8b21a6774142289078d6c05f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://18dbd66a12f96fa4beef4da9dcc0e1b1d3a5164a8b21a6774142289078d6c05f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T08:39:19Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T08:39:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T08:39:16Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T08:40:16Z is after 2025-08-24T17:21:41Z" Oct 03 08:40:16 crc kubenswrapper[4765]: I1003 08:40:16.329394 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 08:40:16 crc kubenswrapper[4765]: I1003 08:40:16.329422 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 08:40:16 crc kubenswrapper[4765]: I1003 08:40:16.329430 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 08:40:16 crc kubenswrapper[4765]: I1003 08:40:16.329444 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 08:40:16 crc kubenswrapper[4765]: I1003 08:40:16.329454 4765 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T08:40:16Z","lastTransitionTime":"2025-10-03T08:40:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 08:40:16 crc kubenswrapper[4765]: I1003 08:40:16.337872 4765 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:35Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:35Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T08:40:16Z is after 2025-08-24T17:21:41Z" Oct 03 08:40:16 crc kubenswrapper[4765]: I1003 08:40:16.350457 4765 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-csb5z" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"912755c8-dd28-4fbc-82de-9cf85df54f4f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d7f179012e9f55f30c641a1ae3640cc90cefb3d2527d0c1e0580c219899503e1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T08:39:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w8k2k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T08:39:36Z\\\"}}\" for pod \"openshift-multus\"/\"multus-csb5z\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T08:40:16Z is after 2025-08-24T17:21:41Z" Oct 03 08:40:16 crc kubenswrapper[4765]: I1003 08:40:16.360463 4765 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-j8mss" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d636dbad-9ffa-4ba7-953f-adea04b76a23\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://33c95fa1034cd2135f4293956d73825e809195d220ff0b10a6604bd399a5730a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T08:39:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dhzzj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://714c78e9165f96e2aee03ad7be980399f06aeb852da4d76611c236f262518281\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T08:39:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dhzzj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T08:39:36Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-j8mss\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T08:40:16Z is after 2025-08-24T17:21:41Z" Oct 03 08:40:16 crc kubenswrapper[4765]: I1003 08:40:16.370573 4765 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:35Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:35Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T08:40:16Z is after 2025-08-24T17:21:41Z" Oct 03 08:40:16 crc kubenswrapper[4765]: I1003 08:40:16.382606 4765 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:36Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:36Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e8d6f534a0a702832db2f8947c1528a98d511d3950cc5a6ec0ac3b31b3dbcb7c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T08:39:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://20ad16cb9f0f7e17ac946cd2c3f7c01b6e6c95d6d76c99f482b3761546689af2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T08:39:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T08:40:16Z is after 2025-08-24T17:21:41Z" Oct 03 08:40:16 crc kubenswrapper[4765]: I1003 08:40:16.395708 4765 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:38Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:38Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a37f2b5f797755065158a077232872befbc61f2f19c80dfd27bba7f131db794c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T08:39:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T08:40:16Z is after 2025-08-24T17:21:41Z" Oct 03 08:40:16 crc kubenswrapper[4765]: I1003 08:40:16.417655 4765 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-srgbb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ea01fba1-445f-46c1-898c-1ceb34866850\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:36Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:36Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d73e2e54676fc570262cfd551322ed003812c372ddc25695ca3b34ae2a05423b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T08:39:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m4zqv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fa40947035e07c4926ee170348e2bd545830d0c6c1fa6b59a2aa7f12eac2c6da\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T08:39:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m4zqv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://902d94d2cc9ce526c6ea774f1bb70fbee7da85cedab72fcd842f87d47ee8a458\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T08:39:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m4zqv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://95502595a856f5f235331ab5db3d4f97a50f968857c1962d12b873a714689f0c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T08:39:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m4zqv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a3ad66691c9dcf004703b79d697a78f9b42791fafba2ddf278997b6ad28bdd4a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T08:39:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m4zqv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://68b9b8a7ec5c072f50d44aa0d3800b7cdee18bdd868d37ec129ceb37a23bd3ca\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T08:39:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m4zqv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a4d481217db9abe6da65a66219fdf2298353f237df78c085f40bb803f7349ccd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a4d481217db9abe6da65a66219fdf2298353f237df78c085f40bb803f7349ccd\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-10-03T08:40:03Z\\\",\\\"message\\\":\\\":false hairpin_snat_ip:169.254.0.5 fd69::5 neighbor_responder:none reject:true skip_snat:false]} protocol:{GoSet:[tcp]} selection_fields:{GoSet:[]} vips:{GoMap:map[10.217.5.188:443:]}] Rows:[] Columns:[] Mutations:[] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {53c717ca-2174-4315-bb03-c937a9c0d9b6}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nI1003 08:40:03.133396 6432 transact.go:42] Configuring OVN: [{Op:update Table:Load_Balancer Row:map[external_ids:{GoMap:map[k8s.ovn.org/kind:Service k8s.ovn.org/owner:openshift-cluster-version/cluster-version-operator]} name:Service_openshift-cluster-version/cluster-version-operator_TCP_cluster options:{GoMap:map[event:false hairpin_snat_ip:169.254.0.5 fd69::5 neighbor_responder:none reject:true skip_snat:false]} protocol:{GoSet:[tcp]} selection_fields:{GoSet:[]} vips:{GoMap:map[10.217.4.182:9099:]}] Rows:[] Columns:[] Mutations:[] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {61d39e4d-21a9-4387-9a2b-fa4ad14792e2}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nI1003 08:40:03.133455 6432 base_network_controller_pods.go:916] Annotation values: ip=[10.217.0.3/23] ; mac=0a:58:0a:d9:00:03 ; gw=[10.217.0.1]\\\\nF1003 08:40:03.133502 6432 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-03T08:40:02Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-srgbb_openshift-ovn-kubernetes(ea01fba1-445f-46c1-898c-1ceb34866850)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m4zqv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6d5d60eb6ab5ff22cc2c6826b1d47220bb827fa0429f2a59020ae01d0a43f6bf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T08:39:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m4zqv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://761ad034a8a6a7a6280fcccaca136b26ddd423885bc2a7ab6b8e1856f16f7416\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://761ad034a8a6a7a6280fcccaca136b26ddd423885bc2a7ab6b8e1856f16f7416\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T08:39:37Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T08:39:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m4zqv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T08:39:36Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-srgbb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T08:40:16Z is after 2025-08-24T17:21:41Z" Oct 03 08:40:16 crc kubenswrapper[4765]: I1003 08:40:16.431260 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 08:40:16 crc kubenswrapper[4765]: I1003 08:40:16.431353 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 08:40:16 crc kubenswrapper[4765]: I1003 08:40:16.431376 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 08:40:16 crc kubenswrapper[4765]: I1003 08:40:16.431404 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 08:40:16 crc kubenswrapper[4765]: I1003 08:40:16.431425 4765 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T08:40:16Z","lastTransitionTime":"2025-10-03T08:40:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 08:40:16 crc kubenswrapper[4765]: I1003 08:40:16.434065 4765 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9660b983-3561-4cf7-8ea0-31a63e8d1051\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://23c27e7d79dab0c54b22f0114e7f55a9267e3a21961b8479c37fd77d0e8b66c7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T08:39:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fb89a31c804d86cbc11b04e4dcfab79d4536f28a107d43e98d48172a1c257ebb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T08:39:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1c3168f51c49cd9633557cf31cdc0fec47b3fcf981462dc85f4253a0584fcf52\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T08:39:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://99ae775d5cfd2e88a1c7ca516e1c59f2e08ce1d383653cacbefeac66b07abcb5\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T08:39:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T08:39:16Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T08:40:16Z is after 2025-08-24T17:21:41Z" Oct 03 08:40:16 crc kubenswrapper[4765]: I1003 08:40:16.451543 4765 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-4bmrv" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4f105c06-3e67-486f-a622-923ae442117c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d7a29ab4db9b7548c70824520272e6323f615934cddf1d92bf653f6d8f030a19\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T08:39:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6lhz2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ab314ed006e71cae099ecf8be4ac18fb777dd83fe4e233d9bc347965f76b6a0b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ab314ed006e71cae099ecf8be4ac18fb777dd83fe4e233d9bc347965f76b6a0b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T08:39:37Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T08:39:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6lhz2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://261208c132a527b6786f64f37dd699e889f68ebc01265028288b3620615274bc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://261208c132a527b6786f64f37dd699e889f68ebc01265028288b3620615274bc\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T08:39:39Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T08:39:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6lhz2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8358f38c6c98de8206fc03fda21200b0e526972747588a6e32d525c064aa57ac\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8358f38c6c98de8206fc03fda21200b0e526972747588a6e32d525c064aa57ac\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T08:39:39Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T08:39:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6lhz2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9af7a0993c4e8d1177050ee170ae306c2e2570b0daca2d3f5c812b5f0e9c81da\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9af7a0993c4e8d1177050ee170ae306c2e2570b0daca2d3f5c812b5f0e9c81da\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T08:39:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T08:39:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6lhz2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://23ac91bc25ecc5c606b22bf6df52129330bb8c214ef8ec881fb202df6350c853\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://23ac91bc25ecc5c606b22bf6df52129330bb8c214ef8ec881fb202df6350c853\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T08:39:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T08:39:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6lhz2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7c836df75da45ef369baafc15bdbed1068becc3bf57a4c83a8519280ff3eb847\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7c836df75da45ef369baafc15bdbed1068becc3bf57a4c83a8519280ff3eb847\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T08:39:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T08:39:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6lhz2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T08:39:36Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-4bmrv\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T08:40:16Z is after 2025-08-24T17:21:41Z" Oct 03 08:40:16 crc kubenswrapper[4765]: I1003 08:40:16.462822 4765 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-p9gf5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"46c76a49-e10b-4a12-a6c7-12c330cd3c4e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d127171dd11041892813dd0596574630e756cc4f2e54b149619bffdbe9bae37d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T08:39:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2ldt7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T08:39:36Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-p9gf5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T08:40:16Z is after 2025-08-24T17:21:41Z" Oct 03 08:40:16 crc kubenswrapper[4765]: I1003 08:40:16.474035 4765 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-svqbq" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"84cdf1d7-9997-4015-bdbf-eedacc081685\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://43441b23076aa88505c0014c6734ffd0302f9011300711eece573befc94f3fbf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T08:39:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tm2z5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T08:39:39Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-svqbq\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T08:40:16Z is after 2025-08-24T17:21:41Z" Oct 03 08:40:16 crc kubenswrapper[4765]: I1003 08:40:16.485253 4765 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-9pssq" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fcbd8c60-e4bc-43c1-b769-9ae58a05ea0f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fb36c0727cbf11d911102b2e91c3989a264374191f4ff34349ed6ec8eba2e58d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T08:39:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ggxtg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d810b33fb4971c7a1473884cbe04ad15b3cac6c0ca9af2384819d72a748ab173\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T08:39:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ggxtg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T08:39:48Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-9pssq\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T08:40:16Z is after 2025-08-24T17:21:41Z" Oct 03 08:40:16 crc kubenswrapper[4765]: I1003 08:40:16.499878 4765 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0c434639-9c6c-420c-a51b-fdf59b654daa\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d31497fd54f7500ac776bdd9a16414d873c053353911ed5ba237b201e9e7ac12\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T08:39:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e89b19d6a5b90a2051665bf2e5e150f73df7899eff246ee75246bc2127c415ea\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T08:39:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://79fad446c147481b1a0ff2a173848b2d24384e6b6aafcd0749dc820e9abfe929\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T08:39:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f2e21a2b21d807288e991a3a44ea38d316985590080aa4291aa3385816f826dd\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://aa0283dadc2c5e48aa9bfd20ef35d889a350244b72eb8529d4d4e682d5fa0e47\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-03T08:39:35Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1003 08:39:29.830291 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1003 08:39:29.833185 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2710500186/tls.crt::/tmp/serving-cert-2710500186/tls.key\\\\\\\"\\\\nI1003 08:39:35.213224 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1003 08:39:35.219008 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1003 08:39:35.219055 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1003 08:39:35.219088 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1003 08:39:35.219098 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1003 08:39:35.227302 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI1003 08:39:35.227314 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW1003 08:39:35.227372 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1003 08:39:35.227381 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1003 08:39:35.227385 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1003 08:39:35.227395 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1003 08:39:35.227398 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1003 08:39:35.227401 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF1003 08:39:35.229781 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-03T08:39:19Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T08:39:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fa1d1c0f4dab4b4c6c9f3afccac34473eab40a714015a2a7ce725ed1a92b609c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T08:39:19Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7c94b0f96f50324a67a7b189e9d3c4e406193421aa2e793692219bef9f2578dd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7c94b0f96f50324a67a7b189e9d3c4e406193421aa2e793692219bef9f2578dd\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T08:39:17Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T08:39:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T08:39:16Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T08:40:16Z is after 2025-08-24T17:21:41Z" Oct 03 08:40:16 crc kubenswrapper[4765]: I1003 08:40:16.511123 4765 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2a9b9fb7-e509-45bf-8ceb-fed6c0d26821\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://78b8f31e2b3f0891e3909baeb57c5a2dfe52c0e85d1aa86fe045ed54c56d5202\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T08:39:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6cedef6c592c877edfd8afe1dc09789fdc84a816a6a84d9ac9115fa494d8b5fe\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T08:39:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f1e92137f7438e3f6ae4b9225226f23f10f0e5e8a2b6a86f486971315d8bee00\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T08:39:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://00a75616be0bff2d1c730afda7f4212c6d85e07870e6f680c6903862387e00a0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://00a75616be0bff2d1c730afda7f4212c6d85e07870e6f680c6903862387e00a0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T08:39:17Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T08:39:17Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T08:39:16Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T08:40:16Z is after 2025-08-24T17:21:41Z" Oct 03 08:40:16 crc kubenswrapper[4765]: I1003 08:40:16.523202 4765 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:35Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:35Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T08:40:16Z is after 2025-08-24T17:21:41Z" Oct 03 08:40:16 crc kubenswrapper[4765]: I1003 08:40:16.534047 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 08:40:16 crc kubenswrapper[4765]: I1003 08:40:16.534082 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 08:40:16 crc kubenswrapper[4765]: I1003 08:40:16.534091 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 08:40:16 crc kubenswrapper[4765]: I1003 08:40:16.534110 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 08:40:16 crc kubenswrapper[4765]: I1003 08:40:16.534125 4765 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T08:40:16Z","lastTransitionTime":"2025-10-03T08:40:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 08:40:16 crc kubenswrapper[4765]: I1003 08:40:16.537530 4765 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:36Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:36Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e2003e4dd90b26bd915c05a690d0ab12b21ef7773138f11993382b0e7ac2d778\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T08:39:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T08:40:16Z is after 2025-08-24T17:21:41Z" Oct 03 08:40:16 crc kubenswrapper[4765]: I1003 08:40:16.549446 4765 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-wdwf5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6824483c-e9a7-4e95-bb3d-e00bac2af3aa\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:50Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:50Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:50Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9t858\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9t858\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T08:39:50Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-wdwf5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T08:40:16Z is after 2025-08-24T17:21:41Z" Oct 03 08:40:16 crc kubenswrapper[4765]: I1003 08:40:16.636532 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 08:40:16 crc kubenswrapper[4765]: I1003 08:40:16.636578 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 08:40:16 crc kubenswrapper[4765]: I1003 08:40:16.636586 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 08:40:16 crc kubenswrapper[4765]: I1003 08:40:16.636601 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 08:40:16 crc kubenswrapper[4765]: I1003 08:40:16.636611 4765 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T08:40:16Z","lastTransitionTime":"2025-10-03T08:40:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 08:40:16 crc kubenswrapper[4765]: I1003 08:40:16.739297 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 08:40:16 crc kubenswrapper[4765]: I1003 08:40:16.739343 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 08:40:16 crc kubenswrapper[4765]: I1003 08:40:16.739354 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 08:40:16 crc kubenswrapper[4765]: I1003 08:40:16.739370 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 08:40:16 crc kubenswrapper[4765]: I1003 08:40:16.739379 4765 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T08:40:16Z","lastTransitionTime":"2025-10-03T08:40:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 08:40:16 crc kubenswrapper[4765]: I1003 08:40:16.842093 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 08:40:16 crc kubenswrapper[4765]: I1003 08:40:16.842138 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 08:40:16 crc kubenswrapper[4765]: I1003 08:40:16.842188 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 08:40:16 crc kubenswrapper[4765]: I1003 08:40:16.842206 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 08:40:16 crc kubenswrapper[4765]: I1003 08:40:16.842216 4765 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T08:40:16Z","lastTransitionTime":"2025-10-03T08:40:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 08:40:16 crc kubenswrapper[4765]: I1003 08:40:16.944043 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 08:40:16 crc kubenswrapper[4765]: I1003 08:40:16.944085 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 08:40:16 crc kubenswrapper[4765]: I1003 08:40:16.944094 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 08:40:16 crc kubenswrapper[4765]: I1003 08:40:16.944108 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 08:40:16 crc kubenswrapper[4765]: I1003 08:40:16.944118 4765 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T08:40:16Z","lastTransitionTime":"2025-10-03T08:40:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 08:40:17 crc kubenswrapper[4765]: I1003 08:40:17.046514 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 08:40:17 crc kubenswrapper[4765]: I1003 08:40:17.046610 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 08:40:17 crc kubenswrapper[4765]: I1003 08:40:17.046622 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 08:40:17 crc kubenswrapper[4765]: I1003 08:40:17.046662 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 08:40:17 crc kubenswrapper[4765]: I1003 08:40:17.046675 4765 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T08:40:17Z","lastTransitionTime":"2025-10-03T08:40:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 08:40:17 crc kubenswrapper[4765]: I1003 08:40:17.149595 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 08:40:17 crc kubenswrapper[4765]: I1003 08:40:17.149667 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 08:40:17 crc kubenswrapper[4765]: I1003 08:40:17.149681 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 08:40:17 crc kubenswrapper[4765]: I1003 08:40:17.149702 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 08:40:17 crc kubenswrapper[4765]: I1003 08:40:17.149711 4765 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T08:40:17Z","lastTransitionTime":"2025-10-03T08:40:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 08:40:17 crc kubenswrapper[4765]: I1003 08:40:17.252086 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 08:40:17 crc kubenswrapper[4765]: I1003 08:40:17.252124 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 08:40:17 crc kubenswrapper[4765]: I1003 08:40:17.252133 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 08:40:17 crc kubenswrapper[4765]: I1003 08:40:17.252150 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 08:40:17 crc kubenswrapper[4765]: I1003 08:40:17.252161 4765 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T08:40:17Z","lastTransitionTime":"2025-10-03T08:40:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 08:40:17 crc kubenswrapper[4765]: I1003 08:40:17.306125 4765 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 03 08:40:17 crc kubenswrapper[4765]: I1003 08:40:17.306186 4765 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 03 08:40:17 crc kubenswrapper[4765]: E1003 08:40:17.306889 4765 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 03 08:40:17 crc kubenswrapper[4765]: E1003 08:40:17.306344 4765 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 03 08:40:17 crc kubenswrapper[4765]: I1003 08:40:17.306246 4765 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 03 08:40:17 crc kubenswrapper[4765]: E1003 08:40:17.307042 4765 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 03 08:40:17 crc kubenswrapper[4765]: I1003 08:40:17.362393 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 08:40:17 crc kubenswrapper[4765]: I1003 08:40:17.362445 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 08:40:17 crc kubenswrapper[4765]: I1003 08:40:17.362457 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 08:40:17 crc kubenswrapper[4765]: I1003 08:40:17.362473 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 08:40:17 crc kubenswrapper[4765]: I1003 08:40:17.362482 4765 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T08:40:17Z","lastTransitionTime":"2025-10-03T08:40:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 08:40:17 crc kubenswrapper[4765]: I1003 08:40:17.465014 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 08:40:17 crc kubenswrapper[4765]: I1003 08:40:17.465519 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 08:40:17 crc kubenswrapper[4765]: I1003 08:40:17.465587 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 08:40:17 crc kubenswrapper[4765]: I1003 08:40:17.465672 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 08:40:17 crc kubenswrapper[4765]: I1003 08:40:17.465753 4765 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T08:40:17Z","lastTransitionTime":"2025-10-03T08:40:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 08:40:17 crc kubenswrapper[4765]: I1003 08:40:17.568609 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 08:40:17 crc kubenswrapper[4765]: I1003 08:40:17.568677 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 08:40:17 crc kubenswrapper[4765]: I1003 08:40:17.568690 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 08:40:17 crc kubenswrapper[4765]: I1003 08:40:17.568707 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 08:40:17 crc kubenswrapper[4765]: I1003 08:40:17.568719 4765 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T08:40:17Z","lastTransitionTime":"2025-10-03T08:40:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 08:40:17 crc kubenswrapper[4765]: I1003 08:40:17.671456 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 08:40:17 crc kubenswrapper[4765]: I1003 08:40:17.671516 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 08:40:17 crc kubenswrapper[4765]: I1003 08:40:17.671525 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 08:40:17 crc kubenswrapper[4765]: I1003 08:40:17.671549 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 08:40:17 crc kubenswrapper[4765]: I1003 08:40:17.671560 4765 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T08:40:17Z","lastTransitionTime":"2025-10-03T08:40:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 08:40:17 crc kubenswrapper[4765]: I1003 08:40:17.774125 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 08:40:17 crc kubenswrapper[4765]: I1003 08:40:17.774162 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 08:40:17 crc kubenswrapper[4765]: I1003 08:40:17.774173 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 08:40:17 crc kubenswrapper[4765]: I1003 08:40:17.774189 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 08:40:17 crc kubenswrapper[4765]: I1003 08:40:17.774231 4765 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T08:40:17Z","lastTransitionTime":"2025-10-03T08:40:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 08:40:17 crc kubenswrapper[4765]: I1003 08:40:17.876852 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 08:40:17 crc kubenswrapper[4765]: I1003 08:40:17.876885 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 08:40:17 crc kubenswrapper[4765]: I1003 08:40:17.876896 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 08:40:17 crc kubenswrapper[4765]: I1003 08:40:17.876940 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 08:40:17 crc kubenswrapper[4765]: I1003 08:40:17.876959 4765 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T08:40:17Z","lastTransitionTime":"2025-10-03T08:40:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 08:40:17 crc kubenswrapper[4765]: I1003 08:40:17.978694 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 08:40:17 crc kubenswrapper[4765]: I1003 08:40:17.978929 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 08:40:17 crc kubenswrapper[4765]: I1003 08:40:17.979018 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 08:40:17 crc kubenswrapper[4765]: I1003 08:40:17.979083 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 08:40:17 crc kubenswrapper[4765]: I1003 08:40:17.979180 4765 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T08:40:17Z","lastTransitionTime":"2025-10-03T08:40:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 08:40:18 crc kubenswrapper[4765]: I1003 08:40:18.081312 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 08:40:18 crc kubenswrapper[4765]: I1003 08:40:18.081344 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 08:40:18 crc kubenswrapper[4765]: I1003 08:40:18.081353 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 08:40:18 crc kubenswrapper[4765]: I1003 08:40:18.081375 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 08:40:18 crc kubenswrapper[4765]: I1003 08:40:18.081391 4765 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T08:40:18Z","lastTransitionTime":"2025-10-03T08:40:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 08:40:18 crc kubenswrapper[4765]: I1003 08:40:18.183043 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 08:40:18 crc kubenswrapper[4765]: I1003 08:40:18.183096 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 08:40:18 crc kubenswrapper[4765]: I1003 08:40:18.183105 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 08:40:18 crc kubenswrapper[4765]: I1003 08:40:18.183120 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 08:40:18 crc kubenswrapper[4765]: I1003 08:40:18.183131 4765 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T08:40:18Z","lastTransitionTime":"2025-10-03T08:40:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 08:40:18 crc kubenswrapper[4765]: I1003 08:40:18.285845 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 08:40:18 crc kubenswrapper[4765]: I1003 08:40:18.285879 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 08:40:18 crc kubenswrapper[4765]: I1003 08:40:18.285893 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 08:40:18 crc kubenswrapper[4765]: I1003 08:40:18.285912 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 08:40:18 crc kubenswrapper[4765]: I1003 08:40:18.285924 4765 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T08:40:18Z","lastTransitionTime":"2025-10-03T08:40:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 08:40:18 crc kubenswrapper[4765]: I1003 08:40:18.306619 4765 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-wdwf5" Oct 03 08:40:18 crc kubenswrapper[4765]: E1003 08:40:18.306781 4765 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-wdwf5" podUID="6824483c-e9a7-4e95-bb3d-e00bac2af3aa" Oct 03 08:40:18 crc kubenswrapper[4765]: I1003 08:40:18.388211 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 08:40:18 crc kubenswrapper[4765]: I1003 08:40:18.388251 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 08:40:18 crc kubenswrapper[4765]: I1003 08:40:18.388262 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 08:40:18 crc kubenswrapper[4765]: I1003 08:40:18.388280 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 08:40:18 crc kubenswrapper[4765]: I1003 08:40:18.388290 4765 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T08:40:18Z","lastTransitionTime":"2025-10-03T08:40:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 08:40:18 crc kubenswrapper[4765]: I1003 08:40:18.491021 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 08:40:18 crc kubenswrapper[4765]: I1003 08:40:18.491078 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 08:40:18 crc kubenswrapper[4765]: I1003 08:40:18.491094 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 08:40:18 crc kubenswrapper[4765]: I1003 08:40:18.491116 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 08:40:18 crc kubenswrapper[4765]: I1003 08:40:18.491128 4765 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T08:40:18Z","lastTransitionTime":"2025-10-03T08:40:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 08:40:18 crc kubenswrapper[4765]: I1003 08:40:18.593496 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 08:40:18 crc kubenswrapper[4765]: I1003 08:40:18.593548 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 08:40:18 crc kubenswrapper[4765]: I1003 08:40:18.593563 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 08:40:18 crc kubenswrapper[4765]: I1003 08:40:18.593583 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 08:40:18 crc kubenswrapper[4765]: I1003 08:40:18.593595 4765 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T08:40:18Z","lastTransitionTime":"2025-10-03T08:40:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 08:40:18 crc kubenswrapper[4765]: I1003 08:40:18.695959 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 08:40:18 crc kubenswrapper[4765]: I1003 08:40:18.696012 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 08:40:18 crc kubenswrapper[4765]: I1003 08:40:18.696021 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 08:40:18 crc kubenswrapper[4765]: I1003 08:40:18.696035 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 08:40:18 crc kubenswrapper[4765]: I1003 08:40:18.696045 4765 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T08:40:18Z","lastTransitionTime":"2025-10-03T08:40:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 08:40:18 crc kubenswrapper[4765]: I1003 08:40:18.798638 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 08:40:18 crc kubenswrapper[4765]: I1003 08:40:18.798699 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 08:40:18 crc kubenswrapper[4765]: I1003 08:40:18.798710 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 08:40:18 crc kubenswrapper[4765]: I1003 08:40:18.798727 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 08:40:18 crc kubenswrapper[4765]: I1003 08:40:18.798737 4765 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T08:40:18Z","lastTransitionTime":"2025-10-03T08:40:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 08:40:18 crc kubenswrapper[4765]: I1003 08:40:18.900756 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 08:40:18 crc kubenswrapper[4765]: I1003 08:40:18.900788 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 08:40:18 crc kubenswrapper[4765]: I1003 08:40:18.900797 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 08:40:18 crc kubenswrapper[4765]: I1003 08:40:18.900810 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 08:40:18 crc kubenswrapper[4765]: I1003 08:40:18.900819 4765 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T08:40:18Z","lastTransitionTime":"2025-10-03T08:40:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 08:40:19 crc kubenswrapper[4765]: I1003 08:40:19.003506 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 08:40:19 crc kubenswrapper[4765]: I1003 08:40:19.003562 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 08:40:19 crc kubenswrapper[4765]: I1003 08:40:19.003571 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 08:40:19 crc kubenswrapper[4765]: I1003 08:40:19.003592 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 08:40:19 crc kubenswrapper[4765]: I1003 08:40:19.003604 4765 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T08:40:19Z","lastTransitionTime":"2025-10-03T08:40:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 08:40:19 crc kubenswrapper[4765]: I1003 08:40:19.106475 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 08:40:19 crc kubenswrapper[4765]: I1003 08:40:19.106513 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 08:40:19 crc kubenswrapper[4765]: I1003 08:40:19.106521 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 08:40:19 crc kubenswrapper[4765]: I1003 08:40:19.106536 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 08:40:19 crc kubenswrapper[4765]: I1003 08:40:19.106545 4765 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T08:40:19Z","lastTransitionTime":"2025-10-03T08:40:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 08:40:19 crc kubenswrapper[4765]: I1003 08:40:19.208935 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 08:40:19 crc kubenswrapper[4765]: I1003 08:40:19.208974 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 08:40:19 crc kubenswrapper[4765]: I1003 08:40:19.208984 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 08:40:19 crc kubenswrapper[4765]: I1003 08:40:19.208999 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 08:40:19 crc kubenswrapper[4765]: I1003 08:40:19.209008 4765 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T08:40:19Z","lastTransitionTime":"2025-10-03T08:40:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 08:40:19 crc kubenswrapper[4765]: I1003 08:40:19.306395 4765 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 03 08:40:19 crc kubenswrapper[4765]: I1003 08:40:19.306482 4765 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 03 08:40:19 crc kubenswrapper[4765]: E1003 08:40:19.306534 4765 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 03 08:40:19 crc kubenswrapper[4765]: E1003 08:40:19.306555 4765 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 03 08:40:19 crc kubenswrapper[4765]: I1003 08:40:19.306856 4765 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 03 08:40:19 crc kubenswrapper[4765]: E1003 08:40:19.307048 4765 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 03 08:40:19 crc kubenswrapper[4765]: I1003 08:40:19.311031 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 08:40:19 crc kubenswrapper[4765]: I1003 08:40:19.311125 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 08:40:19 crc kubenswrapper[4765]: I1003 08:40:19.311194 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 08:40:19 crc kubenswrapper[4765]: I1003 08:40:19.311258 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 08:40:19 crc kubenswrapper[4765]: I1003 08:40:19.311318 4765 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T08:40:19Z","lastTransitionTime":"2025-10-03T08:40:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 08:40:19 crc kubenswrapper[4765]: I1003 08:40:19.413303 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 08:40:19 crc kubenswrapper[4765]: I1003 08:40:19.413349 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 08:40:19 crc kubenswrapper[4765]: I1003 08:40:19.413358 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 08:40:19 crc kubenswrapper[4765]: I1003 08:40:19.413373 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 08:40:19 crc kubenswrapper[4765]: I1003 08:40:19.413382 4765 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T08:40:19Z","lastTransitionTime":"2025-10-03T08:40:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 08:40:19 crc kubenswrapper[4765]: I1003 08:40:19.516112 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 08:40:19 crc kubenswrapper[4765]: I1003 08:40:19.516151 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 08:40:19 crc kubenswrapper[4765]: I1003 08:40:19.516161 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 08:40:19 crc kubenswrapper[4765]: I1003 08:40:19.516180 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 08:40:19 crc kubenswrapper[4765]: I1003 08:40:19.516189 4765 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T08:40:19Z","lastTransitionTime":"2025-10-03T08:40:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 08:40:19 crc kubenswrapper[4765]: I1003 08:40:19.618135 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 08:40:19 crc kubenswrapper[4765]: I1003 08:40:19.618189 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 08:40:19 crc kubenswrapper[4765]: I1003 08:40:19.618203 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 08:40:19 crc kubenswrapper[4765]: I1003 08:40:19.618222 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 08:40:19 crc kubenswrapper[4765]: I1003 08:40:19.618234 4765 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T08:40:19Z","lastTransitionTime":"2025-10-03T08:40:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 08:40:19 crc kubenswrapper[4765]: I1003 08:40:19.720271 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 08:40:19 crc kubenswrapper[4765]: I1003 08:40:19.720313 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 08:40:19 crc kubenswrapper[4765]: I1003 08:40:19.720355 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 08:40:19 crc kubenswrapper[4765]: I1003 08:40:19.720372 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 08:40:19 crc kubenswrapper[4765]: I1003 08:40:19.720383 4765 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T08:40:19Z","lastTransitionTime":"2025-10-03T08:40:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 08:40:19 crc kubenswrapper[4765]: I1003 08:40:19.823000 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 08:40:19 crc kubenswrapper[4765]: I1003 08:40:19.823044 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 08:40:19 crc kubenswrapper[4765]: I1003 08:40:19.823054 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 08:40:19 crc kubenswrapper[4765]: I1003 08:40:19.823069 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 08:40:19 crc kubenswrapper[4765]: I1003 08:40:19.823078 4765 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T08:40:19Z","lastTransitionTime":"2025-10-03T08:40:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 08:40:19 crc kubenswrapper[4765]: I1003 08:40:19.925348 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 08:40:19 crc kubenswrapper[4765]: I1003 08:40:19.925388 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 08:40:19 crc kubenswrapper[4765]: I1003 08:40:19.925401 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 08:40:19 crc kubenswrapper[4765]: I1003 08:40:19.925418 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 08:40:19 crc kubenswrapper[4765]: I1003 08:40:19.925428 4765 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T08:40:19Z","lastTransitionTime":"2025-10-03T08:40:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 08:40:20 crc kubenswrapper[4765]: I1003 08:40:20.027627 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 08:40:20 crc kubenswrapper[4765]: I1003 08:40:20.028010 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 08:40:20 crc kubenswrapper[4765]: I1003 08:40:20.028095 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 08:40:20 crc kubenswrapper[4765]: I1003 08:40:20.028188 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 08:40:20 crc kubenswrapper[4765]: I1003 08:40:20.028283 4765 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T08:40:20Z","lastTransitionTime":"2025-10-03T08:40:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 08:40:20 crc kubenswrapper[4765]: I1003 08:40:20.130978 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 08:40:20 crc kubenswrapper[4765]: I1003 08:40:20.131033 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 08:40:20 crc kubenswrapper[4765]: I1003 08:40:20.131046 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 08:40:20 crc kubenswrapper[4765]: I1003 08:40:20.131062 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 08:40:20 crc kubenswrapper[4765]: I1003 08:40:20.131073 4765 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T08:40:20Z","lastTransitionTime":"2025-10-03T08:40:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 08:40:20 crc kubenswrapper[4765]: I1003 08:40:20.233372 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 08:40:20 crc kubenswrapper[4765]: I1003 08:40:20.233413 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 08:40:20 crc kubenswrapper[4765]: I1003 08:40:20.233423 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 08:40:20 crc kubenswrapper[4765]: I1003 08:40:20.233440 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 08:40:20 crc kubenswrapper[4765]: I1003 08:40:20.233452 4765 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T08:40:20Z","lastTransitionTime":"2025-10-03T08:40:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 08:40:20 crc kubenswrapper[4765]: I1003 08:40:20.306463 4765 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-wdwf5" Oct 03 08:40:20 crc kubenswrapper[4765]: E1003 08:40:20.306630 4765 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-wdwf5" podUID="6824483c-e9a7-4e95-bb3d-e00bac2af3aa" Oct 03 08:40:20 crc kubenswrapper[4765]: I1003 08:40:20.307434 4765 scope.go:117] "RemoveContainer" containerID="a4d481217db9abe6da65a66219fdf2298353f237df78c085f40bb803f7349ccd" Oct 03 08:40:20 crc kubenswrapper[4765]: E1003 08:40:20.307782 4765 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-srgbb_openshift-ovn-kubernetes(ea01fba1-445f-46c1-898c-1ceb34866850)\"" pod="openshift-ovn-kubernetes/ovnkube-node-srgbb" podUID="ea01fba1-445f-46c1-898c-1ceb34866850" Oct 03 08:40:20 crc kubenswrapper[4765]: I1003 08:40:20.335729 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 08:40:20 crc kubenswrapper[4765]: I1003 08:40:20.335772 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 08:40:20 crc kubenswrapper[4765]: I1003 08:40:20.335784 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 08:40:20 crc kubenswrapper[4765]: I1003 08:40:20.335798 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 08:40:20 crc kubenswrapper[4765]: I1003 08:40:20.335808 4765 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T08:40:20Z","lastTransitionTime":"2025-10-03T08:40:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 08:40:20 crc kubenswrapper[4765]: I1003 08:40:20.438353 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 08:40:20 crc kubenswrapper[4765]: I1003 08:40:20.438402 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 08:40:20 crc kubenswrapper[4765]: I1003 08:40:20.438412 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 08:40:20 crc kubenswrapper[4765]: I1003 08:40:20.438428 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 08:40:20 crc kubenswrapper[4765]: I1003 08:40:20.438440 4765 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T08:40:20Z","lastTransitionTime":"2025-10-03T08:40:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 08:40:20 crc kubenswrapper[4765]: I1003 08:40:20.541253 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 08:40:20 crc kubenswrapper[4765]: I1003 08:40:20.541305 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 08:40:20 crc kubenswrapper[4765]: I1003 08:40:20.541316 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 08:40:20 crc kubenswrapper[4765]: I1003 08:40:20.541333 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 08:40:20 crc kubenswrapper[4765]: I1003 08:40:20.541346 4765 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T08:40:20Z","lastTransitionTime":"2025-10-03T08:40:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 08:40:20 crc kubenswrapper[4765]: I1003 08:40:20.644077 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 08:40:20 crc kubenswrapper[4765]: I1003 08:40:20.644131 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 08:40:20 crc kubenswrapper[4765]: I1003 08:40:20.644144 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 08:40:20 crc kubenswrapper[4765]: I1003 08:40:20.644163 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 08:40:20 crc kubenswrapper[4765]: I1003 08:40:20.644175 4765 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T08:40:20Z","lastTransitionTime":"2025-10-03T08:40:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 08:40:20 crc kubenswrapper[4765]: I1003 08:40:20.747160 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 08:40:20 crc kubenswrapper[4765]: I1003 08:40:20.747230 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 08:40:20 crc kubenswrapper[4765]: I1003 08:40:20.747250 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 08:40:20 crc kubenswrapper[4765]: I1003 08:40:20.747278 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 08:40:20 crc kubenswrapper[4765]: I1003 08:40:20.747294 4765 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T08:40:20Z","lastTransitionTime":"2025-10-03T08:40:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 08:40:20 crc kubenswrapper[4765]: I1003 08:40:20.850438 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 08:40:20 crc kubenswrapper[4765]: I1003 08:40:20.850850 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 08:40:20 crc kubenswrapper[4765]: I1003 08:40:20.850931 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 08:40:20 crc kubenswrapper[4765]: I1003 08:40:20.851010 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 08:40:20 crc kubenswrapper[4765]: I1003 08:40:20.851075 4765 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T08:40:20Z","lastTransitionTime":"2025-10-03T08:40:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 08:40:20 crc kubenswrapper[4765]: I1003 08:40:20.954137 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 08:40:20 crc kubenswrapper[4765]: I1003 08:40:20.954176 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 08:40:20 crc kubenswrapper[4765]: I1003 08:40:20.954187 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 08:40:20 crc kubenswrapper[4765]: I1003 08:40:20.954204 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 08:40:20 crc kubenswrapper[4765]: I1003 08:40:20.954217 4765 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T08:40:20Z","lastTransitionTime":"2025-10-03T08:40:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 08:40:21 crc kubenswrapper[4765]: I1003 08:40:21.056718 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 08:40:21 crc kubenswrapper[4765]: I1003 08:40:21.056764 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 08:40:21 crc kubenswrapper[4765]: I1003 08:40:21.056775 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 08:40:21 crc kubenswrapper[4765]: I1003 08:40:21.056792 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 08:40:21 crc kubenswrapper[4765]: I1003 08:40:21.056801 4765 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T08:40:21Z","lastTransitionTime":"2025-10-03T08:40:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 08:40:21 crc kubenswrapper[4765]: I1003 08:40:21.159520 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 08:40:21 crc kubenswrapper[4765]: I1003 08:40:21.159855 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 08:40:21 crc kubenswrapper[4765]: I1003 08:40:21.159941 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 08:40:21 crc kubenswrapper[4765]: I1003 08:40:21.160041 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 08:40:21 crc kubenswrapper[4765]: I1003 08:40:21.160130 4765 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T08:40:21Z","lastTransitionTime":"2025-10-03T08:40:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 08:40:21 crc kubenswrapper[4765]: I1003 08:40:21.263983 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 08:40:21 crc kubenswrapper[4765]: I1003 08:40:21.264032 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 08:40:21 crc kubenswrapper[4765]: I1003 08:40:21.264044 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 08:40:21 crc kubenswrapper[4765]: I1003 08:40:21.264066 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 08:40:21 crc kubenswrapper[4765]: I1003 08:40:21.264079 4765 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T08:40:21Z","lastTransitionTime":"2025-10-03T08:40:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 08:40:21 crc kubenswrapper[4765]: I1003 08:40:21.305915 4765 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 03 08:40:21 crc kubenswrapper[4765]: I1003 08:40:21.305977 4765 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 03 08:40:21 crc kubenswrapper[4765]: I1003 08:40:21.306036 4765 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 03 08:40:21 crc kubenswrapper[4765]: E1003 08:40:21.306073 4765 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 03 08:40:21 crc kubenswrapper[4765]: E1003 08:40:21.306137 4765 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 03 08:40:21 crc kubenswrapper[4765]: E1003 08:40:21.306286 4765 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 03 08:40:21 crc kubenswrapper[4765]: I1003 08:40:21.366640 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 08:40:21 crc kubenswrapper[4765]: I1003 08:40:21.366703 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 08:40:21 crc kubenswrapper[4765]: I1003 08:40:21.366717 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 08:40:21 crc kubenswrapper[4765]: I1003 08:40:21.366738 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 08:40:21 crc kubenswrapper[4765]: I1003 08:40:21.366753 4765 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T08:40:21Z","lastTransitionTime":"2025-10-03T08:40:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 08:40:21 crc kubenswrapper[4765]: I1003 08:40:21.468468 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 08:40:21 crc kubenswrapper[4765]: I1003 08:40:21.468507 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 08:40:21 crc kubenswrapper[4765]: I1003 08:40:21.468517 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 08:40:21 crc kubenswrapper[4765]: I1003 08:40:21.468532 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 08:40:21 crc kubenswrapper[4765]: I1003 08:40:21.468543 4765 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T08:40:21Z","lastTransitionTime":"2025-10-03T08:40:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 08:40:21 crc kubenswrapper[4765]: I1003 08:40:21.570613 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 08:40:21 crc kubenswrapper[4765]: I1003 08:40:21.570665 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 08:40:21 crc kubenswrapper[4765]: I1003 08:40:21.570673 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 08:40:21 crc kubenswrapper[4765]: I1003 08:40:21.570691 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 08:40:21 crc kubenswrapper[4765]: I1003 08:40:21.570700 4765 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T08:40:21Z","lastTransitionTime":"2025-10-03T08:40:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 08:40:21 crc kubenswrapper[4765]: I1003 08:40:21.672836 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 08:40:21 crc kubenswrapper[4765]: I1003 08:40:21.672880 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 08:40:21 crc kubenswrapper[4765]: I1003 08:40:21.672889 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 08:40:21 crc kubenswrapper[4765]: I1003 08:40:21.672905 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 08:40:21 crc kubenswrapper[4765]: I1003 08:40:21.672916 4765 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T08:40:21Z","lastTransitionTime":"2025-10-03T08:40:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 08:40:21 crc kubenswrapper[4765]: I1003 08:40:21.775149 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 08:40:21 crc kubenswrapper[4765]: I1003 08:40:21.775184 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 08:40:21 crc kubenswrapper[4765]: I1003 08:40:21.775192 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 08:40:21 crc kubenswrapper[4765]: I1003 08:40:21.775207 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 08:40:21 crc kubenswrapper[4765]: I1003 08:40:21.775215 4765 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T08:40:21Z","lastTransitionTime":"2025-10-03T08:40:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 08:40:21 crc kubenswrapper[4765]: I1003 08:40:21.877470 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 08:40:21 crc kubenswrapper[4765]: I1003 08:40:21.877508 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 08:40:21 crc kubenswrapper[4765]: I1003 08:40:21.877519 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 08:40:21 crc kubenswrapper[4765]: I1003 08:40:21.877534 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 08:40:21 crc kubenswrapper[4765]: I1003 08:40:21.877543 4765 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T08:40:21Z","lastTransitionTime":"2025-10-03T08:40:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 08:40:21 crc kubenswrapper[4765]: I1003 08:40:21.979721 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 08:40:21 crc kubenswrapper[4765]: I1003 08:40:21.979771 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 08:40:21 crc kubenswrapper[4765]: I1003 08:40:21.979784 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 08:40:21 crc kubenswrapper[4765]: I1003 08:40:21.979802 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 08:40:21 crc kubenswrapper[4765]: I1003 08:40:21.979814 4765 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T08:40:21Z","lastTransitionTime":"2025-10-03T08:40:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 08:40:22 crc kubenswrapper[4765]: I1003 08:40:22.082196 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 08:40:22 crc kubenswrapper[4765]: I1003 08:40:22.082245 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 08:40:22 crc kubenswrapper[4765]: I1003 08:40:22.082255 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 08:40:22 crc kubenswrapper[4765]: I1003 08:40:22.082273 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 08:40:22 crc kubenswrapper[4765]: I1003 08:40:22.082283 4765 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T08:40:22Z","lastTransitionTime":"2025-10-03T08:40:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 08:40:22 crc kubenswrapper[4765]: I1003 08:40:22.184317 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 08:40:22 crc kubenswrapper[4765]: I1003 08:40:22.184364 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 08:40:22 crc kubenswrapper[4765]: I1003 08:40:22.184378 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 08:40:22 crc kubenswrapper[4765]: I1003 08:40:22.184430 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 08:40:22 crc kubenswrapper[4765]: I1003 08:40:22.184450 4765 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T08:40:22Z","lastTransitionTime":"2025-10-03T08:40:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 08:40:22 crc kubenswrapper[4765]: I1003 08:40:22.286672 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 08:40:22 crc kubenswrapper[4765]: I1003 08:40:22.286720 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 08:40:22 crc kubenswrapper[4765]: I1003 08:40:22.286733 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 08:40:22 crc kubenswrapper[4765]: I1003 08:40:22.286752 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 08:40:22 crc kubenswrapper[4765]: I1003 08:40:22.286763 4765 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T08:40:22Z","lastTransitionTime":"2025-10-03T08:40:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 08:40:22 crc kubenswrapper[4765]: I1003 08:40:22.306466 4765 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-wdwf5" Oct 03 08:40:22 crc kubenswrapper[4765]: E1003 08:40:22.306766 4765 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-wdwf5" podUID="6824483c-e9a7-4e95-bb3d-e00bac2af3aa" Oct 03 08:40:22 crc kubenswrapper[4765]: I1003 08:40:22.318191 4765 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-config-operator/kube-rbac-proxy-crio-crc"] Oct 03 08:40:22 crc kubenswrapper[4765]: I1003 08:40:22.367956 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/6824483c-e9a7-4e95-bb3d-e00bac2af3aa-metrics-certs\") pod \"network-metrics-daemon-wdwf5\" (UID: \"6824483c-e9a7-4e95-bb3d-e00bac2af3aa\") " pod="openshift-multus/network-metrics-daemon-wdwf5" Oct 03 08:40:22 crc kubenswrapper[4765]: E1003 08:40:22.368137 4765 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Oct 03 08:40:22 crc kubenswrapper[4765]: E1003 08:40:22.368241 4765 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/6824483c-e9a7-4e95-bb3d-e00bac2af3aa-metrics-certs podName:6824483c-e9a7-4e95-bb3d-e00bac2af3aa nodeName:}" failed. No retries permitted until 2025-10-03 08:40:54.368212172 +0000 UTC m=+98.669706502 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/6824483c-e9a7-4e95-bb3d-e00bac2af3aa-metrics-certs") pod "network-metrics-daemon-wdwf5" (UID: "6824483c-e9a7-4e95-bb3d-e00bac2af3aa") : object "openshift-multus"/"metrics-daemon-secret" not registered Oct 03 08:40:22 crc kubenswrapper[4765]: I1003 08:40:22.389255 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 08:40:22 crc kubenswrapper[4765]: I1003 08:40:22.389304 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 08:40:22 crc kubenswrapper[4765]: I1003 08:40:22.389316 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 08:40:22 crc kubenswrapper[4765]: I1003 08:40:22.389336 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 08:40:22 crc kubenswrapper[4765]: I1003 08:40:22.389349 4765 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T08:40:22Z","lastTransitionTime":"2025-10-03T08:40:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 08:40:22 crc kubenswrapper[4765]: I1003 08:40:22.492347 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 08:40:22 crc kubenswrapper[4765]: I1003 08:40:22.492438 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 08:40:22 crc kubenswrapper[4765]: I1003 08:40:22.492450 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 08:40:22 crc kubenswrapper[4765]: I1003 08:40:22.492467 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 08:40:22 crc kubenswrapper[4765]: I1003 08:40:22.492479 4765 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T08:40:22Z","lastTransitionTime":"2025-10-03T08:40:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 08:40:22 crc kubenswrapper[4765]: I1003 08:40:22.594771 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 08:40:22 crc kubenswrapper[4765]: I1003 08:40:22.594806 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 08:40:22 crc kubenswrapper[4765]: I1003 08:40:22.594818 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 08:40:22 crc kubenswrapper[4765]: I1003 08:40:22.594836 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 08:40:22 crc kubenswrapper[4765]: I1003 08:40:22.594848 4765 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T08:40:22Z","lastTransitionTime":"2025-10-03T08:40:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 08:40:22 crc kubenswrapper[4765]: I1003 08:40:22.696882 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 08:40:22 crc kubenswrapper[4765]: I1003 08:40:22.696918 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 08:40:22 crc kubenswrapper[4765]: I1003 08:40:22.696929 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 08:40:22 crc kubenswrapper[4765]: I1003 08:40:22.696946 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 08:40:22 crc kubenswrapper[4765]: I1003 08:40:22.696957 4765 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T08:40:22Z","lastTransitionTime":"2025-10-03T08:40:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 08:40:22 crc kubenswrapper[4765]: I1003 08:40:22.799362 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 08:40:22 crc kubenswrapper[4765]: I1003 08:40:22.799399 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 08:40:22 crc kubenswrapper[4765]: I1003 08:40:22.799414 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 08:40:22 crc kubenswrapper[4765]: I1003 08:40:22.799429 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 08:40:22 crc kubenswrapper[4765]: I1003 08:40:22.799438 4765 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T08:40:22Z","lastTransitionTime":"2025-10-03T08:40:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 08:40:22 crc kubenswrapper[4765]: I1003 08:40:22.902520 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 08:40:22 crc kubenswrapper[4765]: I1003 08:40:22.902560 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 08:40:22 crc kubenswrapper[4765]: I1003 08:40:22.902569 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 08:40:22 crc kubenswrapper[4765]: I1003 08:40:22.902584 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 08:40:22 crc kubenswrapper[4765]: I1003 08:40:22.902595 4765 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T08:40:22Z","lastTransitionTime":"2025-10-03T08:40:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 08:40:22 crc kubenswrapper[4765]: I1003 08:40:22.909516 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 08:40:22 crc kubenswrapper[4765]: I1003 08:40:22.909553 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 08:40:22 crc kubenswrapper[4765]: I1003 08:40:22.909563 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 08:40:22 crc kubenswrapper[4765]: I1003 08:40:22.909578 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 08:40:22 crc kubenswrapper[4765]: I1003 08:40:22.909587 4765 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T08:40:22Z","lastTransitionTime":"2025-10-03T08:40:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 08:40:22 crc kubenswrapper[4765]: E1003 08:40:22.922484 4765 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-03T08:40:22Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:22Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-03T08:40:22Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:22Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-03T08:40:22Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:22Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-03T08:40:22Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:22Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"4a5a1b91-d1b3-462d-b8c2-89eae83d6c3d\\\",\\\"systemUUID\\\":\\\"c85bcae8-d463-4f60-8737-09c0f3c02573\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T08:40:22Z is after 2025-08-24T17:21:41Z" Oct 03 08:40:22 crc kubenswrapper[4765]: I1003 08:40:22.926499 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 08:40:22 crc kubenswrapper[4765]: I1003 08:40:22.926553 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 08:40:22 crc kubenswrapper[4765]: I1003 08:40:22.926566 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 08:40:22 crc kubenswrapper[4765]: I1003 08:40:22.926587 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 08:40:22 crc kubenswrapper[4765]: I1003 08:40:22.926599 4765 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T08:40:22Z","lastTransitionTime":"2025-10-03T08:40:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 08:40:22 crc kubenswrapper[4765]: E1003 08:40:22.938587 4765 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-03T08:40:22Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:22Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-03T08:40:22Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:22Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-03T08:40:22Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:22Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-03T08:40:22Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:22Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"4a5a1b91-d1b3-462d-b8c2-89eae83d6c3d\\\",\\\"systemUUID\\\":\\\"c85bcae8-d463-4f60-8737-09c0f3c02573\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T08:40:22Z is after 2025-08-24T17:21:41Z" Oct 03 08:40:22 crc kubenswrapper[4765]: I1003 08:40:22.942223 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 08:40:22 crc kubenswrapper[4765]: I1003 08:40:22.942366 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 08:40:22 crc kubenswrapper[4765]: I1003 08:40:22.942475 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 08:40:22 crc kubenswrapper[4765]: I1003 08:40:22.942576 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 08:40:22 crc kubenswrapper[4765]: I1003 08:40:22.942691 4765 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T08:40:22Z","lastTransitionTime":"2025-10-03T08:40:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 08:40:22 crc kubenswrapper[4765]: E1003 08:40:22.954722 4765 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-03T08:40:22Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:22Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-03T08:40:22Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:22Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-03T08:40:22Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:22Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-03T08:40:22Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:22Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"4a5a1b91-d1b3-462d-b8c2-89eae83d6c3d\\\",\\\"systemUUID\\\":\\\"c85bcae8-d463-4f60-8737-09c0f3c02573\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T08:40:22Z is after 2025-08-24T17:21:41Z" Oct 03 08:40:22 crc kubenswrapper[4765]: I1003 08:40:22.957992 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 08:40:22 crc kubenswrapper[4765]: I1003 08:40:22.958032 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 08:40:22 crc kubenswrapper[4765]: I1003 08:40:22.958042 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 08:40:22 crc kubenswrapper[4765]: I1003 08:40:22.958061 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 08:40:22 crc kubenswrapper[4765]: I1003 08:40:22.958073 4765 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T08:40:22Z","lastTransitionTime":"2025-10-03T08:40:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 08:40:22 crc kubenswrapper[4765]: E1003 08:40:22.972843 4765 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-03T08:40:22Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:22Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-03T08:40:22Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:22Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-03T08:40:22Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:22Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-03T08:40:22Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:22Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"4a5a1b91-d1b3-462d-b8c2-89eae83d6c3d\\\",\\\"systemUUID\\\":\\\"c85bcae8-d463-4f60-8737-09c0f3c02573\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T08:40:22Z is after 2025-08-24T17:21:41Z" Oct 03 08:40:22 crc kubenswrapper[4765]: I1003 08:40:22.976437 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 08:40:22 crc kubenswrapper[4765]: I1003 08:40:22.976462 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 08:40:22 crc kubenswrapper[4765]: I1003 08:40:22.976472 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 08:40:22 crc kubenswrapper[4765]: I1003 08:40:22.976487 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 08:40:22 crc kubenswrapper[4765]: I1003 08:40:22.976497 4765 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T08:40:22Z","lastTransitionTime":"2025-10-03T08:40:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 08:40:22 crc kubenswrapper[4765]: E1003 08:40:22.989258 4765 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-03T08:40:22Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:22Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-03T08:40:22Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:22Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-03T08:40:22Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:22Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-03T08:40:22Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:22Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"4a5a1b91-d1b3-462d-b8c2-89eae83d6c3d\\\",\\\"systemUUID\\\":\\\"c85bcae8-d463-4f60-8737-09c0f3c02573\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T08:40:22Z is after 2025-08-24T17:21:41Z" Oct 03 08:40:22 crc kubenswrapper[4765]: E1003 08:40:22.989457 4765 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Oct 03 08:40:23 crc kubenswrapper[4765]: I1003 08:40:23.005342 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 08:40:23 crc kubenswrapper[4765]: I1003 08:40:23.005379 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 08:40:23 crc kubenswrapper[4765]: I1003 08:40:23.005394 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 08:40:23 crc kubenswrapper[4765]: I1003 08:40:23.005412 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 08:40:23 crc kubenswrapper[4765]: I1003 08:40:23.005424 4765 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T08:40:23Z","lastTransitionTime":"2025-10-03T08:40:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 08:40:23 crc kubenswrapper[4765]: I1003 08:40:23.107463 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 08:40:23 crc kubenswrapper[4765]: I1003 08:40:23.107498 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 08:40:23 crc kubenswrapper[4765]: I1003 08:40:23.107507 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 08:40:23 crc kubenswrapper[4765]: I1003 08:40:23.107524 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 08:40:23 crc kubenswrapper[4765]: I1003 08:40:23.107545 4765 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T08:40:23Z","lastTransitionTime":"2025-10-03T08:40:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 08:40:23 crc kubenswrapper[4765]: I1003 08:40:23.210317 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 08:40:23 crc kubenswrapper[4765]: I1003 08:40:23.210351 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 08:40:23 crc kubenswrapper[4765]: I1003 08:40:23.210359 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 08:40:23 crc kubenswrapper[4765]: I1003 08:40:23.210374 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 08:40:23 crc kubenswrapper[4765]: I1003 08:40:23.210383 4765 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T08:40:23Z","lastTransitionTime":"2025-10-03T08:40:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 08:40:23 crc kubenswrapper[4765]: I1003 08:40:23.306021 4765 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 03 08:40:23 crc kubenswrapper[4765]: I1003 08:40:23.306079 4765 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 03 08:40:23 crc kubenswrapper[4765]: E1003 08:40:23.306155 4765 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 03 08:40:23 crc kubenswrapper[4765]: I1003 08:40:23.306099 4765 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 03 08:40:23 crc kubenswrapper[4765]: E1003 08:40:23.306212 4765 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 03 08:40:23 crc kubenswrapper[4765]: E1003 08:40:23.306277 4765 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 03 08:40:23 crc kubenswrapper[4765]: I1003 08:40:23.312669 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 08:40:23 crc kubenswrapper[4765]: I1003 08:40:23.312710 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 08:40:23 crc kubenswrapper[4765]: I1003 08:40:23.312722 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 08:40:23 crc kubenswrapper[4765]: I1003 08:40:23.312737 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 08:40:23 crc kubenswrapper[4765]: I1003 08:40:23.312750 4765 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T08:40:23Z","lastTransitionTime":"2025-10-03T08:40:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 08:40:23 crc kubenswrapper[4765]: I1003 08:40:23.415115 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 08:40:23 crc kubenswrapper[4765]: I1003 08:40:23.415154 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 08:40:23 crc kubenswrapper[4765]: I1003 08:40:23.415166 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 08:40:23 crc kubenswrapper[4765]: I1003 08:40:23.415182 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 08:40:23 crc kubenswrapper[4765]: I1003 08:40:23.415192 4765 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T08:40:23Z","lastTransitionTime":"2025-10-03T08:40:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 08:40:23 crc kubenswrapper[4765]: I1003 08:40:23.518269 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 08:40:23 crc kubenswrapper[4765]: I1003 08:40:23.518305 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 08:40:23 crc kubenswrapper[4765]: I1003 08:40:23.518317 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 08:40:23 crc kubenswrapper[4765]: I1003 08:40:23.518336 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 08:40:23 crc kubenswrapper[4765]: I1003 08:40:23.518347 4765 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T08:40:23Z","lastTransitionTime":"2025-10-03T08:40:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 08:40:23 crc kubenswrapper[4765]: I1003 08:40:23.620300 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 08:40:23 crc kubenswrapper[4765]: I1003 08:40:23.620336 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 08:40:23 crc kubenswrapper[4765]: I1003 08:40:23.620347 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 08:40:23 crc kubenswrapper[4765]: I1003 08:40:23.620362 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 08:40:23 crc kubenswrapper[4765]: I1003 08:40:23.620370 4765 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T08:40:23Z","lastTransitionTime":"2025-10-03T08:40:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 08:40:23 crc kubenswrapper[4765]: I1003 08:40:23.722783 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 08:40:23 crc kubenswrapper[4765]: I1003 08:40:23.722828 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 08:40:23 crc kubenswrapper[4765]: I1003 08:40:23.722841 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 08:40:23 crc kubenswrapper[4765]: I1003 08:40:23.722860 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 08:40:23 crc kubenswrapper[4765]: I1003 08:40:23.722874 4765 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T08:40:23Z","lastTransitionTime":"2025-10-03T08:40:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 08:40:23 crc kubenswrapper[4765]: I1003 08:40:23.826154 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 08:40:23 crc kubenswrapper[4765]: I1003 08:40:23.826189 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 08:40:23 crc kubenswrapper[4765]: I1003 08:40:23.826200 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 08:40:23 crc kubenswrapper[4765]: I1003 08:40:23.826217 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 08:40:23 crc kubenswrapper[4765]: I1003 08:40:23.826227 4765 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T08:40:23Z","lastTransitionTime":"2025-10-03T08:40:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 08:40:23 crc kubenswrapper[4765]: I1003 08:40:23.929045 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 08:40:23 crc kubenswrapper[4765]: I1003 08:40:23.929085 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 08:40:23 crc kubenswrapper[4765]: I1003 08:40:23.929108 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 08:40:23 crc kubenswrapper[4765]: I1003 08:40:23.929124 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 08:40:23 crc kubenswrapper[4765]: I1003 08:40:23.929140 4765 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T08:40:23Z","lastTransitionTime":"2025-10-03T08:40:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 08:40:24 crc kubenswrapper[4765]: I1003 08:40:24.031529 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 08:40:24 crc kubenswrapper[4765]: I1003 08:40:24.031766 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 08:40:24 crc kubenswrapper[4765]: I1003 08:40:24.031776 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 08:40:24 crc kubenswrapper[4765]: I1003 08:40:24.031791 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 08:40:24 crc kubenswrapper[4765]: I1003 08:40:24.031800 4765 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T08:40:24Z","lastTransitionTime":"2025-10-03T08:40:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 08:40:24 crc kubenswrapper[4765]: I1003 08:40:24.133390 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 08:40:24 crc kubenswrapper[4765]: I1003 08:40:24.133432 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 08:40:24 crc kubenswrapper[4765]: I1003 08:40:24.133443 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 08:40:24 crc kubenswrapper[4765]: I1003 08:40:24.133460 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 08:40:24 crc kubenswrapper[4765]: I1003 08:40:24.133470 4765 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T08:40:24Z","lastTransitionTime":"2025-10-03T08:40:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 08:40:24 crc kubenswrapper[4765]: I1003 08:40:24.235701 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 08:40:24 crc kubenswrapper[4765]: I1003 08:40:24.235731 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 08:40:24 crc kubenswrapper[4765]: I1003 08:40:24.235739 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 08:40:24 crc kubenswrapper[4765]: I1003 08:40:24.235754 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 08:40:24 crc kubenswrapper[4765]: I1003 08:40:24.235767 4765 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T08:40:24Z","lastTransitionTime":"2025-10-03T08:40:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 08:40:24 crc kubenswrapper[4765]: I1003 08:40:24.306569 4765 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-wdwf5" Oct 03 08:40:24 crc kubenswrapper[4765]: E1003 08:40:24.306766 4765 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-wdwf5" podUID="6824483c-e9a7-4e95-bb3d-e00bac2af3aa" Oct 03 08:40:24 crc kubenswrapper[4765]: I1003 08:40:24.338465 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 08:40:24 crc kubenswrapper[4765]: I1003 08:40:24.338500 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 08:40:24 crc kubenswrapper[4765]: I1003 08:40:24.338509 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 08:40:24 crc kubenswrapper[4765]: I1003 08:40:24.338523 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 08:40:24 crc kubenswrapper[4765]: I1003 08:40:24.338535 4765 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T08:40:24Z","lastTransitionTime":"2025-10-03T08:40:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 08:40:24 crc kubenswrapper[4765]: I1003 08:40:24.440308 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 08:40:24 crc kubenswrapper[4765]: I1003 08:40:24.440345 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 08:40:24 crc kubenswrapper[4765]: I1003 08:40:24.440362 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 08:40:24 crc kubenswrapper[4765]: I1003 08:40:24.440380 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 08:40:24 crc kubenswrapper[4765]: I1003 08:40:24.440392 4765 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T08:40:24Z","lastTransitionTime":"2025-10-03T08:40:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 08:40:24 crc kubenswrapper[4765]: I1003 08:40:24.543002 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 08:40:24 crc kubenswrapper[4765]: I1003 08:40:24.543046 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 08:40:24 crc kubenswrapper[4765]: I1003 08:40:24.543055 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 08:40:24 crc kubenswrapper[4765]: I1003 08:40:24.543069 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 08:40:24 crc kubenswrapper[4765]: I1003 08:40:24.543079 4765 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T08:40:24Z","lastTransitionTime":"2025-10-03T08:40:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 08:40:24 crc kubenswrapper[4765]: I1003 08:40:24.645661 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 08:40:24 crc kubenswrapper[4765]: I1003 08:40:24.645701 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 08:40:24 crc kubenswrapper[4765]: I1003 08:40:24.645711 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 08:40:24 crc kubenswrapper[4765]: I1003 08:40:24.645727 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 08:40:24 crc kubenswrapper[4765]: I1003 08:40:24.645739 4765 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T08:40:24Z","lastTransitionTime":"2025-10-03T08:40:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 08:40:24 crc kubenswrapper[4765]: I1003 08:40:24.694144 4765 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-csb5z_912755c8-dd28-4fbc-82de-9cf85df54f4f/kube-multus/0.log" Oct 03 08:40:24 crc kubenswrapper[4765]: I1003 08:40:24.694199 4765 generic.go:334] "Generic (PLEG): container finished" podID="912755c8-dd28-4fbc-82de-9cf85df54f4f" containerID="d7f179012e9f55f30c641a1ae3640cc90cefb3d2527d0c1e0580c219899503e1" exitCode=1 Oct 03 08:40:24 crc kubenswrapper[4765]: I1003 08:40:24.694238 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-csb5z" event={"ID":"912755c8-dd28-4fbc-82de-9cf85df54f4f","Type":"ContainerDied","Data":"d7f179012e9f55f30c641a1ae3640cc90cefb3d2527d0c1e0580c219899503e1"} Oct 03 08:40:24 crc kubenswrapper[4765]: I1003 08:40:24.694696 4765 scope.go:117] "RemoveContainer" containerID="d7f179012e9f55f30c641a1ae3640cc90cefb3d2527d0c1e0580c219899503e1" Oct 03 08:40:24 crc kubenswrapper[4765]: I1003 08:40:24.713343 4765 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"648d26ad-0ca3-4ce7-885d-6aab568ed72d\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://dee8eb78cfc7f681a7009b32e7521490cfa896aee35f8f552a150738224517be\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T08:39:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bdfadb3541e9c76e5ab7469b7161c24715f4eeff89ec4bba0cc253bece41f1b2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://bdfadb3541e9c76e5ab7469b7161c24715f4eeff89ec4bba0cc253bece41f1b2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T08:39:17Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T08:39:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T08:39:16Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T08:40:24Z is after 2025-08-24T17:21:41Z" Oct 03 08:40:24 crc kubenswrapper[4765]: I1003 08:40:24.730480 4765 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0c434639-9c6c-420c-a51b-fdf59b654daa\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d31497fd54f7500ac776bdd9a16414d873c053353911ed5ba237b201e9e7ac12\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T08:39:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e89b19d6a5b90a2051665bf2e5e150f73df7899eff246ee75246bc2127c415ea\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T08:39:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://79fad446c147481b1a0ff2a173848b2d24384e6b6aafcd0749dc820e9abfe929\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T08:39:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f2e21a2b21d807288e991a3a44ea38d316985590080aa4291aa3385816f826dd\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://aa0283dadc2c5e48aa9bfd20ef35d889a350244b72eb8529d4d4e682d5fa0e47\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-03T08:39:35Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1003 08:39:29.830291 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1003 08:39:29.833185 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2710500186/tls.crt::/tmp/serving-cert-2710500186/tls.key\\\\\\\"\\\\nI1003 08:39:35.213224 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1003 08:39:35.219008 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1003 08:39:35.219055 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1003 08:39:35.219088 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1003 08:39:35.219098 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1003 08:39:35.227302 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI1003 08:39:35.227314 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW1003 08:39:35.227372 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1003 08:39:35.227381 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1003 08:39:35.227385 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1003 08:39:35.227395 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1003 08:39:35.227398 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1003 08:39:35.227401 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF1003 08:39:35.229781 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-03T08:39:19Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T08:39:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fa1d1c0f4dab4b4c6c9f3afccac34473eab40a714015a2a7ce725ed1a92b609c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T08:39:19Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7c94b0f96f50324a67a7b189e9d3c4e406193421aa2e793692219bef9f2578dd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7c94b0f96f50324a67a7b189e9d3c4e406193421aa2e793692219bef9f2578dd\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T08:39:17Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T08:39:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T08:39:16Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T08:40:24Z is after 2025-08-24T17:21:41Z" Oct 03 08:40:24 crc kubenswrapper[4765]: I1003 08:40:24.743479 4765 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2a9b9fb7-e509-45bf-8ceb-fed6c0d26821\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://78b8f31e2b3f0891e3909baeb57c5a2dfe52c0e85d1aa86fe045ed54c56d5202\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T08:39:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6cedef6c592c877edfd8afe1dc09789fdc84a816a6a84d9ac9115fa494d8b5fe\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T08:39:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f1e92137f7438e3f6ae4b9225226f23f10f0e5e8a2b6a86f486971315d8bee00\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T08:39:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://00a75616be0bff2d1c730afda7f4212c6d85e07870e6f680c6903862387e00a0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://00a75616be0bff2d1c730afda7f4212c6d85e07870e6f680c6903862387e00a0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T08:39:17Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T08:39:17Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T08:39:16Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T08:40:24Z is after 2025-08-24T17:21:41Z" Oct 03 08:40:24 crc kubenswrapper[4765]: I1003 08:40:24.748155 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 08:40:24 crc kubenswrapper[4765]: I1003 08:40:24.748184 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 08:40:24 crc kubenswrapper[4765]: I1003 08:40:24.748194 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 08:40:24 crc kubenswrapper[4765]: I1003 08:40:24.748211 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 08:40:24 crc kubenswrapper[4765]: I1003 08:40:24.748221 4765 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T08:40:24Z","lastTransitionTime":"2025-10-03T08:40:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 08:40:24 crc kubenswrapper[4765]: I1003 08:40:24.756913 4765 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:35Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:35Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T08:40:24Z is after 2025-08-24T17:21:41Z" Oct 03 08:40:24 crc kubenswrapper[4765]: I1003 08:40:24.770640 4765 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:36Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:36Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e2003e4dd90b26bd915c05a690d0ab12b21ef7773138f11993382b0e7ac2d778\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T08:39:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T08:40:24Z is after 2025-08-24T17:21:41Z" Oct 03 08:40:24 crc kubenswrapper[4765]: I1003 08:40:24.781857 4765 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-wdwf5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6824483c-e9a7-4e95-bb3d-e00bac2af3aa\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:50Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:50Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:50Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9t858\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9t858\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T08:39:50Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-wdwf5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T08:40:24Z is after 2025-08-24T17:21:41Z" Oct 03 08:40:24 crc kubenswrapper[4765]: I1003 08:40:24.808426 4765 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"859ee4f1-636f-48e5-ad72-fef19f311c64\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ffaf0cbc60fa84230a87aff908b5b2a76956abfa937aeea94363abe91640b93e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T08:39:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0fee410f71d4fa82e7bf54dad906736bc7182be512825a06bf7a4c76ed2f2789\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T08:39:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ab0ed26066c771f9943b6435fa382ff61fb04f0c8bef3d505aba4c5d1a1d4740\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T08:39:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://153c9584928c3d064c6098126dad58733015ed123b9a55c959e69ddcc0ad2110\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T08:39:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6fa1bc45d80d90bc08ca3a7177e2ac77b66c36f5a0f863532174be7719bfaae0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T08:39:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c1359775b3b907743ce1d7754c904fab5cf953ad3a28f91e781be95462e6a9d4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c1359775b3b907743ce1d7754c904fab5cf953ad3a28f91e781be95462e6a9d4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T08:39:17Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T08:39:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a56c24b3af6b3d5c18009cbb5517273674eb36f57157344f99903f9c50c7d1a0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a56c24b3af6b3d5c18009cbb5517273674eb36f57157344f99903f9c50c7d1a0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T08:39:18Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T08:39:18Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://18dbd66a12f96fa4beef4da9dcc0e1b1d3a5164a8b21a6774142289078d6c05f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://18dbd66a12f96fa4beef4da9dcc0e1b1d3a5164a8b21a6774142289078d6c05f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T08:39:19Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T08:39:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T08:39:16Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T08:40:24Z is after 2025-08-24T17:21:41Z" Oct 03 08:40:24 crc kubenswrapper[4765]: I1003 08:40:24.821816 4765 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:35Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:35Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T08:40:24Z is after 2025-08-24T17:21:41Z" Oct 03 08:40:24 crc kubenswrapper[4765]: I1003 08:40:24.833969 4765 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-csb5z" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"912755c8-dd28-4fbc-82de-9cf85df54f4f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:24Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:24Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d7f179012e9f55f30c641a1ae3640cc90cefb3d2527d0c1e0580c219899503e1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d7f179012e9f55f30c641a1ae3640cc90cefb3d2527d0c1e0580c219899503e1\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-10-03T08:40:23Z\\\",\\\"message\\\":\\\"2025-10-03T08:39:38+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_5149c5cc-1f13-4c92-ba76-7ef1ed5a7abf\\\\n2025-10-03T08:39:38+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_5149c5cc-1f13-4c92-ba76-7ef1ed5a7abf to /host/opt/cni/bin/\\\\n2025-10-03T08:39:38Z [verbose] multus-daemon started\\\\n2025-10-03T08:39:38Z [verbose] Readiness Indicator file check\\\\n2025-10-03T08:40:23Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-03T08:39:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w8k2k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T08:39:36Z\\\"}}\" for pod \"openshift-multus\"/\"multus-csb5z\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T08:40:24Z is after 2025-08-24T17:21:41Z" Oct 03 08:40:24 crc kubenswrapper[4765]: I1003 08:40:24.845386 4765 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-j8mss" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d636dbad-9ffa-4ba7-953f-adea04b76a23\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://33c95fa1034cd2135f4293956d73825e809195d220ff0b10a6604bd399a5730a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T08:39:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dhzzj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://714c78e9165f96e2aee03ad7be980399f06aeb852da4d76611c236f262518281\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T08:39:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dhzzj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T08:39:36Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-j8mss\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T08:40:24Z is after 2025-08-24T17:21:41Z" Oct 03 08:40:24 crc kubenswrapper[4765]: I1003 08:40:24.850114 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 08:40:24 crc kubenswrapper[4765]: I1003 08:40:24.850147 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 08:40:24 crc kubenswrapper[4765]: I1003 08:40:24.850157 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 08:40:24 crc kubenswrapper[4765]: I1003 08:40:24.850173 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 08:40:24 crc kubenswrapper[4765]: I1003 08:40:24.850185 4765 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T08:40:24Z","lastTransitionTime":"2025-10-03T08:40:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 08:40:24 crc kubenswrapper[4765]: I1003 08:40:24.857528 4765 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:35Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:35Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T08:40:24Z is after 2025-08-24T17:21:41Z" Oct 03 08:40:24 crc kubenswrapper[4765]: I1003 08:40:24.869708 4765 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:36Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:36Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e8d6f534a0a702832db2f8947c1528a98d511d3950cc5a6ec0ac3b31b3dbcb7c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T08:39:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://20ad16cb9f0f7e17ac946cd2c3f7c01b6e6c95d6d76c99f482b3761546689af2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T08:39:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T08:40:24Z is after 2025-08-24T17:21:41Z" Oct 03 08:40:24 crc kubenswrapper[4765]: I1003 08:40:24.880541 4765 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:38Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:38Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a37f2b5f797755065158a077232872befbc61f2f19c80dfd27bba7f131db794c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T08:39:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T08:40:24Z is after 2025-08-24T17:21:41Z" Oct 03 08:40:24 crc kubenswrapper[4765]: I1003 08:40:24.898849 4765 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-srgbb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ea01fba1-445f-46c1-898c-1ceb34866850\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:36Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:36Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d73e2e54676fc570262cfd551322ed003812c372ddc25695ca3b34ae2a05423b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T08:39:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m4zqv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fa40947035e07c4926ee170348e2bd545830d0c6c1fa6b59a2aa7f12eac2c6da\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T08:39:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m4zqv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://902d94d2cc9ce526c6ea774f1bb70fbee7da85cedab72fcd842f87d47ee8a458\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T08:39:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m4zqv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://95502595a856f5f235331ab5db3d4f97a50f968857c1962d12b873a714689f0c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T08:39:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m4zqv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a3ad66691c9dcf004703b79d697a78f9b42791fafba2ddf278997b6ad28bdd4a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T08:39:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m4zqv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://68b9b8a7ec5c072f50d44aa0d3800b7cdee18bdd868d37ec129ceb37a23bd3ca\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T08:39:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m4zqv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a4d481217db9abe6da65a66219fdf2298353f237df78c085f40bb803f7349ccd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a4d481217db9abe6da65a66219fdf2298353f237df78c085f40bb803f7349ccd\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-10-03T08:40:03Z\\\",\\\"message\\\":\\\":false hairpin_snat_ip:169.254.0.5 fd69::5 neighbor_responder:none reject:true skip_snat:false]} protocol:{GoSet:[tcp]} selection_fields:{GoSet:[]} vips:{GoMap:map[10.217.5.188:443:]}] Rows:[] Columns:[] Mutations:[] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {53c717ca-2174-4315-bb03-c937a9c0d9b6}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nI1003 08:40:03.133396 6432 transact.go:42] Configuring OVN: [{Op:update Table:Load_Balancer Row:map[external_ids:{GoMap:map[k8s.ovn.org/kind:Service k8s.ovn.org/owner:openshift-cluster-version/cluster-version-operator]} name:Service_openshift-cluster-version/cluster-version-operator_TCP_cluster options:{GoMap:map[event:false hairpin_snat_ip:169.254.0.5 fd69::5 neighbor_responder:none reject:true skip_snat:false]} protocol:{GoSet:[tcp]} selection_fields:{GoSet:[]} vips:{GoMap:map[10.217.4.182:9099:]}] Rows:[] Columns:[] Mutations:[] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {61d39e4d-21a9-4387-9a2b-fa4ad14792e2}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nI1003 08:40:03.133455 6432 base_network_controller_pods.go:916] Annotation values: ip=[10.217.0.3/23] ; mac=0a:58:0a:d9:00:03 ; gw=[10.217.0.1]\\\\nF1003 08:40:03.133502 6432 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-03T08:40:02Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-srgbb_openshift-ovn-kubernetes(ea01fba1-445f-46c1-898c-1ceb34866850)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m4zqv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6d5d60eb6ab5ff22cc2c6826b1d47220bb827fa0429f2a59020ae01d0a43f6bf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T08:39:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m4zqv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://761ad034a8a6a7a6280fcccaca136b26ddd423885bc2a7ab6b8e1856f16f7416\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://761ad034a8a6a7a6280fcccaca136b26ddd423885bc2a7ab6b8e1856f16f7416\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T08:39:37Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T08:39:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m4zqv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T08:39:36Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-srgbb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T08:40:24Z is after 2025-08-24T17:21:41Z" Oct 03 08:40:24 crc kubenswrapper[4765]: I1003 08:40:24.910994 4765 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9660b983-3561-4cf7-8ea0-31a63e8d1051\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://23c27e7d79dab0c54b22f0114e7f55a9267e3a21961b8479c37fd77d0e8b66c7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T08:39:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fb89a31c804d86cbc11b04e4dcfab79d4536f28a107d43e98d48172a1c257ebb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T08:39:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1c3168f51c49cd9633557cf31cdc0fec47b3fcf981462dc85f4253a0584fcf52\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T08:39:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://99ae775d5cfd2e88a1c7ca516e1c59f2e08ce1d383653cacbefeac66b07abcb5\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T08:39:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T08:39:16Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T08:40:24Z is after 2025-08-24T17:21:41Z" Oct 03 08:40:24 crc kubenswrapper[4765]: I1003 08:40:24.923376 4765 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-4bmrv" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4f105c06-3e67-486f-a622-923ae442117c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d7a29ab4db9b7548c70824520272e6323f615934cddf1d92bf653f6d8f030a19\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T08:39:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6lhz2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ab314ed006e71cae099ecf8be4ac18fb777dd83fe4e233d9bc347965f76b6a0b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ab314ed006e71cae099ecf8be4ac18fb777dd83fe4e233d9bc347965f76b6a0b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T08:39:37Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T08:39:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6lhz2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://261208c132a527b6786f64f37dd699e889f68ebc01265028288b3620615274bc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://261208c132a527b6786f64f37dd699e889f68ebc01265028288b3620615274bc\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T08:39:39Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T08:39:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6lhz2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8358f38c6c98de8206fc03fda21200b0e526972747588a6e32d525c064aa57ac\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8358f38c6c98de8206fc03fda21200b0e526972747588a6e32d525c064aa57ac\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T08:39:39Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T08:39:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6lhz2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9af7a0993c4e8d1177050ee170ae306c2e2570b0daca2d3f5c812b5f0e9c81da\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9af7a0993c4e8d1177050ee170ae306c2e2570b0daca2d3f5c812b5f0e9c81da\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T08:39:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T08:39:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6lhz2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://23ac91bc25ecc5c606b22bf6df52129330bb8c214ef8ec881fb202df6350c853\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://23ac91bc25ecc5c606b22bf6df52129330bb8c214ef8ec881fb202df6350c853\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T08:39:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T08:39:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6lhz2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7c836df75da45ef369baafc15bdbed1068becc3bf57a4c83a8519280ff3eb847\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7c836df75da45ef369baafc15bdbed1068becc3bf57a4c83a8519280ff3eb847\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T08:39:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T08:39:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6lhz2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T08:39:36Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-4bmrv\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T08:40:24Z is after 2025-08-24T17:21:41Z" Oct 03 08:40:24 crc kubenswrapper[4765]: I1003 08:40:24.932399 4765 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-p9gf5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"46c76a49-e10b-4a12-a6c7-12c330cd3c4e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d127171dd11041892813dd0596574630e756cc4f2e54b149619bffdbe9bae37d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T08:39:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2ldt7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T08:39:36Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-p9gf5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T08:40:24Z is after 2025-08-24T17:21:41Z" Oct 03 08:40:24 crc kubenswrapper[4765]: I1003 08:40:24.942071 4765 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-svqbq" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"84cdf1d7-9997-4015-bdbf-eedacc081685\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://43441b23076aa88505c0014c6734ffd0302f9011300711eece573befc94f3fbf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T08:39:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tm2z5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T08:39:39Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-svqbq\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T08:40:24Z is after 2025-08-24T17:21:41Z" Oct 03 08:40:24 crc kubenswrapper[4765]: I1003 08:40:24.951752 4765 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-9pssq" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fcbd8c60-e4bc-43c1-b769-9ae58a05ea0f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fb36c0727cbf11d911102b2e91c3989a264374191f4ff34349ed6ec8eba2e58d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T08:39:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ggxtg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d810b33fb4971c7a1473884cbe04ad15b3cac6c0ca9af2384819d72a748ab173\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T08:39:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ggxtg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T08:39:48Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-9pssq\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T08:40:24Z is after 2025-08-24T17:21:41Z" Oct 03 08:40:24 crc kubenswrapper[4765]: I1003 08:40:24.952572 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 08:40:24 crc kubenswrapper[4765]: I1003 08:40:24.952606 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 08:40:24 crc kubenswrapper[4765]: I1003 08:40:24.952618 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 08:40:24 crc kubenswrapper[4765]: I1003 08:40:24.952636 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 08:40:24 crc kubenswrapper[4765]: I1003 08:40:24.952664 4765 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T08:40:24Z","lastTransitionTime":"2025-10-03T08:40:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 08:40:25 crc kubenswrapper[4765]: I1003 08:40:25.054889 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 08:40:25 crc kubenswrapper[4765]: I1003 08:40:25.054940 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 08:40:25 crc kubenswrapper[4765]: I1003 08:40:25.054954 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 08:40:25 crc kubenswrapper[4765]: I1003 08:40:25.054973 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 08:40:25 crc kubenswrapper[4765]: I1003 08:40:25.054984 4765 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T08:40:25Z","lastTransitionTime":"2025-10-03T08:40:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 08:40:25 crc kubenswrapper[4765]: I1003 08:40:25.157334 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 08:40:25 crc kubenswrapper[4765]: I1003 08:40:25.157378 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 08:40:25 crc kubenswrapper[4765]: I1003 08:40:25.157387 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 08:40:25 crc kubenswrapper[4765]: I1003 08:40:25.157401 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 08:40:25 crc kubenswrapper[4765]: I1003 08:40:25.157410 4765 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T08:40:25Z","lastTransitionTime":"2025-10-03T08:40:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 08:40:25 crc kubenswrapper[4765]: I1003 08:40:25.261069 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 08:40:25 crc kubenswrapper[4765]: I1003 08:40:25.261112 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 08:40:25 crc kubenswrapper[4765]: I1003 08:40:25.261126 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 08:40:25 crc kubenswrapper[4765]: I1003 08:40:25.261147 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 08:40:25 crc kubenswrapper[4765]: I1003 08:40:25.261161 4765 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T08:40:25Z","lastTransitionTime":"2025-10-03T08:40:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 08:40:25 crc kubenswrapper[4765]: I1003 08:40:25.305613 4765 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 03 08:40:25 crc kubenswrapper[4765]: I1003 08:40:25.305613 4765 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 03 08:40:25 crc kubenswrapper[4765]: I1003 08:40:25.306043 4765 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 03 08:40:25 crc kubenswrapper[4765]: E1003 08:40:25.306253 4765 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 03 08:40:25 crc kubenswrapper[4765]: E1003 08:40:25.306335 4765 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 03 08:40:25 crc kubenswrapper[4765]: E1003 08:40:25.306439 4765 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 03 08:40:25 crc kubenswrapper[4765]: I1003 08:40:25.363407 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 08:40:25 crc kubenswrapper[4765]: I1003 08:40:25.363453 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 08:40:25 crc kubenswrapper[4765]: I1003 08:40:25.363466 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 08:40:25 crc kubenswrapper[4765]: I1003 08:40:25.363484 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 08:40:25 crc kubenswrapper[4765]: I1003 08:40:25.363495 4765 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T08:40:25Z","lastTransitionTime":"2025-10-03T08:40:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 08:40:25 crc kubenswrapper[4765]: I1003 08:40:25.465758 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 08:40:25 crc kubenswrapper[4765]: I1003 08:40:25.466062 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 08:40:25 crc kubenswrapper[4765]: I1003 08:40:25.466155 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 08:40:25 crc kubenswrapper[4765]: I1003 08:40:25.466248 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 08:40:25 crc kubenswrapper[4765]: I1003 08:40:25.466333 4765 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T08:40:25Z","lastTransitionTime":"2025-10-03T08:40:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 08:40:25 crc kubenswrapper[4765]: I1003 08:40:25.569213 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 08:40:25 crc kubenswrapper[4765]: I1003 08:40:25.569278 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 08:40:25 crc kubenswrapper[4765]: I1003 08:40:25.569290 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 08:40:25 crc kubenswrapper[4765]: I1003 08:40:25.569306 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 08:40:25 crc kubenswrapper[4765]: I1003 08:40:25.569318 4765 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T08:40:25Z","lastTransitionTime":"2025-10-03T08:40:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 08:40:25 crc kubenswrapper[4765]: I1003 08:40:25.671754 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 08:40:25 crc kubenswrapper[4765]: I1003 08:40:25.671788 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 08:40:25 crc kubenswrapper[4765]: I1003 08:40:25.671799 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 08:40:25 crc kubenswrapper[4765]: I1003 08:40:25.671816 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 08:40:25 crc kubenswrapper[4765]: I1003 08:40:25.671827 4765 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T08:40:25Z","lastTransitionTime":"2025-10-03T08:40:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 08:40:25 crc kubenswrapper[4765]: I1003 08:40:25.699027 4765 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-csb5z_912755c8-dd28-4fbc-82de-9cf85df54f4f/kube-multus/0.log" Oct 03 08:40:25 crc kubenswrapper[4765]: I1003 08:40:25.699087 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-csb5z" event={"ID":"912755c8-dd28-4fbc-82de-9cf85df54f4f","Type":"ContainerStarted","Data":"52f5a7f443bf8e52988e8645ff60745a747d602261e7dbf01b68c58aaf9bae05"} Oct 03 08:40:25 crc kubenswrapper[4765]: I1003 08:40:25.715631 4765 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0c434639-9c6c-420c-a51b-fdf59b654daa\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d31497fd54f7500ac776bdd9a16414d873c053353911ed5ba237b201e9e7ac12\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T08:39:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e89b19d6a5b90a2051665bf2e5e150f73df7899eff246ee75246bc2127c415ea\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T08:39:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://79fad446c147481b1a0ff2a173848b2d24384e6b6aafcd0749dc820e9abfe929\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T08:39:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f2e21a2b21d807288e991a3a44ea38d316985590080aa4291aa3385816f826dd\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://aa0283dadc2c5e48aa9bfd20ef35d889a350244b72eb8529d4d4e682d5fa0e47\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-03T08:39:35Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1003 08:39:29.830291 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1003 08:39:29.833185 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2710500186/tls.crt::/tmp/serving-cert-2710500186/tls.key\\\\\\\"\\\\nI1003 08:39:35.213224 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1003 08:39:35.219008 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1003 08:39:35.219055 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1003 08:39:35.219088 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1003 08:39:35.219098 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1003 08:39:35.227302 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI1003 08:39:35.227314 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW1003 08:39:35.227372 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1003 08:39:35.227381 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1003 08:39:35.227385 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1003 08:39:35.227395 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1003 08:39:35.227398 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1003 08:39:35.227401 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF1003 08:39:35.229781 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-03T08:39:19Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T08:39:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fa1d1c0f4dab4b4c6c9f3afccac34473eab40a714015a2a7ce725ed1a92b609c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T08:39:19Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7c94b0f96f50324a67a7b189e9d3c4e406193421aa2e793692219bef9f2578dd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7c94b0f96f50324a67a7b189e9d3c4e406193421aa2e793692219bef9f2578dd\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T08:39:17Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T08:39:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T08:39:16Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T08:40:25Z is after 2025-08-24T17:21:41Z" Oct 03 08:40:25 crc kubenswrapper[4765]: I1003 08:40:25.728025 4765 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2a9b9fb7-e509-45bf-8ceb-fed6c0d26821\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://78b8f31e2b3f0891e3909baeb57c5a2dfe52c0e85d1aa86fe045ed54c56d5202\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T08:39:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6cedef6c592c877edfd8afe1dc09789fdc84a816a6a84d9ac9115fa494d8b5fe\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T08:39:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f1e92137f7438e3f6ae4b9225226f23f10f0e5e8a2b6a86f486971315d8bee00\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T08:39:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://00a75616be0bff2d1c730afda7f4212c6d85e07870e6f680c6903862387e00a0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://00a75616be0bff2d1c730afda7f4212c6d85e07870e6f680c6903862387e00a0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T08:39:17Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T08:39:17Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T08:39:16Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T08:40:25Z is after 2025-08-24T17:21:41Z" Oct 03 08:40:25 crc kubenswrapper[4765]: I1003 08:40:25.739331 4765 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:35Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:35Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T08:40:25Z is after 2025-08-24T17:21:41Z" Oct 03 08:40:25 crc kubenswrapper[4765]: I1003 08:40:25.753173 4765 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:36Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:36Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e2003e4dd90b26bd915c05a690d0ab12b21ef7773138f11993382b0e7ac2d778\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T08:39:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T08:40:25Z is after 2025-08-24T17:21:41Z" Oct 03 08:40:25 crc kubenswrapper[4765]: I1003 08:40:25.763783 4765 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-wdwf5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6824483c-e9a7-4e95-bb3d-e00bac2af3aa\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:50Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:50Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:50Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9t858\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9t858\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T08:39:50Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-wdwf5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T08:40:25Z is after 2025-08-24T17:21:41Z" Oct 03 08:40:25 crc kubenswrapper[4765]: I1003 08:40:25.773846 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 08:40:25 crc kubenswrapper[4765]: I1003 08:40:25.773903 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 08:40:25 crc kubenswrapper[4765]: I1003 08:40:25.773919 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 08:40:25 crc kubenswrapper[4765]: I1003 08:40:25.773937 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 08:40:25 crc kubenswrapper[4765]: I1003 08:40:25.774235 4765 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T08:40:25Z","lastTransitionTime":"2025-10-03T08:40:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 08:40:25 crc kubenswrapper[4765]: I1003 08:40:25.774706 4765 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"648d26ad-0ca3-4ce7-885d-6aab568ed72d\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://dee8eb78cfc7f681a7009b32e7521490cfa896aee35f8f552a150738224517be\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T08:39:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bdfadb3541e9c76e5ab7469b7161c24715f4eeff89ec4bba0cc253bece41f1b2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://bdfadb3541e9c76e5ab7469b7161c24715f4eeff89ec4bba0cc253bece41f1b2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T08:39:17Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T08:39:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T08:39:16Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T08:40:25Z is after 2025-08-24T17:21:41Z" Oct 03 08:40:25 crc kubenswrapper[4765]: I1003 08:40:25.786037 4765 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:35Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:35Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T08:40:25Z is after 2025-08-24T17:21:41Z" Oct 03 08:40:25 crc kubenswrapper[4765]: I1003 08:40:25.803400 4765 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-csb5z" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"912755c8-dd28-4fbc-82de-9cf85df54f4f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://52f5a7f443bf8e52988e8645ff60745a747d602261e7dbf01b68c58aaf9bae05\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d7f179012e9f55f30c641a1ae3640cc90cefb3d2527d0c1e0580c219899503e1\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-10-03T08:40:23Z\\\",\\\"message\\\":\\\"2025-10-03T08:39:38+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_5149c5cc-1f13-4c92-ba76-7ef1ed5a7abf\\\\n2025-10-03T08:39:38+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_5149c5cc-1f13-4c92-ba76-7ef1ed5a7abf to /host/opt/cni/bin/\\\\n2025-10-03T08:39:38Z [verbose] multus-daemon started\\\\n2025-10-03T08:39:38Z [verbose] Readiness Indicator file check\\\\n2025-10-03T08:40:23Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-03T08:39:37Z\\\"}},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T08:40:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w8k2k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T08:39:36Z\\\"}}\" for pod \"openshift-multus\"/\"multus-csb5z\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T08:40:25Z is after 2025-08-24T17:21:41Z" Oct 03 08:40:25 crc kubenswrapper[4765]: I1003 08:40:25.835800 4765 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-j8mss" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d636dbad-9ffa-4ba7-953f-adea04b76a23\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://33c95fa1034cd2135f4293956d73825e809195d220ff0b10a6604bd399a5730a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T08:39:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dhzzj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://714c78e9165f96e2aee03ad7be980399f06aeb852da4d76611c236f262518281\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T08:39:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dhzzj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T08:39:36Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-j8mss\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T08:40:25Z is after 2025-08-24T17:21:41Z" Oct 03 08:40:25 crc kubenswrapper[4765]: I1003 08:40:25.863870 4765 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"859ee4f1-636f-48e5-ad72-fef19f311c64\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ffaf0cbc60fa84230a87aff908b5b2a76956abfa937aeea94363abe91640b93e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T08:39:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0fee410f71d4fa82e7bf54dad906736bc7182be512825a06bf7a4c76ed2f2789\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T08:39:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ab0ed26066c771f9943b6435fa382ff61fb04f0c8bef3d505aba4c5d1a1d4740\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T08:39:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://153c9584928c3d064c6098126dad58733015ed123b9a55c959e69ddcc0ad2110\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T08:39:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6fa1bc45d80d90bc08ca3a7177e2ac77b66c36f5a0f863532174be7719bfaae0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T08:39:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c1359775b3b907743ce1d7754c904fab5cf953ad3a28f91e781be95462e6a9d4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c1359775b3b907743ce1d7754c904fab5cf953ad3a28f91e781be95462e6a9d4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T08:39:17Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T08:39:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a56c24b3af6b3d5c18009cbb5517273674eb36f57157344f99903f9c50c7d1a0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a56c24b3af6b3d5c18009cbb5517273674eb36f57157344f99903f9c50c7d1a0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T08:39:18Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T08:39:18Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://18dbd66a12f96fa4beef4da9dcc0e1b1d3a5164a8b21a6774142289078d6c05f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://18dbd66a12f96fa4beef4da9dcc0e1b1d3a5164a8b21a6774142289078d6c05f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T08:39:19Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T08:39:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T08:39:16Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T08:40:25Z is after 2025-08-24T17:21:41Z" Oct 03 08:40:25 crc kubenswrapper[4765]: I1003 08:40:25.876343 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 08:40:25 crc kubenswrapper[4765]: I1003 08:40:25.876380 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 08:40:25 crc kubenswrapper[4765]: I1003 08:40:25.876389 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 08:40:25 crc kubenswrapper[4765]: I1003 08:40:25.876404 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 08:40:25 crc kubenswrapper[4765]: I1003 08:40:25.876415 4765 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T08:40:25Z","lastTransitionTime":"2025-10-03T08:40:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 08:40:25 crc kubenswrapper[4765]: I1003 08:40:25.881342 4765 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:36Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:36Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e8d6f534a0a702832db2f8947c1528a98d511d3950cc5a6ec0ac3b31b3dbcb7c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T08:39:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://20ad16cb9f0f7e17ac946cd2c3f7c01b6e6c95d6d76c99f482b3761546689af2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T08:39:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T08:40:25Z is after 2025-08-24T17:21:41Z" Oct 03 08:40:25 crc kubenswrapper[4765]: I1003 08:40:25.894623 4765 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:38Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:38Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a37f2b5f797755065158a077232872befbc61f2f19c80dfd27bba7f131db794c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T08:39:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T08:40:25Z is after 2025-08-24T17:21:41Z" Oct 03 08:40:25 crc kubenswrapper[4765]: I1003 08:40:25.912976 4765 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-srgbb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ea01fba1-445f-46c1-898c-1ceb34866850\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:36Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:36Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d73e2e54676fc570262cfd551322ed003812c372ddc25695ca3b34ae2a05423b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T08:39:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m4zqv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fa40947035e07c4926ee170348e2bd545830d0c6c1fa6b59a2aa7f12eac2c6da\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T08:39:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m4zqv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://902d94d2cc9ce526c6ea774f1bb70fbee7da85cedab72fcd842f87d47ee8a458\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T08:39:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m4zqv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://95502595a856f5f235331ab5db3d4f97a50f968857c1962d12b873a714689f0c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T08:39:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m4zqv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a3ad66691c9dcf004703b79d697a78f9b42791fafba2ddf278997b6ad28bdd4a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T08:39:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m4zqv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://68b9b8a7ec5c072f50d44aa0d3800b7cdee18bdd868d37ec129ceb37a23bd3ca\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T08:39:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m4zqv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a4d481217db9abe6da65a66219fdf2298353f237df78c085f40bb803f7349ccd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a4d481217db9abe6da65a66219fdf2298353f237df78c085f40bb803f7349ccd\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-10-03T08:40:03Z\\\",\\\"message\\\":\\\":false hairpin_snat_ip:169.254.0.5 fd69::5 neighbor_responder:none reject:true skip_snat:false]} protocol:{GoSet:[tcp]} selection_fields:{GoSet:[]} vips:{GoMap:map[10.217.5.188:443:]}] Rows:[] Columns:[] Mutations:[] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {53c717ca-2174-4315-bb03-c937a9c0d9b6}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nI1003 08:40:03.133396 6432 transact.go:42] Configuring OVN: [{Op:update Table:Load_Balancer Row:map[external_ids:{GoMap:map[k8s.ovn.org/kind:Service k8s.ovn.org/owner:openshift-cluster-version/cluster-version-operator]} name:Service_openshift-cluster-version/cluster-version-operator_TCP_cluster options:{GoMap:map[event:false hairpin_snat_ip:169.254.0.5 fd69::5 neighbor_responder:none reject:true skip_snat:false]} protocol:{GoSet:[tcp]} selection_fields:{GoSet:[]} vips:{GoMap:map[10.217.4.182:9099:]}] Rows:[] Columns:[] Mutations:[] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {61d39e4d-21a9-4387-9a2b-fa4ad14792e2}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nI1003 08:40:03.133455 6432 base_network_controller_pods.go:916] Annotation values: ip=[10.217.0.3/23] ; mac=0a:58:0a:d9:00:03 ; gw=[10.217.0.1]\\\\nF1003 08:40:03.133502 6432 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-03T08:40:02Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-srgbb_openshift-ovn-kubernetes(ea01fba1-445f-46c1-898c-1ceb34866850)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m4zqv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6d5d60eb6ab5ff22cc2c6826b1d47220bb827fa0429f2a59020ae01d0a43f6bf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T08:39:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m4zqv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://761ad034a8a6a7a6280fcccaca136b26ddd423885bc2a7ab6b8e1856f16f7416\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://761ad034a8a6a7a6280fcccaca136b26ddd423885bc2a7ab6b8e1856f16f7416\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T08:39:37Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T08:39:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m4zqv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T08:39:36Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-srgbb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T08:40:25Z is after 2025-08-24T17:21:41Z" Oct 03 08:40:25 crc kubenswrapper[4765]: I1003 08:40:25.927002 4765 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:35Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:35Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T08:40:25Z is after 2025-08-24T17:21:41Z" Oct 03 08:40:25 crc kubenswrapper[4765]: I1003 08:40:25.945089 4765 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-4bmrv" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4f105c06-3e67-486f-a622-923ae442117c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d7a29ab4db9b7548c70824520272e6323f615934cddf1d92bf653f6d8f030a19\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T08:39:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6lhz2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ab314ed006e71cae099ecf8be4ac18fb777dd83fe4e233d9bc347965f76b6a0b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ab314ed006e71cae099ecf8be4ac18fb777dd83fe4e233d9bc347965f76b6a0b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T08:39:37Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T08:39:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6lhz2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://261208c132a527b6786f64f37dd699e889f68ebc01265028288b3620615274bc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://261208c132a527b6786f64f37dd699e889f68ebc01265028288b3620615274bc\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T08:39:39Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T08:39:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6lhz2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8358f38c6c98de8206fc03fda21200b0e526972747588a6e32d525c064aa57ac\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8358f38c6c98de8206fc03fda21200b0e526972747588a6e32d525c064aa57ac\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T08:39:39Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T08:39:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6lhz2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9af7a0993c4e8d1177050ee170ae306c2e2570b0daca2d3f5c812b5f0e9c81da\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9af7a0993c4e8d1177050ee170ae306c2e2570b0daca2d3f5c812b5f0e9c81da\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T08:39:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T08:39:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6lhz2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://23ac91bc25ecc5c606b22bf6df52129330bb8c214ef8ec881fb202df6350c853\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://23ac91bc25ecc5c606b22bf6df52129330bb8c214ef8ec881fb202df6350c853\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T08:39:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T08:39:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6lhz2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7c836df75da45ef369baafc15bdbed1068becc3bf57a4c83a8519280ff3eb847\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7c836df75da45ef369baafc15bdbed1068becc3bf57a4c83a8519280ff3eb847\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T08:39:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T08:39:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6lhz2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T08:39:36Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-4bmrv\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T08:40:25Z is after 2025-08-24T17:21:41Z" Oct 03 08:40:25 crc kubenswrapper[4765]: I1003 08:40:25.956474 4765 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-p9gf5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"46c76a49-e10b-4a12-a6c7-12c330cd3c4e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d127171dd11041892813dd0596574630e756cc4f2e54b149619bffdbe9bae37d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T08:39:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2ldt7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T08:39:36Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-p9gf5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T08:40:25Z is after 2025-08-24T17:21:41Z" Oct 03 08:40:25 crc kubenswrapper[4765]: I1003 08:40:25.967412 4765 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-svqbq" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"84cdf1d7-9997-4015-bdbf-eedacc081685\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://43441b23076aa88505c0014c6734ffd0302f9011300711eece573befc94f3fbf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T08:39:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tm2z5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T08:39:39Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-svqbq\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T08:40:25Z is after 2025-08-24T17:21:41Z" Oct 03 08:40:25 crc kubenswrapper[4765]: I1003 08:40:25.979093 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 08:40:25 crc kubenswrapper[4765]: I1003 08:40:25.979132 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 08:40:25 crc kubenswrapper[4765]: I1003 08:40:25.979140 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 08:40:25 crc kubenswrapper[4765]: I1003 08:40:25.979175 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 08:40:25 crc kubenswrapper[4765]: I1003 08:40:25.979189 4765 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T08:40:25Z","lastTransitionTime":"2025-10-03T08:40:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 08:40:25 crc kubenswrapper[4765]: I1003 08:40:25.981030 4765 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-9pssq" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fcbd8c60-e4bc-43c1-b769-9ae58a05ea0f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fb36c0727cbf11d911102b2e91c3989a264374191f4ff34349ed6ec8eba2e58d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T08:39:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ggxtg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d810b33fb4971c7a1473884cbe04ad15b3cac6c0ca9af2384819d72a748ab173\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T08:39:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ggxtg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T08:39:48Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-9pssq\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T08:40:25Z is after 2025-08-24T17:21:41Z" Oct 03 08:40:25 crc kubenswrapper[4765]: I1003 08:40:25.995042 4765 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9660b983-3561-4cf7-8ea0-31a63e8d1051\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://23c27e7d79dab0c54b22f0114e7f55a9267e3a21961b8479c37fd77d0e8b66c7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T08:39:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fb89a31c804d86cbc11b04e4dcfab79d4536f28a107d43e98d48172a1c257ebb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T08:39:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1c3168f51c49cd9633557cf31cdc0fec47b3fcf981462dc85f4253a0584fcf52\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T08:39:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://99ae775d5cfd2e88a1c7ca516e1c59f2e08ce1d383653cacbefeac66b07abcb5\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T08:39:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T08:39:16Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T08:40:25Z is after 2025-08-24T17:21:41Z" Oct 03 08:40:26 crc kubenswrapper[4765]: I1003 08:40:26.082055 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 08:40:26 crc kubenswrapper[4765]: I1003 08:40:26.082091 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 08:40:26 crc kubenswrapper[4765]: I1003 08:40:26.082101 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 08:40:26 crc kubenswrapper[4765]: I1003 08:40:26.082117 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 08:40:26 crc kubenswrapper[4765]: I1003 08:40:26.082129 4765 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T08:40:26Z","lastTransitionTime":"2025-10-03T08:40:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 08:40:26 crc kubenswrapper[4765]: I1003 08:40:26.183869 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 08:40:26 crc kubenswrapper[4765]: I1003 08:40:26.183928 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 08:40:26 crc kubenswrapper[4765]: I1003 08:40:26.183945 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 08:40:26 crc kubenswrapper[4765]: I1003 08:40:26.183963 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 08:40:26 crc kubenswrapper[4765]: I1003 08:40:26.183975 4765 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T08:40:26Z","lastTransitionTime":"2025-10-03T08:40:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 08:40:26 crc kubenswrapper[4765]: I1003 08:40:26.286444 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 08:40:26 crc kubenswrapper[4765]: I1003 08:40:26.286493 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 08:40:26 crc kubenswrapper[4765]: I1003 08:40:26.286502 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 08:40:26 crc kubenswrapper[4765]: I1003 08:40:26.286518 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 08:40:26 crc kubenswrapper[4765]: I1003 08:40:26.286528 4765 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T08:40:26Z","lastTransitionTime":"2025-10-03T08:40:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 08:40:26 crc kubenswrapper[4765]: I1003 08:40:26.306165 4765 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-wdwf5" Oct 03 08:40:26 crc kubenswrapper[4765]: E1003 08:40:26.306325 4765 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-wdwf5" podUID="6824483c-e9a7-4e95-bb3d-e00bac2af3aa" Oct 03 08:40:26 crc kubenswrapper[4765]: I1003 08:40:26.317350 4765 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-wdwf5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6824483c-e9a7-4e95-bb3d-e00bac2af3aa\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:50Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:50Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:50Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9t858\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9t858\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T08:39:50Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-wdwf5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T08:40:26Z is after 2025-08-24T17:21:41Z" Oct 03 08:40:26 crc kubenswrapper[4765]: I1003 08:40:26.327594 4765 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"648d26ad-0ca3-4ce7-885d-6aab568ed72d\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://dee8eb78cfc7f681a7009b32e7521490cfa896aee35f8f552a150738224517be\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T08:39:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bdfadb3541e9c76e5ab7469b7161c24715f4eeff89ec4bba0cc253bece41f1b2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://bdfadb3541e9c76e5ab7469b7161c24715f4eeff89ec4bba0cc253bece41f1b2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T08:39:17Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T08:39:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T08:39:16Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T08:40:26Z is after 2025-08-24T17:21:41Z" Oct 03 08:40:26 crc kubenswrapper[4765]: I1003 08:40:26.342511 4765 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0c434639-9c6c-420c-a51b-fdf59b654daa\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d31497fd54f7500ac776bdd9a16414d873c053353911ed5ba237b201e9e7ac12\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T08:39:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e89b19d6a5b90a2051665bf2e5e150f73df7899eff246ee75246bc2127c415ea\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T08:39:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://79fad446c147481b1a0ff2a173848b2d24384e6b6aafcd0749dc820e9abfe929\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T08:39:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f2e21a2b21d807288e991a3a44ea38d316985590080aa4291aa3385816f826dd\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://aa0283dadc2c5e48aa9bfd20ef35d889a350244b72eb8529d4d4e682d5fa0e47\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-03T08:39:35Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1003 08:39:29.830291 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1003 08:39:29.833185 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2710500186/tls.crt::/tmp/serving-cert-2710500186/tls.key\\\\\\\"\\\\nI1003 08:39:35.213224 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1003 08:39:35.219008 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1003 08:39:35.219055 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1003 08:39:35.219088 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1003 08:39:35.219098 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1003 08:39:35.227302 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI1003 08:39:35.227314 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW1003 08:39:35.227372 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1003 08:39:35.227381 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1003 08:39:35.227385 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1003 08:39:35.227395 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1003 08:39:35.227398 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1003 08:39:35.227401 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF1003 08:39:35.229781 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-03T08:39:19Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T08:39:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fa1d1c0f4dab4b4c6c9f3afccac34473eab40a714015a2a7ce725ed1a92b609c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T08:39:19Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7c94b0f96f50324a67a7b189e9d3c4e406193421aa2e793692219bef9f2578dd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7c94b0f96f50324a67a7b189e9d3c4e406193421aa2e793692219bef9f2578dd\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T08:39:17Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T08:39:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T08:39:16Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T08:40:26Z is after 2025-08-24T17:21:41Z" Oct 03 08:40:26 crc kubenswrapper[4765]: I1003 08:40:26.354211 4765 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2a9b9fb7-e509-45bf-8ceb-fed6c0d26821\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://78b8f31e2b3f0891e3909baeb57c5a2dfe52c0e85d1aa86fe045ed54c56d5202\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T08:39:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6cedef6c592c877edfd8afe1dc09789fdc84a816a6a84d9ac9115fa494d8b5fe\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T08:39:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f1e92137f7438e3f6ae4b9225226f23f10f0e5e8a2b6a86f486971315d8bee00\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T08:39:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://00a75616be0bff2d1c730afda7f4212c6d85e07870e6f680c6903862387e00a0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://00a75616be0bff2d1c730afda7f4212c6d85e07870e6f680c6903862387e00a0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T08:39:17Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T08:39:17Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T08:39:16Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T08:40:26Z is after 2025-08-24T17:21:41Z" Oct 03 08:40:26 crc kubenswrapper[4765]: I1003 08:40:26.365803 4765 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:35Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:35Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T08:40:26Z is after 2025-08-24T17:21:41Z" Oct 03 08:40:26 crc kubenswrapper[4765]: I1003 08:40:26.379325 4765 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:36Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:36Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e2003e4dd90b26bd915c05a690d0ab12b21ef7773138f11993382b0e7ac2d778\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T08:39:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T08:40:26Z is after 2025-08-24T17:21:41Z" Oct 03 08:40:26 crc kubenswrapper[4765]: I1003 08:40:26.389373 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 08:40:26 crc kubenswrapper[4765]: I1003 08:40:26.389410 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 08:40:26 crc kubenswrapper[4765]: I1003 08:40:26.389419 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 08:40:26 crc kubenswrapper[4765]: I1003 08:40:26.389434 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 08:40:26 crc kubenswrapper[4765]: I1003 08:40:26.389443 4765 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T08:40:26Z","lastTransitionTime":"2025-10-03T08:40:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 08:40:26 crc kubenswrapper[4765]: I1003 08:40:26.400078 4765 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"859ee4f1-636f-48e5-ad72-fef19f311c64\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ffaf0cbc60fa84230a87aff908b5b2a76956abfa937aeea94363abe91640b93e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T08:39:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0fee410f71d4fa82e7bf54dad906736bc7182be512825a06bf7a4c76ed2f2789\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T08:39:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ab0ed26066c771f9943b6435fa382ff61fb04f0c8bef3d505aba4c5d1a1d4740\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T08:39:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://153c9584928c3d064c6098126dad58733015ed123b9a55c959e69ddcc0ad2110\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T08:39:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6fa1bc45d80d90bc08ca3a7177e2ac77b66c36f5a0f863532174be7719bfaae0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T08:39:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c1359775b3b907743ce1d7754c904fab5cf953ad3a28f91e781be95462e6a9d4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c1359775b3b907743ce1d7754c904fab5cf953ad3a28f91e781be95462e6a9d4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T08:39:17Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T08:39:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a56c24b3af6b3d5c18009cbb5517273674eb36f57157344f99903f9c50c7d1a0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a56c24b3af6b3d5c18009cbb5517273674eb36f57157344f99903f9c50c7d1a0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T08:39:18Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T08:39:18Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://18dbd66a12f96fa4beef4da9dcc0e1b1d3a5164a8b21a6774142289078d6c05f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://18dbd66a12f96fa4beef4da9dcc0e1b1d3a5164a8b21a6774142289078d6c05f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T08:39:19Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T08:39:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T08:39:16Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T08:40:26Z is after 2025-08-24T17:21:41Z" Oct 03 08:40:26 crc kubenswrapper[4765]: I1003 08:40:26.417558 4765 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:35Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:35Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T08:40:26Z is after 2025-08-24T17:21:41Z" Oct 03 08:40:26 crc kubenswrapper[4765]: I1003 08:40:26.430233 4765 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-csb5z" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"912755c8-dd28-4fbc-82de-9cf85df54f4f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://52f5a7f443bf8e52988e8645ff60745a747d602261e7dbf01b68c58aaf9bae05\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d7f179012e9f55f30c641a1ae3640cc90cefb3d2527d0c1e0580c219899503e1\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-10-03T08:40:23Z\\\",\\\"message\\\":\\\"2025-10-03T08:39:38+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_5149c5cc-1f13-4c92-ba76-7ef1ed5a7abf\\\\n2025-10-03T08:39:38+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_5149c5cc-1f13-4c92-ba76-7ef1ed5a7abf to /host/opt/cni/bin/\\\\n2025-10-03T08:39:38Z [verbose] multus-daemon started\\\\n2025-10-03T08:39:38Z [verbose] Readiness Indicator file check\\\\n2025-10-03T08:40:23Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-03T08:39:37Z\\\"}},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T08:40:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w8k2k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T08:39:36Z\\\"}}\" for pod \"openshift-multus\"/\"multus-csb5z\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T08:40:26Z is after 2025-08-24T17:21:41Z" Oct 03 08:40:26 crc kubenswrapper[4765]: I1003 08:40:26.440579 4765 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-j8mss" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d636dbad-9ffa-4ba7-953f-adea04b76a23\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://33c95fa1034cd2135f4293956d73825e809195d220ff0b10a6604bd399a5730a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T08:39:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dhzzj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://714c78e9165f96e2aee03ad7be980399f06aeb852da4d76611c236f262518281\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T08:39:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dhzzj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T08:39:36Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-j8mss\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T08:40:26Z is after 2025-08-24T17:21:41Z" Oct 03 08:40:26 crc kubenswrapper[4765]: I1003 08:40:26.452355 4765 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:35Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:35Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T08:40:26Z is after 2025-08-24T17:21:41Z" Oct 03 08:40:26 crc kubenswrapper[4765]: I1003 08:40:26.464957 4765 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:36Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:36Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e8d6f534a0a702832db2f8947c1528a98d511d3950cc5a6ec0ac3b31b3dbcb7c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T08:39:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://20ad16cb9f0f7e17ac946cd2c3f7c01b6e6c95d6d76c99f482b3761546689af2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T08:39:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T08:40:26Z is after 2025-08-24T17:21:41Z" Oct 03 08:40:26 crc kubenswrapper[4765]: I1003 08:40:26.476100 4765 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:38Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:38Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a37f2b5f797755065158a077232872befbc61f2f19c80dfd27bba7f131db794c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T08:39:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T08:40:26Z is after 2025-08-24T17:21:41Z" Oct 03 08:40:26 crc kubenswrapper[4765]: I1003 08:40:26.491795 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 08:40:26 crc kubenswrapper[4765]: I1003 08:40:26.491849 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 08:40:26 crc kubenswrapper[4765]: I1003 08:40:26.491859 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 08:40:26 crc kubenswrapper[4765]: I1003 08:40:26.491877 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 08:40:26 crc kubenswrapper[4765]: I1003 08:40:26.491907 4765 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T08:40:26Z","lastTransitionTime":"2025-10-03T08:40:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 08:40:26 crc kubenswrapper[4765]: I1003 08:40:26.494751 4765 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-srgbb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ea01fba1-445f-46c1-898c-1ceb34866850\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:36Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:36Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d73e2e54676fc570262cfd551322ed003812c372ddc25695ca3b34ae2a05423b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T08:39:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m4zqv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fa40947035e07c4926ee170348e2bd545830d0c6c1fa6b59a2aa7f12eac2c6da\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T08:39:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m4zqv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://902d94d2cc9ce526c6ea774f1bb70fbee7da85cedab72fcd842f87d47ee8a458\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T08:39:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m4zqv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://95502595a856f5f235331ab5db3d4f97a50f968857c1962d12b873a714689f0c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T08:39:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m4zqv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a3ad66691c9dcf004703b79d697a78f9b42791fafba2ddf278997b6ad28bdd4a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T08:39:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m4zqv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://68b9b8a7ec5c072f50d44aa0d3800b7cdee18bdd868d37ec129ceb37a23bd3ca\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T08:39:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m4zqv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a4d481217db9abe6da65a66219fdf2298353f237df78c085f40bb803f7349ccd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a4d481217db9abe6da65a66219fdf2298353f237df78c085f40bb803f7349ccd\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-10-03T08:40:03Z\\\",\\\"message\\\":\\\":false hairpin_snat_ip:169.254.0.5 fd69::5 neighbor_responder:none reject:true skip_snat:false]} protocol:{GoSet:[tcp]} selection_fields:{GoSet:[]} vips:{GoMap:map[10.217.5.188:443:]}] Rows:[] Columns:[] Mutations:[] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {53c717ca-2174-4315-bb03-c937a9c0d9b6}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nI1003 08:40:03.133396 6432 transact.go:42] Configuring OVN: [{Op:update Table:Load_Balancer Row:map[external_ids:{GoMap:map[k8s.ovn.org/kind:Service k8s.ovn.org/owner:openshift-cluster-version/cluster-version-operator]} name:Service_openshift-cluster-version/cluster-version-operator_TCP_cluster options:{GoMap:map[event:false hairpin_snat_ip:169.254.0.5 fd69::5 neighbor_responder:none reject:true skip_snat:false]} protocol:{GoSet:[tcp]} selection_fields:{GoSet:[]} vips:{GoMap:map[10.217.4.182:9099:]}] Rows:[] Columns:[] Mutations:[] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {61d39e4d-21a9-4387-9a2b-fa4ad14792e2}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nI1003 08:40:03.133455 6432 base_network_controller_pods.go:916] Annotation values: ip=[10.217.0.3/23] ; mac=0a:58:0a:d9:00:03 ; gw=[10.217.0.1]\\\\nF1003 08:40:03.133502 6432 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-03T08:40:02Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-srgbb_openshift-ovn-kubernetes(ea01fba1-445f-46c1-898c-1ceb34866850)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m4zqv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6d5d60eb6ab5ff22cc2c6826b1d47220bb827fa0429f2a59020ae01d0a43f6bf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T08:39:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m4zqv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://761ad034a8a6a7a6280fcccaca136b26ddd423885bc2a7ab6b8e1856f16f7416\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://761ad034a8a6a7a6280fcccaca136b26ddd423885bc2a7ab6b8e1856f16f7416\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T08:39:37Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T08:39:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m4zqv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T08:39:36Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-srgbb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T08:40:26Z is after 2025-08-24T17:21:41Z" Oct 03 08:40:26 crc kubenswrapper[4765]: I1003 08:40:26.507950 4765 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9660b983-3561-4cf7-8ea0-31a63e8d1051\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://23c27e7d79dab0c54b22f0114e7f55a9267e3a21961b8479c37fd77d0e8b66c7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T08:39:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fb89a31c804d86cbc11b04e4dcfab79d4536f28a107d43e98d48172a1c257ebb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T08:39:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1c3168f51c49cd9633557cf31cdc0fec47b3fcf981462dc85f4253a0584fcf52\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T08:39:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://99ae775d5cfd2e88a1c7ca516e1c59f2e08ce1d383653cacbefeac66b07abcb5\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T08:39:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T08:39:16Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T08:40:26Z is after 2025-08-24T17:21:41Z" Oct 03 08:40:26 crc kubenswrapper[4765]: I1003 08:40:26.520802 4765 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-4bmrv" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4f105c06-3e67-486f-a622-923ae442117c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d7a29ab4db9b7548c70824520272e6323f615934cddf1d92bf653f6d8f030a19\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T08:39:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6lhz2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ab314ed006e71cae099ecf8be4ac18fb777dd83fe4e233d9bc347965f76b6a0b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ab314ed006e71cae099ecf8be4ac18fb777dd83fe4e233d9bc347965f76b6a0b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T08:39:37Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T08:39:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6lhz2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://261208c132a527b6786f64f37dd699e889f68ebc01265028288b3620615274bc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://261208c132a527b6786f64f37dd699e889f68ebc01265028288b3620615274bc\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T08:39:39Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T08:39:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6lhz2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8358f38c6c98de8206fc03fda21200b0e526972747588a6e32d525c064aa57ac\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8358f38c6c98de8206fc03fda21200b0e526972747588a6e32d525c064aa57ac\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T08:39:39Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T08:39:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6lhz2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9af7a0993c4e8d1177050ee170ae306c2e2570b0daca2d3f5c812b5f0e9c81da\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9af7a0993c4e8d1177050ee170ae306c2e2570b0daca2d3f5c812b5f0e9c81da\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T08:39:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T08:39:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6lhz2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://23ac91bc25ecc5c606b22bf6df52129330bb8c214ef8ec881fb202df6350c853\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://23ac91bc25ecc5c606b22bf6df52129330bb8c214ef8ec881fb202df6350c853\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T08:39:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T08:39:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6lhz2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7c836df75da45ef369baafc15bdbed1068becc3bf57a4c83a8519280ff3eb847\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7c836df75da45ef369baafc15bdbed1068becc3bf57a4c83a8519280ff3eb847\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T08:39:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T08:39:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6lhz2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T08:39:36Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-4bmrv\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T08:40:26Z is after 2025-08-24T17:21:41Z" Oct 03 08:40:26 crc kubenswrapper[4765]: I1003 08:40:26.530166 4765 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-p9gf5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"46c76a49-e10b-4a12-a6c7-12c330cd3c4e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d127171dd11041892813dd0596574630e756cc4f2e54b149619bffdbe9bae37d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T08:39:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2ldt7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T08:39:36Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-p9gf5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T08:40:26Z is after 2025-08-24T17:21:41Z" Oct 03 08:40:26 crc kubenswrapper[4765]: I1003 08:40:26.539354 4765 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-svqbq" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"84cdf1d7-9997-4015-bdbf-eedacc081685\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://43441b23076aa88505c0014c6734ffd0302f9011300711eece573befc94f3fbf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T08:39:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tm2z5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T08:39:39Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-svqbq\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T08:40:26Z is after 2025-08-24T17:21:41Z" Oct 03 08:40:26 crc kubenswrapper[4765]: I1003 08:40:26.550678 4765 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-9pssq" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fcbd8c60-e4bc-43c1-b769-9ae58a05ea0f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fb36c0727cbf11d911102b2e91c3989a264374191f4ff34349ed6ec8eba2e58d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T08:39:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ggxtg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d810b33fb4971c7a1473884cbe04ad15b3cac6c0ca9af2384819d72a748ab173\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T08:39:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ggxtg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T08:39:48Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-9pssq\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T08:40:26Z is after 2025-08-24T17:21:41Z" Oct 03 08:40:26 crc kubenswrapper[4765]: I1003 08:40:26.593922 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 08:40:26 crc kubenswrapper[4765]: I1003 08:40:26.593968 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 08:40:26 crc kubenswrapper[4765]: I1003 08:40:26.593977 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 08:40:26 crc kubenswrapper[4765]: I1003 08:40:26.593992 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 08:40:26 crc kubenswrapper[4765]: I1003 08:40:26.594002 4765 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T08:40:26Z","lastTransitionTime":"2025-10-03T08:40:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 08:40:26 crc kubenswrapper[4765]: I1003 08:40:26.696501 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 08:40:26 crc kubenswrapper[4765]: I1003 08:40:26.696546 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 08:40:26 crc kubenswrapper[4765]: I1003 08:40:26.696559 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 08:40:26 crc kubenswrapper[4765]: I1003 08:40:26.696578 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 08:40:26 crc kubenswrapper[4765]: I1003 08:40:26.696590 4765 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T08:40:26Z","lastTransitionTime":"2025-10-03T08:40:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 08:40:26 crc kubenswrapper[4765]: I1003 08:40:26.799344 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 08:40:26 crc kubenswrapper[4765]: I1003 08:40:26.799412 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 08:40:26 crc kubenswrapper[4765]: I1003 08:40:26.799430 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 08:40:26 crc kubenswrapper[4765]: I1003 08:40:26.799476 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 08:40:26 crc kubenswrapper[4765]: I1003 08:40:26.799492 4765 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T08:40:26Z","lastTransitionTime":"2025-10-03T08:40:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 08:40:26 crc kubenswrapper[4765]: I1003 08:40:26.901845 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 08:40:26 crc kubenswrapper[4765]: I1003 08:40:26.901906 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 08:40:26 crc kubenswrapper[4765]: I1003 08:40:26.901918 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 08:40:26 crc kubenswrapper[4765]: I1003 08:40:26.901937 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 08:40:26 crc kubenswrapper[4765]: I1003 08:40:26.901949 4765 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T08:40:26Z","lastTransitionTime":"2025-10-03T08:40:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 08:40:27 crc kubenswrapper[4765]: I1003 08:40:27.004196 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 08:40:27 crc kubenswrapper[4765]: I1003 08:40:27.004465 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 08:40:27 crc kubenswrapper[4765]: I1003 08:40:27.004541 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 08:40:27 crc kubenswrapper[4765]: I1003 08:40:27.004624 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 08:40:27 crc kubenswrapper[4765]: I1003 08:40:27.004735 4765 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T08:40:27Z","lastTransitionTime":"2025-10-03T08:40:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 08:40:27 crc kubenswrapper[4765]: I1003 08:40:27.106901 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 08:40:27 crc kubenswrapper[4765]: I1003 08:40:27.106937 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 08:40:27 crc kubenswrapper[4765]: I1003 08:40:27.106946 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 08:40:27 crc kubenswrapper[4765]: I1003 08:40:27.106963 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 08:40:27 crc kubenswrapper[4765]: I1003 08:40:27.106973 4765 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T08:40:27Z","lastTransitionTime":"2025-10-03T08:40:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 08:40:27 crc kubenswrapper[4765]: I1003 08:40:27.210215 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 08:40:27 crc kubenswrapper[4765]: I1003 08:40:27.210259 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 08:40:27 crc kubenswrapper[4765]: I1003 08:40:27.210271 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 08:40:27 crc kubenswrapper[4765]: I1003 08:40:27.210289 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 08:40:27 crc kubenswrapper[4765]: I1003 08:40:27.210301 4765 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T08:40:27Z","lastTransitionTime":"2025-10-03T08:40:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 08:40:27 crc kubenswrapper[4765]: I1003 08:40:27.306176 4765 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 03 08:40:27 crc kubenswrapper[4765]: E1003 08:40:27.306337 4765 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 03 08:40:27 crc kubenswrapper[4765]: I1003 08:40:27.306424 4765 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 03 08:40:27 crc kubenswrapper[4765]: E1003 08:40:27.306556 4765 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 03 08:40:27 crc kubenswrapper[4765]: I1003 08:40:27.306424 4765 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 03 08:40:27 crc kubenswrapper[4765]: E1003 08:40:27.306778 4765 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 03 08:40:27 crc kubenswrapper[4765]: I1003 08:40:27.312398 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 08:40:27 crc kubenswrapper[4765]: I1003 08:40:27.312437 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 08:40:27 crc kubenswrapper[4765]: I1003 08:40:27.312451 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 08:40:27 crc kubenswrapper[4765]: I1003 08:40:27.312468 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 08:40:27 crc kubenswrapper[4765]: I1003 08:40:27.312480 4765 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T08:40:27Z","lastTransitionTime":"2025-10-03T08:40:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 08:40:27 crc kubenswrapper[4765]: I1003 08:40:27.414161 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 08:40:27 crc kubenswrapper[4765]: I1003 08:40:27.414191 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 08:40:27 crc kubenswrapper[4765]: I1003 08:40:27.414201 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 08:40:27 crc kubenswrapper[4765]: I1003 08:40:27.414214 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 08:40:27 crc kubenswrapper[4765]: I1003 08:40:27.414222 4765 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T08:40:27Z","lastTransitionTime":"2025-10-03T08:40:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 08:40:27 crc kubenswrapper[4765]: I1003 08:40:27.516694 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 08:40:27 crc kubenswrapper[4765]: I1003 08:40:27.516741 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 08:40:27 crc kubenswrapper[4765]: I1003 08:40:27.516752 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 08:40:27 crc kubenswrapper[4765]: I1003 08:40:27.516768 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 08:40:27 crc kubenswrapper[4765]: I1003 08:40:27.516782 4765 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T08:40:27Z","lastTransitionTime":"2025-10-03T08:40:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 08:40:27 crc kubenswrapper[4765]: I1003 08:40:27.619155 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 08:40:27 crc kubenswrapper[4765]: I1003 08:40:27.619194 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 08:40:27 crc kubenswrapper[4765]: I1003 08:40:27.619205 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 08:40:27 crc kubenswrapper[4765]: I1003 08:40:27.619223 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 08:40:27 crc kubenswrapper[4765]: I1003 08:40:27.619234 4765 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T08:40:27Z","lastTransitionTime":"2025-10-03T08:40:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 08:40:27 crc kubenswrapper[4765]: I1003 08:40:27.721712 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 08:40:27 crc kubenswrapper[4765]: I1003 08:40:27.721750 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 08:40:27 crc kubenswrapper[4765]: I1003 08:40:27.721761 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 08:40:27 crc kubenswrapper[4765]: I1003 08:40:27.721775 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 08:40:27 crc kubenswrapper[4765]: I1003 08:40:27.721783 4765 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T08:40:27Z","lastTransitionTime":"2025-10-03T08:40:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 08:40:27 crc kubenswrapper[4765]: I1003 08:40:27.823754 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 08:40:27 crc kubenswrapper[4765]: I1003 08:40:27.823798 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 08:40:27 crc kubenswrapper[4765]: I1003 08:40:27.823808 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 08:40:27 crc kubenswrapper[4765]: I1003 08:40:27.823823 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 08:40:27 crc kubenswrapper[4765]: I1003 08:40:27.823832 4765 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T08:40:27Z","lastTransitionTime":"2025-10-03T08:40:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 08:40:27 crc kubenswrapper[4765]: I1003 08:40:27.928057 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 08:40:27 crc kubenswrapper[4765]: I1003 08:40:27.928090 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 08:40:27 crc kubenswrapper[4765]: I1003 08:40:27.928098 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 08:40:27 crc kubenswrapper[4765]: I1003 08:40:27.928111 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 08:40:27 crc kubenswrapper[4765]: I1003 08:40:27.928121 4765 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T08:40:27Z","lastTransitionTime":"2025-10-03T08:40:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 08:40:28 crc kubenswrapper[4765]: I1003 08:40:28.029786 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 08:40:28 crc kubenswrapper[4765]: I1003 08:40:28.029828 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 08:40:28 crc kubenswrapper[4765]: I1003 08:40:28.029842 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 08:40:28 crc kubenswrapper[4765]: I1003 08:40:28.029858 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 08:40:28 crc kubenswrapper[4765]: I1003 08:40:28.029870 4765 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T08:40:28Z","lastTransitionTime":"2025-10-03T08:40:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 08:40:28 crc kubenswrapper[4765]: I1003 08:40:28.132818 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 08:40:28 crc kubenswrapper[4765]: I1003 08:40:28.132884 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 08:40:28 crc kubenswrapper[4765]: I1003 08:40:28.132899 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 08:40:28 crc kubenswrapper[4765]: I1003 08:40:28.132924 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 08:40:28 crc kubenswrapper[4765]: I1003 08:40:28.132943 4765 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T08:40:28Z","lastTransitionTime":"2025-10-03T08:40:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 08:40:28 crc kubenswrapper[4765]: I1003 08:40:28.235672 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 08:40:28 crc kubenswrapper[4765]: I1003 08:40:28.235950 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 08:40:28 crc kubenswrapper[4765]: I1003 08:40:28.236071 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 08:40:28 crc kubenswrapper[4765]: I1003 08:40:28.236174 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 08:40:28 crc kubenswrapper[4765]: I1003 08:40:28.236245 4765 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T08:40:28Z","lastTransitionTime":"2025-10-03T08:40:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 08:40:28 crc kubenswrapper[4765]: I1003 08:40:28.306597 4765 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-wdwf5" Oct 03 08:40:28 crc kubenswrapper[4765]: E1003 08:40:28.306859 4765 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-wdwf5" podUID="6824483c-e9a7-4e95-bb3d-e00bac2af3aa" Oct 03 08:40:28 crc kubenswrapper[4765]: I1003 08:40:28.339212 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 08:40:28 crc kubenswrapper[4765]: I1003 08:40:28.339267 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 08:40:28 crc kubenswrapper[4765]: I1003 08:40:28.339279 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 08:40:28 crc kubenswrapper[4765]: I1003 08:40:28.339299 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 08:40:28 crc kubenswrapper[4765]: I1003 08:40:28.339315 4765 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T08:40:28Z","lastTransitionTime":"2025-10-03T08:40:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 08:40:28 crc kubenswrapper[4765]: I1003 08:40:28.441846 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 08:40:28 crc kubenswrapper[4765]: I1003 08:40:28.441909 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 08:40:28 crc kubenswrapper[4765]: I1003 08:40:28.441923 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 08:40:28 crc kubenswrapper[4765]: I1003 08:40:28.441945 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 08:40:28 crc kubenswrapper[4765]: I1003 08:40:28.441962 4765 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T08:40:28Z","lastTransitionTime":"2025-10-03T08:40:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 08:40:28 crc kubenswrapper[4765]: I1003 08:40:28.544901 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 08:40:28 crc kubenswrapper[4765]: I1003 08:40:28.544965 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 08:40:28 crc kubenswrapper[4765]: I1003 08:40:28.544979 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 08:40:28 crc kubenswrapper[4765]: I1003 08:40:28.545003 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 08:40:28 crc kubenswrapper[4765]: I1003 08:40:28.545017 4765 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T08:40:28Z","lastTransitionTime":"2025-10-03T08:40:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 08:40:28 crc kubenswrapper[4765]: I1003 08:40:28.648251 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 08:40:28 crc kubenswrapper[4765]: I1003 08:40:28.648298 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 08:40:28 crc kubenswrapper[4765]: I1003 08:40:28.648308 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 08:40:28 crc kubenswrapper[4765]: I1003 08:40:28.648325 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 08:40:28 crc kubenswrapper[4765]: I1003 08:40:28.648337 4765 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T08:40:28Z","lastTransitionTime":"2025-10-03T08:40:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 08:40:28 crc kubenswrapper[4765]: I1003 08:40:28.752012 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 08:40:28 crc kubenswrapper[4765]: I1003 08:40:28.752079 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 08:40:28 crc kubenswrapper[4765]: I1003 08:40:28.752092 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 08:40:28 crc kubenswrapper[4765]: I1003 08:40:28.752114 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 08:40:28 crc kubenswrapper[4765]: I1003 08:40:28.752139 4765 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T08:40:28Z","lastTransitionTime":"2025-10-03T08:40:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 08:40:28 crc kubenswrapper[4765]: I1003 08:40:28.854637 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 08:40:28 crc kubenswrapper[4765]: I1003 08:40:28.854725 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 08:40:28 crc kubenswrapper[4765]: I1003 08:40:28.854743 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 08:40:28 crc kubenswrapper[4765]: I1003 08:40:28.854798 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 08:40:28 crc kubenswrapper[4765]: I1003 08:40:28.854813 4765 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T08:40:28Z","lastTransitionTime":"2025-10-03T08:40:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 08:40:28 crc kubenswrapper[4765]: I1003 08:40:28.957378 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 08:40:28 crc kubenswrapper[4765]: I1003 08:40:28.957428 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 08:40:28 crc kubenswrapper[4765]: I1003 08:40:28.957441 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 08:40:28 crc kubenswrapper[4765]: I1003 08:40:28.957460 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 08:40:28 crc kubenswrapper[4765]: I1003 08:40:28.957473 4765 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T08:40:28Z","lastTransitionTime":"2025-10-03T08:40:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 08:40:29 crc kubenswrapper[4765]: I1003 08:40:29.060189 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 08:40:29 crc kubenswrapper[4765]: I1003 08:40:29.060236 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 08:40:29 crc kubenswrapper[4765]: I1003 08:40:29.060249 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 08:40:29 crc kubenswrapper[4765]: I1003 08:40:29.060274 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 08:40:29 crc kubenswrapper[4765]: I1003 08:40:29.060290 4765 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T08:40:29Z","lastTransitionTime":"2025-10-03T08:40:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 08:40:29 crc kubenswrapper[4765]: I1003 08:40:29.162864 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 08:40:29 crc kubenswrapper[4765]: I1003 08:40:29.162906 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 08:40:29 crc kubenswrapper[4765]: I1003 08:40:29.162916 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 08:40:29 crc kubenswrapper[4765]: I1003 08:40:29.162933 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 08:40:29 crc kubenswrapper[4765]: I1003 08:40:29.162948 4765 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T08:40:29Z","lastTransitionTime":"2025-10-03T08:40:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 08:40:29 crc kubenswrapper[4765]: I1003 08:40:29.266296 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 08:40:29 crc kubenswrapper[4765]: I1003 08:40:29.266354 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 08:40:29 crc kubenswrapper[4765]: I1003 08:40:29.266371 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 08:40:29 crc kubenswrapper[4765]: I1003 08:40:29.266390 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 08:40:29 crc kubenswrapper[4765]: I1003 08:40:29.266402 4765 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T08:40:29Z","lastTransitionTime":"2025-10-03T08:40:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 08:40:29 crc kubenswrapper[4765]: I1003 08:40:29.306129 4765 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 03 08:40:29 crc kubenswrapper[4765]: E1003 08:40:29.306309 4765 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 03 08:40:29 crc kubenswrapper[4765]: I1003 08:40:29.306169 4765 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 03 08:40:29 crc kubenswrapper[4765]: E1003 08:40:29.306402 4765 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 03 08:40:29 crc kubenswrapper[4765]: I1003 08:40:29.306129 4765 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 03 08:40:29 crc kubenswrapper[4765]: E1003 08:40:29.306472 4765 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 03 08:40:29 crc kubenswrapper[4765]: I1003 08:40:29.368614 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 08:40:29 crc kubenswrapper[4765]: I1003 08:40:29.368682 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 08:40:29 crc kubenswrapper[4765]: I1003 08:40:29.368775 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 08:40:29 crc kubenswrapper[4765]: I1003 08:40:29.368794 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 08:40:29 crc kubenswrapper[4765]: I1003 08:40:29.368806 4765 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T08:40:29Z","lastTransitionTime":"2025-10-03T08:40:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 08:40:29 crc kubenswrapper[4765]: I1003 08:40:29.471224 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 08:40:29 crc kubenswrapper[4765]: I1003 08:40:29.471263 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 08:40:29 crc kubenswrapper[4765]: I1003 08:40:29.471275 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 08:40:29 crc kubenswrapper[4765]: I1003 08:40:29.471292 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 08:40:29 crc kubenswrapper[4765]: I1003 08:40:29.471306 4765 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T08:40:29Z","lastTransitionTime":"2025-10-03T08:40:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 08:40:29 crc kubenswrapper[4765]: I1003 08:40:29.573598 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 08:40:29 crc kubenswrapper[4765]: I1003 08:40:29.573698 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 08:40:29 crc kubenswrapper[4765]: I1003 08:40:29.573712 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 08:40:29 crc kubenswrapper[4765]: I1003 08:40:29.573728 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 08:40:29 crc kubenswrapper[4765]: I1003 08:40:29.573740 4765 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T08:40:29Z","lastTransitionTime":"2025-10-03T08:40:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 08:40:29 crc kubenswrapper[4765]: I1003 08:40:29.676112 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 08:40:29 crc kubenswrapper[4765]: I1003 08:40:29.676163 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 08:40:29 crc kubenswrapper[4765]: I1003 08:40:29.676177 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 08:40:29 crc kubenswrapper[4765]: I1003 08:40:29.676196 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 08:40:29 crc kubenswrapper[4765]: I1003 08:40:29.676211 4765 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T08:40:29Z","lastTransitionTime":"2025-10-03T08:40:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 08:40:29 crc kubenswrapper[4765]: I1003 08:40:29.778790 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 08:40:29 crc kubenswrapper[4765]: I1003 08:40:29.778828 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 08:40:29 crc kubenswrapper[4765]: I1003 08:40:29.778838 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 08:40:29 crc kubenswrapper[4765]: I1003 08:40:29.778854 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 08:40:29 crc kubenswrapper[4765]: I1003 08:40:29.778866 4765 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T08:40:29Z","lastTransitionTime":"2025-10-03T08:40:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 08:40:29 crc kubenswrapper[4765]: I1003 08:40:29.881444 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 08:40:29 crc kubenswrapper[4765]: I1003 08:40:29.881491 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 08:40:29 crc kubenswrapper[4765]: I1003 08:40:29.881506 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 08:40:29 crc kubenswrapper[4765]: I1003 08:40:29.881525 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 08:40:29 crc kubenswrapper[4765]: I1003 08:40:29.881538 4765 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T08:40:29Z","lastTransitionTime":"2025-10-03T08:40:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 08:40:29 crc kubenswrapper[4765]: I1003 08:40:29.984232 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 08:40:29 crc kubenswrapper[4765]: I1003 08:40:29.984274 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 08:40:29 crc kubenswrapper[4765]: I1003 08:40:29.984292 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 08:40:29 crc kubenswrapper[4765]: I1003 08:40:29.984315 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 08:40:29 crc kubenswrapper[4765]: I1003 08:40:29.984327 4765 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T08:40:29Z","lastTransitionTime":"2025-10-03T08:40:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 08:40:30 crc kubenswrapper[4765]: I1003 08:40:30.086598 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 08:40:30 crc kubenswrapper[4765]: I1003 08:40:30.086638 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 08:40:30 crc kubenswrapper[4765]: I1003 08:40:30.086664 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 08:40:30 crc kubenswrapper[4765]: I1003 08:40:30.086680 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 08:40:30 crc kubenswrapper[4765]: I1003 08:40:30.086692 4765 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T08:40:30Z","lastTransitionTime":"2025-10-03T08:40:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 08:40:30 crc kubenswrapper[4765]: I1003 08:40:30.189195 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 08:40:30 crc kubenswrapper[4765]: I1003 08:40:30.189267 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 08:40:30 crc kubenswrapper[4765]: I1003 08:40:30.189276 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 08:40:30 crc kubenswrapper[4765]: I1003 08:40:30.189300 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 08:40:30 crc kubenswrapper[4765]: I1003 08:40:30.189317 4765 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T08:40:30Z","lastTransitionTime":"2025-10-03T08:40:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 08:40:30 crc kubenswrapper[4765]: I1003 08:40:30.291392 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 08:40:30 crc kubenswrapper[4765]: I1003 08:40:30.291439 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 08:40:30 crc kubenswrapper[4765]: I1003 08:40:30.291450 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 08:40:30 crc kubenswrapper[4765]: I1003 08:40:30.291467 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 08:40:30 crc kubenswrapper[4765]: I1003 08:40:30.291480 4765 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T08:40:30Z","lastTransitionTime":"2025-10-03T08:40:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 08:40:30 crc kubenswrapper[4765]: I1003 08:40:30.306050 4765 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-wdwf5" Oct 03 08:40:30 crc kubenswrapper[4765]: E1003 08:40:30.306199 4765 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-wdwf5" podUID="6824483c-e9a7-4e95-bb3d-e00bac2af3aa" Oct 03 08:40:30 crc kubenswrapper[4765]: I1003 08:40:30.394009 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 08:40:30 crc kubenswrapper[4765]: I1003 08:40:30.394055 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 08:40:30 crc kubenswrapper[4765]: I1003 08:40:30.394068 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 08:40:30 crc kubenswrapper[4765]: I1003 08:40:30.394087 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 08:40:30 crc kubenswrapper[4765]: I1003 08:40:30.394096 4765 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T08:40:30Z","lastTransitionTime":"2025-10-03T08:40:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 08:40:30 crc kubenswrapper[4765]: I1003 08:40:30.497071 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 08:40:30 crc kubenswrapper[4765]: I1003 08:40:30.497118 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 08:40:30 crc kubenswrapper[4765]: I1003 08:40:30.497130 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 08:40:30 crc kubenswrapper[4765]: I1003 08:40:30.497149 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 08:40:30 crc kubenswrapper[4765]: I1003 08:40:30.497162 4765 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T08:40:30Z","lastTransitionTime":"2025-10-03T08:40:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 08:40:30 crc kubenswrapper[4765]: I1003 08:40:30.599273 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 08:40:30 crc kubenswrapper[4765]: I1003 08:40:30.599306 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 08:40:30 crc kubenswrapper[4765]: I1003 08:40:30.599315 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 08:40:30 crc kubenswrapper[4765]: I1003 08:40:30.599328 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 08:40:30 crc kubenswrapper[4765]: I1003 08:40:30.599336 4765 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T08:40:30Z","lastTransitionTime":"2025-10-03T08:40:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 08:40:30 crc kubenswrapper[4765]: I1003 08:40:30.702229 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 08:40:30 crc kubenswrapper[4765]: I1003 08:40:30.702320 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 08:40:30 crc kubenswrapper[4765]: I1003 08:40:30.702332 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 08:40:30 crc kubenswrapper[4765]: I1003 08:40:30.702353 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 08:40:30 crc kubenswrapper[4765]: I1003 08:40:30.702366 4765 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T08:40:30Z","lastTransitionTime":"2025-10-03T08:40:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 08:40:30 crc kubenswrapper[4765]: I1003 08:40:30.804263 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 08:40:30 crc kubenswrapper[4765]: I1003 08:40:30.804865 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 08:40:30 crc kubenswrapper[4765]: I1003 08:40:30.804935 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 08:40:30 crc kubenswrapper[4765]: I1003 08:40:30.805015 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 08:40:30 crc kubenswrapper[4765]: I1003 08:40:30.805076 4765 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T08:40:30Z","lastTransitionTime":"2025-10-03T08:40:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 08:40:30 crc kubenswrapper[4765]: I1003 08:40:30.907081 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 08:40:30 crc kubenswrapper[4765]: I1003 08:40:30.907322 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 08:40:30 crc kubenswrapper[4765]: I1003 08:40:30.907389 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 08:40:30 crc kubenswrapper[4765]: I1003 08:40:30.907463 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 08:40:30 crc kubenswrapper[4765]: I1003 08:40:30.907533 4765 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T08:40:30Z","lastTransitionTime":"2025-10-03T08:40:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 08:40:31 crc kubenswrapper[4765]: I1003 08:40:31.010203 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 08:40:31 crc kubenswrapper[4765]: I1003 08:40:31.010461 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 08:40:31 crc kubenswrapper[4765]: I1003 08:40:31.010525 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 08:40:31 crc kubenswrapper[4765]: I1003 08:40:31.010613 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 08:40:31 crc kubenswrapper[4765]: I1003 08:40:31.010742 4765 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T08:40:31Z","lastTransitionTime":"2025-10-03T08:40:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 08:40:31 crc kubenswrapper[4765]: I1003 08:40:31.113196 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 08:40:31 crc kubenswrapper[4765]: I1003 08:40:31.113499 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 08:40:31 crc kubenswrapper[4765]: I1003 08:40:31.113583 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 08:40:31 crc kubenswrapper[4765]: I1003 08:40:31.113675 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 08:40:31 crc kubenswrapper[4765]: I1003 08:40:31.113765 4765 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T08:40:31Z","lastTransitionTime":"2025-10-03T08:40:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 08:40:31 crc kubenswrapper[4765]: I1003 08:40:31.217252 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 08:40:31 crc kubenswrapper[4765]: I1003 08:40:31.217300 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 08:40:31 crc kubenswrapper[4765]: I1003 08:40:31.217312 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 08:40:31 crc kubenswrapper[4765]: I1003 08:40:31.217330 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 08:40:31 crc kubenswrapper[4765]: I1003 08:40:31.217340 4765 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T08:40:31Z","lastTransitionTime":"2025-10-03T08:40:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 08:40:31 crc kubenswrapper[4765]: I1003 08:40:31.306701 4765 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 03 08:40:31 crc kubenswrapper[4765]: I1003 08:40:31.306717 4765 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 03 08:40:31 crc kubenswrapper[4765]: E1003 08:40:31.306937 4765 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 03 08:40:31 crc kubenswrapper[4765]: I1003 08:40:31.306717 4765 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 03 08:40:31 crc kubenswrapper[4765]: E1003 08:40:31.307021 4765 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 03 08:40:31 crc kubenswrapper[4765]: E1003 08:40:31.307084 4765 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 03 08:40:31 crc kubenswrapper[4765]: I1003 08:40:31.320064 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 08:40:31 crc kubenswrapper[4765]: I1003 08:40:31.320121 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 08:40:31 crc kubenswrapper[4765]: I1003 08:40:31.320130 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 08:40:31 crc kubenswrapper[4765]: I1003 08:40:31.320148 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 08:40:31 crc kubenswrapper[4765]: I1003 08:40:31.320161 4765 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T08:40:31Z","lastTransitionTime":"2025-10-03T08:40:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 08:40:31 crc kubenswrapper[4765]: I1003 08:40:31.422220 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 08:40:31 crc kubenswrapper[4765]: I1003 08:40:31.422270 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 08:40:31 crc kubenswrapper[4765]: I1003 08:40:31.422282 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 08:40:31 crc kubenswrapper[4765]: I1003 08:40:31.422303 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 08:40:31 crc kubenswrapper[4765]: I1003 08:40:31.422318 4765 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T08:40:31Z","lastTransitionTime":"2025-10-03T08:40:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 08:40:31 crc kubenswrapper[4765]: I1003 08:40:31.524731 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 08:40:31 crc kubenswrapper[4765]: I1003 08:40:31.524768 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 08:40:31 crc kubenswrapper[4765]: I1003 08:40:31.524777 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 08:40:31 crc kubenswrapper[4765]: I1003 08:40:31.524791 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 08:40:31 crc kubenswrapper[4765]: I1003 08:40:31.524802 4765 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T08:40:31Z","lastTransitionTime":"2025-10-03T08:40:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 08:40:31 crc kubenswrapper[4765]: I1003 08:40:31.627278 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 08:40:31 crc kubenswrapper[4765]: I1003 08:40:31.627333 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 08:40:31 crc kubenswrapper[4765]: I1003 08:40:31.627342 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 08:40:31 crc kubenswrapper[4765]: I1003 08:40:31.627357 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 08:40:31 crc kubenswrapper[4765]: I1003 08:40:31.627368 4765 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T08:40:31Z","lastTransitionTime":"2025-10-03T08:40:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 08:40:31 crc kubenswrapper[4765]: I1003 08:40:31.728979 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 08:40:31 crc kubenswrapper[4765]: I1003 08:40:31.729022 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 08:40:31 crc kubenswrapper[4765]: I1003 08:40:31.729033 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 08:40:31 crc kubenswrapper[4765]: I1003 08:40:31.729049 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 08:40:31 crc kubenswrapper[4765]: I1003 08:40:31.729060 4765 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T08:40:31Z","lastTransitionTime":"2025-10-03T08:40:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 08:40:31 crc kubenswrapper[4765]: I1003 08:40:31.831628 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 08:40:31 crc kubenswrapper[4765]: I1003 08:40:31.831698 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 08:40:31 crc kubenswrapper[4765]: I1003 08:40:31.831711 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 08:40:31 crc kubenswrapper[4765]: I1003 08:40:31.831731 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 08:40:31 crc kubenswrapper[4765]: I1003 08:40:31.831747 4765 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T08:40:31Z","lastTransitionTime":"2025-10-03T08:40:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 08:40:31 crc kubenswrapper[4765]: I1003 08:40:31.934088 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 08:40:31 crc kubenswrapper[4765]: I1003 08:40:31.934162 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 08:40:31 crc kubenswrapper[4765]: I1003 08:40:31.934177 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 08:40:31 crc kubenswrapper[4765]: I1003 08:40:31.934209 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 08:40:31 crc kubenswrapper[4765]: I1003 08:40:31.934227 4765 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T08:40:31Z","lastTransitionTime":"2025-10-03T08:40:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 08:40:32 crc kubenswrapper[4765]: I1003 08:40:32.038194 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 08:40:32 crc kubenswrapper[4765]: I1003 08:40:32.038251 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 08:40:32 crc kubenswrapper[4765]: I1003 08:40:32.038262 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 08:40:32 crc kubenswrapper[4765]: I1003 08:40:32.038287 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 08:40:32 crc kubenswrapper[4765]: I1003 08:40:32.038300 4765 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T08:40:32Z","lastTransitionTime":"2025-10-03T08:40:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 08:40:32 crc kubenswrapper[4765]: I1003 08:40:32.140842 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 08:40:32 crc kubenswrapper[4765]: I1003 08:40:32.140935 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 08:40:32 crc kubenswrapper[4765]: I1003 08:40:32.140954 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 08:40:32 crc kubenswrapper[4765]: I1003 08:40:32.140982 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 08:40:32 crc kubenswrapper[4765]: I1003 08:40:32.140999 4765 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T08:40:32Z","lastTransitionTime":"2025-10-03T08:40:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 08:40:32 crc kubenswrapper[4765]: I1003 08:40:32.243904 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 08:40:32 crc kubenswrapper[4765]: I1003 08:40:32.243967 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 08:40:32 crc kubenswrapper[4765]: I1003 08:40:32.243977 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 08:40:32 crc kubenswrapper[4765]: I1003 08:40:32.244003 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 08:40:32 crc kubenswrapper[4765]: I1003 08:40:32.244018 4765 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T08:40:32Z","lastTransitionTime":"2025-10-03T08:40:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 08:40:32 crc kubenswrapper[4765]: I1003 08:40:32.306158 4765 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-wdwf5" Oct 03 08:40:32 crc kubenswrapper[4765]: E1003 08:40:32.306376 4765 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-wdwf5" podUID="6824483c-e9a7-4e95-bb3d-e00bac2af3aa" Oct 03 08:40:32 crc kubenswrapper[4765]: I1003 08:40:32.347683 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 08:40:32 crc kubenswrapper[4765]: I1003 08:40:32.347741 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 08:40:32 crc kubenswrapper[4765]: I1003 08:40:32.347751 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 08:40:32 crc kubenswrapper[4765]: I1003 08:40:32.347769 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 08:40:32 crc kubenswrapper[4765]: I1003 08:40:32.347782 4765 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T08:40:32Z","lastTransitionTime":"2025-10-03T08:40:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 08:40:32 crc kubenswrapper[4765]: I1003 08:40:32.450040 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 08:40:32 crc kubenswrapper[4765]: I1003 08:40:32.450100 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 08:40:32 crc kubenswrapper[4765]: I1003 08:40:32.450114 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 08:40:32 crc kubenswrapper[4765]: I1003 08:40:32.450134 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 08:40:32 crc kubenswrapper[4765]: I1003 08:40:32.450152 4765 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T08:40:32Z","lastTransitionTime":"2025-10-03T08:40:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 08:40:32 crc kubenswrapper[4765]: I1003 08:40:32.552568 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 08:40:32 crc kubenswrapper[4765]: I1003 08:40:32.552696 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 08:40:32 crc kubenswrapper[4765]: I1003 08:40:32.552713 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 08:40:32 crc kubenswrapper[4765]: I1003 08:40:32.552738 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 08:40:32 crc kubenswrapper[4765]: I1003 08:40:32.552751 4765 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T08:40:32Z","lastTransitionTime":"2025-10-03T08:40:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 08:40:32 crc kubenswrapper[4765]: I1003 08:40:32.655398 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 08:40:32 crc kubenswrapper[4765]: I1003 08:40:32.655428 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 08:40:32 crc kubenswrapper[4765]: I1003 08:40:32.655438 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 08:40:32 crc kubenswrapper[4765]: I1003 08:40:32.655451 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 08:40:32 crc kubenswrapper[4765]: I1003 08:40:32.655463 4765 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T08:40:32Z","lastTransitionTime":"2025-10-03T08:40:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 08:40:32 crc kubenswrapper[4765]: I1003 08:40:32.758108 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 08:40:32 crc kubenswrapper[4765]: I1003 08:40:32.758162 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 08:40:32 crc kubenswrapper[4765]: I1003 08:40:32.758172 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 08:40:32 crc kubenswrapper[4765]: I1003 08:40:32.758189 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 08:40:32 crc kubenswrapper[4765]: I1003 08:40:32.758200 4765 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T08:40:32Z","lastTransitionTime":"2025-10-03T08:40:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 08:40:32 crc kubenswrapper[4765]: I1003 08:40:32.861228 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 08:40:32 crc kubenswrapper[4765]: I1003 08:40:32.862093 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 08:40:32 crc kubenswrapper[4765]: I1003 08:40:32.862153 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 08:40:32 crc kubenswrapper[4765]: I1003 08:40:32.862182 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 08:40:32 crc kubenswrapper[4765]: I1003 08:40:32.862200 4765 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T08:40:32Z","lastTransitionTime":"2025-10-03T08:40:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 08:40:32 crc kubenswrapper[4765]: I1003 08:40:32.965200 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 08:40:32 crc kubenswrapper[4765]: I1003 08:40:32.965250 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 08:40:32 crc kubenswrapper[4765]: I1003 08:40:32.965259 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 08:40:32 crc kubenswrapper[4765]: I1003 08:40:32.965276 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 08:40:32 crc kubenswrapper[4765]: I1003 08:40:32.965285 4765 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T08:40:32Z","lastTransitionTime":"2025-10-03T08:40:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 08:40:33 crc kubenswrapper[4765]: I1003 08:40:33.068003 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 08:40:33 crc kubenswrapper[4765]: I1003 08:40:33.068056 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 08:40:33 crc kubenswrapper[4765]: I1003 08:40:33.068068 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 08:40:33 crc kubenswrapper[4765]: I1003 08:40:33.068089 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 08:40:33 crc kubenswrapper[4765]: I1003 08:40:33.068103 4765 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T08:40:33Z","lastTransitionTime":"2025-10-03T08:40:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 08:40:33 crc kubenswrapper[4765]: I1003 08:40:33.170787 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 08:40:33 crc kubenswrapper[4765]: I1003 08:40:33.170856 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 08:40:33 crc kubenswrapper[4765]: I1003 08:40:33.170872 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 08:40:33 crc kubenswrapper[4765]: I1003 08:40:33.170893 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 08:40:33 crc kubenswrapper[4765]: I1003 08:40:33.170903 4765 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T08:40:33Z","lastTransitionTime":"2025-10-03T08:40:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 08:40:33 crc kubenswrapper[4765]: I1003 08:40:33.273262 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 08:40:33 crc kubenswrapper[4765]: I1003 08:40:33.273309 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 08:40:33 crc kubenswrapper[4765]: I1003 08:40:33.273322 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 08:40:33 crc kubenswrapper[4765]: I1003 08:40:33.273342 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 08:40:33 crc kubenswrapper[4765]: I1003 08:40:33.273354 4765 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T08:40:33Z","lastTransitionTime":"2025-10-03T08:40:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 08:40:33 crc kubenswrapper[4765]: I1003 08:40:33.306247 4765 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 03 08:40:33 crc kubenswrapper[4765]: I1003 08:40:33.306362 4765 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 03 08:40:33 crc kubenswrapper[4765]: E1003 08:40:33.306481 4765 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 03 08:40:33 crc kubenswrapper[4765]: I1003 08:40:33.306502 4765 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 03 08:40:33 crc kubenswrapper[4765]: E1003 08:40:33.306606 4765 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 03 08:40:33 crc kubenswrapper[4765]: E1003 08:40:33.306691 4765 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 03 08:40:33 crc kubenswrapper[4765]: I1003 08:40:33.358946 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 08:40:33 crc kubenswrapper[4765]: I1003 08:40:33.359006 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 08:40:33 crc kubenswrapper[4765]: I1003 08:40:33.359020 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 08:40:33 crc kubenswrapper[4765]: I1003 08:40:33.359042 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 08:40:33 crc kubenswrapper[4765]: I1003 08:40:33.359054 4765 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T08:40:33Z","lastTransitionTime":"2025-10-03T08:40:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 08:40:33 crc kubenswrapper[4765]: E1003 08:40:33.371762 4765 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-03T08:40:33Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:33Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-03T08:40:33Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:33Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-03T08:40:33Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:33Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-03T08:40:33Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:33Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"4a5a1b91-d1b3-462d-b8c2-89eae83d6c3d\\\",\\\"systemUUID\\\":\\\"c85bcae8-d463-4f60-8737-09c0f3c02573\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T08:40:33Z is after 2025-08-24T17:21:41Z" Oct 03 08:40:33 crc kubenswrapper[4765]: I1003 08:40:33.376931 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 08:40:33 crc kubenswrapper[4765]: I1003 08:40:33.376974 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 08:40:33 crc kubenswrapper[4765]: I1003 08:40:33.376989 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 08:40:33 crc kubenswrapper[4765]: I1003 08:40:33.377010 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 08:40:33 crc kubenswrapper[4765]: I1003 08:40:33.377022 4765 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T08:40:33Z","lastTransitionTime":"2025-10-03T08:40:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 08:40:33 crc kubenswrapper[4765]: E1003 08:40:33.391123 4765 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-03T08:40:33Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:33Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-03T08:40:33Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:33Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-03T08:40:33Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:33Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-03T08:40:33Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:33Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"4a5a1b91-d1b3-462d-b8c2-89eae83d6c3d\\\",\\\"systemUUID\\\":\\\"c85bcae8-d463-4f60-8737-09c0f3c02573\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T08:40:33Z is after 2025-08-24T17:21:41Z" Oct 03 08:40:33 crc kubenswrapper[4765]: I1003 08:40:33.394711 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 08:40:33 crc kubenswrapper[4765]: I1003 08:40:33.394762 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 08:40:33 crc kubenswrapper[4765]: I1003 08:40:33.394771 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 08:40:33 crc kubenswrapper[4765]: I1003 08:40:33.394786 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 08:40:33 crc kubenswrapper[4765]: I1003 08:40:33.394839 4765 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T08:40:33Z","lastTransitionTime":"2025-10-03T08:40:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 08:40:33 crc kubenswrapper[4765]: E1003 08:40:33.428895 4765 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-03T08:40:33Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:33Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-03T08:40:33Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:33Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-03T08:40:33Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:33Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-03T08:40:33Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:33Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"4a5a1b91-d1b3-462d-b8c2-89eae83d6c3d\\\",\\\"systemUUID\\\":\\\"c85bcae8-d463-4f60-8737-09c0f3c02573\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T08:40:33Z is after 2025-08-24T17:21:41Z" Oct 03 08:40:33 crc kubenswrapper[4765]: I1003 08:40:33.431757 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 08:40:33 crc kubenswrapper[4765]: I1003 08:40:33.431802 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 08:40:33 crc kubenswrapper[4765]: I1003 08:40:33.431832 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 08:40:33 crc kubenswrapper[4765]: I1003 08:40:33.431851 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 08:40:33 crc kubenswrapper[4765]: I1003 08:40:33.431862 4765 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T08:40:33Z","lastTransitionTime":"2025-10-03T08:40:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 08:40:33 crc kubenswrapper[4765]: E1003 08:40:33.445118 4765 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-03T08:40:33Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:33Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-03T08:40:33Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:33Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-03T08:40:33Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:33Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-03T08:40:33Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:33Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"4a5a1b91-d1b3-462d-b8c2-89eae83d6c3d\\\",\\\"systemUUID\\\":\\\"c85bcae8-d463-4f60-8737-09c0f3c02573\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T08:40:33Z is after 2025-08-24T17:21:41Z" Oct 03 08:40:33 crc kubenswrapper[4765]: I1003 08:40:33.448335 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 08:40:33 crc kubenswrapper[4765]: I1003 08:40:33.448376 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 08:40:33 crc kubenswrapper[4765]: I1003 08:40:33.448389 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 08:40:33 crc kubenswrapper[4765]: I1003 08:40:33.448406 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 08:40:33 crc kubenswrapper[4765]: I1003 08:40:33.448420 4765 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T08:40:33Z","lastTransitionTime":"2025-10-03T08:40:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 08:40:33 crc kubenswrapper[4765]: E1003 08:40:33.461432 4765 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-03T08:40:33Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:33Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-03T08:40:33Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:33Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-03T08:40:33Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:33Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-03T08:40:33Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:33Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"4a5a1b91-d1b3-462d-b8c2-89eae83d6c3d\\\",\\\"systemUUID\\\":\\\"c85bcae8-d463-4f60-8737-09c0f3c02573\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T08:40:33Z is after 2025-08-24T17:21:41Z" Oct 03 08:40:33 crc kubenswrapper[4765]: E1003 08:40:33.461542 4765 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Oct 03 08:40:33 crc kubenswrapper[4765]: I1003 08:40:33.463157 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 08:40:33 crc kubenswrapper[4765]: I1003 08:40:33.463186 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 08:40:33 crc kubenswrapper[4765]: I1003 08:40:33.463212 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 08:40:33 crc kubenswrapper[4765]: I1003 08:40:33.463226 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 08:40:33 crc kubenswrapper[4765]: I1003 08:40:33.463236 4765 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T08:40:33Z","lastTransitionTime":"2025-10-03T08:40:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 08:40:33 crc kubenswrapper[4765]: I1003 08:40:33.565515 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 08:40:33 crc kubenswrapper[4765]: I1003 08:40:33.565550 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 08:40:33 crc kubenswrapper[4765]: I1003 08:40:33.565579 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 08:40:33 crc kubenswrapper[4765]: I1003 08:40:33.565595 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 08:40:33 crc kubenswrapper[4765]: I1003 08:40:33.565603 4765 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T08:40:33Z","lastTransitionTime":"2025-10-03T08:40:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 08:40:33 crc kubenswrapper[4765]: I1003 08:40:33.668941 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 08:40:33 crc kubenswrapper[4765]: I1003 08:40:33.668979 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 08:40:33 crc kubenswrapper[4765]: I1003 08:40:33.668990 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 08:40:33 crc kubenswrapper[4765]: I1003 08:40:33.669006 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 08:40:33 crc kubenswrapper[4765]: I1003 08:40:33.669019 4765 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T08:40:33Z","lastTransitionTime":"2025-10-03T08:40:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 08:40:33 crc kubenswrapper[4765]: I1003 08:40:33.771714 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 08:40:33 crc kubenswrapper[4765]: I1003 08:40:33.771762 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 08:40:33 crc kubenswrapper[4765]: I1003 08:40:33.771781 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 08:40:33 crc kubenswrapper[4765]: I1003 08:40:33.771799 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 08:40:33 crc kubenswrapper[4765]: I1003 08:40:33.771811 4765 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T08:40:33Z","lastTransitionTime":"2025-10-03T08:40:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 08:40:33 crc kubenswrapper[4765]: I1003 08:40:33.874586 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 08:40:33 crc kubenswrapper[4765]: I1003 08:40:33.874713 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 08:40:33 crc kubenswrapper[4765]: I1003 08:40:33.874739 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 08:40:33 crc kubenswrapper[4765]: I1003 08:40:33.874772 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 08:40:33 crc kubenswrapper[4765]: I1003 08:40:33.874797 4765 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T08:40:33Z","lastTransitionTime":"2025-10-03T08:40:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 08:40:33 crc kubenswrapper[4765]: I1003 08:40:33.977625 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 08:40:33 crc kubenswrapper[4765]: I1003 08:40:33.977695 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 08:40:33 crc kubenswrapper[4765]: I1003 08:40:33.977706 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 08:40:33 crc kubenswrapper[4765]: I1003 08:40:33.977737 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 08:40:33 crc kubenswrapper[4765]: I1003 08:40:33.977751 4765 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T08:40:33Z","lastTransitionTime":"2025-10-03T08:40:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 08:40:34 crc kubenswrapper[4765]: I1003 08:40:34.085161 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 08:40:34 crc kubenswrapper[4765]: I1003 08:40:34.085197 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 08:40:34 crc kubenswrapper[4765]: I1003 08:40:34.085206 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 08:40:34 crc kubenswrapper[4765]: I1003 08:40:34.085220 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 08:40:34 crc kubenswrapper[4765]: I1003 08:40:34.085230 4765 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T08:40:34Z","lastTransitionTime":"2025-10-03T08:40:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 08:40:34 crc kubenswrapper[4765]: I1003 08:40:34.187581 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 08:40:34 crc kubenswrapper[4765]: I1003 08:40:34.187647 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 08:40:34 crc kubenswrapper[4765]: I1003 08:40:34.187657 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 08:40:34 crc kubenswrapper[4765]: I1003 08:40:34.187692 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 08:40:34 crc kubenswrapper[4765]: I1003 08:40:34.187706 4765 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T08:40:34Z","lastTransitionTime":"2025-10-03T08:40:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 08:40:34 crc kubenswrapper[4765]: I1003 08:40:34.290893 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 08:40:34 crc kubenswrapper[4765]: I1003 08:40:34.290935 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 08:40:34 crc kubenswrapper[4765]: I1003 08:40:34.290945 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 08:40:34 crc kubenswrapper[4765]: I1003 08:40:34.290961 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 08:40:34 crc kubenswrapper[4765]: I1003 08:40:34.290971 4765 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T08:40:34Z","lastTransitionTime":"2025-10-03T08:40:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 08:40:34 crc kubenswrapper[4765]: I1003 08:40:34.306034 4765 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-wdwf5" Oct 03 08:40:34 crc kubenswrapper[4765]: E1003 08:40:34.306234 4765 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-wdwf5" podUID="6824483c-e9a7-4e95-bb3d-e00bac2af3aa" Oct 03 08:40:34 crc kubenswrapper[4765]: I1003 08:40:34.393321 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 08:40:34 crc kubenswrapper[4765]: I1003 08:40:34.393363 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 08:40:34 crc kubenswrapper[4765]: I1003 08:40:34.393371 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 08:40:34 crc kubenswrapper[4765]: I1003 08:40:34.393388 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 08:40:34 crc kubenswrapper[4765]: I1003 08:40:34.393398 4765 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T08:40:34Z","lastTransitionTime":"2025-10-03T08:40:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 08:40:34 crc kubenswrapper[4765]: I1003 08:40:34.496322 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 08:40:34 crc kubenswrapper[4765]: I1003 08:40:34.496615 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 08:40:34 crc kubenswrapper[4765]: I1003 08:40:34.496729 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 08:40:34 crc kubenswrapper[4765]: I1003 08:40:34.496828 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 08:40:34 crc kubenswrapper[4765]: I1003 08:40:34.496932 4765 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T08:40:34Z","lastTransitionTime":"2025-10-03T08:40:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 08:40:34 crc kubenswrapper[4765]: I1003 08:40:34.599233 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 08:40:34 crc kubenswrapper[4765]: I1003 08:40:34.599275 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 08:40:34 crc kubenswrapper[4765]: I1003 08:40:34.599287 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 08:40:34 crc kubenswrapper[4765]: I1003 08:40:34.599302 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 08:40:34 crc kubenswrapper[4765]: I1003 08:40:34.599313 4765 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T08:40:34Z","lastTransitionTime":"2025-10-03T08:40:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 08:40:34 crc kubenswrapper[4765]: I1003 08:40:34.701849 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 08:40:34 crc kubenswrapper[4765]: I1003 08:40:34.701891 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 08:40:34 crc kubenswrapper[4765]: I1003 08:40:34.701903 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 08:40:34 crc kubenswrapper[4765]: I1003 08:40:34.701920 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 08:40:34 crc kubenswrapper[4765]: I1003 08:40:34.701933 4765 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T08:40:34Z","lastTransitionTime":"2025-10-03T08:40:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 08:40:34 crc kubenswrapper[4765]: I1003 08:40:34.804090 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 08:40:34 crc kubenswrapper[4765]: I1003 08:40:34.804155 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 08:40:34 crc kubenswrapper[4765]: I1003 08:40:34.804165 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 08:40:34 crc kubenswrapper[4765]: I1003 08:40:34.804187 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 08:40:34 crc kubenswrapper[4765]: I1003 08:40:34.804198 4765 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T08:40:34Z","lastTransitionTime":"2025-10-03T08:40:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 08:40:34 crc kubenswrapper[4765]: I1003 08:40:34.907578 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 08:40:34 crc kubenswrapper[4765]: I1003 08:40:34.907615 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 08:40:34 crc kubenswrapper[4765]: I1003 08:40:34.907627 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 08:40:34 crc kubenswrapper[4765]: I1003 08:40:34.907645 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 08:40:34 crc kubenswrapper[4765]: I1003 08:40:34.907658 4765 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T08:40:34Z","lastTransitionTime":"2025-10-03T08:40:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 08:40:35 crc kubenswrapper[4765]: I1003 08:40:35.010259 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 08:40:35 crc kubenswrapper[4765]: I1003 08:40:35.010328 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 08:40:35 crc kubenswrapper[4765]: I1003 08:40:35.010343 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 08:40:35 crc kubenswrapper[4765]: I1003 08:40:35.010365 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 08:40:35 crc kubenswrapper[4765]: I1003 08:40:35.010380 4765 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T08:40:35Z","lastTransitionTime":"2025-10-03T08:40:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 08:40:35 crc kubenswrapper[4765]: I1003 08:40:35.113068 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 08:40:35 crc kubenswrapper[4765]: I1003 08:40:35.113169 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 08:40:35 crc kubenswrapper[4765]: I1003 08:40:35.113183 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 08:40:35 crc kubenswrapper[4765]: I1003 08:40:35.113204 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 08:40:35 crc kubenswrapper[4765]: I1003 08:40:35.113218 4765 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T08:40:35Z","lastTransitionTime":"2025-10-03T08:40:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 08:40:35 crc kubenswrapper[4765]: I1003 08:40:35.215577 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 08:40:35 crc kubenswrapper[4765]: I1003 08:40:35.215630 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 08:40:35 crc kubenswrapper[4765]: I1003 08:40:35.215664 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 08:40:35 crc kubenswrapper[4765]: I1003 08:40:35.215685 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 08:40:35 crc kubenswrapper[4765]: I1003 08:40:35.215705 4765 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T08:40:35Z","lastTransitionTime":"2025-10-03T08:40:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 08:40:35 crc kubenswrapper[4765]: I1003 08:40:35.305913 4765 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 03 08:40:35 crc kubenswrapper[4765]: I1003 08:40:35.305932 4765 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 03 08:40:35 crc kubenswrapper[4765]: I1003 08:40:35.306016 4765 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 03 08:40:35 crc kubenswrapper[4765]: E1003 08:40:35.306206 4765 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 03 08:40:35 crc kubenswrapper[4765]: E1003 08:40:35.306410 4765 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 03 08:40:35 crc kubenswrapper[4765]: E1003 08:40:35.306500 4765 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 03 08:40:35 crc kubenswrapper[4765]: I1003 08:40:35.307201 4765 scope.go:117] "RemoveContainer" containerID="a4d481217db9abe6da65a66219fdf2298353f237df78c085f40bb803f7349ccd" Oct 03 08:40:35 crc kubenswrapper[4765]: I1003 08:40:35.318913 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 08:40:35 crc kubenswrapper[4765]: I1003 08:40:35.318979 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 08:40:35 crc kubenswrapper[4765]: I1003 08:40:35.318995 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 08:40:35 crc kubenswrapper[4765]: I1003 08:40:35.319019 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 08:40:35 crc kubenswrapper[4765]: I1003 08:40:35.319033 4765 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T08:40:35Z","lastTransitionTime":"2025-10-03T08:40:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 08:40:35 crc kubenswrapper[4765]: I1003 08:40:35.421129 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 08:40:35 crc kubenswrapper[4765]: I1003 08:40:35.421478 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 08:40:35 crc kubenswrapper[4765]: I1003 08:40:35.421492 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 08:40:35 crc kubenswrapper[4765]: I1003 08:40:35.421510 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 08:40:35 crc kubenswrapper[4765]: I1003 08:40:35.421523 4765 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T08:40:35Z","lastTransitionTime":"2025-10-03T08:40:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 08:40:35 crc kubenswrapper[4765]: I1003 08:40:35.524282 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 08:40:35 crc kubenswrapper[4765]: I1003 08:40:35.524337 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 08:40:35 crc kubenswrapper[4765]: I1003 08:40:35.524348 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 08:40:35 crc kubenswrapper[4765]: I1003 08:40:35.524367 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 08:40:35 crc kubenswrapper[4765]: I1003 08:40:35.524379 4765 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T08:40:35Z","lastTransitionTime":"2025-10-03T08:40:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 08:40:35 crc kubenswrapper[4765]: I1003 08:40:35.626854 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 08:40:35 crc kubenswrapper[4765]: I1003 08:40:35.626903 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 08:40:35 crc kubenswrapper[4765]: I1003 08:40:35.626913 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 08:40:35 crc kubenswrapper[4765]: I1003 08:40:35.626934 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 08:40:35 crc kubenswrapper[4765]: I1003 08:40:35.626947 4765 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T08:40:35Z","lastTransitionTime":"2025-10-03T08:40:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 08:40:35 crc kubenswrapper[4765]: I1003 08:40:35.729332 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 08:40:35 crc kubenswrapper[4765]: I1003 08:40:35.729368 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 08:40:35 crc kubenswrapper[4765]: I1003 08:40:35.729379 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 08:40:35 crc kubenswrapper[4765]: I1003 08:40:35.729396 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 08:40:35 crc kubenswrapper[4765]: I1003 08:40:35.729408 4765 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T08:40:35Z","lastTransitionTime":"2025-10-03T08:40:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 08:40:35 crc kubenswrapper[4765]: I1003 08:40:35.730850 4765 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-srgbb_ea01fba1-445f-46c1-898c-1ceb34866850/ovnkube-controller/2.log" Oct 03 08:40:35 crc kubenswrapper[4765]: I1003 08:40:35.733768 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-srgbb" event={"ID":"ea01fba1-445f-46c1-898c-1ceb34866850","Type":"ContainerStarted","Data":"115444bc9990e2060fb9e8fff1ca7328f3abbaee25879c6af5feac46f0a417bb"} Oct 03 08:40:35 crc kubenswrapper[4765]: I1003 08:40:35.734332 4765 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-srgbb" Oct 03 08:40:35 crc kubenswrapper[4765]: I1003 08:40:35.757348 4765 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:35Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:35Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T08:40:35Z is after 2025-08-24T17:21:41Z" Oct 03 08:40:35 crc kubenswrapper[4765]: I1003 08:40:35.774987 4765 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:36Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:36Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e8d6f534a0a702832db2f8947c1528a98d511d3950cc5a6ec0ac3b31b3dbcb7c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T08:39:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://20ad16cb9f0f7e17ac946cd2c3f7c01b6e6c95d6d76c99f482b3761546689af2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T08:39:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T08:40:35Z is after 2025-08-24T17:21:41Z" Oct 03 08:40:35 crc kubenswrapper[4765]: I1003 08:40:35.788087 4765 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:38Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:38Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a37f2b5f797755065158a077232872befbc61f2f19c80dfd27bba7f131db794c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T08:39:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T08:40:35Z is after 2025-08-24T17:21:41Z" Oct 03 08:40:35 crc kubenswrapper[4765]: I1003 08:40:35.810661 4765 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-srgbb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ea01fba1-445f-46c1-898c-1ceb34866850\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:36Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:36Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d73e2e54676fc570262cfd551322ed003812c372ddc25695ca3b34ae2a05423b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T08:39:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m4zqv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fa40947035e07c4926ee170348e2bd545830d0c6c1fa6b59a2aa7f12eac2c6da\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T08:39:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m4zqv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://902d94d2cc9ce526c6ea774f1bb70fbee7da85cedab72fcd842f87d47ee8a458\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T08:39:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m4zqv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://95502595a856f5f235331ab5db3d4f97a50f968857c1962d12b873a714689f0c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T08:39:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m4zqv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a3ad66691c9dcf004703b79d697a78f9b42791fafba2ddf278997b6ad28bdd4a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T08:39:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m4zqv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://68b9b8a7ec5c072f50d44aa0d3800b7cdee18bdd868d37ec129ceb37a23bd3ca\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T08:39:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m4zqv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://115444bc9990e2060fb9e8fff1ca7328f3abbaee25879c6af5feac46f0a417bb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a4d481217db9abe6da65a66219fdf2298353f237df78c085f40bb803f7349ccd\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-10-03T08:40:03Z\\\",\\\"message\\\":\\\":false hairpin_snat_ip:169.254.0.5 fd69::5 neighbor_responder:none reject:true skip_snat:false]} protocol:{GoSet:[tcp]} selection_fields:{GoSet:[]} vips:{GoMap:map[10.217.5.188:443:]}] Rows:[] Columns:[] Mutations:[] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {53c717ca-2174-4315-bb03-c937a9c0d9b6}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nI1003 08:40:03.133396 6432 transact.go:42] Configuring OVN: [{Op:update Table:Load_Balancer Row:map[external_ids:{GoMap:map[k8s.ovn.org/kind:Service k8s.ovn.org/owner:openshift-cluster-version/cluster-version-operator]} name:Service_openshift-cluster-version/cluster-version-operator_TCP_cluster options:{GoMap:map[event:false hairpin_snat_ip:169.254.0.5 fd69::5 neighbor_responder:none reject:true skip_snat:false]} protocol:{GoSet:[tcp]} selection_fields:{GoSet:[]} vips:{GoMap:map[10.217.4.182:9099:]}] Rows:[] Columns:[] Mutations:[] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {61d39e4d-21a9-4387-9a2b-fa4ad14792e2}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nI1003 08:40:03.133455 6432 base_network_controller_pods.go:916] Annotation values: ip=[10.217.0.3/23] ; mac=0a:58:0a:d9:00:03 ; gw=[10.217.0.1]\\\\nF1003 08:40:03.133502 6432 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-03T08:40:02Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T08:40:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m4zqv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6d5d60eb6ab5ff22cc2c6826b1d47220bb827fa0429f2a59020ae01d0a43f6bf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T08:39:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m4zqv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://761ad034a8a6a7a6280fcccaca136b26ddd423885bc2a7ab6b8e1856f16f7416\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://761ad034a8a6a7a6280fcccaca136b26ddd423885bc2a7ab6b8e1856f16f7416\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T08:39:37Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T08:39:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m4zqv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T08:39:36Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-srgbb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T08:40:35Z is after 2025-08-24T17:21:41Z" Oct 03 08:40:35 crc kubenswrapper[4765]: I1003 08:40:35.823424 4765 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-9pssq" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fcbd8c60-e4bc-43c1-b769-9ae58a05ea0f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fb36c0727cbf11d911102b2e91c3989a264374191f4ff34349ed6ec8eba2e58d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T08:39:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ggxtg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d810b33fb4971c7a1473884cbe04ad15b3cac6c0ca9af2384819d72a748ab173\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T08:39:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ggxtg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T08:39:48Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-9pssq\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T08:40:35Z is after 2025-08-24T17:21:41Z" Oct 03 08:40:35 crc kubenswrapper[4765]: I1003 08:40:35.832215 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 08:40:35 crc kubenswrapper[4765]: I1003 08:40:35.832265 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 08:40:35 crc kubenswrapper[4765]: I1003 08:40:35.832278 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 08:40:35 crc kubenswrapper[4765]: I1003 08:40:35.832299 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 08:40:35 crc kubenswrapper[4765]: I1003 08:40:35.832312 4765 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T08:40:35Z","lastTransitionTime":"2025-10-03T08:40:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 08:40:35 crc kubenswrapper[4765]: I1003 08:40:35.837920 4765 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9660b983-3561-4cf7-8ea0-31a63e8d1051\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://23c27e7d79dab0c54b22f0114e7f55a9267e3a21961b8479c37fd77d0e8b66c7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T08:39:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fb89a31c804d86cbc11b04e4dcfab79d4536f28a107d43e98d48172a1c257ebb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T08:39:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1c3168f51c49cd9633557cf31cdc0fec47b3fcf981462dc85f4253a0584fcf52\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T08:39:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://99ae775d5cfd2e88a1c7ca516e1c59f2e08ce1d383653cacbefeac66b07abcb5\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T08:39:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T08:39:16Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T08:40:35Z is after 2025-08-24T17:21:41Z" Oct 03 08:40:35 crc kubenswrapper[4765]: I1003 08:40:35.852984 4765 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-4bmrv" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4f105c06-3e67-486f-a622-923ae442117c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d7a29ab4db9b7548c70824520272e6323f615934cddf1d92bf653f6d8f030a19\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T08:39:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6lhz2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ab314ed006e71cae099ecf8be4ac18fb777dd83fe4e233d9bc347965f76b6a0b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ab314ed006e71cae099ecf8be4ac18fb777dd83fe4e233d9bc347965f76b6a0b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T08:39:37Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T08:39:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6lhz2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://261208c132a527b6786f64f37dd699e889f68ebc01265028288b3620615274bc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://261208c132a527b6786f64f37dd699e889f68ebc01265028288b3620615274bc\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T08:39:39Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T08:39:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6lhz2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8358f38c6c98de8206fc03fda21200b0e526972747588a6e32d525c064aa57ac\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8358f38c6c98de8206fc03fda21200b0e526972747588a6e32d525c064aa57ac\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T08:39:39Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T08:39:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6lhz2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9af7a0993c4e8d1177050ee170ae306c2e2570b0daca2d3f5c812b5f0e9c81da\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9af7a0993c4e8d1177050ee170ae306c2e2570b0daca2d3f5c812b5f0e9c81da\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T08:39:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T08:39:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6lhz2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://23ac91bc25ecc5c606b22bf6df52129330bb8c214ef8ec881fb202df6350c853\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://23ac91bc25ecc5c606b22bf6df52129330bb8c214ef8ec881fb202df6350c853\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T08:39:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T08:39:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6lhz2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7c836df75da45ef369baafc15bdbed1068becc3bf57a4c83a8519280ff3eb847\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7c836df75da45ef369baafc15bdbed1068becc3bf57a4c83a8519280ff3eb847\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T08:39:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T08:39:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6lhz2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T08:39:36Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-4bmrv\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T08:40:35Z is after 2025-08-24T17:21:41Z" Oct 03 08:40:35 crc kubenswrapper[4765]: I1003 08:40:35.871041 4765 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-p9gf5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"46c76a49-e10b-4a12-a6c7-12c330cd3c4e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d127171dd11041892813dd0596574630e756cc4f2e54b149619bffdbe9bae37d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T08:39:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2ldt7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T08:39:36Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-p9gf5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T08:40:35Z is after 2025-08-24T17:21:41Z" Oct 03 08:40:35 crc kubenswrapper[4765]: I1003 08:40:35.883503 4765 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-svqbq" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"84cdf1d7-9997-4015-bdbf-eedacc081685\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://43441b23076aa88505c0014c6734ffd0302f9011300711eece573befc94f3fbf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T08:39:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tm2z5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T08:39:39Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-svqbq\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T08:40:35Z is after 2025-08-24T17:21:41Z" Oct 03 08:40:35 crc kubenswrapper[4765]: I1003 08:40:35.900539 4765 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:36Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:36Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e2003e4dd90b26bd915c05a690d0ab12b21ef7773138f11993382b0e7ac2d778\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T08:39:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T08:40:35Z is after 2025-08-24T17:21:41Z" Oct 03 08:40:35 crc kubenswrapper[4765]: I1003 08:40:35.915241 4765 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-wdwf5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6824483c-e9a7-4e95-bb3d-e00bac2af3aa\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:50Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:50Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:50Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9t858\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9t858\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T08:39:50Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-wdwf5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T08:40:35Z is after 2025-08-24T17:21:41Z" Oct 03 08:40:35 crc kubenswrapper[4765]: I1003 08:40:35.929034 4765 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"648d26ad-0ca3-4ce7-885d-6aab568ed72d\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://dee8eb78cfc7f681a7009b32e7521490cfa896aee35f8f552a150738224517be\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T08:39:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bdfadb3541e9c76e5ab7469b7161c24715f4eeff89ec4bba0cc253bece41f1b2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://bdfadb3541e9c76e5ab7469b7161c24715f4eeff89ec4bba0cc253bece41f1b2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T08:39:17Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T08:39:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T08:39:16Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T08:40:35Z is after 2025-08-24T17:21:41Z" Oct 03 08:40:35 crc kubenswrapper[4765]: I1003 08:40:35.934507 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 08:40:35 crc kubenswrapper[4765]: I1003 08:40:35.934552 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 08:40:35 crc kubenswrapper[4765]: I1003 08:40:35.934565 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 08:40:35 crc kubenswrapper[4765]: I1003 08:40:35.934584 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 08:40:35 crc kubenswrapper[4765]: I1003 08:40:35.934595 4765 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T08:40:35Z","lastTransitionTime":"2025-10-03T08:40:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 08:40:35 crc kubenswrapper[4765]: I1003 08:40:35.945423 4765 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0c434639-9c6c-420c-a51b-fdf59b654daa\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d31497fd54f7500ac776bdd9a16414d873c053353911ed5ba237b201e9e7ac12\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T08:39:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e89b19d6a5b90a2051665bf2e5e150f73df7899eff246ee75246bc2127c415ea\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T08:39:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://79fad446c147481b1a0ff2a173848b2d24384e6b6aafcd0749dc820e9abfe929\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T08:39:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f2e21a2b21d807288e991a3a44ea38d316985590080aa4291aa3385816f826dd\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://aa0283dadc2c5e48aa9bfd20ef35d889a350244b72eb8529d4d4e682d5fa0e47\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-03T08:39:35Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1003 08:39:29.830291 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1003 08:39:29.833185 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2710500186/tls.crt::/tmp/serving-cert-2710500186/tls.key\\\\\\\"\\\\nI1003 08:39:35.213224 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1003 08:39:35.219008 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1003 08:39:35.219055 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1003 08:39:35.219088 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1003 08:39:35.219098 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1003 08:39:35.227302 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI1003 08:39:35.227314 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW1003 08:39:35.227372 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1003 08:39:35.227381 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1003 08:39:35.227385 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1003 08:39:35.227395 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1003 08:39:35.227398 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1003 08:39:35.227401 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF1003 08:39:35.229781 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-03T08:39:19Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T08:39:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fa1d1c0f4dab4b4c6c9f3afccac34473eab40a714015a2a7ce725ed1a92b609c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T08:39:19Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7c94b0f96f50324a67a7b189e9d3c4e406193421aa2e793692219bef9f2578dd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7c94b0f96f50324a67a7b189e9d3c4e406193421aa2e793692219bef9f2578dd\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T08:39:17Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T08:39:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T08:39:16Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T08:40:35Z is after 2025-08-24T17:21:41Z" Oct 03 08:40:35 crc kubenswrapper[4765]: I1003 08:40:35.960372 4765 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2a9b9fb7-e509-45bf-8ceb-fed6c0d26821\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://78b8f31e2b3f0891e3909baeb57c5a2dfe52c0e85d1aa86fe045ed54c56d5202\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T08:39:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6cedef6c592c877edfd8afe1dc09789fdc84a816a6a84d9ac9115fa494d8b5fe\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T08:39:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f1e92137f7438e3f6ae4b9225226f23f10f0e5e8a2b6a86f486971315d8bee00\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T08:39:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://00a75616be0bff2d1c730afda7f4212c6d85e07870e6f680c6903862387e00a0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://00a75616be0bff2d1c730afda7f4212c6d85e07870e6f680c6903862387e00a0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T08:39:17Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T08:39:17Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T08:39:16Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T08:40:35Z is after 2025-08-24T17:21:41Z" Oct 03 08:40:35 crc kubenswrapper[4765]: I1003 08:40:35.989095 4765 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:35Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:35Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T08:40:35Z is after 2025-08-24T17:21:41Z" Oct 03 08:40:36 crc kubenswrapper[4765]: I1003 08:40:36.012768 4765 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"859ee4f1-636f-48e5-ad72-fef19f311c64\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ffaf0cbc60fa84230a87aff908b5b2a76956abfa937aeea94363abe91640b93e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T08:39:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0fee410f71d4fa82e7bf54dad906736bc7182be512825a06bf7a4c76ed2f2789\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T08:39:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ab0ed26066c771f9943b6435fa382ff61fb04f0c8bef3d505aba4c5d1a1d4740\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T08:39:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://153c9584928c3d064c6098126dad58733015ed123b9a55c959e69ddcc0ad2110\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T08:39:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6fa1bc45d80d90bc08ca3a7177e2ac77b66c36f5a0f863532174be7719bfaae0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T08:39:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c1359775b3b907743ce1d7754c904fab5cf953ad3a28f91e781be95462e6a9d4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c1359775b3b907743ce1d7754c904fab5cf953ad3a28f91e781be95462e6a9d4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T08:39:17Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T08:39:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a56c24b3af6b3d5c18009cbb5517273674eb36f57157344f99903f9c50c7d1a0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a56c24b3af6b3d5c18009cbb5517273674eb36f57157344f99903f9c50c7d1a0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T08:39:18Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T08:39:18Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://18dbd66a12f96fa4beef4da9dcc0e1b1d3a5164a8b21a6774142289078d6c05f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://18dbd66a12f96fa4beef4da9dcc0e1b1d3a5164a8b21a6774142289078d6c05f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T08:39:19Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T08:39:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T08:39:16Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T08:40:36Z is after 2025-08-24T17:21:41Z" Oct 03 08:40:36 crc kubenswrapper[4765]: I1003 08:40:36.026142 4765 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:35Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:35Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T08:40:36Z is after 2025-08-24T17:21:41Z" Oct 03 08:40:36 crc kubenswrapper[4765]: I1003 08:40:36.036667 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 08:40:36 crc kubenswrapper[4765]: I1003 08:40:36.036710 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 08:40:36 crc kubenswrapper[4765]: I1003 08:40:36.036722 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 08:40:36 crc kubenswrapper[4765]: I1003 08:40:36.036741 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 08:40:36 crc kubenswrapper[4765]: I1003 08:40:36.036753 4765 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T08:40:36Z","lastTransitionTime":"2025-10-03T08:40:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 08:40:36 crc kubenswrapper[4765]: I1003 08:40:36.042120 4765 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-csb5z" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"912755c8-dd28-4fbc-82de-9cf85df54f4f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://52f5a7f443bf8e52988e8645ff60745a747d602261e7dbf01b68c58aaf9bae05\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d7f179012e9f55f30c641a1ae3640cc90cefb3d2527d0c1e0580c219899503e1\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-10-03T08:40:23Z\\\",\\\"message\\\":\\\"2025-10-03T08:39:38+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_5149c5cc-1f13-4c92-ba76-7ef1ed5a7abf\\\\n2025-10-03T08:39:38+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_5149c5cc-1f13-4c92-ba76-7ef1ed5a7abf to /host/opt/cni/bin/\\\\n2025-10-03T08:39:38Z [verbose] multus-daemon started\\\\n2025-10-03T08:39:38Z [verbose] Readiness Indicator file check\\\\n2025-10-03T08:40:23Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-03T08:39:37Z\\\"}},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T08:40:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w8k2k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T08:39:36Z\\\"}}\" for pod \"openshift-multus\"/\"multus-csb5z\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T08:40:36Z is after 2025-08-24T17:21:41Z" Oct 03 08:40:36 crc kubenswrapper[4765]: I1003 08:40:36.056030 4765 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-j8mss" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d636dbad-9ffa-4ba7-953f-adea04b76a23\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://33c95fa1034cd2135f4293956d73825e809195d220ff0b10a6604bd399a5730a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T08:39:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dhzzj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://714c78e9165f96e2aee03ad7be980399f06aeb852da4d76611c236f262518281\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T08:39:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dhzzj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T08:39:36Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-j8mss\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T08:40:36Z is after 2025-08-24T17:21:41Z" Oct 03 08:40:36 crc kubenswrapper[4765]: I1003 08:40:36.139252 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 08:40:36 crc kubenswrapper[4765]: I1003 08:40:36.139313 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 08:40:36 crc kubenswrapper[4765]: I1003 08:40:36.139327 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 08:40:36 crc kubenswrapper[4765]: I1003 08:40:36.139353 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 08:40:36 crc kubenswrapper[4765]: I1003 08:40:36.139369 4765 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T08:40:36Z","lastTransitionTime":"2025-10-03T08:40:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 08:40:36 crc kubenswrapper[4765]: I1003 08:40:36.241416 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 08:40:36 crc kubenswrapper[4765]: I1003 08:40:36.241464 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 08:40:36 crc kubenswrapper[4765]: I1003 08:40:36.241477 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 08:40:36 crc kubenswrapper[4765]: I1003 08:40:36.241495 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 08:40:36 crc kubenswrapper[4765]: I1003 08:40:36.241508 4765 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T08:40:36Z","lastTransitionTime":"2025-10-03T08:40:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 08:40:36 crc kubenswrapper[4765]: I1003 08:40:36.306413 4765 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-wdwf5" Oct 03 08:40:36 crc kubenswrapper[4765]: E1003 08:40:36.306569 4765 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-wdwf5" podUID="6824483c-e9a7-4e95-bb3d-e00bac2af3aa" Oct 03 08:40:36 crc kubenswrapper[4765]: I1003 08:40:36.320987 4765 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-4bmrv" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4f105c06-3e67-486f-a622-923ae442117c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d7a29ab4db9b7548c70824520272e6323f615934cddf1d92bf653f6d8f030a19\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T08:39:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6lhz2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ab314ed006e71cae099ecf8be4ac18fb777dd83fe4e233d9bc347965f76b6a0b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ab314ed006e71cae099ecf8be4ac18fb777dd83fe4e233d9bc347965f76b6a0b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T08:39:37Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T08:39:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6lhz2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://261208c132a527b6786f64f37dd699e889f68ebc01265028288b3620615274bc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://261208c132a527b6786f64f37dd699e889f68ebc01265028288b3620615274bc\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T08:39:39Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T08:39:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6lhz2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8358f38c6c98de8206fc03fda21200b0e526972747588a6e32d525c064aa57ac\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8358f38c6c98de8206fc03fda21200b0e526972747588a6e32d525c064aa57ac\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T08:39:39Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T08:39:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6lhz2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9af7a0993c4e8d1177050ee170ae306c2e2570b0daca2d3f5c812b5f0e9c81da\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9af7a0993c4e8d1177050ee170ae306c2e2570b0daca2d3f5c812b5f0e9c81da\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T08:39:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T08:39:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6lhz2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://23ac91bc25ecc5c606b22bf6df52129330bb8c214ef8ec881fb202df6350c853\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://23ac91bc25ecc5c606b22bf6df52129330bb8c214ef8ec881fb202df6350c853\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T08:39:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T08:39:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6lhz2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7c836df75da45ef369baafc15bdbed1068becc3bf57a4c83a8519280ff3eb847\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7c836df75da45ef369baafc15bdbed1068becc3bf57a4c83a8519280ff3eb847\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T08:39:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T08:39:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6lhz2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T08:39:36Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-4bmrv\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T08:40:36Z is after 2025-08-24T17:21:41Z" Oct 03 08:40:36 crc kubenswrapper[4765]: I1003 08:40:36.333055 4765 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-p9gf5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"46c76a49-e10b-4a12-a6c7-12c330cd3c4e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d127171dd11041892813dd0596574630e756cc4f2e54b149619bffdbe9bae37d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T08:39:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2ldt7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T08:39:36Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-p9gf5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T08:40:36Z is after 2025-08-24T17:21:41Z" Oct 03 08:40:36 crc kubenswrapper[4765]: I1003 08:40:36.344576 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 08:40:36 crc kubenswrapper[4765]: I1003 08:40:36.344665 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 08:40:36 crc kubenswrapper[4765]: I1003 08:40:36.344680 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 08:40:36 crc kubenswrapper[4765]: I1003 08:40:36.344702 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 08:40:36 crc kubenswrapper[4765]: I1003 08:40:36.344713 4765 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T08:40:36Z","lastTransitionTime":"2025-10-03T08:40:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 08:40:36 crc kubenswrapper[4765]: I1003 08:40:36.346515 4765 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-svqbq" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"84cdf1d7-9997-4015-bdbf-eedacc081685\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://43441b23076aa88505c0014c6734ffd0302f9011300711eece573befc94f3fbf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T08:39:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tm2z5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T08:39:39Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-svqbq\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T08:40:36Z is after 2025-08-24T17:21:41Z" Oct 03 08:40:36 crc kubenswrapper[4765]: I1003 08:40:36.360980 4765 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-9pssq" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fcbd8c60-e4bc-43c1-b769-9ae58a05ea0f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fb36c0727cbf11d911102b2e91c3989a264374191f4ff34349ed6ec8eba2e58d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T08:39:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ggxtg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d810b33fb4971c7a1473884cbe04ad15b3cac6c0ca9af2384819d72a748ab173\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T08:39:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ggxtg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T08:39:48Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-9pssq\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T08:40:36Z is after 2025-08-24T17:21:41Z" Oct 03 08:40:36 crc kubenswrapper[4765]: I1003 08:40:36.375050 4765 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9660b983-3561-4cf7-8ea0-31a63e8d1051\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://23c27e7d79dab0c54b22f0114e7f55a9267e3a21961b8479c37fd77d0e8b66c7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T08:39:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fb89a31c804d86cbc11b04e4dcfab79d4536f28a107d43e98d48172a1c257ebb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T08:39:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1c3168f51c49cd9633557cf31cdc0fec47b3fcf981462dc85f4253a0584fcf52\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T08:39:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://99ae775d5cfd2e88a1c7ca516e1c59f2e08ce1d383653cacbefeac66b07abcb5\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T08:39:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T08:39:16Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T08:40:36Z is after 2025-08-24T17:21:41Z" Oct 03 08:40:36 crc kubenswrapper[4765]: I1003 08:40:36.388354 4765 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0c434639-9c6c-420c-a51b-fdf59b654daa\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d31497fd54f7500ac776bdd9a16414d873c053353911ed5ba237b201e9e7ac12\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T08:39:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e89b19d6a5b90a2051665bf2e5e150f73df7899eff246ee75246bc2127c415ea\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T08:39:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://79fad446c147481b1a0ff2a173848b2d24384e6b6aafcd0749dc820e9abfe929\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T08:39:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f2e21a2b21d807288e991a3a44ea38d316985590080aa4291aa3385816f826dd\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://aa0283dadc2c5e48aa9bfd20ef35d889a350244b72eb8529d4d4e682d5fa0e47\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-03T08:39:35Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1003 08:39:29.830291 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1003 08:39:29.833185 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2710500186/tls.crt::/tmp/serving-cert-2710500186/tls.key\\\\\\\"\\\\nI1003 08:39:35.213224 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1003 08:39:35.219008 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1003 08:39:35.219055 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1003 08:39:35.219088 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1003 08:39:35.219098 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1003 08:39:35.227302 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI1003 08:39:35.227314 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW1003 08:39:35.227372 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1003 08:39:35.227381 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1003 08:39:35.227385 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1003 08:39:35.227395 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1003 08:39:35.227398 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1003 08:39:35.227401 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF1003 08:39:35.229781 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-03T08:39:19Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T08:39:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fa1d1c0f4dab4b4c6c9f3afccac34473eab40a714015a2a7ce725ed1a92b609c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T08:39:19Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7c94b0f96f50324a67a7b189e9d3c4e406193421aa2e793692219bef9f2578dd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7c94b0f96f50324a67a7b189e9d3c4e406193421aa2e793692219bef9f2578dd\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T08:39:17Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T08:39:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T08:39:16Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T08:40:36Z is after 2025-08-24T17:21:41Z" Oct 03 08:40:36 crc kubenswrapper[4765]: I1003 08:40:36.400750 4765 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2a9b9fb7-e509-45bf-8ceb-fed6c0d26821\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://78b8f31e2b3f0891e3909baeb57c5a2dfe52c0e85d1aa86fe045ed54c56d5202\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T08:39:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6cedef6c592c877edfd8afe1dc09789fdc84a816a6a84d9ac9115fa494d8b5fe\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T08:39:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f1e92137f7438e3f6ae4b9225226f23f10f0e5e8a2b6a86f486971315d8bee00\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T08:39:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://00a75616be0bff2d1c730afda7f4212c6d85e07870e6f680c6903862387e00a0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://00a75616be0bff2d1c730afda7f4212c6d85e07870e6f680c6903862387e00a0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T08:39:17Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T08:39:17Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T08:39:16Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T08:40:36Z is after 2025-08-24T17:21:41Z" Oct 03 08:40:36 crc kubenswrapper[4765]: I1003 08:40:36.416483 4765 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:35Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:35Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T08:40:36Z is after 2025-08-24T17:21:41Z" Oct 03 08:40:36 crc kubenswrapper[4765]: I1003 08:40:36.433340 4765 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:36Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:36Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e2003e4dd90b26bd915c05a690d0ab12b21ef7773138f11993382b0e7ac2d778\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T08:39:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T08:40:36Z is after 2025-08-24T17:21:41Z" Oct 03 08:40:36 crc kubenswrapper[4765]: I1003 08:40:36.446880 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 08:40:36 crc kubenswrapper[4765]: I1003 08:40:36.446916 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 08:40:36 crc kubenswrapper[4765]: I1003 08:40:36.446926 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 08:40:36 crc kubenswrapper[4765]: I1003 08:40:36.446941 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 08:40:36 crc kubenswrapper[4765]: I1003 08:40:36.446953 4765 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T08:40:36Z","lastTransitionTime":"2025-10-03T08:40:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 08:40:36 crc kubenswrapper[4765]: I1003 08:40:36.447484 4765 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-wdwf5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6824483c-e9a7-4e95-bb3d-e00bac2af3aa\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:50Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:50Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:50Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9t858\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9t858\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T08:39:50Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-wdwf5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T08:40:36Z is after 2025-08-24T17:21:41Z" Oct 03 08:40:36 crc kubenswrapper[4765]: I1003 08:40:36.461421 4765 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"648d26ad-0ca3-4ce7-885d-6aab568ed72d\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://dee8eb78cfc7f681a7009b32e7521490cfa896aee35f8f552a150738224517be\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T08:39:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bdfadb3541e9c76e5ab7469b7161c24715f4eeff89ec4bba0cc253bece41f1b2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://bdfadb3541e9c76e5ab7469b7161c24715f4eeff89ec4bba0cc253bece41f1b2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T08:39:17Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T08:39:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T08:39:16Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T08:40:36Z is after 2025-08-24T17:21:41Z" Oct 03 08:40:36 crc kubenswrapper[4765]: I1003 08:40:36.474748 4765 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:35Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:35Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T08:40:36Z is after 2025-08-24T17:21:41Z" Oct 03 08:40:36 crc kubenswrapper[4765]: I1003 08:40:36.488215 4765 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-csb5z" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"912755c8-dd28-4fbc-82de-9cf85df54f4f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://52f5a7f443bf8e52988e8645ff60745a747d602261e7dbf01b68c58aaf9bae05\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d7f179012e9f55f30c641a1ae3640cc90cefb3d2527d0c1e0580c219899503e1\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-10-03T08:40:23Z\\\",\\\"message\\\":\\\"2025-10-03T08:39:38+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_5149c5cc-1f13-4c92-ba76-7ef1ed5a7abf\\\\n2025-10-03T08:39:38+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_5149c5cc-1f13-4c92-ba76-7ef1ed5a7abf to /host/opt/cni/bin/\\\\n2025-10-03T08:39:38Z [verbose] multus-daemon started\\\\n2025-10-03T08:39:38Z [verbose] Readiness Indicator file check\\\\n2025-10-03T08:40:23Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-03T08:39:37Z\\\"}},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T08:40:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w8k2k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T08:39:36Z\\\"}}\" for pod \"openshift-multus\"/\"multus-csb5z\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T08:40:36Z is after 2025-08-24T17:21:41Z" Oct 03 08:40:36 crc kubenswrapper[4765]: I1003 08:40:36.501777 4765 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-j8mss" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d636dbad-9ffa-4ba7-953f-adea04b76a23\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://33c95fa1034cd2135f4293956d73825e809195d220ff0b10a6604bd399a5730a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T08:39:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dhzzj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://714c78e9165f96e2aee03ad7be980399f06aeb852da4d76611c236f262518281\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T08:39:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dhzzj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T08:39:36Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-j8mss\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T08:40:36Z is after 2025-08-24T17:21:41Z" Oct 03 08:40:36 crc kubenswrapper[4765]: I1003 08:40:36.523977 4765 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"859ee4f1-636f-48e5-ad72-fef19f311c64\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ffaf0cbc60fa84230a87aff908b5b2a76956abfa937aeea94363abe91640b93e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T08:39:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0fee410f71d4fa82e7bf54dad906736bc7182be512825a06bf7a4c76ed2f2789\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T08:39:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ab0ed26066c771f9943b6435fa382ff61fb04f0c8bef3d505aba4c5d1a1d4740\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T08:39:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://153c9584928c3d064c6098126dad58733015ed123b9a55c959e69ddcc0ad2110\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T08:39:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6fa1bc45d80d90bc08ca3a7177e2ac77b66c36f5a0f863532174be7719bfaae0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T08:39:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c1359775b3b907743ce1d7754c904fab5cf953ad3a28f91e781be95462e6a9d4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c1359775b3b907743ce1d7754c904fab5cf953ad3a28f91e781be95462e6a9d4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T08:39:17Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T08:39:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a56c24b3af6b3d5c18009cbb5517273674eb36f57157344f99903f9c50c7d1a0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a56c24b3af6b3d5c18009cbb5517273674eb36f57157344f99903f9c50c7d1a0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T08:39:18Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T08:39:18Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://18dbd66a12f96fa4beef4da9dcc0e1b1d3a5164a8b21a6774142289078d6c05f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://18dbd66a12f96fa4beef4da9dcc0e1b1d3a5164a8b21a6774142289078d6c05f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T08:39:19Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T08:39:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T08:39:16Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T08:40:36Z is after 2025-08-24T17:21:41Z" Oct 03 08:40:36 crc kubenswrapper[4765]: I1003 08:40:36.538372 4765 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:36Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:36Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e8d6f534a0a702832db2f8947c1528a98d511d3950cc5a6ec0ac3b31b3dbcb7c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T08:39:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://20ad16cb9f0f7e17ac946cd2c3f7c01b6e6c95d6d76c99f482b3761546689af2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T08:39:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T08:40:36Z is after 2025-08-24T17:21:41Z" Oct 03 08:40:36 crc kubenswrapper[4765]: I1003 08:40:36.549950 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 08:40:36 crc kubenswrapper[4765]: I1003 08:40:36.550039 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 08:40:36 crc kubenswrapper[4765]: I1003 08:40:36.550060 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 08:40:36 crc kubenswrapper[4765]: I1003 08:40:36.550081 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 08:40:36 crc kubenswrapper[4765]: I1003 08:40:36.550096 4765 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T08:40:36Z","lastTransitionTime":"2025-10-03T08:40:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 08:40:36 crc kubenswrapper[4765]: I1003 08:40:36.551709 4765 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:38Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:38Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a37f2b5f797755065158a077232872befbc61f2f19c80dfd27bba7f131db794c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T08:39:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T08:40:36Z is after 2025-08-24T17:21:41Z" Oct 03 08:40:36 crc kubenswrapper[4765]: I1003 08:40:36.570188 4765 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-srgbb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ea01fba1-445f-46c1-898c-1ceb34866850\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:36Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:36Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d73e2e54676fc570262cfd551322ed003812c372ddc25695ca3b34ae2a05423b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T08:39:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m4zqv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fa40947035e07c4926ee170348e2bd545830d0c6c1fa6b59a2aa7f12eac2c6da\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T08:39:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m4zqv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://902d94d2cc9ce526c6ea774f1bb70fbee7da85cedab72fcd842f87d47ee8a458\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T08:39:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m4zqv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://95502595a856f5f235331ab5db3d4f97a50f968857c1962d12b873a714689f0c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T08:39:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m4zqv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a3ad66691c9dcf004703b79d697a78f9b42791fafba2ddf278997b6ad28bdd4a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T08:39:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m4zqv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://68b9b8a7ec5c072f50d44aa0d3800b7cdee18bdd868d37ec129ceb37a23bd3ca\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T08:39:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m4zqv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://115444bc9990e2060fb9e8fff1ca7328f3abbaee25879c6af5feac46f0a417bb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a4d481217db9abe6da65a66219fdf2298353f237df78c085f40bb803f7349ccd\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-10-03T08:40:03Z\\\",\\\"message\\\":\\\":false hairpin_snat_ip:169.254.0.5 fd69::5 neighbor_responder:none reject:true skip_snat:false]} protocol:{GoSet:[tcp]} selection_fields:{GoSet:[]} vips:{GoMap:map[10.217.5.188:443:]}] Rows:[] Columns:[] Mutations:[] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {53c717ca-2174-4315-bb03-c937a9c0d9b6}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nI1003 08:40:03.133396 6432 transact.go:42] Configuring OVN: [{Op:update Table:Load_Balancer Row:map[external_ids:{GoMap:map[k8s.ovn.org/kind:Service k8s.ovn.org/owner:openshift-cluster-version/cluster-version-operator]} name:Service_openshift-cluster-version/cluster-version-operator_TCP_cluster options:{GoMap:map[event:false hairpin_snat_ip:169.254.0.5 fd69::5 neighbor_responder:none reject:true skip_snat:false]} protocol:{GoSet:[tcp]} selection_fields:{GoSet:[]} vips:{GoMap:map[10.217.4.182:9099:]}] Rows:[] Columns:[] Mutations:[] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {61d39e4d-21a9-4387-9a2b-fa4ad14792e2}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nI1003 08:40:03.133455 6432 base_network_controller_pods.go:916] Annotation values: ip=[10.217.0.3/23] ; mac=0a:58:0a:d9:00:03 ; gw=[10.217.0.1]\\\\nF1003 08:40:03.133502 6432 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-03T08:40:02Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T08:40:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m4zqv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6d5d60eb6ab5ff22cc2c6826b1d47220bb827fa0429f2a59020ae01d0a43f6bf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T08:39:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m4zqv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://761ad034a8a6a7a6280fcccaca136b26ddd423885bc2a7ab6b8e1856f16f7416\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://761ad034a8a6a7a6280fcccaca136b26ddd423885bc2a7ab6b8e1856f16f7416\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T08:39:37Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T08:39:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m4zqv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T08:39:36Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-srgbb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T08:40:36Z is after 2025-08-24T17:21:41Z" Oct 03 08:40:36 crc kubenswrapper[4765]: I1003 08:40:36.583872 4765 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:35Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:35Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T08:40:36Z is after 2025-08-24T17:21:41Z" Oct 03 08:40:36 crc kubenswrapper[4765]: I1003 08:40:36.653056 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 08:40:36 crc kubenswrapper[4765]: I1003 08:40:36.653100 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 08:40:36 crc kubenswrapper[4765]: I1003 08:40:36.653110 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 08:40:36 crc kubenswrapper[4765]: I1003 08:40:36.653127 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 08:40:36 crc kubenswrapper[4765]: I1003 08:40:36.653139 4765 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T08:40:36Z","lastTransitionTime":"2025-10-03T08:40:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 08:40:36 crc kubenswrapper[4765]: I1003 08:40:36.739537 4765 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-srgbb_ea01fba1-445f-46c1-898c-1ceb34866850/ovnkube-controller/3.log" Oct 03 08:40:36 crc kubenswrapper[4765]: I1003 08:40:36.740405 4765 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-srgbb_ea01fba1-445f-46c1-898c-1ceb34866850/ovnkube-controller/2.log" Oct 03 08:40:36 crc kubenswrapper[4765]: I1003 08:40:36.743802 4765 generic.go:334] "Generic (PLEG): container finished" podID="ea01fba1-445f-46c1-898c-1ceb34866850" containerID="115444bc9990e2060fb9e8fff1ca7328f3abbaee25879c6af5feac46f0a417bb" exitCode=1 Oct 03 08:40:36 crc kubenswrapper[4765]: I1003 08:40:36.743856 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-srgbb" event={"ID":"ea01fba1-445f-46c1-898c-1ceb34866850","Type":"ContainerDied","Data":"115444bc9990e2060fb9e8fff1ca7328f3abbaee25879c6af5feac46f0a417bb"} Oct 03 08:40:36 crc kubenswrapper[4765]: I1003 08:40:36.743899 4765 scope.go:117] "RemoveContainer" containerID="a4d481217db9abe6da65a66219fdf2298353f237df78c085f40bb803f7349ccd" Oct 03 08:40:36 crc kubenswrapper[4765]: I1003 08:40:36.744819 4765 scope.go:117] "RemoveContainer" containerID="115444bc9990e2060fb9e8fff1ca7328f3abbaee25879c6af5feac46f0a417bb" Oct 03 08:40:36 crc kubenswrapper[4765]: E1003 08:40:36.744991 4765 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 40s restarting failed container=ovnkube-controller pod=ovnkube-node-srgbb_openshift-ovn-kubernetes(ea01fba1-445f-46c1-898c-1ceb34866850)\"" pod="openshift-ovn-kubernetes/ovnkube-node-srgbb" podUID="ea01fba1-445f-46c1-898c-1ceb34866850" Oct 03 08:40:36 crc kubenswrapper[4765]: I1003 08:40:36.755613 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 08:40:36 crc kubenswrapper[4765]: I1003 08:40:36.755662 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 08:40:36 crc kubenswrapper[4765]: I1003 08:40:36.755706 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 08:40:36 crc kubenswrapper[4765]: I1003 08:40:36.755724 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 08:40:36 crc kubenswrapper[4765]: I1003 08:40:36.755734 4765 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T08:40:36Z","lastTransitionTime":"2025-10-03T08:40:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 08:40:36 crc kubenswrapper[4765]: I1003 08:40:36.761925 4765 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:35Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:35Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T08:40:36Z is after 2025-08-24T17:21:41Z" Oct 03 08:40:36 crc kubenswrapper[4765]: I1003 08:40:36.777713 4765 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-csb5z" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"912755c8-dd28-4fbc-82de-9cf85df54f4f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://52f5a7f443bf8e52988e8645ff60745a747d602261e7dbf01b68c58aaf9bae05\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d7f179012e9f55f30c641a1ae3640cc90cefb3d2527d0c1e0580c219899503e1\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-10-03T08:40:23Z\\\",\\\"message\\\":\\\"2025-10-03T08:39:38+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_5149c5cc-1f13-4c92-ba76-7ef1ed5a7abf\\\\n2025-10-03T08:39:38+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_5149c5cc-1f13-4c92-ba76-7ef1ed5a7abf to /host/opt/cni/bin/\\\\n2025-10-03T08:39:38Z [verbose] multus-daemon started\\\\n2025-10-03T08:39:38Z [verbose] Readiness Indicator file check\\\\n2025-10-03T08:40:23Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-03T08:39:37Z\\\"}},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T08:40:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w8k2k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T08:39:36Z\\\"}}\" for pod \"openshift-multus\"/\"multus-csb5z\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T08:40:36Z is after 2025-08-24T17:21:41Z" Oct 03 08:40:36 crc kubenswrapper[4765]: I1003 08:40:36.790983 4765 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-j8mss" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d636dbad-9ffa-4ba7-953f-adea04b76a23\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://33c95fa1034cd2135f4293956d73825e809195d220ff0b10a6604bd399a5730a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T08:39:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dhzzj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://714c78e9165f96e2aee03ad7be980399f06aeb852da4d76611c236f262518281\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T08:39:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dhzzj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T08:39:36Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-j8mss\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T08:40:36Z is after 2025-08-24T17:21:41Z" Oct 03 08:40:36 crc kubenswrapper[4765]: I1003 08:40:36.814365 4765 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"859ee4f1-636f-48e5-ad72-fef19f311c64\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ffaf0cbc60fa84230a87aff908b5b2a76956abfa937aeea94363abe91640b93e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T08:39:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0fee410f71d4fa82e7bf54dad906736bc7182be512825a06bf7a4c76ed2f2789\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T08:39:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ab0ed26066c771f9943b6435fa382ff61fb04f0c8bef3d505aba4c5d1a1d4740\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T08:39:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://153c9584928c3d064c6098126dad58733015ed123b9a55c959e69ddcc0ad2110\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T08:39:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6fa1bc45d80d90bc08ca3a7177e2ac77b66c36f5a0f863532174be7719bfaae0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T08:39:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c1359775b3b907743ce1d7754c904fab5cf953ad3a28f91e781be95462e6a9d4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c1359775b3b907743ce1d7754c904fab5cf953ad3a28f91e781be95462e6a9d4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T08:39:17Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T08:39:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a56c24b3af6b3d5c18009cbb5517273674eb36f57157344f99903f9c50c7d1a0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a56c24b3af6b3d5c18009cbb5517273674eb36f57157344f99903f9c50c7d1a0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T08:39:18Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T08:39:18Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://18dbd66a12f96fa4beef4da9dcc0e1b1d3a5164a8b21a6774142289078d6c05f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://18dbd66a12f96fa4beef4da9dcc0e1b1d3a5164a8b21a6774142289078d6c05f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T08:39:19Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T08:39:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T08:39:16Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T08:40:36Z is after 2025-08-24T17:21:41Z" Oct 03 08:40:36 crc kubenswrapper[4765]: I1003 08:40:36.828536 4765 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:36Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:36Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e8d6f534a0a702832db2f8947c1528a98d511d3950cc5a6ec0ac3b31b3dbcb7c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T08:39:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://20ad16cb9f0f7e17ac946cd2c3f7c01b6e6c95d6d76c99f482b3761546689af2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T08:39:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T08:40:36Z is after 2025-08-24T17:21:41Z" Oct 03 08:40:36 crc kubenswrapper[4765]: I1003 08:40:36.841649 4765 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:38Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:38Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a37f2b5f797755065158a077232872befbc61f2f19c80dfd27bba7f131db794c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T08:39:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T08:40:36Z is after 2025-08-24T17:21:41Z" Oct 03 08:40:36 crc kubenswrapper[4765]: I1003 08:40:36.858825 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 08:40:36 crc kubenswrapper[4765]: I1003 08:40:36.858877 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 08:40:36 crc kubenswrapper[4765]: I1003 08:40:36.858890 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 08:40:36 crc kubenswrapper[4765]: I1003 08:40:36.858908 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 08:40:36 crc kubenswrapper[4765]: I1003 08:40:36.858920 4765 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T08:40:36Z","lastTransitionTime":"2025-10-03T08:40:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 08:40:36 crc kubenswrapper[4765]: I1003 08:40:36.862887 4765 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-srgbb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ea01fba1-445f-46c1-898c-1ceb34866850\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:36Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:36Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d73e2e54676fc570262cfd551322ed003812c372ddc25695ca3b34ae2a05423b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T08:39:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m4zqv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fa40947035e07c4926ee170348e2bd545830d0c6c1fa6b59a2aa7f12eac2c6da\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T08:39:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m4zqv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://902d94d2cc9ce526c6ea774f1bb70fbee7da85cedab72fcd842f87d47ee8a458\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T08:39:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m4zqv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://95502595a856f5f235331ab5db3d4f97a50f968857c1962d12b873a714689f0c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T08:39:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m4zqv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a3ad66691c9dcf004703b79d697a78f9b42791fafba2ddf278997b6ad28bdd4a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T08:39:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m4zqv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://68b9b8a7ec5c072f50d44aa0d3800b7cdee18bdd868d37ec129ceb37a23bd3ca\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T08:39:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m4zqv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://115444bc9990e2060fb9e8fff1ca7328f3abbaee25879c6af5feac46f0a417bb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a4d481217db9abe6da65a66219fdf2298353f237df78c085f40bb803f7349ccd\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-10-03T08:40:03Z\\\",\\\"message\\\":\\\":false hairpin_snat_ip:169.254.0.5 fd69::5 neighbor_responder:none reject:true skip_snat:false]} protocol:{GoSet:[tcp]} selection_fields:{GoSet:[]} vips:{GoMap:map[10.217.5.188:443:]}] Rows:[] Columns:[] Mutations:[] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {53c717ca-2174-4315-bb03-c937a9c0d9b6}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nI1003 08:40:03.133396 6432 transact.go:42] Configuring OVN: [{Op:update Table:Load_Balancer Row:map[external_ids:{GoMap:map[k8s.ovn.org/kind:Service k8s.ovn.org/owner:openshift-cluster-version/cluster-version-operator]} name:Service_openshift-cluster-version/cluster-version-operator_TCP_cluster options:{GoMap:map[event:false hairpin_snat_ip:169.254.0.5 fd69::5 neighbor_responder:none reject:true skip_snat:false]} protocol:{GoSet:[tcp]} selection_fields:{GoSet:[]} vips:{GoMap:map[10.217.4.182:9099:]}] Rows:[] Columns:[] Mutations:[] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {61d39e4d-21a9-4387-9a2b-fa4ad14792e2}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nI1003 08:40:03.133455 6432 base_network_controller_pods.go:916] Annotation values: ip=[10.217.0.3/23] ; mac=0a:58:0a:d9:00:03 ; gw=[10.217.0.1]\\\\nF1003 08:40:03.133502 6432 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-03T08:40:02Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://115444bc9990e2060fb9e8fff1ca7328f3abbaee25879c6af5feac46f0a417bb\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-10-03T08:40:36Z\\\",\\\"message\\\":\\\" error occurred: failed calling webhook \\\\\\\"pod.network-node-identity.openshift.io\\\\\\\": failed to call webhook: Post \\\\\\\"https://127.0.0.1:9743/pod?timeout=10s\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T08:40:36Z is after 2025-08-24T17:21:41Z\\\\nI1003 08:40:36.201436 6807 obj_retry.go:285] Attempting retry of *v1.Pod openshift-multus/network-metrics-daemon-wdwf5 before timer (time: 2025-10-03 08:40:37.635855498 +0000 UTC m=+2.038952178): skip\\\\nI1003 08:40:36.201453 6807 obj_retry.go:420] Function iterateRetryResources for *v1.Pod ended (in 55.481µs)\\\\nI1003 08:40:36.201382 6807 handler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI1003 08:40:36.201544 6807 handler.go:190] Sending *v1.Namespace event handler 1 for removal\\\\nI1003 08:40:36.201600 6807 handler.go:208] Removed *v1.Node event handler 2\\\\nI1003 08:40:36.201626 6807 factory.go:656] Stopping watch factory\\\\nI1003 08:40:36.201639 6807 ovnkube.go:599] Stopped ovnkube\\\\nI1003 08:40:36.201671 6807 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI1003 08:40:36.201691 6807 metrics.go:553] Stopping metrics server at address \\\\\\\"127.0.0.1:29103\\\\\\\"\\\\nF1003 08:40:36.201791 6807 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-03T08:40:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m4zqv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6d5d60eb6ab5ff22cc2c6826b1d47220bb827fa0429f2a59020ae01d0a43f6bf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T08:39:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m4zqv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://761ad034a8a6a7a6280fcccaca136b26ddd423885bc2a7ab6b8e1856f16f7416\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://761ad034a8a6a7a6280fcccaca136b26ddd423885bc2a7ab6b8e1856f16f7416\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T08:39:37Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T08:39:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m4zqv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T08:39:36Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-srgbb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T08:40:36Z is after 2025-08-24T17:21:41Z" Oct 03 08:40:36 crc kubenswrapper[4765]: I1003 08:40:36.876798 4765 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:35Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:35Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T08:40:36Z is after 2025-08-24T17:21:41Z" Oct 03 08:40:36 crc kubenswrapper[4765]: I1003 08:40:36.890870 4765 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-4bmrv" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4f105c06-3e67-486f-a622-923ae442117c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d7a29ab4db9b7548c70824520272e6323f615934cddf1d92bf653f6d8f030a19\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T08:39:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6lhz2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ab314ed006e71cae099ecf8be4ac18fb777dd83fe4e233d9bc347965f76b6a0b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ab314ed006e71cae099ecf8be4ac18fb777dd83fe4e233d9bc347965f76b6a0b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T08:39:37Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T08:39:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6lhz2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://261208c132a527b6786f64f37dd699e889f68ebc01265028288b3620615274bc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://261208c132a527b6786f64f37dd699e889f68ebc01265028288b3620615274bc\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T08:39:39Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T08:39:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6lhz2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8358f38c6c98de8206fc03fda21200b0e526972747588a6e32d525c064aa57ac\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8358f38c6c98de8206fc03fda21200b0e526972747588a6e32d525c064aa57ac\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T08:39:39Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T08:39:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6lhz2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9af7a0993c4e8d1177050ee170ae306c2e2570b0daca2d3f5c812b5f0e9c81da\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9af7a0993c4e8d1177050ee170ae306c2e2570b0daca2d3f5c812b5f0e9c81da\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T08:39:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T08:39:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6lhz2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://23ac91bc25ecc5c606b22bf6df52129330bb8c214ef8ec881fb202df6350c853\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://23ac91bc25ecc5c606b22bf6df52129330bb8c214ef8ec881fb202df6350c853\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T08:39:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T08:39:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6lhz2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7c836df75da45ef369baafc15bdbed1068becc3bf57a4c83a8519280ff3eb847\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7c836df75da45ef369baafc15bdbed1068becc3bf57a4c83a8519280ff3eb847\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T08:39:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T08:39:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6lhz2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T08:39:36Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-4bmrv\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T08:40:36Z is after 2025-08-24T17:21:41Z" Oct 03 08:40:36 crc kubenswrapper[4765]: I1003 08:40:36.903442 4765 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-p9gf5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"46c76a49-e10b-4a12-a6c7-12c330cd3c4e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d127171dd11041892813dd0596574630e756cc4f2e54b149619bffdbe9bae37d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T08:39:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2ldt7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T08:39:36Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-p9gf5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T08:40:36Z is after 2025-08-24T17:21:41Z" Oct 03 08:40:36 crc kubenswrapper[4765]: I1003 08:40:36.915260 4765 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-svqbq" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"84cdf1d7-9997-4015-bdbf-eedacc081685\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://43441b23076aa88505c0014c6734ffd0302f9011300711eece573befc94f3fbf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T08:39:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tm2z5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T08:39:39Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-svqbq\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T08:40:36Z is after 2025-08-24T17:21:41Z" Oct 03 08:40:36 crc kubenswrapper[4765]: I1003 08:40:36.926840 4765 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-9pssq" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fcbd8c60-e4bc-43c1-b769-9ae58a05ea0f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fb36c0727cbf11d911102b2e91c3989a264374191f4ff34349ed6ec8eba2e58d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T08:39:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ggxtg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d810b33fb4971c7a1473884cbe04ad15b3cac6c0ca9af2384819d72a748ab173\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T08:39:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ggxtg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T08:39:48Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-9pssq\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T08:40:36Z is after 2025-08-24T17:21:41Z" Oct 03 08:40:36 crc kubenswrapper[4765]: I1003 08:40:36.941440 4765 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9660b983-3561-4cf7-8ea0-31a63e8d1051\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://23c27e7d79dab0c54b22f0114e7f55a9267e3a21961b8479c37fd77d0e8b66c7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T08:39:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fb89a31c804d86cbc11b04e4dcfab79d4536f28a107d43e98d48172a1c257ebb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T08:39:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1c3168f51c49cd9633557cf31cdc0fec47b3fcf981462dc85f4253a0584fcf52\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T08:39:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://99ae775d5cfd2e88a1c7ca516e1c59f2e08ce1d383653cacbefeac66b07abcb5\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T08:39:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T08:39:16Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T08:40:36Z is after 2025-08-24T17:21:41Z" Oct 03 08:40:36 crc kubenswrapper[4765]: I1003 08:40:36.955430 4765 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0c434639-9c6c-420c-a51b-fdf59b654daa\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d31497fd54f7500ac776bdd9a16414d873c053353911ed5ba237b201e9e7ac12\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T08:39:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e89b19d6a5b90a2051665bf2e5e150f73df7899eff246ee75246bc2127c415ea\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T08:39:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://79fad446c147481b1a0ff2a173848b2d24384e6b6aafcd0749dc820e9abfe929\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T08:39:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f2e21a2b21d807288e991a3a44ea38d316985590080aa4291aa3385816f826dd\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://aa0283dadc2c5e48aa9bfd20ef35d889a350244b72eb8529d4d4e682d5fa0e47\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-03T08:39:35Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1003 08:39:29.830291 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1003 08:39:29.833185 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2710500186/tls.crt::/tmp/serving-cert-2710500186/tls.key\\\\\\\"\\\\nI1003 08:39:35.213224 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1003 08:39:35.219008 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1003 08:39:35.219055 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1003 08:39:35.219088 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1003 08:39:35.219098 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1003 08:39:35.227302 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI1003 08:39:35.227314 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW1003 08:39:35.227372 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1003 08:39:35.227381 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1003 08:39:35.227385 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1003 08:39:35.227395 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1003 08:39:35.227398 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1003 08:39:35.227401 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF1003 08:39:35.229781 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-03T08:39:19Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T08:39:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fa1d1c0f4dab4b4c6c9f3afccac34473eab40a714015a2a7ce725ed1a92b609c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T08:39:19Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7c94b0f96f50324a67a7b189e9d3c4e406193421aa2e793692219bef9f2578dd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7c94b0f96f50324a67a7b189e9d3c4e406193421aa2e793692219bef9f2578dd\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T08:39:17Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T08:39:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T08:39:16Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T08:40:36Z is after 2025-08-24T17:21:41Z" Oct 03 08:40:36 crc kubenswrapper[4765]: I1003 08:40:36.961452 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 08:40:36 crc kubenswrapper[4765]: I1003 08:40:36.961495 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 08:40:36 crc kubenswrapper[4765]: I1003 08:40:36.961509 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 08:40:36 crc kubenswrapper[4765]: I1003 08:40:36.961529 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 08:40:36 crc kubenswrapper[4765]: I1003 08:40:36.961542 4765 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T08:40:36Z","lastTransitionTime":"2025-10-03T08:40:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 08:40:36 crc kubenswrapper[4765]: I1003 08:40:36.969429 4765 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2a9b9fb7-e509-45bf-8ceb-fed6c0d26821\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://78b8f31e2b3f0891e3909baeb57c5a2dfe52c0e85d1aa86fe045ed54c56d5202\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T08:39:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6cedef6c592c877edfd8afe1dc09789fdc84a816a6a84d9ac9115fa494d8b5fe\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T08:39:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f1e92137f7438e3f6ae4b9225226f23f10f0e5e8a2b6a86f486971315d8bee00\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T08:39:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://00a75616be0bff2d1c730afda7f4212c6d85e07870e6f680c6903862387e00a0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://00a75616be0bff2d1c730afda7f4212c6d85e07870e6f680c6903862387e00a0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T08:39:17Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T08:39:17Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T08:39:16Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T08:40:36Z is after 2025-08-24T17:21:41Z" Oct 03 08:40:36 crc kubenswrapper[4765]: I1003 08:40:36.987513 4765 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:35Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:35Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T08:40:36Z is after 2025-08-24T17:21:41Z" Oct 03 08:40:37 crc kubenswrapper[4765]: I1003 08:40:37.003441 4765 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:36Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:36Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e2003e4dd90b26bd915c05a690d0ab12b21ef7773138f11993382b0e7ac2d778\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T08:39:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T08:40:37Z is after 2025-08-24T17:21:41Z" Oct 03 08:40:37 crc kubenswrapper[4765]: I1003 08:40:37.015139 4765 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-wdwf5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6824483c-e9a7-4e95-bb3d-e00bac2af3aa\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:50Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:50Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:50Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9t858\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9t858\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T08:39:50Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-wdwf5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T08:40:37Z is after 2025-08-24T17:21:41Z" Oct 03 08:40:37 crc kubenswrapper[4765]: I1003 08:40:37.025771 4765 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"648d26ad-0ca3-4ce7-885d-6aab568ed72d\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://dee8eb78cfc7f681a7009b32e7521490cfa896aee35f8f552a150738224517be\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T08:39:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bdfadb3541e9c76e5ab7469b7161c24715f4eeff89ec4bba0cc253bece41f1b2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://bdfadb3541e9c76e5ab7469b7161c24715f4eeff89ec4bba0cc253bece41f1b2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T08:39:17Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T08:39:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T08:39:16Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T08:40:37Z is after 2025-08-24T17:21:41Z" Oct 03 08:40:37 crc kubenswrapper[4765]: I1003 08:40:37.063909 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 08:40:37 crc kubenswrapper[4765]: I1003 08:40:37.063987 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 08:40:37 crc kubenswrapper[4765]: I1003 08:40:37.063997 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 08:40:37 crc kubenswrapper[4765]: I1003 08:40:37.064014 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 08:40:37 crc kubenswrapper[4765]: I1003 08:40:37.064024 4765 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T08:40:37Z","lastTransitionTime":"2025-10-03T08:40:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 08:40:37 crc kubenswrapper[4765]: I1003 08:40:37.166730 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 08:40:37 crc kubenswrapper[4765]: I1003 08:40:37.166768 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 08:40:37 crc kubenswrapper[4765]: I1003 08:40:37.166779 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 08:40:37 crc kubenswrapper[4765]: I1003 08:40:37.166796 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 08:40:37 crc kubenswrapper[4765]: I1003 08:40:37.166809 4765 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T08:40:37Z","lastTransitionTime":"2025-10-03T08:40:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 08:40:37 crc kubenswrapper[4765]: I1003 08:40:37.269727 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 08:40:37 crc kubenswrapper[4765]: I1003 08:40:37.269819 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 08:40:37 crc kubenswrapper[4765]: I1003 08:40:37.270018 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 08:40:37 crc kubenswrapper[4765]: I1003 08:40:37.270044 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 08:40:37 crc kubenswrapper[4765]: I1003 08:40:37.270060 4765 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T08:40:37Z","lastTransitionTime":"2025-10-03T08:40:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 08:40:37 crc kubenswrapper[4765]: I1003 08:40:37.306100 4765 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 03 08:40:37 crc kubenswrapper[4765]: I1003 08:40:37.306141 4765 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 03 08:40:37 crc kubenswrapper[4765]: I1003 08:40:37.306100 4765 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 03 08:40:37 crc kubenswrapper[4765]: E1003 08:40:37.306319 4765 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 03 08:40:37 crc kubenswrapper[4765]: E1003 08:40:37.306509 4765 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 03 08:40:37 crc kubenswrapper[4765]: E1003 08:40:37.306769 4765 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 03 08:40:37 crc kubenswrapper[4765]: I1003 08:40:37.373265 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 08:40:37 crc kubenswrapper[4765]: I1003 08:40:37.373311 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 08:40:37 crc kubenswrapper[4765]: I1003 08:40:37.373325 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 08:40:37 crc kubenswrapper[4765]: I1003 08:40:37.373355 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 08:40:37 crc kubenswrapper[4765]: I1003 08:40:37.373367 4765 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T08:40:37Z","lastTransitionTime":"2025-10-03T08:40:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 08:40:37 crc kubenswrapper[4765]: I1003 08:40:37.475423 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 08:40:37 crc kubenswrapper[4765]: I1003 08:40:37.475471 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 08:40:37 crc kubenswrapper[4765]: I1003 08:40:37.475481 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 08:40:37 crc kubenswrapper[4765]: I1003 08:40:37.475505 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 08:40:37 crc kubenswrapper[4765]: I1003 08:40:37.475524 4765 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T08:40:37Z","lastTransitionTime":"2025-10-03T08:40:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 08:40:37 crc kubenswrapper[4765]: I1003 08:40:37.578811 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 08:40:37 crc kubenswrapper[4765]: I1003 08:40:37.578850 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 08:40:37 crc kubenswrapper[4765]: I1003 08:40:37.578862 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 08:40:37 crc kubenswrapper[4765]: I1003 08:40:37.578877 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 08:40:37 crc kubenswrapper[4765]: I1003 08:40:37.578887 4765 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T08:40:37Z","lastTransitionTime":"2025-10-03T08:40:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 08:40:37 crc kubenswrapper[4765]: I1003 08:40:37.681640 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 08:40:37 crc kubenswrapper[4765]: I1003 08:40:37.681706 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 08:40:37 crc kubenswrapper[4765]: I1003 08:40:37.681715 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 08:40:37 crc kubenswrapper[4765]: I1003 08:40:37.681733 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 08:40:37 crc kubenswrapper[4765]: I1003 08:40:37.681744 4765 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T08:40:37Z","lastTransitionTime":"2025-10-03T08:40:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 08:40:37 crc kubenswrapper[4765]: I1003 08:40:37.748640 4765 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-srgbb_ea01fba1-445f-46c1-898c-1ceb34866850/ovnkube-controller/3.log" Oct 03 08:40:37 crc kubenswrapper[4765]: I1003 08:40:37.752095 4765 scope.go:117] "RemoveContainer" containerID="115444bc9990e2060fb9e8fff1ca7328f3abbaee25879c6af5feac46f0a417bb" Oct 03 08:40:37 crc kubenswrapper[4765]: E1003 08:40:37.752375 4765 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 40s restarting failed container=ovnkube-controller pod=ovnkube-node-srgbb_openshift-ovn-kubernetes(ea01fba1-445f-46c1-898c-1ceb34866850)\"" pod="openshift-ovn-kubernetes/ovnkube-node-srgbb" podUID="ea01fba1-445f-46c1-898c-1ceb34866850" Oct 03 08:40:37 crc kubenswrapper[4765]: I1003 08:40:37.768495 4765 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2a9b9fb7-e509-45bf-8ceb-fed6c0d26821\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://78b8f31e2b3f0891e3909baeb57c5a2dfe52c0e85d1aa86fe045ed54c56d5202\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T08:39:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6cedef6c592c877edfd8afe1dc09789fdc84a816a6a84d9ac9115fa494d8b5fe\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T08:39:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f1e92137f7438e3f6ae4b9225226f23f10f0e5e8a2b6a86f486971315d8bee00\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T08:39:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://00a75616be0bff2d1c730afda7f4212c6d85e07870e6f680c6903862387e00a0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://00a75616be0bff2d1c730afda7f4212c6d85e07870e6f680c6903862387e00a0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T08:39:17Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T08:39:17Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T08:39:16Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T08:40:37Z is after 2025-08-24T17:21:41Z" Oct 03 08:40:37 crc kubenswrapper[4765]: I1003 08:40:37.780296 4765 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:35Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:35Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T08:40:37Z is after 2025-08-24T17:21:41Z" Oct 03 08:40:37 crc kubenswrapper[4765]: I1003 08:40:37.783815 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 08:40:37 crc kubenswrapper[4765]: I1003 08:40:37.783995 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 08:40:37 crc kubenswrapper[4765]: I1003 08:40:37.784069 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 08:40:37 crc kubenswrapper[4765]: I1003 08:40:37.784150 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 08:40:37 crc kubenswrapper[4765]: I1003 08:40:37.784224 4765 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T08:40:37Z","lastTransitionTime":"2025-10-03T08:40:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 08:40:37 crc kubenswrapper[4765]: I1003 08:40:37.792134 4765 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:36Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:36Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e2003e4dd90b26bd915c05a690d0ab12b21ef7773138f11993382b0e7ac2d778\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T08:39:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T08:40:37Z is after 2025-08-24T17:21:41Z" Oct 03 08:40:37 crc kubenswrapper[4765]: I1003 08:40:37.803657 4765 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-wdwf5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6824483c-e9a7-4e95-bb3d-e00bac2af3aa\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:50Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:50Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:50Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9t858\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9t858\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T08:39:50Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-wdwf5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T08:40:37Z is after 2025-08-24T17:21:41Z" Oct 03 08:40:37 crc kubenswrapper[4765]: I1003 08:40:37.814965 4765 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"648d26ad-0ca3-4ce7-885d-6aab568ed72d\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://dee8eb78cfc7f681a7009b32e7521490cfa896aee35f8f552a150738224517be\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T08:39:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bdfadb3541e9c76e5ab7469b7161c24715f4eeff89ec4bba0cc253bece41f1b2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://bdfadb3541e9c76e5ab7469b7161c24715f4eeff89ec4bba0cc253bece41f1b2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T08:39:17Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T08:39:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T08:39:16Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T08:40:37Z is after 2025-08-24T17:21:41Z" Oct 03 08:40:37 crc kubenswrapper[4765]: I1003 08:40:37.827993 4765 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0c434639-9c6c-420c-a51b-fdf59b654daa\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d31497fd54f7500ac776bdd9a16414d873c053353911ed5ba237b201e9e7ac12\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T08:39:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e89b19d6a5b90a2051665bf2e5e150f73df7899eff246ee75246bc2127c415ea\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T08:39:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://79fad446c147481b1a0ff2a173848b2d24384e6b6aafcd0749dc820e9abfe929\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T08:39:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f2e21a2b21d807288e991a3a44ea38d316985590080aa4291aa3385816f826dd\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://aa0283dadc2c5e48aa9bfd20ef35d889a350244b72eb8529d4d4e682d5fa0e47\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-03T08:39:35Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1003 08:39:29.830291 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1003 08:39:29.833185 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2710500186/tls.crt::/tmp/serving-cert-2710500186/tls.key\\\\\\\"\\\\nI1003 08:39:35.213224 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1003 08:39:35.219008 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1003 08:39:35.219055 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1003 08:39:35.219088 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1003 08:39:35.219098 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1003 08:39:35.227302 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI1003 08:39:35.227314 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW1003 08:39:35.227372 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1003 08:39:35.227381 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1003 08:39:35.227385 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1003 08:39:35.227395 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1003 08:39:35.227398 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1003 08:39:35.227401 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF1003 08:39:35.229781 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-03T08:39:19Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T08:39:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fa1d1c0f4dab4b4c6c9f3afccac34473eab40a714015a2a7ce725ed1a92b609c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T08:39:19Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7c94b0f96f50324a67a7b189e9d3c4e406193421aa2e793692219bef9f2578dd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7c94b0f96f50324a67a7b189e9d3c4e406193421aa2e793692219bef9f2578dd\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T08:39:17Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T08:39:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T08:39:16Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T08:40:37Z is after 2025-08-24T17:21:41Z" Oct 03 08:40:37 crc kubenswrapper[4765]: I1003 08:40:37.838603 4765 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-csb5z" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"912755c8-dd28-4fbc-82de-9cf85df54f4f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://52f5a7f443bf8e52988e8645ff60745a747d602261e7dbf01b68c58aaf9bae05\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d7f179012e9f55f30c641a1ae3640cc90cefb3d2527d0c1e0580c219899503e1\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-10-03T08:40:23Z\\\",\\\"message\\\":\\\"2025-10-03T08:39:38+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_5149c5cc-1f13-4c92-ba76-7ef1ed5a7abf\\\\n2025-10-03T08:39:38+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_5149c5cc-1f13-4c92-ba76-7ef1ed5a7abf to /host/opt/cni/bin/\\\\n2025-10-03T08:39:38Z [verbose] multus-daemon started\\\\n2025-10-03T08:39:38Z [verbose] Readiness Indicator file check\\\\n2025-10-03T08:40:23Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-03T08:39:37Z\\\"}},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T08:40:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w8k2k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T08:39:36Z\\\"}}\" for pod \"openshift-multus\"/\"multus-csb5z\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T08:40:37Z is after 2025-08-24T17:21:41Z" Oct 03 08:40:37 crc kubenswrapper[4765]: I1003 08:40:37.848298 4765 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-j8mss" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d636dbad-9ffa-4ba7-953f-adea04b76a23\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://33c95fa1034cd2135f4293956d73825e809195d220ff0b10a6604bd399a5730a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T08:39:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dhzzj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://714c78e9165f96e2aee03ad7be980399f06aeb852da4d76611c236f262518281\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T08:39:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dhzzj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T08:39:36Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-j8mss\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T08:40:37Z is after 2025-08-24T17:21:41Z" Oct 03 08:40:37 crc kubenswrapper[4765]: I1003 08:40:37.868059 4765 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"859ee4f1-636f-48e5-ad72-fef19f311c64\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ffaf0cbc60fa84230a87aff908b5b2a76956abfa937aeea94363abe91640b93e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T08:39:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0fee410f71d4fa82e7bf54dad906736bc7182be512825a06bf7a4c76ed2f2789\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T08:39:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ab0ed26066c771f9943b6435fa382ff61fb04f0c8bef3d505aba4c5d1a1d4740\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T08:39:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://153c9584928c3d064c6098126dad58733015ed123b9a55c959e69ddcc0ad2110\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T08:39:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6fa1bc45d80d90bc08ca3a7177e2ac77b66c36f5a0f863532174be7719bfaae0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T08:39:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c1359775b3b907743ce1d7754c904fab5cf953ad3a28f91e781be95462e6a9d4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c1359775b3b907743ce1d7754c904fab5cf953ad3a28f91e781be95462e6a9d4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T08:39:17Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T08:39:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a56c24b3af6b3d5c18009cbb5517273674eb36f57157344f99903f9c50c7d1a0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a56c24b3af6b3d5c18009cbb5517273674eb36f57157344f99903f9c50c7d1a0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T08:39:18Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T08:39:18Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://18dbd66a12f96fa4beef4da9dcc0e1b1d3a5164a8b21a6774142289078d6c05f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://18dbd66a12f96fa4beef4da9dcc0e1b1d3a5164a8b21a6774142289078d6c05f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T08:39:19Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T08:39:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T08:39:16Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T08:40:37Z is after 2025-08-24T17:21:41Z" Oct 03 08:40:37 crc kubenswrapper[4765]: I1003 08:40:37.879777 4765 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:35Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:35Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T08:40:37Z is after 2025-08-24T17:21:41Z" Oct 03 08:40:37 crc kubenswrapper[4765]: I1003 08:40:37.886482 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 08:40:37 crc kubenswrapper[4765]: I1003 08:40:37.886525 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 08:40:37 crc kubenswrapper[4765]: I1003 08:40:37.886536 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 08:40:37 crc kubenswrapper[4765]: I1003 08:40:37.886552 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 08:40:37 crc kubenswrapper[4765]: I1003 08:40:37.886563 4765 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T08:40:37Z","lastTransitionTime":"2025-10-03T08:40:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 08:40:37 crc kubenswrapper[4765]: I1003 08:40:37.891843 4765 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:38Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:38Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a37f2b5f797755065158a077232872befbc61f2f19c80dfd27bba7f131db794c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T08:39:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T08:40:37Z is after 2025-08-24T17:21:41Z" Oct 03 08:40:37 crc kubenswrapper[4765]: I1003 08:40:37.911885 4765 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-srgbb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ea01fba1-445f-46c1-898c-1ceb34866850\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:36Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:36Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d73e2e54676fc570262cfd551322ed003812c372ddc25695ca3b34ae2a05423b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T08:39:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m4zqv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fa40947035e07c4926ee170348e2bd545830d0c6c1fa6b59a2aa7f12eac2c6da\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T08:39:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m4zqv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://902d94d2cc9ce526c6ea774f1bb70fbee7da85cedab72fcd842f87d47ee8a458\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T08:39:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m4zqv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://95502595a856f5f235331ab5db3d4f97a50f968857c1962d12b873a714689f0c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T08:39:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m4zqv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a3ad66691c9dcf004703b79d697a78f9b42791fafba2ddf278997b6ad28bdd4a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T08:39:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m4zqv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://68b9b8a7ec5c072f50d44aa0d3800b7cdee18bdd868d37ec129ceb37a23bd3ca\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T08:39:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m4zqv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://115444bc9990e2060fb9e8fff1ca7328f3abbaee25879c6af5feac46f0a417bb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://115444bc9990e2060fb9e8fff1ca7328f3abbaee25879c6af5feac46f0a417bb\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-10-03T08:40:36Z\\\",\\\"message\\\":\\\" error occurred: failed calling webhook \\\\\\\"pod.network-node-identity.openshift.io\\\\\\\": failed to call webhook: Post \\\\\\\"https://127.0.0.1:9743/pod?timeout=10s\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T08:40:36Z is after 2025-08-24T17:21:41Z\\\\nI1003 08:40:36.201436 6807 obj_retry.go:285] Attempting retry of *v1.Pod openshift-multus/network-metrics-daemon-wdwf5 before timer (time: 2025-10-03 08:40:37.635855498 +0000 UTC m=+2.038952178): skip\\\\nI1003 08:40:36.201453 6807 obj_retry.go:420] Function iterateRetryResources for *v1.Pod ended (in 55.481µs)\\\\nI1003 08:40:36.201382 6807 handler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI1003 08:40:36.201544 6807 handler.go:190] Sending *v1.Namespace event handler 1 for removal\\\\nI1003 08:40:36.201600 6807 handler.go:208] Removed *v1.Node event handler 2\\\\nI1003 08:40:36.201626 6807 factory.go:656] Stopping watch factory\\\\nI1003 08:40:36.201639 6807 ovnkube.go:599] Stopped ovnkube\\\\nI1003 08:40:36.201671 6807 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI1003 08:40:36.201691 6807 metrics.go:553] Stopping metrics server at address \\\\\\\"127.0.0.1:29103\\\\\\\"\\\\nF1003 08:40:36.201791 6807 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-03T08:40:35Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=ovnkube-controller pod=ovnkube-node-srgbb_openshift-ovn-kubernetes(ea01fba1-445f-46c1-898c-1ceb34866850)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m4zqv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6d5d60eb6ab5ff22cc2c6826b1d47220bb827fa0429f2a59020ae01d0a43f6bf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T08:39:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m4zqv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://761ad034a8a6a7a6280fcccaca136b26ddd423885bc2a7ab6b8e1856f16f7416\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://761ad034a8a6a7a6280fcccaca136b26ddd423885bc2a7ab6b8e1856f16f7416\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T08:39:37Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T08:39:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m4zqv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T08:39:36Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-srgbb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T08:40:37Z is after 2025-08-24T17:21:41Z" Oct 03 08:40:37 crc kubenswrapper[4765]: I1003 08:40:37.925302 4765 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:35Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:35Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T08:40:37Z is after 2025-08-24T17:21:41Z" Oct 03 08:40:37 crc kubenswrapper[4765]: I1003 08:40:37.939575 4765 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:36Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:36Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e8d6f534a0a702832db2f8947c1528a98d511d3950cc5a6ec0ac3b31b3dbcb7c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T08:39:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://20ad16cb9f0f7e17ac946cd2c3f7c01b6e6c95d6d76c99f482b3761546689af2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T08:39:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T08:40:37Z is after 2025-08-24T17:21:41Z" Oct 03 08:40:37 crc kubenswrapper[4765]: I1003 08:40:37.952003 4765 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-p9gf5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"46c76a49-e10b-4a12-a6c7-12c330cd3c4e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d127171dd11041892813dd0596574630e756cc4f2e54b149619bffdbe9bae37d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T08:39:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2ldt7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T08:39:36Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-p9gf5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T08:40:37Z is after 2025-08-24T17:21:41Z" Oct 03 08:40:37 crc kubenswrapper[4765]: I1003 08:40:37.962802 4765 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-svqbq" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"84cdf1d7-9997-4015-bdbf-eedacc081685\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://43441b23076aa88505c0014c6734ffd0302f9011300711eece573befc94f3fbf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T08:39:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tm2z5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T08:39:39Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-svqbq\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T08:40:37Z is after 2025-08-24T17:21:41Z" Oct 03 08:40:37 crc kubenswrapper[4765]: I1003 08:40:37.975017 4765 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-9pssq" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fcbd8c60-e4bc-43c1-b769-9ae58a05ea0f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fb36c0727cbf11d911102b2e91c3989a264374191f4ff34349ed6ec8eba2e58d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T08:39:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ggxtg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d810b33fb4971c7a1473884cbe04ad15b3cac6c0ca9af2384819d72a748ab173\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T08:39:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ggxtg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T08:39:48Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-9pssq\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T08:40:37Z is after 2025-08-24T17:21:41Z" Oct 03 08:40:37 crc kubenswrapper[4765]: I1003 08:40:37.988089 4765 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9660b983-3561-4cf7-8ea0-31a63e8d1051\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://23c27e7d79dab0c54b22f0114e7f55a9267e3a21961b8479c37fd77d0e8b66c7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T08:39:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fb89a31c804d86cbc11b04e4dcfab79d4536f28a107d43e98d48172a1c257ebb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T08:39:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1c3168f51c49cd9633557cf31cdc0fec47b3fcf981462dc85f4253a0584fcf52\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T08:39:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://99ae775d5cfd2e88a1c7ca516e1c59f2e08ce1d383653cacbefeac66b07abcb5\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T08:39:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T08:39:16Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T08:40:37Z is after 2025-08-24T17:21:41Z" Oct 03 08:40:37 crc kubenswrapper[4765]: I1003 08:40:37.988862 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 08:40:37 crc kubenswrapper[4765]: I1003 08:40:37.988894 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 08:40:37 crc kubenswrapper[4765]: I1003 08:40:37.988904 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 08:40:37 crc kubenswrapper[4765]: I1003 08:40:37.988920 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 08:40:37 crc kubenswrapper[4765]: I1003 08:40:37.988932 4765 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T08:40:37Z","lastTransitionTime":"2025-10-03T08:40:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 08:40:38 crc kubenswrapper[4765]: I1003 08:40:38.003179 4765 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-4bmrv" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4f105c06-3e67-486f-a622-923ae442117c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d7a29ab4db9b7548c70824520272e6323f615934cddf1d92bf653f6d8f030a19\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T08:39:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6lhz2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ab314ed006e71cae099ecf8be4ac18fb777dd83fe4e233d9bc347965f76b6a0b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ab314ed006e71cae099ecf8be4ac18fb777dd83fe4e233d9bc347965f76b6a0b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T08:39:37Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T08:39:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6lhz2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://261208c132a527b6786f64f37dd699e889f68ebc01265028288b3620615274bc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://261208c132a527b6786f64f37dd699e889f68ebc01265028288b3620615274bc\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T08:39:39Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T08:39:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6lhz2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8358f38c6c98de8206fc03fda21200b0e526972747588a6e32d525c064aa57ac\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8358f38c6c98de8206fc03fda21200b0e526972747588a6e32d525c064aa57ac\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T08:39:39Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T08:39:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6lhz2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9af7a0993c4e8d1177050ee170ae306c2e2570b0daca2d3f5c812b5f0e9c81da\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9af7a0993c4e8d1177050ee170ae306c2e2570b0daca2d3f5c812b5f0e9c81da\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T08:39:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T08:39:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6lhz2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://23ac91bc25ecc5c606b22bf6df52129330bb8c214ef8ec881fb202df6350c853\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://23ac91bc25ecc5c606b22bf6df52129330bb8c214ef8ec881fb202df6350c853\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T08:39:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T08:39:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6lhz2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7c836df75da45ef369baafc15bdbed1068becc3bf57a4c83a8519280ff3eb847\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7c836df75da45ef369baafc15bdbed1068becc3bf57a4c83a8519280ff3eb847\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T08:39:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T08:39:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6lhz2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T08:39:36Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-4bmrv\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T08:40:38Z is after 2025-08-24T17:21:41Z" Oct 03 08:40:38 crc kubenswrapper[4765]: I1003 08:40:38.090930 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 08:40:38 crc kubenswrapper[4765]: I1003 08:40:38.090980 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 08:40:38 crc kubenswrapper[4765]: I1003 08:40:38.090999 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 08:40:38 crc kubenswrapper[4765]: I1003 08:40:38.091019 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 08:40:38 crc kubenswrapper[4765]: I1003 08:40:38.091030 4765 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T08:40:38Z","lastTransitionTime":"2025-10-03T08:40:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 08:40:38 crc kubenswrapper[4765]: I1003 08:40:38.193911 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 08:40:38 crc kubenswrapper[4765]: I1003 08:40:38.193944 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 08:40:38 crc kubenswrapper[4765]: I1003 08:40:38.193954 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 08:40:38 crc kubenswrapper[4765]: I1003 08:40:38.193968 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 08:40:38 crc kubenswrapper[4765]: I1003 08:40:38.193979 4765 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T08:40:38Z","lastTransitionTime":"2025-10-03T08:40:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 08:40:38 crc kubenswrapper[4765]: I1003 08:40:38.296241 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 08:40:38 crc kubenswrapper[4765]: I1003 08:40:38.296297 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 08:40:38 crc kubenswrapper[4765]: I1003 08:40:38.296310 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 08:40:38 crc kubenswrapper[4765]: I1003 08:40:38.296328 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 08:40:38 crc kubenswrapper[4765]: I1003 08:40:38.296337 4765 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T08:40:38Z","lastTransitionTime":"2025-10-03T08:40:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 08:40:38 crc kubenswrapper[4765]: I1003 08:40:38.306075 4765 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-wdwf5" Oct 03 08:40:38 crc kubenswrapper[4765]: E1003 08:40:38.306202 4765 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-wdwf5" podUID="6824483c-e9a7-4e95-bb3d-e00bac2af3aa" Oct 03 08:40:38 crc kubenswrapper[4765]: I1003 08:40:38.399199 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 08:40:38 crc kubenswrapper[4765]: I1003 08:40:38.399244 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 08:40:38 crc kubenswrapper[4765]: I1003 08:40:38.399257 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 08:40:38 crc kubenswrapper[4765]: I1003 08:40:38.399275 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 08:40:38 crc kubenswrapper[4765]: I1003 08:40:38.399286 4765 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T08:40:38Z","lastTransitionTime":"2025-10-03T08:40:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 08:40:38 crc kubenswrapper[4765]: I1003 08:40:38.501511 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 08:40:38 crc kubenswrapper[4765]: I1003 08:40:38.501576 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 08:40:38 crc kubenswrapper[4765]: I1003 08:40:38.501587 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 08:40:38 crc kubenswrapper[4765]: I1003 08:40:38.501605 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 08:40:38 crc kubenswrapper[4765]: I1003 08:40:38.501619 4765 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T08:40:38Z","lastTransitionTime":"2025-10-03T08:40:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 08:40:38 crc kubenswrapper[4765]: I1003 08:40:38.603614 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 08:40:38 crc kubenswrapper[4765]: I1003 08:40:38.603677 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 08:40:38 crc kubenswrapper[4765]: I1003 08:40:38.603695 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 08:40:38 crc kubenswrapper[4765]: I1003 08:40:38.603712 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 08:40:38 crc kubenswrapper[4765]: I1003 08:40:38.603724 4765 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T08:40:38Z","lastTransitionTime":"2025-10-03T08:40:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 08:40:38 crc kubenswrapper[4765]: I1003 08:40:38.707639 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 08:40:38 crc kubenswrapper[4765]: I1003 08:40:38.707706 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 08:40:38 crc kubenswrapper[4765]: I1003 08:40:38.707718 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 08:40:38 crc kubenswrapper[4765]: I1003 08:40:38.707735 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 08:40:38 crc kubenswrapper[4765]: I1003 08:40:38.707745 4765 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T08:40:38Z","lastTransitionTime":"2025-10-03T08:40:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 08:40:38 crc kubenswrapper[4765]: I1003 08:40:38.810203 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 08:40:38 crc kubenswrapper[4765]: I1003 08:40:38.810264 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 08:40:38 crc kubenswrapper[4765]: I1003 08:40:38.810282 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 08:40:38 crc kubenswrapper[4765]: I1003 08:40:38.810304 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 08:40:38 crc kubenswrapper[4765]: I1003 08:40:38.810316 4765 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T08:40:38Z","lastTransitionTime":"2025-10-03T08:40:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 08:40:38 crc kubenswrapper[4765]: I1003 08:40:38.913218 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 08:40:38 crc kubenswrapper[4765]: I1003 08:40:38.913261 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 08:40:38 crc kubenswrapper[4765]: I1003 08:40:38.913274 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 08:40:38 crc kubenswrapper[4765]: I1003 08:40:38.913292 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 08:40:38 crc kubenswrapper[4765]: I1003 08:40:38.913305 4765 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T08:40:38Z","lastTransitionTime":"2025-10-03T08:40:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 08:40:39 crc kubenswrapper[4765]: I1003 08:40:39.016541 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 08:40:39 crc kubenswrapper[4765]: I1003 08:40:39.016594 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 08:40:39 crc kubenswrapper[4765]: I1003 08:40:39.016605 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 08:40:39 crc kubenswrapper[4765]: I1003 08:40:39.016677 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 08:40:39 crc kubenswrapper[4765]: I1003 08:40:39.016699 4765 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T08:40:39Z","lastTransitionTime":"2025-10-03T08:40:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 08:40:39 crc kubenswrapper[4765]: I1003 08:40:39.119841 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 08:40:39 crc kubenswrapper[4765]: I1003 08:40:39.119888 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 08:40:39 crc kubenswrapper[4765]: I1003 08:40:39.119898 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 08:40:39 crc kubenswrapper[4765]: I1003 08:40:39.119913 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 08:40:39 crc kubenswrapper[4765]: I1003 08:40:39.119925 4765 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T08:40:39Z","lastTransitionTime":"2025-10-03T08:40:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 08:40:39 crc kubenswrapper[4765]: I1003 08:40:39.146964 4765 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 03 08:40:39 crc kubenswrapper[4765]: E1003 08:40:39.147186 4765 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-03 08:41:43.147165141 +0000 UTC m=+147.448659471 (durationBeforeRetry 1m4s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 03 08:40:39 crc kubenswrapper[4765]: I1003 08:40:39.223334 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 08:40:39 crc kubenswrapper[4765]: I1003 08:40:39.223383 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 08:40:39 crc kubenswrapper[4765]: I1003 08:40:39.223395 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 08:40:39 crc kubenswrapper[4765]: I1003 08:40:39.223413 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 08:40:39 crc kubenswrapper[4765]: I1003 08:40:39.223425 4765 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T08:40:39Z","lastTransitionTime":"2025-10-03T08:40:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 08:40:39 crc kubenswrapper[4765]: I1003 08:40:39.248033 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 03 08:40:39 crc kubenswrapper[4765]: I1003 08:40:39.248100 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 03 08:40:39 crc kubenswrapper[4765]: I1003 08:40:39.248127 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 03 08:40:39 crc kubenswrapper[4765]: I1003 08:40:39.248166 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 03 08:40:39 crc kubenswrapper[4765]: E1003 08:40:39.248284 4765 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Oct 03 08:40:39 crc kubenswrapper[4765]: E1003 08:40:39.248286 4765 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Oct 03 08:40:39 crc kubenswrapper[4765]: E1003 08:40:39.248302 4765 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Oct 03 08:40:39 crc kubenswrapper[4765]: E1003 08:40:39.248300 4765 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Oct 03 08:40:39 crc kubenswrapper[4765]: E1003 08:40:39.248405 4765 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-10-03 08:41:43.248383622 +0000 UTC m=+147.549877942 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Oct 03 08:40:39 crc kubenswrapper[4765]: E1003 08:40:39.248316 4765 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Oct 03 08:40:39 crc kubenswrapper[4765]: E1003 08:40:39.248467 4765 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Oct 03 08:40:39 crc kubenswrapper[4765]: E1003 08:40:39.248318 4765 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Oct 03 08:40:39 crc kubenswrapper[4765]: E1003 08:40:39.248515 4765 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2025-10-03 08:41:43.248500024 +0000 UTC m=+147.549994544 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Oct 03 08:40:39 crc kubenswrapper[4765]: E1003 08:40:39.248718 4765 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Oct 03 08:40:39 crc kubenswrapper[4765]: E1003 08:40:39.248725 4765 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2025-10-03 08:41:43.24871164 +0000 UTC m=+147.550206190 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Oct 03 08:40:39 crc kubenswrapper[4765]: E1003 08:40:39.248763 4765 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-10-03 08:41:43.248755221 +0000 UTC m=+147.550249751 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Oct 03 08:40:39 crc kubenswrapper[4765]: I1003 08:40:39.306412 4765 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 03 08:40:39 crc kubenswrapper[4765]: I1003 08:40:39.306440 4765 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 03 08:40:39 crc kubenswrapper[4765]: I1003 08:40:39.306456 4765 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 03 08:40:39 crc kubenswrapper[4765]: E1003 08:40:39.306572 4765 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 03 08:40:39 crc kubenswrapper[4765]: E1003 08:40:39.306914 4765 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 03 08:40:39 crc kubenswrapper[4765]: E1003 08:40:39.307108 4765 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 03 08:40:39 crc kubenswrapper[4765]: I1003 08:40:39.326937 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 08:40:39 crc kubenswrapper[4765]: I1003 08:40:39.327006 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 08:40:39 crc kubenswrapper[4765]: I1003 08:40:39.327016 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 08:40:39 crc kubenswrapper[4765]: I1003 08:40:39.327033 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 08:40:39 crc kubenswrapper[4765]: I1003 08:40:39.327043 4765 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T08:40:39Z","lastTransitionTime":"2025-10-03T08:40:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 08:40:39 crc kubenswrapper[4765]: I1003 08:40:39.429728 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 08:40:39 crc kubenswrapper[4765]: I1003 08:40:39.429777 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 08:40:39 crc kubenswrapper[4765]: I1003 08:40:39.429786 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 08:40:39 crc kubenswrapper[4765]: I1003 08:40:39.429805 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 08:40:39 crc kubenswrapper[4765]: I1003 08:40:39.429828 4765 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T08:40:39Z","lastTransitionTime":"2025-10-03T08:40:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 08:40:39 crc kubenswrapper[4765]: I1003 08:40:39.532758 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 08:40:39 crc kubenswrapper[4765]: I1003 08:40:39.532911 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 08:40:39 crc kubenswrapper[4765]: I1003 08:40:39.532926 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 08:40:39 crc kubenswrapper[4765]: I1003 08:40:39.532947 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 08:40:39 crc kubenswrapper[4765]: I1003 08:40:39.532962 4765 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T08:40:39Z","lastTransitionTime":"2025-10-03T08:40:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 08:40:39 crc kubenswrapper[4765]: I1003 08:40:39.635948 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 08:40:39 crc kubenswrapper[4765]: I1003 08:40:39.636038 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 08:40:39 crc kubenswrapper[4765]: I1003 08:40:39.636048 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 08:40:39 crc kubenswrapper[4765]: I1003 08:40:39.636069 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 08:40:39 crc kubenswrapper[4765]: I1003 08:40:39.636080 4765 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T08:40:39Z","lastTransitionTime":"2025-10-03T08:40:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 08:40:39 crc kubenswrapper[4765]: I1003 08:40:39.739917 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 08:40:39 crc kubenswrapper[4765]: I1003 08:40:39.739964 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 08:40:39 crc kubenswrapper[4765]: I1003 08:40:39.739977 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 08:40:39 crc kubenswrapper[4765]: I1003 08:40:39.739997 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 08:40:39 crc kubenswrapper[4765]: I1003 08:40:39.740010 4765 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T08:40:39Z","lastTransitionTime":"2025-10-03T08:40:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 08:40:39 crc kubenswrapper[4765]: I1003 08:40:39.843375 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 08:40:39 crc kubenswrapper[4765]: I1003 08:40:39.843435 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 08:40:39 crc kubenswrapper[4765]: I1003 08:40:39.843447 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 08:40:39 crc kubenswrapper[4765]: I1003 08:40:39.843468 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 08:40:39 crc kubenswrapper[4765]: I1003 08:40:39.843481 4765 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T08:40:39Z","lastTransitionTime":"2025-10-03T08:40:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 08:40:39 crc kubenswrapper[4765]: I1003 08:40:39.947562 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 08:40:39 crc kubenswrapper[4765]: I1003 08:40:39.947679 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 08:40:39 crc kubenswrapper[4765]: I1003 08:40:39.947692 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 08:40:39 crc kubenswrapper[4765]: I1003 08:40:39.947711 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 08:40:39 crc kubenswrapper[4765]: I1003 08:40:39.947724 4765 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T08:40:39Z","lastTransitionTime":"2025-10-03T08:40:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 08:40:40 crc kubenswrapper[4765]: I1003 08:40:40.051166 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 08:40:40 crc kubenswrapper[4765]: I1003 08:40:40.051702 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 08:40:40 crc kubenswrapper[4765]: I1003 08:40:40.051712 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 08:40:40 crc kubenswrapper[4765]: I1003 08:40:40.051731 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 08:40:40 crc kubenswrapper[4765]: I1003 08:40:40.051744 4765 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T08:40:40Z","lastTransitionTime":"2025-10-03T08:40:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 08:40:40 crc kubenswrapper[4765]: I1003 08:40:40.154584 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 08:40:40 crc kubenswrapper[4765]: I1003 08:40:40.154629 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 08:40:40 crc kubenswrapper[4765]: I1003 08:40:40.154639 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 08:40:40 crc kubenswrapper[4765]: I1003 08:40:40.154674 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 08:40:40 crc kubenswrapper[4765]: I1003 08:40:40.154688 4765 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T08:40:40Z","lastTransitionTime":"2025-10-03T08:40:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 08:40:40 crc kubenswrapper[4765]: I1003 08:40:40.257416 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 08:40:40 crc kubenswrapper[4765]: I1003 08:40:40.257736 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 08:40:40 crc kubenswrapper[4765]: I1003 08:40:40.257748 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 08:40:40 crc kubenswrapper[4765]: I1003 08:40:40.257762 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 08:40:40 crc kubenswrapper[4765]: I1003 08:40:40.257771 4765 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T08:40:40Z","lastTransitionTime":"2025-10-03T08:40:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 08:40:40 crc kubenswrapper[4765]: I1003 08:40:40.306942 4765 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-wdwf5" Oct 03 08:40:40 crc kubenswrapper[4765]: E1003 08:40:40.307103 4765 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-wdwf5" podUID="6824483c-e9a7-4e95-bb3d-e00bac2af3aa" Oct 03 08:40:40 crc kubenswrapper[4765]: I1003 08:40:40.360765 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 08:40:40 crc kubenswrapper[4765]: I1003 08:40:40.360804 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 08:40:40 crc kubenswrapper[4765]: I1003 08:40:40.360817 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 08:40:40 crc kubenswrapper[4765]: I1003 08:40:40.360833 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 08:40:40 crc kubenswrapper[4765]: I1003 08:40:40.360844 4765 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T08:40:40Z","lastTransitionTime":"2025-10-03T08:40:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 08:40:40 crc kubenswrapper[4765]: I1003 08:40:40.464340 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 08:40:40 crc kubenswrapper[4765]: I1003 08:40:40.464394 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 08:40:40 crc kubenswrapper[4765]: I1003 08:40:40.464406 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 08:40:40 crc kubenswrapper[4765]: I1003 08:40:40.464423 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 08:40:40 crc kubenswrapper[4765]: I1003 08:40:40.464434 4765 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T08:40:40Z","lastTransitionTime":"2025-10-03T08:40:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 08:40:40 crc kubenswrapper[4765]: I1003 08:40:40.567633 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 08:40:40 crc kubenswrapper[4765]: I1003 08:40:40.567703 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 08:40:40 crc kubenswrapper[4765]: I1003 08:40:40.567715 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 08:40:40 crc kubenswrapper[4765]: I1003 08:40:40.567735 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 08:40:40 crc kubenswrapper[4765]: I1003 08:40:40.567748 4765 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T08:40:40Z","lastTransitionTime":"2025-10-03T08:40:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 08:40:40 crc kubenswrapper[4765]: I1003 08:40:40.669764 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 08:40:40 crc kubenswrapper[4765]: I1003 08:40:40.669840 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 08:40:40 crc kubenswrapper[4765]: I1003 08:40:40.669851 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 08:40:40 crc kubenswrapper[4765]: I1003 08:40:40.669868 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 08:40:40 crc kubenswrapper[4765]: I1003 08:40:40.669882 4765 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T08:40:40Z","lastTransitionTime":"2025-10-03T08:40:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 08:40:40 crc kubenswrapper[4765]: I1003 08:40:40.772699 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 08:40:40 crc kubenswrapper[4765]: I1003 08:40:40.772775 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 08:40:40 crc kubenswrapper[4765]: I1003 08:40:40.772794 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 08:40:40 crc kubenswrapper[4765]: I1003 08:40:40.772823 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 08:40:40 crc kubenswrapper[4765]: I1003 08:40:40.772850 4765 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T08:40:40Z","lastTransitionTime":"2025-10-03T08:40:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 08:40:40 crc kubenswrapper[4765]: I1003 08:40:40.877069 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 08:40:40 crc kubenswrapper[4765]: I1003 08:40:40.877122 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 08:40:40 crc kubenswrapper[4765]: I1003 08:40:40.877131 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 08:40:40 crc kubenswrapper[4765]: I1003 08:40:40.877147 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 08:40:40 crc kubenswrapper[4765]: I1003 08:40:40.877159 4765 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T08:40:40Z","lastTransitionTime":"2025-10-03T08:40:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 08:40:40 crc kubenswrapper[4765]: I1003 08:40:40.980837 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 08:40:40 crc kubenswrapper[4765]: I1003 08:40:40.980907 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 08:40:40 crc kubenswrapper[4765]: I1003 08:40:40.980917 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 08:40:40 crc kubenswrapper[4765]: I1003 08:40:40.980934 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 08:40:40 crc kubenswrapper[4765]: I1003 08:40:40.980945 4765 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T08:40:40Z","lastTransitionTime":"2025-10-03T08:40:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 08:40:41 crc kubenswrapper[4765]: I1003 08:40:41.084052 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 08:40:41 crc kubenswrapper[4765]: I1003 08:40:41.084156 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 08:40:41 crc kubenswrapper[4765]: I1003 08:40:41.084182 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 08:40:41 crc kubenswrapper[4765]: I1003 08:40:41.084215 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 08:40:41 crc kubenswrapper[4765]: I1003 08:40:41.084241 4765 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T08:40:41Z","lastTransitionTime":"2025-10-03T08:40:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 08:40:41 crc kubenswrapper[4765]: I1003 08:40:41.187954 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 08:40:41 crc kubenswrapper[4765]: I1003 08:40:41.188013 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 08:40:41 crc kubenswrapper[4765]: I1003 08:40:41.188025 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 08:40:41 crc kubenswrapper[4765]: I1003 08:40:41.188046 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 08:40:41 crc kubenswrapper[4765]: I1003 08:40:41.188061 4765 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T08:40:41Z","lastTransitionTime":"2025-10-03T08:40:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 08:40:41 crc kubenswrapper[4765]: I1003 08:40:41.291369 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 08:40:41 crc kubenswrapper[4765]: I1003 08:40:41.291437 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 08:40:41 crc kubenswrapper[4765]: I1003 08:40:41.291465 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 08:40:41 crc kubenswrapper[4765]: I1003 08:40:41.291498 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 08:40:41 crc kubenswrapper[4765]: I1003 08:40:41.291520 4765 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T08:40:41Z","lastTransitionTime":"2025-10-03T08:40:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 08:40:41 crc kubenswrapper[4765]: I1003 08:40:41.306305 4765 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 03 08:40:41 crc kubenswrapper[4765]: I1003 08:40:41.306382 4765 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 03 08:40:41 crc kubenswrapper[4765]: I1003 08:40:41.306450 4765 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 03 08:40:41 crc kubenswrapper[4765]: E1003 08:40:41.306461 4765 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 03 08:40:41 crc kubenswrapper[4765]: E1003 08:40:41.306621 4765 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 03 08:40:41 crc kubenswrapper[4765]: E1003 08:40:41.306768 4765 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 03 08:40:41 crc kubenswrapper[4765]: I1003 08:40:41.394353 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 08:40:41 crc kubenswrapper[4765]: I1003 08:40:41.394389 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 08:40:41 crc kubenswrapper[4765]: I1003 08:40:41.394399 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 08:40:41 crc kubenswrapper[4765]: I1003 08:40:41.394413 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 08:40:41 crc kubenswrapper[4765]: I1003 08:40:41.394422 4765 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T08:40:41Z","lastTransitionTime":"2025-10-03T08:40:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 08:40:41 crc kubenswrapper[4765]: I1003 08:40:41.497526 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 08:40:41 crc kubenswrapper[4765]: I1003 08:40:41.497571 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 08:40:41 crc kubenswrapper[4765]: I1003 08:40:41.497586 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 08:40:41 crc kubenswrapper[4765]: I1003 08:40:41.497604 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 08:40:41 crc kubenswrapper[4765]: I1003 08:40:41.497615 4765 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T08:40:41Z","lastTransitionTime":"2025-10-03T08:40:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 08:40:41 crc kubenswrapper[4765]: I1003 08:40:41.600200 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 08:40:41 crc kubenswrapper[4765]: I1003 08:40:41.600248 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 08:40:41 crc kubenswrapper[4765]: I1003 08:40:41.600262 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 08:40:41 crc kubenswrapper[4765]: I1003 08:40:41.600279 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 08:40:41 crc kubenswrapper[4765]: I1003 08:40:41.600290 4765 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T08:40:41Z","lastTransitionTime":"2025-10-03T08:40:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 08:40:41 crc kubenswrapper[4765]: I1003 08:40:41.702417 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 08:40:41 crc kubenswrapper[4765]: I1003 08:40:41.702464 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 08:40:41 crc kubenswrapper[4765]: I1003 08:40:41.702474 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 08:40:41 crc kubenswrapper[4765]: I1003 08:40:41.702491 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 08:40:41 crc kubenswrapper[4765]: I1003 08:40:41.702502 4765 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T08:40:41Z","lastTransitionTime":"2025-10-03T08:40:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 08:40:41 crc kubenswrapper[4765]: I1003 08:40:41.807464 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 08:40:41 crc kubenswrapper[4765]: I1003 08:40:41.807793 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 08:40:41 crc kubenswrapper[4765]: I1003 08:40:41.807835 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 08:40:41 crc kubenswrapper[4765]: I1003 08:40:41.807854 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 08:40:41 crc kubenswrapper[4765]: I1003 08:40:41.807867 4765 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T08:40:41Z","lastTransitionTime":"2025-10-03T08:40:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 08:40:41 crc kubenswrapper[4765]: I1003 08:40:41.909509 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 08:40:41 crc kubenswrapper[4765]: I1003 08:40:41.909546 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 08:40:41 crc kubenswrapper[4765]: I1003 08:40:41.909555 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 08:40:41 crc kubenswrapper[4765]: I1003 08:40:41.909619 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 08:40:41 crc kubenswrapper[4765]: I1003 08:40:41.909631 4765 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T08:40:41Z","lastTransitionTime":"2025-10-03T08:40:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 08:40:42 crc kubenswrapper[4765]: I1003 08:40:42.012872 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 08:40:42 crc kubenswrapper[4765]: I1003 08:40:42.012911 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 08:40:42 crc kubenswrapper[4765]: I1003 08:40:42.012922 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 08:40:42 crc kubenswrapper[4765]: I1003 08:40:42.012936 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 08:40:42 crc kubenswrapper[4765]: I1003 08:40:42.012945 4765 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T08:40:42Z","lastTransitionTime":"2025-10-03T08:40:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 08:40:42 crc kubenswrapper[4765]: I1003 08:40:42.116039 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 08:40:42 crc kubenswrapper[4765]: I1003 08:40:42.116093 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 08:40:42 crc kubenswrapper[4765]: I1003 08:40:42.116107 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 08:40:42 crc kubenswrapper[4765]: I1003 08:40:42.116128 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 08:40:42 crc kubenswrapper[4765]: I1003 08:40:42.116140 4765 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T08:40:42Z","lastTransitionTime":"2025-10-03T08:40:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 08:40:42 crc kubenswrapper[4765]: I1003 08:40:42.218332 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 08:40:42 crc kubenswrapper[4765]: I1003 08:40:42.218396 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 08:40:42 crc kubenswrapper[4765]: I1003 08:40:42.218409 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 08:40:42 crc kubenswrapper[4765]: I1003 08:40:42.218428 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 08:40:42 crc kubenswrapper[4765]: I1003 08:40:42.218440 4765 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T08:40:42Z","lastTransitionTime":"2025-10-03T08:40:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 08:40:42 crc kubenswrapper[4765]: I1003 08:40:42.306693 4765 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-wdwf5" Oct 03 08:40:42 crc kubenswrapper[4765]: E1003 08:40:42.306931 4765 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-wdwf5" podUID="6824483c-e9a7-4e95-bb3d-e00bac2af3aa" Oct 03 08:40:42 crc kubenswrapper[4765]: I1003 08:40:42.321595 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 08:40:42 crc kubenswrapper[4765]: I1003 08:40:42.321684 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 08:40:42 crc kubenswrapper[4765]: I1003 08:40:42.321703 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 08:40:42 crc kubenswrapper[4765]: I1003 08:40:42.321725 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 08:40:42 crc kubenswrapper[4765]: I1003 08:40:42.321746 4765 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T08:40:42Z","lastTransitionTime":"2025-10-03T08:40:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 08:40:42 crc kubenswrapper[4765]: I1003 08:40:42.425706 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 08:40:42 crc kubenswrapper[4765]: I1003 08:40:42.425779 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 08:40:42 crc kubenswrapper[4765]: I1003 08:40:42.425793 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 08:40:42 crc kubenswrapper[4765]: I1003 08:40:42.425813 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 08:40:42 crc kubenswrapper[4765]: I1003 08:40:42.425826 4765 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T08:40:42Z","lastTransitionTime":"2025-10-03T08:40:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 08:40:42 crc kubenswrapper[4765]: I1003 08:40:42.528787 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 08:40:42 crc kubenswrapper[4765]: I1003 08:40:42.528848 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 08:40:42 crc kubenswrapper[4765]: I1003 08:40:42.528858 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 08:40:42 crc kubenswrapper[4765]: I1003 08:40:42.528875 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 08:40:42 crc kubenswrapper[4765]: I1003 08:40:42.528890 4765 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T08:40:42Z","lastTransitionTime":"2025-10-03T08:40:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 08:40:42 crc kubenswrapper[4765]: I1003 08:40:42.631944 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 08:40:42 crc kubenswrapper[4765]: I1003 08:40:42.632008 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 08:40:42 crc kubenswrapper[4765]: I1003 08:40:42.632021 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 08:40:42 crc kubenswrapper[4765]: I1003 08:40:42.632039 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 08:40:42 crc kubenswrapper[4765]: I1003 08:40:42.632052 4765 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T08:40:42Z","lastTransitionTime":"2025-10-03T08:40:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 08:40:42 crc kubenswrapper[4765]: I1003 08:40:42.735749 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 08:40:42 crc kubenswrapper[4765]: I1003 08:40:42.735788 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 08:40:42 crc kubenswrapper[4765]: I1003 08:40:42.735799 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 08:40:42 crc kubenswrapper[4765]: I1003 08:40:42.735814 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 08:40:42 crc kubenswrapper[4765]: I1003 08:40:42.735826 4765 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T08:40:42Z","lastTransitionTime":"2025-10-03T08:40:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 08:40:42 crc kubenswrapper[4765]: I1003 08:40:42.838791 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 08:40:42 crc kubenswrapper[4765]: I1003 08:40:42.838832 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 08:40:42 crc kubenswrapper[4765]: I1003 08:40:42.838851 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 08:40:42 crc kubenswrapper[4765]: I1003 08:40:42.838868 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 08:40:42 crc kubenswrapper[4765]: I1003 08:40:42.838878 4765 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T08:40:42Z","lastTransitionTime":"2025-10-03T08:40:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 08:40:42 crc kubenswrapper[4765]: I1003 08:40:42.942480 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 08:40:42 crc kubenswrapper[4765]: I1003 08:40:42.942540 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 08:40:42 crc kubenswrapper[4765]: I1003 08:40:42.942556 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 08:40:42 crc kubenswrapper[4765]: I1003 08:40:42.942576 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 08:40:42 crc kubenswrapper[4765]: I1003 08:40:42.942589 4765 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T08:40:42Z","lastTransitionTime":"2025-10-03T08:40:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 08:40:43 crc kubenswrapper[4765]: I1003 08:40:43.045590 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 08:40:43 crc kubenswrapper[4765]: I1003 08:40:43.045702 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 08:40:43 crc kubenswrapper[4765]: I1003 08:40:43.045713 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 08:40:43 crc kubenswrapper[4765]: I1003 08:40:43.045733 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 08:40:43 crc kubenswrapper[4765]: I1003 08:40:43.045752 4765 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T08:40:43Z","lastTransitionTime":"2025-10-03T08:40:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 08:40:43 crc kubenswrapper[4765]: I1003 08:40:43.149495 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 08:40:43 crc kubenswrapper[4765]: I1003 08:40:43.149572 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 08:40:43 crc kubenswrapper[4765]: I1003 08:40:43.149585 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 08:40:43 crc kubenswrapper[4765]: I1003 08:40:43.149607 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 08:40:43 crc kubenswrapper[4765]: I1003 08:40:43.149623 4765 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T08:40:43Z","lastTransitionTime":"2025-10-03T08:40:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 08:40:43 crc kubenswrapper[4765]: I1003 08:40:43.253015 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 08:40:43 crc kubenswrapper[4765]: I1003 08:40:43.253078 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 08:40:43 crc kubenswrapper[4765]: I1003 08:40:43.253097 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 08:40:43 crc kubenswrapper[4765]: I1003 08:40:43.253121 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 08:40:43 crc kubenswrapper[4765]: I1003 08:40:43.253138 4765 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T08:40:43Z","lastTransitionTime":"2025-10-03T08:40:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 08:40:43 crc kubenswrapper[4765]: I1003 08:40:43.306363 4765 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 03 08:40:43 crc kubenswrapper[4765]: I1003 08:40:43.306363 4765 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 03 08:40:43 crc kubenswrapper[4765]: I1003 08:40:43.306392 4765 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 03 08:40:43 crc kubenswrapper[4765]: E1003 08:40:43.307191 4765 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 03 08:40:43 crc kubenswrapper[4765]: E1003 08:40:43.307683 4765 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 03 08:40:43 crc kubenswrapper[4765]: E1003 08:40:43.307745 4765 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 03 08:40:43 crc kubenswrapper[4765]: I1003 08:40:43.355531 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 08:40:43 crc kubenswrapper[4765]: I1003 08:40:43.355574 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 08:40:43 crc kubenswrapper[4765]: I1003 08:40:43.355583 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 08:40:43 crc kubenswrapper[4765]: I1003 08:40:43.355599 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 08:40:43 crc kubenswrapper[4765]: I1003 08:40:43.355611 4765 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T08:40:43Z","lastTransitionTime":"2025-10-03T08:40:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 08:40:43 crc kubenswrapper[4765]: I1003 08:40:43.458399 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 08:40:43 crc kubenswrapper[4765]: I1003 08:40:43.458451 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 08:40:43 crc kubenswrapper[4765]: I1003 08:40:43.458466 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 08:40:43 crc kubenswrapper[4765]: I1003 08:40:43.458492 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 08:40:43 crc kubenswrapper[4765]: I1003 08:40:43.458511 4765 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T08:40:43Z","lastTransitionTime":"2025-10-03T08:40:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 08:40:43 crc kubenswrapper[4765]: I1003 08:40:43.532431 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 08:40:43 crc kubenswrapper[4765]: I1003 08:40:43.532525 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 08:40:43 crc kubenswrapper[4765]: I1003 08:40:43.532538 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 08:40:43 crc kubenswrapper[4765]: I1003 08:40:43.532558 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 08:40:43 crc kubenswrapper[4765]: I1003 08:40:43.532571 4765 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T08:40:43Z","lastTransitionTime":"2025-10-03T08:40:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 08:40:43 crc kubenswrapper[4765]: E1003 08:40:43.546999 4765 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-03T08:40:43Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:43Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-03T08:40:43Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:43Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-03T08:40:43Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:43Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-03T08:40:43Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:43Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"4a5a1b91-d1b3-462d-b8c2-89eae83d6c3d\\\",\\\"systemUUID\\\":\\\"c85bcae8-d463-4f60-8737-09c0f3c02573\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T08:40:43Z is after 2025-08-24T17:21:41Z" Oct 03 08:40:43 crc kubenswrapper[4765]: I1003 08:40:43.552058 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 08:40:43 crc kubenswrapper[4765]: I1003 08:40:43.552171 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 08:40:43 crc kubenswrapper[4765]: I1003 08:40:43.552188 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 08:40:43 crc kubenswrapper[4765]: I1003 08:40:43.552209 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 08:40:43 crc kubenswrapper[4765]: I1003 08:40:43.552225 4765 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T08:40:43Z","lastTransitionTime":"2025-10-03T08:40:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 08:40:43 crc kubenswrapper[4765]: E1003 08:40:43.565689 4765 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-03T08:40:43Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:43Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-03T08:40:43Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:43Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-03T08:40:43Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:43Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-03T08:40:43Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:43Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"4a5a1b91-d1b3-462d-b8c2-89eae83d6c3d\\\",\\\"systemUUID\\\":\\\"c85bcae8-d463-4f60-8737-09c0f3c02573\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T08:40:43Z is after 2025-08-24T17:21:41Z" Oct 03 08:40:43 crc kubenswrapper[4765]: I1003 08:40:43.569994 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 08:40:43 crc kubenswrapper[4765]: I1003 08:40:43.570041 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 08:40:43 crc kubenswrapper[4765]: I1003 08:40:43.570054 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 08:40:43 crc kubenswrapper[4765]: I1003 08:40:43.570074 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 08:40:43 crc kubenswrapper[4765]: I1003 08:40:43.570086 4765 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T08:40:43Z","lastTransitionTime":"2025-10-03T08:40:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 08:40:43 crc kubenswrapper[4765]: E1003 08:40:43.583295 4765 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-03T08:40:43Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:43Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-03T08:40:43Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:43Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-03T08:40:43Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:43Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-03T08:40:43Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:43Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"4a5a1b91-d1b3-462d-b8c2-89eae83d6c3d\\\",\\\"systemUUID\\\":\\\"c85bcae8-d463-4f60-8737-09c0f3c02573\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T08:40:43Z is after 2025-08-24T17:21:41Z" Oct 03 08:40:43 crc kubenswrapper[4765]: I1003 08:40:43.590421 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 08:40:43 crc kubenswrapper[4765]: I1003 08:40:43.590483 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 08:40:43 crc kubenswrapper[4765]: I1003 08:40:43.590493 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 08:40:43 crc kubenswrapper[4765]: I1003 08:40:43.590512 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 08:40:43 crc kubenswrapper[4765]: I1003 08:40:43.590547 4765 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T08:40:43Z","lastTransitionTime":"2025-10-03T08:40:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 08:40:43 crc kubenswrapper[4765]: E1003 08:40:43.603545 4765 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-03T08:40:43Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:43Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-03T08:40:43Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:43Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-03T08:40:43Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:43Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-03T08:40:43Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:43Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"4a5a1b91-d1b3-462d-b8c2-89eae83d6c3d\\\",\\\"systemUUID\\\":\\\"c85bcae8-d463-4f60-8737-09c0f3c02573\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T08:40:43Z is after 2025-08-24T17:21:41Z" Oct 03 08:40:43 crc kubenswrapper[4765]: I1003 08:40:43.607945 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 08:40:43 crc kubenswrapper[4765]: I1003 08:40:43.608058 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 08:40:43 crc kubenswrapper[4765]: I1003 08:40:43.608071 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 08:40:43 crc kubenswrapper[4765]: I1003 08:40:43.608094 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 08:40:43 crc kubenswrapper[4765]: I1003 08:40:43.608109 4765 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T08:40:43Z","lastTransitionTime":"2025-10-03T08:40:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 08:40:43 crc kubenswrapper[4765]: E1003 08:40:43.623767 4765 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-03T08:40:43Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:43Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-03T08:40:43Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:43Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-03T08:40:43Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:43Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-03T08:40:43Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:43Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"4a5a1b91-d1b3-462d-b8c2-89eae83d6c3d\\\",\\\"systemUUID\\\":\\\"c85bcae8-d463-4f60-8737-09c0f3c02573\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T08:40:43Z is after 2025-08-24T17:21:41Z" Oct 03 08:40:43 crc kubenswrapper[4765]: E1003 08:40:43.623956 4765 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Oct 03 08:40:43 crc kubenswrapper[4765]: I1003 08:40:43.626447 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 08:40:43 crc kubenswrapper[4765]: I1003 08:40:43.626518 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 08:40:43 crc kubenswrapper[4765]: I1003 08:40:43.626531 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 08:40:43 crc kubenswrapper[4765]: I1003 08:40:43.626557 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 08:40:43 crc kubenswrapper[4765]: I1003 08:40:43.626570 4765 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T08:40:43Z","lastTransitionTime":"2025-10-03T08:40:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 08:40:43 crc kubenswrapper[4765]: I1003 08:40:43.730175 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 08:40:43 crc kubenswrapper[4765]: I1003 08:40:43.730227 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 08:40:43 crc kubenswrapper[4765]: I1003 08:40:43.730236 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 08:40:43 crc kubenswrapper[4765]: I1003 08:40:43.730252 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 08:40:43 crc kubenswrapper[4765]: I1003 08:40:43.730264 4765 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T08:40:43Z","lastTransitionTime":"2025-10-03T08:40:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 08:40:43 crc kubenswrapper[4765]: I1003 08:40:43.833600 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 08:40:43 crc kubenswrapper[4765]: I1003 08:40:43.833690 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 08:40:43 crc kubenswrapper[4765]: I1003 08:40:43.833702 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 08:40:43 crc kubenswrapper[4765]: I1003 08:40:43.833723 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 08:40:43 crc kubenswrapper[4765]: I1003 08:40:43.833736 4765 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T08:40:43Z","lastTransitionTime":"2025-10-03T08:40:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 08:40:43 crc kubenswrapper[4765]: I1003 08:40:43.936951 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 08:40:43 crc kubenswrapper[4765]: I1003 08:40:43.936990 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 08:40:43 crc kubenswrapper[4765]: I1003 08:40:43.936999 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 08:40:43 crc kubenswrapper[4765]: I1003 08:40:43.937016 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 08:40:43 crc kubenswrapper[4765]: I1003 08:40:43.937026 4765 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T08:40:43Z","lastTransitionTime":"2025-10-03T08:40:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 08:40:44 crc kubenswrapper[4765]: I1003 08:40:44.039542 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 08:40:44 crc kubenswrapper[4765]: I1003 08:40:44.039591 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 08:40:44 crc kubenswrapper[4765]: I1003 08:40:44.039603 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 08:40:44 crc kubenswrapper[4765]: I1003 08:40:44.039617 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 08:40:44 crc kubenswrapper[4765]: I1003 08:40:44.039637 4765 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T08:40:44Z","lastTransitionTime":"2025-10-03T08:40:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 08:40:44 crc kubenswrapper[4765]: I1003 08:40:44.141680 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 08:40:44 crc kubenswrapper[4765]: I1003 08:40:44.141720 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 08:40:44 crc kubenswrapper[4765]: I1003 08:40:44.141731 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 08:40:44 crc kubenswrapper[4765]: I1003 08:40:44.141746 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 08:40:44 crc kubenswrapper[4765]: I1003 08:40:44.141758 4765 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T08:40:44Z","lastTransitionTime":"2025-10-03T08:40:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 08:40:44 crc kubenswrapper[4765]: I1003 08:40:44.244475 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 08:40:44 crc kubenswrapper[4765]: I1003 08:40:44.244529 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 08:40:44 crc kubenswrapper[4765]: I1003 08:40:44.244689 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 08:40:44 crc kubenswrapper[4765]: I1003 08:40:44.244709 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 08:40:44 crc kubenswrapper[4765]: I1003 08:40:44.244721 4765 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T08:40:44Z","lastTransitionTime":"2025-10-03T08:40:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 08:40:44 crc kubenswrapper[4765]: I1003 08:40:44.306385 4765 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-wdwf5" Oct 03 08:40:44 crc kubenswrapper[4765]: E1003 08:40:44.306525 4765 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-wdwf5" podUID="6824483c-e9a7-4e95-bb3d-e00bac2af3aa" Oct 03 08:40:44 crc kubenswrapper[4765]: I1003 08:40:44.347756 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 08:40:44 crc kubenswrapper[4765]: I1003 08:40:44.347874 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 08:40:44 crc kubenswrapper[4765]: I1003 08:40:44.347886 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 08:40:44 crc kubenswrapper[4765]: I1003 08:40:44.347906 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 08:40:44 crc kubenswrapper[4765]: I1003 08:40:44.347918 4765 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T08:40:44Z","lastTransitionTime":"2025-10-03T08:40:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 08:40:44 crc kubenswrapper[4765]: I1003 08:40:44.451365 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 08:40:44 crc kubenswrapper[4765]: I1003 08:40:44.451439 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 08:40:44 crc kubenswrapper[4765]: I1003 08:40:44.451459 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 08:40:44 crc kubenswrapper[4765]: I1003 08:40:44.451487 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 08:40:44 crc kubenswrapper[4765]: I1003 08:40:44.451508 4765 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T08:40:44Z","lastTransitionTime":"2025-10-03T08:40:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 08:40:44 crc kubenswrapper[4765]: I1003 08:40:44.554414 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 08:40:44 crc kubenswrapper[4765]: I1003 08:40:44.554454 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 08:40:44 crc kubenswrapper[4765]: I1003 08:40:44.554465 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 08:40:44 crc kubenswrapper[4765]: I1003 08:40:44.554481 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 08:40:44 crc kubenswrapper[4765]: I1003 08:40:44.554494 4765 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T08:40:44Z","lastTransitionTime":"2025-10-03T08:40:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 08:40:44 crc kubenswrapper[4765]: I1003 08:40:44.657693 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 08:40:44 crc kubenswrapper[4765]: I1003 08:40:44.657754 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 08:40:44 crc kubenswrapper[4765]: I1003 08:40:44.657767 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 08:40:44 crc kubenswrapper[4765]: I1003 08:40:44.657787 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 08:40:44 crc kubenswrapper[4765]: I1003 08:40:44.657801 4765 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T08:40:44Z","lastTransitionTime":"2025-10-03T08:40:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 08:40:44 crc kubenswrapper[4765]: I1003 08:40:44.761313 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 08:40:44 crc kubenswrapper[4765]: I1003 08:40:44.761404 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 08:40:44 crc kubenswrapper[4765]: I1003 08:40:44.761423 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 08:40:44 crc kubenswrapper[4765]: I1003 08:40:44.761492 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 08:40:44 crc kubenswrapper[4765]: I1003 08:40:44.761505 4765 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T08:40:44Z","lastTransitionTime":"2025-10-03T08:40:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 08:40:44 crc kubenswrapper[4765]: I1003 08:40:44.864674 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 08:40:44 crc kubenswrapper[4765]: I1003 08:40:44.864726 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 08:40:44 crc kubenswrapper[4765]: I1003 08:40:44.864738 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 08:40:44 crc kubenswrapper[4765]: I1003 08:40:44.864759 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 08:40:44 crc kubenswrapper[4765]: I1003 08:40:44.864770 4765 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T08:40:44Z","lastTransitionTime":"2025-10-03T08:40:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 08:40:44 crc kubenswrapper[4765]: I1003 08:40:44.967887 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 08:40:44 crc kubenswrapper[4765]: I1003 08:40:44.967943 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 08:40:44 crc kubenswrapper[4765]: I1003 08:40:44.967954 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 08:40:44 crc kubenswrapper[4765]: I1003 08:40:44.967974 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 08:40:44 crc kubenswrapper[4765]: I1003 08:40:44.967998 4765 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T08:40:44Z","lastTransitionTime":"2025-10-03T08:40:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 08:40:45 crc kubenswrapper[4765]: I1003 08:40:45.070474 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 08:40:45 crc kubenswrapper[4765]: I1003 08:40:45.070533 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 08:40:45 crc kubenswrapper[4765]: I1003 08:40:45.070545 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 08:40:45 crc kubenswrapper[4765]: I1003 08:40:45.070561 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 08:40:45 crc kubenswrapper[4765]: I1003 08:40:45.070572 4765 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T08:40:45Z","lastTransitionTime":"2025-10-03T08:40:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 08:40:45 crc kubenswrapper[4765]: I1003 08:40:45.173567 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 08:40:45 crc kubenswrapper[4765]: I1003 08:40:45.173611 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 08:40:45 crc kubenswrapper[4765]: I1003 08:40:45.173657 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 08:40:45 crc kubenswrapper[4765]: I1003 08:40:45.173679 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 08:40:45 crc kubenswrapper[4765]: I1003 08:40:45.173691 4765 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T08:40:45Z","lastTransitionTime":"2025-10-03T08:40:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 08:40:45 crc kubenswrapper[4765]: I1003 08:40:45.276863 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 08:40:45 crc kubenswrapper[4765]: I1003 08:40:45.276939 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 08:40:45 crc kubenswrapper[4765]: I1003 08:40:45.276951 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 08:40:45 crc kubenswrapper[4765]: I1003 08:40:45.276974 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 08:40:45 crc kubenswrapper[4765]: I1003 08:40:45.276987 4765 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T08:40:45Z","lastTransitionTime":"2025-10-03T08:40:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 08:40:45 crc kubenswrapper[4765]: I1003 08:40:45.306799 4765 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 03 08:40:45 crc kubenswrapper[4765]: I1003 08:40:45.306897 4765 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 03 08:40:45 crc kubenswrapper[4765]: I1003 08:40:45.306843 4765 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 03 08:40:45 crc kubenswrapper[4765]: E1003 08:40:45.306999 4765 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 03 08:40:45 crc kubenswrapper[4765]: E1003 08:40:45.307159 4765 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 03 08:40:45 crc kubenswrapper[4765]: E1003 08:40:45.307206 4765 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 03 08:40:45 crc kubenswrapper[4765]: I1003 08:40:45.380320 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 08:40:45 crc kubenswrapper[4765]: I1003 08:40:45.380393 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 08:40:45 crc kubenswrapper[4765]: I1003 08:40:45.380404 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 08:40:45 crc kubenswrapper[4765]: I1003 08:40:45.380427 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 08:40:45 crc kubenswrapper[4765]: I1003 08:40:45.380439 4765 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T08:40:45Z","lastTransitionTime":"2025-10-03T08:40:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 08:40:45 crc kubenswrapper[4765]: I1003 08:40:45.483159 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 08:40:45 crc kubenswrapper[4765]: I1003 08:40:45.483201 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 08:40:45 crc kubenswrapper[4765]: I1003 08:40:45.483209 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 08:40:45 crc kubenswrapper[4765]: I1003 08:40:45.483223 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 08:40:45 crc kubenswrapper[4765]: I1003 08:40:45.483233 4765 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T08:40:45Z","lastTransitionTime":"2025-10-03T08:40:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 08:40:45 crc kubenswrapper[4765]: I1003 08:40:45.586770 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 08:40:45 crc kubenswrapper[4765]: I1003 08:40:45.586843 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 08:40:45 crc kubenswrapper[4765]: I1003 08:40:45.586882 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 08:40:45 crc kubenswrapper[4765]: I1003 08:40:45.586921 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 08:40:45 crc kubenswrapper[4765]: I1003 08:40:45.586945 4765 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T08:40:45Z","lastTransitionTime":"2025-10-03T08:40:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 08:40:45 crc kubenswrapper[4765]: I1003 08:40:45.691211 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 08:40:45 crc kubenswrapper[4765]: I1003 08:40:45.691270 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 08:40:45 crc kubenswrapper[4765]: I1003 08:40:45.691291 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 08:40:45 crc kubenswrapper[4765]: I1003 08:40:45.691319 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 08:40:45 crc kubenswrapper[4765]: I1003 08:40:45.691337 4765 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T08:40:45Z","lastTransitionTime":"2025-10-03T08:40:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 08:40:45 crc kubenswrapper[4765]: I1003 08:40:45.795401 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 08:40:45 crc kubenswrapper[4765]: I1003 08:40:45.795504 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 08:40:45 crc kubenswrapper[4765]: I1003 08:40:45.795527 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 08:40:45 crc kubenswrapper[4765]: I1003 08:40:45.795569 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 08:40:45 crc kubenswrapper[4765]: I1003 08:40:45.795597 4765 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T08:40:45Z","lastTransitionTime":"2025-10-03T08:40:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 08:40:45 crc kubenswrapper[4765]: I1003 08:40:45.899166 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 08:40:45 crc kubenswrapper[4765]: I1003 08:40:45.899217 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 08:40:45 crc kubenswrapper[4765]: I1003 08:40:45.899230 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 08:40:45 crc kubenswrapper[4765]: I1003 08:40:45.899249 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 08:40:45 crc kubenswrapper[4765]: I1003 08:40:45.899259 4765 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T08:40:45Z","lastTransitionTime":"2025-10-03T08:40:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 08:40:46 crc kubenswrapper[4765]: I1003 08:40:46.002480 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 08:40:46 crc kubenswrapper[4765]: I1003 08:40:46.002544 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 08:40:46 crc kubenswrapper[4765]: I1003 08:40:46.002557 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 08:40:46 crc kubenswrapper[4765]: I1003 08:40:46.002573 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 08:40:46 crc kubenswrapper[4765]: I1003 08:40:46.002584 4765 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T08:40:46Z","lastTransitionTime":"2025-10-03T08:40:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 08:40:46 crc kubenswrapper[4765]: I1003 08:40:46.104621 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 08:40:46 crc kubenswrapper[4765]: I1003 08:40:46.104697 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 08:40:46 crc kubenswrapper[4765]: I1003 08:40:46.104711 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 08:40:46 crc kubenswrapper[4765]: I1003 08:40:46.104728 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 08:40:46 crc kubenswrapper[4765]: I1003 08:40:46.104740 4765 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T08:40:46Z","lastTransitionTime":"2025-10-03T08:40:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 08:40:46 crc kubenswrapper[4765]: I1003 08:40:46.208375 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 08:40:46 crc kubenswrapper[4765]: I1003 08:40:46.208439 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 08:40:46 crc kubenswrapper[4765]: I1003 08:40:46.208456 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 08:40:46 crc kubenswrapper[4765]: I1003 08:40:46.208479 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 08:40:46 crc kubenswrapper[4765]: I1003 08:40:46.208496 4765 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T08:40:46Z","lastTransitionTime":"2025-10-03T08:40:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 08:40:46 crc kubenswrapper[4765]: I1003 08:40:46.306325 4765 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-wdwf5" Oct 03 08:40:46 crc kubenswrapper[4765]: E1003 08:40:46.306561 4765 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-wdwf5" podUID="6824483c-e9a7-4e95-bb3d-e00bac2af3aa" Oct 03 08:40:46 crc kubenswrapper[4765]: I1003 08:40:46.311119 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 08:40:46 crc kubenswrapper[4765]: I1003 08:40:46.311176 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 08:40:46 crc kubenswrapper[4765]: I1003 08:40:46.311189 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 08:40:46 crc kubenswrapper[4765]: I1003 08:40:46.311210 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 08:40:46 crc kubenswrapper[4765]: I1003 08:40:46.311228 4765 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T08:40:46Z","lastTransitionTime":"2025-10-03T08:40:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 08:40:46 crc kubenswrapper[4765]: I1003 08:40:46.324555 4765 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:35Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:35Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T08:40:46Z is after 2025-08-24T17:21:41Z" Oct 03 08:40:46 crc kubenswrapper[4765]: I1003 08:40:46.340616 4765 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:36Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:36Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e8d6f534a0a702832db2f8947c1528a98d511d3950cc5a6ec0ac3b31b3dbcb7c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T08:39:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://20ad16cb9f0f7e17ac946cd2c3f7c01b6e6c95d6d76c99f482b3761546689af2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T08:39:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T08:40:46Z is after 2025-08-24T17:21:41Z" Oct 03 08:40:46 crc kubenswrapper[4765]: I1003 08:40:46.355558 4765 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:38Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:38Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a37f2b5f797755065158a077232872befbc61f2f19c80dfd27bba7f131db794c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T08:39:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T08:40:46Z is after 2025-08-24T17:21:41Z" Oct 03 08:40:46 crc kubenswrapper[4765]: I1003 08:40:46.375040 4765 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-srgbb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ea01fba1-445f-46c1-898c-1ceb34866850\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:36Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:36Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d73e2e54676fc570262cfd551322ed003812c372ddc25695ca3b34ae2a05423b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T08:39:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m4zqv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fa40947035e07c4926ee170348e2bd545830d0c6c1fa6b59a2aa7f12eac2c6da\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T08:39:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m4zqv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://902d94d2cc9ce526c6ea774f1bb70fbee7da85cedab72fcd842f87d47ee8a458\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T08:39:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m4zqv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://95502595a856f5f235331ab5db3d4f97a50f968857c1962d12b873a714689f0c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T08:39:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m4zqv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a3ad66691c9dcf004703b79d697a78f9b42791fafba2ddf278997b6ad28bdd4a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T08:39:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m4zqv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://68b9b8a7ec5c072f50d44aa0d3800b7cdee18bdd868d37ec129ceb37a23bd3ca\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T08:39:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m4zqv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://115444bc9990e2060fb9e8fff1ca7328f3abbaee25879c6af5feac46f0a417bb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://115444bc9990e2060fb9e8fff1ca7328f3abbaee25879c6af5feac46f0a417bb\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-10-03T08:40:36Z\\\",\\\"message\\\":\\\" error occurred: failed calling webhook \\\\\\\"pod.network-node-identity.openshift.io\\\\\\\": failed to call webhook: Post \\\\\\\"https://127.0.0.1:9743/pod?timeout=10s\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T08:40:36Z is after 2025-08-24T17:21:41Z\\\\nI1003 08:40:36.201436 6807 obj_retry.go:285] Attempting retry of *v1.Pod openshift-multus/network-metrics-daemon-wdwf5 before timer (time: 2025-10-03 08:40:37.635855498 +0000 UTC m=+2.038952178): skip\\\\nI1003 08:40:36.201453 6807 obj_retry.go:420] Function iterateRetryResources for *v1.Pod ended (in 55.481µs)\\\\nI1003 08:40:36.201382 6807 handler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI1003 08:40:36.201544 6807 handler.go:190] Sending *v1.Namespace event handler 1 for removal\\\\nI1003 08:40:36.201600 6807 handler.go:208] Removed *v1.Node event handler 2\\\\nI1003 08:40:36.201626 6807 factory.go:656] Stopping watch factory\\\\nI1003 08:40:36.201639 6807 ovnkube.go:599] Stopped ovnkube\\\\nI1003 08:40:36.201671 6807 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI1003 08:40:36.201691 6807 metrics.go:553] Stopping metrics server at address \\\\\\\"127.0.0.1:29103\\\\\\\"\\\\nF1003 08:40:36.201791 6807 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-03T08:40:35Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=ovnkube-controller pod=ovnkube-node-srgbb_openshift-ovn-kubernetes(ea01fba1-445f-46c1-898c-1ceb34866850)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m4zqv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6d5d60eb6ab5ff22cc2c6826b1d47220bb827fa0429f2a59020ae01d0a43f6bf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T08:39:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m4zqv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://761ad034a8a6a7a6280fcccaca136b26ddd423885bc2a7ab6b8e1856f16f7416\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://761ad034a8a6a7a6280fcccaca136b26ddd423885bc2a7ab6b8e1856f16f7416\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T08:39:37Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T08:39:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m4zqv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T08:39:36Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-srgbb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T08:40:46Z is after 2025-08-24T17:21:41Z" Oct 03 08:40:46 crc kubenswrapper[4765]: I1003 08:40:46.390191 4765 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9660b983-3561-4cf7-8ea0-31a63e8d1051\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://23c27e7d79dab0c54b22f0114e7f55a9267e3a21961b8479c37fd77d0e8b66c7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T08:39:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fb89a31c804d86cbc11b04e4dcfab79d4536f28a107d43e98d48172a1c257ebb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T08:39:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1c3168f51c49cd9633557cf31cdc0fec47b3fcf981462dc85f4253a0584fcf52\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T08:39:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://99ae775d5cfd2e88a1c7ca516e1c59f2e08ce1d383653cacbefeac66b07abcb5\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T08:39:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T08:39:16Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T08:40:46Z is after 2025-08-24T17:21:41Z" Oct 03 08:40:46 crc kubenswrapper[4765]: I1003 08:40:46.408352 4765 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-4bmrv" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4f105c06-3e67-486f-a622-923ae442117c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d7a29ab4db9b7548c70824520272e6323f615934cddf1d92bf653f6d8f030a19\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T08:39:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6lhz2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ab314ed006e71cae099ecf8be4ac18fb777dd83fe4e233d9bc347965f76b6a0b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ab314ed006e71cae099ecf8be4ac18fb777dd83fe4e233d9bc347965f76b6a0b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T08:39:37Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T08:39:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6lhz2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://261208c132a527b6786f64f37dd699e889f68ebc01265028288b3620615274bc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://261208c132a527b6786f64f37dd699e889f68ebc01265028288b3620615274bc\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T08:39:39Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T08:39:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6lhz2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8358f38c6c98de8206fc03fda21200b0e526972747588a6e32d525c064aa57ac\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8358f38c6c98de8206fc03fda21200b0e526972747588a6e32d525c064aa57ac\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T08:39:39Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T08:39:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6lhz2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9af7a0993c4e8d1177050ee170ae306c2e2570b0daca2d3f5c812b5f0e9c81da\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9af7a0993c4e8d1177050ee170ae306c2e2570b0daca2d3f5c812b5f0e9c81da\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T08:39:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T08:39:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6lhz2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://23ac91bc25ecc5c606b22bf6df52129330bb8c214ef8ec881fb202df6350c853\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://23ac91bc25ecc5c606b22bf6df52129330bb8c214ef8ec881fb202df6350c853\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T08:39:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T08:39:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6lhz2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7c836df75da45ef369baafc15bdbed1068becc3bf57a4c83a8519280ff3eb847\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7c836df75da45ef369baafc15bdbed1068becc3bf57a4c83a8519280ff3eb847\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T08:39:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T08:39:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6lhz2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T08:39:36Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-4bmrv\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T08:40:46Z is after 2025-08-24T17:21:41Z" Oct 03 08:40:46 crc kubenswrapper[4765]: I1003 08:40:46.413557 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 08:40:46 crc kubenswrapper[4765]: I1003 08:40:46.413610 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 08:40:46 crc kubenswrapper[4765]: I1003 08:40:46.413622 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 08:40:46 crc kubenswrapper[4765]: I1003 08:40:46.413667 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 08:40:46 crc kubenswrapper[4765]: I1003 08:40:46.413682 4765 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T08:40:46Z","lastTransitionTime":"2025-10-03T08:40:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 08:40:46 crc kubenswrapper[4765]: I1003 08:40:46.421663 4765 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-p9gf5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"46c76a49-e10b-4a12-a6c7-12c330cd3c4e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d127171dd11041892813dd0596574630e756cc4f2e54b149619bffdbe9bae37d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T08:39:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2ldt7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T08:39:36Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-p9gf5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T08:40:46Z is after 2025-08-24T17:21:41Z" Oct 03 08:40:46 crc kubenswrapper[4765]: I1003 08:40:46.434938 4765 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-svqbq" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"84cdf1d7-9997-4015-bdbf-eedacc081685\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://43441b23076aa88505c0014c6734ffd0302f9011300711eece573befc94f3fbf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T08:39:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tm2z5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T08:39:39Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-svqbq\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T08:40:46Z is after 2025-08-24T17:21:41Z" Oct 03 08:40:46 crc kubenswrapper[4765]: I1003 08:40:46.450533 4765 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-9pssq" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fcbd8c60-e4bc-43c1-b769-9ae58a05ea0f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fb36c0727cbf11d911102b2e91c3989a264374191f4ff34349ed6ec8eba2e58d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T08:39:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ggxtg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d810b33fb4971c7a1473884cbe04ad15b3cac6c0ca9af2384819d72a748ab173\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T08:39:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ggxtg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T08:39:48Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-9pssq\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T08:40:46Z is after 2025-08-24T17:21:41Z" Oct 03 08:40:46 crc kubenswrapper[4765]: I1003 08:40:46.465104 4765 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-wdwf5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6824483c-e9a7-4e95-bb3d-e00bac2af3aa\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:50Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:50Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:50Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9t858\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9t858\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T08:39:50Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-wdwf5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T08:40:46Z is after 2025-08-24T17:21:41Z" Oct 03 08:40:46 crc kubenswrapper[4765]: I1003 08:40:46.479844 4765 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"648d26ad-0ca3-4ce7-885d-6aab568ed72d\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://dee8eb78cfc7f681a7009b32e7521490cfa896aee35f8f552a150738224517be\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T08:39:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bdfadb3541e9c76e5ab7469b7161c24715f4eeff89ec4bba0cc253bece41f1b2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://bdfadb3541e9c76e5ab7469b7161c24715f4eeff89ec4bba0cc253bece41f1b2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T08:39:17Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T08:39:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T08:39:16Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T08:40:46Z is after 2025-08-24T17:21:41Z" Oct 03 08:40:46 crc kubenswrapper[4765]: I1003 08:40:46.497686 4765 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0c434639-9c6c-420c-a51b-fdf59b654daa\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d31497fd54f7500ac776bdd9a16414d873c053353911ed5ba237b201e9e7ac12\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T08:39:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e89b19d6a5b90a2051665bf2e5e150f73df7899eff246ee75246bc2127c415ea\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T08:39:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://79fad446c147481b1a0ff2a173848b2d24384e6b6aafcd0749dc820e9abfe929\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T08:39:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f2e21a2b21d807288e991a3a44ea38d316985590080aa4291aa3385816f826dd\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://aa0283dadc2c5e48aa9bfd20ef35d889a350244b72eb8529d4d4e682d5fa0e47\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-03T08:39:35Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1003 08:39:29.830291 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1003 08:39:29.833185 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2710500186/tls.crt::/tmp/serving-cert-2710500186/tls.key\\\\\\\"\\\\nI1003 08:39:35.213224 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1003 08:39:35.219008 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1003 08:39:35.219055 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1003 08:39:35.219088 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1003 08:39:35.219098 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1003 08:39:35.227302 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI1003 08:39:35.227314 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW1003 08:39:35.227372 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1003 08:39:35.227381 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1003 08:39:35.227385 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1003 08:39:35.227395 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1003 08:39:35.227398 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1003 08:39:35.227401 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF1003 08:39:35.229781 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-03T08:39:19Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T08:39:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fa1d1c0f4dab4b4c6c9f3afccac34473eab40a714015a2a7ce725ed1a92b609c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T08:39:19Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7c94b0f96f50324a67a7b189e9d3c4e406193421aa2e793692219bef9f2578dd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7c94b0f96f50324a67a7b189e9d3c4e406193421aa2e793692219bef9f2578dd\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T08:39:17Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T08:39:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T08:39:16Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T08:40:46Z is after 2025-08-24T17:21:41Z" Oct 03 08:40:46 crc kubenswrapper[4765]: I1003 08:40:46.511483 4765 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2a9b9fb7-e509-45bf-8ceb-fed6c0d26821\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://78b8f31e2b3f0891e3909baeb57c5a2dfe52c0e85d1aa86fe045ed54c56d5202\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T08:39:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6cedef6c592c877edfd8afe1dc09789fdc84a816a6a84d9ac9115fa494d8b5fe\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T08:39:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f1e92137f7438e3f6ae4b9225226f23f10f0e5e8a2b6a86f486971315d8bee00\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T08:39:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://00a75616be0bff2d1c730afda7f4212c6d85e07870e6f680c6903862387e00a0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://00a75616be0bff2d1c730afda7f4212c6d85e07870e6f680c6903862387e00a0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T08:39:17Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T08:39:17Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T08:39:16Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T08:40:46Z is after 2025-08-24T17:21:41Z" Oct 03 08:40:46 crc kubenswrapper[4765]: I1003 08:40:46.516874 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 08:40:46 crc kubenswrapper[4765]: I1003 08:40:46.516939 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 08:40:46 crc kubenswrapper[4765]: I1003 08:40:46.516951 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 08:40:46 crc kubenswrapper[4765]: I1003 08:40:46.516973 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 08:40:46 crc kubenswrapper[4765]: I1003 08:40:46.516989 4765 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T08:40:46Z","lastTransitionTime":"2025-10-03T08:40:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 08:40:46 crc kubenswrapper[4765]: I1003 08:40:46.527257 4765 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:35Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:35Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T08:40:46Z is after 2025-08-24T17:21:41Z" Oct 03 08:40:46 crc kubenswrapper[4765]: I1003 08:40:46.544521 4765 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:36Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:36Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e2003e4dd90b26bd915c05a690d0ab12b21ef7773138f11993382b0e7ac2d778\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T08:39:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T08:40:46Z is after 2025-08-24T17:21:41Z" Oct 03 08:40:46 crc kubenswrapper[4765]: I1003 08:40:46.566328 4765 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"859ee4f1-636f-48e5-ad72-fef19f311c64\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ffaf0cbc60fa84230a87aff908b5b2a76956abfa937aeea94363abe91640b93e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T08:39:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0fee410f71d4fa82e7bf54dad906736bc7182be512825a06bf7a4c76ed2f2789\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T08:39:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ab0ed26066c771f9943b6435fa382ff61fb04f0c8bef3d505aba4c5d1a1d4740\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T08:39:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://153c9584928c3d064c6098126dad58733015ed123b9a55c959e69ddcc0ad2110\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T08:39:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6fa1bc45d80d90bc08ca3a7177e2ac77b66c36f5a0f863532174be7719bfaae0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T08:39:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c1359775b3b907743ce1d7754c904fab5cf953ad3a28f91e781be95462e6a9d4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c1359775b3b907743ce1d7754c904fab5cf953ad3a28f91e781be95462e6a9d4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T08:39:17Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T08:39:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a56c24b3af6b3d5c18009cbb5517273674eb36f57157344f99903f9c50c7d1a0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a56c24b3af6b3d5c18009cbb5517273674eb36f57157344f99903f9c50c7d1a0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T08:39:18Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T08:39:18Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://18dbd66a12f96fa4beef4da9dcc0e1b1d3a5164a8b21a6774142289078d6c05f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://18dbd66a12f96fa4beef4da9dcc0e1b1d3a5164a8b21a6774142289078d6c05f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T08:39:19Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T08:39:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T08:39:16Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T08:40:46Z is after 2025-08-24T17:21:41Z" Oct 03 08:40:46 crc kubenswrapper[4765]: I1003 08:40:46.582021 4765 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:35Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:35Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T08:40:46Z is after 2025-08-24T17:21:41Z" Oct 03 08:40:46 crc kubenswrapper[4765]: I1003 08:40:46.597027 4765 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-csb5z" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"912755c8-dd28-4fbc-82de-9cf85df54f4f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://52f5a7f443bf8e52988e8645ff60745a747d602261e7dbf01b68c58aaf9bae05\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d7f179012e9f55f30c641a1ae3640cc90cefb3d2527d0c1e0580c219899503e1\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-10-03T08:40:23Z\\\",\\\"message\\\":\\\"2025-10-03T08:39:38+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_5149c5cc-1f13-4c92-ba76-7ef1ed5a7abf\\\\n2025-10-03T08:39:38+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_5149c5cc-1f13-4c92-ba76-7ef1ed5a7abf to /host/opt/cni/bin/\\\\n2025-10-03T08:39:38Z [verbose] multus-daemon started\\\\n2025-10-03T08:39:38Z [verbose] Readiness Indicator file check\\\\n2025-10-03T08:40:23Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-03T08:39:37Z\\\"}},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T08:40:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w8k2k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T08:39:36Z\\\"}}\" for pod \"openshift-multus\"/\"multus-csb5z\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T08:40:46Z is after 2025-08-24T17:21:41Z" Oct 03 08:40:46 crc kubenswrapper[4765]: I1003 08:40:46.610798 4765 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-j8mss" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d636dbad-9ffa-4ba7-953f-adea04b76a23\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://33c95fa1034cd2135f4293956d73825e809195d220ff0b10a6604bd399a5730a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T08:39:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dhzzj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://714c78e9165f96e2aee03ad7be980399f06aeb852da4d76611c236f262518281\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T08:39:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dhzzj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T08:39:36Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-j8mss\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T08:40:46Z is after 2025-08-24T17:21:41Z" Oct 03 08:40:46 crc kubenswrapper[4765]: I1003 08:40:46.624380 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 08:40:46 crc kubenswrapper[4765]: I1003 08:40:46.624416 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 08:40:46 crc kubenswrapper[4765]: I1003 08:40:46.624428 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 08:40:46 crc kubenswrapper[4765]: I1003 08:40:46.624444 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 08:40:46 crc kubenswrapper[4765]: I1003 08:40:46.624454 4765 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T08:40:46Z","lastTransitionTime":"2025-10-03T08:40:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 08:40:46 crc kubenswrapper[4765]: I1003 08:40:46.727095 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 08:40:46 crc kubenswrapper[4765]: I1003 08:40:46.727144 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 08:40:46 crc kubenswrapper[4765]: I1003 08:40:46.727155 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 08:40:46 crc kubenswrapper[4765]: I1003 08:40:46.727171 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 08:40:46 crc kubenswrapper[4765]: I1003 08:40:46.727182 4765 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T08:40:46Z","lastTransitionTime":"2025-10-03T08:40:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 08:40:46 crc kubenswrapper[4765]: I1003 08:40:46.830106 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 08:40:46 crc kubenswrapper[4765]: I1003 08:40:46.830145 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 08:40:46 crc kubenswrapper[4765]: I1003 08:40:46.830155 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 08:40:46 crc kubenswrapper[4765]: I1003 08:40:46.830169 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 08:40:46 crc kubenswrapper[4765]: I1003 08:40:46.830179 4765 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T08:40:46Z","lastTransitionTime":"2025-10-03T08:40:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 08:40:46 crc kubenswrapper[4765]: I1003 08:40:46.932417 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 08:40:46 crc kubenswrapper[4765]: I1003 08:40:46.932469 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 08:40:46 crc kubenswrapper[4765]: I1003 08:40:46.932482 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 08:40:46 crc kubenswrapper[4765]: I1003 08:40:46.932506 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 08:40:46 crc kubenswrapper[4765]: I1003 08:40:46.932519 4765 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T08:40:46Z","lastTransitionTime":"2025-10-03T08:40:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 08:40:47 crc kubenswrapper[4765]: I1003 08:40:47.034812 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 08:40:47 crc kubenswrapper[4765]: I1003 08:40:47.034871 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 08:40:47 crc kubenswrapper[4765]: I1003 08:40:47.034884 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 08:40:47 crc kubenswrapper[4765]: I1003 08:40:47.034905 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 08:40:47 crc kubenswrapper[4765]: I1003 08:40:47.034920 4765 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T08:40:47Z","lastTransitionTime":"2025-10-03T08:40:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 08:40:47 crc kubenswrapper[4765]: I1003 08:40:47.137268 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 08:40:47 crc kubenswrapper[4765]: I1003 08:40:47.137328 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 08:40:47 crc kubenswrapper[4765]: I1003 08:40:47.137342 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 08:40:47 crc kubenswrapper[4765]: I1003 08:40:47.137363 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 08:40:47 crc kubenswrapper[4765]: I1003 08:40:47.137375 4765 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T08:40:47Z","lastTransitionTime":"2025-10-03T08:40:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 08:40:47 crc kubenswrapper[4765]: I1003 08:40:47.240685 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 08:40:47 crc kubenswrapper[4765]: I1003 08:40:47.240768 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 08:40:47 crc kubenswrapper[4765]: I1003 08:40:47.240782 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 08:40:47 crc kubenswrapper[4765]: I1003 08:40:47.240809 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 08:40:47 crc kubenswrapper[4765]: I1003 08:40:47.240823 4765 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T08:40:47Z","lastTransitionTime":"2025-10-03T08:40:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 08:40:47 crc kubenswrapper[4765]: I1003 08:40:47.306343 4765 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 03 08:40:47 crc kubenswrapper[4765]: E1003 08:40:47.306497 4765 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 03 08:40:47 crc kubenswrapper[4765]: I1003 08:40:47.306554 4765 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 03 08:40:47 crc kubenswrapper[4765]: E1003 08:40:47.306810 4765 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 03 08:40:47 crc kubenswrapper[4765]: I1003 08:40:47.306942 4765 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 03 08:40:47 crc kubenswrapper[4765]: E1003 08:40:47.307018 4765 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 03 08:40:47 crc kubenswrapper[4765]: I1003 08:40:47.344064 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 08:40:47 crc kubenswrapper[4765]: I1003 08:40:47.344117 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 08:40:47 crc kubenswrapper[4765]: I1003 08:40:47.344129 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 08:40:47 crc kubenswrapper[4765]: I1003 08:40:47.344148 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 08:40:47 crc kubenswrapper[4765]: I1003 08:40:47.344159 4765 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T08:40:47Z","lastTransitionTime":"2025-10-03T08:40:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 08:40:47 crc kubenswrapper[4765]: I1003 08:40:47.446351 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 08:40:47 crc kubenswrapper[4765]: I1003 08:40:47.446396 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 08:40:47 crc kubenswrapper[4765]: I1003 08:40:47.446408 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 08:40:47 crc kubenswrapper[4765]: I1003 08:40:47.446468 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 08:40:47 crc kubenswrapper[4765]: I1003 08:40:47.446480 4765 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T08:40:47Z","lastTransitionTime":"2025-10-03T08:40:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 08:40:47 crc kubenswrapper[4765]: I1003 08:40:47.548925 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 08:40:47 crc kubenswrapper[4765]: I1003 08:40:47.548967 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 08:40:47 crc kubenswrapper[4765]: I1003 08:40:47.548975 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 08:40:47 crc kubenswrapper[4765]: I1003 08:40:47.548990 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 08:40:47 crc kubenswrapper[4765]: I1003 08:40:47.548998 4765 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T08:40:47Z","lastTransitionTime":"2025-10-03T08:40:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 08:40:47 crc kubenswrapper[4765]: I1003 08:40:47.651706 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 08:40:47 crc kubenswrapper[4765]: I1003 08:40:47.651758 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 08:40:47 crc kubenswrapper[4765]: I1003 08:40:47.651769 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 08:40:47 crc kubenswrapper[4765]: I1003 08:40:47.651785 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 08:40:47 crc kubenswrapper[4765]: I1003 08:40:47.651796 4765 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T08:40:47Z","lastTransitionTime":"2025-10-03T08:40:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 08:40:47 crc kubenswrapper[4765]: I1003 08:40:47.753843 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 08:40:47 crc kubenswrapper[4765]: I1003 08:40:47.753883 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 08:40:47 crc kubenswrapper[4765]: I1003 08:40:47.753893 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 08:40:47 crc kubenswrapper[4765]: I1003 08:40:47.753908 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 08:40:47 crc kubenswrapper[4765]: I1003 08:40:47.753918 4765 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T08:40:47Z","lastTransitionTime":"2025-10-03T08:40:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 08:40:47 crc kubenswrapper[4765]: I1003 08:40:47.855728 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 08:40:47 crc kubenswrapper[4765]: I1003 08:40:47.855768 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 08:40:47 crc kubenswrapper[4765]: I1003 08:40:47.855778 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 08:40:47 crc kubenswrapper[4765]: I1003 08:40:47.855792 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 08:40:47 crc kubenswrapper[4765]: I1003 08:40:47.855802 4765 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T08:40:47Z","lastTransitionTime":"2025-10-03T08:40:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 08:40:47 crc kubenswrapper[4765]: I1003 08:40:47.958260 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 08:40:47 crc kubenswrapper[4765]: I1003 08:40:47.958299 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 08:40:47 crc kubenswrapper[4765]: I1003 08:40:47.958310 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 08:40:47 crc kubenswrapper[4765]: I1003 08:40:47.958325 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 08:40:47 crc kubenswrapper[4765]: I1003 08:40:47.958334 4765 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T08:40:47Z","lastTransitionTime":"2025-10-03T08:40:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 08:40:48 crc kubenswrapper[4765]: I1003 08:40:48.059915 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 08:40:48 crc kubenswrapper[4765]: I1003 08:40:48.059941 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 08:40:48 crc kubenswrapper[4765]: I1003 08:40:48.059949 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 08:40:48 crc kubenswrapper[4765]: I1003 08:40:48.059964 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 08:40:48 crc kubenswrapper[4765]: I1003 08:40:48.059972 4765 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T08:40:48Z","lastTransitionTime":"2025-10-03T08:40:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 08:40:48 crc kubenswrapper[4765]: I1003 08:40:48.162744 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 08:40:48 crc kubenswrapper[4765]: I1003 08:40:48.162805 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 08:40:48 crc kubenswrapper[4765]: I1003 08:40:48.162818 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 08:40:48 crc kubenswrapper[4765]: I1003 08:40:48.162835 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 08:40:48 crc kubenswrapper[4765]: I1003 08:40:48.162845 4765 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T08:40:48Z","lastTransitionTime":"2025-10-03T08:40:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 08:40:48 crc kubenswrapper[4765]: I1003 08:40:48.265978 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 08:40:48 crc kubenswrapper[4765]: I1003 08:40:48.266041 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 08:40:48 crc kubenswrapper[4765]: I1003 08:40:48.266057 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 08:40:48 crc kubenswrapper[4765]: I1003 08:40:48.266082 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 08:40:48 crc kubenswrapper[4765]: I1003 08:40:48.266096 4765 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T08:40:48Z","lastTransitionTime":"2025-10-03T08:40:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 08:40:48 crc kubenswrapper[4765]: I1003 08:40:48.306725 4765 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-wdwf5" Oct 03 08:40:48 crc kubenswrapper[4765]: E1003 08:40:48.306946 4765 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-wdwf5" podUID="6824483c-e9a7-4e95-bb3d-e00bac2af3aa" Oct 03 08:40:48 crc kubenswrapper[4765]: I1003 08:40:48.368901 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 08:40:48 crc kubenswrapper[4765]: I1003 08:40:48.369249 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 08:40:48 crc kubenswrapper[4765]: I1003 08:40:48.369349 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 08:40:48 crc kubenswrapper[4765]: I1003 08:40:48.369427 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 08:40:48 crc kubenswrapper[4765]: I1003 08:40:48.369514 4765 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T08:40:48Z","lastTransitionTime":"2025-10-03T08:40:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 08:40:48 crc kubenswrapper[4765]: I1003 08:40:48.472807 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 08:40:48 crc kubenswrapper[4765]: I1003 08:40:48.473149 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 08:40:48 crc kubenswrapper[4765]: I1003 08:40:48.473260 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 08:40:48 crc kubenswrapper[4765]: I1003 08:40:48.473359 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 08:40:48 crc kubenswrapper[4765]: I1003 08:40:48.473448 4765 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T08:40:48Z","lastTransitionTime":"2025-10-03T08:40:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 08:40:48 crc kubenswrapper[4765]: I1003 08:40:48.576288 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 08:40:48 crc kubenswrapper[4765]: I1003 08:40:48.576324 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 08:40:48 crc kubenswrapper[4765]: I1003 08:40:48.576332 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 08:40:48 crc kubenswrapper[4765]: I1003 08:40:48.576350 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 08:40:48 crc kubenswrapper[4765]: I1003 08:40:48.576359 4765 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T08:40:48Z","lastTransitionTime":"2025-10-03T08:40:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 08:40:48 crc kubenswrapper[4765]: I1003 08:40:48.678605 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 08:40:48 crc kubenswrapper[4765]: I1003 08:40:48.678674 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 08:40:48 crc kubenswrapper[4765]: I1003 08:40:48.678684 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 08:40:48 crc kubenswrapper[4765]: I1003 08:40:48.678701 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 08:40:48 crc kubenswrapper[4765]: I1003 08:40:48.678711 4765 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T08:40:48Z","lastTransitionTime":"2025-10-03T08:40:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 08:40:48 crc kubenswrapper[4765]: I1003 08:40:48.781438 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 08:40:48 crc kubenswrapper[4765]: I1003 08:40:48.781638 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 08:40:48 crc kubenswrapper[4765]: I1003 08:40:48.781668 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 08:40:48 crc kubenswrapper[4765]: I1003 08:40:48.781686 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 08:40:48 crc kubenswrapper[4765]: I1003 08:40:48.781698 4765 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T08:40:48Z","lastTransitionTime":"2025-10-03T08:40:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 08:40:48 crc kubenswrapper[4765]: I1003 08:40:48.884257 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 08:40:48 crc kubenswrapper[4765]: I1003 08:40:48.884300 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 08:40:48 crc kubenswrapper[4765]: I1003 08:40:48.884314 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 08:40:48 crc kubenswrapper[4765]: I1003 08:40:48.884331 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 08:40:48 crc kubenswrapper[4765]: I1003 08:40:48.884345 4765 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T08:40:48Z","lastTransitionTime":"2025-10-03T08:40:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 08:40:48 crc kubenswrapper[4765]: I1003 08:40:48.986395 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 08:40:48 crc kubenswrapper[4765]: I1003 08:40:48.986476 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 08:40:48 crc kubenswrapper[4765]: I1003 08:40:48.986487 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 08:40:48 crc kubenswrapper[4765]: I1003 08:40:48.986502 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 08:40:48 crc kubenswrapper[4765]: I1003 08:40:48.986514 4765 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T08:40:48Z","lastTransitionTime":"2025-10-03T08:40:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 08:40:49 crc kubenswrapper[4765]: I1003 08:40:49.089097 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 08:40:49 crc kubenswrapper[4765]: I1003 08:40:49.089153 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 08:40:49 crc kubenswrapper[4765]: I1003 08:40:49.089216 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 08:40:49 crc kubenswrapper[4765]: I1003 08:40:49.089235 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 08:40:49 crc kubenswrapper[4765]: I1003 08:40:49.089247 4765 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T08:40:49Z","lastTransitionTime":"2025-10-03T08:40:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 08:40:49 crc kubenswrapper[4765]: I1003 08:40:49.192634 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 08:40:49 crc kubenswrapper[4765]: I1003 08:40:49.192752 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 08:40:49 crc kubenswrapper[4765]: I1003 08:40:49.192766 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 08:40:49 crc kubenswrapper[4765]: I1003 08:40:49.192790 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 08:40:49 crc kubenswrapper[4765]: I1003 08:40:49.192813 4765 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T08:40:49Z","lastTransitionTime":"2025-10-03T08:40:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 08:40:49 crc kubenswrapper[4765]: I1003 08:40:49.296558 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 08:40:49 crc kubenswrapper[4765]: I1003 08:40:49.296619 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 08:40:49 crc kubenswrapper[4765]: I1003 08:40:49.296631 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 08:40:49 crc kubenswrapper[4765]: I1003 08:40:49.296672 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 08:40:49 crc kubenswrapper[4765]: I1003 08:40:49.296687 4765 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T08:40:49Z","lastTransitionTime":"2025-10-03T08:40:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 08:40:49 crc kubenswrapper[4765]: I1003 08:40:49.306027 4765 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 03 08:40:49 crc kubenswrapper[4765]: I1003 08:40:49.306027 4765 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 03 08:40:49 crc kubenswrapper[4765]: E1003 08:40:49.306213 4765 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 03 08:40:49 crc kubenswrapper[4765]: I1003 08:40:49.306052 4765 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 03 08:40:49 crc kubenswrapper[4765]: E1003 08:40:49.306338 4765 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 03 08:40:49 crc kubenswrapper[4765]: E1003 08:40:49.306395 4765 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 03 08:40:49 crc kubenswrapper[4765]: I1003 08:40:49.400757 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 08:40:49 crc kubenswrapper[4765]: I1003 08:40:49.400807 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 08:40:49 crc kubenswrapper[4765]: I1003 08:40:49.400816 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 08:40:49 crc kubenswrapper[4765]: I1003 08:40:49.400836 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 08:40:49 crc kubenswrapper[4765]: I1003 08:40:49.400846 4765 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T08:40:49Z","lastTransitionTime":"2025-10-03T08:40:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 08:40:49 crc kubenswrapper[4765]: I1003 08:40:49.504113 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 08:40:49 crc kubenswrapper[4765]: I1003 08:40:49.504775 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 08:40:49 crc kubenswrapper[4765]: I1003 08:40:49.504797 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 08:40:49 crc kubenswrapper[4765]: I1003 08:40:49.504813 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 08:40:49 crc kubenswrapper[4765]: I1003 08:40:49.504825 4765 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T08:40:49Z","lastTransitionTime":"2025-10-03T08:40:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 08:40:49 crc kubenswrapper[4765]: I1003 08:40:49.607058 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 08:40:49 crc kubenswrapper[4765]: I1003 08:40:49.607110 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 08:40:49 crc kubenswrapper[4765]: I1003 08:40:49.607121 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 08:40:49 crc kubenswrapper[4765]: I1003 08:40:49.607136 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 08:40:49 crc kubenswrapper[4765]: I1003 08:40:49.607146 4765 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T08:40:49Z","lastTransitionTime":"2025-10-03T08:40:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 08:40:49 crc kubenswrapper[4765]: I1003 08:40:49.709925 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 08:40:49 crc kubenswrapper[4765]: I1003 08:40:49.710206 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 08:40:49 crc kubenswrapper[4765]: I1003 08:40:49.710330 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 08:40:49 crc kubenswrapper[4765]: I1003 08:40:49.710433 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 08:40:49 crc kubenswrapper[4765]: I1003 08:40:49.710514 4765 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T08:40:49Z","lastTransitionTime":"2025-10-03T08:40:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 08:40:49 crc kubenswrapper[4765]: I1003 08:40:49.813319 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 08:40:49 crc kubenswrapper[4765]: I1003 08:40:49.813596 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 08:40:49 crc kubenswrapper[4765]: I1003 08:40:49.813749 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 08:40:49 crc kubenswrapper[4765]: I1003 08:40:49.813848 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 08:40:49 crc kubenswrapper[4765]: I1003 08:40:49.813928 4765 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T08:40:49Z","lastTransitionTime":"2025-10-03T08:40:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 08:40:49 crc kubenswrapper[4765]: I1003 08:40:49.916876 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 08:40:49 crc kubenswrapper[4765]: I1003 08:40:49.916922 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 08:40:49 crc kubenswrapper[4765]: I1003 08:40:49.916931 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 08:40:49 crc kubenswrapper[4765]: I1003 08:40:49.916945 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 08:40:49 crc kubenswrapper[4765]: I1003 08:40:49.916955 4765 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T08:40:49Z","lastTransitionTime":"2025-10-03T08:40:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 08:40:50 crc kubenswrapper[4765]: I1003 08:40:50.019497 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 08:40:50 crc kubenswrapper[4765]: I1003 08:40:50.019549 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 08:40:50 crc kubenswrapper[4765]: I1003 08:40:50.019561 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 08:40:50 crc kubenswrapper[4765]: I1003 08:40:50.019575 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 08:40:50 crc kubenswrapper[4765]: I1003 08:40:50.019584 4765 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T08:40:50Z","lastTransitionTime":"2025-10-03T08:40:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 08:40:50 crc kubenswrapper[4765]: I1003 08:40:50.121885 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 08:40:50 crc kubenswrapper[4765]: I1003 08:40:50.121919 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 08:40:50 crc kubenswrapper[4765]: I1003 08:40:50.121928 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 08:40:50 crc kubenswrapper[4765]: I1003 08:40:50.121941 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 08:40:50 crc kubenswrapper[4765]: I1003 08:40:50.121952 4765 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T08:40:50Z","lastTransitionTime":"2025-10-03T08:40:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 08:40:50 crc kubenswrapper[4765]: I1003 08:40:50.224293 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 08:40:50 crc kubenswrapper[4765]: I1003 08:40:50.224348 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 08:40:50 crc kubenswrapper[4765]: I1003 08:40:50.224357 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 08:40:50 crc kubenswrapper[4765]: I1003 08:40:50.224371 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 08:40:50 crc kubenswrapper[4765]: I1003 08:40:50.224379 4765 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T08:40:50Z","lastTransitionTime":"2025-10-03T08:40:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 08:40:50 crc kubenswrapper[4765]: I1003 08:40:50.306254 4765 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-wdwf5" Oct 03 08:40:50 crc kubenswrapper[4765]: E1003 08:40:50.306879 4765 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-wdwf5" podUID="6824483c-e9a7-4e95-bb3d-e00bac2af3aa" Oct 03 08:40:50 crc kubenswrapper[4765]: I1003 08:40:50.326949 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 08:40:50 crc kubenswrapper[4765]: I1003 08:40:50.327013 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 08:40:50 crc kubenswrapper[4765]: I1003 08:40:50.327031 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 08:40:50 crc kubenswrapper[4765]: I1003 08:40:50.327061 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 08:40:50 crc kubenswrapper[4765]: I1003 08:40:50.327080 4765 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T08:40:50Z","lastTransitionTime":"2025-10-03T08:40:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 08:40:50 crc kubenswrapper[4765]: I1003 08:40:50.430335 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 08:40:50 crc kubenswrapper[4765]: I1003 08:40:50.430387 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 08:40:50 crc kubenswrapper[4765]: I1003 08:40:50.430402 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 08:40:50 crc kubenswrapper[4765]: I1003 08:40:50.430422 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 08:40:50 crc kubenswrapper[4765]: I1003 08:40:50.430438 4765 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T08:40:50Z","lastTransitionTime":"2025-10-03T08:40:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 08:40:50 crc kubenswrapper[4765]: I1003 08:40:50.532854 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 08:40:50 crc kubenswrapper[4765]: I1003 08:40:50.532921 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 08:40:50 crc kubenswrapper[4765]: I1003 08:40:50.532932 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 08:40:50 crc kubenswrapper[4765]: I1003 08:40:50.532948 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 08:40:50 crc kubenswrapper[4765]: I1003 08:40:50.532964 4765 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T08:40:50Z","lastTransitionTime":"2025-10-03T08:40:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 08:40:50 crc kubenswrapper[4765]: I1003 08:40:50.635385 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 08:40:50 crc kubenswrapper[4765]: I1003 08:40:50.635442 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 08:40:50 crc kubenswrapper[4765]: I1003 08:40:50.635455 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 08:40:50 crc kubenswrapper[4765]: I1003 08:40:50.635474 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 08:40:50 crc kubenswrapper[4765]: I1003 08:40:50.635487 4765 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T08:40:50Z","lastTransitionTime":"2025-10-03T08:40:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 08:40:50 crc kubenswrapper[4765]: I1003 08:40:50.737535 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 08:40:50 crc kubenswrapper[4765]: I1003 08:40:50.737577 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 08:40:50 crc kubenswrapper[4765]: I1003 08:40:50.737588 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 08:40:50 crc kubenswrapper[4765]: I1003 08:40:50.737602 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 08:40:50 crc kubenswrapper[4765]: I1003 08:40:50.737612 4765 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T08:40:50Z","lastTransitionTime":"2025-10-03T08:40:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 08:40:50 crc kubenswrapper[4765]: I1003 08:40:50.840902 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 08:40:50 crc kubenswrapper[4765]: I1003 08:40:50.841512 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 08:40:50 crc kubenswrapper[4765]: I1003 08:40:50.841750 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 08:40:50 crc kubenswrapper[4765]: I1003 08:40:50.842009 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 08:40:50 crc kubenswrapper[4765]: I1003 08:40:50.842224 4765 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T08:40:50Z","lastTransitionTime":"2025-10-03T08:40:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 08:40:50 crc kubenswrapper[4765]: I1003 08:40:50.944945 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 08:40:50 crc kubenswrapper[4765]: I1003 08:40:50.944988 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 08:40:50 crc kubenswrapper[4765]: I1003 08:40:50.945001 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 08:40:50 crc kubenswrapper[4765]: I1003 08:40:50.945020 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 08:40:50 crc kubenswrapper[4765]: I1003 08:40:50.945034 4765 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T08:40:50Z","lastTransitionTime":"2025-10-03T08:40:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 08:40:51 crc kubenswrapper[4765]: I1003 08:40:51.048295 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 08:40:51 crc kubenswrapper[4765]: I1003 08:40:51.048670 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 08:40:51 crc kubenswrapper[4765]: I1003 08:40:51.048737 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 08:40:51 crc kubenswrapper[4765]: I1003 08:40:51.048862 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 08:40:51 crc kubenswrapper[4765]: I1003 08:40:51.048933 4765 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T08:40:51Z","lastTransitionTime":"2025-10-03T08:40:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 08:40:51 crc kubenswrapper[4765]: I1003 08:40:51.151233 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 08:40:51 crc kubenswrapper[4765]: I1003 08:40:51.151691 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 08:40:51 crc kubenswrapper[4765]: I1003 08:40:51.151963 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 08:40:51 crc kubenswrapper[4765]: I1003 08:40:51.152092 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 08:40:51 crc kubenswrapper[4765]: I1003 08:40:51.152169 4765 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T08:40:51Z","lastTransitionTime":"2025-10-03T08:40:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 08:40:51 crc kubenswrapper[4765]: I1003 08:40:51.255573 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 08:40:51 crc kubenswrapper[4765]: I1003 08:40:51.255879 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 08:40:51 crc kubenswrapper[4765]: I1003 08:40:51.256007 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 08:40:51 crc kubenswrapper[4765]: I1003 08:40:51.256112 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 08:40:51 crc kubenswrapper[4765]: I1003 08:40:51.256212 4765 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T08:40:51Z","lastTransitionTime":"2025-10-03T08:40:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 08:40:51 crc kubenswrapper[4765]: I1003 08:40:51.306305 4765 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 03 08:40:51 crc kubenswrapper[4765]: I1003 08:40:51.306389 4765 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 03 08:40:51 crc kubenswrapper[4765]: I1003 08:40:51.306308 4765 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 03 08:40:51 crc kubenswrapper[4765]: E1003 08:40:51.306495 4765 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 03 08:40:51 crc kubenswrapper[4765]: E1003 08:40:51.306620 4765 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 03 08:40:51 crc kubenswrapper[4765]: E1003 08:40:51.306837 4765 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 03 08:40:51 crc kubenswrapper[4765]: I1003 08:40:51.307537 4765 scope.go:117] "RemoveContainer" containerID="115444bc9990e2060fb9e8fff1ca7328f3abbaee25879c6af5feac46f0a417bb" Oct 03 08:40:51 crc kubenswrapper[4765]: E1003 08:40:51.307795 4765 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 40s restarting failed container=ovnkube-controller pod=ovnkube-node-srgbb_openshift-ovn-kubernetes(ea01fba1-445f-46c1-898c-1ceb34866850)\"" pod="openshift-ovn-kubernetes/ovnkube-node-srgbb" podUID="ea01fba1-445f-46c1-898c-1ceb34866850" Oct 03 08:40:51 crc kubenswrapper[4765]: I1003 08:40:51.358395 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 08:40:51 crc kubenswrapper[4765]: I1003 08:40:51.358631 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 08:40:51 crc kubenswrapper[4765]: I1003 08:40:51.358720 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 08:40:51 crc kubenswrapper[4765]: I1003 08:40:51.358825 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 08:40:51 crc kubenswrapper[4765]: I1003 08:40:51.358887 4765 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T08:40:51Z","lastTransitionTime":"2025-10-03T08:40:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 08:40:51 crc kubenswrapper[4765]: I1003 08:40:51.466238 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 08:40:51 crc kubenswrapper[4765]: I1003 08:40:51.466294 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 08:40:51 crc kubenswrapper[4765]: I1003 08:40:51.466330 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 08:40:51 crc kubenswrapper[4765]: I1003 08:40:51.466349 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 08:40:51 crc kubenswrapper[4765]: I1003 08:40:51.466359 4765 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T08:40:51Z","lastTransitionTime":"2025-10-03T08:40:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 08:40:51 crc kubenswrapper[4765]: I1003 08:40:51.568730 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 08:40:51 crc kubenswrapper[4765]: I1003 08:40:51.568772 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 08:40:51 crc kubenswrapper[4765]: I1003 08:40:51.568784 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 08:40:51 crc kubenswrapper[4765]: I1003 08:40:51.568801 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 08:40:51 crc kubenswrapper[4765]: I1003 08:40:51.568813 4765 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T08:40:51Z","lastTransitionTime":"2025-10-03T08:40:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 08:40:51 crc kubenswrapper[4765]: I1003 08:40:51.671009 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 08:40:51 crc kubenswrapper[4765]: I1003 08:40:51.671055 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 08:40:51 crc kubenswrapper[4765]: I1003 08:40:51.671066 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 08:40:51 crc kubenswrapper[4765]: I1003 08:40:51.671083 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 08:40:51 crc kubenswrapper[4765]: I1003 08:40:51.671095 4765 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T08:40:51Z","lastTransitionTime":"2025-10-03T08:40:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 08:40:51 crc kubenswrapper[4765]: I1003 08:40:51.772896 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 08:40:51 crc kubenswrapper[4765]: I1003 08:40:51.772943 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 08:40:51 crc kubenswrapper[4765]: I1003 08:40:51.772954 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 08:40:51 crc kubenswrapper[4765]: I1003 08:40:51.772970 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 08:40:51 crc kubenswrapper[4765]: I1003 08:40:51.773012 4765 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T08:40:51Z","lastTransitionTime":"2025-10-03T08:40:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 08:40:51 crc kubenswrapper[4765]: I1003 08:40:51.875322 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 08:40:51 crc kubenswrapper[4765]: I1003 08:40:51.875359 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 08:40:51 crc kubenswrapper[4765]: I1003 08:40:51.875369 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 08:40:51 crc kubenswrapper[4765]: I1003 08:40:51.875384 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 08:40:51 crc kubenswrapper[4765]: I1003 08:40:51.875394 4765 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T08:40:51Z","lastTransitionTime":"2025-10-03T08:40:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 08:40:51 crc kubenswrapper[4765]: I1003 08:40:51.977881 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 08:40:51 crc kubenswrapper[4765]: I1003 08:40:51.977932 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 08:40:51 crc kubenswrapper[4765]: I1003 08:40:51.977945 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 08:40:51 crc kubenswrapper[4765]: I1003 08:40:51.977960 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 08:40:51 crc kubenswrapper[4765]: I1003 08:40:51.977971 4765 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T08:40:51Z","lastTransitionTime":"2025-10-03T08:40:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 08:40:52 crc kubenswrapper[4765]: I1003 08:40:52.080419 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 08:40:52 crc kubenswrapper[4765]: I1003 08:40:52.080447 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 08:40:52 crc kubenswrapper[4765]: I1003 08:40:52.080456 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 08:40:52 crc kubenswrapper[4765]: I1003 08:40:52.080468 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 08:40:52 crc kubenswrapper[4765]: I1003 08:40:52.080476 4765 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T08:40:52Z","lastTransitionTime":"2025-10-03T08:40:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 08:40:52 crc kubenswrapper[4765]: I1003 08:40:52.182486 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 08:40:52 crc kubenswrapper[4765]: I1003 08:40:52.182532 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 08:40:52 crc kubenswrapper[4765]: I1003 08:40:52.182544 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 08:40:52 crc kubenswrapper[4765]: I1003 08:40:52.182561 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 08:40:52 crc kubenswrapper[4765]: I1003 08:40:52.182571 4765 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T08:40:52Z","lastTransitionTime":"2025-10-03T08:40:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 08:40:52 crc kubenswrapper[4765]: I1003 08:40:52.285382 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 08:40:52 crc kubenswrapper[4765]: I1003 08:40:52.285435 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 08:40:52 crc kubenswrapper[4765]: I1003 08:40:52.285445 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 08:40:52 crc kubenswrapper[4765]: I1003 08:40:52.285462 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 08:40:52 crc kubenswrapper[4765]: I1003 08:40:52.285472 4765 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T08:40:52Z","lastTransitionTime":"2025-10-03T08:40:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 08:40:52 crc kubenswrapper[4765]: I1003 08:40:52.306234 4765 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-wdwf5" Oct 03 08:40:52 crc kubenswrapper[4765]: E1003 08:40:52.306431 4765 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-wdwf5" podUID="6824483c-e9a7-4e95-bb3d-e00bac2af3aa" Oct 03 08:40:52 crc kubenswrapper[4765]: I1003 08:40:52.387607 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 08:40:52 crc kubenswrapper[4765]: I1003 08:40:52.387676 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 08:40:52 crc kubenswrapper[4765]: I1003 08:40:52.387687 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 08:40:52 crc kubenswrapper[4765]: I1003 08:40:52.387708 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 08:40:52 crc kubenswrapper[4765]: I1003 08:40:52.387722 4765 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T08:40:52Z","lastTransitionTime":"2025-10-03T08:40:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 08:40:52 crc kubenswrapper[4765]: I1003 08:40:52.491949 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 08:40:52 crc kubenswrapper[4765]: I1003 08:40:52.491987 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 08:40:52 crc kubenswrapper[4765]: I1003 08:40:52.491997 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 08:40:52 crc kubenswrapper[4765]: I1003 08:40:52.492011 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 08:40:52 crc kubenswrapper[4765]: I1003 08:40:52.492025 4765 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T08:40:52Z","lastTransitionTime":"2025-10-03T08:40:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 08:40:52 crc kubenswrapper[4765]: I1003 08:40:52.594681 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 08:40:52 crc kubenswrapper[4765]: I1003 08:40:52.594734 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 08:40:52 crc kubenswrapper[4765]: I1003 08:40:52.594746 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 08:40:52 crc kubenswrapper[4765]: I1003 08:40:52.594763 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 08:40:52 crc kubenswrapper[4765]: I1003 08:40:52.594774 4765 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T08:40:52Z","lastTransitionTime":"2025-10-03T08:40:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 08:40:52 crc kubenswrapper[4765]: I1003 08:40:52.697989 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 08:40:52 crc kubenswrapper[4765]: I1003 08:40:52.698045 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 08:40:52 crc kubenswrapper[4765]: I1003 08:40:52.698059 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 08:40:52 crc kubenswrapper[4765]: I1003 08:40:52.698084 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 08:40:52 crc kubenswrapper[4765]: I1003 08:40:52.698100 4765 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T08:40:52Z","lastTransitionTime":"2025-10-03T08:40:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 08:40:52 crc kubenswrapper[4765]: I1003 08:40:52.801969 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 08:40:52 crc kubenswrapper[4765]: I1003 08:40:52.802040 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 08:40:52 crc kubenswrapper[4765]: I1003 08:40:52.802052 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 08:40:52 crc kubenswrapper[4765]: I1003 08:40:52.802073 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 08:40:52 crc kubenswrapper[4765]: I1003 08:40:52.802086 4765 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T08:40:52Z","lastTransitionTime":"2025-10-03T08:40:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 08:40:52 crc kubenswrapper[4765]: I1003 08:40:52.904573 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 08:40:52 crc kubenswrapper[4765]: I1003 08:40:52.904658 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 08:40:52 crc kubenswrapper[4765]: I1003 08:40:52.904668 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 08:40:52 crc kubenswrapper[4765]: I1003 08:40:52.904682 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 08:40:52 crc kubenswrapper[4765]: I1003 08:40:52.904691 4765 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T08:40:52Z","lastTransitionTime":"2025-10-03T08:40:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 08:40:53 crc kubenswrapper[4765]: I1003 08:40:53.007009 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 08:40:53 crc kubenswrapper[4765]: I1003 08:40:53.007055 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 08:40:53 crc kubenswrapper[4765]: I1003 08:40:53.007064 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 08:40:53 crc kubenswrapper[4765]: I1003 08:40:53.007078 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 08:40:53 crc kubenswrapper[4765]: I1003 08:40:53.007087 4765 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T08:40:53Z","lastTransitionTime":"2025-10-03T08:40:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 08:40:53 crc kubenswrapper[4765]: I1003 08:40:53.109103 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 08:40:53 crc kubenswrapper[4765]: I1003 08:40:53.109151 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 08:40:53 crc kubenswrapper[4765]: I1003 08:40:53.109165 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 08:40:53 crc kubenswrapper[4765]: I1003 08:40:53.109182 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 08:40:53 crc kubenswrapper[4765]: I1003 08:40:53.109212 4765 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T08:40:53Z","lastTransitionTime":"2025-10-03T08:40:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 08:40:53 crc kubenswrapper[4765]: I1003 08:40:53.211448 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 08:40:53 crc kubenswrapper[4765]: I1003 08:40:53.211483 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 08:40:53 crc kubenswrapper[4765]: I1003 08:40:53.211493 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 08:40:53 crc kubenswrapper[4765]: I1003 08:40:53.211510 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 08:40:53 crc kubenswrapper[4765]: I1003 08:40:53.211520 4765 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T08:40:53Z","lastTransitionTime":"2025-10-03T08:40:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 08:40:53 crc kubenswrapper[4765]: I1003 08:40:53.306467 4765 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 03 08:40:53 crc kubenswrapper[4765]: I1003 08:40:53.306576 4765 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 03 08:40:53 crc kubenswrapper[4765]: I1003 08:40:53.306467 4765 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 03 08:40:53 crc kubenswrapper[4765]: E1003 08:40:53.306817 4765 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 03 08:40:53 crc kubenswrapper[4765]: E1003 08:40:53.306847 4765 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 03 08:40:53 crc kubenswrapper[4765]: E1003 08:40:53.306594 4765 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 03 08:40:53 crc kubenswrapper[4765]: I1003 08:40:53.313819 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 08:40:53 crc kubenswrapper[4765]: I1003 08:40:53.313846 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 08:40:53 crc kubenswrapper[4765]: I1003 08:40:53.313858 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 08:40:53 crc kubenswrapper[4765]: I1003 08:40:53.313869 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 08:40:53 crc kubenswrapper[4765]: I1003 08:40:53.313877 4765 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T08:40:53Z","lastTransitionTime":"2025-10-03T08:40:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 08:40:53 crc kubenswrapper[4765]: I1003 08:40:53.416520 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 08:40:53 crc kubenswrapper[4765]: I1003 08:40:53.416557 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 08:40:53 crc kubenswrapper[4765]: I1003 08:40:53.416568 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 08:40:53 crc kubenswrapper[4765]: I1003 08:40:53.416583 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 08:40:53 crc kubenswrapper[4765]: I1003 08:40:53.416593 4765 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T08:40:53Z","lastTransitionTime":"2025-10-03T08:40:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 08:40:53 crc kubenswrapper[4765]: I1003 08:40:53.519017 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 08:40:53 crc kubenswrapper[4765]: I1003 08:40:53.519053 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 08:40:53 crc kubenswrapper[4765]: I1003 08:40:53.519065 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 08:40:53 crc kubenswrapper[4765]: I1003 08:40:53.519085 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 08:40:53 crc kubenswrapper[4765]: I1003 08:40:53.519095 4765 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T08:40:53Z","lastTransitionTime":"2025-10-03T08:40:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 08:40:53 crc kubenswrapper[4765]: I1003 08:40:53.621307 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 08:40:53 crc kubenswrapper[4765]: I1003 08:40:53.621373 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 08:40:53 crc kubenswrapper[4765]: I1003 08:40:53.621387 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 08:40:53 crc kubenswrapper[4765]: I1003 08:40:53.621402 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 08:40:53 crc kubenswrapper[4765]: I1003 08:40:53.621415 4765 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T08:40:53Z","lastTransitionTime":"2025-10-03T08:40:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 08:40:53 crc kubenswrapper[4765]: I1003 08:40:53.723524 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 08:40:53 crc kubenswrapper[4765]: I1003 08:40:53.723577 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 08:40:53 crc kubenswrapper[4765]: I1003 08:40:53.723587 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 08:40:53 crc kubenswrapper[4765]: I1003 08:40:53.723601 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 08:40:53 crc kubenswrapper[4765]: I1003 08:40:53.723613 4765 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T08:40:53Z","lastTransitionTime":"2025-10-03T08:40:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 08:40:53 crc kubenswrapper[4765]: I1003 08:40:53.772384 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 08:40:53 crc kubenswrapper[4765]: I1003 08:40:53.772424 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 08:40:53 crc kubenswrapper[4765]: I1003 08:40:53.772436 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 08:40:53 crc kubenswrapper[4765]: I1003 08:40:53.772466 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 08:40:53 crc kubenswrapper[4765]: I1003 08:40:53.772478 4765 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T08:40:53Z","lastTransitionTime":"2025-10-03T08:40:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 08:40:53 crc kubenswrapper[4765]: E1003 08:40:53.784010 4765 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-03T08:40:53Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:53Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-03T08:40:53Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:53Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-03T08:40:53Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:53Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-03T08:40:53Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:53Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"4a5a1b91-d1b3-462d-b8c2-89eae83d6c3d\\\",\\\"systemUUID\\\":\\\"c85bcae8-d463-4f60-8737-09c0f3c02573\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T08:40:53Z is after 2025-08-24T17:21:41Z" Oct 03 08:40:53 crc kubenswrapper[4765]: I1003 08:40:53.787253 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 08:40:53 crc kubenswrapper[4765]: I1003 08:40:53.787315 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 08:40:53 crc kubenswrapper[4765]: I1003 08:40:53.787326 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 08:40:53 crc kubenswrapper[4765]: I1003 08:40:53.787339 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 08:40:53 crc kubenswrapper[4765]: I1003 08:40:53.787348 4765 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T08:40:53Z","lastTransitionTime":"2025-10-03T08:40:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 08:40:53 crc kubenswrapper[4765]: E1003 08:40:53.799282 4765 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-03T08:40:53Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:53Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-03T08:40:53Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:53Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-03T08:40:53Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:53Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-03T08:40:53Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:53Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"4a5a1b91-d1b3-462d-b8c2-89eae83d6c3d\\\",\\\"systemUUID\\\":\\\"c85bcae8-d463-4f60-8737-09c0f3c02573\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T08:40:53Z is after 2025-08-24T17:21:41Z" Oct 03 08:40:53 crc kubenswrapper[4765]: I1003 08:40:53.802353 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 08:40:53 crc kubenswrapper[4765]: I1003 08:40:53.802385 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 08:40:53 crc kubenswrapper[4765]: I1003 08:40:53.802397 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 08:40:53 crc kubenswrapper[4765]: I1003 08:40:53.802412 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 08:40:53 crc kubenswrapper[4765]: I1003 08:40:53.802423 4765 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T08:40:53Z","lastTransitionTime":"2025-10-03T08:40:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 08:40:53 crc kubenswrapper[4765]: E1003 08:40:53.815376 4765 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-03T08:40:53Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:53Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-03T08:40:53Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:53Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-03T08:40:53Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:53Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-03T08:40:53Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:53Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"4a5a1b91-d1b3-462d-b8c2-89eae83d6c3d\\\",\\\"systemUUID\\\":\\\"c85bcae8-d463-4f60-8737-09c0f3c02573\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T08:40:53Z is after 2025-08-24T17:21:41Z" Oct 03 08:40:53 crc kubenswrapper[4765]: I1003 08:40:53.818519 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 08:40:53 crc kubenswrapper[4765]: I1003 08:40:53.818550 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 08:40:53 crc kubenswrapper[4765]: I1003 08:40:53.818560 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 08:40:53 crc kubenswrapper[4765]: I1003 08:40:53.818573 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 08:40:53 crc kubenswrapper[4765]: I1003 08:40:53.818582 4765 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T08:40:53Z","lastTransitionTime":"2025-10-03T08:40:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 08:40:53 crc kubenswrapper[4765]: E1003 08:40:53.829574 4765 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-03T08:40:53Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:53Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-03T08:40:53Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:53Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-03T08:40:53Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:53Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-03T08:40:53Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:53Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"4a5a1b91-d1b3-462d-b8c2-89eae83d6c3d\\\",\\\"systemUUID\\\":\\\"c85bcae8-d463-4f60-8737-09c0f3c02573\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T08:40:53Z is after 2025-08-24T17:21:41Z" Oct 03 08:40:53 crc kubenswrapper[4765]: I1003 08:40:53.832286 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 08:40:53 crc kubenswrapper[4765]: I1003 08:40:53.832349 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 08:40:53 crc kubenswrapper[4765]: I1003 08:40:53.832364 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 08:40:53 crc kubenswrapper[4765]: I1003 08:40:53.832379 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 08:40:53 crc kubenswrapper[4765]: I1003 08:40:53.832388 4765 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T08:40:53Z","lastTransitionTime":"2025-10-03T08:40:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 08:40:53 crc kubenswrapper[4765]: E1003 08:40:53.842929 4765 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-03T08:40:53Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:53Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-03T08:40:53Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:53Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-03T08:40:53Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:53Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-03T08:40:53Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:53Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"4a5a1b91-d1b3-462d-b8c2-89eae83d6c3d\\\",\\\"systemUUID\\\":\\\"c85bcae8-d463-4f60-8737-09c0f3c02573\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T08:40:53Z is after 2025-08-24T17:21:41Z" Oct 03 08:40:53 crc kubenswrapper[4765]: E1003 08:40:53.843043 4765 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Oct 03 08:40:53 crc kubenswrapper[4765]: I1003 08:40:53.844317 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 08:40:53 crc kubenswrapper[4765]: I1003 08:40:53.844362 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 08:40:53 crc kubenswrapper[4765]: I1003 08:40:53.844374 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 08:40:53 crc kubenswrapper[4765]: I1003 08:40:53.844391 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 08:40:53 crc kubenswrapper[4765]: I1003 08:40:53.844404 4765 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T08:40:53Z","lastTransitionTime":"2025-10-03T08:40:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 08:40:53 crc kubenswrapper[4765]: I1003 08:40:53.946421 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 08:40:53 crc kubenswrapper[4765]: I1003 08:40:53.946476 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 08:40:53 crc kubenswrapper[4765]: I1003 08:40:53.946485 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 08:40:53 crc kubenswrapper[4765]: I1003 08:40:53.946501 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 08:40:53 crc kubenswrapper[4765]: I1003 08:40:53.946512 4765 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T08:40:53Z","lastTransitionTime":"2025-10-03T08:40:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 08:40:54 crc kubenswrapper[4765]: I1003 08:40:54.049042 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 08:40:54 crc kubenswrapper[4765]: I1003 08:40:54.049086 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 08:40:54 crc kubenswrapper[4765]: I1003 08:40:54.049094 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 08:40:54 crc kubenswrapper[4765]: I1003 08:40:54.049108 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 08:40:54 crc kubenswrapper[4765]: I1003 08:40:54.049118 4765 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T08:40:54Z","lastTransitionTime":"2025-10-03T08:40:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 08:40:54 crc kubenswrapper[4765]: I1003 08:40:54.151581 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 08:40:54 crc kubenswrapper[4765]: I1003 08:40:54.151620 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 08:40:54 crc kubenswrapper[4765]: I1003 08:40:54.151628 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 08:40:54 crc kubenswrapper[4765]: I1003 08:40:54.151671 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 08:40:54 crc kubenswrapper[4765]: I1003 08:40:54.151691 4765 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T08:40:54Z","lastTransitionTime":"2025-10-03T08:40:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 08:40:54 crc kubenswrapper[4765]: I1003 08:40:54.254275 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 08:40:54 crc kubenswrapper[4765]: I1003 08:40:54.254328 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 08:40:54 crc kubenswrapper[4765]: I1003 08:40:54.254337 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 08:40:54 crc kubenswrapper[4765]: I1003 08:40:54.254350 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 08:40:54 crc kubenswrapper[4765]: I1003 08:40:54.254359 4765 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T08:40:54Z","lastTransitionTime":"2025-10-03T08:40:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 08:40:54 crc kubenswrapper[4765]: I1003 08:40:54.306124 4765 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-wdwf5" Oct 03 08:40:54 crc kubenswrapper[4765]: E1003 08:40:54.306276 4765 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-wdwf5" podUID="6824483c-e9a7-4e95-bb3d-e00bac2af3aa" Oct 03 08:40:54 crc kubenswrapper[4765]: I1003 08:40:54.356762 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 08:40:54 crc kubenswrapper[4765]: I1003 08:40:54.356813 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 08:40:54 crc kubenswrapper[4765]: I1003 08:40:54.356843 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 08:40:54 crc kubenswrapper[4765]: I1003 08:40:54.356857 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 08:40:54 crc kubenswrapper[4765]: I1003 08:40:54.356866 4765 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T08:40:54Z","lastTransitionTime":"2025-10-03T08:40:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 08:40:54 crc kubenswrapper[4765]: I1003 08:40:54.406804 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/6824483c-e9a7-4e95-bb3d-e00bac2af3aa-metrics-certs\") pod \"network-metrics-daemon-wdwf5\" (UID: \"6824483c-e9a7-4e95-bb3d-e00bac2af3aa\") " pod="openshift-multus/network-metrics-daemon-wdwf5" Oct 03 08:40:54 crc kubenswrapper[4765]: E1003 08:40:54.406929 4765 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Oct 03 08:40:54 crc kubenswrapper[4765]: E1003 08:40:54.406978 4765 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/6824483c-e9a7-4e95-bb3d-e00bac2af3aa-metrics-certs podName:6824483c-e9a7-4e95-bb3d-e00bac2af3aa nodeName:}" failed. No retries permitted until 2025-10-03 08:41:58.406964591 +0000 UTC m=+162.708458921 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/6824483c-e9a7-4e95-bb3d-e00bac2af3aa-metrics-certs") pod "network-metrics-daemon-wdwf5" (UID: "6824483c-e9a7-4e95-bb3d-e00bac2af3aa") : object "openshift-multus"/"metrics-daemon-secret" not registered Oct 03 08:40:54 crc kubenswrapper[4765]: I1003 08:40:54.458359 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 08:40:54 crc kubenswrapper[4765]: I1003 08:40:54.458398 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 08:40:54 crc kubenswrapper[4765]: I1003 08:40:54.458410 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 08:40:54 crc kubenswrapper[4765]: I1003 08:40:54.458425 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 08:40:54 crc kubenswrapper[4765]: I1003 08:40:54.458435 4765 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T08:40:54Z","lastTransitionTime":"2025-10-03T08:40:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 08:40:54 crc kubenswrapper[4765]: I1003 08:40:54.561189 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 08:40:54 crc kubenswrapper[4765]: I1003 08:40:54.561296 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 08:40:54 crc kubenswrapper[4765]: I1003 08:40:54.561333 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 08:40:54 crc kubenswrapper[4765]: I1003 08:40:54.561369 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 08:40:54 crc kubenswrapper[4765]: I1003 08:40:54.561390 4765 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T08:40:54Z","lastTransitionTime":"2025-10-03T08:40:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 08:40:54 crc kubenswrapper[4765]: I1003 08:40:54.664099 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 08:40:54 crc kubenswrapper[4765]: I1003 08:40:54.664175 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 08:40:54 crc kubenswrapper[4765]: I1003 08:40:54.664188 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 08:40:54 crc kubenswrapper[4765]: I1003 08:40:54.664208 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 08:40:54 crc kubenswrapper[4765]: I1003 08:40:54.664222 4765 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T08:40:54Z","lastTransitionTime":"2025-10-03T08:40:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 08:40:54 crc kubenswrapper[4765]: I1003 08:40:54.767098 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 08:40:54 crc kubenswrapper[4765]: I1003 08:40:54.767168 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 08:40:54 crc kubenswrapper[4765]: I1003 08:40:54.767187 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 08:40:54 crc kubenswrapper[4765]: I1003 08:40:54.767210 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 08:40:54 crc kubenswrapper[4765]: I1003 08:40:54.767228 4765 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T08:40:54Z","lastTransitionTime":"2025-10-03T08:40:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 08:40:54 crc kubenswrapper[4765]: I1003 08:40:54.869276 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 08:40:54 crc kubenswrapper[4765]: I1003 08:40:54.869337 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 08:40:54 crc kubenswrapper[4765]: I1003 08:40:54.869350 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 08:40:54 crc kubenswrapper[4765]: I1003 08:40:54.869373 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 08:40:54 crc kubenswrapper[4765]: I1003 08:40:54.869385 4765 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T08:40:54Z","lastTransitionTime":"2025-10-03T08:40:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 08:40:54 crc kubenswrapper[4765]: I1003 08:40:54.971383 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 08:40:54 crc kubenswrapper[4765]: I1003 08:40:54.971440 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 08:40:54 crc kubenswrapper[4765]: I1003 08:40:54.971450 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 08:40:54 crc kubenswrapper[4765]: I1003 08:40:54.971466 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 08:40:54 crc kubenswrapper[4765]: I1003 08:40:54.971481 4765 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T08:40:54Z","lastTransitionTime":"2025-10-03T08:40:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 08:40:55 crc kubenswrapper[4765]: I1003 08:40:55.073278 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 08:40:55 crc kubenswrapper[4765]: I1003 08:40:55.073344 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 08:40:55 crc kubenswrapper[4765]: I1003 08:40:55.073353 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 08:40:55 crc kubenswrapper[4765]: I1003 08:40:55.073367 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 08:40:55 crc kubenswrapper[4765]: I1003 08:40:55.073376 4765 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T08:40:55Z","lastTransitionTime":"2025-10-03T08:40:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 08:40:55 crc kubenswrapper[4765]: I1003 08:40:55.175864 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 08:40:55 crc kubenswrapper[4765]: I1003 08:40:55.175922 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 08:40:55 crc kubenswrapper[4765]: I1003 08:40:55.175935 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 08:40:55 crc kubenswrapper[4765]: I1003 08:40:55.175949 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 08:40:55 crc kubenswrapper[4765]: I1003 08:40:55.175962 4765 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T08:40:55Z","lastTransitionTime":"2025-10-03T08:40:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 08:40:55 crc kubenswrapper[4765]: I1003 08:40:55.277877 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 08:40:55 crc kubenswrapper[4765]: I1003 08:40:55.277919 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 08:40:55 crc kubenswrapper[4765]: I1003 08:40:55.277928 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 08:40:55 crc kubenswrapper[4765]: I1003 08:40:55.277941 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 08:40:55 crc kubenswrapper[4765]: I1003 08:40:55.277951 4765 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T08:40:55Z","lastTransitionTime":"2025-10-03T08:40:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 08:40:55 crc kubenswrapper[4765]: I1003 08:40:55.306818 4765 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 03 08:40:55 crc kubenswrapper[4765]: I1003 08:40:55.306874 4765 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 03 08:40:55 crc kubenswrapper[4765]: I1003 08:40:55.306827 4765 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 03 08:40:55 crc kubenswrapper[4765]: E1003 08:40:55.306934 4765 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 03 08:40:55 crc kubenswrapper[4765]: E1003 08:40:55.306995 4765 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 03 08:40:55 crc kubenswrapper[4765]: E1003 08:40:55.307043 4765 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 03 08:40:55 crc kubenswrapper[4765]: I1003 08:40:55.381495 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 08:40:55 crc kubenswrapper[4765]: I1003 08:40:55.381567 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 08:40:55 crc kubenswrapper[4765]: I1003 08:40:55.381585 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 08:40:55 crc kubenswrapper[4765]: I1003 08:40:55.381613 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 08:40:55 crc kubenswrapper[4765]: I1003 08:40:55.381633 4765 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T08:40:55Z","lastTransitionTime":"2025-10-03T08:40:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 08:40:55 crc kubenswrapper[4765]: I1003 08:40:55.484473 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 08:40:55 crc kubenswrapper[4765]: I1003 08:40:55.484532 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 08:40:55 crc kubenswrapper[4765]: I1003 08:40:55.484544 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 08:40:55 crc kubenswrapper[4765]: I1003 08:40:55.484561 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 08:40:55 crc kubenswrapper[4765]: I1003 08:40:55.484574 4765 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T08:40:55Z","lastTransitionTime":"2025-10-03T08:40:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 08:40:55 crc kubenswrapper[4765]: I1003 08:40:55.587557 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 08:40:55 crc kubenswrapper[4765]: I1003 08:40:55.587612 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 08:40:55 crc kubenswrapper[4765]: I1003 08:40:55.587632 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 08:40:55 crc kubenswrapper[4765]: I1003 08:40:55.587668 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 08:40:55 crc kubenswrapper[4765]: I1003 08:40:55.587678 4765 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T08:40:55Z","lastTransitionTime":"2025-10-03T08:40:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 08:40:55 crc kubenswrapper[4765]: I1003 08:40:55.690924 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 08:40:55 crc kubenswrapper[4765]: I1003 08:40:55.690981 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 08:40:55 crc kubenswrapper[4765]: I1003 08:40:55.690994 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 08:40:55 crc kubenswrapper[4765]: I1003 08:40:55.691012 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 08:40:55 crc kubenswrapper[4765]: I1003 08:40:55.691023 4765 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T08:40:55Z","lastTransitionTime":"2025-10-03T08:40:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 08:40:55 crc kubenswrapper[4765]: I1003 08:40:55.794391 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 08:40:55 crc kubenswrapper[4765]: I1003 08:40:55.794465 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 08:40:55 crc kubenswrapper[4765]: I1003 08:40:55.794485 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 08:40:55 crc kubenswrapper[4765]: I1003 08:40:55.794515 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 08:40:55 crc kubenswrapper[4765]: I1003 08:40:55.794545 4765 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T08:40:55Z","lastTransitionTime":"2025-10-03T08:40:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 08:40:55 crc kubenswrapper[4765]: I1003 08:40:55.897091 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 08:40:55 crc kubenswrapper[4765]: I1003 08:40:55.897142 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 08:40:55 crc kubenswrapper[4765]: I1003 08:40:55.897153 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 08:40:55 crc kubenswrapper[4765]: I1003 08:40:55.897168 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 08:40:55 crc kubenswrapper[4765]: I1003 08:40:55.897180 4765 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T08:40:55Z","lastTransitionTime":"2025-10-03T08:40:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 08:40:55 crc kubenswrapper[4765]: I1003 08:40:55.999604 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 08:40:55 crc kubenswrapper[4765]: I1003 08:40:55.999737 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 08:40:55 crc kubenswrapper[4765]: I1003 08:40:55.999750 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 08:40:55 crc kubenswrapper[4765]: I1003 08:40:55.999779 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 08:40:55 crc kubenswrapper[4765]: I1003 08:40:55.999802 4765 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T08:40:55Z","lastTransitionTime":"2025-10-03T08:40:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 08:40:56 crc kubenswrapper[4765]: I1003 08:40:56.102139 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 08:40:56 crc kubenswrapper[4765]: I1003 08:40:56.102182 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 08:40:56 crc kubenswrapper[4765]: I1003 08:40:56.102193 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 08:40:56 crc kubenswrapper[4765]: I1003 08:40:56.102209 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 08:40:56 crc kubenswrapper[4765]: I1003 08:40:56.102221 4765 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T08:40:56Z","lastTransitionTime":"2025-10-03T08:40:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 08:40:56 crc kubenswrapper[4765]: I1003 08:40:56.204282 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 08:40:56 crc kubenswrapper[4765]: I1003 08:40:56.204326 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 08:40:56 crc kubenswrapper[4765]: I1003 08:40:56.204334 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 08:40:56 crc kubenswrapper[4765]: I1003 08:40:56.204348 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 08:40:56 crc kubenswrapper[4765]: I1003 08:40:56.204362 4765 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T08:40:56Z","lastTransitionTime":"2025-10-03T08:40:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 08:40:56 crc kubenswrapper[4765]: I1003 08:40:56.305993 4765 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-wdwf5" Oct 03 08:40:56 crc kubenswrapper[4765]: E1003 08:40:56.306182 4765 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-wdwf5" podUID="6824483c-e9a7-4e95-bb3d-e00bac2af3aa" Oct 03 08:40:56 crc kubenswrapper[4765]: I1003 08:40:56.307547 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 08:40:56 crc kubenswrapper[4765]: I1003 08:40:56.307677 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 08:40:56 crc kubenswrapper[4765]: I1003 08:40:56.307691 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 08:40:56 crc kubenswrapper[4765]: I1003 08:40:56.307710 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 08:40:56 crc kubenswrapper[4765]: I1003 08:40:56.307724 4765 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T08:40:56Z","lastTransitionTime":"2025-10-03T08:40:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 08:40:56 crc kubenswrapper[4765]: I1003 08:40:56.321523 4765 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-p9gf5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"46c76a49-e10b-4a12-a6c7-12c330cd3c4e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d127171dd11041892813dd0596574630e756cc4f2e54b149619bffdbe9bae37d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T08:39:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2ldt7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T08:39:36Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-p9gf5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T08:40:56Z is after 2025-08-24T17:21:41Z" Oct 03 08:40:56 crc kubenswrapper[4765]: I1003 08:40:56.333375 4765 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-svqbq" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"84cdf1d7-9997-4015-bdbf-eedacc081685\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://43441b23076aa88505c0014c6734ffd0302f9011300711eece573befc94f3fbf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T08:39:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tm2z5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T08:39:39Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-svqbq\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T08:40:56Z is after 2025-08-24T17:21:41Z" Oct 03 08:40:56 crc kubenswrapper[4765]: I1003 08:40:56.345414 4765 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-9pssq" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fcbd8c60-e4bc-43c1-b769-9ae58a05ea0f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fb36c0727cbf11d911102b2e91c3989a264374191f4ff34349ed6ec8eba2e58d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T08:39:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ggxtg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d810b33fb4971c7a1473884cbe04ad15b3cac6c0ca9af2384819d72a748ab173\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T08:39:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ggxtg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T08:39:48Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-9pssq\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T08:40:56Z is after 2025-08-24T17:21:41Z" Oct 03 08:40:56 crc kubenswrapper[4765]: I1003 08:40:56.360205 4765 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9660b983-3561-4cf7-8ea0-31a63e8d1051\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://23c27e7d79dab0c54b22f0114e7f55a9267e3a21961b8479c37fd77d0e8b66c7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T08:39:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fb89a31c804d86cbc11b04e4dcfab79d4536f28a107d43e98d48172a1c257ebb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T08:39:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1c3168f51c49cd9633557cf31cdc0fec47b3fcf981462dc85f4253a0584fcf52\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T08:39:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://99ae775d5cfd2e88a1c7ca516e1c59f2e08ce1d383653cacbefeac66b07abcb5\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T08:39:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T08:39:16Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T08:40:56Z is after 2025-08-24T17:21:41Z" Oct 03 08:40:56 crc kubenswrapper[4765]: I1003 08:40:56.377811 4765 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-4bmrv" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4f105c06-3e67-486f-a622-923ae442117c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d7a29ab4db9b7548c70824520272e6323f615934cddf1d92bf653f6d8f030a19\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T08:39:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6lhz2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ab314ed006e71cae099ecf8be4ac18fb777dd83fe4e233d9bc347965f76b6a0b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ab314ed006e71cae099ecf8be4ac18fb777dd83fe4e233d9bc347965f76b6a0b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T08:39:37Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T08:39:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6lhz2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://261208c132a527b6786f64f37dd699e889f68ebc01265028288b3620615274bc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://261208c132a527b6786f64f37dd699e889f68ebc01265028288b3620615274bc\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T08:39:39Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T08:39:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6lhz2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8358f38c6c98de8206fc03fda21200b0e526972747588a6e32d525c064aa57ac\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8358f38c6c98de8206fc03fda21200b0e526972747588a6e32d525c064aa57ac\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T08:39:39Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T08:39:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6lhz2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9af7a0993c4e8d1177050ee170ae306c2e2570b0daca2d3f5c812b5f0e9c81da\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9af7a0993c4e8d1177050ee170ae306c2e2570b0daca2d3f5c812b5f0e9c81da\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T08:39:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T08:39:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6lhz2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://23ac91bc25ecc5c606b22bf6df52129330bb8c214ef8ec881fb202df6350c853\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://23ac91bc25ecc5c606b22bf6df52129330bb8c214ef8ec881fb202df6350c853\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T08:39:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T08:39:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6lhz2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7c836df75da45ef369baafc15bdbed1068becc3bf57a4c83a8519280ff3eb847\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7c836df75da45ef369baafc15bdbed1068becc3bf57a4c83a8519280ff3eb847\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T08:39:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T08:39:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6lhz2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T08:39:36Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-4bmrv\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T08:40:56Z is after 2025-08-24T17:21:41Z" Oct 03 08:40:56 crc kubenswrapper[4765]: I1003 08:40:56.389822 4765 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2a9b9fb7-e509-45bf-8ceb-fed6c0d26821\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://78b8f31e2b3f0891e3909baeb57c5a2dfe52c0e85d1aa86fe045ed54c56d5202\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T08:39:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6cedef6c592c877edfd8afe1dc09789fdc84a816a6a84d9ac9115fa494d8b5fe\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T08:39:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f1e92137f7438e3f6ae4b9225226f23f10f0e5e8a2b6a86f486971315d8bee00\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T08:39:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://00a75616be0bff2d1c730afda7f4212c6d85e07870e6f680c6903862387e00a0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://00a75616be0bff2d1c730afda7f4212c6d85e07870e6f680c6903862387e00a0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T08:39:17Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T08:39:17Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T08:39:16Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T08:40:56Z is after 2025-08-24T17:21:41Z" Oct 03 08:40:56 crc kubenswrapper[4765]: I1003 08:40:56.405138 4765 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:35Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:35Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T08:40:56Z is after 2025-08-24T17:21:41Z" Oct 03 08:40:56 crc kubenswrapper[4765]: I1003 08:40:56.410619 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 08:40:56 crc kubenswrapper[4765]: I1003 08:40:56.410731 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 08:40:56 crc kubenswrapper[4765]: I1003 08:40:56.410752 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 08:40:56 crc kubenswrapper[4765]: I1003 08:40:56.410778 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 08:40:56 crc kubenswrapper[4765]: I1003 08:40:56.410818 4765 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T08:40:56Z","lastTransitionTime":"2025-10-03T08:40:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 08:40:56 crc kubenswrapper[4765]: I1003 08:40:56.420380 4765 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:36Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:36Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e2003e4dd90b26bd915c05a690d0ab12b21ef7773138f11993382b0e7ac2d778\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T08:39:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T08:40:56Z is after 2025-08-24T17:21:41Z" Oct 03 08:40:56 crc kubenswrapper[4765]: I1003 08:40:56.433053 4765 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-wdwf5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6824483c-e9a7-4e95-bb3d-e00bac2af3aa\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:50Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:50Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:50Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9t858\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9t858\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T08:39:50Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-wdwf5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T08:40:56Z is after 2025-08-24T17:21:41Z" Oct 03 08:40:56 crc kubenswrapper[4765]: I1003 08:40:56.445089 4765 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"648d26ad-0ca3-4ce7-885d-6aab568ed72d\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://dee8eb78cfc7f681a7009b32e7521490cfa896aee35f8f552a150738224517be\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T08:39:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bdfadb3541e9c76e5ab7469b7161c24715f4eeff89ec4bba0cc253bece41f1b2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://bdfadb3541e9c76e5ab7469b7161c24715f4eeff89ec4bba0cc253bece41f1b2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T08:39:17Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T08:39:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T08:39:16Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T08:40:56Z is after 2025-08-24T17:21:41Z" Oct 03 08:40:56 crc kubenswrapper[4765]: I1003 08:40:56.461788 4765 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0c434639-9c6c-420c-a51b-fdf59b654daa\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d31497fd54f7500ac776bdd9a16414d873c053353911ed5ba237b201e9e7ac12\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T08:39:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e89b19d6a5b90a2051665bf2e5e150f73df7899eff246ee75246bc2127c415ea\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T08:39:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://79fad446c147481b1a0ff2a173848b2d24384e6b6aafcd0749dc820e9abfe929\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T08:39:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f2e21a2b21d807288e991a3a44ea38d316985590080aa4291aa3385816f826dd\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://aa0283dadc2c5e48aa9bfd20ef35d889a350244b72eb8529d4d4e682d5fa0e47\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-03T08:39:35Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1003 08:39:29.830291 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1003 08:39:29.833185 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2710500186/tls.crt::/tmp/serving-cert-2710500186/tls.key\\\\\\\"\\\\nI1003 08:39:35.213224 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1003 08:39:35.219008 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1003 08:39:35.219055 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1003 08:39:35.219088 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1003 08:39:35.219098 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1003 08:39:35.227302 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI1003 08:39:35.227314 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW1003 08:39:35.227372 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1003 08:39:35.227381 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1003 08:39:35.227385 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1003 08:39:35.227395 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1003 08:39:35.227398 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1003 08:39:35.227401 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF1003 08:39:35.229781 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-03T08:39:19Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T08:39:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fa1d1c0f4dab4b4c6c9f3afccac34473eab40a714015a2a7ce725ed1a92b609c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T08:39:19Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7c94b0f96f50324a67a7b189e9d3c4e406193421aa2e793692219bef9f2578dd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7c94b0f96f50324a67a7b189e9d3c4e406193421aa2e793692219bef9f2578dd\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T08:39:17Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T08:39:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T08:39:16Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T08:40:56Z is after 2025-08-24T17:21:41Z" Oct 03 08:40:56 crc kubenswrapper[4765]: I1003 08:40:56.475707 4765 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-csb5z" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"912755c8-dd28-4fbc-82de-9cf85df54f4f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://52f5a7f443bf8e52988e8645ff60745a747d602261e7dbf01b68c58aaf9bae05\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d7f179012e9f55f30c641a1ae3640cc90cefb3d2527d0c1e0580c219899503e1\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-10-03T08:40:23Z\\\",\\\"message\\\":\\\"2025-10-03T08:39:38+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_5149c5cc-1f13-4c92-ba76-7ef1ed5a7abf\\\\n2025-10-03T08:39:38+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_5149c5cc-1f13-4c92-ba76-7ef1ed5a7abf to /host/opt/cni/bin/\\\\n2025-10-03T08:39:38Z [verbose] multus-daemon started\\\\n2025-10-03T08:39:38Z [verbose] Readiness Indicator file check\\\\n2025-10-03T08:40:23Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-03T08:39:37Z\\\"}},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T08:40:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w8k2k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T08:39:36Z\\\"}}\" for pod \"openshift-multus\"/\"multus-csb5z\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T08:40:56Z is after 2025-08-24T17:21:41Z" Oct 03 08:40:56 crc kubenswrapper[4765]: I1003 08:40:56.486591 4765 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-j8mss" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d636dbad-9ffa-4ba7-953f-adea04b76a23\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://33c95fa1034cd2135f4293956d73825e809195d220ff0b10a6604bd399a5730a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T08:39:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dhzzj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://714c78e9165f96e2aee03ad7be980399f06aeb852da4d76611c236f262518281\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T08:39:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dhzzj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T08:39:36Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-j8mss\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T08:40:56Z is after 2025-08-24T17:21:41Z" Oct 03 08:40:56 crc kubenswrapper[4765]: I1003 08:40:56.507071 4765 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"859ee4f1-636f-48e5-ad72-fef19f311c64\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ffaf0cbc60fa84230a87aff908b5b2a76956abfa937aeea94363abe91640b93e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T08:39:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0fee410f71d4fa82e7bf54dad906736bc7182be512825a06bf7a4c76ed2f2789\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T08:39:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ab0ed26066c771f9943b6435fa382ff61fb04f0c8bef3d505aba4c5d1a1d4740\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T08:39:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://153c9584928c3d064c6098126dad58733015ed123b9a55c959e69ddcc0ad2110\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T08:39:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6fa1bc45d80d90bc08ca3a7177e2ac77b66c36f5a0f863532174be7719bfaae0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T08:39:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c1359775b3b907743ce1d7754c904fab5cf953ad3a28f91e781be95462e6a9d4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c1359775b3b907743ce1d7754c904fab5cf953ad3a28f91e781be95462e6a9d4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T08:39:17Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T08:39:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a56c24b3af6b3d5c18009cbb5517273674eb36f57157344f99903f9c50c7d1a0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a56c24b3af6b3d5c18009cbb5517273674eb36f57157344f99903f9c50c7d1a0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T08:39:18Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T08:39:18Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://18dbd66a12f96fa4beef4da9dcc0e1b1d3a5164a8b21a6774142289078d6c05f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://18dbd66a12f96fa4beef4da9dcc0e1b1d3a5164a8b21a6774142289078d6c05f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T08:39:19Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T08:39:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T08:39:16Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T08:40:56Z is after 2025-08-24T17:21:41Z" Oct 03 08:40:56 crc kubenswrapper[4765]: I1003 08:40:56.512999 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 08:40:56 crc kubenswrapper[4765]: I1003 08:40:56.513036 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 08:40:56 crc kubenswrapper[4765]: I1003 08:40:56.513051 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 08:40:56 crc kubenswrapper[4765]: I1003 08:40:56.513065 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 08:40:56 crc kubenswrapper[4765]: I1003 08:40:56.513077 4765 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T08:40:56Z","lastTransitionTime":"2025-10-03T08:40:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 08:40:56 crc kubenswrapper[4765]: I1003 08:40:56.519726 4765 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:35Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:35Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T08:40:56Z is after 2025-08-24T17:21:41Z" Oct 03 08:40:56 crc kubenswrapper[4765]: I1003 08:40:56.531409 4765 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:38Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:38Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a37f2b5f797755065158a077232872befbc61f2f19c80dfd27bba7f131db794c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T08:39:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T08:40:56Z is after 2025-08-24T17:21:41Z" Oct 03 08:40:56 crc kubenswrapper[4765]: I1003 08:40:56.548195 4765 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-srgbb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ea01fba1-445f-46c1-898c-1ceb34866850\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:36Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:36Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d73e2e54676fc570262cfd551322ed003812c372ddc25695ca3b34ae2a05423b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T08:39:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m4zqv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fa40947035e07c4926ee170348e2bd545830d0c6c1fa6b59a2aa7f12eac2c6da\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T08:39:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m4zqv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://902d94d2cc9ce526c6ea774f1bb70fbee7da85cedab72fcd842f87d47ee8a458\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T08:39:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m4zqv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://95502595a856f5f235331ab5db3d4f97a50f968857c1962d12b873a714689f0c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T08:39:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m4zqv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a3ad66691c9dcf004703b79d697a78f9b42791fafba2ddf278997b6ad28bdd4a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T08:39:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m4zqv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://68b9b8a7ec5c072f50d44aa0d3800b7cdee18bdd868d37ec129ceb37a23bd3ca\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T08:39:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m4zqv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://115444bc9990e2060fb9e8fff1ca7328f3abbaee25879c6af5feac46f0a417bb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://115444bc9990e2060fb9e8fff1ca7328f3abbaee25879c6af5feac46f0a417bb\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-10-03T08:40:36Z\\\",\\\"message\\\":\\\" error occurred: failed calling webhook \\\\\\\"pod.network-node-identity.openshift.io\\\\\\\": failed to call webhook: Post \\\\\\\"https://127.0.0.1:9743/pod?timeout=10s\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T08:40:36Z is after 2025-08-24T17:21:41Z\\\\nI1003 08:40:36.201436 6807 obj_retry.go:285] Attempting retry of *v1.Pod openshift-multus/network-metrics-daemon-wdwf5 before timer (time: 2025-10-03 08:40:37.635855498 +0000 UTC m=+2.038952178): skip\\\\nI1003 08:40:36.201453 6807 obj_retry.go:420] Function iterateRetryResources for *v1.Pod ended (in 55.481µs)\\\\nI1003 08:40:36.201382 6807 handler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI1003 08:40:36.201544 6807 handler.go:190] Sending *v1.Namespace event handler 1 for removal\\\\nI1003 08:40:36.201600 6807 handler.go:208] Removed *v1.Node event handler 2\\\\nI1003 08:40:36.201626 6807 factory.go:656] Stopping watch factory\\\\nI1003 08:40:36.201639 6807 ovnkube.go:599] Stopped ovnkube\\\\nI1003 08:40:36.201671 6807 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI1003 08:40:36.201691 6807 metrics.go:553] Stopping metrics server at address \\\\\\\"127.0.0.1:29103\\\\\\\"\\\\nF1003 08:40:36.201791 6807 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-03T08:40:35Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=ovnkube-controller pod=ovnkube-node-srgbb_openshift-ovn-kubernetes(ea01fba1-445f-46c1-898c-1ceb34866850)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m4zqv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6d5d60eb6ab5ff22cc2c6826b1d47220bb827fa0429f2a59020ae01d0a43f6bf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T08:39:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m4zqv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://761ad034a8a6a7a6280fcccaca136b26ddd423885bc2a7ab6b8e1856f16f7416\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://761ad034a8a6a7a6280fcccaca136b26ddd423885bc2a7ab6b8e1856f16f7416\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T08:39:37Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T08:39:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m4zqv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T08:39:36Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-srgbb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T08:40:56Z is after 2025-08-24T17:21:41Z" Oct 03 08:40:56 crc kubenswrapper[4765]: I1003 08:40:56.559527 4765 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:35Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:35Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T08:40:56Z is after 2025-08-24T17:21:41Z" Oct 03 08:40:56 crc kubenswrapper[4765]: I1003 08:40:56.569903 4765 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:36Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T08:39:36Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e8d6f534a0a702832db2f8947c1528a98d511d3950cc5a6ec0ac3b31b3dbcb7c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T08:39:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://20ad16cb9f0f7e17ac946cd2c3f7c01b6e6c95d6d76c99f482b3761546689af2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T08:39:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T08:40:56Z is after 2025-08-24T17:21:41Z" Oct 03 08:40:56 crc kubenswrapper[4765]: I1003 08:40:56.615689 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 08:40:56 crc kubenswrapper[4765]: I1003 08:40:56.615728 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 08:40:56 crc kubenswrapper[4765]: I1003 08:40:56.615740 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 08:40:56 crc kubenswrapper[4765]: I1003 08:40:56.615755 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 08:40:56 crc kubenswrapper[4765]: I1003 08:40:56.615767 4765 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T08:40:56Z","lastTransitionTime":"2025-10-03T08:40:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 08:40:56 crc kubenswrapper[4765]: I1003 08:40:56.718069 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 08:40:56 crc kubenswrapper[4765]: I1003 08:40:56.718105 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 08:40:56 crc kubenswrapper[4765]: I1003 08:40:56.718114 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 08:40:56 crc kubenswrapper[4765]: I1003 08:40:56.718129 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 08:40:56 crc kubenswrapper[4765]: I1003 08:40:56.718139 4765 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T08:40:56Z","lastTransitionTime":"2025-10-03T08:40:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 08:40:56 crc kubenswrapper[4765]: I1003 08:40:56.819515 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 08:40:56 crc kubenswrapper[4765]: I1003 08:40:56.819545 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 08:40:56 crc kubenswrapper[4765]: I1003 08:40:56.819553 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 08:40:56 crc kubenswrapper[4765]: I1003 08:40:56.819565 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 08:40:56 crc kubenswrapper[4765]: I1003 08:40:56.819576 4765 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T08:40:56Z","lastTransitionTime":"2025-10-03T08:40:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 08:40:56 crc kubenswrapper[4765]: I1003 08:40:56.925625 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 08:40:56 crc kubenswrapper[4765]: I1003 08:40:56.925748 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 08:40:56 crc kubenswrapper[4765]: I1003 08:40:56.925765 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 08:40:56 crc kubenswrapper[4765]: I1003 08:40:56.925784 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 08:40:56 crc kubenswrapper[4765]: I1003 08:40:56.925801 4765 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T08:40:56Z","lastTransitionTime":"2025-10-03T08:40:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 08:40:57 crc kubenswrapper[4765]: I1003 08:40:57.027827 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 08:40:57 crc kubenswrapper[4765]: I1003 08:40:57.027894 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 08:40:57 crc kubenswrapper[4765]: I1003 08:40:57.027911 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 08:40:57 crc kubenswrapper[4765]: I1003 08:40:57.027924 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 08:40:57 crc kubenswrapper[4765]: I1003 08:40:57.027936 4765 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T08:40:57Z","lastTransitionTime":"2025-10-03T08:40:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 08:40:57 crc kubenswrapper[4765]: I1003 08:40:57.130267 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 08:40:57 crc kubenswrapper[4765]: I1003 08:40:57.130301 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 08:40:57 crc kubenswrapper[4765]: I1003 08:40:57.130311 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 08:40:57 crc kubenswrapper[4765]: I1003 08:40:57.130325 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 08:40:57 crc kubenswrapper[4765]: I1003 08:40:57.130336 4765 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T08:40:57Z","lastTransitionTime":"2025-10-03T08:40:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 08:40:57 crc kubenswrapper[4765]: I1003 08:40:57.232909 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 08:40:57 crc kubenswrapper[4765]: I1003 08:40:57.232949 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 08:40:57 crc kubenswrapper[4765]: I1003 08:40:57.232960 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 08:40:57 crc kubenswrapper[4765]: I1003 08:40:57.232975 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 08:40:57 crc kubenswrapper[4765]: I1003 08:40:57.232986 4765 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T08:40:57Z","lastTransitionTime":"2025-10-03T08:40:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 08:40:57 crc kubenswrapper[4765]: I1003 08:40:57.305896 4765 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 03 08:40:57 crc kubenswrapper[4765]: E1003 08:40:57.306308 4765 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 03 08:40:57 crc kubenswrapper[4765]: I1003 08:40:57.306006 4765 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 03 08:40:57 crc kubenswrapper[4765]: E1003 08:40:57.306502 4765 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 03 08:40:57 crc kubenswrapper[4765]: I1003 08:40:57.305896 4765 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 03 08:40:57 crc kubenswrapper[4765]: E1003 08:40:57.306761 4765 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 03 08:40:57 crc kubenswrapper[4765]: I1003 08:40:57.335392 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 08:40:57 crc kubenswrapper[4765]: I1003 08:40:57.335447 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 08:40:57 crc kubenswrapper[4765]: I1003 08:40:57.335463 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 08:40:57 crc kubenswrapper[4765]: I1003 08:40:57.335480 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 08:40:57 crc kubenswrapper[4765]: I1003 08:40:57.335492 4765 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T08:40:57Z","lastTransitionTime":"2025-10-03T08:40:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 08:40:57 crc kubenswrapper[4765]: I1003 08:40:57.438758 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 08:40:57 crc kubenswrapper[4765]: I1003 08:40:57.439021 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 08:40:57 crc kubenswrapper[4765]: I1003 08:40:57.439052 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 08:40:57 crc kubenswrapper[4765]: I1003 08:40:57.439072 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 08:40:57 crc kubenswrapper[4765]: I1003 08:40:57.439085 4765 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T08:40:57Z","lastTransitionTime":"2025-10-03T08:40:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 08:40:57 crc kubenswrapper[4765]: I1003 08:40:57.541131 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 08:40:57 crc kubenswrapper[4765]: I1003 08:40:57.541179 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 08:40:57 crc kubenswrapper[4765]: I1003 08:40:57.541190 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 08:40:57 crc kubenswrapper[4765]: I1003 08:40:57.541206 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 08:40:57 crc kubenswrapper[4765]: I1003 08:40:57.541219 4765 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T08:40:57Z","lastTransitionTime":"2025-10-03T08:40:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 08:40:57 crc kubenswrapper[4765]: I1003 08:40:57.643950 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 08:40:57 crc kubenswrapper[4765]: I1003 08:40:57.643991 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 08:40:57 crc kubenswrapper[4765]: I1003 08:40:57.644003 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 08:40:57 crc kubenswrapper[4765]: I1003 08:40:57.644017 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 08:40:57 crc kubenswrapper[4765]: I1003 08:40:57.644028 4765 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T08:40:57Z","lastTransitionTime":"2025-10-03T08:40:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 08:40:57 crc kubenswrapper[4765]: I1003 08:40:57.746277 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 08:40:57 crc kubenswrapper[4765]: I1003 08:40:57.746311 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 08:40:57 crc kubenswrapper[4765]: I1003 08:40:57.746321 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 08:40:57 crc kubenswrapper[4765]: I1003 08:40:57.746335 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 08:40:57 crc kubenswrapper[4765]: I1003 08:40:57.746344 4765 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T08:40:57Z","lastTransitionTime":"2025-10-03T08:40:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 08:40:57 crc kubenswrapper[4765]: I1003 08:40:57.848036 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 08:40:57 crc kubenswrapper[4765]: I1003 08:40:57.848090 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 08:40:57 crc kubenswrapper[4765]: I1003 08:40:57.848100 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 08:40:57 crc kubenswrapper[4765]: I1003 08:40:57.848121 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 08:40:57 crc kubenswrapper[4765]: I1003 08:40:57.848133 4765 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T08:40:57Z","lastTransitionTime":"2025-10-03T08:40:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 08:40:57 crc kubenswrapper[4765]: I1003 08:40:57.950334 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 08:40:57 crc kubenswrapper[4765]: I1003 08:40:57.950383 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 08:40:57 crc kubenswrapper[4765]: I1003 08:40:57.950394 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 08:40:57 crc kubenswrapper[4765]: I1003 08:40:57.950413 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 08:40:57 crc kubenswrapper[4765]: I1003 08:40:57.950424 4765 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T08:40:57Z","lastTransitionTime":"2025-10-03T08:40:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 08:40:58 crc kubenswrapper[4765]: I1003 08:40:58.052589 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 08:40:58 crc kubenswrapper[4765]: I1003 08:40:58.052662 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 08:40:58 crc kubenswrapper[4765]: I1003 08:40:58.052676 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 08:40:58 crc kubenswrapper[4765]: I1003 08:40:58.052697 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 08:40:58 crc kubenswrapper[4765]: I1003 08:40:58.052709 4765 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T08:40:58Z","lastTransitionTime":"2025-10-03T08:40:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 08:40:58 crc kubenswrapper[4765]: I1003 08:40:58.155301 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 08:40:58 crc kubenswrapper[4765]: I1003 08:40:58.155340 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 08:40:58 crc kubenswrapper[4765]: I1003 08:40:58.155350 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 08:40:58 crc kubenswrapper[4765]: I1003 08:40:58.155363 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 08:40:58 crc kubenswrapper[4765]: I1003 08:40:58.155372 4765 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T08:40:58Z","lastTransitionTime":"2025-10-03T08:40:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 08:40:58 crc kubenswrapper[4765]: I1003 08:40:58.257965 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 08:40:58 crc kubenswrapper[4765]: I1003 08:40:58.258000 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 08:40:58 crc kubenswrapper[4765]: I1003 08:40:58.258009 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 08:40:58 crc kubenswrapper[4765]: I1003 08:40:58.258023 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 08:40:58 crc kubenswrapper[4765]: I1003 08:40:58.258031 4765 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T08:40:58Z","lastTransitionTime":"2025-10-03T08:40:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 08:40:58 crc kubenswrapper[4765]: I1003 08:40:58.306554 4765 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-wdwf5" Oct 03 08:40:58 crc kubenswrapper[4765]: E1003 08:40:58.306793 4765 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-wdwf5" podUID="6824483c-e9a7-4e95-bb3d-e00bac2af3aa" Oct 03 08:40:58 crc kubenswrapper[4765]: I1003 08:40:58.360393 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 08:40:58 crc kubenswrapper[4765]: I1003 08:40:58.360427 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 08:40:58 crc kubenswrapper[4765]: I1003 08:40:58.360438 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 08:40:58 crc kubenswrapper[4765]: I1003 08:40:58.360454 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 08:40:58 crc kubenswrapper[4765]: I1003 08:40:58.360463 4765 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T08:40:58Z","lastTransitionTime":"2025-10-03T08:40:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 08:40:58 crc kubenswrapper[4765]: I1003 08:40:58.462710 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 08:40:58 crc kubenswrapper[4765]: I1003 08:40:58.462747 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 08:40:58 crc kubenswrapper[4765]: I1003 08:40:58.462757 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 08:40:58 crc kubenswrapper[4765]: I1003 08:40:58.462769 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 08:40:58 crc kubenswrapper[4765]: I1003 08:40:58.462778 4765 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T08:40:58Z","lastTransitionTime":"2025-10-03T08:40:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 08:40:58 crc kubenswrapper[4765]: I1003 08:40:58.564937 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 08:40:58 crc kubenswrapper[4765]: I1003 08:40:58.564982 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 08:40:58 crc kubenswrapper[4765]: I1003 08:40:58.564990 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 08:40:58 crc kubenswrapper[4765]: I1003 08:40:58.565003 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 08:40:58 crc kubenswrapper[4765]: I1003 08:40:58.565012 4765 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T08:40:58Z","lastTransitionTime":"2025-10-03T08:40:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 08:40:58 crc kubenswrapper[4765]: I1003 08:40:58.667036 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 08:40:58 crc kubenswrapper[4765]: I1003 08:40:58.667072 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 08:40:58 crc kubenswrapper[4765]: I1003 08:40:58.667081 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 08:40:58 crc kubenswrapper[4765]: I1003 08:40:58.667095 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 08:40:58 crc kubenswrapper[4765]: I1003 08:40:58.667107 4765 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T08:40:58Z","lastTransitionTime":"2025-10-03T08:40:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 08:40:58 crc kubenswrapper[4765]: I1003 08:40:58.769421 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 08:40:58 crc kubenswrapper[4765]: I1003 08:40:58.769452 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 08:40:58 crc kubenswrapper[4765]: I1003 08:40:58.769463 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 08:40:58 crc kubenswrapper[4765]: I1003 08:40:58.769476 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 08:40:58 crc kubenswrapper[4765]: I1003 08:40:58.769486 4765 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T08:40:58Z","lastTransitionTime":"2025-10-03T08:40:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 08:40:58 crc kubenswrapper[4765]: I1003 08:40:58.871446 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 08:40:58 crc kubenswrapper[4765]: I1003 08:40:58.871478 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 08:40:58 crc kubenswrapper[4765]: I1003 08:40:58.871486 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 08:40:58 crc kubenswrapper[4765]: I1003 08:40:58.871500 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 08:40:58 crc kubenswrapper[4765]: I1003 08:40:58.871510 4765 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T08:40:58Z","lastTransitionTime":"2025-10-03T08:40:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 08:40:58 crc kubenswrapper[4765]: I1003 08:40:58.974263 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 08:40:58 crc kubenswrapper[4765]: I1003 08:40:58.974301 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 08:40:58 crc kubenswrapper[4765]: I1003 08:40:58.974312 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 08:40:58 crc kubenswrapper[4765]: I1003 08:40:58.974326 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 08:40:58 crc kubenswrapper[4765]: I1003 08:40:58.974335 4765 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T08:40:58Z","lastTransitionTime":"2025-10-03T08:40:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 08:40:59 crc kubenswrapper[4765]: I1003 08:40:59.076142 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 08:40:59 crc kubenswrapper[4765]: I1003 08:40:59.076189 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 08:40:59 crc kubenswrapper[4765]: I1003 08:40:59.076198 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 08:40:59 crc kubenswrapper[4765]: I1003 08:40:59.076211 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 08:40:59 crc kubenswrapper[4765]: I1003 08:40:59.076225 4765 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T08:40:59Z","lastTransitionTime":"2025-10-03T08:40:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 08:40:59 crc kubenswrapper[4765]: I1003 08:40:59.178911 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 08:40:59 crc kubenswrapper[4765]: I1003 08:40:59.178951 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 08:40:59 crc kubenswrapper[4765]: I1003 08:40:59.178960 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 08:40:59 crc kubenswrapper[4765]: I1003 08:40:59.178974 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 08:40:59 crc kubenswrapper[4765]: I1003 08:40:59.178984 4765 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T08:40:59Z","lastTransitionTime":"2025-10-03T08:40:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 08:40:59 crc kubenswrapper[4765]: I1003 08:40:59.281088 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 08:40:59 crc kubenswrapper[4765]: I1003 08:40:59.281325 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 08:40:59 crc kubenswrapper[4765]: I1003 08:40:59.281337 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 08:40:59 crc kubenswrapper[4765]: I1003 08:40:59.281355 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 08:40:59 crc kubenswrapper[4765]: I1003 08:40:59.281368 4765 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T08:40:59Z","lastTransitionTime":"2025-10-03T08:40:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 08:40:59 crc kubenswrapper[4765]: I1003 08:40:59.306428 4765 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 03 08:40:59 crc kubenswrapper[4765]: I1003 08:40:59.306487 4765 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 03 08:40:59 crc kubenswrapper[4765]: I1003 08:40:59.306446 4765 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 03 08:40:59 crc kubenswrapper[4765]: E1003 08:40:59.306556 4765 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 03 08:40:59 crc kubenswrapper[4765]: E1003 08:40:59.306657 4765 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 03 08:40:59 crc kubenswrapper[4765]: E1003 08:40:59.306751 4765 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 03 08:40:59 crc kubenswrapper[4765]: I1003 08:40:59.384120 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 08:40:59 crc kubenswrapper[4765]: I1003 08:40:59.384161 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 08:40:59 crc kubenswrapper[4765]: I1003 08:40:59.384170 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 08:40:59 crc kubenswrapper[4765]: I1003 08:40:59.384184 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 08:40:59 crc kubenswrapper[4765]: I1003 08:40:59.384193 4765 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T08:40:59Z","lastTransitionTime":"2025-10-03T08:40:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 08:40:59 crc kubenswrapper[4765]: I1003 08:40:59.487285 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 08:40:59 crc kubenswrapper[4765]: I1003 08:40:59.487346 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 08:40:59 crc kubenswrapper[4765]: I1003 08:40:59.487355 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 08:40:59 crc kubenswrapper[4765]: I1003 08:40:59.487369 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 08:40:59 crc kubenswrapper[4765]: I1003 08:40:59.487378 4765 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T08:40:59Z","lastTransitionTime":"2025-10-03T08:40:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 08:40:59 crc kubenswrapper[4765]: I1003 08:40:59.589602 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 08:40:59 crc kubenswrapper[4765]: I1003 08:40:59.589714 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 08:40:59 crc kubenswrapper[4765]: I1003 08:40:59.589725 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 08:40:59 crc kubenswrapper[4765]: I1003 08:40:59.589753 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 08:40:59 crc kubenswrapper[4765]: I1003 08:40:59.589767 4765 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T08:40:59Z","lastTransitionTime":"2025-10-03T08:40:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 08:40:59 crc kubenswrapper[4765]: I1003 08:40:59.691874 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 08:40:59 crc kubenswrapper[4765]: I1003 08:40:59.691932 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 08:40:59 crc kubenswrapper[4765]: I1003 08:40:59.691942 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 08:40:59 crc kubenswrapper[4765]: I1003 08:40:59.691959 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 08:40:59 crc kubenswrapper[4765]: I1003 08:40:59.691971 4765 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T08:40:59Z","lastTransitionTime":"2025-10-03T08:40:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 08:40:59 crc kubenswrapper[4765]: I1003 08:40:59.794220 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 08:40:59 crc kubenswrapper[4765]: I1003 08:40:59.794612 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 08:40:59 crc kubenswrapper[4765]: I1003 08:40:59.794745 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 08:40:59 crc kubenswrapper[4765]: I1003 08:40:59.794939 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 08:40:59 crc kubenswrapper[4765]: I1003 08:40:59.795034 4765 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T08:40:59Z","lastTransitionTime":"2025-10-03T08:40:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 08:40:59 crc kubenswrapper[4765]: I1003 08:40:59.897907 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 08:40:59 crc kubenswrapper[4765]: I1003 08:40:59.897959 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 08:40:59 crc kubenswrapper[4765]: I1003 08:40:59.897968 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 08:40:59 crc kubenswrapper[4765]: I1003 08:40:59.897989 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 08:40:59 crc kubenswrapper[4765]: I1003 08:40:59.898010 4765 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T08:40:59Z","lastTransitionTime":"2025-10-03T08:40:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 08:41:00 crc kubenswrapper[4765]: I1003 08:41:00.000567 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 08:41:00 crc kubenswrapper[4765]: I1003 08:41:00.000608 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 08:41:00 crc kubenswrapper[4765]: I1003 08:41:00.000620 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 08:41:00 crc kubenswrapper[4765]: I1003 08:41:00.000635 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 08:41:00 crc kubenswrapper[4765]: I1003 08:41:00.000674 4765 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T08:41:00Z","lastTransitionTime":"2025-10-03T08:41:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 08:41:00 crc kubenswrapper[4765]: I1003 08:41:00.103637 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 08:41:00 crc kubenswrapper[4765]: I1003 08:41:00.103709 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 08:41:00 crc kubenswrapper[4765]: I1003 08:41:00.103719 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 08:41:00 crc kubenswrapper[4765]: I1003 08:41:00.103740 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 08:41:00 crc kubenswrapper[4765]: I1003 08:41:00.103752 4765 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T08:41:00Z","lastTransitionTime":"2025-10-03T08:41:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 08:41:00 crc kubenswrapper[4765]: I1003 08:41:00.206601 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 08:41:00 crc kubenswrapper[4765]: I1003 08:41:00.206721 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 08:41:00 crc kubenswrapper[4765]: I1003 08:41:00.206740 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 08:41:00 crc kubenswrapper[4765]: I1003 08:41:00.206761 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 08:41:00 crc kubenswrapper[4765]: I1003 08:41:00.206777 4765 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T08:41:00Z","lastTransitionTime":"2025-10-03T08:41:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 08:41:00 crc kubenswrapper[4765]: I1003 08:41:00.306151 4765 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-wdwf5" Oct 03 08:41:00 crc kubenswrapper[4765]: E1003 08:41:00.306609 4765 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-wdwf5" podUID="6824483c-e9a7-4e95-bb3d-e00bac2af3aa" Oct 03 08:41:00 crc kubenswrapper[4765]: I1003 08:41:00.308922 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 08:41:00 crc kubenswrapper[4765]: I1003 08:41:00.308955 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 08:41:00 crc kubenswrapper[4765]: I1003 08:41:00.308968 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 08:41:00 crc kubenswrapper[4765]: I1003 08:41:00.308984 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 08:41:00 crc kubenswrapper[4765]: I1003 08:41:00.308994 4765 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T08:41:00Z","lastTransitionTime":"2025-10-03T08:41:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 08:41:00 crc kubenswrapper[4765]: I1003 08:41:00.411403 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 08:41:00 crc kubenswrapper[4765]: I1003 08:41:00.411682 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 08:41:00 crc kubenswrapper[4765]: I1003 08:41:00.411750 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 08:41:00 crc kubenswrapper[4765]: I1003 08:41:00.411819 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 08:41:00 crc kubenswrapper[4765]: I1003 08:41:00.411877 4765 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T08:41:00Z","lastTransitionTime":"2025-10-03T08:41:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 08:41:00 crc kubenswrapper[4765]: I1003 08:41:00.514607 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 08:41:00 crc kubenswrapper[4765]: I1003 08:41:00.514641 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 08:41:00 crc kubenswrapper[4765]: I1003 08:41:00.514667 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 08:41:00 crc kubenswrapper[4765]: I1003 08:41:00.514679 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 08:41:00 crc kubenswrapper[4765]: I1003 08:41:00.514689 4765 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T08:41:00Z","lastTransitionTime":"2025-10-03T08:41:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 08:41:00 crc kubenswrapper[4765]: I1003 08:41:00.617298 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 08:41:00 crc kubenswrapper[4765]: I1003 08:41:00.617337 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 08:41:00 crc kubenswrapper[4765]: I1003 08:41:00.617347 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 08:41:00 crc kubenswrapper[4765]: I1003 08:41:00.617361 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 08:41:00 crc kubenswrapper[4765]: I1003 08:41:00.617371 4765 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T08:41:00Z","lastTransitionTime":"2025-10-03T08:41:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 08:41:00 crc kubenswrapper[4765]: I1003 08:41:00.719166 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 08:41:00 crc kubenswrapper[4765]: I1003 08:41:00.719712 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 08:41:00 crc kubenswrapper[4765]: I1003 08:41:00.719805 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 08:41:00 crc kubenswrapper[4765]: I1003 08:41:00.719922 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 08:41:00 crc kubenswrapper[4765]: I1003 08:41:00.720010 4765 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T08:41:00Z","lastTransitionTime":"2025-10-03T08:41:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 08:41:00 crc kubenswrapper[4765]: I1003 08:41:00.823293 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 08:41:00 crc kubenswrapper[4765]: I1003 08:41:00.823333 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 08:41:00 crc kubenswrapper[4765]: I1003 08:41:00.823345 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 08:41:00 crc kubenswrapper[4765]: I1003 08:41:00.823369 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 08:41:00 crc kubenswrapper[4765]: I1003 08:41:00.823385 4765 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T08:41:00Z","lastTransitionTime":"2025-10-03T08:41:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 08:41:00 crc kubenswrapper[4765]: I1003 08:41:00.925915 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 08:41:00 crc kubenswrapper[4765]: I1003 08:41:00.925967 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 08:41:00 crc kubenswrapper[4765]: I1003 08:41:00.925978 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 08:41:00 crc kubenswrapper[4765]: I1003 08:41:00.925994 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 08:41:00 crc kubenswrapper[4765]: I1003 08:41:00.926004 4765 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T08:41:00Z","lastTransitionTime":"2025-10-03T08:41:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 08:41:01 crc kubenswrapper[4765]: I1003 08:41:01.028018 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 08:41:01 crc kubenswrapper[4765]: I1003 08:41:01.028055 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 08:41:01 crc kubenswrapper[4765]: I1003 08:41:01.028064 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 08:41:01 crc kubenswrapper[4765]: I1003 08:41:01.028078 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 08:41:01 crc kubenswrapper[4765]: I1003 08:41:01.028088 4765 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T08:41:01Z","lastTransitionTime":"2025-10-03T08:41:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 08:41:01 crc kubenswrapper[4765]: I1003 08:41:01.130678 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 08:41:01 crc kubenswrapper[4765]: I1003 08:41:01.130719 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 08:41:01 crc kubenswrapper[4765]: I1003 08:41:01.130730 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 08:41:01 crc kubenswrapper[4765]: I1003 08:41:01.130745 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 08:41:01 crc kubenswrapper[4765]: I1003 08:41:01.130757 4765 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T08:41:01Z","lastTransitionTime":"2025-10-03T08:41:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 08:41:01 crc kubenswrapper[4765]: I1003 08:41:01.233444 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 08:41:01 crc kubenswrapper[4765]: I1003 08:41:01.233495 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 08:41:01 crc kubenswrapper[4765]: I1003 08:41:01.233510 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 08:41:01 crc kubenswrapper[4765]: I1003 08:41:01.233529 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 08:41:01 crc kubenswrapper[4765]: I1003 08:41:01.233539 4765 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T08:41:01Z","lastTransitionTime":"2025-10-03T08:41:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 08:41:01 crc kubenswrapper[4765]: I1003 08:41:01.306327 4765 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 03 08:41:01 crc kubenswrapper[4765]: I1003 08:41:01.306360 4765 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 03 08:41:01 crc kubenswrapper[4765]: I1003 08:41:01.306396 4765 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 03 08:41:01 crc kubenswrapper[4765]: E1003 08:41:01.306454 4765 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 03 08:41:01 crc kubenswrapper[4765]: E1003 08:41:01.306550 4765 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 03 08:41:01 crc kubenswrapper[4765]: E1003 08:41:01.306665 4765 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 03 08:41:01 crc kubenswrapper[4765]: I1003 08:41:01.336277 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 08:41:01 crc kubenswrapper[4765]: I1003 08:41:01.336315 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 08:41:01 crc kubenswrapper[4765]: I1003 08:41:01.336325 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 08:41:01 crc kubenswrapper[4765]: I1003 08:41:01.336340 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 08:41:01 crc kubenswrapper[4765]: I1003 08:41:01.336350 4765 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T08:41:01Z","lastTransitionTime":"2025-10-03T08:41:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 08:41:01 crc kubenswrapper[4765]: I1003 08:41:01.438849 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 08:41:01 crc kubenswrapper[4765]: I1003 08:41:01.438889 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 08:41:01 crc kubenswrapper[4765]: I1003 08:41:01.438897 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 08:41:01 crc kubenswrapper[4765]: I1003 08:41:01.438911 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 08:41:01 crc kubenswrapper[4765]: I1003 08:41:01.438922 4765 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T08:41:01Z","lastTransitionTime":"2025-10-03T08:41:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 08:41:01 crc kubenswrapper[4765]: I1003 08:41:01.542506 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 08:41:01 crc kubenswrapper[4765]: I1003 08:41:01.542562 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 08:41:01 crc kubenswrapper[4765]: I1003 08:41:01.542576 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 08:41:01 crc kubenswrapper[4765]: I1003 08:41:01.542593 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 08:41:01 crc kubenswrapper[4765]: I1003 08:41:01.542605 4765 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T08:41:01Z","lastTransitionTime":"2025-10-03T08:41:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 08:41:01 crc kubenswrapper[4765]: I1003 08:41:01.645728 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 08:41:01 crc kubenswrapper[4765]: I1003 08:41:01.645780 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 08:41:01 crc kubenswrapper[4765]: I1003 08:41:01.645791 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 08:41:01 crc kubenswrapper[4765]: I1003 08:41:01.645808 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 08:41:01 crc kubenswrapper[4765]: I1003 08:41:01.645820 4765 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T08:41:01Z","lastTransitionTime":"2025-10-03T08:41:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 08:41:01 crc kubenswrapper[4765]: I1003 08:41:01.748115 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 08:41:01 crc kubenswrapper[4765]: I1003 08:41:01.748163 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 08:41:01 crc kubenswrapper[4765]: I1003 08:41:01.748173 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 08:41:01 crc kubenswrapper[4765]: I1003 08:41:01.748372 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 08:41:01 crc kubenswrapper[4765]: I1003 08:41:01.748385 4765 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T08:41:01Z","lastTransitionTime":"2025-10-03T08:41:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 08:41:01 crc kubenswrapper[4765]: I1003 08:41:01.850435 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 08:41:01 crc kubenswrapper[4765]: I1003 08:41:01.850476 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 08:41:01 crc kubenswrapper[4765]: I1003 08:41:01.850486 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 08:41:01 crc kubenswrapper[4765]: I1003 08:41:01.850501 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 08:41:01 crc kubenswrapper[4765]: I1003 08:41:01.850512 4765 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T08:41:01Z","lastTransitionTime":"2025-10-03T08:41:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 08:41:01 crc kubenswrapper[4765]: I1003 08:41:01.952944 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 08:41:01 crc kubenswrapper[4765]: I1003 08:41:01.952977 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 08:41:01 crc kubenswrapper[4765]: I1003 08:41:01.952985 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 08:41:01 crc kubenswrapper[4765]: I1003 08:41:01.952997 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 08:41:01 crc kubenswrapper[4765]: I1003 08:41:01.953008 4765 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T08:41:01Z","lastTransitionTime":"2025-10-03T08:41:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 08:41:02 crc kubenswrapper[4765]: I1003 08:41:02.055401 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 08:41:02 crc kubenswrapper[4765]: I1003 08:41:02.055442 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 08:41:02 crc kubenswrapper[4765]: I1003 08:41:02.055454 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 08:41:02 crc kubenswrapper[4765]: I1003 08:41:02.055470 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 08:41:02 crc kubenswrapper[4765]: I1003 08:41:02.055481 4765 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T08:41:02Z","lastTransitionTime":"2025-10-03T08:41:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 08:41:02 crc kubenswrapper[4765]: I1003 08:41:02.158401 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 08:41:02 crc kubenswrapper[4765]: I1003 08:41:02.158441 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 08:41:02 crc kubenswrapper[4765]: I1003 08:41:02.158451 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 08:41:02 crc kubenswrapper[4765]: I1003 08:41:02.158466 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 08:41:02 crc kubenswrapper[4765]: I1003 08:41:02.158477 4765 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T08:41:02Z","lastTransitionTime":"2025-10-03T08:41:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 08:41:02 crc kubenswrapper[4765]: I1003 08:41:02.261053 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 08:41:02 crc kubenswrapper[4765]: I1003 08:41:02.261112 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 08:41:02 crc kubenswrapper[4765]: I1003 08:41:02.261124 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 08:41:02 crc kubenswrapper[4765]: I1003 08:41:02.261141 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 08:41:02 crc kubenswrapper[4765]: I1003 08:41:02.261153 4765 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T08:41:02Z","lastTransitionTime":"2025-10-03T08:41:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 08:41:02 crc kubenswrapper[4765]: I1003 08:41:02.306484 4765 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-wdwf5" Oct 03 08:41:02 crc kubenswrapper[4765]: E1003 08:41:02.306625 4765 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-wdwf5" podUID="6824483c-e9a7-4e95-bb3d-e00bac2af3aa" Oct 03 08:41:02 crc kubenswrapper[4765]: I1003 08:41:02.364179 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 08:41:02 crc kubenswrapper[4765]: I1003 08:41:02.364245 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 08:41:02 crc kubenswrapper[4765]: I1003 08:41:02.364262 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 08:41:02 crc kubenswrapper[4765]: I1003 08:41:02.364286 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 08:41:02 crc kubenswrapper[4765]: I1003 08:41:02.364307 4765 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T08:41:02Z","lastTransitionTime":"2025-10-03T08:41:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 08:41:02 crc kubenswrapper[4765]: I1003 08:41:02.466140 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 08:41:02 crc kubenswrapper[4765]: I1003 08:41:02.466176 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 08:41:02 crc kubenswrapper[4765]: I1003 08:41:02.466189 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 08:41:02 crc kubenswrapper[4765]: I1003 08:41:02.466204 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 08:41:02 crc kubenswrapper[4765]: I1003 08:41:02.466213 4765 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T08:41:02Z","lastTransitionTime":"2025-10-03T08:41:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 08:41:02 crc kubenswrapper[4765]: I1003 08:41:02.568764 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 08:41:02 crc kubenswrapper[4765]: I1003 08:41:02.568801 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 08:41:02 crc kubenswrapper[4765]: I1003 08:41:02.568811 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 08:41:02 crc kubenswrapper[4765]: I1003 08:41:02.568826 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 08:41:02 crc kubenswrapper[4765]: I1003 08:41:02.568835 4765 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T08:41:02Z","lastTransitionTime":"2025-10-03T08:41:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 08:41:02 crc kubenswrapper[4765]: I1003 08:41:02.671265 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 08:41:02 crc kubenswrapper[4765]: I1003 08:41:02.671372 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 08:41:02 crc kubenswrapper[4765]: I1003 08:41:02.671381 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 08:41:02 crc kubenswrapper[4765]: I1003 08:41:02.671394 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 08:41:02 crc kubenswrapper[4765]: I1003 08:41:02.671404 4765 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T08:41:02Z","lastTransitionTime":"2025-10-03T08:41:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 08:41:02 crc kubenswrapper[4765]: I1003 08:41:02.773759 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 08:41:02 crc kubenswrapper[4765]: I1003 08:41:02.773798 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 08:41:02 crc kubenswrapper[4765]: I1003 08:41:02.773806 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 08:41:02 crc kubenswrapper[4765]: I1003 08:41:02.773819 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 08:41:02 crc kubenswrapper[4765]: I1003 08:41:02.773830 4765 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T08:41:02Z","lastTransitionTime":"2025-10-03T08:41:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 08:41:02 crc kubenswrapper[4765]: I1003 08:41:02.875452 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 08:41:02 crc kubenswrapper[4765]: I1003 08:41:02.875484 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 08:41:02 crc kubenswrapper[4765]: I1003 08:41:02.875492 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 08:41:02 crc kubenswrapper[4765]: I1003 08:41:02.875507 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 08:41:02 crc kubenswrapper[4765]: I1003 08:41:02.875517 4765 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T08:41:02Z","lastTransitionTime":"2025-10-03T08:41:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 08:41:02 crc kubenswrapper[4765]: I1003 08:41:02.977381 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 08:41:02 crc kubenswrapper[4765]: I1003 08:41:02.977407 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 08:41:02 crc kubenswrapper[4765]: I1003 08:41:02.977415 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 08:41:02 crc kubenswrapper[4765]: I1003 08:41:02.977427 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 08:41:02 crc kubenswrapper[4765]: I1003 08:41:02.977436 4765 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T08:41:02Z","lastTransitionTime":"2025-10-03T08:41:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 08:41:03 crc kubenswrapper[4765]: I1003 08:41:03.080336 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 08:41:03 crc kubenswrapper[4765]: I1003 08:41:03.080379 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 08:41:03 crc kubenswrapper[4765]: I1003 08:41:03.080389 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 08:41:03 crc kubenswrapper[4765]: I1003 08:41:03.080408 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 08:41:03 crc kubenswrapper[4765]: I1003 08:41:03.080420 4765 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T08:41:03Z","lastTransitionTime":"2025-10-03T08:41:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 08:41:03 crc kubenswrapper[4765]: I1003 08:41:03.182631 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 08:41:03 crc kubenswrapper[4765]: I1003 08:41:03.182700 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 08:41:03 crc kubenswrapper[4765]: I1003 08:41:03.182715 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 08:41:03 crc kubenswrapper[4765]: I1003 08:41:03.182730 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 08:41:03 crc kubenswrapper[4765]: I1003 08:41:03.182739 4765 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T08:41:03Z","lastTransitionTime":"2025-10-03T08:41:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 08:41:03 crc kubenswrapper[4765]: I1003 08:41:03.284924 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 08:41:03 crc kubenswrapper[4765]: I1003 08:41:03.284962 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 08:41:03 crc kubenswrapper[4765]: I1003 08:41:03.284971 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 08:41:03 crc kubenswrapper[4765]: I1003 08:41:03.284991 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 08:41:03 crc kubenswrapper[4765]: I1003 08:41:03.285008 4765 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T08:41:03Z","lastTransitionTime":"2025-10-03T08:41:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 08:41:03 crc kubenswrapper[4765]: I1003 08:41:03.306571 4765 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 03 08:41:03 crc kubenswrapper[4765]: I1003 08:41:03.306679 4765 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 03 08:41:03 crc kubenswrapper[4765]: E1003 08:41:03.306715 4765 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 03 08:41:03 crc kubenswrapper[4765]: E1003 08:41:03.306804 4765 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 03 08:41:03 crc kubenswrapper[4765]: I1003 08:41:03.307018 4765 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 03 08:41:03 crc kubenswrapper[4765]: E1003 08:41:03.307078 4765 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 03 08:41:03 crc kubenswrapper[4765]: I1003 08:41:03.388554 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 08:41:03 crc kubenswrapper[4765]: I1003 08:41:03.388742 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 08:41:03 crc kubenswrapper[4765]: I1003 08:41:03.388776 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 08:41:03 crc kubenswrapper[4765]: I1003 08:41:03.388817 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 08:41:03 crc kubenswrapper[4765]: I1003 08:41:03.388848 4765 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T08:41:03Z","lastTransitionTime":"2025-10-03T08:41:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 08:41:03 crc kubenswrapper[4765]: I1003 08:41:03.491636 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 08:41:03 crc kubenswrapper[4765]: I1003 08:41:03.491731 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 08:41:03 crc kubenswrapper[4765]: I1003 08:41:03.491746 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 08:41:03 crc kubenswrapper[4765]: I1003 08:41:03.491767 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 08:41:03 crc kubenswrapper[4765]: I1003 08:41:03.491779 4765 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T08:41:03Z","lastTransitionTime":"2025-10-03T08:41:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 08:41:03 crc kubenswrapper[4765]: I1003 08:41:03.595305 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 08:41:03 crc kubenswrapper[4765]: I1003 08:41:03.595360 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 08:41:03 crc kubenswrapper[4765]: I1003 08:41:03.595391 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 08:41:03 crc kubenswrapper[4765]: I1003 08:41:03.595414 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 08:41:03 crc kubenswrapper[4765]: I1003 08:41:03.595438 4765 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T08:41:03Z","lastTransitionTime":"2025-10-03T08:41:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 08:41:03 crc kubenswrapper[4765]: I1003 08:41:03.698845 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 08:41:03 crc kubenswrapper[4765]: I1003 08:41:03.698943 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 08:41:03 crc kubenswrapper[4765]: I1003 08:41:03.698968 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 08:41:03 crc kubenswrapper[4765]: I1003 08:41:03.699006 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 08:41:03 crc kubenswrapper[4765]: I1003 08:41:03.699030 4765 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T08:41:03Z","lastTransitionTime":"2025-10-03T08:41:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 08:41:03 crc kubenswrapper[4765]: I1003 08:41:03.801577 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 08:41:03 crc kubenswrapper[4765]: I1003 08:41:03.801888 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 08:41:03 crc kubenswrapper[4765]: I1003 08:41:03.801902 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 08:41:03 crc kubenswrapper[4765]: I1003 08:41:03.801917 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 08:41:03 crc kubenswrapper[4765]: I1003 08:41:03.801927 4765 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T08:41:03Z","lastTransitionTime":"2025-10-03T08:41:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 08:41:03 crc kubenswrapper[4765]: I1003 08:41:03.905611 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 08:41:03 crc kubenswrapper[4765]: I1003 08:41:03.905768 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 08:41:03 crc kubenswrapper[4765]: I1003 08:41:03.905792 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 08:41:03 crc kubenswrapper[4765]: I1003 08:41:03.905827 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 08:41:03 crc kubenswrapper[4765]: I1003 08:41:03.905847 4765 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T08:41:03Z","lastTransitionTime":"2025-10-03T08:41:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 08:41:04 crc kubenswrapper[4765]: I1003 08:41:04.010749 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 08:41:04 crc kubenswrapper[4765]: I1003 08:41:04.010831 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 08:41:04 crc kubenswrapper[4765]: I1003 08:41:04.010855 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 08:41:04 crc kubenswrapper[4765]: I1003 08:41:04.010886 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 08:41:04 crc kubenswrapper[4765]: I1003 08:41:04.010909 4765 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T08:41:04Z","lastTransitionTime":"2025-10-03T08:41:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 08:41:04 crc kubenswrapper[4765]: I1003 08:41:04.074831 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 08:41:04 crc kubenswrapper[4765]: I1003 08:41:04.074923 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 08:41:04 crc kubenswrapper[4765]: I1003 08:41:04.074937 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 08:41:04 crc kubenswrapper[4765]: I1003 08:41:04.074961 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 08:41:04 crc kubenswrapper[4765]: I1003 08:41:04.074976 4765 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T08:41:04Z","lastTransitionTime":"2025-10-03T08:41:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 08:41:04 crc kubenswrapper[4765]: I1003 08:41:04.122465 4765 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-cluster-version/cluster-version-operator-5c965bbfc6-vr8pp"] Oct 03 08:41:04 crc kubenswrapper[4765]: I1003 08:41:04.122967 4765 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-vr8pp" Oct 03 08:41:04 crc kubenswrapper[4765]: I1003 08:41:04.125254 4765 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-version"/"default-dockercfg-gxtc4" Oct 03 08:41:04 crc kubenswrapper[4765]: I1003 08:41:04.125392 4765 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-version"/"kube-root-ca.crt" Oct 03 08:41:04 crc kubenswrapper[4765]: I1003 08:41:04.125424 4765 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-version"/"openshift-service-ca.crt" Oct 03 08:41:04 crc kubenswrapper[4765]: I1003 08:41:04.126462 4765 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-version"/"cluster-version-operator-serving-cert" Oct 03 08:41:04 crc kubenswrapper[4765]: I1003 08:41:04.150687 4765 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-etcd/etcd-crc" podStartSLOduration=87.150633131 podStartE2EDuration="1m27.150633131s" podCreationTimestamp="2025-10-03 08:39:37 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-03 08:41:04.149952854 +0000 UTC m=+108.451447194" watchObservedRunningTime="2025-10-03 08:41:04.150633131 +0000 UTC m=+108.452127461" Oct 03 08:41:04 crc kubenswrapper[4765]: I1003 08:41:04.195687 4765 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-csb5z" podStartSLOduration=89.195637431 podStartE2EDuration="1m29.195637431s" podCreationTimestamp="2025-10-03 08:39:35 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-03 08:41:04.182350311 +0000 UTC m=+108.483844641" watchObservedRunningTime="2025-10-03 08:41:04.195637431 +0000 UTC m=+108.497131761" Oct 03 08:41:04 crc kubenswrapper[4765]: I1003 08:41:04.196392 4765 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/machine-config-daemon-j8mss" podStartSLOduration=89.19638419 podStartE2EDuration="1m29.19638419s" podCreationTimestamp="2025-10-03 08:39:35 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-03 08:41:04.195870017 +0000 UTC m=+108.497364337" watchObservedRunningTime="2025-10-03 08:41:04.19638419 +0000 UTC m=+108.497878520" Oct 03 08:41:04 crc kubenswrapper[4765]: I1003 08:41:04.221466 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/d4231d16-f975-4b42-b499-c4c7eb679858-service-ca\") pod \"cluster-version-operator-5c965bbfc6-vr8pp\" (UID: \"d4231d16-f975-4b42-b499-c4c7eb679858\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-vr8pp" Oct 03 08:41:04 crc kubenswrapper[4765]: I1003 08:41:04.221533 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/d4231d16-f975-4b42-b499-c4c7eb679858-serving-cert\") pod \"cluster-version-operator-5c965bbfc6-vr8pp\" (UID: \"d4231d16-f975-4b42-b499-c4c7eb679858\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-vr8pp" Oct 03 08:41:04 crc kubenswrapper[4765]: I1003 08:41:04.221569 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-ssl-certs\" (UniqueName: \"kubernetes.io/host-path/d4231d16-f975-4b42-b499-c4c7eb679858-etc-ssl-certs\") pod \"cluster-version-operator-5c965bbfc6-vr8pp\" (UID: \"d4231d16-f975-4b42-b499-c4c7eb679858\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-vr8pp" Oct 03 08:41:04 crc kubenswrapper[4765]: I1003 08:41:04.221588 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/d4231d16-f975-4b42-b499-c4c7eb679858-kube-api-access\") pod \"cluster-version-operator-5c965bbfc6-vr8pp\" (UID: \"d4231d16-f975-4b42-b499-c4c7eb679858\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-vr8pp" Oct 03 08:41:04 crc kubenswrapper[4765]: I1003 08:41:04.221636 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-cvo-updatepayloads\" (UniqueName: \"kubernetes.io/host-path/d4231d16-f975-4b42-b499-c4c7eb679858-etc-cvo-updatepayloads\") pod \"cluster-version-operator-5c965bbfc6-vr8pp\" (UID: \"d4231d16-f975-4b42-b499-c4c7eb679858\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-vr8pp" Oct 03 08:41:04 crc kubenswrapper[4765]: I1003 08:41:04.306142 4765 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-wdwf5" Oct 03 08:41:04 crc kubenswrapper[4765]: E1003 08:41:04.306385 4765 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-wdwf5" podUID="6824483c-e9a7-4e95-bb3d-e00bac2af3aa" Oct 03 08:41:04 crc kubenswrapper[4765]: I1003 08:41:04.323695 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/d4231d16-f975-4b42-b499-c4c7eb679858-serving-cert\") pod \"cluster-version-operator-5c965bbfc6-vr8pp\" (UID: \"d4231d16-f975-4b42-b499-c4c7eb679858\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-vr8pp" Oct 03 08:41:04 crc kubenswrapper[4765]: I1003 08:41:04.323755 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-ssl-certs\" (UniqueName: \"kubernetes.io/host-path/d4231d16-f975-4b42-b499-c4c7eb679858-etc-ssl-certs\") pod \"cluster-version-operator-5c965bbfc6-vr8pp\" (UID: \"d4231d16-f975-4b42-b499-c4c7eb679858\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-vr8pp" Oct 03 08:41:04 crc kubenswrapper[4765]: I1003 08:41:04.323773 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/d4231d16-f975-4b42-b499-c4c7eb679858-kube-api-access\") pod \"cluster-version-operator-5c965bbfc6-vr8pp\" (UID: \"d4231d16-f975-4b42-b499-c4c7eb679858\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-vr8pp" Oct 03 08:41:04 crc kubenswrapper[4765]: I1003 08:41:04.323806 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-cvo-updatepayloads\" (UniqueName: \"kubernetes.io/host-path/d4231d16-f975-4b42-b499-c4c7eb679858-etc-cvo-updatepayloads\") pod \"cluster-version-operator-5c965bbfc6-vr8pp\" (UID: \"d4231d16-f975-4b42-b499-c4c7eb679858\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-vr8pp" Oct 03 08:41:04 crc kubenswrapper[4765]: I1003 08:41:04.323853 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/d4231d16-f975-4b42-b499-c4c7eb679858-service-ca\") pod \"cluster-version-operator-5c965bbfc6-vr8pp\" (UID: \"d4231d16-f975-4b42-b499-c4c7eb679858\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-vr8pp" Oct 03 08:41:04 crc kubenswrapper[4765]: I1003 08:41:04.323862 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-ssl-certs\" (UniqueName: \"kubernetes.io/host-path/d4231d16-f975-4b42-b499-c4c7eb679858-etc-ssl-certs\") pod \"cluster-version-operator-5c965bbfc6-vr8pp\" (UID: \"d4231d16-f975-4b42-b499-c4c7eb679858\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-vr8pp" Oct 03 08:41:04 crc kubenswrapper[4765]: I1003 08:41:04.324034 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-cvo-updatepayloads\" (UniqueName: \"kubernetes.io/host-path/d4231d16-f975-4b42-b499-c4c7eb679858-etc-cvo-updatepayloads\") pod \"cluster-version-operator-5c965bbfc6-vr8pp\" (UID: \"d4231d16-f975-4b42-b499-c4c7eb679858\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-vr8pp" Oct 03 08:41:04 crc kubenswrapper[4765]: I1003 08:41:04.324812 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/d4231d16-f975-4b42-b499-c4c7eb679858-service-ca\") pod \"cluster-version-operator-5c965bbfc6-vr8pp\" (UID: \"d4231d16-f975-4b42-b499-c4c7eb679858\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-vr8pp" Oct 03 08:41:04 crc kubenswrapper[4765]: I1003 08:41:04.336410 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/d4231d16-f975-4b42-b499-c4c7eb679858-serving-cert\") pod \"cluster-version-operator-5c965bbfc6-vr8pp\" (UID: \"d4231d16-f975-4b42-b499-c4c7eb679858\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-vr8pp" Oct 03 08:41:04 crc kubenswrapper[4765]: I1003 08:41:04.343911 4765 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-controller-manager/kube-controller-manager-crc" podStartSLOduration=89.343877123 podStartE2EDuration="1m29.343877123s" podCreationTimestamp="2025-10-03 08:39:35 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-03 08:41:04.343200747 +0000 UTC m=+108.644695077" watchObservedRunningTime="2025-10-03 08:41:04.343877123 +0000 UTC m=+108.645371453" Oct 03 08:41:04 crc kubenswrapper[4765]: I1003 08:41:04.349391 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/d4231d16-f975-4b42-b499-c4c7eb679858-kube-api-access\") pod \"cluster-version-operator-5c965bbfc6-vr8pp\" (UID: \"d4231d16-f975-4b42-b499-c4c7eb679858\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-vr8pp" Oct 03 08:41:04 crc kubenswrapper[4765]: I1003 08:41:04.371590 4765 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-additional-cni-plugins-4bmrv" podStartSLOduration=88.371566333 podStartE2EDuration="1m28.371566333s" podCreationTimestamp="2025-10-03 08:39:36 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-03 08:41:04.37142589 +0000 UTC m=+108.672920220" watchObservedRunningTime="2025-10-03 08:41:04.371566333 +0000 UTC m=+108.673060663" Oct 03 08:41:04 crc kubenswrapper[4765]: I1003 08:41:04.387383 4765 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns/node-resolver-p9gf5" podStartSLOduration=89.387361337 podStartE2EDuration="1m29.387361337s" podCreationTimestamp="2025-10-03 08:39:35 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-03 08:41:04.387199502 +0000 UTC m=+108.688693832" watchObservedRunningTime="2025-10-03 08:41:04.387361337 +0000 UTC m=+108.688855667" Oct 03 08:41:04 crc kubenswrapper[4765]: I1003 08:41:04.397839 4765 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/node-ca-svqbq" podStartSLOduration=89.397800096 podStartE2EDuration="1m29.397800096s" podCreationTimestamp="2025-10-03 08:39:35 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-03 08:41:04.39712229 +0000 UTC m=+108.698616620" watchObservedRunningTime="2025-10-03 08:41:04.397800096 +0000 UTC m=+108.699294426" Oct 03 08:41:04 crc kubenswrapper[4765]: I1003 08:41:04.426268 4765 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-9pssq" podStartSLOduration=88.426249165 podStartE2EDuration="1m28.426249165s" podCreationTimestamp="2025-10-03 08:39:36 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-03 08:41:04.411523478 +0000 UTC m=+108.713017808" watchObservedRunningTime="2025-10-03 08:41:04.426249165 +0000 UTC m=+108.727743495" Oct 03 08:41:04 crc kubenswrapper[4765]: I1003 08:41:04.437958 4765 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-vr8pp" Oct 03 08:41:04 crc kubenswrapper[4765]: I1003 08:41:04.447125 4765 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-apiserver/kube-apiserver-crc" podStartSLOduration=89.447095734 podStartE2EDuration="1m29.447095734s" podCreationTimestamp="2025-10-03 08:39:35 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-03 08:41:04.446429088 +0000 UTC m=+108.747923428" watchObservedRunningTime="2025-10-03 08:41:04.447095734 +0000 UTC m=+108.748590064" Oct 03 08:41:04 crc kubenswrapper[4765]: I1003 08:41:04.447431 4765 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" podStartSLOduration=42.447424692 podStartE2EDuration="42.447424692s" podCreationTimestamp="2025-10-03 08:40:22 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-03 08:41:04.426390279 +0000 UTC m=+108.727884609" watchObservedRunningTime="2025-10-03 08:41:04.447424692 +0000 UTC m=+108.748919022" Oct 03 08:41:04 crc kubenswrapper[4765]: I1003 08:41:04.461661 4765 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" podStartSLOduration=52.461622346 podStartE2EDuration="52.461622346s" podCreationTimestamp="2025-10-03 08:40:12 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-03 08:41:04.460307533 +0000 UTC m=+108.761801883" watchObservedRunningTime="2025-10-03 08:41:04.461622346 +0000 UTC m=+108.763116676" Oct 03 08:41:04 crc kubenswrapper[4765]: W1003 08:41:04.463630 4765 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd4231d16_f975_4b42_b499_c4c7eb679858.slice/crio-25a4eb9cead2ec329b39b4229646a34169b5c736e961795d9b757f3704eae4d8 WatchSource:0}: Error finding container 25a4eb9cead2ec329b39b4229646a34169b5c736e961795d9b757f3704eae4d8: Status 404 returned error can't find the container with id 25a4eb9cead2ec329b39b4229646a34169b5c736e961795d9b757f3704eae4d8 Oct 03 08:41:04 crc kubenswrapper[4765]: I1003 08:41:04.847777 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-vr8pp" event={"ID":"d4231d16-f975-4b42-b499-c4c7eb679858","Type":"ContainerStarted","Data":"71e8512df5999b2c8738337645f534caaefceb42ff2bbe4128edb8a98c0d584c"} Oct 03 08:41:04 crc kubenswrapper[4765]: I1003 08:41:04.847849 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-vr8pp" event={"ID":"d4231d16-f975-4b42-b499-c4c7eb679858","Type":"ContainerStarted","Data":"25a4eb9cead2ec329b39b4229646a34169b5c736e961795d9b757f3704eae4d8"} Oct 03 08:41:05 crc kubenswrapper[4765]: I1003 08:41:05.306321 4765 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 03 08:41:05 crc kubenswrapper[4765]: E1003 08:41:05.306452 4765 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 03 08:41:05 crc kubenswrapper[4765]: I1003 08:41:05.306630 4765 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 03 08:41:05 crc kubenswrapper[4765]: I1003 08:41:05.306723 4765 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 03 08:41:05 crc kubenswrapper[4765]: E1003 08:41:05.306842 4765 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 03 08:41:05 crc kubenswrapper[4765]: E1003 08:41:05.307069 4765 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 03 08:41:06 crc kubenswrapper[4765]: I1003 08:41:06.305876 4765 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-wdwf5" Oct 03 08:41:06 crc kubenswrapper[4765]: E1003 08:41:06.306805 4765 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-wdwf5" podUID="6824483c-e9a7-4e95-bb3d-e00bac2af3aa" Oct 03 08:41:06 crc kubenswrapper[4765]: I1003 08:41:06.307576 4765 scope.go:117] "RemoveContainer" containerID="115444bc9990e2060fb9e8fff1ca7328f3abbaee25879c6af5feac46f0a417bb" Oct 03 08:41:06 crc kubenswrapper[4765]: E1003 08:41:06.307800 4765 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 40s restarting failed container=ovnkube-controller pod=ovnkube-node-srgbb_openshift-ovn-kubernetes(ea01fba1-445f-46c1-898c-1ceb34866850)\"" pod="openshift-ovn-kubernetes/ovnkube-node-srgbb" podUID="ea01fba1-445f-46c1-898c-1ceb34866850" Oct 03 08:41:07 crc kubenswrapper[4765]: I1003 08:41:07.306540 4765 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 03 08:41:07 crc kubenswrapper[4765]: I1003 08:41:07.306589 4765 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 03 08:41:07 crc kubenswrapper[4765]: I1003 08:41:07.306589 4765 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 03 08:41:07 crc kubenswrapper[4765]: E1003 08:41:07.306740 4765 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 03 08:41:07 crc kubenswrapper[4765]: E1003 08:41:07.307159 4765 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 03 08:41:07 crc kubenswrapper[4765]: E1003 08:41:07.307316 4765 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 03 08:41:08 crc kubenswrapper[4765]: I1003 08:41:08.306427 4765 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-wdwf5" Oct 03 08:41:08 crc kubenswrapper[4765]: E1003 08:41:08.306890 4765 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-wdwf5" podUID="6824483c-e9a7-4e95-bb3d-e00bac2af3aa" Oct 03 08:41:09 crc kubenswrapper[4765]: I1003 08:41:09.306092 4765 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 03 08:41:09 crc kubenswrapper[4765]: I1003 08:41:09.306119 4765 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 03 08:41:09 crc kubenswrapper[4765]: I1003 08:41:09.306151 4765 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 03 08:41:09 crc kubenswrapper[4765]: E1003 08:41:09.306215 4765 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 03 08:41:09 crc kubenswrapper[4765]: E1003 08:41:09.306352 4765 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 03 08:41:09 crc kubenswrapper[4765]: E1003 08:41:09.306430 4765 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 03 08:41:10 crc kubenswrapper[4765]: I1003 08:41:10.305933 4765 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-wdwf5" Oct 03 08:41:10 crc kubenswrapper[4765]: E1003 08:41:10.306175 4765 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-wdwf5" podUID="6824483c-e9a7-4e95-bb3d-e00bac2af3aa" Oct 03 08:41:10 crc kubenswrapper[4765]: I1003 08:41:10.865779 4765 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-csb5z_912755c8-dd28-4fbc-82de-9cf85df54f4f/kube-multus/1.log" Oct 03 08:41:10 crc kubenswrapper[4765]: I1003 08:41:10.866458 4765 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-csb5z_912755c8-dd28-4fbc-82de-9cf85df54f4f/kube-multus/0.log" Oct 03 08:41:10 crc kubenswrapper[4765]: I1003 08:41:10.866495 4765 generic.go:334] "Generic (PLEG): container finished" podID="912755c8-dd28-4fbc-82de-9cf85df54f4f" containerID="52f5a7f443bf8e52988e8645ff60745a747d602261e7dbf01b68c58aaf9bae05" exitCode=1 Oct 03 08:41:10 crc kubenswrapper[4765]: I1003 08:41:10.866529 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-csb5z" event={"ID":"912755c8-dd28-4fbc-82de-9cf85df54f4f","Type":"ContainerDied","Data":"52f5a7f443bf8e52988e8645ff60745a747d602261e7dbf01b68c58aaf9bae05"} Oct 03 08:41:10 crc kubenswrapper[4765]: I1003 08:41:10.866563 4765 scope.go:117] "RemoveContainer" containerID="d7f179012e9f55f30c641a1ae3640cc90cefb3d2527d0c1e0580c219899503e1" Oct 03 08:41:10 crc kubenswrapper[4765]: I1003 08:41:10.866969 4765 scope.go:117] "RemoveContainer" containerID="52f5a7f443bf8e52988e8645ff60745a747d602261e7dbf01b68c58aaf9bae05" Oct 03 08:41:10 crc kubenswrapper[4765]: E1003 08:41:10.867118 4765 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-multus\" with CrashLoopBackOff: \"back-off 10s restarting failed container=kube-multus pod=multus-csb5z_openshift-multus(912755c8-dd28-4fbc-82de-9cf85df54f4f)\"" pod="openshift-multus/multus-csb5z" podUID="912755c8-dd28-4fbc-82de-9cf85df54f4f" Oct 03 08:41:10 crc kubenswrapper[4765]: I1003 08:41:10.885915 4765 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-vr8pp" podStartSLOduration=95.885883075 podStartE2EDuration="1m35.885883075s" podCreationTimestamp="2025-10-03 08:39:35 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-03 08:41:04.86048074 +0000 UTC m=+109.161975070" watchObservedRunningTime="2025-10-03 08:41:10.885883075 +0000 UTC m=+115.187377405" Oct 03 08:41:11 crc kubenswrapper[4765]: I1003 08:41:11.306140 4765 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 03 08:41:11 crc kubenswrapper[4765]: I1003 08:41:11.306211 4765 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 03 08:41:11 crc kubenswrapper[4765]: I1003 08:41:11.306270 4765 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 03 08:41:11 crc kubenswrapper[4765]: E1003 08:41:11.306386 4765 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 03 08:41:11 crc kubenswrapper[4765]: E1003 08:41:11.306508 4765 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 03 08:41:11 crc kubenswrapper[4765]: E1003 08:41:11.306566 4765 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 03 08:41:11 crc kubenswrapper[4765]: I1003 08:41:11.871089 4765 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-csb5z_912755c8-dd28-4fbc-82de-9cf85df54f4f/kube-multus/1.log" Oct 03 08:41:12 crc kubenswrapper[4765]: I1003 08:41:12.306373 4765 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-wdwf5" Oct 03 08:41:12 crc kubenswrapper[4765]: E1003 08:41:12.306859 4765 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-wdwf5" podUID="6824483c-e9a7-4e95-bb3d-e00bac2af3aa" Oct 03 08:41:13 crc kubenswrapper[4765]: I1003 08:41:13.305924 4765 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 03 08:41:13 crc kubenswrapper[4765]: I1003 08:41:13.306040 4765 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 03 08:41:13 crc kubenswrapper[4765]: I1003 08:41:13.305924 4765 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 03 08:41:13 crc kubenswrapper[4765]: E1003 08:41:13.306153 4765 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 03 08:41:13 crc kubenswrapper[4765]: E1003 08:41:13.306334 4765 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 03 08:41:13 crc kubenswrapper[4765]: E1003 08:41:13.306359 4765 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 03 08:41:14 crc kubenswrapper[4765]: I1003 08:41:14.305869 4765 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-wdwf5" Oct 03 08:41:14 crc kubenswrapper[4765]: E1003 08:41:14.306222 4765 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-wdwf5" podUID="6824483c-e9a7-4e95-bb3d-e00bac2af3aa" Oct 03 08:41:15 crc kubenswrapper[4765]: I1003 08:41:15.306340 4765 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 03 08:41:15 crc kubenswrapper[4765]: I1003 08:41:15.306443 4765 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 03 08:41:15 crc kubenswrapper[4765]: E1003 08:41:15.306480 4765 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 03 08:41:15 crc kubenswrapper[4765]: E1003 08:41:15.306561 4765 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 03 08:41:15 crc kubenswrapper[4765]: I1003 08:41:15.306450 4765 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 03 08:41:15 crc kubenswrapper[4765]: E1003 08:41:15.306665 4765 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 03 08:41:16 crc kubenswrapper[4765]: E1003 08:41:16.271120 4765 kubelet_node_status.go:497] "Node not becoming ready in time after startup" Oct 03 08:41:16 crc kubenswrapper[4765]: I1003 08:41:16.306495 4765 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-wdwf5" Oct 03 08:41:16 crc kubenswrapper[4765]: E1003 08:41:16.307683 4765 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-wdwf5" podUID="6824483c-e9a7-4e95-bb3d-e00bac2af3aa" Oct 03 08:41:16 crc kubenswrapper[4765]: E1003 08:41:16.417123 4765 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Oct 03 08:41:17 crc kubenswrapper[4765]: I1003 08:41:17.306425 4765 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 03 08:41:17 crc kubenswrapper[4765]: I1003 08:41:17.306490 4765 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 03 08:41:17 crc kubenswrapper[4765]: E1003 08:41:17.306584 4765 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 03 08:41:17 crc kubenswrapper[4765]: I1003 08:41:17.306684 4765 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 03 08:41:17 crc kubenswrapper[4765]: E1003 08:41:17.306683 4765 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 03 08:41:17 crc kubenswrapper[4765]: E1003 08:41:17.306834 4765 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 03 08:41:17 crc kubenswrapper[4765]: I1003 08:41:17.308013 4765 scope.go:117] "RemoveContainer" containerID="115444bc9990e2060fb9e8fff1ca7328f3abbaee25879c6af5feac46f0a417bb" Oct 03 08:41:17 crc kubenswrapper[4765]: I1003 08:41:17.891314 4765 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-srgbb_ea01fba1-445f-46c1-898c-1ceb34866850/ovnkube-controller/3.log" Oct 03 08:41:17 crc kubenswrapper[4765]: I1003 08:41:17.893714 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-srgbb" event={"ID":"ea01fba1-445f-46c1-898c-1ceb34866850","Type":"ContainerStarted","Data":"7f00afa4ccebb6c76784137043797c0ee3ab98e16e9dffb9acb0f972b0c35b63"} Oct 03 08:41:17 crc kubenswrapper[4765]: I1003 08:41:17.894127 4765 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-srgbb" Oct 03 08:41:17 crc kubenswrapper[4765]: I1003 08:41:17.919969 4765 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ovn-kubernetes/ovnkube-node-srgbb" podStartSLOduration=101.91995223 podStartE2EDuration="1m41.91995223s" podCreationTimestamp="2025-10-03 08:39:36 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-03 08:41:17.919319744 +0000 UTC m=+122.220814084" watchObservedRunningTime="2025-10-03 08:41:17.91995223 +0000 UTC m=+122.221446560" Oct 03 08:41:18 crc kubenswrapper[4765]: I1003 08:41:18.061071 4765 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-multus/network-metrics-daemon-wdwf5"] Oct 03 08:41:18 crc kubenswrapper[4765]: I1003 08:41:18.061234 4765 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-wdwf5" Oct 03 08:41:18 crc kubenswrapper[4765]: E1003 08:41:18.061382 4765 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-wdwf5" podUID="6824483c-e9a7-4e95-bb3d-e00bac2af3aa" Oct 03 08:41:19 crc kubenswrapper[4765]: I1003 08:41:19.306828 4765 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 03 08:41:19 crc kubenswrapper[4765]: E1003 08:41:19.306991 4765 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 03 08:41:19 crc kubenswrapper[4765]: I1003 08:41:19.307143 4765 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 03 08:41:19 crc kubenswrapper[4765]: I1003 08:41:19.306865 4765 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 03 08:41:19 crc kubenswrapper[4765]: E1003 08:41:19.307345 4765 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 03 08:41:19 crc kubenswrapper[4765]: E1003 08:41:19.307436 4765 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 03 08:41:20 crc kubenswrapper[4765]: I1003 08:41:20.306292 4765 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-wdwf5" Oct 03 08:41:20 crc kubenswrapper[4765]: E1003 08:41:20.306431 4765 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-wdwf5" podUID="6824483c-e9a7-4e95-bb3d-e00bac2af3aa" Oct 03 08:41:21 crc kubenswrapper[4765]: I1003 08:41:21.306668 4765 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 03 08:41:21 crc kubenswrapper[4765]: I1003 08:41:21.306689 4765 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 03 08:41:21 crc kubenswrapper[4765]: E1003 08:41:21.306797 4765 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 03 08:41:21 crc kubenswrapper[4765]: I1003 08:41:21.306825 4765 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 03 08:41:21 crc kubenswrapper[4765]: E1003 08:41:21.306909 4765 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 03 08:41:21 crc kubenswrapper[4765]: E1003 08:41:21.307058 4765 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 03 08:41:21 crc kubenswrapper[4765]: E1003 08:41:21.418204 4765 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Oct 03 08:41:22 crc kubenswrapper[4765]: I1003 08:41:22.306526 4765 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-wdwf5" Oct 03 08:41:22 crc kubenswrapper[4765]: E1003 08:41:22.306732 4765 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-wdwf5" podUID="6824483c-e9a7-4e95-bb3d-e00bac2af3aa" Oct 03 08:41:23 crc kubenswrapper[4765]: I1003 08:41:23.306113 4765 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 03 08:41:23 crc kubenswrapper[4765]: I1003 08:41:23.306119 4765 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 03 08:41:23 crc kubenswrapper[4765]: E1003 08:41:23.306351 4765 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 03 08:41:23 crc kubenswrapper[4765]: I1003 08:41:23.306118 4765 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 03 08:41:23 crc kubenswrapper[4765]: E1003 08:41:23.306407 4765 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 03 08:41:23 crc kubenswrapper[4765]: E1003 08:41:23.306443 4765 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 03 08:41:24 crc kubenswrapper[4765]: I1003 08:41:24.306083 4765 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-wdwf5" Oct 03 08:41:24 crc kubenswrapper[4765]: E1003 08:41:24.306224 4765 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-wdwf5" podUID="6824483c-e9a7-4e95-bb3d-e00bac2af3aa" Oct 03 08:41:25 crc kubenswrapper[4765]: I1003 08:41:25.306222 4765 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 03 08:41:25 crc kubenswrapper[4765]: I1003 08:41:25.306222 4765 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 03 08:41:25 crc kubenswrapper[4765]: E1003 08:41:25.306338 4765 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 03 08:41:25 crc kubenswrapper[4765]: I1003 08:41:25.306384 4765 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 03 08:41:25 crc kubenswrapper[4765]: I1003 08:41:25.306475 4765 scope.go:117] "RemoveContainer" containerID="52f5a7f443bf8e52988e8645ff60745a747d602261e7dbf01b68c58aaf9bae05" Oct 03 08:41:25 crc kubenswrapper[4765]: E1003 08:41:25.306892 4765 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 03 08:41:25 crc kubenswrapper[4765]: E1003 08:41:25.307122 4765 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 03 08:41:25 crc kubenswrapper[4765]: I1003 08:41:25.921568 4765 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-csb5z_912755c8-dd28-4fbc-82de-9cf85df54f4f/kube-multus/1.log" Oct 03 08:41:25 crc kubenswrapper[4765]: I1003 08:41:25.921961 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-csb5z" event={"ID":"912755c8-dd28-4fbc-82de-9cf85df54f4f","Type":"ContainerStarted","Data":"8d3bc39e0926f0219495285f71e5ec98da034b168a3092f1121da2eabf6b6f10"} Oct 03 08:41:26 crc kubenswrapper[4765]: I1003 08:41:26.306198 4765 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-wdwf5" Oct 03 08:41:26 crc kubenswrapper[4765]: E1003 08:41:26.307497 4765 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-wdwf5" podUID="6824483c-e9a7-4e95-bb3d-e00bac2af3aa" Oct 03 08:41:26 crc kubenswrapper[4765]: E1003 08:41:26.420578 4765 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Oct 03 08:41:27 crc kubenswrapper[4765]: I1003 08:41:27.306135 4765 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 03 08:41:27 crc kubenswrapper[4765]: I1003 08:41:27.306240 4765 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 03 08:41:27 crc kubenswrapper[4765]: I1003 08:41:27.306301 4765 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 03 08:41:27 crc kubenswrapper[4765]: E1003 08:41:27.306517 4765 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 03 08:41:27 crc kubenswrapper[4765]: E1003 08:41:27.306768 4765 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 03 08:41:27 crc kubenswrapper[4765]: E1003 08:41:27.306863 4765 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 03 08:41:28 crc kubenswrapper[4765]: I1003 08:41:28.306367 4765 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-wdwf5" Oct 03 08:41:28 crc kubenswrapper[4765]: E1003 08:41:28.306577 4765 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-wdwf5" podUID="6824483c-e9a7-4e95-bb3d-e00bac2af3aa" Oct 03 08:41:29 crc kubenswrapper[4765]: I1003 08:41:29.306506 4765 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 03 08:41:29 crc kubenswrapper[4765]: I1003 08:41:29.306610 4765 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 03 08:41:29 crc kubenswrapper[4765]: E1003 08:41:29.306720 4765 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 03 08:41:29 crc kubenswrapper[4765]: I1003 08:41:29.306615 4765 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 03 08:41:29 crc kubenswrapper[4765]: E1003 08:41:29.306772 4765 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 03 08:41:29 crc kubenswrapper[4765]: E1003 08:41:29.306862 4765 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 03 08:41:30 crc kubenswrapper[4765]: I1003 08:41:30.305972 4765 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-wdwf5" Oct 03 08:41:30 crc kubenswrapper[4765]: E1003 08:41:30.306504 4765 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-wdwf5" podUID="6824483c-e9a7-4e95-bb3d-e00bac2af3aa" Oct 03 08:41:31 crc kubenswrapper[4765]: I1003 08:41:31.305946 4765 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 03 08:41:31 crc kubenswrapper[4765]: I1003 08:41:31.306014 4765 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 03 08:41:31 crc kubenswrapper[4765]: E1003 08:41:31.306124 4765 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 03 08:41:31 crc kubenswrapper[4765]: I1003 08:41:31.306181 4765 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 03 08:41:31 crc kubenswrapper[4765]: E1003 08:41:31.306297 4765 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 03 08:41:31 crc kubenswrapper[4765]: E1003 08:41:31.306354 4765 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 03 08:41:32 crc kubenswrapper[4765]: I1003 08:41:32.305724 4765 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-wdwf5" Oct 03 08:41:32 crc kubenswrapper[4765]: I1003 08:41:32.307481 4765 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"metrics-daemon-sa-dockercfg-d427c" Oct 03 08:41:32 crc kubenswrapper[4765]: I1003 08:41:32.311068 4765 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"metrics-daemon-secret" Oct 03 08:41:33 crc kubenswrapper[4765]: I1003 08:41:33.305737 4765 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 03 08:41:33 crc kubenswrapper[4765]: I1003 08:41:33.305777 4765 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 03 08:41:33 crc kubenswrapper[4765]: I1003 08:41:33.305822 4765 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 03 08:41:33 crc kubenswrapper[4765]: I1003 08:41:33.307136 4765 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-diagnostics"/"kube-root-ca.crt" Oct 03 08:41:33 crc kubenswrapper[4765]: I1003 08:41:33.307689 4765 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-diagnostics"/"openshift-service-ca.crt" Oct 03 08:41:33 crc kubenswrapper[4765]: I1003 08:41:33.308016 4765 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-console"/"networking-console-plugin-cert" Oct 03 08:41:33 crc kubenswrapper[4765]: I1003 08:41:33.308469 4765 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-console"/"networking-console-plugin" Oct 03 08:41:34 crc kubenswrapper[4765]: I1003 08:41:34.642737 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeReady" Oct 03 08:41:34 crc kubenswrapper[4765]: I1003 08:41:34.676468 4765 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-cluster-machine-approver/machine-approver-56656f9798-vpkxq"] Oct 03 08:41:34 crc kubenswrapper[4765]: I1003 08:41:34.676916 4765 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-t6dcd"] Oct 03 08:41:34 crc kubenswrapper[4765]: I1003 08:41:34.677214 4765 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-t6dcd" Oct 03 08:41:34 crc kubenswrapper[4765]: I1003 08:41:34.677562 4765 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-vpkxq" Oct 03 08:41:34 crc kubenswrapper[4765]: I1003 08:41:34.679973 4765 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-dockercfg-xtcjv" Oct 03 08:41:34 crc kubenswrapper[4765]: I1003 08:41:34.680372 4765 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"openshift-service-ca.crt" Oct 03 08:41:34 crc kubenswrapper[4765]: I1003 08:41:34.680634 4765 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-serving-cert" Oct 03 08:41:34 crc kubenswrapper[4765]: I1003 08:41:34.681016 4765 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-config" Oct 03 08:41:34 crc kubenswrapper[4765]: I1003 08:41:34.681304 4765 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-authentication/oauth-openshift-558db77b4-jgs5w"] Oct 03 08:41:34 crc kubenswrapper[4765]: I1003 08:41:34.681782 4765 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-558db77b4-jgs5w" Oct 03 08:41:34 crc kubenswrapper[4765]: I1003 08:41:34.685133 4765 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-api/machine-api-operator-5694c8668f-8mnt6"] Oct 03 08:41:34 crc kubenswrapper[4765]: I1003 08:41:34.685837 4765 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-api/machine-api-operator-5694c8668f-8mnt6" Oct 03 08:41:34 crc kubenswrapper[4765]: I1003 08:41:34.693562 4765 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-console/downloads-7954f5f757-qdr5x"] Oct 03 08:41:34 crc kubenswrapper[4765]: I1003 08:41:34.693957 4765 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"kube-root-ca.crt" Oct 03 08:41:34 crc kubenswrapper[4765]: I1003 08:41:34.694323 4765 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-apiserver/apiserver-76f77b778f-gcgfs"] Oct 03 08:41:34 crc kubenswrapper[4765]: I1003 08:41:34.695282 4765 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"kube-root-ca.crt" Oct 03 08:41:34 crc kubenswrapper[4765]: I1003 08:41:34.695368 4765 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-serving-cert" Oct 03 08:41:34 crc kubenswrapper[4765]: I1003 08:41:34.695472 4765 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-session" Oct 03 08:41:34 crc kubenswrapper[4765]: I1003 08:41:34.695284 4765 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"openshift-service-ca.crt" Oct 03 08:41:34 crc kubenswrapper[4765]: I1003 08:41:34.695677 4765 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-machine-approver"/"machine-approver-tls" Oct 03 08:41:34 crc kubenswrapper[4765]: I1003 08:41:34.695737 4765 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-machine-approver"/"machine-approver-sa-dockercfg-nl2j4" Oct 03 08:41:34 crc kubenswrapper[4765]: I1003 08:41:34.695830 4765 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"machine-api-operator-images" Oct 03 08:41:34 crc kubenswrapper[4765]: I1003 08:41:34.695943 4765 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"audit" Oct 03 08:41:34 crc kubenswrapper[4765]: I1003 08:41:34.696049 4765 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"kube-rbac-proxy" Oct 03 08:41:34 crc kubenswrapper[4765]: I1003 08:41:34.696163 4765 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"kube-root-ca.crt" Oct 03 08:41:34 crc kubenswrapper[4765]: I1003 08:41:34.695042 4765 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/downloads-7954f5f757-qdr5x" Oct 03 08:41:34 crc kubenswrapper[4765]: I1003 08:41:34.697026 4765 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-4srhb"] Oct 03 08:41:34 crc kubenswrapper[4765]: I1003 08:41:34.697312 4765 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-apiserver/apiserver-76f77b778f-gcgfs" Oct 03 08:41:34 crc kubenswrapper[4765]: I1003 08:41:34.697666 4765 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-config-operator/openshift-config-operator-7777fb866f-9k7k4"] Oct 03 08:41:34 crc kubenswrapper[4765]: I1003 08:41:34.698585 4765 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-config-operator/openshift-config-operator-7777fb866f-9k7k4" Oct 03 08:41:34 crc kubenswrapper[4765]: I1003 08:41:34.698796 4765 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-console/console-f9d7485db-g8jbc"] Oct 03 08:41:34 crc kubenswrapper[4765]: I1003 08:41:34.698941 4765 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-4srhb" Oct 03 08:41:34 crc kubenswrapper[4765]: I1003 08:41:34.699258 4765 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-f9d7485db-g8jbc" Oct 03 08:41:34 crc kubenswrapper[4765]: I1003 08:41:34.702985 4765 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-etcd-operator/etcd-operator-b45778765-4rknw"] Oct 03 08:41:34 crc kubenswrapper[4765]: I1003 08:41:34.710041 4765 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ingress/router-default-5444994796-f64ph"] Oct 03 08:41:34 crc kubenswrapper[4765]: I1003 08:41:34.710784 4765 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-2gzl6"] Oct 03 08:41:34 crc kubenswrapper[4765]: I1003 08:41:34.711121 4765 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"console-config" Oct 03 08:41:34 crc kubenswrapper[4765]: I1003 08:41:34.711249 4765 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-879f6c89f-2gzl6" Oct 03 08:41:34 crc kubenswrapper[4765]: I1003 08:41:34.711779 4765 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-etcd-operator/etcd-operator-b45778765-4rknw" Oct 03 08:41:34 crc kubenswrapper[4765]: I1003 08:41:34.711953 4765 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-config-operator"/"kube-root-ca.crt" Oct 03 08:41:34 crc kubenswrapper[4765]: I1003 08:41:34.712074 4765 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"client-ca" Oct 03 08:41:34 crc kubenswrapper[4765]: I1003 08:41:34.712157 4765 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"kube-root-ca.crt" Oct 03 08:41:34 crc kubenswrapper[4765]: I1003 08:41:34.712266 4765 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress/router-default-5444994796-f64ph" Oct 03 08:41:34 crc kubenswrapper[4765]: I1003 08:41:34.712730 4765 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"default-dockercfg-chnjx" Oct 03 08:41:34 crc kubenswrapper[4765]: I1003 08:41:34.712840 4765 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-authentication-operator/authentication-operator-69f744f599-mpxvz"] Oct 03 08:41:34 crc kubenswrapper[4765]: I1003 08:41:34.712970 4765 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"kube-rbac-proxy" Oct 03 08:41:34 crc kubenswrapper[4765]: I1003 08:41:34.712986 4765 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"machine-approver-config" Oct 03 08:41:34 crc kubenswrapper[4765]: I1003 08:41:34.713070 4765 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"kube-root-ca.crt" Oct 03 08:41:34 crc kubenswrapper[4765]: I1003 08:41:34.713167 4765 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"kube-root-ca.crt" Oct 03 08:41:34 crc kubenswrapper[4765]: I1003 08:41:34.721283 4765 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-provider-selection" Oct 03 08:41:34 crc kubenswrapper[4765]: I1003 08:41:34.721695 4765 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-idp-0-file-data" Oct 03 08:41:34 crc kubenswrapper[4765]: I1003 08:41:34.721747 4765 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-error" Oct 03 08:41:34 crc kubenswrapper[4765]: I1003 08:41:34.721879 4765 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"kube-root-ca.crt" Oct 03 08:41:34 crc kubenswrapper[4765]: I1003 08:41:34.721960 4765 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-service-ca" Oct 03 08:41:34 crc kubenswrapper[4765]: I1003 08:41:34.722674 4765 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-router-certs" Oct 03 08:41:34 crc kubenswrapper[4765]: I1003 08:41:34.722864 4765 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"serving-cert" Oct 03 08:41:34 crc kubenswrapper[4765]: I1003 08:41:34.723073 4765 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"config" Oct 03 08:41:34 crc kubenswrapper[4765]: I1003 08:41:34.723225 4765 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"openshift-service-ca.crt" Oct 03 08:41:34 crc kubenswrapper[4765]: I1003 08:41:34.723240 4765 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"audit-1" Oct 03 08:41:34 crc kubenswrapper[4765]: I1003 08:41:34.723434 4765 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-config-operator"/"openshift-service-ca.crt" Oct 03 08:41:34 crc kubenswrapper[4765]: I1003 08:41:34.723525 4765 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"config" Oct 03 08:41:34 crc kubenswrapper[4765]: I1003 08:41:34.723930 4765 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"openshift-apiserver-sa-dockercfg-djjff" Oct 03 08:41:34 crc kubenswrapper[4765]: I1003 08:41:34.724184 4765 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"route-controller-manager-sa-dockercfg-h2zr2" Oct 03 08:41:34 crc kubenswrapper[4765]: I1003 08:41:34.724364 4765 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"etcd-serving-ca" Oct 03 08:41:34 crc kubenswrapper[4765]: I1003 08:41:34.724422 4765 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"oauth-serving-cert" Oct 03 08:41:34 crc kubenswrapper[4765]: I1003 08:41:34.724475 4765 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"encryption-config-1" Oct 03 08:41:34 crc kubenswrapper[4765]: I1003 08:41:34.724481 4765 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"openshift-service-ca.crt" Oct 03 08:41:34 crc kubenswrapper[4765]: I1003 08:41:34.724603 4765 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"openshift-service-ca.crt" Oct 03 08:41:34 crc kubenswrapper[4765]: I1003 08:41:34.724666 4765 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-dockercfg-f62pw" Oct 03 08:41:34 crc kubenswrapper[4765]: I1003 08:41:34.724801 4765 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"machine-api-operator-tls" Oct 03 08:41:34 crc kubenswrapper[4765]: I1003 08:41:34.724819 4765 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"openshift-service-ca.crt" Oct 03 08:41:34 crc kubenswrapper[4765]: I1003 08:41:34.724938 4765 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-oauth-config" Oct 03 08:41:34 crc kubenswrapper[4765]: I1003 08:41:34.725108 4765 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"machine-api-operator-dockercfg-mfbb7" Oct 03 08:41:34 crc kubenswrapper[4765]: I1003 08:41:34.725214 4765 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"serving-cert" Oct 03 08:41:34 crc kubenswrapper[4765]: I1003 08:41:34.725305 4765 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"openshift-service-ca.crt" Oct 03 08:41:34 crc kubenswrapper[4765]: I1003 08:41:34.725398 4765 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"service-ca" Oct 03 08:41:34 crc kubenswrapper[4765]: I1003 08:41:34.725453 4765 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"etcd-client" Oct 03 08:41:34 crc kubenswrapper[4765]: I1003 08:41:34.725745 4765 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-serving-cert" Oct 03 08:41:34 crc kubenswrapper[4765]: I1003 08:41:34.726567 4765 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"image-import-ca" Oct 03 08:41:34 crc kubenswrapper[4765]: I1003 08:41:34.767137 4765 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-oauth-apiserver/apiserver-7bbb656c7d-l2dpj"] Oct 03 08:41:34 crc kubenswrapper[4765]: I1003 08:41:34.767678 4765 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-l2dpj" Oct 03 08:41:34 crc kubenswrapper[4765]: I1003 08:41:34.767930 4765 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication-operator/authentication-operator-69f744f599-mpxvz" Oct 03 08:41:34 crc kubenswrapper[4765]: I1003 08:41:34.768314 4765 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"oauth-openshift-dockercfg-znhcc" Oct 03 08:41:34 crc kubenswrapper[4765]: I1003 08:41:34.768562 4765 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-config-operator"/"openshift-config-operator-dockercfg-7pc5z" Oct 03 08:41:34 crc kubenswrapper[4765]: I1003 08:41:34.768930 4765 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-cliconfig" Oct 03 08:41:34 crc kubenswrapper[4765]: I1003 08:41:34.769130 4765 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-config-operator"/"config-operator-serving-cert" Oct 03 08:41:34 crc kubenswrapper[4765]: I1003 08:41:34.769552 4765 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-rhthj"] Oct 03 08:41:34 crc kubenswrapper[4765]: I1003 08:41:34.770168 4765 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-config-operator/machine-config-operator-74547568cd-74jfz"] Oct 03 08:41:34 crc kubenswrapper[4765]: I1003 08:41:34.770663 4765 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-74jfz" Oct 03 08:41:34 crc kubenswrapper[4765]: I1003 08:41:34.770875 4765 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-rhthj" Oct 03 08:41:34 crc kubenswrapper[4765]: I1003 08:41:34.771070 4765 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-console-operator/console-operator-58897d9998-gzcf9"] Oct 03 08:41:34 crc kubenswrapper[4765]: I1003 08:41:34.771470 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/d6e8ca49-1faf-4e22-8760-d7eca3820980-oauth-serving-cert\") pod \"console-f9d7485db-g8jbc\" (UID: \"d6e8ca49-1faf-4e22-8760-d7eca3820980\") " pod="openshift-console/console-f9d7485db-g8jbc" Oct 03 08:41:34 crc kubenswrapper[4765]: I1003 08:41:34.771792 4765 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console-operator/console-operator-58897d9998-gzcf9" Oct 03 08:41:34 crc kubenswrapper[4765]: I1003 08:41:34.772226 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/d6e8ca49-1faf-4e22-8760-d7eca3820980-service-ca\") pod \"console-f9d7485db-g8jbc\" (UID: \"d6e8ca49-1faf-4e22-8760-d7eca3820980\") " pod="openshift-console/console-f9d7485db-g8jbc" Oct 03 08:41:34 crc kubenswrapper[4765]: I1003 08:41:34.772370 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/d6e8ca49-1faf-4e22-8760-d7eca3820980-console-oauth-config\") pod \"console-f9d7485db-g8jbc\" (UID: \"d6e8ca49-1faf-4e22-8760-d7eca3820980\") " pod="openshift-console/console-f9d7485db-g8jbc" Oct 03 08:41:34 crc kubenswrapper[4765]: I1003 08:41:34.772456 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-f5rgb\" (UniqueName: \"kubernetes.io/projected/d6e8ca49-1faf-4e22-8760-d7eca3820980-kube-api-access-f5rgb\") pod \"console-f9d7485db-g8jbc\" (UID: \"d6e8ca49-1faf-4e22-8760-d7eca3820980\") " pod="openshift-console/console-f9d7485db-g8jbc" Oct 03 08:41:34 crc kubenswrapper[4765]: I1003 08:41:34.772556 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/d6e8ca49-1faf-4e22-8760-d7eca3820980-console-config\") pod \"console-f9d7485db-g8jbc\" (UID: \"d6e8ca49-1faf-4e22-8760-d7eca3820980\") " pod="openshift-console/console-f9d7485db-g8jbc" Oct 03 08:41:34 crc kubenswrapper[4765]: I1003 08:41:34.772724 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/d6e8ca49-1faf-4e22-8760-d7eca3820980-console-serving-cert\") pod \"console-f9d7485db-g8jbc\" (UID: \"d6e8ca49-1faf-4e22-8760-d7eca3820980\") " pod="openshift-console/console-f9d7485db-g8jbc" Oct 03 08:41:34 crc kubenswrapper[4765]: I1003 08:41:34.772839 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/d6e8ca49-1faf-4e22-8760-d7eca3820980-trusted-ca-bundle\") pod \"console-f9d7485db-g8jbc\" (UID: \"d6e8ca49-1faf-4e22-8760-d7eca3820980\") " pod="openshift-console/console-f9d7485db-g8jbc" Oct 03 08:41:34 crc kubenswrapper[4765]: I1003 08:41:34.775170 4765 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"service-ca-bundle" Oct 03 08:41:34 crc kubenswrapper[4765]: I1003 08:41:34.775725 4765 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-operator-serving-cert" Oct 03 08:41:34 crc kubenswrapper[4765]: I1003 08:41:34.776829 4765 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-operator-config" Oct 03 08:41:34 crc kubenswrapper[4765]: I1003 08:41:34.777025 4765 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"client-ca" Oct 03 08:41:34 crc kubenswrapper[4765]: I1003 08:41:34.777101 4765 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"kube-root-ca.crt" Oct 03 08:41:34 crc kubenswrapper[4765]: I1003 08:41:34.777203 4765 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-metrics-certs-default" Oct 03 08:41:34 crc kubenswrapper[4765]: I1003 08:41:34.777324 4765 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-client" Oct 03 08:41:34 crc kubenswrapper[4765]: I1003 08:41:34.784099 4765 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-service-ca-bundle" Oct 03 08:41:34 crc kubenswrapper[4765]: I1003 08:41:34.784847 4765 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"trusted-ca-bundle" Oct 03 08:41:34 crc kubenswrapper[4765]: I1003 08:41:34.785125 4765 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-trusted-ca-bundle" Oct 03 08:41:34 crc kubenswrapper[4765]: I1003 08:41:34.787756 4765 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-t6dcd"] Oct 03 08:41:34 crc kubenswrapper[4765]: I1003 08:41:34.796401 4765 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-8fzw7"] Oct 03 08:41:34 crc kubenswrapper[4765]: I1003 08:41:34.796623 4765 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"oauth-apiserver-sa-dockercfg-6r2bq" Oct 03 08:41:34 crc kubenswrapper[4765]: I1003 08:41:34.796851 4765 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"service-ca-bundle" Oct 03 08:41:34 crc kubenswrapper[4765]: I1003 08:41:34.796977 4765 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"openshift-service-ca.crt" Oct 03 08:41:34 crc kubenswrapper[4765]: I1003 08:41:34.797080 4765 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"etcd-client" Oct 03 08:41:34 crc kubenswrapper[4765]: I1003 08:41:34.797182 4765 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"authentication-operator-config" Oct 03 08:41:34 crc kubenswrapper[4765]: I1003 08:41:34.797283 4765 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication-operator"/"serving-cert" Oct 03 08:41:34 crc kubenswrapper[4765]: I1003 08:41:34.797416 4765 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"kube-root-ca.crt" Oct 03 08:41:34 crc kubenswrapper[4765]: I1003 08:41:34.797540 4765 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication-operator"/"authentication-operator-dockercfg-mz9bj" Oct 03 08:41:34 crc kubenswrapper[4765]: I1003 08:41:34.786024 4765 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"kube-root-ca.crt" Oct 03 08:41:34 crc kubenswrapper[4765]: I1003 08:41:34.786147 4765 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"openshift-controller-manager-sa-dockercfg-msq4c" Oct 03 08:41:34 crc kubenswrapper[4765]: I1003 08:41:34.797833 4765 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-config-operator/machine-config-controller-84d6567774-djlns"] Oct 03 08:41:34 crc kubenswrapper[4765]: I1003 08:41:34.786195 4765 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"serving-cert" Oct 03 08:41:34 crc kubenswrapper[4765]: I1003 08:41:34.786238 4765 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"openshift-service-ca.crt" Oct 03 08:41:34 crc kubenswrapper[4765]: I1003 08:41:34.786277 4765 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-stats-default" Oct 03 08:41:34 crc kubenswrapper[4765]: I1003 08:41:34.786384 4765 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-operator-dockercfg-r9srn" Oct 03 08:41:34 crc kubenswrapper[4765]: I1003 08:41:34.786460 4765 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"config" Oct 03 08:41:34 crc kubenswrapper[4765]: I1003 08:41:34.786514 4765 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"kube-root-ca.crt" Oct 03 08:41:34 crc kubenswrapper[4765]: I1003 08:41:34.786557 4765 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-dockercfg-zdk86" Oct 03 08:41:34 crc kubenswrapper[4765]: I1003 08:41:34.786698 4765 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-certs-default" Oct 03 08:41:34 crc kubenswrapper[4765]: I1003 08:41:34.787187 4765 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-service-ca.crt" Oct 03 08:41:34 crc kubenswrapper[4765]: I1003 08:41:34.787377 4765 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-ca-bundle" Oct 03 08:41:34 crc kubenswrapper[4765]: I1003 08:41:34.787512 4765 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-login" Oct 03 08:41:34 crc kubenswrapper[4765]: I1003 08:41:34.787689 4765 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"openshift-service-ca.crt" Oct 03 08:41:34 crc kubenswrapper[4765]: I1003 08:41:34.799124 4765 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-djlns" Oct 03 08:41:34 crc kubenswrapper[4765]: I1003 08:41:34.799498 4765 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-8fzw7" Oct 03 08:41:34 crc kubenswrapper[4765]: I1003 08:41:34.804606 4765 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-global-ca" Oct 03 08:41:34 crc kubenswrapper[4765]: I1003 08:41:34.805721 4765 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"kube-root-ca.crt" Oct 03 08:41:34 crc kubenswrapper[4765]: I1003 08:41:34.805870 4765 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"serving-cert" Oct 03 08:41:34 crc kubenswrapper[4765]: I1003 08:41:34.805921 4765 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"openshift-service-ca.crt" Oct 03 08:41:34 crc kubenswrapper[4765]: I1003 08:41:34.806075 4765 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"audit-1" Oct 03 08:41:34 crc kubenswrapper[4765]: I1003 08:41:34.806208 4765 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"etcd-serving-ca" Oct 03 08:41:34 crc kubenswrapper[4765]: I1003 08:41:34.806276 4765 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"encryption-config-1" Oct 03 08:41:34 crc kubenswrapper[4765]: I1003 08:41:34.806318 4765 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"trusted-ca-bundle" Oct 03 08:41:34 crc kubenswrapper[4765]: I1003 08:41:34.806442 4765 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-controller-manager-operator"/"kube-root-ca.crt" Oct 03 08:41:34 crc kubenswrapper[4765]: I1003 08:41:34.806213 4765 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-operator-dockercfg-98p87" Oct 03 08:41:34 crc kubenswrapper[4765]: I1003 08:41:34.806920 4765 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"machine-config-operator-images" Oct 03 08:41:34 crc kubenswrapper[4765]: I1003 08:41:34.815209 4765 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-service-ca/service-ca-9c57cc56f-pfj5p"] Oct 03 08:41:34 crc kubenswrapper[4765]: I1003 08:41:34.815795 4765 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca/service-ca-9c57cc56f-pfj5p" Oct 03 08:41:34 crc kubenswrapper[4765]: I1003 08:41:34.816674 4765 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"trusted-ca-bundle" Oct 03 08:41:34 crc kubenswrapper[4765]: I1003 08:41:34.823183 4765 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-tmd5f"] Oct 03 08:41:34 crc kubenswrapper[4765]: I1003 08:41:34.825894 4765 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-tmd5f" Oct 03 08:41:34 crc kubenswrapper[4765]: I1003 08:41:34.836474 4765 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"mco-proxy-tls" Oct 03 08:41:34 crc kubenswrapper[4765]: I1003 08:41:34.836814 4765 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-dns-operator/dns-operator-744455d44c-swd9l"] Oct 03 08:41:34 crc kubenswrapper[4765]: I1003 08:41:34.838182 4765 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-ts98r"] Oct 03 08:41:34 crc kubenswrapper[4765]: I1003 08:41:34.838821 4765 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-ts98r" Oct 03 08:41:34 crc kubenswrapper[4765]: I1003 08:41:34.839118 4765 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns-operator/dns-operator-744455d44c-swd9l" Oct 03 08:41:34 crc kubenswrapper[4765]: I1003 08:41:34.861317 4765 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-dockercfg-gkqpw" Oct 03 08:41:34 crc kubenswrapper[4765]: I1003 08:41:34.862048 4765 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-ocp-branding-template" Oct 03 08:41:34 crc kubenswrapper[4765]: I1003 08:41:34.862274 4765 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"trusted-ca-bundle" Oct 03 08:41:34 crc kubenswrapper[4765]: I1003 08:41:34.862501 4765 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-7fwgr"] Oct 03 08:41:34 crc kubenswrapper[4765]: I1003 08:41:34.862971 4765 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-serving-cert" Oct 03 08:41:34 crc kubenswrapper[4765]: I1003 08:41:34.863106 4765 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-7fwgr" Oct 03 08:41:34 crc kubenswrapper[4765]: I1003 08:41:34.865191 4765 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-5wxg4"] Oct 03 08:41:34 crc kubenswrapper[4765]: I1003 08:41:34.865764 4765 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-vphkx"] Oct 03 08:41:34 crc kubenswrapper[4765]: I1003 08:41:34.866327 4765 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-vphkx" Oct 03 08:41:34 crc kubenswrapper[4765]: I1003 08:41:34.866632 4765 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-5wxg4" Oct 03 08:41:34 crc kubenswrapper[4765]: I1003 08:41:34.867090 4765 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-ph5q8"] Oct 03 08:41:34 crc kubenswrapper[4765]: I1003 08:41:34.867638 4765 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-ph5q8" Oct 03 08:41:34 crc kubenswrapper[4765]: I1003 08:41:34.868312 4765 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-w9ww9"] Oct 03 08:41:34 crc kubenswrapper[4765]: I1003 08:41:34.870168 4765 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-w9ww9" Oct 03 08:41:34 crc kubenswrapper[4765]: I1003 08:41:34.872553 4765 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-qwb6x"] Oct 03 08:41:34 crc kubenswrapper[4765]: I1003 08:41:34.872942 4765 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-697d97f7c8-qwb6x" Oct 03 08:41:34 crc kubenswrapper[4765]: I1003 08:41:34.873282 4765 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-rnsx7"] Oct 03 08:41:34 crc kubenswrapper[4765]: I1003 08:41:34.874985 4765 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-rnsx7" Oct 03 08:41:34 crc kubenswrapper[4765]: I1003 08:41:34.875327 4765 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-gvz2v"] Oct 03 08:41:34 crc kubenswrapper[4765]: I1003 08:41:34.876111 4765 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-gvz2v" Oct 03 08:41:34 crc kubenswrapper[4765]: I1003 08:41:34.876561 4765 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-storage-version-migrator/migrator-59844c95c7-kthpd"] Oct 03 08:41:34 crc kubenswrapper[4765]: I1003 08:41:34.877531 4765 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-kthpd" Oct 03 08:41:34 crc kubenswrapper[4765]: I1003 08:41:34.877653 4765 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-multus/multus-admission-controller-857f4d67dd-rmm5l"] Oct 03 08:41:34 crc kubenswrapper[4765]: I1003 08:41:34.878556 4765 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-admission-controller-857f4d67dd-rmm5l" Oct 03 08:41:34 crc kubenswrapper[4765]: I1003 08:41:34.880432 4765 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-config" Oct 03 08:41:34 crc kubenswrapper[4765]: I1003 08:41:34.883738 4765 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ingress-operator/ingress-operator-5b745b69d9-x6hb4"] Oct 03 08:41:34 crc kubenswrapper[4765]: I1003 08:41:34.886453 4765 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/catalog-operator-68c6474976-pmrxv"] Oct 03 08:41:34 crc kubenswrapper[4765]: I1003 08:41:34.886748 4765 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-x6hb4" Oct 03 08:41:34 crc kubenswrapper[4765]: I1003 08:41:34.887201 4765 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-service-ca-operator/service-ca-operator-777779d784-n4hxl"] Oct 03 08:41:34 crc kubenswrapper[4765]: I1003 08:41:34.887361 4765 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-pmrxv" Oct 03 08:41:34 crc kubenswrapper[4765]: I1003 08:41:34.887750 4765 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca-operator/service-ca-operator-777779d784-n4hxl" Oct 03 08:41:34 crc kubenswrapper[4765]: I1003 08:41:34.888192 4765 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-9g9cw"] Oct 03 08:41:34 crc kubenswrapper[4765]: I1003 08:41:34.888576 4765 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-9g9cw" Oct 03 08:41:34 crc kubenswrapper[4765]: I1003 08:41:34.889295 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/1a071347-8c80-4f91-87f3-1d95c7b18a1c-metrics-certs\") pod \"router-default-5444994796-f64ph\" (UID: \"1a071347-8c80-4f91-87f3-1d95c7b18a1c\") " pod="openshift-ingress/router-default-5444994796-f64ph" Oct 03 08:41:34 crc kubenswrapper[4765]: I1003 08:41:34.889339 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/0ea22c01-e088-40b8-aecd-e83fe862bc78-v4-0-config-user-template-error\") pod \"oauth-openshift-558db77b4-jgs5w\" (UID: \"0ea22c01-e088-40b8-aecd-e83fe862bc78\") " pod="openshift-authentication/oauth-openshift-558db77b4-jgs5w" Oct 03 08:41:34 crc kubenswrapper[4765]: I1003 08:41:34.889419 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-28fmp\" (UniqueName: \"kubernetes.io/projected/909e5d8a-0d69-4973-b9ce-bc5febb55e14-kube-api-access-28fmp\") pod \"authentication-operator-69f744f599-mpxvz\" (UID: \"909e5d8a-0d69-4973-b9ce-bc5febb55e14\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-mpxvz" Oct 03 08:41:34 crc kubenswrapper[4765]: I1003 08:41:34.889442 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/6d5f9563-ba1f-4c05-a32d-127a5c01932d-etcd-serving-ca\") pod \"apiserver-76f77b778f-gcgfs\" (UID: \"6d5f9563-ba1f-4c05-a32d-127a5c01932d\") " pod="openshift-apiserver/apiserver-76f77b778f-gcgfs" Oct 03 08:41:34 crc kubenswrapper[4765]: I1003 08:41:34.889474 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/0ea22c01-e088-40b8-aecd-e83fe862bc78-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-558db77b4-jgs5w\" (UID: \"0ea22c01-e088-40b8-aecd-e83fe862bc78\") " pod="openshift-authentication/oauth-openshift-558db77b4-jgs5w" Oct 03 08:41:34 crc kubenswrapper[4765]: I1003 08:41:34.889512 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qwks2\" (UniqueName: \"kubernetes.io/projected/0ea22c01-e088-40b8-aecd-e83fe862bc78-kube-api-access-qwks2\") pod \"oauth-openshift-558db77b4-jgs5w\" (UID: \"0ea22c01-e088-40b8-aecd-e83fe862bc78\") " pod="openshift-authentication/oauth-openshift-558db77b4-jgs5w" Oct 03 08:41:34 crc kubenswrapper[4765]: I1003 08:41:34.889540 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/959cebb7-4057-42d3-a1bf-fc19557247cc-config\") pod \"kube-storage-version-migrator-operator-b67b599dd-ph5q8\" (UID: \"959cebb7-4057-42d3-a1bf-fc19557247cc\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-ph5q8" Oct 03 08:41:34 crc kubenswrapper[4765]: I1003 08:41:34.889556 4765 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29324670-fpjm8"] Oct 03 08:41:34 crc kubenswrapper[4765]: I1003 08:41:34.889574 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/0ea22c01-e088-40b8-aecd-e83fe862bc78-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-558db77b4-jgs5w\" (UID: \"0ea22c01-e088-40b8-aecd-e83fe862bc78\") " pod="openshift-authentication/oauth-openshift-558db77b4-jgs5w" Oct 03 08:41:34 crc kubenswrapper[4765]: I1003 08:41:34.889602 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/8029aef2-a0bf-4d08-b786-0bfff6f8943a-apiservice-cert\") pod \"packageserver-d55dfcdfc-ts98r\" (UID: \"8029aef2-a0bf-4d08-b786-0bfff6f8943a\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-ts98r" Oct 03 08:41:34 crc kubenswrapper[4765]: I1003 08:41:34.889628 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/1a071347-8c80-4f91-87f3-1d95c7b18a1c-service-ca-bundle\") pod \"router-default-5444994796-f64ph\" (UID: \"1a071347-8c80-4f91-87f3-1d95c7b18a1c\") " pod="openshift-ingress/router-default-5444994796-f64ph" Oct 03 08:41:34 crc kubenswrapper[4765]: I1003 08:41:34.889672 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/17c891c2-c5ff-4815-9f09-347204c5da1d-config\") pod \"machine-api-operator-5694c8668f-8mnt6\" (UID: \"17c891c2-c5ff-4815-9f09-347204c5da1d\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-8mnt6" Oct 03 08:41:34 crc kubenswrapper[4765]: I1003 08:41:34.889690 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/17c891c2-c5ff-4815-9f09-347204c5da1d-images\") pod \"machine-api-operator-5694c8668f-8mnt6\" (UID: \"17c891c2-c5ff-4815-9f09-347204c5da1d\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-8mnt6" Oct 03 08:41:34 crc kubenswrapper[4765]: I1003 08:41:34.889713 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/d6e8ca49-1faf-4e22-8760-d7eca3820980-trusted-ca-bundle\") pod \"console-f9d7485db-g8jbc\" (UID: \"d6e8ca49-1faf-4e22-8760-d7eca3820980\") " pod="openshift-console/console-f9d7485db-g8jbc" Oct 03 08:41:34 crc kubenswrapper[4765]: I1003 08:41:34.889729 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"machine-api-operator-tls\" (UniqueName: \"kubernetes.io/secret/17c891c2-c5ff-4815-9f09-347204c5da1d-machine-api-operator-tls\") pod \"machine-api-operator-5694c8668f-8mnt6\" (UID: \"17c891c2-c5ff-4815-9f09-347204c5da1d\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-8mnt6" Oct 03 08:41:34 crc kubenswrapper[4765]: I1003 08:41:34.889748 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/959cebb7-4057-42d3-a1bf-fc19557247cc-serving-cert\") pod \"kube-storage-version-migrator-operator-b67b599dd-ph5q8\" (UID: \"959cebb7-4057-42d3-a1bf-fc19557247cc\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-ph5q8" Oct 03 08:41:34 crc kubenswrapper[4765]: I1003 08:41:34.889776 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/d6e8ca49-1faf-4e22-8760-d7eca3820980-oauth-serving-cert\") pod \"console-f9d7485db-g8jbc\" (UID: \"d6e8ca49-1faf-4e22-8760-d7eca3820980\") " pod="openshift-console/console-f9d7485db-g8jbc" Oct 03 08:41:34 crc kubenswrapper[4765]: I1003 08:41:34.889793 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/eccb70cc-3c95-4f97-ad20-610bb8a7b5df-config\") pod \"openshift-apiserver-operator-796bbdcf4f-t6dcd\" (UID: \"eccb70cc-3c95-4f97-ad20-610bb8a7b5df\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-t6dcd" Oct 03 08:41:34 crc kubenswrapper[4765]: I1003 08:41:34.889809 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/0ea22c01-e088-40b8-aecd-e83fe862bc78-v4-0-config-system-session\") pod \"oauth-openshift-558db77b4-jgs5w\" (UID: \"0ea22c01-e088-40b8-aecd-e83fe862bc78\") " pod="openshift-authentication/oauth-openshift-558db77b4-jgs5w" Oct 03 08:41:34 crc kubenswrapper[4765]: I1003 08:41:34.889825 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2qtnr\" (UniqueName: \"kubernetes.io/projected/4a2b0c12-72bd-44fb-88f7-18203ba2ccb6-kube-api-access-2qtnr\") pod \"machine-approver-56656f9798-vpkxq\" (UID: \"4a2b0c12-72bd-44fb-88f7-18203ba2ccb6\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-vpkxq" Oct 03 08:41:34 crc kubenswrapper[4765]: I1003 08:41:34.889857 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/d6e8ca49-1faf-4e22-8760-d7eca3820980-service-ca\") pod \"console-f9d7485db-g8jbc\" (UID: \"d6e8ca49-1faf-4e22-8760-d7eca3820980\") " pod="openshift-console/console-f9d7485db-g8jbc" Oct 03 08:41:34 crc kubenswrapper[4765]: I1003 08:41:34.889874 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ktdhc\" (UniqueName: \"kubernetes.io/projected/1a071347-8c80-4f91-87f3-1d95c7b18a1c-kube-api-access-ktdhc\") pod \"router-default-5444994796-f64ph\" (UID: \"1a071347-8c80-4f91-87f3-1d95c7b18a1c\") " pod="openshift-ingress/router-default-5444994796-f64ph" Oct 03 08:41:34 crc kubenswrapper[4765]: I1003 08:41:34.889896 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/a2cdc869-a3e4-410d-be35-0ad4514d8bf8-etcd-client\") pod \"etcd-operator-b45778765-4rknw\" (UID: \"a2cdc869-a3e4-410d-be35-0ad4514d8bf8\") " pod="openshift-etcd-operator/etcd-operator-b45778765-4rknw" Oct 03 08:41:34 crc kubenswrapper[4765]: I1003 08:41:34.889924 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-f5rgb\" (UniqueName: \"kubernetes.io/projected/d6e8ca49-1faf-4e22-8760-d7eca3820980-kube-api-access-f5rgb\") pod \"console-f9d7485db-g8jbc\" (UID: \"d6e8ca49-1faf-4e22-8760-d7eca3820980\") " pod="openshift-console/console-f9d7485db-g8jbc" Oct 03 08:41:34 crc kubenswrapper[4765]: I1003 08:41:34.889945 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qz6sh\" (UniqueName: \"kubernetes.io/projected/4a5c90d2-421e-47fd-a2ae-c7c0c3c5a170-kube-api-access-qz6sh\") pod \"openshift-config-operator-7777fb866f-9k7k4\" (UID: \"4a5c90d2-421e-47fd-a2ae-c7c0c3c5a170\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-9k7k4" Oct 03 08:41:34 crc kubenswrapper[4765]: I1003 08:41:34.889966 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/6d5f9563-ba1f-4c05-a32d-127a5c01932d-node-pullsecrets\") pod \"apiserver-76f77b778f-gcgfs\" (UID: \"6d5f9563-ba1f-4c05-a32d-127a5c01932d\") " pod="openshift-apiserver/apiserver-76f77b778f-gcgfs" Oct 03 08:41:34 crc kubenswrapper[4765]: I1003 08:41:34.889989 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6d5f9563-ba1f-4c05-a32d-127a5c01932d-config\") pod \"apiserver-76f77b778f-gcgfs\" (UID: \"6d5f9563-ba1f-4c05-a32d-127a5c01932d\") " pod="openshift-apiserver/apiserver-76f77b778f-gcgfs" Oct 03 08:41:34 crc kubenswrapper[4765]: I1003 08:41:34.890012 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mmpcj\" (UniqueName: \"kubernetes.io/projected/6d5f9563-ba1f-4c05-a32d-127a5c01932d-kube-api-access-mmpcj\") pod \"apiserver-76f77b778f-gcgfs\" (UID: \"6d5f9563-ba1f-4c05-a32d-127a5c01932d\") " pod="openshift-apiserver/apiserver-76f77b778f-gcgfs" Oct 03 08:41:34 crc kubenswrapper[4765]: I1003 08:41:34.890048 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmpfs\" (UniqueName: \"kubernetes.io/empty-dir/8029aef2-a0bf-4d08-b786-0bfff6f8943a-tmpfs\") pod \"packageserver-d55dfcdfc-ts98r\" (UID: \"8029aef2-a0bf-4d08-b786-0bfff6f8943a\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-ts98r" Oct 03 08:41:34 crc kubenswrapper[4765]: I1003 08:41:34.890076 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-service-ca\" (UniqueName: \"kubernetes.io/configmap/a2cdc869-a3e4-410d-be35-0ad4514d8bf8-etcd-service-ca\") pod \"etcd-operator-b45778765-4rknw\" (UID: \"a2cdc869-a3e4-410d-be35-0ad4514d8bf8\") " pod="openshift-etcd-operator/etcd-operator-b45778765-4rknw" Oct 03 08:41:34 crc kubenswrapper[4765]: I1003 08:41:34.890094 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7vzc4\" (UniqueName: \"kubernetes.io/projected/d6fe9149-6e84-4fe5-97b0-5b6fd0a522bc-kube-api-access-7vzc4\") pod \"downloads-7954f5f757-qdr5x\" (UID: \"d6fe9149-6e84-4fe5-97b0-5b6fd0a522bc\") " pod="openshift-console/downloads-7954f5f757-qdr5x" Oct 03 08:41:34 crc kubenswrapper[4765]: I1003 08:41:34.890113 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/6d5f9563-ba1f-4c05-a32d-127a5c01932d-image-import-ca\") pod \"apiserver-76f77b778f-gcgfs\" (UID: \"6d5f9563-ba1f-4c05-a32d-127a5c01932d\") " pod="openshift-apiserver/apiserver-76f77b778f-gcgfs" Oct 03 08:41:34 crc kubenswrapper[4765]: I1003 08:41:34.890134 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/a2cdc869-a3e4-410d-be35-0ad4514d8bf8-serving-cert\") pod \"etcd-operator-b45778765-4rknw\" (UID: \"a2cdc869-a3e4-410d-be35-0ad4514d8bf8\") " pod="openshift-etcd-operator/etcd-operator-b45778765-4rknw" Oct 03 08:41:34 crc kubenswrapper[4765]: I1003 08:41:34.890157 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/0ea22c01-e088-40b8-aecd-e83fe862bc78-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-558db77b4-jgs5w\" (UID: \"0ea22c01-e088-40b8-aecd-e83fe862bc78\") " pod="openshift-authentication/oauth-openshift-558db77b4-jgs5w" Oct 03 08:41:34 crc kubenswrapper[4765]: I1003 08:41:34.890255 4765 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29324670-fpjm8" Oct 03 08:41:34 crc kubenswrapper[4765]: I1003 08:41:34.890283 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/909e5d8a-0d69-4973-b9ce-bc5febb55e14-service-ca-bundle\") pod \"authentication-operator-69f744f599-mpxvz\" (UID: \"909e5d8a-0d69-4973-b9ce-bc5febb55e14\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-mpxvz" Oct 03 08:41:34 crc kubenswrapper[4765]: I1003 08:41:34.890313 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6zvc5\" (UniqueName: \"kubernetes.io/projected/a2cdc869-a3e4-410d-be35-0ad4514d8bf8-kube-api-access-6zvc5\") pod \"etcd-operator-b45778765-4rknw\" (UID: \"a2cdc869-a3e4-410d-be35-0ad4514d8bf8\") " pod="openshift-etcd-operator/etcd-operator-b45778765-4rknw" Oct 03 08:41:34 crc kubenswrapper[4765]: I1003 08:41:34.890335 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/8029aef2-a0bf-4d08-b786-0bfff6f8943a-webhook-cert\") pod \"packageserver-d55dfcdfc-ts98r\" (UID: \"8029aef2-a0bf-4d08-b786-0bfff6f8943a\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-ts98r" Oct 03 08:41:34 crc kubenswrapper[4765]: I1003 08:41:34.891499 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/0ea22c01-e088-40b8-aecd-e83fe862bc78-audit-dir\") pod \"oauth-openshift-558db77b4-jgs5w\" (UID: \"0ea22c01-e088-40b8-aecd-e83fe862bc78\") " pod="openshift-authentication/oauth-openshift-558db77b4-jgs5w" Oct 03 08:41:34 crc kubenswrapper[4765]: I1003 08:41:34.891542 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/0ea22c01-e088-40b8-aecd-e83fe862bc78-v4-0-config-system-service-ca\") pod \"oauth-openshift-558db77b4-jgs5w\" (UID: \"0ea22c01-e088-40b8-aecd-e83fe862bc78\") " pod="openshift-authentication/oauth-openshift-558db77b4-jgs5w" Oct 03 08:41:34 crc kubenswrapper[4765]: I1003 08:41:34.891577 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"machine-approver-tls\" (UniqueName: \"kubernetes.io/secret/4a2b0c12-72bd-44fb-88f7-18203ba2ccb6-machine-approver-tls\") pod \"machine-approver-56656f9798-vpkxq\" (UID: \"4a2b0c12-72bd-44fb-88f7-18203ba2ccb6\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-vpkxq" Oct 03 08:41:34 crc kubenswrapper[4765]: I1003 08:41:34.891604 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/6d5f9563-ba1f-4c05-a32d-127a5c01932d-etcd-client\") pod \"apiserver-76f77b778f-gcgfs\" (UID: \"6d5f9563-ba1f-4c05-a32d-127a5c01932d\") " pod="openshift-apiserver/apiserver-76f77b778f-gcgfs" Oct 03 08:41:34 crc kubenswrapper[4765]: I1003 08:41:34.891679 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/e41d9b4e-c3ce-4604-a3f8-1e972308f9a7-proxy-ca-bundles\") pod \"controller-manager-879f6c89f-2gzl6\" (UID: \"e41d9b4e-c3ce-4604-a3f8-1e972308f9a7\") " pod="openshift-controller-manager/controller-manager-879f6c89f-2gzl6" Oct 03 08:41:34 crc kubenswrapper[4765]: I1003 08:41:34.891731 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pv824\" (UniqueName: \"kubernetes.io/projected/8029aef2-a0bf-4d08-b786-0bfff6f8943a-kube-api-access-pv824\") pod \"packageserver-d55dfcdfc-ts98r\" (UID: \"8029aef2-a0bf-4d08-b786-0bfff6f8943a\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-ts98r" Oct 03 08:41:34 crc kubenswrapper[4765]: I1003 08:41:34.891798 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/d6e8ca49-1faf-4e22-8760-d7eca3820980-console-serving-cert\") pod \"console-f9d7485db-g8jbc\" (UID: \"d6e8ca49-1faf-4e22-8760-d7eca3820980\") " pod="openshift-console/console-f9d7485db-g8jbc" Oct 03 08:41:34 crc kubenswrapper[4765]: I1003 08:41:34.891837 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/0ea22c01-e088-40b8-aecd-e83fe862bc78-v4-0-config-system-serving-cert\") pod \"oauth-openshift-558db77b4-jgs5w\" (UID: \"0ea22c01-e088-40b8-aecd-e83fe862bc78\") " pod="openshift-authentication/oauth-openshift-558db77b4-jgs5w" Oct 03 08:41:34 crc kubenswrapper[4765]: I1003 08:41:34.891862 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/3b2c5fda-4f45-444f-991b-0afa96721739-client-ca\") pod \"route-controller-manager-6576b87f9c-4srhb\" (UID: \"3b2c5fda-4f45-444f-991b-0afa96721739\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-4srhb" Oct 03 08:41:34 crc kubenswrapper[4765]: I1003 08:41:34.891884 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-n86kd\" (UniqueName: \"kubernetes.io/projected/eccb70cc-3c95-4f97-ad20-610bb8a7b5df-kube-api-access-n86kd\") pod \"openshift-apiserver-operator-796bbdcf4f-t6dcd\" (UID: \"eccb70cc-3c95-4f97-ad20-610bb8a7b5df\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-t6dcd" Oct 03 08:41:34 crc kubenswrapper[4765]: I1003 08:41:34.891885 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/d6e8ca49-1faf-4e22-8760-d7eca3820980-trusted-ca-bundle\") pod \"console-f9d7485db-g8jbc\" (UID: \"d6e8ca49-1faf-4e22-8760-d7eca3820980\") " pod="openshift-console/console-f9d7485db-g8jbc" Oct 03 08:41:34 crc kubenswrapper[4765]: I1003 08:41:34.891943 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/e41d9b4e-c3ce-4604-a3f8-1e972308f9a7-client-ca\") pod \"controller-manager-879f6c89f-2gzl6\" (UID: \"e41d9b4e-c3ce-4604-a3f8-1e972308f9a7\") " pod="openshift-controller-manager/controller-manager-879f6c89f-2gzl6" Oct 03 08:41:34 crc kubenswrapper[4765]: I1003 08:41:34.891968 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7tr8j\" (UniqueName: \"kubernetes.io/projected/3b2c5fda-4f45-444f-991b-0afa96721739-kube-api-access-7tr8j\") pod \"route-controller-manager-6576b87f9c-4srhb\" (UID: \"3b2c5fda-4f45-444f-991b-0afa96721739\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-4srhb" Oct 03 08:41:34 crc kubenswrapper[4765]: I1003 08:41:34.892025 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/4a5c90d2-421e-47fd-a2ae-c7c0c3c5a170-serving-cert\") pod \"openshift-config-operator-7777fb866f-9k7k4\" (UID: \"4a5c90d2-421e-47fd-a2ae-c7c0c3c5a170\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-9k7k4" Oct 03 08:41:34 crc kubenswrapper[4765]: I1003 08:41:34.892053 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a2cdc869-a3e4-410d-be35-0ad4514d8bf8-config\") pod \"etcd-operator-b45778765-4rknw\" (UID: \"a2cdc869-a3e4-410d-be35-0ad4514d8bf8\") " pod="openshift-etcd-operator/etcd-operator-b45778765-4rknw" Oct 03 08:41:34 crc kubenswrapper[4765]: I1003 08:41:34.892076 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dnc8f\" (UniqueName: \"kubernetes.io/projected/17c891c2-c5ff-4815-9f09-347204c5da1d-kube-api-access-dnc8f\") pod \"machine-api-operator-5694c8668f-8mnt6\" (UID: \"17c891c2-c5ff-4815-9f09-347204c5da1d\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-8mnt6" Oct 03 08:41:34 crc kubenswrapper[4765]: I1003 08:41:34.892122 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/4a2b0c12-72bd-44fb-88f7-18203ba2ccb6-auth-proxy-config\") pod \"machine-approver-56656f9798-vpkxq\" (UID: \"4a2b0c12-72bd-44fb-88f7-18203ba2ccb6\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-vpkxq" Oct 03 08:41:34 crc kubenswrapper[4765]: I1003 08:41:34.892144 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6d5f9563-ba1f-4c05-a32d-127a5c01932d-trusted-ca-bundle\") pod \"apiserver-76f77b778f-gcgfs\" (UID: \"6d5f9563-ba1f-4c05-a32d-127a5c01932d\") " pod="openshift-apiserver/apiserver-76f77b778f-gcgfs" Oct 03 08:41:34 crc kubenswrapper[4765]: I1003 08:41:34.892174 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/0ea22c01-e088-40b8-aecd-e83fe862bc78-v4-0-config-user-template-login\") pod \"oauth-openshift-558db77b4-jgs5w\" (UID: \"0ea22c01-e088-40b8-aecd-e83fe862bc78\") " pod="openshift-authentication/oauth-openshift-558db77b4-jgs5w" Oct 03 08:41:34 crc kubenswrapper[4765]: I1003 08:41:34.892195 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/909e5d8a-0d69-4973-b9ce-bc5febb55e14-config\") pod \"authentication-operator-69f744f599-mpxvz\" (UID: \"909e5d8a-0d69-4973-b9ce-bc5febb55e14\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-mpxvz" Oct 03 08:41:34 crc kubenswrapper[4765]: I1003 08:41:34.892211 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4a2b0c12-72bd-44fb-88f7-18203ba2ccb6-config\") pod \"machine-approver-56656f9798-vpkxq\" (UID: \"4a2b0c12-72bd-44fb-88f7-18203ba2ccb6\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-vpkxq" Oct 03 08:41:34 crc kubenswrapper[4765]: I1003 08:41:34.892236 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-ca\" (UniqueName: \"kubernetes.io/configmap/a2cdc869-a3e4-410d-be35-0ad4514d8bf8-etcd-ca\") pod \"etcd-operator-b45778765-4rknw\" (UID: \"a2cdc869-a3e4-410d-be35-0ad4514d8bf8\") " pod="openshift-etcd-operator/etcd-operator-b45778765-4rknw" Oct 03 08:41:34 crc kubenswrapper[4765]: I1003 08:41:34.892255 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/6d5f9563-ba1f-4c05-a32d-127a5c01932d-audit\") pod \"apiserver-76f77b778f-gcgfs\" (UID: \"6d5f9563-ba1f-4c05-a32d-127a5c01932d\") " pod="openshift-apiserver/apiserver-76f77b778f-gcgfs" Oct 03 08:41:34 crc kubenswrapper[4765]: I1003 08:41:34.892293 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/eccb70cc-3c95-4f97-ad20-610bb8a7b5df-serving-cert\") pod \"openshift-apiserver-operator-796bbdcf4f-t6dcd\" (UID: \"eccb70cc-3c95-4f97-ad20-610bb8a7b5df\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-t6dcd" Oct 03 08:41:34 crc kubenswrapper[4765]: I1003 08:41:34.892316 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/0ea22c01-e088-40b8-aecd-e83fe862bc78-v4-0-config-system-cliconfig\") pod \"oauth-openshift-558db77b4-jgs5w\" (UID: \"0ea22c01-e088-40b8-aecd-e83fe862bc78\") " pod="openshift-authentication/oauth-openshift-558db77b4-jgs5w" Oct 03 08:41:34 crc kubenswrapper[4765]: I1003 08:41:34.892522 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/6d5f9563-ba1f-4c05-a32d-127a5c01932d-audit-dir\") pod \"apiserver-76f77b778f-gcgfs\" (UID: \"6d5f9563-ba1f-4c05-a32d-127a5c01932d\") " pod="openshift-apiserver/apiserver-76f77b778f-gcgfs" Oct 03 08:41:34 crc kubenswrapper[4765]: I1003 08:41:34.892558 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"available-featuregates\" (UniqueName: \"kubernetes.io/empty-dir/4a5c90d2-421e-47fd-a2ae-c7c0c3c5a170-available-featuregates\") pod \"openshift-config-operator-7777fb866f-9k7k4\" (UID: \"4a5c90d2-421e-47fd-a2ae-c7c0c3c5a170\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-9k7k4" Oct 03 08:41:34 crc kubenswrapper[4765]: I1003 08:41:34.892589 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/d6e8ca49-1faf-4e22-8760-d7eca3820980-console-oauth-config\") pod \"console-f9d7485db-g8jbc\" (UID: \"d6e8ca49-1faf-4e22-8760-d7eca3820980\") " pod="openshift-console/console-f9d7485db-g8jbc" Oct 03 08:41:34 crc kubenswrapper[4765]: I1003 08:41:34.892619 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e41d9b4e-c3ce-4604-a3f8-1e972308f9a7-serving-cert\") pod \"controller-manager-879f6c89f-2gzl6\" (UID: \"e41d9b4e-c3ce-4604-a3f8-1e972308f9a7\") " pod="openshift-controller-manager/controller-manager-879f6c89f-2gzl6" Oct 03 08:41:34 crc kubenswrapper[4765]: I1003 08:41:34.892635 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/d6e8ca49-1faf-4e22-8760-d7eca3820980-oauth-serving-cert\") pod \"console-f9d7485db-g8jbc\" (UID: \"d6e8ca49-1faf-4e22-8760-d7eca3820980\") " pod="openshift-console/console-f9d7485db-g8jbc" Oct 03 08:41:34 crc kubenswrapper[4765]: I1003 08:41:34.892688 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-z7psh\" (UniqueName: \"kubernetes.io/projected/e41d9b4e-c3ce-4604-a3f8-1e972308f9a7-kube-api-access-z7psh\") pod \"controller-manager-879f6c89f-2gzl6\" (UID: \"e41d9b4e-c3ce-4604-a3f8-1e972308f9a7\") " pod="openshift-controller-manager/controller-manager-879f6c89f-2gzl6" Oct 03 08:41:34 crc kubenswrapper[4765]: I1003 08:41:34.892760 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/0ea22c01-e088-40b8-aecd-e83fe862bc78-audit-policies\") pod \"oauth-openshift-558db77b4-jgs5w\" (UID: \"0ea22c01-e088-40b8-aecd-e83fe862bc78\") " pod="openshift-authentication/oauth-openshift-558db77b4-jgs5w" Oct 03 08:41:34 crc kubenswrapper[4765]: I1003 08:41:34.892811 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-m2kqd\" (UniqueName: \"kubernetes.io/projected/959cebb7-4057-42d3-a1bf-fc19557247cc-kube-api-access-m2kqd\") pod \"kube-storage-version-migrator-operator-b67b599dd-ph5q8\" (UID: \"959cebb7-4057-42d3-a1bf-fc19557247cc\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-ph5q8" Oct 03 08:41:34 crc kubenswrapper[4765]: I1003 08:41:34.892926 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e41d9b4e-c3ce-4604-a3f8-1e972308f9a7-config\") pod \"controller-manager-879f6c89f-2gzl6\" (UID: \"e41d9b4e-c3ce-4604-a3f8-1e972308f9a7\") " pod="openshift-controller-manager/controller-manager-879f6c89f-2gzl6" Oct 03 08:41:34 crc kubenswrapper[4765]: I1003 08:41:34.892974 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/1a071347-8c80-4f91-87f3-1d95c7b18a1c-stats-auth\") pod \"router-default-5444994796-f64ph\" (UID: \"1a071347-8c80-4f91-87f3-1d95c7b18a1c\") " pod="openshift-ingress/router-default-5444994796-f64ph" Oct 03 08:41:34 crc kubenswrapper[4765]: I1003 08:41:34.892990 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/d6e8ca49-1faf-4e22-8760-d7eca3820980-service-ca\") pod \"console-f9d7485db-g8jbc\" (UID: \"d6e8ca49-1faf-4e22-8760-d7eca3820980\") " pod="openshift-console/console-f9d7485db-g8jbc" Oct 03 08:41:34 crc kubenswrapper[4765]: I1003 08:41:34.893008 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/6d5f9563-ba1f-4c05-a32d-127a5c01932d-serving-cert\") pod \"apiserver-76f77b778f-gcgfs\" (UID: \"6d5f9563-ba1f-4c05-a32d-127a5c01932d\") " pod="openshift-apiserver/apiserver-76f77b778f-gcgfs" Oct 03 08:41:34 crc kubenswrapper[4765]: I1003 08:41:34.893046 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/3b2c5fda-4f45-444f-991b-0afa96721739-serving-cert\") pod \"route-controller-manager-6576b87f9c-4srhb\" (UID: \"3b2c5fda-4f45-444f-991b-0afa96721739\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-4srhb" Oct 03 08:41:34 crc kubenswrapper[4765]: I1003 08:41:34.893097 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/d6e8ca49-1faf-4e22-8760-d7eca3820980-console-config\") pod \"console-f9d7485db-g8jbc\" (UID: \"d6e8ca49-1faf-4e22-8760-d7eca3820980\") " pod="openshift-console/console-f9d7485db-g8jbc" Oct 03 08:41:34 crc kubenswrapper[4765]: I1003 08:41:34.893135 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/909e5d8a-0d69-4973-b9ce-bc5febb55e14-serving-cert\") pod \"authentication-operator-69f744f599-mpxvz\" (UID: \"909e5d8a-0d69-4973-b9ce-bc5febb55e14\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-mpxvz" Oct 03 08:41:34 crc kubenswrapper[4765]: I1003 08:41:34.893329 4765 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/downloads-7954f5f757-qdr5x"] Oct 03 08:41:34 crc kubenswrapper[4765]: I1003 08:41:34.893400 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/1a071347-8c80-4f91-87f3-1d95c7b18a1c-default-certificate\") pod \"router-default-5444994796-f64ph\" (UID: \"1a071347-8c80-4f91-87f3-1d95c7b18a1c\") " pod="openshift-ingress/router-default-5444994796-f64ph" Oct 03 08:41:34 crc kubenswrapper[4765]: I1003 08:41:34.893427 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/6d5f9563-ba1f-4c05-a32d-127a5c01932d-encryption-config\") pod \"apiserver-76f77b778f-gcgfs\" (UID: \"6d5f9563-ba1f-4c05-a32d-127a5c01932d\") " pod="openshift-apiserver/apiserver-76f77b778f-gcgfs" Oct 03 08:41:34 crc kubenswrapper[4765]: I1003 08:41:34.893444 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3b2c5fda-4f45-444f-991b-0afa96721739-config\") pod \"route-controller-manager-6576b87f9c-4srhb\" (UID: \"3b2c5fda-4f45-444f-991b-0afa96721739\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-4srhb" Oct 03 08:41:34 crc kubenswrapper[4765]: I1003 08:41:34.893506 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/0ea22c01-e088-40b8-aecd-e83fe862bc78-v4-0-config-system-router-certs\") pod \"oauth-openshift-558db77b4-jgs5w\" (UID: \"0ea22c01-e088-40b8-aecd-e83fe862bc78\") " pod="openshift-authentication/oauth-openshift-558db77b4-jgs5w" Oct 03 08:41:34 crc kubenswrapper[4765]: I1003 08:41:34.893531 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/0ea22c01-e088-40b8-aecd-e83fe862bc78-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-558db77b4-jgs5w\" (UID: \"0ea22c01-e088-40b8-aecd-e83fe862bc78\") " pod="openshift-authentication/oauth-openshift-558db77b4-jgs5w" Oct 03 08:41:34 crc kubenswrapper[4765]: I1003 08:41:34.893554 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/909e5d8a-0d69-4973-b9ce-bc5febb55e14-trusted-ca-bundle\") pod \"authentication-operator-69f744f599-mpxvz\" (UID: \"909e5d8a-0d69-4973-b9ce-bc5febb55e14\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-mpxvz" Oct 03 08:41:34 crc kubenswrapper[4765]: I1003 08:41:34.893658 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/d6e8ca49-1faf-4e22-8760-d7eca3820980-console-config\") pod \"console-f9d7485db-g8jbc\" (UID: \"d6e8ca49-1faf-4e22-8760-d7eca3820980\") " pod="openshift-console/console-f9d7485db-g8jbc" Oct 03 08:41:34 crc kubenswrapper[4765]: I1003 08:41:34.908048 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/d6e8ca49-1faf-4e22-8760-d7eca3820980-console-serving-cert\") pod \"console-f9d7485db-g8jbc\" (UID: \"d6e8ca49-1faf-4e22-8760-d7eca3820980\") " pod="openshift-console/console-f9d7485db-g8jbc" Oct 03 08:41:34 crc kubenswrapper[4765]: I1003 08:41:34.908123 4765 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-2gzl6"] Oct 03 08:41:34 crc kubenswrapper[4765]: I1003 08:41:34.909793 4765 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"trusted-ca" Oct 03 08:41:34 crc kubenswrapper[4765]: I1003 08:41:34.909838 4765 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-4srhb"] Oct 03 08:41:34 crc kubenswrapper[4765]: I1003 08:41:34.912235 4765 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-api/machine-api-operator-5694c8668f-8mnt6"] Oct 03 08:41:34 crc kubenswrapper[4765]: I1003 08:41:34.913150 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/d6e8ca49-1faf-4e22-8760-d7eca3820980-console-oauth-config\") pod \"console-f9d7485db-g8jbc\" (UID: \"d6e8ca49-1faf-4e22-8760-d7eca3820980\") " pod="openshift-console/console-f9d7485db-g8jbc" Oct 03 08:41:34 crc kubenswrapper[4765]: I1003 08:41:34.915770 4765 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-etcd-operator/etcd-operator-b45778765-4rknw"] Oct 03 08:41:34 crc kubenswrapper[4765]: I1003 08:41:34.916530 4765 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-config-operator/machine-config-controller-84d6567774-djlns"] Oct 03 08:41:34 crc kubenswrapper[4765]: I1003 08:41:34.919843 4765 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-5wxg4"] Oct 03 08:41:34 crc kubenswrapper[4765]: I1003 08:41:34.919909 4765 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-config-operator/openshift-config-operator-7777fb866f-9k7k4"] Oct 03 08:41:34 crc kubenswrapper[4765]: I1003 08:41:34.921872 4765 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console-operator"/"console-operator-dockercfg-4xjcr" Oct 03 08:41:34 crc kubenswrapper[4765]: I1003 08:41:34.922445 4765 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication/oauth-openshift-558db77b4-jgs5w"] Oct 03 08:41:34 crc kubenswrapper[4765]: I1003 08:41:34.926668 4765 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-rnsx7"] Oct 03 08:41:34 crc kubenswrapper[4765]: I1003 08:41:34.926804 4765 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-tmd5f"] Oct 03 08:41:34 crc kubenswrapper[4765]: I1003 08:41:34.926868 4765 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console-operator/console-operator-58897d9998-gzcf9"] Oct 03 08:41:34 crc kubenswrapper[4765]: I1003 08:41:34.928289 4765 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-8fzw7"] Oct 03 08:41:34 crc kubenswrapper[4765]: I1003 08:41:34.935387 4765 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ingress-canary/ingress-canary-qj7dr"] Oct 03 08:41:34 crc kubenswrapper[4765]: I1003 08:41:34.936112 4765 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-oauth-apiserver/apiserver-7bbb656c7d-l2dpj"] Oct 03 08:41:34 crc kubenswrapper[4765]: I1003 08:41:34.936206 4765 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-qj7dr" Oct 03 08:41:34 crc kubenswrapper[4765]: I1003 08:41:34.944261 4765 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-vphkx"] Oct 03 08:41:34 crc kubenswrapper[4765]: I1003 08:41:34.946571 4765 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator/migrator-59844c95c7-kthpd"] Oct 03 08:41:34 crc kubenswrapper[4765]: I1003 08:41:34.953769 4765 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca/service-ca-9c57cc56f-pfj5p"] Oct 03 08:41:34 crc kubenswrapper[4765]: I1003 08:41:34.955041 4765 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console-operator"/"serving-cert" Oct 03 08:41:34 crc kubenswrapper[4765]: I1003 08:41:34.961814 4765 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-apiserver/apiserver-76f77b778f-gcgfs"] Oct 03 08:41:34 crc kubenswrapper[4765]: I1003 08:41:34.963779 4765 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-config-operator/machine-config-server-7s2rp"] Oct 03 08:41:34 crc kubenswrapper[4765]: I1003 08:41:34.964165 4765 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"kube-root-ca.crt" Oct 03 08:41:34 crc kubenswrapper[4765]: I1003 08:41:34.966758 4765 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-server-7s2rp" Oct 03 08:41:34 crc kubenswrapper[4765]: I1003 08:41:34.967434 4765 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-rhthj"] Oct 03 08:41:34 crc kubenswrapper[4765]: I1003 08:41:34.969218 4765 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-config-operator/machine-config-operator-74547568cd-74jfz"] Oct 03 08:41:34 crc kubenswrapper[4765]: I1003 08:41:34.971435 4765 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-7fwgr"] Oct 03 08:41:34 crc kubenswrapper[4765]: I1003 08:41:34.973268 4765 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-ph5q8"] Oct 03 08:41:34 crc kubenswrapper[4765]: I1003 08:41:34.974386 4765 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29324670-fpjm8"] Oct 03 08:41:34 crc kubenswrapper[4765]: I1003 08:41:34.976099 4765 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-gvz2v"] Oct 03 08:41:34 crc kubenswrapper[4765]: I1003 08:41:34.979469 4765 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"openshift-service-ca.crt" Oct 03 08:41:34 crc kubenswrapper[4765]: I1003 08:41:34.981839 4765 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-dns-operator/dns-operator-744455d44c-swd9l"] Oct 03 08:41:34 crc kubenswrapper[4765]: I1003 08:41:34.983440 4765 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-ts98r"] Oct 03 08:41:34 crc kubenswrapper[4765]: I1003 08:41:34.984508 4765 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-w9ww9"] Oct 03 08:41:34 crc kubenswrapper[4765]: I1003 08:41:34.985609 4765 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication-operator/authentication-operator-69f744f599-mpxvz"] Oct 03 08:41:34 crc kubenswrapper[4765]: I1003 08:41:34.986805 4765 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-canary/ingress-canary-qj7dr"] Oct 03 08:41:34 crc kubenswrapper[4765]: I1003 08:41:34.988467 4765 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-f9d7485db-g8jbc"] Oct 03 08:41:34 crc kubenswrapper[4765]: I1003 08:41:34.989143 4765 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-multus/multus-admission-controller-857f4d67dd-rmm5l"] Oct 03 08:41:34 crc kubenswrapper[4765]: I1003 08:41:34.991924 4765 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/catalog-operator-68c6474976-pmrxv"] Oct 03 08:41:34 crc kubenswrapper[4765]: I1003 08:41:34.991954 4765 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-qwb6x"] Oct 03 08:41:34 crc kubenswrapper[4765]: I1003 08:41:34.992189 4765 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-dns/dns-default-5g7vw"] Oct 03 08:41:34 crc kubenswrapper[4765]: I1003 08:41:34.993227 4765 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-5g7vw" Oct 03 08:41:34 crc kubenswrapper[4765]: I1003 08:41:34.994435 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/0ea22c01-e088-40b8-aecd-e83fe862bc78-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-558db77b4-jgs5w\" (UID: \"0ea22c01-e088-40b8-aecd-e83fe862bc78\") " pod="openshift-authentication/oauth-openshift-558db77b4-jgs5w" Oct 03 08:41:34 crc kubenswrapper[4765]: I1003 08:41:34.994481 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/8029aef2-a0bf-4d08-b786-0bfff6f8943a-apiservice-cert\") pod \"packageserver-d55dfcdfc-ts98r\" (UID: \"8029aef2-a0bf-4d08-b786-0bfff6f8943a\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-ts98r" Oct 03 08:41:34 crc kubenswrapper[4765]: I1003 08:41:34.994508 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/1a071347-8c80-4f91-87f3-1d95c7b18a1c-service-ca-bundle\") pod \"router-default-5444994796-f64ph\" (UID: \"1a071347-8c80-4f91-87f3-1d95c7b18a1c\") " pod="openshift-ingress/router-default-5444994796-f64ph" Oct 03 08:41:34 crc kubenswrapper[4765]: I1003 08:41:34.994527 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/17c891c2-c5ff-4815-9f09-347204c5da1d-config\") pod \"machine-api-operator-5694c8668f-8mnt6\" (UID: \"17c891c2-c5ff-4815-9f09-347204c5da1d\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-8mnt6" Oct 03 08:41:34 crc kubenswrapper[4765]: I1003 08:41:34.994568 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/17c891c2-c5ff-4815-9f09-347204c5da1d-images\") pod \"machine-api-operator-5694c8668f-8mnt6\" (UID: \"17c891c2-c5ff-4815-9f09-347204c5da1d\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-8mnt6" Oct 03 08:41:34 crc kubenswrapper[4765]: I1003 08:41:34.994586 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"machine-api-operator-tls\" (UniqueName: \"kubernetes.io/secret/17c891c2-c5ff-4815-9f09-347204c5da1d-machine-api-operator-tls\") pod \"machine-api-operator-5694c8668f-8mnt6\" (UID: \"17c891c2-c5ff-4815-9f09-347204c5da1d\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-8mnt6" Oct 03 08:41:34 crc kubenswrapper[4765]: I1003 08:41:34.994607 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/959cebb7-4057-42d3-a1bf-fc19557247cc-serving-cert\") pod \"kube-storage-version-migrator-operator-b67b599dd-ph5q8\" (UID: \"959cebb7-4057-42d3-a1bf-fc19557247cc\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-ph5q8" Oct 03 08:41:34 crc kubenswrapper[4765]: I1003 08:41:34.994626 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/eccb70cc-3c95-4f97-ad20-610bb8a7b5df-config\") pod \"openshift-apiserver-operator-796bbdcf4f-t6dcd\" (UID: \"eccb70cc-3c95-4f97-ad20-610bb8a7b5df\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-t6dcd" Oct 03 08:41:34 crc kubenswrapper[4765]: I1003 08:41:34.994660 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/0ea22c01-e088-40b8-aecd-e83fe862bc78-v4-0-config-system-session\") pod \"oauth-openshift-558db77b4-jgs5w\" (UID: \"0ea22c01-e088-40b8-aecd-e83fe862bc78\") " pod="openshift-authentication/oauth-openshift-558db77b4-jgs5w" Oct 03 08:41:34 crc kubenswrapper[4765]: I1003 08:41:34.994680 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2qtnr\" (UniqueName: \"kubernetes.io/projected/4a2b0c12-72bd-44fb-88f7-18203ba2ccb6-kube-api-access-2qtnr\") pod \"machine-approver-56656f9798-vpkxq\" (UID: \"4a2b0c12-72bd-44fb-88f7-18203ba2ccb6\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-vpkxq" Oct 03 08:41:34 crc kubenswrapper[4765]: I1003 08:41:34.994706 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ktdhc\" (UniqueName: \"kubernetes.io/projected/1a071347-8c80-4f91-87f3-1d95c7b18a1c-kube-api-access-ktdhc\") pod \"router-default-5444994796-f64ph\" (UID: \"1a071347-8c80-4f91-87f3-1d95c7b18a1c\") " pod="openshift-ingress/router-default-5444994796-f64ph" Oct 03 08:41:34 crc kubenswrapper[4765]: I1003 08:41:34.994724 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/a2cdc869-a3e4-410d-be35-0ad4514d8bf8-etcd-client\") pod \"etcd-operator-b45778765-4rknw\" (UID: \"a2cdc869-a3e4-410d-be35-0ad4514d8bf8\") " pod="openshift-etcd-operator/etcd-operator-b45778765-4rknw" Oct 03 08:41:34 crc kubenswrapper[4765]: I1003 08:41:34.994742 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mmpcj\" (UniqueName: \"kubernetes.io/projected/6d5f9563-ba1f-4c05-a32d-127a5c01932d-kube-api-access-mmpcj\") pod \"apiserver-76f77b778f-gcgfs\" (UID: \"6d5f9563-ba1f-4c05-a32d-127a5c01932d\") " pod="openshift-apiserver/apiserver-76f77b778f-gcgfs" Oct 03 08:41:34 crc kubenswrapper[4765]: I1003 08:41:34.994758 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tmpfs\" (UniqueName: \"kubernetes.io/empty-dir/8029aef2-a0bf-4d08-b786-0bfff6f8943a-tmpfs\") pod \"packageserver-d55dfcdfc-ts98r\" (UID: \"8029aef2-a0bf-4d08-b786-0bfff6f8943a\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-ts98r" Oct 03 08:41:34 crc kubenswrapper[4765]: I1003 08:41:34.995050 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qz6sh\" (UniqueName: \"kubernetes.io/projected/4a5c90d2-421e-47fd-a2ae-c7c0c3c5a170-kube-api-access-qz6sh\") pod \"openshift-config-operator-7777fb866f-9k7k4\" (UID: \"4a5c90d2-421e-47fd-a2ae-c7c0c3c5a170\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-9k7k4" Oct 03 08:41:34 crc kubenswrapper[4765]: I1003 08:41:34.995102 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/6d5f9563-ba1f-4c05-a32d-127a5c01932d-node-pullsecrets\") pod \"apiserver-76f77b778f-gcgfs\" (UID: \"6d5f9563-ba1f-4c05-a32d-127a5c01932d\") " pod="openshift-apiserver/apiserver-76f77b778f-gcgfs" Oct 03 08:41:34 crc kubenswrapper[4765]: I1003 08:41:34.995123 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6d5f9563-ba1f-4c05-a32d-127a5c01932d-config\") pod \"apiserver-76f77b778f-gcgfs\" (UID: \"6d5f9563-ba1f-4c05-a32d-127a5c01932d\") " pod="openshift-apiserver/apiserver-76f77b778f-gcgfs" Oct 03 08:41:34 crc kubenswrapper[4765]: I1003 08:41:34.995145 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-service-ca\" (UniqueName: \"kubernetes.io/configmap/a2cdc869-a3e4-410d-be35-0ad4514d8bf8-etcd-service-ca\") pod \"etcd-operator-b45778765-4rknw\" (UID: \"a2cdc869-a3e4-410d-be35-0ad4514d8bf8\") " pod="openshift-etcd-operator/etcd-operator-b45778765-4rknw" Oct 03 08:41:34 crc kubenswrapper[4765]: I1003 08:41:34.995162 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7vzc4\" (UniqueName: \"kubernetes.io/projected/d6fe9149-6e84-4fe5-97b0-5b6fd0a522bc-kube-api-access-7vzc4\") pod \"downloads-7954f5f757-qdr5x\" (UID: \"d6fe9149-6e84-4fe5-97b0-5b6fd0a522bc\") " pod="openshift-console/downloads-7954f5f757-qdr5x" Oct 03 08:41:34 crc kubenswrapper[4765]: I1003 08:41:34.995180 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/6d5f9563-ba1f-4c05-a32d-127a5c01932d-image-import-ca\") pod \"apiserver-76f77b778f-gcgfs\" (UID: \"6d5f9563-ba1f-4c05-a32d-127a5c01932d\") " pod="openshift-apiserver/apiserver-76f77b778f-gcgfs" Oct 03 08:41:34 crc kubenswrapper[4765]: I1003 08:41:34.995200 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/a2cdc869-a3e4-410d-be35-0ad4514d8bf8-serving-cert\") pod \"etcd-operator-b45778765-4rknw\" (UID: \"a2cdc869-a3e4-410d-be35-0ad4514d8bf8\") " pod="openshift-etcd-operator/etcd-operator-b45778765-4rknw" Oct 03 08:41:34 crc kubenswrapper[4765]: I1003 08:41:34.995220 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/0ea22c01-e088-40b8-aecd-e83fe862bc78-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-558db77b4-jgs5w\" (UID: \"0ea22c01-e088-40b8-aecd-e83fe862bc78\") " pod="openshift-authentication/oauth-openshift-558db77b4-jgs5w" Oct 03 08:41:34 crc kubenswrapper[4765]: I1003 08:41:34.995239 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/909e5d8a-0d69-4973-b9ce-bc5febb55e14-service-ca-bundle\") pod \"authentication-operator-69f744f599-mpxvz\" (UID: \"909e5d8a-0d69-4973-b9ce-bc5febb55e14\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-mpxvz" Oct 03 08:41:34 crc kubenswrapper[4765]: I1003 08:41:34.995260 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6zvc5\" (UniqueName: \"kubernetes.io/projected/a2cdc869-a3e4-410d-be35-0ad4514d8bf8-kube-api-access-6zvc5\") pod \"etcd-operator-b45778765-4rknw\" (UID: \"a2cdc869-a3e4-410d-be35-0ad4514d8bf8\") " pod="openshift-etcd-operator/etcd-operator-b45778765-4rknw" Oct 03 08:41:34 crc kubenswrapper[4765]: I1003 08:41:34.995281 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/8029aef2-a0bf-4d08-b786-0bfff6f8943a-webhook-cert\") pod \"packageserver-d55dfcdfc-ts98r\" (UID: \"8029aef2-a0bf-4d08-b786-0bfff6f8943a\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-ts98r" Oct 03 08:41:34 crc kubenswrapper[4765]: I1003 08:41:34.995313 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/0ea22c01-e088-40b8-aecd-e83fe862bc78-audit-dir\") pod \"oauth-openshift-558db77b4-jgs5w\" (UID: \"0ea22c01-e088-40b8-aecd-e83fe862bc78\") " pod="openshift-authentication/oauth-openshift-558db77b4-jgs5w" Oct 03 08:41:34 crc kubenswrapper[4765]: I1003 08:41:34.995333 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/0ea22c01-e088-40b8-aecd-e83fe862bc78-v4-0-config-system-service-ca\") pod \"oauth-openshift-558db77b4-jgs5w\" (UID: \"0ea22c01-e088-40b8-aecd-e83fe862bc78\") " pod="openshift-authentication/oauth-openshift-558db77b4-jgs5w" Oct 03 08:41:34 crc kubenswrapper[4765]: I1003 08:41:34.995354 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/e41d9b4e-c3ce-4604-a3f8-1e972308f9a7-proxy-ca-bundles\") pod \"controller-manager-879f6c89f-2gzl6\" (UID: \"e41d9b4e-c3ce-4604-a3f8-1e972308f9a7\") " pod="openshift-controller-manager/controller-manager-879f6c89f-2gzl6" Oct 03 08:41:34 crc kubenswrapper[4765]: I1003 08:41:34.995374 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"machine-approver-tls\" (UniqueName: \"kubernetes.io/secret/4a2b0c12-72bd-44fb-88f7-18203ba2ccb6-machine-approver-tls\") pod \"machine-approver-56656f9798-vpkxq\" (UID: \"4a2b0c12-72bd-44fb-88f7-18203ba2ccb6\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-vpkxq" Oct 03 08:41:34 crc kubenswrapper[4765]: I1003 08:41:34.995393 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/6d5f9563-ba1f-4c05-a32d-127a5c01932d-etcd-client\") pod \"apiserver-76f77b778f-gcgfs\" (UID: \"6d5f9563-ba1f-4c05-a32d-127a5c01932d\") " pod="openshift-apiserver/apiserver-76f77b778f-gcgfs" Oct 03 08:41:34 crc kubenswrapper[4765]: I1003 08:41:34.995413 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pv824\" (UniqueName: \"kubernetes.io/projected/8029aef2-a0bf-4d08-b786-0bfff6f8943a-kube-api-access-pv824\") pod \"packageserver-d55dfcdfc-ts98r\" (UID: \"8029aef2-a0bf-4d08-b786-0bfff6f8943a\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-ts98r" Oct 03 08:41:34 crc kubenswrapper[4765]: I1003 08:41:34.995432 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-n86kd\" (UniqueName: \"kubernetes.io/projected/eccb70cc-3c95-4f97-ad20-610bb8a7b5df-kube-api-access-n86kd\") pod \"openshift-apiserver-operator-796bbdcf4f-t6dcd\" (UID: \"eccb70cc-3c95-4f97-ad20-610bb8a7b5df\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-t6dcd" Oct 03 08:41:34 crc kubenswrapper[4765]: I1003 08:41:34.995450 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/0ea22c01-e088-40b8-aecd-e83fe862bc78-v4-0-config-system-serving-cert\") pod \"oauth-openshift-558db77b4-jgs5w\" (UID: \"0ea22c01-e088-40b8-aecd-e83fe862bc78\") " pod="openshift-authentication/oauth-openshift-558db77b4-jgs5w" Oct 03 08:41:34 crc kubenswrapper[4765]: I1003 08:41:34.995467 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/3b2c5fda-4f45-444f-991b-0afa96721739-client-ca\") pod \"route-controller-manager-6576b87f9c-4srhb\" (UID: \"3b2c5fda-4f45-444f-991b-0afa96721739\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-4srhb" Oct 03 08:41:34 crc kubenswrapper[4765]: I1003 08:41:34.995484 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/e41d9b4e-c3ce-4604-a3f8-1e972308f9a7-client-ca\") pod \"controller-manager-879f6c89f-2gzl6\" (UID: \"e41d9b4e-c3ce-4604-a3f8-1e972308f9a7\") " pod="openshift-controller-manager/controller-manager-879f6c89f-2gzl6" Oct 03 08:41:34 crc kubenswrapper[4765]: I1003 08:41:34.995503 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7tr8j\" (UniqueName: \"kubernetes.io/projected/3b2c5fda-4f45-444f-991b-0afa96721739-kube-api-access-7tr8j\") pod \"route-controller-manager-6576b87f9c-4srhb\" (UID: \"3b2c5fda-4f45-444f-991b-0afa96721739\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-4srhb" Oct 03 08:41:34 crc kubenswrapper[4765]: I1003 08:41:34.995520 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dnc8f\" (UniqueName: \"kubernetes.io/projected/17c891c2-c5ff-4815-9f09-347204c5da1d-kube-api-access-dnc8f\") pod \"machine-api-operator-5694c8668f-8mnt6\" (UID: \"17c891c2-c5ff-4815-9f09-347204c5da1d\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-8mnt6" Oct 03 08:41:34 crc kubenswrapper[4765]: I1003 08:41:34.995537 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/4a5c90d2-421e-47fd-a2ae-c7c0c3c5a170-serving-cert\") pod \"openshift-config-operator-7777fb866f-9k7k4\" (UID: \"4a5c90d2-421e-47fd-a2ae-c7c0c3c5a170\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-9k7k4" Oct 03 08:41:34 crc kubenswrapper[4765]: I1003 08:41:34.995553 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a2cdc869-a3e4-410d-be35-0ad4514d8bf8-config\") pod \"etcd-operator-b45778765-4rknw\" (UID: \"a2cdc869-a3e4-410d-be35-0ad4514d8bf8\") " pod="openshift-etcd-operator/etcd-operator-b45778765-4rknw" Oct 03 08:41:34 crc kubenswrapper[4765]: I1003 08:41:34.995570 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/4a2b0c12-72bd-44fb-88f7-18203ba2ccb6-auth-proxy-config\") pod \"machine-approver-56656f9798-vpkxq\" (UID: \"4a2b0c12-72bd-44fb-88f7-18203ba2ccb6\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-vpkxq" Oct 03 08:41:34 crc kubenswrapper[4765]: I1003 08:41:34.995591 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6d5f9563-ba1f-4c05-a32d-127a5c01932d-trusted-ca-bundle\") pod \"apiserver-76f77b778f-gcgfs\" (UID: \"6d5f9563-ba1f-4c05-a32d-127a5c01932d\") " pod="openshift-apiserver/apiserver-76f77b778f-gcgfs" Oct 03 08:41:34 crc kubenswrapper[4765]: I1003 08:41:34.995611 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/0ea22c01-e088-40b8-aecd-e83fe862bc78-v4-0-config-user-template-login\") pod \"oauth-openshift-558db77b4-jgs5w\" (UID: \"0ea22c01-e088-40b8-aecd-e83fe862bc78\") " pod="openshift-authentication/oauth-openshift-558db77b4-jgs5w" Oct 03 08:41:34 crc kubenswrapper[4765]: I1003 08:41:34.995631 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/909e5d8a-0d69-4973-b9ce-bc5febb55e14-config\") pod \"authentication-operator-69f744f599-mpxvz\" (UID: \"909e5d8a-0d69-4973-b9ce-bc5febb55e14\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-mpxvz" Oct 03 08:41:34 crc kubenswrapper[4765]: I1003 08:41:34.995665 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4a2b0c12-72bd-44fb-88f7-18203ba2ccb6-config\") pod \"machine-approver-56656f9798-vpkxq\" (UID: \"4a2b0c12-72bd-44fb-88f7-18203ba2ccb6\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-vpkxq" Oct 03 08:41:34 crc kubenswrapper[4765]: I1003 08:41:34.995683 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-ca\" (UniqueName: \"kubernetes.io/configmap/a2cdc869-a3e4-410d-be35-0ad4514d8bf8-etcd-ca\") pod \"etcd-operator-b45778765-4rknw\" (UID: \"a2cdc869-a3e4-410d-be35-0ad4514d8bf8\") " pod="openshift-etcd-operator/etcd-operator-b45778765-4rknw" Oct 03 08:41:34 crc kubenswrapper[4765]: I1003 08:41:34.995698 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/6d5f9563-ba1f-4c05-a32d-127a5c01932d-audit\") pod \"apiserver-76f77b778f-gcgfs\" (UID: \"6d5f9563-ba1f-4c05-a32d-127a5c01932d\") " pod="openshift-apiserver/apiserver-76f77b778f-gcgfs" Oct 03 08:41:34 crc kubenswrapper[4765]: I1003 08:41:34.995715 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/eccb70cc-3c95-4f97-ad20-610bb8a7b5df-serving-cert\") pod \"openshift-apiserver-operator-796bbdcf4f-t6dcd\" (UID: \"eccb70cc-3c95-4f97-ad20-610bb8a7b5df\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-t6dcd" Oct 03 08:41:34 crc kubenswrapper[4765]: I1003 08:41:34.995733 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/0ea22c01-e088-40b8-aecd-e83fe862bc78-v4-0-config-system-cliconfig\") pod \"oauth-openshift-558db77b4-jgs5w\" (UID: \"0ea22c01-e088-40b8-aecd-e83fe862bc78\") " pod="openshift-authentication/oauth-openshift-558db77b4-jgs5w" Oct 03 08:41:34 crc kubenswrapper[4765]: I1003 08:41:34.995750 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/6d5f9563-ba1f-4c05-a32d-127a5c01932d-audit-dir\") pod \"apiserver-76f77b778f-gcgfs\" (UID: \"6d5f9563-ba1f-4c05-a32d-127a5c01932d\") " pod="openshift-apiserver/apiserver-76f77b778f-gcgfs" Oct 03 08:41:34 crc kubenswrapper[4765]: I1003 08:41:34.995767 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"available-featuregates\" (UniqueName: \"kubernetes.io/empty-dir/4a5c90d2-421e-47fd-a2ae-c7c0c3c5a170-available-featuregates\") pod \"openshift-config-operator-7777fb866f-9k7k4\" (UID: \"4a5c90d2-421e-47fd-a2ae-c7c0c3c5a170\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-9k7k4" Oct 03 08:41:34 crc kubenswrapper[4765]: I1003 08:41:34.995786 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-m2kqd\" (UniqueName: \"kubernetes.io/projected/959cebb7-4057-42d3-a1bf-fc19557247cc-kube-api-access-m2kqd\") pod \"kube-storage-version-migrator-operator-b67b599dd-ph5q8\" (UID: \"959cebb7-4057-42d3-a1bf-fc19557247cc\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-ph5q8" Oct 03 08:41:34 crc kubenswrapper[4765]: I1003 08:41:34.995804 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e41d9b4e-c3ce-4604-a3f8-1e972308f9a7-serving-cert\") pod \"controller-manager-879f6c89f-2gzl6\" (UID: \"e41d9b4e-c3ce-4604-a3f8-1e972308f9a7\") " pod="openshift-controller-manager/controller-manager-879f6c89f-2gzl6" Oct 03 08:41:34 crc kubenswrapper[4765]: I1003 08:41:34.995821 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-z7psh\" (UniqueName: \"kubernetes.io/projected/e41d9b4e-c3ce-4604-a3f8-1e972308f9a7-kube-api-access-z7psh\") pod \"controller-manager-879f6c89f-2gzl6\" (UID: \"e41d9b4e-c3ce-4604-a3f8-1e972308f9a7\") " pod="openshift-controller-manager/controller-manager-879f6c89f-2gzl6" Oct 03 08:41:34 crc kubenswrapper[4765]: I1003 08:41:34.995837 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/0ea22c01-e088-40b8-aecd-e83fe862bc78-audit-policies\") pod \"oauth-openshift-558db77b4-jgs5w\" (UID: \"0ea22c01-e088-40b8-aecd-e83fe862bc78\") " pod="openshift-authentication/oauth-openshift-558db77b4-jgs5w" Oct 03 08:41:34 crc kubenswrapper[4765]: I1003 08:41:34.995855 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/6d5f9563-ba1f-4c05-a32d-127a5c01932d-serving-cert\") pod \"apiserver-76f77b778f-gcgfs\" (UID: \"6d5f9563-ba1f-4c05-a32d-127a5c01932d\") " pod="openshift-apiserver/apiserver-76f77b778f-gcgfs" Oct 03 08:41:34 crc kubenswrapper[4765]: I1003 08:41:34.995870 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/3b2c5fda-4f45-444f-991b-0afa96721739-serving-cert\") pod \"route-controller-manager-6576b87f9c-4srhb\" (UID: \"3b2c5fda-4f45-444f-991b-0afa96721739\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-4srhb" Oct 03 08:41:34 crc kubenswrapper[4765]: I1003 08:41:34.995900 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e41d9b4e-c3ce-4604-a3f8-1e972308f9a7-config\") pod \"controller-manager-879f6c89f-2gzl6\" (UID: \"e41d9b4e-c3ce-4604-a3f8-1e972308f9a7\") " pod="openshift-controller-manager/controller-manager-879f6c89f-2gzl6" Oct 03 08:41:34 crc kubenswrapper[4765]: I1003 08:41:34.995916 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/1a071347-8c80-4f91-87f3-1d95c7b18a1c-stats-auth\") pod \"router-default-5444994796-f64ph\" (UID: \"1a071347-8c80-4f91-87f3-1d95c7b18a1c\") " pod="openshift-ingress/router-default-5444994796-f64ph" Oct 03 08:41:34 crc kubenswrapper[4765]: I1003 08:41:34.995938 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/909e5d8a-0d69-4973-b9ce-bc5febb55e14-serving-cert\") pod \"authentication-operator-69f744f599-mpxvz\" (UID: \"909e5d8a-0d69-4973-b9ce-bc5febb55e14\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-mpxvz" Oct 03 08:41:34 crc kubenswrapper[4765]: I1003 08:41:34.995955 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3b2c5fda-4f45-444f-991b-0afa96721739-config\") pod \"route-controller-manager-6576b87f9c-4srhb\" (UID: \"3b2c5fda-4f45-444f-991b-0afa96721739\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-4srhb" Oct 03 08:41:34 crc kubenswrapper[4765]: I1003 08:41:34.995984 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/1a071347-8c80-4f91-87f3-1d95c7b18a1c-default-certificate\") pod \"router-default-5444994796-f64ph\" (UID: \"1a071347-8c80-4f91-87f3-1d95c7b18a1c\") " pod="openshift-ingress/router-default-5444994796-f64ph" Oct 03 08:41:34 crc kubenswrapper[4765]: I1003 08:41:34.996017 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/6d5f9563-ba1f-4c05-a32d-127a5c01932d-encryption-config\") pod \"apiserver-76f77b778f-gcgfs\" (UID: \"6d5f9563-ba1f-4c05-a32d-127a5c01932d\") " pod="openshift-apiserver/apiserver-76f77b778f-gcgfs" Oct 03 08:41:34 crc kubenswrapper[4765]: I1003 08:41:34.996033 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/0ea22c01-e088-40b8-aecd-e83fe862bc78-v4-0-config-system-router-certs\") pod \"oauth-openshift-558db77b4-jgs5w\" (UID: \"0ea22c01-e088-40b8-aecd-e83fe862bc78\") " pod="openshift-authentication/oauth-openshift-558db77b4-jgs5w" Oct 03 08:41:34 crc kubenswrapper[4765]: I1003 08:41:34.996052 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/0ea22c01-e088-40b8-aecd-e83fe862bc78-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-558db77b4-jgs5w\" (UID: \"0ea22c01-e088-40b8-aecd-e83fe862bc78\") " pod="openshift-authentication/oauth-openshift-558db77b4-jgs5w" Oct 03 08:41:34 crc kubenswrapper[4765]: I1003 08:41:34.996069 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/909e5d8a-0d69-4973-b9ce-bc5febb55e14-trusted-ca-bundle\") pod \"authentication-operator-69f744f599-mpxvz\" (UID: \"909e5d8a-0d69-4973-b9ce-bc5febb55e14\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-mpxvz" Oct 03 08:41:34 crc kubenswrapper[4765]: I1003 08:41:34.996088 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/1a071347-8c80-4f91-87f3-1d95c7b18a1c-metrics-certs\") pod \"router-default-5444994796-f64ph\" (UID: \"1a071347-8c80-4f91-87f3-1d95c7b18a1c\") " pod="openshift-ingress/router-default-5444994796-f64ph" Oct 03 08:41:34 crc kubenswrapper[4765]: I1003 08:41:34.996103 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/0ea22c01-e088-40b8-aecd-e83fe862bc78-v4-0-config-user-template-error\") pod \"oauth-openshift-558db77b4-jgs5w\" (UID: \"0ea22c01-e088-40b8-aecd-e83fe862bc78\") " pod="openshift-authentication/oauth-openshift-558db77b4-jgs5w" Oct 03 08:41:34 crc kubenswrapper[4765]: I1003 08:41:34.996121 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-28fmp\" (UniqueName: \"kubernetes.io/projected/909e5d8a-0d69-4973-b9ce-bc5febb55e14-kube-api-access-28fmp\") pod \"authentication-operator-69f744f599-mpxvz\" (UID: \"909e5d8a-0d69-4973-b9ce-bc5febb55e14\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-mpxvz" Oct 03 08:41:34 crc kubenswrapper[4765]: I1003 08:41:34.996138 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/6d5f9563-ba1f-4c05-a32d-127a5c01932d-etcd-serving-ca\") pod \"apiserver-76f77b778f-gcgfs\" (UID: \"6d5f9563-ba1f-4c05-a32d-127a5c01932d\") " pod="openshift-apiserver/apiserver-76f77b778f-gcgfs" Oct 03 08:41:34 crc kubenswrapper[4765]: I1003 08:41:34.996164 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/0ea22c01-e088-40b8-aecd-e83fe862bc78-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-558db77b4-jgs5w\" (UID: \"0ea22c01-e088-40b8-aecd-e83fe862bc78\") " pod="openshift-authentication/oauth-openshift-558db77b4-jgs5w" Oct 03 08:41:34 crc kubenswrapper[4765]: I1003 08:41:34.996180 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/959cebb7-4057-42d3-a1bf-fc19557247cc-config\") pod \"kube-storage-version-migrator-operator-b67b599dd-ph5q8\" (UID: \"959cebb7-4057-42d3-a1bf-fc19557247cc\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-ph5q8" Oct 03 08:41:34 crc kubenswrapper[4765]: I1003 08:41:34.996199 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qwks2\" (UniqueName: \"kubernetes.io/projected/0ea22c01-e088-40b8-aecd-e83fe862bc78-kube-api-access-qwks2\") pod \"oauth-openshift-558db77b4-jgs5w\" (UID: \"0ea22c01-e088-40b8-aecd-e83fe862bc78\") " pod="openshift-authentication/oauth-openshift-558db77b4-jgs5w" Oct 03 08:41:34 crc kubenswrapper[4765]: I1003 08:41:34.997010 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/eccb70cc-3c95-4f97-ad20-610bb8a7b5df-config\") pod \"openshift-apiserver-operator-796bbdcf4f-t6dcd\" (UID: \"eccb70cc-3c95-4f97-ad20-610bb8a7b5df\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-t6dcd" Oct 03 08:41:34 crc kubenswrapper[4765]: I1003 08:41:34.997627 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/4a2b0c12-72bd-44fb-88f7-18203ba2ccb6-auth-proxy-config\") pod \"machine-approver-56656f9798-vpkxq\" (UID: \"4a2b0c12-72bd-44fb-88f7-18203ba2ccb6\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-vpkxq" Oct 03 08:41:34 crc kubenswrapper[4765]: I1003 08:41:34.998084 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6d5f9563-ba1f-4c05-a32d-127a5c01932d-trusted-ca-bundle\") pod \"apiserver-76f77b778f-gcgfs\" (UID: \"6d5f9563-ba1f-4c05-a32d-127a5c01932d\") " pod="openshift-apiserver/apiserver-76f77b778f-gcgfs" Oct 03 08:41:34 crc kubenswrapper[4765]: I1003 08:41:34.998391 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/6d5f9563-ba1f-4c05-a32d-127a5c01932d-audit-dir\") pod \"apiserver-76f77b778f-gcgfs\" (UID: \"6d5f9563-ba1f-4c05-a32d-127a5c01932d\") " pod="openshift-apiserver/apiserver-76f77b778f-gcgfs" Oct 03 08:41:34 crc kubenswrapper[4765]: I1003 08:41:34.999097 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/17c891c2-c5ff-4815-9f09-347204c5da1d-config\") pod \"machine-api-operator-5694c8668f-8mnt6\" (UID: \"17c891c2-c5ff-4815-9f09-347204c5da1d\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-8mnt6" Oct 03 08:41:34 crc kubenswrapper[4765]: I1003 08:41:34.999194 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/0ea22c01-e088-40b8-aecd-e83fe862bc78-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-558db77b4-jgs5w\" (UID: \"0ea22c01-e088-40b8-aecd-e83fe862bc78\") " pod="openshift-authentication/oauth-openshift-558db77b4-jgs5w" Oct 03 08:41:35 crc kubenswrapper[4765]: I1003 08:41:35.000030 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/1a071347-8c80-4f91-87f3-1d95c7b18a1c-service-ca-bundle\") pod \"router-default-5444994796-f64ph\" (UID: \"1a071347-8c80-4f91-87f3-1d95c7b18a1c\") " pod="openshift-ingress/router-default-5444994796-f64ph" Oct 03 08:41:35 crc kubenswrapper[4765]: I1003 08:41:35.000255 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"images\" (UniqueName: \"kubernetes.io/configmap/17c891c2-c5ff-4815-9f09-347204c5da1d-images\") pod \"machine-api-operator-5694c8668f-8mnt6\" (UID: \"17c891c2-c5ff-4815-9f09-347204c5da1d\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-8mnt6" Oct 03 08:41:35 crc kubenswrapper[4765]: I1003 08:41:35.001185 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/909e5d8a-0d69-4973-b9ce-bc5febb55e14-trusted-ca-bundle\") pod \"authentication-operator-69f744f599-mpxvz\" (UID: \"909e5d8a-0d69-4973-b9ce-bc5febb55e14\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-mpxvz" Oct 03 08:41:35 crc kubenswrapper[4765]: I1003 08:41:35.001245 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e41d9b4e-c3ce-4604-a3f8-1e972308f9a7-config\") pod \"controller-manager-879f6c89f-2gzl6\" (UID: \"e41d9b4e-c3ce-4604-a3f8-1e972308f9a7\") " pod="openshift-controller-manager/controller-manager-879f6c89f-2gzl6" Oct 03 08:41:35 crc kubenswrapper[4765]: I1003 08:41:35.001989 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"available-featuregates\" (UniqueName: \"kubernetes.io/empty-dir/4a5c90d2-421e-47fd-a2ae-c7c0c3c5a170-available-featuregates\") pod \"openshift-config-operator-7777fb866f-9k7k4\" (UID: \"4a5c90d2-421e-47fd-a2ae-c7c0c3c5a170\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-9k7k4" Oct 03 08:41:35 crc kubenswrapper[4765]: I1003 08:41:35.003107 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-service-ca\" (UniqueName: \"kubernetes.io/configmap/a2cdc869-a3e4-410d-be35-0ad4514d8bf8-etcd-service-ca\") pod \"etcd-operator-b45778765-4rknw\" (UID: \"a2cdc869-a3e4-410d-be35-0ad4514d8bf8\") " pod="openshift-etcd-operator/etcd-operator-b45778765-4rknw" Oct 03 08:41:35 crc kubenswrapper[4765]: I1003 08:41:35.003730 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/0ea22c01-e088-40b8-aecd-e83fe862bc78-v4-0-config-system-session\") pod \"oauth-openshift-558db77b4-jgs5w\" (UID: \"0ea22c01-e088-40b8-aecd-e83fe862bc78\") " pod="openshift-authentication/oauth-openshift-558db77b4-jgs5w" Oct 03 08:41:35 crc kubenswrapper[4765]: I1003 08:41:35.003776 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/1a071347-8c80-4f91-87f3-1d95c7b18a1c-metrics-certs\") pod \"router-default-5444994796-f64ph\" (UID: \"1a071347-8c80-4f91-87f3-1d95c7b18a1c\") " pod="openshift-ingress/router-default-5444994796-f64ph" Oct 03 08:41:35 crc kubenswrapper[4765]: I1003 08:41:35.003796 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"machine-api-operator-tls\" (UniqueName: \"kubernetes.io/secret/17c891c2-c5ff-4815-9f09-347204c5da1d-machine-api-operator-tls\") pod \"machine-api-operator-5694c8668f-8mnt6\" (UID: \"17c891c2-c5ff-4815-9f09-347204c5da1d\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-8mnt6" Oct 03 08:41:35 crc kubenswrapper[4765]: I1003 08:41:35.003858 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/0ea22c01-e088-40b8-aecd-e83fe862bc78-v4-0-config-user-template-login\") pod \"oauth-openshift-558db77b4-jgs5w\" (UID: \"0ea22c01-e088-40b8-aecd-e83fe862bc78\") " pod="openshift-authentication/oauth-openshift-558db77b4-jgs5w" Oct 03 08:41:35 crc kubenswrapper[4765]: I1003 08:41:35.003875 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/0ea22c01-e088-40b8-aecd-e83fe862bc78-audit-dir\") pod \"oauth-openshift-558db77b4-jgs5w\" (UID: \"0ea22c01-e088-40b8-aecd-e83fe862bc78\") " pod="openshift-authentication/oauth-openshift-558db77b4-jgs5w" Oct 03 08:41:35 crc kubenswrapper[4765]: I1003 08:41:35.004083 4765 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"console-operator-config" Oct 03 08:41:35 crc kubenswrapper[4765]: I1003 08:41:35.010332 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"machine-approver-tls\" (UniqueName: \"kubernetes.io/secret/4a2b0c12-72bd-44fb-88f7-18203ba2ccb6-machine-approver-tls\") pod \"machine-approver-56656f9798-vpkxq\" (UID: \"4a2b0c12-72bd-44fb-88f7-18203ba2ccb6\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-vpkxq" Oct 03 08:41:35 crc kubenswrapper[4765]: I1003 08:41:35.004828 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/909e5d8a-0d69-4973-b9ce-bc5febb55e14-config\") pod \"authentication-operator-69f744f599-mpxvz\" (UID: \"909e5d8a-0d69-4973-b9ce-bc5febb55e14\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-mpxvz" Oct 03 08:41:35 crc kubenswrapper[4765]: I1003 08:41:35.004885 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/0ea22c01-e088-40b8-aecd-e83fe862bc78-v4-0-config-system-service-ca\") pod \"oauth-openshift-558db77b4-jgs5w\" (UID: \"0ea22c01-e088-40b8-aecd-e83fe862bc78\") " pod="openshift-authentication/oauth-openshift-558db77b4-jgs5w" Oct 03 08:41:35 crc kubenswrapper[4765]: I1003 08:41:35.005152 4765 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["hostpath-provisioner/csi-hostpathplugin-xjnnd"] Oct 03 08:41:35 crc kubenswrapper[4765]: I1003 08:41:35.010601 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tmpfs\" (UniqueName: \"kubernetes.io/empty-dir/8029aef2-a0bf-4d08-b786-0bfff6f8943a-tmpfs\") pod \"packageserver-d55dfcdfc-ts98r\" (UID: \"8029aef2-a0bf-4d08-b786-0bfff6f8943a\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-ts98r" Oct 03 08:41:35 crc kubenswrapper[4765]: I1003 08:41:35.005599 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4a2b0c12-72bd-44fb-88f7-18203ba2ccb6-config\") pod \"machine-approver-56656f9798-vpkxq\" (UID: \"4a2b0c12-72bd-44fb-88f7-18203ba2ccb6\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-vpkxq" Oct 03 08:41:35 crc kubenswrapper[4765]: I1003 08:41:35.005763 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/1a071347-8c80-4f91-87f3-1d95c7b18a1c-default-certificate\") pod \"router-default-5444994796-f64ph\" (UID: \"1a071347-8c80-4f91-87f3-1d95c7b18a1c\") " pod="openshift-ingress/router-default-5444994796-f64ph" Oct 03 08:41:35 crc kubenswrapper[4765]: I1003 08:41:35.005944 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/0ea22c01-e088-40b8-aecd-e83fe862bc78-audit-policies\") pod \"oauth-openshift-558db77b4-jgs5w\" (UID: \"0ea22c01-e088-40b8-aecd-e83fe862bc78\") " pod="openshift-authentication/oauth-openshift-558db77b4-jgs5w" Oct 03 08:41:35 crc kubenswrapper[4765]: I1003 08:41:35.010942 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/6d5f9563-ba1f-4c05-a32d-127a5c01932d-image-import-ca\") pod \"apiserver-76f77b778f-gcgfs\" (UID: \"6d5f9563-ba1f-4c05-a32d-127a5c01932d\") " pod="openshift-apiserver/apiserver-76f77b778f-gcgfs" Oct 03 08:41:35 crc kubenswrapper[4765]: I1003 08:41:35.006113 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/e41d9b4e-c3ce-4604-a3f8-1e972308f9a7-proxy-ca-bundles\") pod \"controller-manager-879f6c89f-2gzl6\" (UID: \"e41d9b4e-c3ce-4604-a3f8-1e972308f9a7\") " pod="openshift-controller-manager/controller-manager-879f6c89f-2gzl6" Oct 03 08:41:35 crc kubenswrapper[4765]: I1003 08:41:35.008798 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/6d5f9563-ba1f-4c05-a32d-127a5c01932d-audit\") pod \"apiserver-76f77b778f-gcgfs\" (UID: \"6d5f9563-ba1f-4c05-a32d-127a5c01932d\") " pod="openshift-apiserver/apiserver-76f77b778f-gcgfs" Oct 03 08:41:35 crc kubenswrapper[4765]: I1003 08:41:35.009377 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/a2cdc869-a3e4-410d-be35-0ad4514d8bf8-etcd-client\") pod \"etcd-operator-b45778765-4rknw\" (UID: \"a2cdc869-a3e4-410d-be35-0ad4514d8bf8\") " pod="openshift-etcd-operator/etcd-operator-b45778765-4rknw" Oct 03 08:41:35 crc kubenswrapper[4765]: I1003 08:41:35.010379 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/1a071347-8c80-4f91-87f3-1d95c7b18a1c-stats-auth\") pod \"router-default-5444994796-f64ph\" (UID: \"1a071347-8c80-4f91-87f3-1d95c7b18a1c\") " pod="openshift-ingress/router-default-5444994796-f64ph" Oct 03 08:41:35 crc kubenswrapper[4765]: I1003 08:41:35.011508 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/0ea22c01-e088-40b8-aecd-e83fe862bc78-v4-0-config-system-router-certs\") pod \"oauth-openshift-558db77b4-jgs5w\" (UID: \"0ea22c01-e088-40b8-aecd-e83fe862bc78\") " pod="openshift-authentication/oauth-openshift-558db77b4-jgs5w" Oct 03 08:41:35 crc kubenswrapper[4765]: I1003 08:41:35.011757 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/0ea22c01-e088-40b8-aecd-e83fe862bc78-v4-0-config-user-template-error\") pod \"oauth-openshift-558db77b4-jgs5w\" (UID: \"0ea22c01-e088-40b8-aecd-e83fe862bc78\") " pod="openshift-authentication/oauth-openshift-558db77b4-jgs5w" Oct 03 08:41:35 crc kubenswrapper[4765]: I1003 08:41:35.012130 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/6d5f9563-ba1f-4c05-a32d-127a5c01932d-etcd-serving-ca\") pod \"apiserver-76f77b778f-gcgfs\" (UID: \"6d5f9563-ba1f-4c05-a32d-127a5c01932d\") " pod="openshift-apiserver/apiserver-76f77b778f-gcgfs" Oct 03 08:41:35 crc kubenswrapper[4765]: I1003 08:41:35.012408 4765 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-9g9cw"] Oct 03 08:41:35 crc kubenswrapper[4765]: I1003 08:41:35.012440 4765 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca-operator/service-ca-operator-777779d784-n4hxl"] Oct 03 08:41:35 crc kubenswrapper[4765]: I1003 08:41:35.012455 4765 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-operator/ingress-operator-5b745b69d9-x6hb4"] Oct 03 08:41:35 crc kubenswrapper[4765]: I1003 08:41:35.012468 4765 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-dns/dns-default-5g7vw"] Oct 03 08:41:35 crc kubenswrapper[4765]: I1003 08:41:35.012481 4765 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["hostpath-provisioner/csi-hostpathplugin-xjnnd"] Oct 03 08:41:35 crc kubenswrapper[4765]: I1003 08:41:35.012577 4765 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="hostpath-provisioner/csi-hostpathplugin-xjnnd" Oct 03 08:41:35 crc kubenswrapper[4765]: I1003 08:41:35.012729 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/3b2c5fda-4f45-444f-991b-0afa96721739-serving-cert\") pod \"route-controller-manager-6576b87f9c-4srhb\" (UID: \"3b2c5fda-4f45-444f-991b-0afa96721739\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-4srhb" Oct 03 08:41:35 crc kubenswrapper[4765]: I1003 08:41:35.013077 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/909e5d8a-0d69-4973-b9ce-bc5febb55e14-service-ca-bundle\") pod \"authentication-operator-69f744f599-mpxvz\" (UID: \"909e5d8a-0d69-4973-b9ce-bc5febb55e14\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-mpxvz" Oct 03 08:41:35 crc kubenswrapper[4765]: I1003 08:41:35.006097 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-ca\" (UniqueName: \"kubernetes.io/configmap/a2cdc869-a3e4-410d-be35-0ad4514d8bf8-etcd-ca\") pod \"etcd-operator-b45778765-4rknw\" (UID: \"a2cdc869-a3e4-410d-be35-0ad4514d8bf8\") " pod="openshift-etcd-operator/etcd-operator-b45778765-4rknw" Oct 03 08:41:35 crc kubenswrapper[4765]: I1003 08:41:35.004441 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/6d5f9563-ba1f-4c05-a32d-127a5c01932d-encryption-config\") pod \"apiserver-76f77b778f-gcgfs\" (UID: \"6d5f9563-ba1f-4c05-a32d-127a5c01932d\") " pod="openshift-apiserver/apiserver-76f77b778f-gcgfs" Oct 03 08:41:35 crc kubenswrapper[4765]: I1003 08:41:35.013593 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/6d5f9563-ba1f-4c05-a32d-127a5c01932d-node-pullsecrets\") pod \"apiserver-76f77b778f-gcgfs\" (UID: \"6d5f9563-ba1f-4c05-a32d-127a5c01932d\") " pod="openshift-apiserver/apiserver-76f77b778f-gcgfs" Oct 03 08:41:35 crc kubenswrapper[4765]: I1003 08:41:35.013778 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6d5f9563-ba1f-4c05-a32d-127a5c01932d-config\") pod \"apiserver-76f77b778f-gcgfs\" (UID: \"6d5f9563-ba1f-4c05-a32d-127a5c01932d\") " pod="openshift-apiserver/apiserver-76f77b778f-gcgfs" Oct 03 08:41:35 crc kubenswrapper[4765]: I1003 08:41:35.005329 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e41d9b4e-c3ce-4604-a3f8-1e972308f9a7-serving-cert\") pod \"controller-manager-879f6c89f-2gzl6\" (UID: \"e41d9b4e-c3ce-4604-a3f8-1e972308f9a7\") " pod="openshift-controller-manager/controller-manager-879f6c89f-2gzl6" Oct 03 08:41:35 crc kubenswrapper[4765]: I1003 08:41:35.014496 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/0ea22c01-e088-40b8-aecd-e83fe862bc78-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-558db77b4-jgs5w\" (UID: \"0ea22c01-e088-40b8-aecd-e83fe862bc78\") " pod="openshift-authentication/oauth-openshift-558db77b4-jgs5w" Oct 03 08:41:35 crc kubenswrapper[4765]: I1003 08:41:35.014951 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3b2c5fda-4f45-444f-991b-0afa96721739-config\") pod \"route-controller-manager-6576b87f9c-4srhb\" (UID: \"3b2c5fda-4f45-444f-991b-0afa96721739\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-4srhb" Oct 03 08:41:35 crc kubenswrapper[4765]: I1003 08:41:35.014962 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/3b2c5fda-4f45-444f-991b-0afa96721739-client-ca\") pod \"route-controller-manager-6576b87f9c-4srhb\" (UID: \"3b2c5fda-4f45-444f-991b-0afa96721739\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-4srhb" Oct 03 08:41:35 crc kubenswrapper[4765]: I1003 08:41:35.015351 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/0ea22c01-e088-40b8-aecd-e83fe862bc78-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-558db77b4-jgs5w\" (UID: \"0ea22c01-e088-40b8-aecd-e83fe862bc78\") " pod="openshift-authentication/oauth-openshift-558db77b4-jgs5w" Oct 03 08:41:35 crc kubenswrapper[4765]: I1003 08:41:35.016063 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/0ea22c01-e088-40b8-aecd-e83fe862bc78-v4-0-config-system-cliconfig\") pod \"oauth-openshift-558db77b4-jgs5w\" (UID: \"0ea22c01-e088-40b8-aecd-e83fe862bc78\") " pod="openshift-authentication/oauth-openshift-558db77b4-jgs5w" Oct 03 08:41:35 crc kubenswrapper[4765]: I1003 08:41:35.016197 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/909e5d8a-0d69-4973-b9ce-bc5febb55e14-serving-cert\") pod \"authentication-operator-69f744f599-mpxvz\" (UID: \"909e5d8a-0d69-4973-b9ce-bc5febb55e14\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-mpxvz" Oct 03 08:41:35 crc kubenswrapper[4765]: I1003 08:41:35.016263 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/a2cdc869-a3e4-410d-be35-0ad4514d8bf8-serving-cert\") pod \"etcd-operator-b45778765-4rknw\" (UID: \"a2cdc869-a3e4-410d-be35-0ad4514d8bf8\") " pod="openshift-etcd-operator/etcd-operator-b45778765-4rknw" Oct 03 08:41:35 crc kubenswrapper[4765]: I1003 08:41:35.016915 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a2cdc869-a3e4-410d-be35-0ad4514d8bf8-config\") pod \"etcd-operator-b45778765-4rknw\" (UID: \"a2cdc869-a3e4-410d-be35-0ad4514d8bf8\") " pod="openshift-etcd-operator/etcd-operator-b45778765-4rknw" Oct 03 08:41:35 crc kubenswrapper[4765]: I1003 08:41:35.017160 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/4a5c90d2-421e-47fd-a2ae-c7c0c3c5a170-serving-cert\") pod \"openshift-config-operator-7777fb866f-9k7k4\" (UID: \"4a5c90d2-421e-47fd-a2ae-c7c0c3c5a170\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-9k7k4" Oct 03 08:41:35 crc kubenswrapper[4765]: I1003 08:41:35.017372 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/6d5f9563-ba1f-4c05-a32d-127a5c01932d-serving-cert\") pod \"apiserver-76f77b778f-gcgfs\" (UID: \"6d5f9563-ba1f-4c05-a32d-127a5c01932d\") " pod="openshift-apiserver/apiserver-76f77b778f-gcgfs" Oct 03 08:41:35 crc kubenswrapper[4765]: I1003 08:41:35.017405 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/e41d9b4e-c3ce-4604-a3f8-1e972308f9a7-client-ca\") pod \"controller-manager-879f6c89f-2gzl6\" (UID: \"e41d9b4e-c3ce-4604-a3f8-1e972308f9a7\") " pod="openshift-controller-manager/controller-manager-879f6c89f-2gzl6" Oct 03 08:41:35 crc kubenswrapper[4765]: I1003 08:41:35.017634 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/0ea22c01-e088-40b8-aecd-e83fe862bc78-v4-0-config-system-serving-cert\") pod \"oauth-openshift-558db77b4-jgs5w\" (UID: \"0ea22c01-e088-40b8-aecd-e83fe862bc78\") " pod="openshift-authentication/oauth-openshift-558db77b4-jgs5w" Oct 03 08:41:35 crc kubenswrapper[4765]: I1003 08:41:35.018186 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/0ea22c01-e088-40b8-aecd-e83fe862bc78-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-558db77b4-jgs5w\" (UID: \"0ea22c01-e088-40b8-aecd-e83fe862bc78\") " pod="openshift-authentication/oauth-openshift-558db77b4-jgs5w" Oct 03 08:41:35 crc kubenswrapper[4765]: I1003 08:41:35.018543 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/6d5f9563-ba1f-4c05-a32d-127a5c01932d-etcd-client\") pod \"apiserver-76f77b778f-gcgfs\" (UID: \"6d5f9563-ba1f-4c05-a32d-127a5c01932d\") " pod="openshift-apiserver/apiserver-76f77b778f-gcgfs" Oct 03 08:41:35 crc kubenswrapper[4765]: I1003 08:41:35.019278 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/eccb70cc-3c95-4f97-ad20-610bb8a7b5df-serving-cert\") pod \"openshift-apiserver-operator-796bbdcf4f-t6dcd\" (UID: \"eccb70cc-3c95-4f97-ad20-610bb8a7b5df\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-t6dcd" Oct 03 08:41:35 crc kubenswrapper[4765]: I1003 08:41:35.019976 4765 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"mcc-proxy-tls" Oct 03 08:41:35 crc kubenswrapper[4765]: I1003 08:41:35.047029 4765 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"trusted-ca" Oct 03 08:41:35 crc kubenswrapper[4765]: I1003 08:41:35.059772 4765 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-controller-dockercfg-c2lfx" Oct 03 08:41:35 crc kubenswrapper[4765]: I1003 08:41:35.079566 4765 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"cluster-image-registry-operator-dockercfg-m4qtx" Oct 03 08:41:35 crc kubenswrapper[4765]: I1003 08:41:35.102609 4765 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"image-registry-operator-tls" Oct 03 08:41:35 crc kubenswrapper[4765]: I1003 08:41:35.140237 4765 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"openshift-service-ca.crt" Oct 03 08:41:35 crc kubenswrapper[4765]: I1003 08:41:35.162252 4765 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca"/"service-ca-dockercfg-pn86c" Oct 03 08:41:35 crc kubenswrapper[4765]: I1003 08:41:35.183391 4765 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca"/"signing-key" Oct 03 08:41:35 crc kubenswrapper[4765]: I1003 08:41:35.200604 4765 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"signing-cabundle" Oct 03 08:41:35 crc kubenswrapper[4765]: I1003 08:41:35.220717 4765 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"kube-root-ca.crt" Oct 03 08:41:35 crc kubenswrapper[4765]: I1003 08:41:35.240174 4765 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"openshift-service-ca.crt" Oct 03 08:41:35 crc kubenswrapper[4765]: I1003 08:41:35.259886 4765 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"olm-operator-serviceaccount-dockercfg-rq7zk" Oct 03 08:41:35 crc kubenswrapper[4765]: I1003 08:41:35.280138 4765 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"pprof-cert" Oct 03 08:41:35 crc kubenswrapper[4765]: I1003 08:41:35.299437 4765 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"kube-root-ca.crt" Oct 03 08:41:35 crc kubenswrapper[4765]: I1003 08:41:35.320095 4765 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"olm-operator-serving-cert" Oct 03 08:41:35 crc kubenswrapper[4765]: I1003 08:41:35.341146 4765 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"packageserver-service-cert" Oct 03 08:41:35 crc kubenswrapper[4765]: I1003 08:41:35.348545 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/8029aef2-a0bf-4d08-b786-0bfff6f8943a-webhook-cert\") pod \"packageserver-d55dfcdfc-ts98r\" (UID: \"8029aef2-a0bf-4d08-b786-0bfff6f8943a\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-ts98r" Oct 03 08:41:35 crc kubenswrapper[4765]: I1003 08:41:35.352826 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/8029aef2-a0bf-4d08-b786-0bfff6f8943a-apiservice-cert\") pod \"packageserver-d55dfcdfc-ts98r\" (UID: \"8029aef2-a0bf-4d08-b786-0bfff6f8943a\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-ts98r" Oct 03 08:41:35 crc kubenswrapper[4765]: I1003 08:41:35.361448 4765 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns-operator"/"openshift-service-ca.crt" Oct 03 08:41:35 crc kubenswrapper[4765]: I1003 08:41:35.380613 4765 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns-operator"/"dns-operator-dockercfg-9mqw5" Oct 03 08:41:35 crc kubenswrapper[4765]: I1003 08:41:35.400839 4765 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns-operator"/"metrics-tls" Oct 03 08:41:35 crc kubenswrapper[4765]: I1003 08:41:35.420853 4765 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns-operator"/"kube-root-ca.crt" Oct 03 08:41:35 crc kubenswrapper[4765]: I1003 08:41:35.440904 4765 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"openshift-service-ca.crt" Oct 03 08:41:35 crc kubenswrapper[4765]: I1003 08:41:35.460328 4765 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-dockercfg-vw8fw" Oct 03 08:41:35 crc kubenswrapper[4765]: I1003 08:41:35.480304 4765 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-serving-cert" Oct 03 08:41:35 crc kubenswrapper[4765]: I1003 08:41:35.501963 4765 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-config" Oct 03 08:41:35 crc kubenswrapper[4765]: I1003 08:41:35.521607 4765 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"kube-root-ca.crt" Oct 03 08:41:35 crc kubenswrapper[4765]: I1003 08:41:35.541372 4765 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-samples-operator"/"openshift-service-ca.crt" Oct 03 08:41:35 crc kubenswrapper[4765]: I1003 08:41:35.560110 4765 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-scheduler-operator"/"openshift-kube-scheduler-operator-config" Oct 03 08:41:35 crc kubenswrapper[4765]: I1003 08:41:35.580334 4765 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-samples-operator"/"cluster-samples-operator-dockercfg-xpp9w" Oct 03 08:41:35 crc kubenswrapper[4765]: I1003 08:41:35.601320 4765 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-samples-operator"/"samples-operator-tls" Oct 03 08:41:35 crc kubenswrapper[4765]: I1003 08:41:35.621289 4765 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-samples-operator"/"kube-root-ca.crt" Oct 03 08:41:35 crc kubenswrapper[4765]: I1003 08:41:35.640732 4765 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-scheduler-operator"/"kube-scheduler-operator-serving-cert" Oct 03 08:41:35 crc kubenswrapper[4765]: I1003 08:41:35.660633 4765 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-scheduler-operator"/"openshift-kube-scheduler-operator-dockercfg-qt55r" Oct 03 08:41:35 crc kubenswrapper[4765]: I1003 08:41:35.680793 4765 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-scheduler-operator"/"kube-root-ca.crt" Oct 03 08:41:35 crc kubenswrapper[4765]: I1003 08:41:35.701245 4765 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"openshift-service-ca.crt" Oct 03 08:41:35 crc kubenswrapper[4765]: I1003 08:41:35.720243 4765 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver-operator"/"kube-root-ca.crt" Oct 03 08:41:35 crc kubenswrapper[4765]: I1003 08:41:35.739891 4765 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-storage-version-migrator-operator"/"kube-storage-version-migrator-operator-dockercfg-2bh8d" Oct 03 08:41:35 crc kubenswrapper[4765]: I1003 08:41:35.760636 4765 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-storage-version-migrator-operator"/"serving-cert" Oct 03 08:41:35 crc kubenswrapper[4765]: I1003 08:41:35.768887 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/959cebb7-4057-42d3-a1bf-fc19557247cc-serving-cert\") pod \"kube-storage-version-migrator-operator-b67b599dd-ph5q8\" (UID: \"959cebb7-4057-42d3-a1bf-fc19557247cc\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-ph5q8" Oct 03 08:41:35 crc kubenswrapper[4765]: I1003 08:41:35.780474 4765 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"config" Oct 03 08:41:35 crc kubenswrapper[4765]: I1003 08:41:35.800281 4765 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"kube-root-ca.crt" Oct 03 08:41:35 crc kubenswrapper[4765]: I1003 08:41:35.820511 4765 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-serving-cert" Oct 03 08:41:35 crc kubenswrapper[4765]: I1003 08:41:35.839619 4765 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-dockercfg-x57mr" Oct 03 08:41:35 crc kubenswrapper[4765]: I1003 08:41:35.847388 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/959cebb7-4057-42d3-a1bf-fc19557247cc-config\") pod \"kube-storage-version-migrator-operator-b67b599dd-ph5q8\" (UID: \"959cebb7-4057-42d3-a1bf-fc19557247cc\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-ph5q8" Oct 03 08:41:35 crc kubenswrapper[4765]: I1003 08:41:35.860069 4765 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-config" Oct 03 08:41:35 crc kubenswrapper[4765]: I1003 08:41:35.878593 4765 request.go:700] Waited for 1.005470396s due to client-side throttling, not priority and fairness, request: GET:https://api-int.crc.testing:6443/api/v1/namespaces/openshift-image-registry/secrets?fieldSelector=metadata.name%3Dinstallation-pull-secrets&limit=500&resourceVersion=0 Oct 03 08:41:35 crc kubenswrapper[4765]: I1003 08:41:35.880104 4765 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"installation-pull-secrets" Oct 03 08:41:35 crc kubenswrapper[4765]: I1003 08:41:35.901435 4765 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"registry-dockercfg-kzzsd" Oct 03 08:41:35 crc kubenswrapper[4765]: I1003 08:41:35.920240 4765 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"image-registry-tls" Oct 03 08:41:35 crc kubenswrapper[4765]: I1003 08:41:35.961048 4765 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"control-plane-machine-set-operator-tls" Oct 03 08:41:35 crc kubenswrapper[4765]: I1003 08:41:35.981301 4765 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"control-plane-machine-set-operator-dockercfg-k9rxt" Oct 03 08:41:36 crc kubenswrapper[4765]: I1003 08:41:36.001170 4765 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"package-server-manager-serving-cert" Oct 03 08:41:36 crc kubenswrapper[4765]: I1003 08:41:36.020986 4765 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator"/"openshift-service-ca.crt" Oct 03 08:41:36 crc kubenswrapper[4765]: I1003 08:41:36.040387 4765 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-storage-version-migrator"/"kube-storage-version-migrator-sa-dockercfg-5xfcg" Oct 03 08:41:36 crc kubenswrapper[4765]: I1003 08:41:36.060402 4765 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator"/"kube-root-ca.crt" Oct 03 08:41:36 crc kubenswrapper[4765]: I1003 08:41:36.081753 4765 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-ac-dockercfg-9lkdf" Oct 03 08:41:36 crc kubenswrapper[4765]: I1003 08:41:36.102359 4765 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-admission-controller-secret" Oct 03 08:41:36 crc kubenswrapper[4765]: I1003 08:41:36.121102 4765 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"openshift-service-ca.crt" Oct 03 08:41:36 crc kubenswrapper[4765]: I1003 08:41:36.140007 4765 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-operator"/"ingress-operator-dockercfg-7lnqk" Oct 03 08:41:36 crc kubenswrapper[4765]: I1003 08:41:36.160934 4765 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-operator"/"metrics-tls" Oct 03 08:41:36 crc kubenswrapper[4765]: I1003 08:41:36.191357 4765 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"trusted-ca" Oct 03 08:41:36 crc kubenswrapper[4765]: I1003 08:41:36.201280 4765 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"kube-root-ca.crt" Oct 03 08:41:36 crc kubenswrapper[4765]: I1003 08:41:36.219916 4765 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"catalog-operator-serving-cert" Oct 03 08:41:36 crc kubenswrapper[4765]: I1003 08:41:36.240426 4765 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"service-ca-operator-config" Oct 03 08:41:36 crc kubenswrapper[4765]: I1003 08:41:36.259996 4765 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca-operator"/"service-ca-operator-dockercfg-rg9jl" Oct 03 08:41:36 crc kubenswrapper[4765]: I1003 08:41:36.280191 4765 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca-operator"/"serving-cert" Oct 03 08:41:36 crc kubenswrapper[4765]: I1003 08:41:36.301017 4765 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"kube-root-ca.crt" Oct 03 08:41:36 crc kubenswrapper[4765]: I1003 08:41:36.320716 4765 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"openshift-service-ca.crt" Oct 03 08:41:36 crc kubenswrapper[4765]: I1003 08:41:36.340481 4765 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"openshift-service-ca.crt" Oct 03 08:41:36 crc kubenswrapper[4765]: I1003 08:41:36.360514 4765 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"marketplace-operator-dockercfg-5nsgg" Oct 03 08:41:36 crc kubenswrapper[4765]: I1003 08:41:36.381198 4765 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"marketplace-operator-metrics" Oct 03 08:41:36 crc kubenswrapper[4765]: I1003 08:41:36.421509 4765 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"kube-root-ca.crt" Oct 03 08:41:36 crc kubenswrapper[4765]: I1003 08:41:36.438429 4765 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"marketplace-trusted-ca" Oct 03 08:41:36 crc kubenswrapper[4765]: I1003 08:41:36.440134 4765 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Oct 03 08:41:36 crc kubenswrapper[4765]: I1003 08:41:36.462105 4765 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Oct 03 08:41:36 crc kubenswrapper[4765]: I1003 08:41:36.500408 4765 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-canary"/"openshift-service-ca.crt" Oct 03 08:41:36 crc kubenswrapper[4765]: I1003 08:41:36.501834 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-f5rgb\" (UniqueName: \"kubernetes.io/projected/d6e8ca49-1faf-4e22-8760-d7eca3820980-kube-api-access-f5rgb\") pod \"console-f9d7485db-g8jbc\" (UID: \"d6e8ca49-1faf-4e22-8760-d7eca3820980\") " pod="openshift-console/console-f9d7485db-g8jbc" Oct 03 08:41:36 crc kubenswrapper[4765]: I1003 08:41:36.521588 4765 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-canary"/"canary-serving-cert" Oct 03 08:41:36 crc kubenswrapper[4765]: I1003 08:41:36.540425 4765 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-canary"/"default-dockercfg-2llfx" Oct 03 08:41:36 crc kubenswrapper[4765]: I1003 08:41:36.560339 4765 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-canary"/"kube-root-ca.crt" Oct 03 08:41:36 crc kubenswrapper[4765]: I1003 08:41:36.580920 4765 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-server-dockercfg-qx5rd" Oct 03 08:41:36 crc kubenswrapper[4765]: I1003 08:41:36.601696 4765 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"node-bootstrapper-token" Oct 03 08:41:36 crc kubenswrapper[4765]: I1003 08:41:36.620369 4765 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-server-tls" Oct 03 08:41:36 crc kubenswrapper[4765]: I1003 08:41:36.640552 4765 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns"/"dns-dockercfg-jwfmh" Oct 03 08:41:36 crc kubenswrapper[4765]: I1003 08:41:36.659527 4765 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns"/"dns-default-metrics-tls" Oct 03 08:41:36 crc kubenswrapper[4765]: I1003 08:41:36.679042 4765 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-f9d7485db-g8jbc" Oct 03 08:41:36 crc kubenswrapper[4765]: I1003 08:41:36.680959 4765 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"dns-default" Oct 03 08:41:36 crc kubenswrapper[4765]: I1003 08:41:36.715300 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6zvc5\" (UniqueName: \"kubernetes.io/projected/a2cdc869-a3e4-410d-be35-0ad4514d8bf8-kube-api-access-6zvc5\") pod \"etcd-operator-b45778765-4rknw\" (UID: \"a2cdc869-a3e4-410d-be35-0ad4514d8bf8\") " pod="openshift-etcd-operator/etcd-operator-b45778765-4rknw" Oct 03 08:41:36 crc kubenswrapper[4765]: I1003 08:41:36.742437 4765 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-etcd-operator/etcd-operator-b45778765-4rknw" Oct 03 08:41:36 crc kubenswrapper[4765]: I1003 08:41:36.745396 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2qtnr\" (UniqueName: \"kubernetes.io/projected/4a2b0c12-72bd-44fb-88f7-18203ba2ccb6-kube-api-access-2qtnr\") pod \"machine-approver-56656f9798-vpkxq\" (UID: \"4a2b0c12-72bd-44fb-88f7-18203ba2ccb6\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-vpkxq" Oct 03 08:41:36 crc kubenswrapper[4765]: I1003 08:41:36.759190 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ktdhc\" (UniqueName: \"kubernetes.io/projected/1a071347-8c80-4f91-87f3-1d95c7b18a1c-kube-api-access-ktdhc\") pod \"router-default-5444994796-f64ph\" (UID: \"1a071347-8c80-4f91-87f3-1d95c7b18a1c\") " pod="openshift-ingress/router-default-5444994796-f64ph" Oct 03 08:41:36 crc kubenswrapper[4765]: I1003 08:41:36.786592 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-m2kqd\" (UniqueName: \"kubernetes.io/projected/959cebb7-4057-42d3-a1bf-fc19557247cc-kube-api-access-m2kqd\") pod \"kube-storage-version-migrator-operator-b67b599dd-ph5q8\" (UID: \"959cebb7-4057-42d3-a1bf-fc19557247cc\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-ph5q8" Oct 03 08:41:36 crc kubenswrapper[4765]: I1003 08:41:36.810300 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-z7psh\" (UniqueName: \"kubernetes.io/projected/e41d9b4e-c3ce-4604-a3f8-1e972308f9a7-kube-api-access-z7psh\") pod \"controller-manager-879f6c89f-2gzl6\" (UID: \"e41d9b4e-c3ce-4604-a3f8-1e972308f9a7\") " pod="openshift-controller-manager/controller-manager-879f6c89f-2gzl6" Oct 03 08:41:36 crc kubenswrapper[4765]: I1003 08:41:36.812657 4765 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-vpkxq" Oct 03 08:41:36 crc kubenswrapper[4765]: I1003 08:41:36.818156 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-28fmp\" (UniqueName: \"kubernetes.io/projected/909e5d8a-0d69-4973-b9ce-bc5febb55e14-kube-api-access-28fmp\") pod \"authentication-operator-69f744f599-mpxvz\" (UID: \"909e5d8a-0d69-4973-b9ce-bc5febb55e14\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-mpxvz" Oct 03 08:41:36 crc kubenswrapper[4765]: W1003 08:41:36.836283 4765 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod4a2b0c12_72bd_44fb_88f7_18203ba2ccb6.slice/crio-b1704a0509242f6ff6c5c955bddc629f143bb2d4d6aff1b6e8615a6046ff7f33 WatchSource:0}: Error finding container b1704a0509242f6ff6c5c955bddc629f143bb2d4d6aff1b6e8615a6046ff7f33: Status 404 returned error can't find the container with id b1704a0509242f6ff6c5c955bddc629f143bb2d4d6aff1b6e8615a6046ff7f33 Oct 03 08:41:36 crc kubenswrapper[4765]: I1003 08:41:36.837049 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7vzc4\" (UniqueName: \"kubernetes.io/projected/d6fe9149-6e84-4fe5-97b0-5b6fd0a522bc-kube-api-access-7vzc4\") pod \"downloads-7954f5f757-qdr5x\" (UID: \"d6fe9149-6e84-4fe5-97b0-5b6fd0a522bc\") " pod="openshift-console/downloads-7954f5f757-qdr5x" Oct 03 08:41:36 crc kubenswrapper[4765]: I1003 08:41:36.858021 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mmpcj\" (UniqueName: \"kubernetes.io/projected/6d5f9563-ba1f-4c05-a32d-127a5c01932d-kube-api-access-mmpcj\") pod \"apiserver-76f77b778f-gcgfs\" (UID: \"6d5f9563-ba1f-4c05-a32d-127a5c01932d\") " pod="openshift-apiserver/apiserver-76f77b778f-gcgfs" Oct 03 08:41:36 crc kubenswrapper[4765]: I1003 08:41:36.877784 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qwks2\" (UniqueName: \"kubernetes.io/projected/0ea22c01-e088-40b8-aecd-e83fe862bc78-kube-api-access-qwks2\") pod \"oauth-openshift-558db77b4-jgs5w\" (UID: \"0ea22c01-e088-40b8-aecd-e83fe862bc78\") " pod="openshift-authentication/oauth-openshift-558db77b4-jgs5w" Oct 03 08:41:36 crc kubenswrapper[4765]: I1003 08:41:36.881324 4765 reflector.go:368] Caches populated for *v1.ConfigMap from object-"hostpath-provisioner"/"kube-root-ca.crt" Oct 03 08:41:36 crc kubenswrapper[4765]: I1003 08:41:36.898324 4765 request.go:700] Waited for 1.885041618s due to client-side throttling, not priority and fairness, request: POST:https://api-int.crc.testing:6443/api/v1/namespaces/openshift-route-controller-manager/serviceaccounts/route-controller-manager-sa/token Oct 03 08:41:36 crc kubenswrapper[4765]: I1003 08:41:36.905381 4765 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/downloads-7954f5f757-qdr5x" Oct 03 08:41:36 crc kubenswrapper[4765]: I1003 08:41:36.914427 4765 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-ph5q8" Oct 03 08:41:36 crc kubenswrapper[4765]: I1003 08:41:36.916043 4765 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-apiserver/apiserver-76f77b778f-gcgfs" Oct 03 08:41:36 crc kubenswrapper[4765]: I1003 08:41:36.921201 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7tr8j\" (UniqueName: \"kubernetes.io/projected/3b2c5fda-4f45-444f-991b-0afa96721739-kube-api-access-7tr8j\") pod \"route-controller-manager-6576b87f9c-4srhb\" (UID: \"3b2c5fda-4f45-444f-991b-0afa96721739\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-4srhb" Oct 03 08:41:36 crc kubenswrapper[4765]: I1003 08:41:36.923706 4765 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-f9d7485db-g8jbc"] Oct 03 08:41:36 crc kubenswrapper[4765]: W1003 08:41:36.934381 4765 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd6e8ca49_1faf_4e22_8760_d7eca3820980.slice/crio-02edebc121a2539e6a682e8abe0e16ae0068222cf3ff2034cda2deb460533dff WatchSource:0}: Error finding container 02edebc121a2539e6a682e8abe0e16ae0068222cf3ff2034cda2deb460533dff: Status 404 returned error can't find the container with id 02edebc121a2539e6a682e8abe0e16ae0068222cf3ff2034cda2deb460533dff Oct 03 08:41:36 crc kubenswrapper[4765]: I1003 08:41:36.936182 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pv824\" (UniqueName: \"kubernetes.io/projected/8029aef2-a0bf-4d08-b786-0bfff6f8943a-kube-api-access-pv824\") pod \"packageserver-d55dfcdfc-ts98r\" (UID: \"8029aef2-a0bf-4d08-b786-0bfff6f8943a\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-ts98r" Oct 03 08:41:36 crc kubenswrapper[4765]: I1003 08:41:36.954369 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-n86kd\" (UniqueName: \"kubernetes.io/projected/eccb70cc-3c95-4f97-ad20-610bb8a7b5df-kube-api-access-n86kd\") pod \"openshift-apiserver-operator-796bbdcf4f-t6dcd\" (UID: \"eccb70cc-3c95-4f97-ad20-610bb8a7b5df\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-t6dcd" Oct 03 08:41:36 crc kubenswrapper[4765]: I1003 08:41:36.956843 4765 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-etcd-operator/etcd-operator-b45778765-4rknw"] Oct 03 08:41:36 crc kubenswrapper[4765]: I1003 08:41:36.961981 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-f9d7485db-g8jbc" event={"ID":"d6e8ca49-1faf-4e22-8760-d7eca3820980","Type":"ContainerStarted","Data":"02edebc121a2539e6a682e8abe0e16ae0068222cf3ff2034cda2deb460533dff"} Oct 03 08:41:36 crc kubenswrapper[4765]: I1003 08:41:36.964422 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-vpkxq" event={"ID":"4a2b0c12-72bd-44fb-88f7-18203ba2ccb6","Type":"ContainerStarted","Data":"b1704a0509242f6ff6c5c955bddc629f143bb2d4d6aff1b6e8615a6046ff7f33"} Oct 03 08:41:36 crc kubenswrapper[4765]: I1003 08:41:36.972468 4765 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-4srhb" Oct 03 08:41:36 crc kubenswrapper[4765]: W1003 08:41:36.973781 4765 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poda2cdc869_a3e4_410d_be35_0ad4514d8bf8.slice/crio-4e6cf10fd10a3fed4de5e0a7c9bea75f83319abedac2df5b8d96acfda452c970 WatchSource:0}: Error finding container 4e6cf10fd10a3fed4de5e0a7c9bea75f83319abedac2df5b8d96acfda452c970: Status 404 returned error can't find the container with id 4e6cf10fd10a3fed4de5e0a7c9bea75f83319abedac2df5b8d96acfda452c970 Oct 03 08:41:36 crc kubenswrapper[4765]: I1003 08:41:36.975172 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qz6sh\" (UniqueName: \"kubernetes.io/projected/4a5c90d2-421e-47fd-a2ae-c7c0c3c5a170-kube-api-access-qz6sh\") pod \"openshift-config-operator-7777fb866f-9k7k4\" (UID: \"4a5c90d2-421e-47fd-a2ae-c7c0c3c5a170\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-9k7k4" Oct 03 08:41:36 crc kubenswrapper[4765]: I1003 08:41:36.979722 4765 reflector.go:368] Caches populated for *v1.Secret from object-"hostpath-provisioner"/"csi-hostpath-provisioner-sa-dockercfg-qd74k" Oct 03 08:41:36 crc kubenswrapper[4765]: I1003 08:41:36.991244 4765 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-879f6c89f-2gzl6" Oct 03 08:41:37 crc kubenswrapper[4765]: I1003 08:41:36.999998 4765 reflector.go:368] Caches populated for *v1.ConfigMap from object-"hostpath-provisioner"/"openshift-service-ca.crt" Oct 03 08:41:37 crc kubenswrapper[4765]: I1003 08:41:37.016330 4765 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress/router-default-5444994796-f64ph" Oct 03 08:41:37 crc kubenswrapper[4765]: I1003 08:41:37.048176 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dnc8f\" (UniqueName: \"kubernetes.io/projected/17c891c2-c5ff-4815-9f09-347204c5da1d-kube-api-access-dnc8f\") pod \"machine-api-operator-5694c8668f-8mnt6\" (UID: \"17c891c2-c5ff-4815-9f09-347204c5da1d\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-8mnt6" Oct 03 08:41:37 crc kubenswrapper[4765]: I1003 08:41:37.058011 4765 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication-operator/authentication-operator-69f744f599-mpxvz" Oct 03 08:41:37 crc kubenswrapper[4765]: I1003 08:41:37.097894 4765 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-t6dcd" Oct 03 08:41:37 crc kubenswrapper[4765]: I1003 08:41:37.126755 4765 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-558db77b4-jgs5w" Oct 03 08:41:37 crc kubenswrapper[4765]: I1003 08:41:37.130316 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/0cc05495-73d0-4866-adcb-aa89431470c5-samples-operator-tls\") pod \"cluster-samples-operator-665b6dd947-vphkx\" (UID: \"0cc05495-73d0-4866-adcb-aa89431470c5\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-vphkx" Oct 03 08:41:37 crc kubenswrapper[4765]: I1003 08:41:37.130346 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rc5dt\" (UniqueName: \"kubernetes.io/projected/0cc05495-73d0-4866-adcb-aa89431470c5-kube-api-access-rc5dt\") pod \"cluster-samples-operator-665b6dd947-vphkx\" (UID: \"0cc05495-73d0-4866-adcb-aa89431470c5\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-vphkx" Oct 03 08:41:37 crc kubenswrapper[4765]: I1003 08:41:37.130369 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/8bb60ae0-1dd0-4af1-ba69-49f17ed39eba-signing-cabundle\") pod \"service-ca-9c57cc56f-pfj5p\" (UID: \"8bb60ae0-1dd0-4af1-ba69-49f17ed39eba\") " pod="openshift-service-ca/service-ca-9c57cc56f-pfj5p" Oct 03 08:41:37 crc kubenswrapper[4765]: I1003 08:41:37.130387 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6639174a-08b1-409e-a6f1-5e238ef9ae85-config\") pod \"openshift-controller-manager-operator-756b6f6bc6-7fwgr\" (UID: \"6639174a-08b1-409e-a6f1-5e238ef9ae85\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-7fwgr" Oct 03 08:41:37 crc kubenswrapper[4765]: I1003 08:41:37.130423 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/32b16068-abfd-4a3f-870c-a17c7ff31d4b-bound-sa-token\") pod \"image-registry-697d97f7c8-qwb6x\" (UID: \"32b16068-abfd-4a3f-870c-a17c7ff31d4b\") " pod="openshift-image-registry/image-registry-697d97f7c8-qwb6x" Oct 03 08:41:37 crc kubenswrapper[4765]: I1003 08:41:37.130438 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/f8a9647c-0b02-4a08-91ee-537052124a65-serving-cert\") pod \"kube-apiserver-operator-766d6c64bb-w9ww9\" (UID: \"f8a9647c-0b02-4a08-91ee-537052124a65\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-w9ww9" Oct 03 08:41:37 crc kubenswrapper[4765]: I1003 08:41:37.130455 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/6639174a-08b1-409e-a6f1-5e238ef9ae85-serving-cert\") pod \"openshift-controller-manager-operator-756b6f6bc6-7fwgr\" (UID: \"6639174a-08b1-409e-a6f1-5e238ef9ae85\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-7fwgr" Oct 03 08:41:37 crc kubenswrapper[4765]: I1003 08:41:37.132666 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/fb1c8d7c-5da9-41d7-85a7-3a36c632e7b3-trusted-ca-bundle\") pod \"apiserver-7bbb656c7d-l2dpj\" (UID: \"fb1c8d7c-5da9-41d7-85a7-3a36c632e7b3\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-l2dpj" Oct 03 08:41:37 crc kubenswrapper[4765]: I1003 08:41:37.132698 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/8bb60ae0-1dd0-4af1-ba69-49f17ed39eba-signing-key\") pod \"service-ca-9c57cc56f-pfj5p\" (UID: \"8bb60ae0-1dd0-4af1-ba69-49f17ed39eba\") " pod="openshift-service-ca/service-ca-9c57cc56f-pfj5p" Oct 03 08:41:37 crc kubenswrapper[4765]: I1003 08:41:37.132746 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/6d421bb9-ba2e-416a-9554-7c4c7c93658b-proxy-tls\") pod \"machine-config-operator-74547568cd-74jfz\" (UID: \"6d421bb9-ba2e-416a-9554-7c4c7c93658b\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-74jfz" Oct 03 08:41:37 crc kubenswrapper[4765]: I1003 08:41:37.132801 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-779bx\" (UniqueName: \"kubernetes.io/projected/6d421bb9-ba2e-416a-9554-7c4c7c93658b-kube-api-access-779bx\") pod \"machine-config-operator-74547568cd-74jfz\" (UID: \"6d421bb9-ba2e-416a-9554-7c4c7c93658b\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-74jfz" Oct 03 08:41:37 crc kubenswrapper[4765]: I1003 08:41:37.132837 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/fb1c8d7c-5da9-41d7-85a7-3a36c632e7b3-serving-cert\") pod \"apiserver-7bbb656c7d-l2dpj\" (UID: \"fb1c8d7c-5da9-41d7-85a7-3a36c632e7b3\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-l2dpj" Oct 03 08:41:37 crc kubenswrapper[4765]: I1003 08:41:37.132864 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/32b16068-abfd-4a3f-870c-a17c7ff31d4b-registry-tls\") pod \"image-registry-697d97f7c8-qwb6x\" (UID: \"32b16068-abfd-4a3f-870c-a17c7ff31d4b\") " pod="openshift-image-registry/image-registry-697d97f7c8-qwb6x" Oct 03 08:41:37 crc kubenswrapper[4765]: I1003 08:41:37.132882 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/32b16068-abfd-4a3f-870c-a17c7ff31d4b-ca-trust-extracted\") pod \"image-registry-697d97f7c8-qwb6x\" (UID: \"32b16068-abfd-4a3f-870c-a17c7ff31d4b\") " pod="openshift-image-registry/image-registry-697d97f7c8-qwb6x" Oct 03 08:41:37 crc kubenswrapper[4765]: I1003 08:41:37.132899 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"mcc-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/ee85c45f-e702-4221-a738-c57382513f5b-mcc-auth-proxy-config\") pod \"machine-config-controller-84d6567774-djlns\" (UID: \"ee85c45f-e702-4221-a738-c57382513f5b\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-djlns" Oct 03 08:41:37 crc kubenswrapper[4765]: I1003 08:41:37.132918 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/6ef10623-b762-418f-bc9d-36a66d6ec9fd-serving-cert\") pod \"console-operator-58897d9998-gzcf9\" (UID: \"6ef10623-b762-418f-bc9d-36a66d6ec9fd\") " pod="openshift-console-operator/console-operator-58897d9998-gzcf9" Oct 03 08:41:37 crc kubenswrapper[4765]: I1003 08:41:37.132943 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/fb1c8d7c-5da9-41d7-85a7-3a36c632e7b3-audit-dir\") pod \"apiserver-7bbb656c7d-l2dpj\" (UID: \"fb1c8d7c-5da9-41d7-85a7-3a36c632e7b3\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-l2dpj" Oct 03 08:41:37 crc kubenswrapper[4765]: I1003 08:41:37.132962 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/69cc377a-02f9-4e57-a9a8-776b5cef5b9b-profile-collector-cert\") pod \"olm-operator-6b444d44fb-tmd5f\" (UID: \"69cc377a-02f9-4e57-a9a8-776b5cef5b9b\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-tmd5f" Oct 03 08:41:37 crc kubenswrapper[4765]: I1003 08:41:37.132990 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lc9gw\" (UniqueName: \"kubernetes.io/projected/b2831ebb-3ca5-490d-b0f7-ea2c669f78e3-kube-api-access-lc9gw\") pod \"cluster-image-registry-operator-dc59b4c8b-8fzw7\" (UID: \"b2831ebb-3ca5-490d-b0f7-ea2c669f78e3\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-8fzw7" Oct 03 08:41:37 crc kubenswrapper[4765]: I1003 08:41:37.133031 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-w8dbx\" (UniqueName: \"kubernetes.io/projected/32b16068-abfd-4a3f-870c-a17c7ff31d4b-kube-api-access-w8dbx\") pod \"image-registry-697d97f7c8-qwb6x\" (UID: \"32b16068-abfd-4a3f-870c-a17c7ff31d4b\") " pod="openshift-image-registry/image-registry-697d97f7c8-qwb6x" Oct 03 08:41:37 crc kubenswrapper[4765]: I1003 08:41:37.133050 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3bf30b77-2306-4ae3-9ae2-02af916249f2-config\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-5wxg4\" (UID: \"3bf30b77-2306-4ae3-9ae2-02af916249f2\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-5wxg4" Oct 03 08:41:37 crc kubenswrapper[4765]: I1003 08:41:37.133065 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/32b16068-abfd-4a3f-870c-a17c7ff31d4b-trusted-ca\") pod \"image-registry-697d97f7c8-qwb6x\" (UID: \"32b16068-abfd-4a3f-870c-a17c7ff31d4b\") " pod="openshift-image-registry/image-registry-697d97f7c8-qwb6x" Oct 03 08:41:37 crc kubenswrapper[4765]: I1003 08:41:37.133081 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/3bf30b77-2306-4ae3-9ae2-02af916249f2-kube-api-access\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-5wxg4\" (UID: \"3bf30b77-2306-4ae3-9ae2-02af916249f2\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-5wxg4" Oct 03 08:41:37 crc kubenswrapper[4765]: I1003 08:41:37.133108 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qnvf4\" (UniqueName: \"kubernetes.io/projected/a7ea52a8-bcd9-4234-ba4d-f4181094c260-kube-api-access-qnvf4\") pod \"dns-operator-744455d44c-swd9l\" (UID: \"a7ea52a8-bcd9-4234-ba4d-f4181094c260\") " pod="openshift-dns-operator/dns-operator-744455d44c-swd9l" Oct 03 08:41:37 crc kubenswrapper[4765]: I1003 08:41:37.133133 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/32b16068-abfd-4a3f-870c-a17c7ff31d4b-registry-certificates\") pod \"image-registry-697d97f7c8-qwb6x\" (UID: \"32b16068-abfd-4a3f-870c-a17c7ff31d4b\") " pod="openshift-image-registry/image-registry-697d97f7c8-qwb6x" Oct 03 08:41:37 crc kubenswrapper[4765]: I1003 08:41:37.133959 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/6d421bb9-ba2e-416a-9554-7c4c7c93658b-images\") pod \"machine-config-operator-74547568cd-74jfz\" (UID: \"6d421bb9-ba2e-416a-9554-7c4c7c93658b\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-74jfz" Oct 03 08:41:37 crc kubenswrapper[4765]: I1003 08:41:37.135013 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/c15a1eca-d125-468b-ac64-8046e4bcd19b-serving-cert\") pod \"kube-controller-manager-operator-78b949d7b-rhthj\" (UID: \"c15a1eca-d125-468b-ac64-8046e4bcd19b\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-rhthj" Oct 03 08:41:37 crc kubenswrapper[4765]: I1003 08:41:37.135145 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/fb1c8d7c-5da9-41d7-85a7-3a36c632e7b3-etcd-serving-ca\") pod \"apiserver-7bbb656c7d-l2dpj\" (UID: \"fb1c8d7c-5da9-41d7-85a7-3a36c632e7b3\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-l2dpj" Oct 03 08:41:37 crc kubenswrapper[4765]: I1003 08:41:37.135171 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6mh7d\" (UniqueName: \"kubernetes.io/projected/6ef10623-b762-418f-bc9d-36a66d6ec9fd-kube-api-access-6mh7d\") pod \"console-operator-58897d9998-gzcf9\" (UID: \"6ef10623-b762-418f-bc9d-36a66d6ec9fd\") " pod="openshift-console-operator/console-operator-58897d9998-gzcf9" Oct 03 08:41:37 crc kubenswrapper[4765]: I1003 08:41:37.135221 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-qwb6x\" (UID: \"32b16068-abfd-4a3f-870c-a17c7ff31d4b\") " pod="openshift-image-registry/image-registry-697d97f7c8-qwb6x" Oct 03 08:41:37 crc kubenswrapper[4765]: I1003 08:41:37.135251 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/32b16068-abfd-4a3f-870c-a17c7ff31d4b-installation-pull-secrets\") pod \"image-registry-697d97f7c8-qwb6x\" (UID: \"32b16068-abfd-4a3f-870c-a17c7ff31d4b\") " pod="openshift-image-registry/image-registry-697d97f7c8-qwb6x" Oct 03 08:41:37 crc kubenswrapper[4765]: E1003 08:41:37.135774 4765 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-03 08:41:37.635757108 +0000 UTC m=+141.937251508 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-qwb6x" (UID: "32b16068-abfd-4a3f-870c-a17c7ff31d4b") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 03 08:41:37 crc kubenswrapper[4765]: I1003 08:41:37.136070 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/fb1c8d7c-5da9-41d7-85a7-3a36c632e7b3-etcd-client\") pod \"apiserver-7bbb656c7d-l2dpj\" (UID: \"fb1c8d7c-5da9-41d7-85a7-3a36c632e7b3\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-l2dpj" Oct 03 08:41:37 crc kubenswrapper[4765]: I1003 08:41:37.136312 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/c15a1eca-d125-468b-ac64-8046e4bcd19b-kube-api-access\") pod \"kube-controller-manager-operator-78b949d7b-rhthj\" (UID: \"c15a1eca-d125-468b-ac64-8046e4bcd19b\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-rhthj" Oct 03 08:41:37 crc kubenswrapper[4765]: I1003 08:41:37.136362 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/a7ea52a8-bcd9-4234-ba4d-f4181094c260-metrics-tls\") pod \"dns-operator-744455d44c-swd9l\" (UID: \"a7ea52a8-bcd9-4234-ba4d-f4181094c260\") " pod="openshift-dns-operator/dns-operator-744455d44c-swd9l" Oct 03 08:41:37 crc kubenswrapper[4765]: I1003 08:41:37.136393 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/6d421bb9-ba2e-416a-9554-7c4c7c93658b-auth-proxy-config\") pod \"machine-config-operator-74547568cd-74jfz\" (UID: \"6d421bb9-ba2e-416a-9554-7c4c7c93658b\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-74jfz" Oct 03 08:41:37 crc kubenswrapper[4765]: I1003 08:41:37.136464 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/b2831ebb-3ca5-490d-b0f7-ea2c669f78e3-trusted-ca\") pod \"cluster-image-registry-operator-dc59b4c8b-8fzw7\" (UID: \"b2831ebb-3ca5-490d-b0f7-ea2c669f78e3\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-8fzw7" Oct 03 08:41:37 crc kubenswrapper[4765]: I1003 08:41:37.136794 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/b2831ebb-3ca5-490d-b0f7-ea2c669f78e3-image-registry-operator-tls\") pod \"cluster-image-registry-operator-dc59b4c8b-8fzw7\" (UID: \"b2831ebb-3ca5-490d-b0f7-ea2c669f78e3\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-8fzw7" Oct 03 08:41:37 crc kubenswrapper[4765]: I1003 08:41:37.136821 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sb9vl\" (UniqueName: \"kubernetes.io/projected/fb1c8d7c-5da9-41d7-85a7-3a36c632e7b3-kube-api-access-sb9vl\") pod \"apiserver-7bbb656c7d-l2dpj\" (UID: \"fb1c8d7c-5da9-41d7-85a7-3a36c632e7b3\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-l2dpj" Oct 03 08:41:37 crc kubenswrapper[4765]: I1003 08:41:37.136844 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-999jl\" (UniqueName: \"kubernetes.io/projected/6639174a-08b1-409e-a6f1-5e238ef9ae85-kube-api-access-999jl\") pod \"openshift-controller-manager-operator-756b6f6bc6-7fwgr\" (UID: \"6639174a-08b1-409e-a6f1-5e238ef9ae85\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-7fwgr" Oct 03 08:41:37 crc kubenswrapper[4765]: I1003 08:41:37.137266 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xpdjb\" (UniqueName: \"kubernetes.io/projected/69cc377a-02f9-4e57-a9a8-776b5cef5b9b-kube-api-access-xpdjb\") pod \"olm-operator-6b444d44fb-tmd5f\" (UID: \"69cc377a-02f9-4e57-a9a8-776b5cef5b9b\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-tmd5f" Oct 03 08:41:37 crc kubenswrapper[4765]: I1003 08:41:37.137539 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c15a1eca-d125-468b-ac64-8046e4bcd19b-config\") pod \"kube-controller-manager-operator-78b949d7b-rhthj\" (UID: \"c15a1eca-d125-468b-ac64-8046e4bcd19b\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-rhthj" Oct 03 08:41:37 crc kubenswrapper[4765]: I1003 08:41:37.137558 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6ef10623-b762-418f-bc9d-36a66d6ec9fd-config\") pod \"console-operator-58897d9998-gzcf9\" (UID: \"6ef10623-b762-418f-bc9d-36a66d6ec9fd\") " pod="openshift-console-operator/console-operator-58897d9998-gzcf9" Oct 03 08:41:37 crc kubenswrapper[4765]: I1003 08:41:37.137597 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/fb1c8d7c-5da9-41d7-85a7-3a36c632e7b3-encryption-config\") pod \"apiserver-7bbb656c7d-l2dpj\" (UID: \"fb1c8d7c-5da9-41d7-85a7-3a36c632e7b3\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-l2dpj" Oct 03 08:41:37 crc kubenswrapper[4765]: I1003 08:41:37.138158 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f8a9647c-0b02-4a08-91ee-537052124a65-config\") pod \"kube-apiserver-operator-766d6c64bb-w9ww9\" (UID: \"f8a9647c-0b02-4a08-91ee-537052124a65\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-w9ww9" Oct 03 08:41:37 crc kubenswrapper[4765]: I1003 08:41:37.138184 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/6ef10623-b762-418f-bc9d-36a66d6ec9fd-trusted-ca\") pod \"console-operator-58897d9998-gzcf9\" (UID: \"6ef10623-b762-418f-bc9d-36a66d6ec9fd\") " pod="openshift-console-operator/console-operator-58897d9998-gzcf9" Oct 03 08:41:37 crc kubenswrapper[4765]: I1003 08:41:37.138227 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/f8a9647c-0b02-4a08-91ee-537052124a65-kube-api-access\") pod \"kube-apiserver-operator-766d6c64bb-w9ww9\" (UID: \"f8a9647c-0b02-4a08-91ee-537052124a65\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-w9ww9" Oct 03 08:41:37 crc kubenswrapper[4765]: I1003 08:41:37.138251 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ftccq\" (UniqueName: \"kubernetes.io/projected/ee85c45f-e702-4221-a738-c57382513f5b-kube-api-access-ftccq\") pod \"machine-config-controller-84d6567774-djlns\" (UID: \"ee85c45f-e702-4221-a738-c57382513f5b\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-djlns" Oct 03 08:41:37 crc kubenswrapper[4765]: I1003 08:41:37.138274 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mk48h\" (UniqueName: \"kubernetes.io/projected/8bb60ae0-1dd0-4af1-ba69-49f17ed39eba-kube-api-access-mk48h\") pod \"service-ca-9c57cc56f-pfj5p\" (UID: \"8bb60ae0-1dd0-4af1-ba69-49f17ed39eba\") " pod="openshift-service-ca/service-ca-9c57cc56f-pfj5p" Oct 03 08:41:37 crc kubenswrapper[4765]: I1003 08:41:37.138293 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/69cc377a-02f9-4e57-a9a8-776b5cef5b9b-srv-cert\") pod \"olm-operator-6b444d44fb-tmd5f\" (UID: \"69cc377a-02f9-4e57-a9a8-776b5cef5b9b\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-tmd5f" Oct 03 08:41:37 crc kubenswrapper[4765]: I1003 08:41:37.138810 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/3bf30b77-2306-4ae3-9ae2-02af916249f2-serving-cert\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-5wxg4\" (UID: \"3bf30b77-2306-4ae3-9ae2-02af916249f2\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-5wxg4" Oct 03 08:41:37 crc kubenswrapper[4765]: I1003 08:41:37.138848 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/fb1c8d7c-5da9-41d7-85a7-3a36c632e7b3-audit-policies\") pod \"apiserver-7bbb656c7d-l2dpj\" (UID: \"fb1c8d7c-5da9-41d7-85a7-3a36c632e7b3\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-l2dpj" Oct 03 08:41:37 crc kubenswrapper[4765]: I1003 08:41:37.138889 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/b2831ebb-3ca5-490d-b0f7-ea2c669f78e3-bound-sa-token\") pod \"cluster-image-registry-operator-dc59b4c8b-8fzw7\" (UID: \"b2831ebb-3ca5-490d-b0f7-ea2c669f78e3\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-8fzw7" Oct 03 08:41:37 crc kubenswrapper[4765]: I1003 08:41:37.138917 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/ee85c45f-e702-4221-a738-c57382513f5b-proxy-tls\") pod \"machine-config-controller-84d6567774-djlns\" (UID: \"ee85c45f-e702-4221-a738-c57382513f5b\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-djlns" Oct 03 08:41:37 crc kubenswrapper[4765]: I1003 08:41:37.178482 4765 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-ts98r" Oct 03 08:41:37 crc kubenswrapper[4765]: I1003 08:41:37.187576 4765 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-ph5q8"] Oct 03 08:41:37 crc kubenswrapper[4765]: I1003 08:41:37.190098 4765 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-api/machine-api-operator-5694c8668f-8mnt6" Oct 03 08:41:37 crc kubenswrapper[4765]: I1003 08:41:37.221628 4765 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/downloads-7954f5f757-qdr5x"] Oct 03 08:41:37 crc kubenswrapper[4765]: I1003 08:41:37.227294 4765 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-config-operator/openshift-config-operator-7777fb866f-9k7k4" Oct 03 08:41:37 crc kubenswrapper[4765]: I1003 08:41:37.244701 4765 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 03 08:41:37 crc kubenswrapper[4765]: E1003 08:41:37.244949 4765 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-03 08:41:37.744909239 +0000 UTC m=+142.046403569 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 03 08:41:37 crc kubenswrapper[4765]: I1003 08:41:37.245119 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/b2831ebb-3ca5-490d-b0f7-ea2c669f78e3-bound-sa-token\") pod \"cluster-image-registry-operator-dc59b4c8b-8fzw7\" (UID: \"b2831ebb-3ca5-490d-b0f7-ea2c669f78e3\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-8fzw7" Oct 03 08:41:37 crc kubenswrapper[4765]: I1003 08:41:37.245160 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/ee85c45f-e702-4221-a738-c57382513f5b-proxy-tls\") pod \"machine-config-controller-84d6567774-djlns\" (UID: \"ee85c45f-e702-4221-a738-c57382513f5b\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-djlns" Oct 03 08:41:37 crc kubenswrapper[4765]: I1003 08:41:37.245358 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/3bf30b77-2306-4ae3-9ae2-02af916249f2-serving-cert\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-5wxg4\" (UID: \"3bf30b77-2306-4ae3-9ae2-02af916249f2\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-5wxg4" Oct 03 08:41:37 crc kubenswrapper[4765]: I1003 08:41:37.245389 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/fb1c8d7c-5da9-41d7-85a7-3a36c632e7b3-audit-policies\") pod \"apiserver-7bbb656c7d-l2dpj\" (UID: \"fb1c8d7c-5da9-41d7-85a7-3a36c632e7b3\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-l2dpj" Oct 03 08:41:37 crc kubenswrapper[4765]: I1003 08:41:37.246329 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/94f97c0b-6272-475f-8794-4d9d26318d18-bound-sa-token\") pod \"ingress-operator-5b745b69d9-x6hb4\" (UID: \"94f97c0b-6272-475f-8794-4d9d26318d18\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-x6hb4" Oct 03 08:41:37 crc kubenswrapper[4765]: I1003 08:41:37.246264 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/fb1c8d7c-5da9-41d7-85a7-3a36c632e7b3-audit-policies\") pod \"apiserver-7bbb656c7d-l2dpj\" (UID: \"fb1c8d7c-5da9-41d7-85a7-3a36c632e7b3\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-l2dpj" Oct 03 08:41:37 crc kubenswrapper[4765]: I1003 08:41:37.246418 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/193eea7c-6015-42df-b104-9a2848192515-registration-dir\") pod \"csi-hostpathplugin-xjnnd\" (UID: \"193eea7c-6015-42df-b104-9a2848192515\") " pod="hostpath-provisioner/csi-hostpathplugin-xjnnd" Oct 03 08:41:37 crc kubenswrapper[4765]: I1003 08:41:37.246689 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/0cc05495-73d0-4866-adcb-aa89431470c5-samples-operator-tls\") pod \"cluster-samples-operator-665b6dd947-vphkx\" (UID: \"0cc05495-73d0-4866-adcb-aa89431470c5\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-vphkx" Oct 03 08:41:37 crc kubenswrapper[4765]: I1003 08:41:37.247205 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rc5dt\" (UniqueName: \"kubernetes.io/projected/0cc05495-73d0-4866-adcb-aa89431470c5-kube-api-access-rc5dt\") pod \"cluster-samples-operator-665b6dd947-vphkx\" (UID: \"0cc05495-73d0-4866-adcb-aa89431470c5\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-vphkx" Oct 03 08:41:37 crc kubenswrapper[4765]: I1003 08:41:37.247245 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-s2j29\" (UniqueName: \"kubernetes.io/projected/432cff95-d219-46af-bfc4-c5afbe99c9c0-kube-api-access-s2j29\") pod \"collect-profiles-29324670-fpjm8\" (UID: \"432cff95-d219-46af-bfc4-c5afbe99c9c0\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29324670-fpjm8" Oct 03 08:41:37 crc kubenswrapper[4765]: I1003 08:41:37.247276 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/8bb60ae0-1dd0-4af1-ba69-49f17ed39eba-signing-cabundle\") pod \"service-ca-9c57cc56f-pfj5p\" (UID: \"8bb60ae0-1dd0-4af1-ba69-49f17ed39eba\") " pod="openshift-service-ca/service-ca-9c57cc56f-pfj5p" Oct 03 08:41:37 crc kubenswrapper[4765]: I1003 08:41:37.247316 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6639174a-08b1-409e-a6f1-5e238ef9ae85-config\") pod \"openshift-controller-manager-operator-756b6f6bc6-7fwgr\" (UID: \"6639174a-08b1-409e-a6f1-5e238ef9ae85\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-7fwgr" Oct 03 08:41:37 crc kubenswrapper[4765]: I1003 08:41:37.247358 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/32b16068-abfd-4a3f-870c-a17c7ff31d4b-bound-sa-token\") pod \"image-registry-697d97f7c8-qwb6x\" (UID: \"32b16068-abfd-4a3f-870c-a17c7ff31d4b\") " pod="openshift-image-registry/image-registry-697d97f7c8-qwb6x" Oct 03 08:41:37 crc kubenswrapper[4765]: I1003 08:41:37.247386 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/f8a9647c-0b02-4a08-91ee-537052124a65-serving-cert\") pod \"kube-apiserver-operator-766d6c64bb-w9ww9\" (UID: \"f8a9647c-0b02-4a08-91ee-537052124a65\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-w9ww9" Oct 03 08:41:37 crc kubenswrapper[4765]: I1003 08:41:37.247414 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/6639174a-08b1-409e-a6f1-5e238ef9ae85-serving-cert\") pod \"openshift-controller-manager-operator-756b6f6bc6-7fwgr\" (UID: \"6639174a-08b1-409e-a6f1-5e238ef9ae85\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-7fwgr" Oct 03 08:41:37 crc kubenswrapper[4765]: I1003 08:41:37.247442 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/fb1c8d7c-5da9-41d7-85a7-3a36c632e7b3-trusted-ca-bundle\") pod \"apiserver-7bbb656c7d-l2dpj\" (UID: \"fb1c8d7c-5da9-41d7-85a7-3a36c632e7b3\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-l2dpj" Oct 03 08:41:37 crc kubenswrapper[4765]: I1003 08:41:37.247467 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/8bb60ae0-1dd0-4af1-ba69-49f17ed39eba-signing-key\") pod \"service-ca-9c57cc56f-pfj5p\" (UID: \"8bb60ae0-1dd0-4af1-ba69-49f17ed39eba\") " pod="openshift-service-ca/service-ca-9c57cc56f-pfj5p" Oct 03 08:41:37 crc kubenswrapper[4765]: I1003 08:41:37.247532 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5mmkp\" (UniqueName: \"kubernetes.io/projected/4a3a7817-f128-4b5a-bbb7-604c846009d5-kube-api-access-5mmkp\") pod \"multus-admission-controller-857f4d67dd-rmm5l\" (UID: \"4a3a7817-f128-4b5a-bbb7-604c846009d5\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-rmm5l" Oct 03 08:41:37 crc kubenswrapper[4765]: I1003 08:41:37.247559 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/a58649d2-054a-42ed-848d-beb9e9de3522-serving-cert\") pod \"service-ca-operator-777779d784-n4hxl\" (UID: \"a58649d2-054a-42ed-848d-beb9e9de3522\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-n4hxl" Oct 03 08:41:37 crc kubenswrapper[4765]: I1003 08:41:37.247632 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/6d421bb9-ba2e-416a-9554-7c4c7c93658b-proxy-tls\") pod \"machine-config-operator-74547568cd-74jfz\" (UID: \"6d421bb9-ba2e-416a-9554-7c4c7c93658b\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-74jfz" Oct 03 08:41:37 crc kubenswrapper[4765]: I1003 08:41:37.247696 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rlf6x\" (UniqueName: \"kubernetes.io/projected/0599e7ee-91e4-4ef3-8b8d-a5aca9e637d3-kube-api-access-rlf6x\") pod \"migrator-59844c95c7-kthpd\" (UID: \"0599e7ee-91e4-4ef3-8b8d-a5aca9e637d3\") " pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-kthpd" Oct 03 08:41:37 crc kubenswrapper[4765]: I1003 08:41:37.247726 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-t2mgk\" (UniqueName: \"kubernetes.io/projected/231de3ee-0e46-4fa8-8380-b31d98d3fab0-kube-api-access-t2mgk\") pod \"machine-config-server-7s2rp\" (UID: \"231de3ee-0e46-4fa8-8380-b31d98d3fab0\") " pod="openshift-machine-config-operator/machine-config-server-7s2rp" Oct 03 08:41:37 crc kubenswrapper[4765]: I1003 08:41:37.247756 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-779bx\" (UniqueName: \"kubernetes.io/projected/6d421bb9-ba2e-416a-9554-7c4c7c93658b-kube-api-access-779bx\") pod \"machine-config-operator-74547568cd-74jfz\" (UID: \"6d421bb9-ba2e-416a-9554-7c4c7c93658b\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-74jfz" Oct 03 08:41:37 crc kubenswrapper[4765]: I1003 08:41:37.247787 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/c59afdc0-a7ed-4cc2-8972-7c8d7414375e-package-server-manager-serving-cert\") pod \"package-server-manager-789f6589d5-gvz2v\" (UID: \"c59afdc0-a7ed-4cc2-8972-7c8d7414375e\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-gvz2v" Oct 03 08:41:37 crc kubenswrapper[4765]: I1003 08:41:37.247841 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/fb1c8d7c-5da9-41d7-85a7-3a36c632e7b3-serving-cert\") pod \"apiserver-7bbb656c7d-l2dpj\" (UID: \"fb1c8d7c-5da9-41d7-85a7-3a36c632e7b3\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-l2dpj" Oct 03 08:41:37 crc kubenswrapper[4765]: I1003 08:41:37.247880 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"csi-data-dir\" (UniqueName: \"kubernetes.io/host-path/193eea7c-6015-42df-b104-9a2848192515-csi-data-dir\") pod \"csi-hostpathplugin-xjnnd\" (UID: \"193eea7c-6015-42df-b104-9a2848192515\") " pod="hostpath-provisioner/csi-hostpathplugin-xjnnd" Oct 03 08:41:37 crc kubenswrapper[4765]: I1003 08:41:37.247914 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/32b16068-abfd-4a3f-870c-a17c7ff31d4b-registry-tls\") pod \"image-registry-697d97f7c8-qwb6x\" (UID: \"32b16068-abfd-4a3f-870c-a17c7ff31d4b\") " pod="openshift-image-registry/image-registry-697d97f7c8-qwb6x" Oct 03 08:41:37 crc kubenswrapper[4765]: I1003 08:41:37.247938 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pxt6j\" (UniqueName: \"kubernetes.io/projected/43701ed5-3c65-480e-b414-9757b707d6be-kube-api-access-pxt6j\") pod \"dns-default-5g7vw\" (UID: \"43701ed5-3c65-480e-b414-9757b707d6be\") " pod="openshift-dns/dns-default-5g7vw" Oct 03 08:41:37 crc kubenswrapper[4765]: I1003 08:41:37.247960 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/19fb459f-dca0-464c-9cc0-830b67a34583-srv-cert\") pod \"catalog-operator-68c6474976-pmrxv\" (UID: \"19fb459f-dca0-464c-9cc0-830b67a34583\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-pmrxv" Oct 03 08:41:37 crc kubenswrapper[4765]: I1003 08:41:37.247987 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/32b16068-abfd-4a3f-870c-a17c7ff31d4b-ca-trust-extracted\") pod \"image-registry-697d97f7c8-qwb6x\" (UID: \"32b16068-abfd-4a3f-870c-a17c7ff31d4b\") " pod="openshift-image-registry/image-registry-697d97f7c8-qwb6x" Oct 03 08:41:37 crc kubenswrapper[4765]: I1003 08:41:37.248027 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"mcc-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/ee85c45f-e702-4221-a738-c57382513f5b-mcc-auth-proxy-config\") pod \"machine-config-controller-84d6567774-djlns\" (UID: \"ee85c45f-e702-4221-a738-c57382513f5b\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-djlns" Oct 03 08:41:37 crc kubenswrapper[4765]: I1003 08:41:37.248051 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/6ef10623-b762-418f-bc9d-36a66d6ec9fd-serving-cert\") pod \"console-operator-58897d9998-gzcf9\" (UID: \"6ef10623-b762-418f-bc9d-36a66d6ec9fd\") " pod="openshift-console-operator/console-operator-58897d9998-gzcf9" Oct 03 08:41:37 crc kubenswrapper[4765]: I1003 08:41:37.248072 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/fb1c8d7c-5da9-41d7-85a7-3a36c632e7b3-audit-dir\") pod \"apiserver-7bbb656c7d-l2dpj\" (UID: \"fb1c8d7c-5da9-41d7-85a7-3a36c632e7b3\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-l2dpj" Oct 03 08:41:37 crc kubenswrapper[4765]: I1003 08:41:37.248124 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/69cc377a-02f9-4e57-a9a8-776b5cef5b9b-profile-collector-cert\") pod \"olm-operator-6b444d44fb-tmd5f\" (UID: \"69cc377a-02f9-4e57-a9a8-776b5cef5b9b\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-tmd5f" Oct 03 08:41:37 crc kubenswrapper[4765]: I1003 08:41:37.248149 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/94f97c0b-6272-475f-8794-4d9d26318d18-trusted-ca\") pod \"ingress-operator-5b745b69d9-x6hb4\" (UID: \"94f97c0b-6272-475f-8794-4d9d26318d18\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-x6hb4" Oct 03 08:41:37 crc kubenswrapper[4765]: I1003 08:41:37.248193 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lc9gw\" (UniqueName: \"kubernetes.io/projected/b2831ebb-3ca5-490d-b0f7-ea2c669f78e3-kube-api-access-lc9gw\") pod \"cluster-image-registry-operator-dc59b4c8b-8fzw7\" (UID: \"b2831ebb-3ca5-490d-b0f7-ea2c669f78e3\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-8fzw7" Oct 03 08:41:37 crc kubenswrapper[4765]: I1003 08:41:37.248220 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-w8dbx\" (UniqueName: \"kubernetes.io/projected/32b16068-abfd-4a3f-870c-a17c7ff31d4b-kube-api-access-w8dbx\") pod \"image-registry-697d97f7c8-qwb6x\" (UID: \"32b16068-abfd-4a3f-870c-a17c7ff31d4b\") " pod="openshift-image-registry/image-registry-697d97f7c8-qwb6x" Oct 03 08:41:37 crc kubenswrapper[4765]: I1003 08:41:37.248241 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3bf30b77-2306-4ae3-9ae2-02af916249f2-config\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-5wxg4\" (UID: \"3bf30b77-2306-4ae3-9ae2-02af916249f2\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-5wxg4" Oct 03 08:41:37 crc kubenswrapper[4765]: I1003 08:41:37.248265 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/32b16068-abfd-4a3f-870c-a17c7ff31d4b-trusted-ca\") pod \"image-registry-697d97f7c8-qwb6x\" (UID: \"32b16068-abfd-4a3f-870c-a17c7ff31d4b\") " pod="openshift-image-registry/image-registry-697d97f7c8-qwb6x" Oct 03 08:41:37 crc kubenswrapper[4765]: I1003 08:41:37.248304 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/3bf30b77-2306-4ae3-9ae2-02af916249f2-kube-api-access\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-5wxg4\" (UID: \"3bf30b77-2306-4ae3-9ae2-02af916249f2\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-5wxg4" Oct 03 08:41:37 crc kubenswrapper[4765]: I1003 08:41:37.248331 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qnvf4\" (UniqueName: \"kubernetes.io/projected/a7ea52a8-bcd9-4234-ba4d-f4181094c260-kube-api-access-qnvf4\") pod \"dns-operator-744455d44c-swd9l\" (UID: \"a7ea52a8-bcd9-4234-ba4d-f4181094c260\") " pod="openshift-dns-operator/dns-operator-744455d44c-swd9l" Oct 03 08:41:37 crc kubenswrapper[4765]: I1003 08:41:37.248380 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/32b16068-abfd-4a3f-870c-a17c7ff31d4b-registry-certificates\") pod \"image-registry-697d97f7c8-qwb6x\" (UID: \"32b16068-abfd-4a3f-870c-a17c7ff31d4b\") " pod="openshift-image-registry/image-registry-697d97f7c8-qwb6x" Oct 03 08:41:37 crc kubenswrapper[4765]: I1003 08:41:37.248412 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/432cff95-d219-46af-bfc4-c5afbe99c9c0-secret-volume\") pod \"collect-profiles-29324670-fpjm8\" (UID: \"432cff95-d219-46af-bfc4-c5afbe99c9c0\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29324670-fpjm8" Oct 03 08:41:37 crc kubenswrapper[4765]: I1003 08:41:37.248436 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5c2h2\" (UniqueName: \"kubernetes.io/projected/19fb459f-dca0-464c-9cc0-830b67a34583-kube-api-access-5c2h2\") pod \"catalog-operator-68c6474976-pmrxv\" (UID: \"19fb459f-dca0-464c-9cc0-830b67a34583\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-pmrxv" Oct 03 08:41:37 crc kubenswrapper[4765]: I1003 08:41:37.248458 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"mountpoint-dir\" (UniqueName: \"kubernetes.io/host-path/193eea7c-6015-42df-b104-9a2848192515-mountpoint-dir\") pod \"csi-hostpathplugin-xjnnd\" (UID: \"193eea7c-6015-42df-b104-9a2848192515\") " pod="hostpath-provisioner/csi-hostpathplugin-xjnnd" Oct 03 08:41:37 crc kubenswrapper[4765]: I1003 08:41:37.248501 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/6d421bb9-ba2e-416a-9554-7c4c7c93658b-images\") pod \"machine-config-operator-74547568cd-74jfz\" (UID: \"6d421bb9-ba2e-416a-9554-7c4c7c93658b\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-74jfz" Oct 03 08:41:37 crc kubenswrapper[4765]: I1003 08:41:37.248556 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/43701ed5-3c65-480e-b414-9757b707d6be-metrics-tls\") pod \"dns-default-5g7vw\" (UID: \"43701ed5-3c65-480e-b414-9757b707d6be\") " pod="openshift-dns/dns-default-5g7vw" Oct 03 08:41:37 crc kubenswrapper[4765]: I1003 08:41:37.248584 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/c15a1eca-d125-468b-ac64-8046e4bcd19b-serving-cert\") pod \"kube-controller-manager-operator-78b949d7b-rhthj\" (UID: \"c15a1eca-d125-468b-ac64-8046e4bcd19b\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-rhthj" Oct 03 08:41:37 crc kubenswrapper[4765]: I1003 08:41:37.248608 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/28cc4e4f-507b-49c7-9a8f-2107e600e834-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-9g9cw\" (UID: \"28cc4e4f-507b-49c7-9a8f-2107e600e834\") " pod="openshift-marketplace/marketplace-operator-79b997595-9g9cw" Oct 03 08:41:37 crc kubenswrapper[4765]: I1003 08:41:37.249244 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/8bb60ae0-1dd0-4af1-ba69-49f17ed39eba-signing-cabundle\") pod \"service-ca-9c57cc56f-pfj5p\" (UID: \"8bb60ae0-1dd0-4af1-ba69-49f17ed39eba\") " pod="openshift-service-ca/service-ca-9c57cc56f-pfj5p" Oct 03 08:41:37 crc kubenswrapper[4765]: I1003 08:41:37.250052 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"mcc-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/ee85c45f-e702-4221-a738-c57382513f5b-mcc-auth-proxy-config\") pod \"machine-config-controller-84d6567774-djlns\" (UID: \"ee85c45f-e702-4221-a738-c57382513f5b\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-djlns" Oct 03 08:41:37 crc kubenswrapper[4765]: I1003 08:41:37.251339 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-85mlv\" (UniqueName: \"kubernetes.io/projected/28cc4e4f-507b-49c7-9a8f-2107e600e834-kube-api-access-85mlv\") pod \"marketplace-operator-79b997595-9g9cw\" (UID: \"28cc4e4f-507b-49c7-9a8f-2107e600e834\") " pod="openshift-marketplace/marketplace-operator-79b997595-9g9cw" Oct 03 08:41:37 crc kubenswrapper[4765]: I1003 08:41:37.251405 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/19fb459f-dca0-464c-9cc0-830b67a34583-profile-collector-cert\") pod \"catalog-operator-68c6474976-pmrxv\" (UID: \"19fb459f-dca0-464c-9cc0-830b67a34583\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-pmrxv" Oct 03 08:41:37 crc kubenswrapper[4765]: I1003 08:41:37.251431 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"certs\" (UniqueName: \"kubernetes.io/secret/231de3ee-0e46-4fa8-8380-b31d98d3fab0-certs\") pod \"machine-config-server-7s2rp\" (UID: \"231de3ee-0e46-4fa8-8380-b31d98d3fab0\") " pod="openshift-machine-config-operator/machine-config-server-7s2rp" Oct 03 08:41:37 crc kubenswrapper[4765]: I1003 08:41:37.251467 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/94f97c0b-6272-475f-8794-4d9d26318d18-metrics-tls\") pod \"ingress-operator-5b745b69d9-x6hb4\" (UID: \"94f97c0b-6272-475f-8794-4d9d26318d18\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-x6hb4" Oct 03 08:41:37 crc kubenswrapper[4765]: I1003 08:41:37.251494 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-h28hx\" (UniqueName: \"kubernetes.io/projected/8f8201b3-edba-4bac-9d31-08452195ff1f-kube-api-access-h28hx\") pod \"control-plane-machine-set-operator-78cbb6b69f-rnsx7\" (UID: \"8f8201b3-edba-4bac-9d31-08452195ff1f\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-rnsx7" Oct 03 08:41:37 crc kubenswrapper[4765]: I1003 08:41:37.251527 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-qwb6x\" (UID: \"32b16068-abfd-4a3f-870c-a17c7ff31d4b\") " pod="openshift-image-registry/image-registry-697d97f7c8-qwb6x" Oct 03 08:41:37 crc kubenswrapper[4765]: I1003 08:41:37.251552 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/32b16068-abfd-4a3f-870c-a17c7ff31d4b-installation-pull-secrets\") pod \"image-registry-697d97f7c8-qwb6x\" (UID: \"32b16068-abfd-4a3f-870c-a17c7ff31d4b\") " pod="openshift-image-registry/image-registry-697d97f7c8-qwb6x" Oct 03 08:41:37 crc kubenswrapper[4765]: I1003 08:41:37.251577 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/fb1c8d7c-5da9-41d7-85a7-3a36c632e7b3-etcd-serving-ca\") pod \"apiserver-7bbb656c7d-l2dpj\" (UID: \"fb1c8d7c-5da9-41d7-85a7-3a36c632e7b3\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-l2dpj" Oct 03 08:41:37 crc kubenswrapper[4765]: I1003 08:41:37.251601 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6mh7d\" (UniqueName: \"kubernetes.io/projected/6ef10623-b762-418f-bc9d-36a66d6ec9fd-kube-api-access-6mh7d\") pod \"console-operator-58897d9998-gzcf9\" (UID: \"6ef10623-b762-418f-bc9d-36a66d6ec9fd\") " pod="openshift-console-operator/console-operator-58897d9998-gzcf9" Oct 03 08:41:37 crc kubenswrapper[4765]: I1003 08:41:37.251633 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-j27lj\" (UniqueName: \"kubernetes.io/projected/a58649d2-054a-42ed-848d-beb9e9de3522-kube-api-access-j27lj\") pod \"service-ca-operator-777779d784-n4hxl\" (UID: \"a58649d2-054a-42ed-848d-beb9e9de3522\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-n4hxl" Oct 03 08:41:37 crc kubenswrapper[4765]: I1003 08:41:37.251684 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/43701ed5-3c65-480e-b414-9757b707d6be-config-volume\") pod \"dns-default-5g7vw\" (UID: \"43701ed5-3c65-480e-b414-9757b707d6be\") " pod="openshift-dns/dns-default-5g7vw" Oct 03 08:41:37 crc kubenswrapper[4765]: I1003 08:41:37.251708 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hhz42\" (UniqueName: \"kubernetes.io/projected/94f97c0b-6272-475f-8794-4d9d26318d18-kube-api-access-hhz42\") pod \"ingress-operator-5b745b69d9-x6hb4\" (UID: \"94f97c0b-6272-475f-8794-4d9d26318d18\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-x6hb4" Oct 03 08:41:37 crc kubenswrapper[4765]: I1003 08:41:37.251737 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/fb1c8d7c-5da9-41d7-85a7-3a36c632e7b3-etcd-client\") pod \"apiserver-7bbb656c7d-l2dpj\" (UID: \"fb1c8d7c-5da9-41d7-85a7-3a36c632e7b3\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-l2dpj" Oct 03 08:41:37 crc kubenswrapper[4765]: I1003 08:41:37.251783 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/c15a1eca-d125-468b-ac64-8046e4bcd19b-kube-api-access\") pod \"kube-controller-manager-operator-78b949d7b-rhthj\" (UID: \"c15a1eca-d125-468b-ac64-8046e4bcd19b\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-rhthj" Oct 03 08:41:37 crc kubenswrapper[4765]: I1003 08:41:37.251811 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/28cc4e4f-507b-49c7-9a8f-2107e600e834-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-9g9cw\" (UID: \"28cc4e4f-507b-49c7-9a8f-2107e600e834\") " pod="openshift-marketplace/marketplace-operator-79b997595-9g9cw" Oct 03 08:41:37 crc kubenswrapper[4765]: I1003 08:41:37.251835 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/a7ea52a8-bcd9-4234-ba4d-f4181094c260-metrics-tls\") pod \"dns-operator-744455d44c-swd9l\" (UID: \"a7ea52a8-bcd9-4234-ba4d-f4181094c260\") " pod="openshift-dns-operator/dns-operator-744455d44c-swd9l" Oct 03 08:41:37 crc kubenswrapper[4765]: I1003 08:41:37.251963 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/6d421bb9-ba2e-416a-9554-7c4c7c93658b-auth-proxy-config\") pod \"machine-config-operator-74547568cd-74jfz\" (UID: \"6d421bb9-ba2e-416a-9554-7c4c7c93658b\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-74jfz" Oct 03 08:41:37 crc kubenswrapper[4765]: I1003 08:41:37.252386 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/b2831ebb-3ca5-490d-b0f7-ea2c669f78e3-trusted-ca\") pod \"cluster-image-registry-operator-dc59b4c8b-8fzw7\" (UID: \"b2831ebb-3ca5-490d-b0f7-ea2c669f78e3\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-8fzw7" Oct 03 08:41:37 crc kubenswrapper[4765]: I1003 08:41:37.252437 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/4a3a7817-f128-4b5a-bbb7-604c846009d5-webhook-certs\") pod \"multus-admission-controller-857f4d67dd-rmm5l\" (UID: \"4a3a7817-f128-4b5a-bbb7-604c846009d5\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-rmm5l" Oct 03 08:41:37 crc kubenswrapper[4765]: I1003 08:41:37.252460 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/b2831ebb-3ca5-490d-b0f7-ea2c669f78e3-image-registry-operator-tls\") pod \"cluster-image-registry-operator-dc59b4c8b-8fzw7\" (UID: \"b2831ebb-3ca5-490d-b0f7-ea2c669f78e3\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-8fzw7" Oct 03 08:41:37 crc kubenswrapper[4765]: I1003 08:41:37.252484 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-sb9vl\" (UniqueName: \"kubernetes.io/projected/fb1c8d7c-5da9-41d7-85a7-3a36c632e7b3-kube-api-access-sb9vl\") pod \"apiserver-7bbb656c7d-l2dpj\" (UID: \"fb1c8d7c-5da9-41d7-85a7-3a36c632e7b3\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-l2dpj" Oct 03 08:41:37 crc kubenswrapper[4765]: I1003 08:41:37.252522 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-999jl\" (UniqueName: \"kubernetes.io/projected/6639174a-08b1-409e-a6f1-5e238ef9ae85-kube-api-access-999jl\") pod \"openshift-controller-manager-operator-756b6f6bc6-7fwgr\" (UID: \"6639174a-08b1-409e-a6f1-5e238ef9ae85\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-7fwgr" Oct 03 08:41:37 crc kubenswrapper[4765]: I1003 08:41:37.252545 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/432cff95-d219-46af-bfc4-c5afbe99c9c0-config-volume\") pod \"collect-profiles-29324670-fpjm8\" (UID: \"432cff95-d219-46af-bfc4-c5afbe99c9c0\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29324670-fpjm8" Oct 03 08:41:37 crc kubenswrapper[4765]: I1003 08:41:37.252565 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-f247d\" (UniqueName: \"kubernetes.io/projected/c59afdc0-a7ed-4cc2-8972-7c8d7414375e-kube-api-access-f247d\") pod \"package-server-manager-789f6589d5-gvz2v\" (UID: \"c59afdc0-a7ed-4cc2-8972-7c8d7414375e\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-gvz2v" Oct 03 08:41:37 crc kubenswrapper[4765]: I1003 08:41:37.252600 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xpdjb\" (UniqueName: \"kubernetes.io/projected/69cc377a-02f9-4e57-a9a8-776b5cef5b9b-kube-api-access-xpdjb\") pod \"olm-operator-6b444d44fb-tmd5f\" (UID: \"69cc377a-02f9-4e57-a9a8-776b5cef5b9b\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-tmd5f" Oct 03 08:41:37 crc kubenswrapper[4765]: I1003 08:41:37.252617 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/193eea7c-6015-42df-b104-9a2848192515-socket-dir\") pod \"csi-hostpathplugin-xjnnd\" (UID: \"193eea7c-6015-42df-b104-9a2848192515\") " pod="hostpath-provisioner/csi-hostpathplugin-xjnnd" Oct 03 08:41:37 crc kubenswrapper[4765]: I1003 08:41:37.256215 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6639174a-08b1-409e-a6f1-5e238ef9ae85-config\") pod \"openshift-controller-manager-operator-756b6f6bc6-7fwgr\" (UID: \"6639174a-08b1-409e-a6f1-5e238ef9ae85\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-7fwgr" Oct 03 08:41:37 crc kubenswrapper[4765]: I1003 08:41:37.256926 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c15a1eca-d125-468b-ac64-8046e4bcd19b-config\") pod \"kube-controller-manager-operator-78b949d7b-rhthj\" (UID: \"c15a1eca-d125-468b-ac64-8046e4bcd19b\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-rhthj" Oct 03 08:41:37 crc kubenswrapper[4765]: I1003 08:41:37.256971 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6ef10623-b762-418f-bc9d-36a66d6ec9fd-config\") pod \"console-operator-58897d9998-gzcf9\" (UID: \"6ef10623-b762-418f-bc9d-36a66d6ec9fd\") " pod="openshift-console-operator/console-operator-58897d9998-gzcf9" Oct 03 08:41:37 crc kubenswrapper[4765]: I1003 08:41:37.257023 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-bootstrap-token\" (UniqueName: \"kubernetes.io/secret/231de3ee-0e46-4fa8-8380-b31d98d3fab0-node-bootstrap-token\") pod \"machine-config-server-7s2rp\" (UID: \"231de3ee-0e46-4fa8-8380-b31d98d3fab0\") " pod="openshift-machine-config-operator/machine-config-server-7s2rp" Oct 03 08:41:37 crc kubenswrapper[4765]: I1003 08:41:37.257064 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/fb1c8d7c-5da9-41d7-85a7-3a36c632e7b3-encryption-config\") pod \"apiserver-7bbb656c7d-l2dpj\" (UID: \"fb1c8d7c-5da9-41d7-85a7-3a36c632e7b3\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-l2dpj" Oct 03 08:41:37 crc kubenswrapper[4765]: I1003 08:41:37.257118 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugins-dir\" (UniqueName: \"kubernetes.io/host-path/193eea7c-6015-42df-b104-9a2848192515-plugins-dir\") pod \"csi-hostpathplugin-xjnnd\" (UID: \"193eea7c-6015-42df-b104-9a2848192515\") " pod="hostpath-provisioner/csi-hostpathplugin-xjnnd" Oct 03 08:41:37 crc kubenswrapper[4765]: I1003 08:41:37.257137 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a58649d2-054a-42ed-848d-beb9e9de3522-config\") pod \"service-ca-operator-777779d784-n4hxl\" (UID: \"a58649d2-054a-42ed-848d-beb9e9de3522\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-n4hxl" Oct 03 08:41:37 crc kubenswrapper[4765]: I1003 08:41:37.257200 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"control-plane-machine-set-operator-tls\" (UniqueName: \"kubernetes.io/secret/8f8201b3-edba-4bac-9d31-08452195ff1f-control-plane-machine-set-operator-tls\") pod \"control-plane-machine-set-operator-78cbb6b69f-rnsx7\" (UID: \"8f8201b3-edba-4bac-9d31-08452195ff1f\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-rnsx7" Oct 03 08:41:37 crc kubenswrapper[4765]: I1003 08:41:37.257263 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f8a9647c-0b02-4a08-91ee-537052124a65-config\") pod \"kube-apiserver-operator-766d6c64bb-w9ww9\" (UID: \"f8a9647c-0b02-4a08-91ee-537052124a65\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-w9ww9" Oct 03 08:41:37 crc kubenswrapper[4765]: I1003 08:41:37.257284 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/6ef10623-b762-418f-bc9d-36a66d6ec9fd-trusted-ca\") pod \"console-operator-58897d9998-gzcf9\" (UID: \"6ef10623-b762-418f-bc9d-36a66d6ec9fd\") " pod="openshift-console-operator/console-operator-58897d9998-gzcf9" Oct 03 08:41:37 crc kubenswrapper[4765]: I1003 08:41:37.257306 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-npv8q\" (UniqueName: \"kubernetes.io/projected/193eea7c-6015-42df-b104-9a2848192515-kube-api-access-npv8q\") pod \"csi-hostpathplugin-xjnnd\" (UID: \"193eea7c-6015-42df-b104-9a2848192515\") " pod="hostpath-provisioner/csi-hostpathplugin-xjnnd" Oct 03 08:41:37 crc kubenswrapper[4765]: I1003 08:41:37.257348 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ftccq\" (UniqueName: \"kubernetes.io/projected/ee85c45f-e702-4221-a738-c57382513f5b-kube-api-access-ftccq\") pod \"machine-config-controller-84d6567774-djlns\" (UID: \"ee85c45f-e702-4221-a738-c57382513f5b\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-djlns" Oct 03 08:41:37 crc kubenswrapper[4765]: I1003 08:41:37.257366 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/f8a9647c-0b02-4a08-91ee-537052124a65-kube-api-access\") pod \"kube-apiserver-operator-766d6c64bb-w9ww9\" (UID: \"f8a9647c-0b02-4a08-91ee-537052124a65\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-w9ww9" Oct 03 08:41:37 crc kubenswrapper[4765]: I1003 08:41:37.257384 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mk48h\" (UniqueName: \"kubernetes.io/projected/8bb60ae0-1dd0-4af1-ba69-49f17ed39eba-kube-api-access-mk48h\") pod \"service-ca-9c57cc56f-pfj5p\" (UID: \"8bb60ae0-1dd0-4af1-ba69-49f17ed39eba\") " pod="openshift-service-ca/service-ca-9c57cc56f-pfj5p" Oct 03 08:41:37 crc kubenswrapper[4765]: I1003 08:41:37.257501 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/69cc377a-02f9-4e57-a9a8-776b5cef5b9b-srv-cert\") pod \"olm-operator-6b444d44fb-tmd5f\" (UID: \"69cc377a-02f9-4e57-a9a8-776b5cef5b9b\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-tmd5f" Oct 03 08:41:37 crc kubenswrapper[4765]: I1003 08:41:37.257519 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/ba0a8c66-3540-44e3-a29f-dda86ace66e8-cert\") pod \"ingress-canary-qj7dr\" (UID: \"ba0a8c66-3540-44e3-a29f-dda86ace66e8\") " pod="openshift-ingress-canary/ingress-canary-qj7dr" Oct 03 08:41:37 crc kubenswrapper[4765]: I1003 08:41:37.257581 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qbpf5\" (UniqueName: \"kubernetes.io/projected/ba0a8c66-3540-44e3-a29f-dda86ace66e8-kube-api-access-qbpf5\") pod \"ingress-canary-qj7dr\" (UID: \"ba0a8c66-3540-44e3-a29f-dda86ace66e8\") " pod="openshift-ingress-canary/ingress-canary-qj7dr" Oct 03 08:41:37 crc kubenswrapper[4765]: I1003 08:41:37.257995 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/fb1c8d7c-5da9-41d7-85a7-3a36c632e7b3-audit-dir\") pod \"apiserver-7bbb656c7d-l2dpj\" (UID: \"fb1c8d7c-5da9-41d7-85a7-3a36c632e7b3\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-l2dpj" Oct 03 08:41:37 crc kubenswrapper[4765]: I1003 08:41:37.259197 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c15a1eca-d125-468b-ac64-8046e4bcd19b-config\") pod \"kube-controller-manager-operator-78b949d7b-rhthj\" (UID: \"c15a1eca-d125-468b-ac64-8046e4bcd19b\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-rhthj" Oct 03 08:41:37 crc kubenswrapper[4765]: I1003 08:41:37.259986 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3bf30b77-2306-4ae3-9ae2-02af916249f2-config\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-5wxg4\" (UID: \"3bf30b77-2306-4ae3-9ae2-02af916249f2\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-5wxg4" Oct 03 08:41:37 crc kubenswrapper[4765]: E1003 08:41:37.260413 4765 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-03 08:41:37.760398841 +0000 UTC m=+142.061893171 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-qwb6x" (UID: "32b16068-abfd-4a3f-870c-a17c7ff31d4b") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 03 08:41:37 crc kubenswrapper[4765]: I1003 08:41:37.260554 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/c15a1eca-d125-468b-ac64-8046e4bcd19b-serving-cert\") pod \"kube-controller-manager-operator-78b949d7b-rhthj\" (UID: \"c15a1eca-d125-468b-ac64-8046e4bcd19b\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-rhthj" Oct 03 08:41:37 crc kubenswrapper[4765]: I1003 08:41:37.262836 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/3bf30b77-2306-4ae3-9ae2-02af916249f2-serving-cert\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-5wxg4\" (UID: \"3bf30b77-2306-4ae3-9ae2-02af916249f2\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-5wxg4" Oct 03 08:41:37 crc kubenswrapper[4765]: I1003 08:41:37.263813 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f8a9647c-0b02-4a08-91ee-537052124a65-config\") pod \"kube-apiserver-operator-766d6c64bb-w9ww9\" (UID: \"f8a9647c-0b02-4a08-91ee-537052124a65\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-w9ww9" Oct 03 08:41:37 crc kubenswrapper[4765]: I1003 08:41:37.264249 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/6639174a-08b1-409e-a6f1-5e238ef9ae85-serving-cert\") pod \"openshift-controller-manager-operator-756b6f6bc6-7fwgr\" (UID: \"6639174a-08b1-409e-a6f1-5e238ef9ae85\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-7fwgr" Oct 03 08:41:37 crc kubenswrapper[4765]: I1003 08:41:37.265108 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/0cc05495-73d0-4866-adcb-aa89431470c5-samples-operator-tls\") pod \"cluster-samples-operator-665b6dd947-vphkx\" (UID: \"0cc05495-73d0-4866-adcb-aa89431470c5\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-vphkx" Oct 03 08:41:37 crc kubenswrapper[4765]: I1003 08:41:37.265146 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/f8a9647c-0b02-4a08-91ee-537052124a65-serving-cert\") pod \"kube-apiserver-operator-766d6c64bb-w9ww9\" (UID: \"f8a9647c-0b02-4a08-91ee-537052124a65\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-w9ww9" Oct 03 08:41:37 crc kubenswrapper[4765]: I1003 08:41:37.265613 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/a7ea52a8-bcd9-4234-ba4d-f4181094c260-metrics-tls\") pod \"dns-operator-744455d44c-swd9l\" (UID: \"a7ea52a8-bcd9-4234-ba4d-f4181094c260\") " pod="openshift-dns-operator/dns-operator-744455d44c-swd9l" Oct 03 08:41:37 crc kubenswrapper[4765]: I1003 08:41:37.269556 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/6d421bb9-ba2e-416a-9554-7c4c7c93658b-auth-proxy-config\") pod \"machine-config-operator-74547568cd-74jfz\" (UID: \"6d421bb9-ba2e-416a-9554-7c4c7c93658b\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-74jfz" Oct 03 08:41:37 crc kubenswrapper[4765]: I1003 08:41:37.270235 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/ee85c45f-e702-4221-a738-c57382513f5b-proxy-tls\") pod \"machine-config-controller-84d6567774-djlns\" (UID: \"ee85c45f-e702-4221-a738-c57382513f5b\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-djlns" Oct 03 08:41:37 crc kubenswrapper[4765]: I1003 08:41:37.270505 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/32b16068-abfd-4a3f-870c-a17c7ff31d4b-ca-trust-extracted\") pod \"image-registry-697d97f7c8-qwb6x\" (UID: \"32b16068-abfd-4a3f-870c-a17c7ff31d4b\") " pod="openshift-image-registry/image-registry-697d97f7c8-qwb6x" Oct 03 08:41:37 crc kubenswrapper[4765]: I1003 08:41:37.270641 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/32b16068-abfd-4a3f-870c-a17c7ff31d4b-trusted-ca\") pod \"image-registry-697d97f7c8-qwb6x\" (UID: \"32b16068-abfd-4a3f-870c-a17c7ff31d4b\") " pod="openshift-image-registry/image-registry-697d97f7c8-qwb6x" Oct 03 08:41:37 crc kubenswrapper[4765]: I1003 08:41:37.271527 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"images\" (UniqueName: \"kubernetes.io/configmap/6d421bb9-ba2e-416a-9554-7c4c7c93658b-images\") pod \"machine-config-operator-74547568cd-74jfz\" (UID: \"6d421bb9-ba2e-416a-9554-7c4c7c93658b\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-74jfz" Oct 03 08:41:37 crc kubenswrapper[4765]: I1003 08:41:37.271824 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/32b16068-abfd-4a3f-870c-a17c7ff31d4b-registry-tls\") pod \"image-registry-697d97f7c8-qwb6x\" (UID: \"32b16068-abfd-4a3f-870c-a17c7ff31d4b\") " pod="openshift-image-registry/image-registry-697d97f7c8-qwb6x" Oct 03 08:41:37 crc kubenswrapper[4765]: I1003 08:41:37.291839 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/69cc377a-02f9-4e57-a9a8-776b5cef5b9b-profile-collector-cert\") pod \"olm-operator-6b444d44fb-tmd5f\" (UID: \"69cc377a-02f9-4e57-a9a8-776b5cef5b9b\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-tmd5f" Oct 03 08:41:37 crc kubenswrapper[4765]: I1003 08:41:37.292088 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/32b16068-abfd-4a3f-870c-a17c7ff31d4b-registry-certificates\") pod \"image-registry-697d97f7c8-qwb6x\" (UID: \"32b16068-abfd-4a3f-870c-a17c7ff31d4b\") " pod="openshift-image-registry/image-registry-697d97f7c8-qwb6x" Oct 03 08:41:37 crc kubenswrapper[4765]: I1003 08:41:37.292364 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6ef10623-b762-418f-bc9d-36a66d6ec9fd-config\") pod \"console-operator-58897d9998-gzcf9\" (UID: \"6ef10623-b762-418f-bc9d-36a66d6ec9fd\") " pod="openshift-console-operator/console-operator-58897d9998-gzcf9" Oct 03 08:41:37 crc kubenswrapper[4765]: I1003 08:41:37.292943 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/6ef10623-b762-418f-bc9d-36a66d6ec9fd-trusted-ca\") pod \"console-operator-58897d9998-gzcf9\" (UID: \"6ef10623-b762-418f-bc9d-36a66d6ec9fd\") " pod="openshift-console-operator/console-operator-58897d9998-gzcf9" Oct 03 08:41:37 crc kubenswrapper[4765]: I1003 08:41:37.293000 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/fb1c8d7c-5da9-41d7-85a7-3a36c632e7b3-etcd-client\") pod \"apiserver-7bbb656c7d-l2dpj\" (UID: \"fb1c8d7c-5da9-41d7-85a7-3a36c632e7b3\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-l2dpj" Oct 03 08:41:37 crc kubenswrapper[4765]: I1003 08:41:37.294726 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/fb1c8d7c-5da9-41d7-85a7-3a36c632e7b3-trusted-ca-bundle\") pod \"apiserver-7bbb656c7d-l2dpj\" (UID: \"fb1c8d7c-5da9-41d7-85a7-3a36c632e7b3\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-l2dpj" Oct 03 08:41:37 crc kubenswrapper[4765]: I1003 08:41:37.295321 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/fb1c8d7c-5da9-41d7-85a7-3a36c632e7b3-encryption-config\") pod \"apiserver-7bbb656c7d-l2dpj\" (UID: \"fb1c8d7c-5da9-41d7-85a7-3a36c632e7b3\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-l2dpj" Oct 03 08:41:37 crc kubenswrapper[4765]: I1003 08:41:37.296091 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/fb1c8d7c-5da9-41d7-85a7-3a36c632e7b3-serving-cert\") pod \"apiserver-7bbb656c7d-l2dpj\" (UID: \"fb1c8d7c-5da9-41d7-85a7-3a36c632e7b3\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-l2dpj" Oct 03 08:41:37 crc kubenswrapper[4765]: I1003 08:41:37.296160 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/32b16068-abfd-4a3f-870c-a17c7ff31d4b-installation-pull-secrets\") pod \"image-registry-697d97f7c8-qwb6x\" (UID: \"32b16068-abfd-4a3f-870c-a17c7ff31d4b\") " pod="openshift-image-registry/image-registry-697d97f7c8-qwb6x" Oct 03 08:41:37 crc kubenswrapper[4765]: I1003 08:41:37.296296 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/fb1c8d7c-5da9-41d7-85a7-3a36c632e7b3-etcd-serving-ca\") pod \"apiserver-7bbb656c7d-l2dpj\" (UID: \"fb1c8d7c-5da9-41d7-85a7-3a36c632e7b3\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-l2dpj" Oct 03 08:41:37 crc kubenswrapper[4765]: I1003 08:41:37.297015 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/6d421bb9-ba2e-416a-9554-7c4c7c93658b-proxy-tls\") pod \"machine-config-operator-74547568cd-74jfz\" (UID: \"6d421bb9-ba2e-416a-9554-7c4c7c93658b\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-74jfz" Oct 03 08:41:37 crc kubenswrapper[4765]: I1003 08:41:37.297508 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/8bb60ae0-1dd0-4af1-ba69-49f17ed39eba-signing-key\") pod \"service-ca-9c57cc56f-pfj5p\" (UID: \"8bb60ae0-1dd0-4af1-ba69-49f17ed39eba\") " pod="openshift-service-ca/service-ca-9c57cc56f-pfj5p" Oct 03 08:41:37 crc kubenswrapper[4765]: I1003 08:41:37.298119 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/69cc377a-02f9-4e57-a9a8-776b5cef5b9b-srv-cert\") pod \"olm-operator-6b444d44fb-tmd5f\" (UID: \"69cc377a-02f9-4e57-a9a8-776b5cef5b9b\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-tmd5f" Oct 03 08:41:37 crc kubenswrapper[4765]: I1003 08:41:37.298351 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/6ef10623-b762-418f-bc9d-36a66d6ec9fd-serving-cert\") pod \"console-operator-58897d9998-gzcf9\" (UID: \"6ef10623-b762-418f-bc9d-36a66d6ec9fd\") " pod="openshift-console-operator/console-operator-58897d9998-gzcf9" Oct 03 08:41:37 crc kubenswrapper[4765]: I1003 08:41:37.299285 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/b2831ebb-3ca5-490d-b0f7-ea2c669f78e3-trusted-ca\") pod \"cluster-image-registry-operator-dc59b4c8b-8fzw7\" (UID: \"b2831ebb-3ca5-490d-b0f7-ea2c669f78e3\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-8fzw7" Oct 03 08:41:37 crc kubenswrapper[4765]: I1003 08:41:37.309829 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/b2831ebb-3ca5-490d-b0f7-ea2c669f78e3-image-registry-operator-tls\") pod \"cluster-image-registry-operator-dc59b4c8b-8fzw7\" (UID: \"b2831ebb-3ca5-490d-b0f7-ea2c669f78e3\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-8fzw7" Oct 03 08:41:37 crc kubenswrapper[4765]: W1003 08:41:37.312720 4765 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod959cebb7_4057_42d3_a1bf_fc19557247cc.slice/crio-d46422da155def79728bed57a9031732e5ab30d2d685e0179a581329abc54998 WatchSource:0}: Error finding container d46422da155def79728bed57a9031732e5ab30d2d685e0179a581329abc54998: Status 404 returned error can't find the container with id d46422da155def79728bed57a9031732e5ab30d2d685e0179a581329abc54998 Oct 03 08:41:37 crc kubenswrapper[4765]: I1003 08:41:37.314336 4765 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-apiserver/apiserver-76f77b778f-gcgfs"] Oct 03 08:41:37 crc kubenswrapper[4765]: I1003 08:41:37.316111 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rc5dt\" (UniqueName: \"kubernetes.io/projected/0cc05495-73d0-4866-adcb-aa89431470c5-kube-api-access-rc5dt\") pod \"cluster-samples-operator-665b6dd947-vphkx\" (UID: \"0cc05495-73d0-4866-adcb-aa89431470c5\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-vphkx" Oct 03 08:41:37 crc kubenswrapper[4765]: I1003 08:41:37.316215 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/b2831ebb-3ca5-490d-b0f7-ea2c669f78e3-bound-sa-token\") pod \"cluster-image-registry-operator-dc59b4c8b-8fzw7\" (UID: \"b2831ebb-3ca5-490d-b0f7-ea2c669f78e3\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-8fzw7" Oct 03 08:41:37 crc kubenswrapper[4765]: I1003 08:41:37.337266 4765 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-4srhb"] Oct 03 08:41:37 crc kubenswrapper[4765]: I1003 08:41:37.339510 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/32b16068-abfd-4a3f-870c-a17c7ff31d4b-bound-sa-token\") pod \"image-registry-697d97f7c8-qwb6x\" (UID: \"32b16068-abfd-4a3f-870c-a17c7ff31d4b\") " pod="openshift-image-registry/image-registry-697d97f7c8-qwb6x" Oct 03 08:41:37 crc kubenswrapper[4765]: I1003 08:41:37.359520 4765 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication-operator/authentication-operator-69f744f599-mpxvz"] Oct 03 08:41:37 crc kubenswrapper[4765]: I1003 08:41:37.365842 4765 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 03 08:41:37 crc kubenswrapper[4765]: I1003 08:41:37.366077 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-j27lj\" (UniqueName: \"kubernetes.io/projected/a58649d2-054a-42ed-848d-beb9e9de3522-kube-api-access-j27lj\") pod \"service-ca-operator-777779d784-n4hxl\" (UID: \"a58649d2-054a-42ed-848d-beb9e9de3522\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-n4hxl" Oct 03 08:41:37 crc kubenswrapper[4765]: I1003 08:41:37.366111 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/43701ed5-3c65-480e-b414-9757b707d6be-config-volume\") pod \"dns-default-5g7vw\" (UID: \"43701ed5-3c65-480e-b414-9757b707d6be\") " pod="openshift-dns/dns-default-5g7vw" Oct 03 08:41:37 crc kubenswrapper[4765]: I1003 08:41:37.366131 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hhz42\" (UniqueName: \"kubernetes.io/projected/94f97c0b-6272-475f-8794-4d9d26318d18-kube-api-access-hhz42\") pod \"ingress-operator-5b745b69d9-x6hb4\" (UID: \"94f97c0b-6272-475f-8794-4d9d26318d18\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-x6hb4" Oct 03 08:41:37 crc kubenswrapper[4765]: E1003 08:41:37.366227 4765 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-03 08:41:37.866186388 +0000 UTC m=+142.167680718 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 03 08:41:37 crc kubenswrapper[4765]: I1003 08:41:37.366314 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/28cc4e4f-507b-49c7-9a8f-2107e600e834-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-9g9cw\" (UID: \"28cc4e4f-507b-49c7-9a8f-2107e600e834\") " pod="openshift-marketplace/marketplace-operator-79b997595-9g9cw" Oct 03 08:41:37 crc kubenswrapper[4765]: I1003 08:41:37.366382 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/4a3a7817-f128-4b5a-bbb7-604c846009d5-webhook-certs\") pod \"multus-admission-controller-857f4d67dd-rmm5l\" (UID: \"4a3a7817-f128-4b5a-bbb7-604c846009d5\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-rmm5l" Oct 03 08:41:37 crc kubenswrapper[4765]: I1003 08:41:37.366441 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/432cff95-d219-46af-bfc4-c5afbe99c9c0-config-volume\") pod \"collect-profiles-29324670-fpjm8\" (UID: \"432cff95-d219-46af-bfc4-c5afbe99c9c0\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29324670-fpjm8" Oct 03 08:41:37 crc kubenswrapper[4765]: I1003 08:41:37.366464 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-f247d\" (UniqueName: \"kubernetes.io/projected/c59afdc0-a7ed-4cc2-8972-7c8d7414375e-kube-api-access-f247d\") pod \"package-server-manager-789f6589d5-gvz2v\" (UID: \"c59afdc0-a7ed-4cc2-8972-7c8d7414375e\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-gvz2v" Oct 03 08:41:37 crc kubenswrapper[4765]: I1003 08:41:37.366497 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/193eea7c-6015-42df-b104-9a2848192515-socket-dir\") pod \"csi-hostpathplugin-xjnnd\" (UID: \"193eea7c-6015-42df-b104-9a2848192515\") " pod="hostpath-provisioner/csi-hostpathplugin-xjnnd" Oct 03 08:41:37 crc kubenswrapper[4765]: I1003 08:41:37.366536 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-bootstrap-token\" (UniqueName: \"kubernetes.io/secret/231de3ee-0e46-4fa8-8380-b31d98d3fab0-node-bootstrap-token\") pod \"machine-config-server-7s2rp\" (UID: \"231de3ee-0e46-4fa8-8380-b31d98d3fab0\") " pod="openshift-machine-config-operator/machine-config-server-7s2rp" Oct 03 08:41:37 crc kubenswrapper[4765]: I1003 08:41:37.366577 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugins-dir\" (UniqueName: \"kubernetes.io/host-path/193eea7c-6015-42df-b104-9a2848192515-plugins-dir\") pod \"csi-hostpathplugin-xjnnd\" (UID: \"193eea7c-6015-42df-b104-9a2848192515\") " pod="hostpath-provisioner/csi-hostpathplugin-xjnnd" Oct 03 08:41:37 crc kubenswrapper[4765]: I1003 08:41:37.366603 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a58649d2-054a-42ed-848d-beb9e9de3522-config\") pod \"service-ca-operator-777779d784-n4hxl\" (UID: \"a58649d2-054a-42ed-848d-beb9e9de3522\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-n4hxl" Oct 03 08:41:37 crc kubenswrapper[4765]: I1003 08:41:37.366631 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"control-plane-machine-set-operator-tls\" (UniqueName: \"kubernetes.io/secret/8f8201b3-edba-4bac-9d31-08452195ff1f-control-plane-machine-set-operator-tls\") pod \"control-plane-machine-set-operator-78cbb6b69f-rnsx7\" (UID: \"8f8201b3-edba-4bac-9d31-08452195ff1f\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-rnsx7" Oct 03 08:41:37 crc kubenswrapper[4765]: I1003 08:41:37.366683 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-npv8q\" (UniqueName: \"kubernetes.io/projected/193eea7c-6015-42df-b104-9a2848192515-kube-api-access-npv8q\") pod \"csi-hostpathplugin-xjnnd\" (UID: \"193eea7c-6015-42df-b104-9a2848192515\") " pod="hostpath-provisioner/csi-hostpathplugin-xjnnd" Oct 03 08:41:37 crc kubenswrapper[4765]: I1003 08:41:37.366737 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/ba0a8c66-3540-44e3-a29f-dda86ace66e8-cert\") pod \"ingress-canary-qj7dr\" (UID: \"ba0a8c66-3540-44e3-a29f-dda86ace66e8\") " pod="openshift-ingress-canary/ingress-canary-qj7dr" Oct 03 08:41:37 crc kubenswrapper[4765]: I1003 08:41:37.366771 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qbpf5\" (UniqueName: \"kubernetes.io/projected/ba0a8c66-3540-44e3-a29f-dda86ace66e8-kube-api-access-qbpf5\") pod \"ingress-canary-qj7dr\" (UID: \"ba0a8c66-3540-44e3-a29f-dda86ace66e8\") " pod="openshift-ingress-canary/ingress-canary-qj7dr" Oct 03 08:41:37 crc kubenswrapper[4765]: I1003 08:41:37.366812 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/94f97c0b-6272-475f-8794-4d9d26318d18-bound-sa-token\") pod \"ingress-operator-5b745b69d9-x6hb4\" (UID: \"94f97c0b-6272-475f-8794-4d9d26318d18\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-x6hb4" Oct 03 08:41:37 crc kubenswrapper[4765]: I1003 08:41:37.366834 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/193eea7c-6015-42df-b104-9a2848192515-registration-dir\") pod \"csi-hostpathplugin-xjnnd\" (UID: \"193eea7c-6015-42df-b104-9a2848192515\") " pod="hostpath-provisioner/csi-hostpathplugin-xjnnd" Oct 03 08:41:37 crc kubenswrapper[4765]: I1003 08:41:37.366882 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2j29\" (UniqueName: \"kubernetes.io/projected/432cff95-d219-46af-bfc4-c5afbe99c9c0-kube-api-access-s2j29\") pod \"collect-profiles-29324670-fpjm8\" (UID: \"432cff95-d219-46af-bfc4-c5afbe99c9c0\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29324670-fpjm8" Oct 03 08:41:37 crc kubenswrapper[4765]: I1003 08:41:37.366932 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5mmkp\" (UniqueName: \"kubernetes.io/projected/4a3a7817-f128-4b5a-bbb7-604c846009d5-kube-api-access-5mmkp\") pod \"multus-admission-controller-857f4d67dd-rmm5l\" (UID: \"4a3a7817-f128-4b5a-bbb7-604c846009d5\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-rmm5l" Oct 03 08:41:37 crc kubenswrapper[4765]: I1003 08:41:37.366952 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/a58649d2-054a-42ed-848d-beb9e9de3522-serving-cert\") pod \"service-ca-operator-777779d784-n4hxl\" (UID: \"a58649d2-054a-42ed-848d-beb9e9de3522\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-n4hxl" Oct 03 08:41:37 crc kubenswrapper[4765]: I1003 08:41:37.366981 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rlf6x\" (UniqueName: \"kubernetes.io/projected/0599e7ee-91e4-4ef3-8b8d-a5aca9e637d3-kube-api-access-rlf6x\") pod \"migrator-59844c95c7-kthpd\" (UID: \"0599e7ee-91e4-4ef3-8b8d-a5aca9e637d3\") " pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-kthpd" Oct 03 08:41:37 crc kubenswrapper[4765]: I1003 08:41:37.367003 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-t2mgk\" (UniqueName: \"kubernetes.io/projected/231de3ee-0e46-4fa8-8380-b31d98d3fab0-kube-api-access-t2mgk\") pod \"machine-config-server-7s2rp\" (UID: \"231de3ee-0e46-4fa8-8380-b31d98d3fab0\") " pod="openshift-machine-config-operator/machine-config-server-7s2rp" Oct 03 08:41:37 crc kubenswrapper[4765]: I1003 08:41:37.367045 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/c59afdc0-a7ed-4cc2-8972-7c8d7414375e-package-server-manager-serving-cert\") pod \"package-server-manager-789f6589d5-gvz2v\" (UID: \"c59afdc0-a7ed-4cc2-8972-7c8d7414375e\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-gvz2v" Oct 03 08:41:37 crc kubenswrapper[4765]: I1003 08:41:37.367085 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"csi-data-dir\" (UniqueName: \"kubernetes.io/host-path/193eea7c-6015-42df-b104-9a2848192515-csi-data-dir\") pod \"csi-hostpathplugin-xjnnd\" (UID: \"193eea7c-6015-42df-b104-9a2848192515\") " pod="hostpath-provisioner/csi-hostpathplugin-xjnnd" Oct 03 08:41:37 crc kubenswrapper[4765]: I1003 08:41:37.367113 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pxt6j\" (UniqueName: \"kubernetes.io/projected/43701ed5-3c65-480e-b414-9757b707d6be-kube-api-access-pxt6j\") pod \"dns-default-5g7vw\" (UID: \"43701ed5-3c65-480e-b414-9757b707d6be\") " pod="openshift-dns/dns-default-5g7vw" Oct 03 08:41:37 crc kubenswrapper[4765]: I1003 08:41:37.367132 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/19fb459f-dca0-464c-9cc0-830b67a34583-srv-cert\") pod \"catalog-operator-68c6474976-pmrxv\" (UID: \"19fb459f-dca0-464c-9cc0-830b67a34583\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-pmrxv" Oct 03 08:41:37 crc kubenswrapper[4765]: I1003 08:41:37.367184 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/94f97c0b-6272-475f-8794-4d9d26318d18-trusted-ca\") pod \"ingress-operator-5b745b69d9-x6hb4\" (UID: \"94f97c0b-6272-475f-8794-4d9d26318d18\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-x6hb4" Oct 03 08:41:37 crc kubenswrapper[4765]: I1003 08:41:37.367280 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/432cff95-d219-46af-bfc4-c5afbe99c9c0-secret-volume\") pod \"collect-profiles-29324670-fpjm8\" (UID: \"432cff95-d219-46af-bfc4-c5afbe99c9c0\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29324670-fpjm8" Oct 03 08:41:37 crc kubenswrapper[4765]: I1003 08:41:37.367304 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5c2h2\" (UniqueName: \"kubernetes.io/projected/19fb459f-dca0-464c-9cc0-830b67a34583-kube-api-access-5c2h2\") pod \"catalog-operator-68c6474976-pmrxv\" (UID: \"19fb459f-dca0-464c-9cc0-830b67a34583\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-pmrxv" Oct 03 08:41:37 crc kubenswrapper[4765]: I1003 08:41:37.367326 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"mountpoint-dir\" (UniqueName: \"kubernetes.io/host-path/193eea7c-6015-42df-b104-9a2848192515-mountpoint-dir\") pod \"csi-hostpathplugin-xjnnd\" (UID: \"193eea7c-6015-42df-b104-9a2848192515\") " pod="hostpath-provisioner/csi-hostpathplugin-xjnnd" Oct 03 08:41:37 crc kubenswrapper[4765]: I1003 08:41:37.367379 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/43701ed5-3c65-480e-b414-9757b707d6be-metrics-tls\") pod \"dns-default-5g7vw\" (UID: \"43701ed5-3c65-480e-b414-9757b707d6be\") " pod="openshift-dns/dns-default-5g7vw" Oct 03 08:41:37 crc kubenswrapper[4765]: I1003 08:41:37.367404 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/28cc4e4f-507b-49c7-9a8f-2107e600e834-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-9g9cw\" (UID: \"28cc4e4f-507b-49c7-9a8f-2107e600e834\") " pod="openshift-marketplace/marketplace-operator-79b997595-9g9cw" Oct 03 08:41:37 crc kubenswrapper[4765]: I1003 08:41:37.367429 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-85mlv\" (UniqueName: \"kubernetes.io/projected/28cc4e4f-507b-49c7-9a8f-2107e600e834-kube-api-access-85mlv\") pod \"marketplace-operator-79b997595-9g9cw\" (UID: \"28cc4e4f-507b-49c7-9a8f-2107e600e834\") " pod="openshift-marketplace/marketplace-operator-79b997595-9g9cw" Oct 03 08:41:37 crc kubenswrapper[4765]: I1003 08:41:37.367457 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/19fb459f-dca0-464c-9cc0-830b67a34583-profile-collector-cert\") pod \"catalog-operator-68c6474976-pmrxv\" (UID: \"19fb459f-dca0-464c-9cc0-830b67a34583\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-pmrxv" Oct 03 08:41:37 crc kubenswrapper[4765]: I1003 08:41:37.367478 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/94f97c0b-6272-475f-8794-4d9d26318d18-metrics-tls\") pod \"ingress-operator-5b745b69d9-x6hb4\" (UID: \"94f97c0b-6272-475f-8794-4d9d26318d18\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-x6hb4" Oct 03 08:41:37 crc kubenswrapper[4765]: I1003 08:41:37.367498 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"certs\" (UniqueName: \"kubernetes.io/secret/231de3ee-0e46-4fa8-8380-b31d98d3fab0-certs\") pod \"machine-config-server-7s2rp\" (UID: \"231de3ee-0e46-4fa8-8380-b31d98d3fab0\") " pod="openshift-machine-config-operator/machine-config-server-7s2rp" Oct 03 08:41:37 crc kubenswrapper[4765]: I1003 08:41:37.367523 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-h28hx\" (UniqueName: \"kubernetes.io/projected/8f8201b3-edba-4bac-9d31-08452195ff1f-kube-api-access-h28hx\") pod \"control-plane-machine-set-operator-78cbb6b69f-rnsx7\" (UID: \"8f8201b3-edba-4bac-9d31-08452195ff1f\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-rnsx7" Oct 03 08:41:37 crc kubenswrapper[4765]: I1003 08:41:37.367549 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-qwb6x\" (UID: \"32b16068-abfd-4a3f-870c-a17c7ff31d4b\") " pod="openshift-image-registry/image-registry-697d97f7c8-qwb6x" Oct 03 08:41:37 crc kubenswrapper[4765]: E1003 08:41:37.367922 4765 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-03 08:41:37.867911391 +0000 UTC m=+142.169405721 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-qwb6x" (UID: "32b16068-abfd-4a3f-870c-a17c7ff31d4b") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 03 08:41:37 crc kubenswrapper[4765]: I1003 08:41:37.368597 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/43701ed5-3c65-480e-b414-9757b707d6be-config-volume\") pod \"dns-default-5g7vw\" (UID: \"43701ed5-3c65-480e-b414-9757b707d6be\") " pod="openshift-dns/dns-default-5g7vw" Oct 03 08:41:37 crc kubenswrapper[4765]: I1003 08:41:37.370057 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugins-dir\" (UniqueName: \"kubernetes.io/host-path/193eea7c-6015-42df-b104-9a2848192515-plugins-dir\") pod \"csi-hostpathplugin-xjnnd\" (UID: \"193eea7c-6015-42df-b104-9a2848192515\") " pod="hostpath-provisioner/csi-hostpathplugin-xjnnd" Oct 03 08:41:37 crc kubenswrapper[4765]: I1003 08:41:37.370609 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a58649d2-054a-42ed-848d-beb9e9de3522-config\") pod \"service-ca-operator-777779d784-n4hxl\" (UID: \"a58649d2-054a-42ed-848d-beb9e9de3522\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-n4hxl" Oct 03 08:41:37 crc kubenswrapper[4765]: I1003 08:41:37.371824 4765 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-2gzl6"] Oct 03 08:41:37 crc kubenswrapper[4765]: I1003 08:41:37.374592 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"mountpoint-dir\" (UniqueName: \"kubernetes.io/host-path/193eea7c-6015-42df-b104-9a2848192515-mountpoint-dir\") pod \"csi-hostpathplugin-xjnnd\" (UID: \"193eea7c-6015-42df-b104-9a2848192515\") " pod="hostpath-provisioner/csi-hostpathplugin-xjnnd" Oct 03 08:41:37 crc kubenswrapper[4765]: I1003 08:41:37.374751 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/28cc4e4f-507b-49c7-9a8f-2107e600e834-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-9g9cw\" (UID: \"28cc4e4f-507b-49c7-9a8f-2107e600e834\") " pod="openshift-marketplace/marketplace-operator-79b997595-9g9cw" Oct 03 08:41:37 crc kubenswrapper[4765]: I1003 08:41:37.374852 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/193eea7c-6015-42df-b104-9a2848192515-registration-dir\") pod \"csi-hostpathplugin-xjnnd\" (UID: \"193eea7c-6015-42df-b104-9a2848192515\") " pod="hostpath-provisioner/csi-hostpathplugin-xjnnd" Oct 03 08:41:37 crc kubenswrapper[4765]: I1003 08:41:37.378056 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/432cff95-d219-46af-bfc4-c5afbe99c9c0-config-volume\") pod \"collect-profiles-29324670-fpjm8\" (UID: \"432cff95-d219-46af-bfc4-c5afbe99c9c0\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29324670-fpjm8" Oct 03 08:41:37 crc kubenswrapper[4765]: I1003 08:41:37.380321 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-bootstrap-token\" (UniqueName: \"kubernetes.io/secret/231de3ee-0e46-4fa8-8380-b31d98d3fab0-node-bootstrap-token\") pod \"machine-config-server-7s2rp\" (UID: \"231de3ee-0e46-4fa8-8380-b31d98d3fab0\") " pod="openshift-machine-config-operator/machine-config-server-7s2rp" Oct 03 08:41:37 crc kubenswrapper[4765]: I1003 08:41:37.378365 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/193eea7c-6015-42df-b104-9a2848192515-socket-dir\") pod \"csi-hostpathplugin-xjnnd\" (UID: \"193eea7c-6015-42df-b104-9a2848192515\") " pod="hostpath-provisioner/csi-hostpathplugin-xjnnd" Oct 03 08:41:37 crc kubenswrapper[4765]: I1003 08:41:37.378706 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-779bx\" (UniqueName: \"kubernetes.io/projected/6d421bb9-ba2e-416a-9554-7c4c7c93658b-kube-api-access-779bx\") pod \"machine-config-operator-74547568cd-74jfz\" (UID: \"6d421bb9-ba2e-416a-9554-7c4c7c93658b\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-74jfz" Oct 03 08:41:37 crc kubenswrapper[4765]: I1003 08:41:37.378798 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"csi-data-dir\" (UniqueName: \"kubernetes.io/host-path/193eea7c-6015-42df-b104-9a2848192515-csi-data-dir\") pod \"csi-hostpathplugin-xjnnd\" (UID: \"193eea7c-6015-42df-b104-9a2848192515\") " pod="hostpath-provisioner/csi-hostpathplugin-xjnnd" Oct 03 08:41:37 crc kubenswrapper[4765]: I1003 08:41:37.378308 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/4a3a7817-f128-4b5a-bbb7-604c846009d5-webhook-certs\") pod \"multus-admission-controller-857f4d67dd-rmm5l\" (UID: \"4a3a7817-f128-4b5a-bbb7-604c846009d5\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-rmm5l" Oct 03 08:41:37 crc kubenswrapper[4765]: I1003 08:41:37.383841 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/28cc4e4f-507b-49c7-9a8f-2107e600e834-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-9g9cw\" (UID: \"28cc4e4f-507b-49c7-9a8f-2107e600e834\") " pod="openshift-marketplace/marketplace-operator-79b997595-9g9cw" Oct 03 08:41:37 crc kubenswrapper[4765]: I1003 08:41:37.384022 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/a58649d2-054a-42ed-848d-beb9e9de3522-serving-cert\") pod \"service-ca-operator-777779d784-n4hxl\" (UID: \"a58649d2-054a-42ed-848d-beb9e9de3522\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-n4hxl" Oct 03 08:41:37 crc kubenswrapper[4765]: I1003 08:41:37.384096 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/43701ed5-3c65-480e-b414-9757b707d6be-metrics-tls\") pod \"dns-default-5g7vw\" (UID: \"43701ed5-3c65-480e-b414-9757b707d6be\") " pod="openshift-dns/dns-default-5g7vw" Oct 03 08:41:37 crc kubenswrapper[4765]: I1003 08:41:37.384576 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/ba0a8c66-3540-44e3-a29f-dda86ace66e8-cert\") pod \"ingress-canary-qj7dr\" (UID: \"ba0a8c66-3540-44e3-a29f-dda86ace66e8\") " pod="openshift-ingress-canary/ingress-canary-qj7dr" Oct 03 08:41:37 crc kubenswrapper[4765]: I1003 08:41:37.384710 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/19fb459f-dca0-464c-9cc0-830b67a34583-srv-cert\") pod \"catalog-operator-68c6474976-pmrxv\" (UID: \"19fb459f-dca0-464c-9cc0-830b67a34583\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-pmrxv" Oct 03 08:41:37 crc kubenswrapper[4765]: I1003 08:41:37.385306 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/94f97c0b-6272-475f-8794-4d9d26318d18-trusted-ca\") pod \"ingress-operator-5b745b69d9-x6hb4\" (UID: \"94f97c0b-6272-475f-8794-4d9d26318d18\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-x6hb4" Oct 03 08:41:37 crc kubenswrapper[4765]: I1003 08:41:37.393779 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"certs\" (UniqueName: \"kubernetes.io/secret/231de3ee-0e46-4fa8-8380-b31d98d3fab0-certs\") pod \"machine-config-server-7s2rp\" (UID: \"231de3ee-0e46-4fa8-8380-b31d98d3fab0\") " pod="openshift-machine-config-operator/machine-config-server-7s2rp" Oct 03 08:41:37 crc kubenswrapper[4765]: I1003 08:41:37.394284 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/19fb459f-dca0-464c-9cc0-830b67a34583-profile-collector-cert\") pod \"catalog-operator-68c6474976-pmrxv\" (UID: \"19fb459f-dca0-464c-9cc0-830b67a34583\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-pmrxv" Oct 03 08:41:37 crc kubenswrapper[4765]: I1003 08:41:37.394305 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"control-plane-machine-set-operator-tls\" (UniqueName: \"kubernetes.io/secret/8f8201b3-edba-4bac-9d31-08452195ff1f-control-plane-machine-set-operator-tls\") pod \"control-plane-machine-set-operator-78cbb6b69f-rnsx7\" (UID: \"8f8201b3-edba-4bac-9d31-08452195ff1f\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-rnsx7" Oct 03 08:41:37 crc kubenswrapper[4765]: I1003 08:41:37.394316 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/432cff95-d219-46af-bfc4-c5afbe99c9c0-secret-volume\") pod \"collect-profiles-29324670-fpjm8\" (UID: \"432cff95-d219-46af-bfc4-c5afbe99c9c0\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29324670-fpjm8" Oct 03 08:41:37 crc kubenswrapper[4765]: I1003 08:41:37.394491 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/c59afdc0-a7ed-4cc2-8972-7c8d7414375e-package-server-manager-serving-cert\") pod \"package-server-manager-789f6589d5-gvz2v\" (UID: \"c59afdc0-a7ed-4cc2-8972-7c8d7414375e\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-gvz2v" Oct 03 08:41:37 crc kubenswrapper[4765]: I1003 08:41:37.394835 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/3bf30b77-2306-4ae3-9ae2-02af916249f2-kube-api-access\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-5wxg4\" (UID: \"3bf30b77-2306-4ae3-9ae2-02af916249f2\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-5wxg4" Oct 03 08:41:37 crc kubenswrapper[4765]: I1003 08:41:37.394959 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/94f97c0b-6272-475f-8794-4d9d26318d18-metrics-tls\") pod \"ingress-operator-5b745b69d9-x6hb4\" (UID: \"94f97c0b-6272-475f-8794-4d9d26318d18\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-x6hb4" Oct 03 08:41:37 crc kubenswrapper[4765]: I1003 08:41:37.404236 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-sb9vl\" (UniqueName: \"kubernetes.io/projected/fb1c8d7c-5da9-41d7-85a7-3a36c632e7b3-kube-api-access-sb9vl\") pod \"apiserver-7bbb656c7d-l2dpj\" (UID: \"fb1c8d7c-5da9-41d7-85a7-3a36c632e7b3\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-l2dpj" Oct 03 08:41:37 crc kubenswrapper[4765]: W1003 08:41:37.446525 4765 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode41d9b4e_c3ce_4604_a3f8_1e972308f9a7.slice/crio-c16e50c5e053e3a8bb68771c0c4c03786f8cba92cfb89ec044499f22d5792dfb WatchSource:0}: Error finding container c16e50c5e053e3a8bb68771c0c4c03786f8cba92cfb89ec044499f22d5792dfb: Status 404 returned error can't find the container with id c16e50c5e053e3a8bb68771c0c4c03786f8cba92cfb89ec044499f22d5792dfb Oct 03 08:41:37 crc kubenswrapper[4765]: I1003 08:41:37.447930 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-999jl\" (UniqueName: \"kubernetes.io/projected/6639174a-08b1-409e-a6f1-5e238ef9ae85-kube-api-access-999jl\") pod \"openshift-controller-manager-operator-756b6f6bc6-7fwgr\" (UID: \"6639174a-08b1-409e-a6f1-5e238ef9ae85\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-7fwgr" Oct 03 08:41:37 crc kubenswrapper[4765]: I1003 08:41:37.455978 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xpdjb\" (UniqueName: \"kubernetes.io/projected/69cc377a-02f9-4e57-a9a8-776b5cef5b9b-kube-api-access-xpdjb\") pod \"olm-operator-6b444d44fb-tmd5f\" (UID: \"69cc377a-02f9-4e57-a9a8-776b5cef5b9b\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-tmd5f" Oct 03 08:41:37 crc kubenswrapper[4765]: I1003 08:41:37.462559 4765 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-t6dcd"] Oct 03 08:41:37 crc kubenswrapper[4765]: I1003 08:41:37.463726 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-w8dbx\" (UniqueName: \"kubernetes.io/projected/32b16068-abfd-4a3f-870c-a17c7ff31d4b-kube-api-access-w8dbx\") pod \"image-registry-697d97f7c8-qwb6x\" (UID: \"32b16068-abfd-4a3f-870c-a17c7ff31d4b\") " pod="openshift-image-registry/image-registry-697d97f7c8-qwb6x" Oct 03 08:41:37 crc kubenswrapper[4765]: I1003 08:41:37.472013 4765 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 03 08:41:37 crc kubenswrapper[4765]: E1003 08:41:37.472598 4765 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-03 08:41:37.972585041 +0000 UTC m=+142.274079371 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 03 08:41:37 crc kubenswrapper[4765]: I1003 08:41:37.472701 4765 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-tmd5f" Oct 03 08:41:37 crc kubenswrapper[4765]: I1003 08:41:37.492223 4765 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-7fwgr" Oct 03 08:41:37 crc kubenswrapper[4765]: I1003 08:41:37.499731 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/f8a9647c-0b02-4a08-91ee-537052124a65-kube-api-access\") pod \"kube-apiserver-operator-766d6c64bb-w9ww9\" (UID: \"f8a9647c-0b02-4a08-91ee-537052124a65\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-w9ww9" Oct 03 08:41:37 crc kubenswrapper[4765]: I1003 08:41:37.499858 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lc9gw\" (UniqueName: \"kubernetes.io/projected/b2831ebb-3ca5-490d-b0f7-ea2c669f78e3-kube-api-access-lc9gw\") pod \"cluster-image-registry-operator-dc59b4c8b-8fzw7\" (UID: \"b2831ebb-3ca5-490d-b0f7-ea2c669f78e3\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-8fzw7" Oct 03 08:41:37 crc kubenswrapper[4765]: I1003 08:41:37.500113 4765 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-vphkx" Oct 03 08:41:37 crc kubenswrapper[4765]: I1003 08:41:37.507021 4765 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-5wxg4" Oct 03 08:41:37 crc kubenswrapper[4765]: I1003 08:41:37.518311 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/c15a1eca-d125-468b-ac64-8046e4bcd19b-kube-api-access\") pod \"kube-controller-manager-operator-78b949d7b-rhthj\" (UID: \"c15a1eca-d125-468b-ac64-8046e4bcd19b\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-rhthj" Oct 03 08:41:37 crc kubenswrapper[4765]: I1003 08:41:37.523793 4765 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-w9ww9" Oct 03 08:41:37 crc kubenswrapper[4765]: I1003 08:41:37.540426 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qnvf4\" (UniqueName: \"kubernetes.io/projected/a7ea52a8-bcd9-4234-ba4d-f4181094c260-kube-api-access-qnvf4\") pod \"dns-operator-744455d44c-swd9l\" (UID: \"a7ea52a8-bcd9-4234-ba4d-f4181094c260\") " pod="openshift-dns-operator/dns-operator-744455d44c-swd9l" Oct 03 08:41:37 crc kubenswrapper[4765]: I1003 08:41:37.559320 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ftccq\" (UniqueName: \"kubernetes.io/projected/ee85c45f-e702-4221-a738-c57382513f5b-kube-api-access-ftccq\") pod \"machine-config-controller-84d6567774-djlns\" (UID: \"ee85c45f-e702-4221-a738-c57382513f5b\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-djlns" Oct 03 08:41:37 crc kubenswrapper[4765]: I1003 08:41:37.573487 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-qwb6x\" (UID: \"32b16068-abfd-4a3f-870c-a17c7ff31d4b\") " pod="openshift-image-registry/image-registry-697d97f7c8-qwb6x" Oct 03 08:41:37 crc kubenswrapper[4765]: E1003 08:41:37.573824 4765 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-03 08:41:38.073812527 +0000 UTC m=+142.375306857 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-qwb6x" (UID: "32b16068-abfd-4a3f-870c-a17c7ff31d4b") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 03 08:41:37 crc kubenswrapper[4765]: I1003 08:41:37.578288 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mk48h\" (UniqueName: \"kubernetes.io/projected/8bb60ae0-1dd0-4af1-ba69-49f17ed39eba-kube-api-access-mk48h\") pod \"service-ca-9c57cc56f-pfj5p\" (UID: \"8bb60ae0-1dd0-4af1-ba69-49f17ed39eba\") " pod="openshift-service-ca/service-ca-9c57cc56f-pfj5p" Oct 03 08:41:37 crc kubenswrapper[4765]: I1003 08:41:37.597880 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6mh7d\" (UniqueName: \"kubernetes.io/projected/6ef10623-b762-418f-bc9d-36a66d6ec9fd-kube-api-access-6mh7d\") pod \"console-operator-58897d9998-gzcf9\" (UID: \"6ef10623-b762-418f-bc9d-36a66d6ec9fd\") " pod="openshift-console-operator/console-operator-58897d9998-gzcf9" Oct 03 08:41:37 crc kubenswrapper[4765]: I1003 08:41:37.640556 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hhz42\" (UniqueName: \"kubernetes.io/projected/94f97c0b-6272-475f-8794-4d9d26318d18-kube-api-access-hhz42\") pod \"ingress-operator-5b745b69d9-x6hb4\" (UID: \"94f97c0b-6272-475f-8794-4d9d26318d18\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-x6hb4" Oct 03 08:41:37 crc kubenswrapper[4765]: I1003 08:41:37.657816 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-j27lj\" (UniqueName: \"kubernetes.io/projected/a58649d2-054a-42ed-848d-beb9e9de3522-kube-api-access-j27lj\") pod \"service-ca-operator-777779d784-n4hxl\" (UID: \"a58649d2-054a-42ed-848d-beb9e9de3522\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-n4hxl" Oct 03 08:41:37 crc kubenswrapper[4765]: I1003 08:41:37.663712 4765 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-ts98r"] Oct 03 08:41:37 crc kubenswrapper[4765]: I1003 08:41:37.673497 4765 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-l2dpj" Oct 03 08:41:37 crc kubenswrapper[4765]: I1003 08:41:37.674059 4765 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 03 08:41:37 crc kubenswrapper[4765]: E1003 08:41:37.674159 4765 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-03 08:41:38.174140859 +0000 UTC m=+142.475635189 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 03 08:41:37 crc kubenswrapper[4765]: I1003 08:41:37.674421 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-qwb6x\" (UID: \"32b16068-abfd-4a3f-870c-a17c7ff31d4b\") " pod="openshift-image-registry/image-registry-697d97f7c8-qwb6x" Oct 03 08:41:37 crc kubenswrapper[4765]: E1003 08:41:37.674722 4765 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-03 08:41:38.174714583 +0000 UTC m=+142.476208903 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-qwb6x" (UID: "32b16068-abfd-4a3f-870c-a17c7ff31d4b") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 03 08:41:37 crc kubenswrapper[4765]: I1003 08:41:37.684663 4765 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-74jfz" Oct 03 08:41:37 crc kubenswrapper[4765]: I1003 08:41:37.691338 4765 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-rhthj" Oct 03 08:41:37 crc kubenswrapper[4765]: I1003 08:41:37.696325 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rlf6x\" (UniqueName: \"kubernetes.io/projected/0599e7ee-91e4-4ef3-8b8d-a5aca9e637d3-kube-api-access-rlf6x\") pod \"migrator-59844c95c7-kthpd\" (UID: \"0599e7ee-91e4-4ef3-8b8d-a5aca9e637d3\") " pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-kthpd" Oct 03 08:41:37 crc kubenswrapper[4765]: I1003 08:41:37.701042 4765 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console-operator/console-operator-58897d9998-gzcf9" Oct 03 08:41:37 crc kubenswrapper[4765]: I1003 08:41:37.701900 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-npv8q\" (UniqueName: \"kubernetes.io/projected/193eea7c-6015-42df-b104-9a2848192515-kube-api-access-npv8q\") pod \"csi-hostpathplugin-xjnnd\" (UID: \"193eea7c-6015-42df-b104-9a2848192515\") " pod="hostpath-provisioner/csi-hostpathplugin-xjnnd" Oct 03 08:41:37 crc kubenswrapper[4765]: I1003 08:41:37.720376 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-t2mgk\" (UniqueName: \"kubernetes.io/projected/231de3ee-0e46-4fa8-8380-b31d98d3fab0-kube-api-access-t2mgk\") pod \"machine-config-server-7s2rp\" (UID: \"231de3ee-0e46-4fa8-8380-b31d98d3fab0\") " pod="openshift-machine-config-operator/machine-config-server-7s2rp" Oct 03 08:41:37 crc kubenswrapper[4765]: I1003 08:41:37.734380 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-f247d\" (UniqueName: \"kubernetes.io/projected/c59afdc0-a7ed-4cc2-8972-7c8d7414375e-kube-api-access-f247d\") pod \"package-server-manager-789f6589d5-gvz2v\" (UID: \"c59afdc0-a7ed-4cc2-8972-7c8d7414375e\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-gvz2v" Oct 03 08:41:37 crc kubenswrapper[4765]: I1003 08:41:37.757711 4765 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-djlns" Oct 03 08:41:37 crc kubenswrapper[4765]: I1003 08:41:37.758981 4765 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-8fzw7" Oct 03 08:41:37 crc kubenswrapper[4765]: I1003 08:41:37.766771 4765 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca/service-ca-9c57cc56f-pfj5p" Oct 03 08:41:37 crc kubenswrapper[4765]: I1003 08:41:37.767554 4765 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication/oauth-openshift-558db77b4-jgs5w"] Oct 03 08:41:37 crc kubenswrapper[4765]: I1003 08:41:37.769321 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-s2j29\" (UniqueName: \"kubernetes.io/projected/432cff95-d219-46af-bfc4-c5afbe99c9c0-kube-api-access-s2j29\") pod \"collect-profiles-29324670-fpjm8\" (UID: \"432cff95-d219-46af-bfc4-c5afbe99c9c0\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29324670-fpjm8" Oct 03 08:41:37 crc kubenswrapper[4765]: I1003 08:41:37.776331 4765 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 03 08:41:37 crc kubenswrapper[4765]: E1003 08:41:37.776451 4765 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-03 08:41:38.27643386 +0000 UTC m=+142.577928190 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 03 08:41:37 crc kubenswrapper[4765]: I1003 08:41:37.776739 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-qwb6x\" (UID: \"32b16068-abfd-4a3f-870c-a17c7ff31d4b\") " pod="openshift-image-registry/image-registry-697d97f7c8-qwb6x" Oct 03 08:41:37 crc kubenswrapper[4765]: E1003 08:41:37.777006 4765 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-03 08:41:38.276998204 +0000 UTC m=+142.578492534 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-qwb6x" (UID: "32b16068-abfd-4a3f-870c-a17c7ff31d4b") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 03 08:41:37 crc kubenswrapper[4765]: I1003 08:41:37.785927 4765 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns-operator/dns-operator-744455d44c-swd9l" Oct 03 08:41:37 crc kubenswrapper[4765]: I1003 08:41:37.792480 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5mmkp\" (UniqueName: \"kubernetes.io/projected/4a3a7817-f128-4b5a-bbb7-604c846009d5-kube-api-access-5mmkp\") pod \"multus-admission-controller-857f4d67dd-rmm5l\" (UID: \"4a3a7817-f128-4b5a-bbb7-604c846009d5\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-rmm5l" Oct 03 08:41:37 crc kubenswrapper[4765]: I1003 08:41:37.815987 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-85mlv\" (UniqueName: \"kubernetes.io/projected/28cc4e4f-507b-49c7-9a8f-2107e600e834-kube-api-access-85mlv\") pod \"marketplace-operator-79b997595-9g9cw\" (UID: \"28cc4e4f-507b-49c7-9a8f-2107e600e834\") " pod="openshift-marketplace/marketplace-operator-79b997595-9g9cw" Oct 03 08:41:37 crc kubenswrapper[4765]: I1003 08:41:37.825584 4765 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-tmd5f"] Oct 03 08:41:37 crc kubenswrapper[4765]: I1003 08:41:37.844679 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qbpf5\" (UniqueName: \"kubernetes.io/projected/ba0a8c66-3540-44e3-a29f-dda86ace66e8-kube-api-access-qbpf5\") pod \"ingress-canary-qj7dr\" (UID: \"ba0a8c66-3540-44e3-a29f-dda86ace66e8\") " pod="openshift-ingress-canary/ingress-canary-qj7dr" Oct 03 08:41:37 crc kubenswrapper[4765]: I1003 08:41:37.845053 4765 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-gvz2v" Oct 03 08:41:37 crc kubenswrapper[4765]: I1003 08:41:37.845889 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5c2h2\" (UniqueName: \"kubernetes.io/projected/19fb459f-dca0-464c-9cc0-830b67a34583-kube-api-access-5c2h2\") pod \"catalog-operator-68c6474976-pmrxv\" (UID: \"19fb459f-dca0-464c-9cc0-830b67a34583\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-pmrxv" Oct 03 08:41:37 crc kubenswrapper[4765]: I1003 08:41:37.854851 4765 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-kthpd" Oct 03 08:41:37 crc kubenswrapper[4765]: I1003 08:41:37.864563 4765 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-admission-controller-857f4d67dd-rmm5l" Oct 03 08:41:37 crc kubenswrapper[4765]: I1003 08:41:37.867309 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pxt6j\" (UniqueName: \"kubernetes.io/projected/43701ed5-3c65-480e-b414-9757b707d6be-kube-api-access-pxt6j\") pod \"dns-default-5g7vw\" (UID: \"43701ed5-3c65-480e-b414-9757b707d6be\") " pod="openshift-dns/dns-default-5g7vw" Oct 03 08:41:37 crc kubenswrapper[4765]: I1003 08:41:37.878814 4765 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 03 08:41:37 crc kubenswrapper[4765]: I1003 08:41:37.878919 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/94f97c0b-6272-475f-8794-4d9d26318d18-bound-sa-token\") pod \"ingress-operator-5b745b69d9-x6hb4\" (UID: \"94f97c0b-6272-475f-8794-4d9d26318d18\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-x6hb4" Oct 03 08:41:37 crc kubenswrapper[4765]: E1003 08:41:37.879272 4765 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-03 08:41:38.379250905 +0000 UTC m=+142.680745265 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 03 08:41:37 crc kubenswrapper[4765]: I1003 08:41:37.891223 4765 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-pmrxv" Oct 03 08:41:37 crc kubenswrapper[4765]: I1003 08:41:37.899748 4765 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca-operator/service-ca-operator-777779d784-n4hxl" Oct 03 08:41:37 crc kubenswrapper[4765]: I1003 08:41:37.905095 4765 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-9g9cw" Oct 03 08:41:37 crc kubenswrapper[4765]: I1003 08:41:37.910163 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-h28hx\" (UniqueName: \"kubernetes.io/projected/8f8201b3-edba-4bac-9d31-08452195ff1f-kube-api-access-h28hx\") pod \"control-plane-machine-set-operator-78cbb6b69f-rnsx7\" (UID: \"8f8201b3-edba-4bac-9d31-08452195ff1f\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-rnsx7" Oct 03 08:41:37 crc kubenswrapper[4765]: I1003 08:41:37.913670 4765 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29324670-fpjm8" Oct 03 08:41:37 crc kubenswrapper[4765]: I1003 08:41:37.924517 4765 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-qj7dr" Oct 03 08:41:37 crc kubenswrapper[4765]: I1003 08:41:37.933402 4765 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-server-7s2rp" Oct 03 08:41:37 crc kubenswrapper[4765]: I1003 08:41:37.941095 4765 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-5g7vw" Oct 03 08:41:37 crc kubenswrapper[4765]: I1003 08:41:37.959349 4765 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="hostpath-provisioner/csi-hostpathplugin-xjnnd" Oct 03 08:41:37 crc kubenswrapper[4765]: I1003 08:41:37.963093 4765 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-api/machine-api-operator-5694c8668f-8mnt6"] Oct 03 08:41:37 crc kubenswrapper[4765]: I1003 08:41:37.974579 4765 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-config-operator/openshift-config-operator-7777fb866f-9k7k4"] Oct 03 08:41:37 crc kubenswrapper[4765]: I1003 08:41:37.983484 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-qwb6x\" (UID: \"32b16068-abfd-4a3f-870c-a17c7ff31d4b\") " pod="openshift-image-registry/image-registry-697d97f7c8-qwb6x" Oct 03 08:41:37 crc kubenswrapper[4765]: E1003 08:41:37.984001 4765 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-03 08:41:38.483954206 +0000 UTC m=+142.785448536 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-qwb6x" (UID: "32b16068-abfd-4a3f-870c-a17c7ff31d4b") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 03 08:41:38 crc kubenswrapper[4765]: I1003 08:41:38.007231 4765 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-7fwgr"] Oct 03 08:41:38 crc kubenswrapper[4765]: I1003 08:41:38.010473 4765 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-w9ww9"] Oct 03 08:41:38 crc kubenswrapper[4765]: I1003 08:41:38.059279 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd-operator/etcd-operator-b45778765-4rknw" event={"ID":"a2cdc869-a3e4-410d-be35-0ad4514d8bf8","Type":"ContainerStarted","Data":"0fbb353fdbd281f8d5eaae2015c19a39d2b262922d3a1bb73a0a40c86f74b18a"} Oct 03 08:41:38 crc kubenswrapper[4765]: I1003 08:41:38.059656 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd-operator/etcd-operator-b45778765-4rknw" event={"ID":"a2cdc869-a3e4-410d-be35-0ad4514d8bf8","Type":"ContainerStarted","Data":"4e6cf10fd10a3fed4de5e0a7c9bea75f83319abedac2df5b8d96acfda452c970"} Oct 03 08:41:38 crc kubenswrapper[4765]: I1003 08:41:38.067893 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-ts98r" event={"ID":"8029aef2-a0bf-4d08-b786-0bfff6f8943a","Type":"ContainerStarted","Data":"632c43f7661d8a22e866b5ac36126007c5bdafdf007dbea67eec89c9f7ba89cc"} Oct 03 08:41:38 crc kubenswrapper[4765]: I1003 08:41:38.086085 4765 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 03 08:41:38 crc kubenswrapper[4765]: E1003 08:41:38.086971 4765 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-03 08:41:38.586783431 +0000 UTC m=+142.888277771 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 03 08:41:38 crc kubenswrapper[4765]: I1003 08:41:38.095987 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-4srhb" event={"ID":"3b2c5fda-4f45-444f-991b-0afa96721739","Type":"ContainerStarted","Data":"d86a79509e5ffc0e76448c5d7eb1b089cb88afe857861ea1ff255560d8f789f4"} Oct 03 08:41:38 crc kubenswrapper[4765]: I1003 08:41:38.096080 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-4srhb" event={"ID":"3b2c5fda-4f45-444f-991b-0afa96721739","Type":"ContainerStarted","Data":"d699286acc7e5a3e670786b6c3628476dea60a398a96fe5b394392f1d115d335"} Oct 03 08:41:38 crc kubenswrapper[4765]: I1003 08:41:38.096414 4765 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-4srhb" Oct 03 08:41:38 crc kubenswrapper[4765]: I1003 08:41:38.107916 4765 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-5wxg4"] Oct 03 08:41:38 crc kubenswrapper[4765]: I1003 08:41:38.121879 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/downloads-7954f5f757-qdr5x" event={"ID":"d6fe9149-6e84-4fe5-97b0-5b6fd0a522bc","Type":"ContainerStarted","Data":"6c24ee849c13f748aa25a84c6f2ac25d1f3e33f2ff903c16f198f9d45bf164f6"} Oct 03 08:41:38 crc kubenswrapper[4765]: I1003 08:41:38.121929 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/downloads-7954f5f757-qdr5x" event={"ID":"d6fe9149-6e84-4fe5-97b0-5b6fd0a522bc","Type":"ContainerStarted","Data":"1c2274a34eed6b2d36dc5f2dcf3289342786627d86602db5426e825558c03058"} Oct 03 08:41:38 crc kubenswrapper[4765]: I1003 08:41:38.124537 4765 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-console/downloads-7954f5f757-qdr5x" Oct 03 08:41:38 crc kubenswrapper[4765]: I1003 08:41:38.128039 4765 patch_prober.go:28] interesting pod/route-controller-manager-6576b87f9c-4srhb container/route-controller-manager namespace/openshift-route-controller-manager: Readiness probe status=failure output="Get \"https://10.217.0.8:8443/healthz\": dial tcp 10.217.0.8:8443: connect: connection refused" start-of-body= Oct 03 08:41:38 crc kubenswrapper[4765]: I1003 08:41:38.128096 4765 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-4srhb" podUID="3b2c5fda-4f45-444f-991b-0afa96721739" containerName="route-controller-manager" probeResult="failure" output="Get \"https://10.217.0.8:8443/healthz\": dial tcp 10.217.0.8:8443: connect: connection refused" Oct 03 08:41:38 crc kubenswrapper[4765]: I1003 08:41:38.128295 4765 patch_prober.go:28] interesting pod/downloads-7954f5f757-qdr5x container/download-server namespace/openshift-console: Readiness probe status=failure output="Get \"http://10.217.0.7:8080/\": dial tcp 10.217.0.7:8080: connect: connection refused" start-of-body= Oct 03 08:41:38 crc kubenswrapper[4765]: I1003 08:41:38.128367 4765 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/downloads-7954f5f757-qdr5x" podUID="d6fe9149-6e84-4fe5-97b0-5b6fd0a522bc" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.7:8080/\": dial tcp 10.217.0.7:8080: connect: connection refused" Oct 03 08:41:38 crc kubenswrapper[4765]: I1003 08:41:38.129502 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-558db77b4-jgs5w" event={"ID":"0ea22c01-e088-40b8-aecd-e83fe862bc78","Type":"ContainerStarted","Data":"23e25634a8e1beb840e011822048cce038653c029b8b1d089b39e89d75c8a421"} Oct 03 08:41:38 crc kubenswrapper[4765]: I1003 08:41:38.136224 4765 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-rnsx7" Oct 03 08:41:38 crc kubenswrapper[4765]: W1003 08:41:38.161474 4765 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod4a5c90d2_421e_47fd_a2ae_c7c0c3c5a170.slice/crio-36b89d8fe3206ed351944fcb462ac9afe2de8872281285bca46f7917436f507c WatchSource:0}: Error finding container 36b89d8fe3206ed351944fcb462ac9afe2de8872281285bca46f7917436f507c: Status 404 returned error can't find the container with id 36b89d8fe3206ed351944fcb462ac9afe2de8872281285bca46f7917436f507c Oct 03 08:41:38 crc kubenswrapper[4765]: I1003 08:41:38.171448 4765 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-x6hb4" Oct 03 08:41:38 crc kubenswrapper[4765]: I1003 08:41:38.181085 4765 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-vphkx"] Oct 03 08:41:38 crc kubenswrapper[4765]: I1003 08:41:38.189871 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-qwb6x\" (UID: \"32b16068-abfd-4a3f-870c-a17c7ff31d4b\") " pod="openshift-image-registry/image-registry-697d97f7c8-qwb6x" Oct 03 08:41:38 crc kubenswrapper[4765]: E1003 08:41:38.198353 4765 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-03 08:41:38.69833432 +0000 UTC m=+142.999828650 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-qwb6x" (UID: "32b16068-abfd-4a3f-870c-a17c7ff31d4b") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 03 08:41:38 crc kubenswrapper[4765]: I1003 08:41:38.210923 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-vpkxq" event={"ID":"4a2b0c12-72bd-44fb-88f7-18203ba2ccb6","Type":"ContainerStarted","Data":"127988bb7b156ae6b33a954d2937ae45195a7824c14657bcf45530530e68ae51"} Oct 03 08:41:38 crc kubenswrapper[4765]: I1003 08:41:38.210970 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-vpkxq" event={"ID":"4a2b0c12-72bd-44fb-88f7-18203ba2ccb6","Type":"ContainerStarted","Data":"fcc6bcc96ea3c4c1640779dd2acf6fe757800757bb70421423e713cad428e350"} Oct 03 08:41:38 crc kubenswrapper[4765]: I1003 08:41:38.213017 4765 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console-operator/console-operator-58897d9998-gzcf9"] Oct 03 08:41:38 crc kubenswrapper[4765]: I1003 08:41:38.240867 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-tmd5f" event={"ID":"69cc377a-02f9-4e57-a9a8-776b5cef5b9b","Type":"ContainerStarted","Data":"e30dd8548e4fe1a7327f916110026be3270f3ef6b766b36d9cefb17a7ed8e9dc"} Oct 03 08:41:38 crc kubenswrapper[4765]: I1003 08:41:38.244364 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-t6dcd" event={"ID":"eccb70cc-3c95-4f97-ad20-610bb8a7b5df","Type":"ContainerStarted","Data":"a40b7f1aea6983bb6b9894ec7efcb846ba6597e8127da995f22ac2bca4ac9025"} Oct 03 08:41:38 crc kubenswrapper[4765]: I1003 08:41:38.261157 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication-operator/authentication-operator-69f744f599-mpxvz" event={"ID":"909e5d8a-0d69-4973-b9ce-bc5febb55e14","Type":"ContainerStarted","Data":"1a31bd6587cdcf9a42190e3c759616d0d5ff734246f404a1525f1cb0eb885690"} Oct 03 08:41:38 crc kubenswrapper[4765]: I1003 08:41:38.264370 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-879f6c89f-2gzl6" event={"ID":"e41d9b4e-c3ce-4604-a3f8-1e972308f9a7","Type":"ContainerStarted","Data":"41abd7bba33bd5738b20482f2f0b1c2605e5902861db39ec8ac96b5eb405557e"} Oct 03 08:41:38 crc kubenswrapper[4765]: I1003 08:41:38.264409 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-879f6c89f-2gzl6" event={"ID":"e41d9b4e-c3ce-4604-a3f8-1e972308f9a7","Type":"ContainerStarted","Data":"c16e50c5e053e3a8bb68771c0c4c03786f8cba92cfb89ec044499f22d5792dfb"} Oct 03 08:41:38 crc kubenswrapper[4765]: I1003 08:41:38.266167 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-f9d7485db-g8jbc" event={"ID":"d6e8ca49-1faf-4e22-8760-d7eca3820980","Type":"ContainerStarted","Data":"5eac0a0fb9aafe9e42d534b96384e590fa0025a066eb96cb750e9e0f08c39bd9"} Oct 03 08:41:38 crc kubenswrapper[4765]: I1003 08:41:38.291748 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver/apiserver-76f77b778f-gcgfs" event={"ID":"6d5f9563-ba1f-4c05-a32d-127a5c01932d","Type":"ContainerStarted","Data":"9776e33e1af0171dd1dd04de625e952babd4d0ed41dfe9eb5140e83b82eddfd8"} Oct 03 08:41:38 crc kubenswrapper[4765]: I1003 08:41:38.294728 4765 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 03 08:41:38 crc kubenswrapper[4765]: E1003 08:41:38.297033 4765 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-03 08:41:38.797010313 +0000 UTC m=+143.098504643 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 03 08:41:38 crc kubenswrapper[4765]: I1003 08:41:38.317736 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-ph5q8" event={"ID":"959cebb7-4057-42d3-a1bf-fc19557247cc","Type":"ContainerStarted","Data":"58ea083fdf1b69fb80b0973310389143704e264106ae39f6deffa9ef69e7b5a1"} Oct 03 08:41:38 crc kubenswrapper[4765]: I1003 08:41:38.317776 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-ph5q8" event={"ID":"959cebb7-4057-42d3-a1bf-fc19557247cc","Type":"ContainerStarted","Data":"d46422da155def79728bed57a9031732e5ab30d2d685e0179a581329abc54998"} Oct 03 08:41:38 crc kubenswrapper[4765]: I1003 08:41:38.332789 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/router-default-5444994796-f64ph" event={"ID":"1a071347-8c80-4f91-87f3-1d95c7b18a1c","Type":"ContainerStarted","Data":"4615f9b13bce9df01b6d094f9e549d0bac60d0b88bf37eac91ae4b26ae795a4b"} Oct 03 08:41:38 crc kubenswrapper[4765]: I1003 08:41:38.332841 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/router-default-5444994796-f64ph" event={"ID":"1a071347-8c80-4f91-87f3-1d95c7b18a1c","Type":"ContainerStarted","Data":"b0ce93dad55b7d912416d973734fbce3f3229a52e2616b8b9c69c257661febfe"} Oct 03 08:41:38 crc kubenswrapper[4765]: I1003 08:41:38.343770 4765 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca/service-ca-9c57cc56f-pfj5p"] Oct 03 08:41:38 crc kubenswrapper[4765]: I1003 08:41:38.388972 4765 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-config-operator/machine-config-operator-74547568cd-74jfz"] Oct 03 08:41:38 crc kubenswrapper[4765]: I1003 08:41:38.398592 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-qwb6x\" (UID: \"32b16068-abfd-4a3f-870c-a17c7ff31d4b\") " pod="openshift-image-registry/image-registry-697d97f7c8-qwb6x" Oct 03 08:41:38 crc kubenswrapper[4765]: E1003 08:41:38.400268 4765 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-03 08:41:38.900249178 +0000 UTC m=+143.201743548 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-qwb6x" (UID: "32b16068-abfd-4a3f-870c-a17c7ff31d4b") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 03 08:41:38 crc kubenswrapper[4765]: I1003 08:41:38.452366 4765 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-rhthj"] Oct 03 08:41:38 crc kubenswrapper[4765]: I1003 08:41:38.478543 4765 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-oauth-apiserver/apiserver-7bbb656c7d-l2dpj"] Oct 03 08:41:38 crc kubenswrapper[4765]: I1003 08:41:38.501021 4765 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 03 08:41:38 crc kubenswrapper[4765]: E1003 08:41:38.501270 4765 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-03 08:41:39.001228527 +0000 UTC m=+143.302722847 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 03 08:41:38 crc kubenswrapper[4765]: I1003 08:41:38.501704 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-qwb6x\" (UID: \"32b16068-abfd-4a3f-870c-a17c7ff31d4b\") " pod="openshift-image-registry/image-registry-697d97f7c8-qwb6x" Oct 03 08:41:38 crc kubenswrapper[4765]: E1003 08:41:38.502816 4765 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-03 08:41:39.002790525 +0000 UTC m=+143.304284905 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-qwb6x" (UID: "32b16068-abfd-4a3f-870c-a17c7ff31d4b") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 03 08:41:38 crc kubenswrapper[4765]: I1003 08:41:38.602654 4765 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 03 08:41:38 crc kubenswrapper[4765]: E1003 08:41:38.602921 4765 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-03 08:41:39.102906763 +0000 UTC m=+143.404401093 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 03 08:41:38 crc kubenswrapper[4765]: W1003 08:41:38.639618 4765 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podfb1c8d7c_5da9_41d7_85a7_3a36c632e7b3.slice/crio-e70473fcc71b1d15d7277406000d51eb7f73d7280750392644e7268eef6c8021 WatchSource:0}: Error finding container e70473fcc71b1d15d7277406000d51eb7f73d7280750392644e7268eef6c8021: Status 404 returned error can't find the container with id e70473fcc71b1d15d7277406000d51eb7f73d7280750392644e7268eef6c8021 Oct 03 08:41:38 crc kubenswrapper[4765]: I1003 08:41:38.705067 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-qwb6x\" (UID: \"32b16068-abfd-4a3f-870c-a17c7ff31d4b\") " pod="openshift-image-registry/image-registry-697d97f7c8-qwb6x" Oct 03 08:41:38 crc kubenswrapper[4765]: E1003 08:41:38.706025 4765 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-03 08:41:39.206010315 +0000 UTC m=+143.507504645 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-qwb6x" (UID: "32b16068-abfd-4a3f-870c-a17c7ff31d4b") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 03 08:41:38 crc kubenswrapper[4765]: I1003 08:41:38.720005 4765 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-8fzw7"] Oct 03 08:41:38 crc kubenswrapper[4765]: I1003 08:41:38.806911 4765 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 03 08:41:38 crc kubenswrapper[4765]: E1003 08:41:38.808004 4765 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-03 08:41:39.307985029 +0000 UTC m=+143.609479359 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 03 08:41:38 crc kubenswrapper[4765]: I1003 08:41:38.909492 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-qwb6x\" (UID: \"32b16068-abfd-4a3f-870c-a17c7ff31d4b\") " pod="openshift-image-registry/image-registry-697d97f7c8-qwb6x" Oct 03 08:41:38 crc kubenswrapper[4765]: E1003 08:41:38.909820 4765 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-03 08:41:39.409808529 +0000 UTC m=+143.711302859 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-qwb6x" (UID: "32b16068-abfd-4a3f-870c-a17c7ff31d4b") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 03 08:41:38 crc kubenswrapper[4765]: I1003 08:41:38.933197 4765 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-gvz2v"] Oct 03 08:41:38 crc kubenswrapper[4765]: I1003 08:41:38.980546 4765 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-dns-operator/dns-operator-744455d44c-swd9l"] Oct 03 08:41:38 crc kubenswrapper[4765]: I1003 08:41:38.986992 4765 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-config-operator/machine-config-controller-84d6567774-djlns"] Oct 03 08:41:39 crc kubenswrapper[4765]: I1003 08:41:39.010736 4765 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 03 08:41:39 crc kubenswrapper[4765]: E1003 08:41:39.011122 4765 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-03 08:41:39.511104466 +0000 UTC m=+143.812598796 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 03 08:41:39 crc kubenswrapper[4765]: I1003 08:41:39.016410 4765 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-ingress/router-default-5444994796-f64ph" Oct 03 08:41:39 crc kubenswrapper[4765]: I1003 08:41:39.025970 4765 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-9g9cw"] Oct 03 08:41:39 crc kubenswrapper[4765]: I1003 08:41:39.047772 4765 patch_prober.go:28] interesting pod/router-default-5444994796-f64ph container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Oct 03 08:41:39 crc kubenswrapper[4765]: [-]has-synced failed: reason withheld Oct 03 08:41:39 crc kubenswrapper[4765]: [+]process-running ok Oct 03 08:41:39 crc kubenswrapper[4765]: healthz check failed Oct 03 08:41:39 crc kubenswrapper[4765]: I1003 08:41:39.047818 4765 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-f64ph" podUID="1a071347-8c80-4f91-87f3-1d95c7b18a1c" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Oct 03 08:41:39 crc kubenswrapper[4765]: I1003 08:41:39.116311 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-qwb6x\" (UID: \"32b16068-abfd-4a3f-870c-a17c7ff31d4b\") " pod="openshift-image-registry/image-registry-697d97f7c8-qwb6x" Oct 03 08:41:39 crc kubenswrapper[4765]: E1003 08:41:39.116895 4765 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-03 08:41:39.616877063 +0000 UTC m=+143.918371393 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-qwb6x" (UID: "32b16068-abfd-4a3f-870c-a17c7ff31d4b") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 03 08:41:39 crc kubenswrapper[4765]: I1003 08:41:39.218266 4765 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 03 08:41:39 crc kubenswrapper[4765]: E1003 08:41:39.218663 4765 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-03 08:41:39.718629111 +0000 UTC m=+144.020123441 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 03 08:41:39 crc kubenswrapper[4765]: W1003 08:41:39.220832 4765 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poda7ea52a8_bcd9_4234_ba4d_f4181094c260.slice/crio-f908beebb41ffacec9da11d77eded1de1ed78d78616269e6f0adb38fdabb05f3 WatchSource:0}: Error finding container f908beebb41ffacec9da11d77eded1de1ed78d78616269e6f0adb38fdabb05f3: Status 404 returned error can't find the container with id f908beebb41ffacec9da11d77eded1de1ed78d78616269e6f0adb38fdabb05f3 Oct 03 08:41:39 crc kubenswrapper[4765]: W1003 08:41:39.222964 4765 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod28cc4e4f_507b_49c7_9a8f_2107e600e834.slice/crio-cf64aba4343e2af32e954b206eacf6fe7a97d7645315c42a21256e478f188b22 WatchSource:0}: Error finding container cf64aba4343e2af32e954b206eacf6fe7a97d7645315c42a21256e478f188b22: Status 404 returned error can't find the container with id cf64aba4343e2af32e954b206eacf6fe7a97d7645315c42a21256e478f188b22 Oct 03 08:41:39 crc kubenswrapper[4765]: W1003 08:41:39.232915 4765 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podc59afdc0_a7ed_4cc2_8972_7c8d7414375e.slice/crio-d397f3a6230668bd35d422fad052032082e7fd91481f93b0794239af0d9355f5 WatchSource:0}: Error finding container d397f3a6230668bd35d422fad052032082e7fd91481f93b0794239af0d9355f5: Status 404 returned error can't find the container with id d397f3a6230668bd35d422fad052032082e7fd91481f93b0794239af0d9355f5 Oct 03 08:41:39 crc kubenswrapper[4765]: I1003 08:41:39.322343 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-qwb6x\" (UID: \"32b16068-abfd-4a3f-870c-a17c7ff31d4b\") " pod="openshift-image-registry/image-registry-697d97f7c8-qwb6x" Oct 03 08:41:39 crc kubenswrapper[4765]: E1003 08:41:39.323991 4765 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-03 08:41:39.823976588 +0000 UTC m=+144.125470918 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-qwb6x" (UID: "32b16068-abfd-4a3f-870c-a17c7ff31d4b") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 03 08:41:39 crc kubenswrapper[4765]: I1003 08:41:39.404857 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns-operator/dns-operator-744455d44c-swd9l" event={"ID":"a7ea52a8-bcd9-4234-ba4d-f4181094c260","Type":"ContainerStarted","Data":"f908beebb41ffacec9da11d77eded1de1ed78d78616269e6f0adb38fdabb05f3"} Oct 03 08:41:39 crc kubenswrapper[4765]: I1003 08:41:39.405595 4765 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/downloads-7954f5f757-qdr5x" podStartSLOduration=123.40558159 podStartE2EDuration="2m3.40558159s" podCreationTimestamp="2025-10-03 08:39:36 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-03 08:41:39.403076538 +0000 UTC m=+143.704570878" watchObservedRunningTime="2025-10-03 08:41:39.40558159 +0000 UTC m=+143.707075920" Oct 03 08:41:39 crc kubenswrapper[4765]: I1003 08:41:39.423093 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-rhthj" event={"ID":"c15a1eca-d125-468b-ac64-8046e4bcd19b","Type":"ContainerStarted","Data":"d400196098648b459b072cce784a7240c5c4c54af36a0f43a8692adb34420ba0"} Oct 03 08:41:39 crc kubenswrapper[4765]: I1003 08:41:39.429958 4765 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 03 08:41:39 crc kubenswrapper[4765]: E1003 08:41:39.430243 4765 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-03 08:41:39.930227867 +0000 UTC m=+144.231722197 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 03 08:41:39 crc kubenswrapper[4765]: I1003 08:41:39.464423 4765 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca-operator/service-ca-operator-777779d784-n4hxl"] Oct 03 08:41:39 crc kubenswrapper[4765]: I1003 08:41:39.486703 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-5wxg4" event={"ID":"3bf30b77-2306-4ae3-9ae2-02af916249f2","Type":"ContainerStarted","Data":"3a1825eace7957d1b4b032b4f794145b4bb0ddc8a6cc0ae330868d93945a7895"} Oct 03 08:41:39 crc kubenswrapper[4765]: I1003 08:41:39.506413 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca/service-ca-9c57cc56f-pfj5p" event={"ID":"8bb60ae0-1dd0-4af1-ba69-49f17ed39eba","Type":"ContainerStarted","Data":"1c69a9dbb54eba79611d4a7b06f6bb7505d9e9df1f0dcc0373b6f2f890889b7d"} Oct 03 08:41:39 crc kubenswrapper[4765]: I1003 08:41:39.517096 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-server-7s2rp" event={"ID":"231de3ee-0e46-4fa8-8380-b31d98d3fab0","Type":"ContainerStarted","Data":"71b2becb82ef590996ba801abdb8a73a0aaffea6f928489d2e3f0b9e165f6042"} Oct 03 08:41:39 crc kubenswrapper[4765]: I1003 08:41:39.531399 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-qwb6x\" (UID: \"32b16068-abfd-4a3f-870c-a17c7ff31d4b\") " pod="openshift-image-registry/image-registry-697d97f7c8-qwb6x" Oct 03 08:41:39 crc kubenswrapper[4765]: E1003 08:41:39.531821 4765 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-03 08:41:40.031802051 +0000 UTC m=+144.333296381 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-qwb6x" (UID: "32b16068-abfd-4a3f-870c-a17c7ff31d4b") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 03 08:41:39 crc kubenswrapper[4765]: I1003 08:41:39.555171 4765 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/console-f9d7485db-g8jbc" podStartSLOduration=124.555146247 podStartE2EDuration="2m4.555146247s" podCreationTimestamp="2025-10-03 08:39:35 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-03 08:41:39.551566778 +0000 UTC m=+143.853061108" watchObservedRunningTime="2025-10-03 08:41:39.555146247 +0000 UTC m=+143.856640567" Oct 03 08:41:39 crc kubenswrapper[4765]: I1003 08:41:39.563592 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-ts98r" event={"ID":"8029aef2-a0bf-4d08-b786-0bfff6f8943a","Type":"ContainerStarted","Data":"c4468a5aca6bc5279e96172af4928b01f277459fef8dc60f4fc307c2bece3706"} Oct 03 08:41:39 crc kubenswrapper[4765]: I1003 08:41:39.565396 4765 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-ts98r" Oct 03 08:41:39 crc kubenswrapper[4765]: I1003 08:41:39.572266 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-djlns" event={"ID":"ee85c45f-e702-4221-a738-c57382513f5b","Type":"ContainerStarted","Data":"5e1135d8f4fd35c02eb86dadd114d95136a81a28f2af5f652ac524680893e49e"} Oct 03 08:41:39 crc kubenswrapper[4765]: I1003 08:41:39.579574 4765 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-t6dcd" podStartSLOduration=124.579556418 podStartE2EDuration="2m4.579556418s" podCreationTimestamp="2025-10-03 08:39:35 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-03 08:41:39.579063096 +0000 UTC m=+143.880557426" watchObservedRunningTime="2025-10-03 08:41:39.579556418 +0000 UTC m=+143.881050738" Oct 03 08:41:39 crc kubenswrapper[4765]: I1003 08:41:39.579917 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-74jfz" event={"ID":"6d421bb9-ba2e-416a-9554-7c4c7c93658b","Type":"ContainerStarted","Data":"20723bd10342160a370d97eb38b58de9dd64a702ed08e9e9c575ef7316150738"} Oct 03 08:41:39 crc kubenswrapper[4765]: I1003 08:41:39.622928 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-w9ww9" event={"ID":"f8a9647c-0b02-4a08-91ee-537052124a65","Type":"ContainerStarted","Data":"5ee67bf158be1cb3e20dcf1ebaf33e810af7ddb7470d9b878bda66f6383a0cc2"} Oct 03 08:41:39 crc kubenswrapper[4765]: I1003 08:41:39.622972 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-w9ww9" event={"ID":"f8a9647c-0b02-4a08-91ee-537052124a65","Type":"ContainerStarted","Data":"f011f68aab6ea4494da2c9563c5e0a3c759a232864272f332dc7c61ff55c40a7"} Oct 03 08:41:39 crc kubenswrapper[4765]: I1003 08:41:39.632006 4765 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-etcd-operator/etcd-operator-b45778765-4rknw" podStartSLOduration=123.631986691 podStartE2EDuration="2m3.631986691s" podCreationTimestamp="2025-10-03 08:39:36 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-03 08:41:39.631613752 +0000 UTC m=+143.933108092" watchObservedRunningTime="2025-10-03 08:41:39.631986691 +0000 UTC m=+143.933481011" Oct 03 08:41:39 crc kubenswrapper[4765]: I1003 08:41:39.632297 4765 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 03 08:41:39 crc kubenswrapper[4765]: E1003 08:41:39.632688 4765 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-03 08:41:40.132670508 +0000 UTC m=+144.434164828 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 03 08:41:39 crc kubenswrapper[4765]: I1003 08:41:39.658934 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-tmd5f" event={"ID":"69cc377a-02f9-4e57-a9a8-776b5cef5b9b","Type":"ContainerStarted","Data":"6a7a7bf43c0215b29d9f11e0710567fae04f7fcf3d84b0a84a09a16db49153fd"} Oct 03 08:41:39 crc kubenswrapper[4765]: I1003 08:41:39.662099 4765 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-tmd5f" Oct 03 08:41:39 crc kubenswrapper[4765]: I1003 08:41:39.692070 4765 patch_prober.go:28] interesting pod/olm-operator-6b444d44fb-tmd5f container/olm-operator namespace/openshift-operator-lifecycle-manager: Readiness probe status=failure output="Get \"https://10.217.0.17:8443/healthz\": dial tcp 10.217.0.17:8443: connect: connection refused" start-of-body= Oct 03 08:41:39 crc kubenswrapper[4765]: I1003 08:41:39.692373 4765 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-tmd5f" podUID="69cc377a-02f9-4e57-a9a8-776b5cef5b9b" containerName="olm-operator" probeResult="failure" output="Get \"https://10.217.0.17:8443/healthz\": dial tcp 10.217.0.17:8443: connect: connection refused" Oct 03 08:41:39 crc kubenswrapper[4765]: I1003 08:41:39.707659 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console-operator/console-operator-58897d9998-gzcf9" event={"ID":"6ef10623-b762-418f-bc9d-36a66d6ec9fd","Type":"ContainerStarted","Data":"1b47c5bf6c99eaf297e0946b978000da7859b50363b4b25c8e5c4d80834c1fc3"} Oct 03 08:41:39 crc kubenswrapper[4765]: I1003 08:41:39.709448 4765 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress/router-default-5444994796-f64ph" podStartSLOduration=123.70943483 podStartE2EDuration="2m3.70943483s" podCreationTimestamp="2025-10-03 08:39:36 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-03 08:41:39.708004545 +0000 UTC m=+144.009498875" watchObservedRunningTime="2025-10-03 08:41:39.70943483 +0000 UTC m=+144.010929160" Oct 03 08:41:39 crc kubenswrapper[4765]: I1003 08:41:39.713747 4765 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-vpkxq" podStartSLOduration=124.713736176 podStartE2EDuration="2m4.713736176s" podCreationTimestamp="2025-10-03 08:39:35 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-03 08:41:39.660927484 +0000 UTC m=+143.962421814" watchObservedRunningTime="2025-10-03 08:41:39.713736176 +0000 UTC m=+144.015230506" Oct 03 08:41:39 crc kubenswrapper[4765]: I1003 08:41:39.724241 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/machine-api-operator-5694c8668f-8mnt6" event={"ID":"17c891c2-c5ff-4815-9f09-347204c5da1d","Type":"ContainerStarted","Data":"6140424bd9297c2aa82e61a3ece0117456c33cb5ca4b2ea9f8b33656dccee8fd"} Oct 03 08:41:39 crc kubenswrapper[4765]: I1003 08:41:39.724371 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/machine-api-operator-5694c8668f-8mnt6" event={"ID":"17c891c2-c5ff-4815-9f09-347204c5da1d","Type":"ContainerStarted","Data":"56502006a0b7e17f48d9c0c0efc234c45b146caa8c22aa7fb903a37b3163f51f"} Oct 03 08:41:39 crc kubenswrapper[4765]: I1003 08:41:39.734184 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-qwb6x\" (UID: \"32b16068-abfd-4a3f-870c-a17c7ff31d4b\") " pod="openshift-image-registry/image-registry-697d97f7c8-qwb6x" Oct 03 08:41:39 crc kubenswrapper[4765]: E1003 08:41:39.734577 4765 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-03 08:41:40.234562259 +0000 UTC m=+144.536056599 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-qwb6x" (UID: "32b16068-abfd-4a3f-870c-a17c7ff31d4b") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 03 08:41:39 crc kubenswrapper[4765]: I1003 08:41:39.776064 4765 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/catalog-operator-68c6474976-pmrxv"] Oct 03 08:41:39 crc kubenswrapper[4765]: I1003 08:41:39.820014 4765 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-canary/ingress-canary-qj7dr"] Oct 03 08:41:39 crc kubenswrapper[4765]: I1003 08:41:39.820068 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-t6dcd" event={"ID":"eccb70cc-3c95-4f97-ad20-610bb8a7b5df","Type":"ContainerStarted","Data":"2d6439ab7b76a2d455d8936331b68ca426a45b35f8850650ba3f1c58efeccfa2"} Oct 03 08:41:39 crc kubenswrapper[4765]: I1003 08:41:39.835447 4765 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 03 08:41:39 crc kubenswrapper[4765]: I1003 08:41:39.835781 4765 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-authentication-operator/authentication-operator-69f744f599-mpxvz" podStartSLOduration=124.835747924 podStartE2EDuration="2m4.835747924s" podCreationTimestamp="2025-10-03 08:39:35 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-03 08:41:39.764355244 +0000 UTC m=+144.065849584" watchObservedRunningTime="2025-10-03 08:41:39.835747924 +0000 UTC m=+144.137242244" Oct 03 08:41:39 crc kubenswrapper[4765]: E1003 08:41:39.838298 4765 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-03 08:41:40.338267886 +0000 UTC m=+144.639762216 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 03 08:41:39 crc kubenswrapper[4765]: I1003 08:41:39.843279 4765 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-ts98r" Oct 03 08:41:39 crc kubenswrapper[4765]: I1003 08:41:39.849322 4765 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-4srhb" podStartSLOduration=123.849292838 podStartE2EDuration="2m3.849292838s" podCreationTimestamp="2025-10-03 08:39:36 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-03 08:41:39.788263413 +0000 UTC m=+144.089757763" watchObservedRunningTime="2025-10-03 08:41:39.849292838 +0000 UTC m=+144.150787168" Oct 03 08:41:39 crc kubenswrapper[4765]: I1003 08:41:39.878771 4765 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-controller-manager/controller-manager-879f6c89f-2gzl6" podStartSLOduration=123.878747264 podStartE2EDuration="2m3.878747264s" podCreationTimestamp="2025-10-03 08:39:36 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-03 08:41:39.848896158 +0000 UTC m=+144.150390498" watchObservedRunningTime="2025-10-03 08:41:39.878747264 +0000 UTC m=+144.180241594" Oct 03 08:41:39 crc kubenswrapper[4765]: I1003 08:41:39.878839 4765 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["hostpath-provisioner/csi-hostpathplugin-xjnnd"] Oct 03 08:41:39 crc kubenswrapper[4765]: I1003 08:41:39.879712 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication-operator/authentication-operator-69f744f599-mpxvz" event={"ID":"909e5d8a-0d69-4973-b9ce-bc5febb55e14","Type":"ContainerStarted","Data":"addb9ce4ffac106372d2d1cae93fdd20ad2ba60896afaeeacf68f14a5c0c1a73"} Oct 03 08:41:39 crc kubenswrapper[4765]: I1003 08:41:39.881922 4765 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29324670-fpjm8"] Oct 03 08:41:39 crc kubenswrapper[4765]: I1003 08:41:39.886262 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-7fwgr" event={"ID":"6639174a-08b1-409e-a6f1-5e238ef9ae85","Type":"ContainerStarted","Data":"ede00e09e771771b4af024f2f3eb678f3b24e0b62fef7601cdf30845926269c9"} Oct 03 08:41:39 crc kubenswrapper[4765]: I1003 08:41:39.897136 4765 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-ph5q8" podStartSLOduration=123.897099606 podStartE2EDuration="2m3.897099606s" podCreationTimestamp="2025-10-03 08:39:36 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-03 08:41:39.891626821 +0000 UTC m=+144.193121151" watchObservedRunningTime="2025-10-03 08:41:39.897099606 +0000 UTC m=+144.198593936" Oct 03 08:41:39 crc kubenswrapper[4765]: I1003 08:41:39.897965 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-8fzw7" event={"ID":"b2831ebb-3ca5-490d-b0f7-ea2c669f78e3","Type":"ContainerStarted","Data":"fcb0932a256e16787da07410fe2cdab055d59b14e2571cc3845718deb076de17"} Oct 03 08:41:39 crc kubenswrapper[4765]: I1003 08:41:39.922671 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-config-operator/openshift-config-operator-7777fb866f-9k7k4" event={"ID":"4a5c90d2-421e-47fd-a2ae-c7c0c3c5a170","Type":"ContainerStarted","Data":"36b89d8fe3206ed351944fcb462ac9afe2de8872281285bca46f7917436f507c"} Oct 03 08:41:39 crc kubenswrapper[4765]: I1003 08:41:39.923570 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-vphkx" event={"ID":"0cc05495-73d0-4866-adcb-aa89431470c5","Type":"ContainerStarted","Data":"8c5a14cfb61e621a3fd3dee98238c4b522b2aca77038d6f8403471511d57ffd7"} Oct 03 08:41:39 crc kubenswrapper[4765]: I1003 08:41:39.924052 4765 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-ts98r" podStartSLOduration=123.92402569 podStartE2EDuration="2m3.92402569s" podCreationTimestamp="2025-10-03 08:39:36 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-03 08:41:39.921344724 +0000 UTC m=+144.222839054" watchObservedRunningTime="2025-10-03 08:41:39.92402569 +0000 UTC m=+144.225520020" Oct 03 08:41:39 crc kubenswrapper[4765]: I1003 08:41:39.924196 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-9g9cw" event={"ID":"28cc4e4f-507b-49c7-9a8f-2107e600e834","Type":"ContainerStarted","Data":"cf64aba4343e2af32e954b206eacf6fe7a97d7645315c42a21256e478f188b22"} Oct 03 08:41:39 crc kubenswrapper[4765]: I1003 08:41:39.924807 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-l2dpj" event={"ID":"fb1c8d7c-5da9-41d7-85a7-3a36c632e7b3","Type":"ContainerStarted","Data":"e70473fcc71b1d15d7277406000d51eb7f73d7280750392644e7268eef6c8021"} Oct 03 08:41:39 crc kubenswrapper[4765]: I1003 08:41:39.925956 4765 generic.go:334] "Generic (PLEG): container finished" podID="6d5f9563-ba1f-4c05-a32d-127a5c01932d" containerID="c36dd99ce20ff47655a088d1f5ee6a46dc5d1fdd05e98d71e939f102044e8612" exitCode=0 Oct 03 08:41:39 crc kubenswrapper[4765]: I1003 08:41:39.926004 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver/apiserver-76f77b778f-gcgfs" event={"ID":"6d5f9563-ba1f-4c05-a32d-127a5c01932d","Type":"ContainerDied","Data":"c36dd99ce20ff47655a088d1f5ee6a46dc5d1fdd05e98d71e939f102044e8612"} Oct 03 08:41:39 crc kubenswrapper[4765]: I1003 08:41:39.927333 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-gvz2v" event={"ID":"c59afdc0-a7ed-4cc2-8972-7c8d7414375e","Type":"ContainerStarted","Data":"d397f3a6230668bd35d422fad052032082e7fd91481f93b0794239af0d9355f5"} Oct 03 08:41:39 crc kubenswrapper[4765]: I1003 08:41:39.930667 4765 patch_prober.go:28] interesting pod/downloads-7954f5f757-qdr5x container/download-server namespace/openshift-console: Readiness probe status=failure output="Get \"http://10.217.0.7:8080/\": dial tcp 10.217.0.7:8080: connect: connection refused" start-of-body= Oct 03 08:41:39 crc kubenswrapper[4765]: I1003 08:41:39.930714 4765 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/downloads-7954f5f757-qdr5x" podUID="d6fe9149-6e84-4fe5-97b0-5b6fd0a522bc" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.7:8080/\": dial tcp 10.217.0.7:8080: connect: connection refused" Oct 03 08:41:39 crc kubenswrapper[4765]: I1003 08:41:39.931731 4765 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-controller-manager/controller-manager-879f6c89f-2gzl6" Oct 03 08:41:39 crc kubenswrapper[4765]: I1003 08:41:39.944013 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-qwb6x\" (UID: \"32b16068-abfd-4a3f-870c-a17c7ff31d4b\") " pod="openshift-image-registry/image-registry-697d97f7c8-qwb6x" Oct 03 08:41:39 crc kubenswrapper[4765]: E1003 08:41:39.947545 4765 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-03 08:41:40.447499078 +0000 UTC m=+144.748993578 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-qwb6x" (UID: "32b16068-abfd-4a3f-870c-a17c7ff31d4b") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 03 08:41:39 crc kubenswrapper[4765]: I1003 08:41:39.948116 4765 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-4srhb" Oct 03 08:41:39 crc kubenswrapper[4765]: I1003 08:41:39.955255 4765 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-controller-manager/controller-manager-879f6c89f-2gzl6" Oct 03 08:41:39 crc kubenswrapper[4765]: I1003 08:41:39.983391 4765 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-tmd5f" podStartSLOduration=123.983375863 podStartE2EDuration="2m3.983375863s" podCreationTimestamp="2025-10-03 08:39:36 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-03 08:41:39.982710376 +0000 UTC m=+144.284204706" watchObservedRunningTime="2025-10-03 08:41:39.983375863 +0000 UTC m=+144.284870183" Oct 03 08:41:39 crc kubenswrapper[4765]: I1003 08:41:39.993466 4765 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-multus/multus-admission-controller-857f4d67dd-rmm5l"] Oct 03 08:41:40 crc kubenswrapper[4765]: I1003 08:41:40.028271 4765 patch_prober.go:28] interesting pod/router-default-5444994796-f64ph container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Oct 03 08:41:40 crc kubenswrapper[4765]: [-]has-synced failed: reason withheld Oct 03 08:41:40 crc kubenswrapper[4765]: [+]process-running ok Oct 03 08:41:40 crc kubenswrapper[4765]: healthz check failed Oct 03 08:41:40 crc kubenswrapper[4765]: I1003 08:41:40.028579 4765 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-f64ph" podUID="1a071347-8c80-4f91-87f3-1d95c7b18a1c" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Oct 03 08:41:40 crc kubenswrapper[4765]: I1003 08:41:40.048694 4765 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 03 08:41:40 crc kubenswrapper[4765]: I1003 08:41:40.050287 4765 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-w9ww9" podStartSLOduration=124.050269122 podStartE2EDuration="2m4.050269122s" podCreationTimestamp="2025-10-03 08:39:36 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-03 08:41:40.008986554 +0000 UTC m=+144.310480884" watchObservedRunningTime="2025-10-03 08:41:40.050269122 +0000 UTC m=+144.351763452" Oct 03 08:41:40 crc kubenswrapper[4765]: E1003 08:41:40.051692 4765 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-03 08:41:40.551631865 +0000 UTC m=+144.853126205 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 03 08:41:40 crc kubenswrapper[4765]: I1003 08:41:40.075966 4765 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-dns/dns-default-5g7vw"] Oct 03 08:41:40 crc kubenswrapper[4765]: I1003 08:41:40.076071 4765 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator/migrator-59844c95c7-kthpd"] Oct 03 08:41:40 crc kubenswrapper[4765]: I1003 08:41:40.098232 4765 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-rnsx7"] Oct 03 08:41:40 crc kubenswrapper[4765]: I1003 08:41:40.130619 4765 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-operator/ingress-operator-5b745b69d9-x6hb4"] Oct 03 08:41:40 crc kubenswrapper[4765]: I1003 08:41:40.152639 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-qwb6x\" (UID: \"32b16068-abfd-4a3f-870c-a17c7ff31d4b\") " pod="openshift-image-registry/image-registry-697d97f7c8-qwb6x" Oct 03 08:41:40 crc kubenswrapper[4765]: E1003 08:41:40.152988 4765 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-03 08:41:40.652976084 +0000 UTC m=+144.954470414 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-qwb6x" (UID: "32b16068-abfd-4a3f-870c-a17c7ff31d4b") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 03 08:41:40 crc kubenswrapper[4765]: W1003 08:41:40.153292 4765 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod8f8201b3_edba_4bac_9d31_08452195ff1f.slice/crio-53a4761db099fe49409037075310b3f5c3c01997a45fdc155ab64d5f7bede3d6 WatchSource:0}: Error finding container 53a4761db099fe49409037075310b3f5c3c01997a45fdc155ab64d5f7bede3d6: Status 404 returned error can't find the container with id 53a4761db099fe49409037075310b3f5c3c01997a45fdc155ab64d5f7bede3d6 Oct 03 08:41:40 crc kubenswrapper[4765]: I1003 08:41:40.253960 4765 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 03 08:41:40 crc kubenswrapper[4765]: E1003 08:41:40.254838 4765 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-03 08:41:40.754817874 +0000 UTC m=+145.056312214 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 03 08:41:40 crc kubenswrapper[4765]: I1003 08:41:40.356773 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-qwb6x\" (UID: \"32b16068-abfd-4a3f-870c-a17c7ff31d4b\") " pod="openshift-image-registry/image-registry-697d97f7c8-qwb6x" Oct 03 08:41:40 crc kubenswrapper[4765]: E1003 08:41:40.357313 4765 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-03 08:41:40.85729372 +0000 UTC m=+145.158788050 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-qwb6x" (UID: "32b16068-abfd-4a3f-870c-a17c7ff31d4b") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 03 08:41:40 crc kubenswrapper[4765]: I1003 08:41:40.458867 4765 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 03 08:41:40 crc kubenswrapper[4765]: E1003 08:41:40.459500 4765 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-03 08:41:40.959472559 +0000 UTC m=+145.260966889 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 03 08:41:40 crc kubenswrapper[4765]: I1003 08:41:40.568025 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-qwb6x\" (UID: \"32b16068-abfd-4a3f-870c-a17c7ff31d4b\") " pod="openshift-image-registry/image-registry-697d97f7c8-qwb6x" Oct 03 08:41:40 crc kubenswrapper[4765]: E1003 08:41:40.569371 4765 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-03 08:41:41.069355778 +0000 UTC m=+145.370850108 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-qwb6x" (UID: "32b16068-abfd-4a3f-870c-a17c7ff31d4b") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 03 08:41:40 crc kubenswrapper[4765]: I1003 08:41:40.674483 4765 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 03 08:41:40 crc kubenswrapper[4765]: E1003 08:41:40.674963 4765 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-03 08:41:41.17490669 +0000 UTC m=+145.476401030 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 03 08:41:40 crc kubenswrapper[4765]: I1003 08:41:40.775981 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-qwb6x\" (UID: \"32b16068-abfd-4a3f-870c-a17c7ff31d4b\") " pod="openshift-image-registry/image-registry-697d97f7c8-qwb6x" Oct 03 08:41:40 crc kubenswrapper[4765]: E1003 08:41:40.776823 4765 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-03 08:41:41.276812122 +0000 UTC m=+145.578306452 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-qwb6x" (UID: "32b16068-abfd-4a3f-870c-a17c7ff31d4b") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 03 08:41:40 crc kubenswrapper[4765]: I1003 08:41:40.877325 4765 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 03 08:41:40 crc kubenswrapper[4765]: E1003 08:41:40.877832 4765 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-03 08:41:41.377794561 +0000 UTC m=+145.679288891 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 03 08:41:40 crc kubenswrapper[4765]: I1003 08:41:40.976881 4765 generic.go:334] "Generic (PLEG): container finished" podID="4a5c90d2-421e-47fd-a2ae-c7c0c3c5a170" containerID="861e3dfd9ea401bc0f7c25e4516eca2db07328eff7d1f670bf4178d71b290ae9" exitCode=0 Oct 03 08:41:40 crc kubenswrapper[4765]: I1003 08:41:40.977877 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-config-operator/openshift-config-operator-7777fb866f-9k7k4" event={"ID":"4a5c90d2-421e-47fd-a2ae-c7c0c3c5a170","Type":"ContainerDied","Data":"861e3dfd9ea401bc0f7c25e4516eca2db07328eff7d1f670bf4178d71b290ae9"} Oct 03 08:41:40 crc kubenswrapper[4765]: I1003 08:41:40.979038 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-qwb6x\" (UID: \"32b16068-abfd-4a3f-870c-a17c7ff31d4b\") " pod="openshift-image-registry/image-registry-697d97f7c8-qwb6x" Oct 03 08:41:40 crc kubenswrapper[4765]: E1003 08:41:40.979341 4765 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-03 08:41:41.479330054 +0000 UTC m=+145.780824384 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-qwb6x" (UID: "32b16068-abfd-4a3f-870c-a17c7ff31d4b") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 03 08:41:40 crc kubenswrapper[4765]: I1003 08:41:40.983914 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-pmrxv" event={"ID":"19fb459f-dca0-464c-9cc0-830b67a34583","Type":"ContainerStarted","Data":"d61b51f505e550487e33e66e7b30109e726133e9d7a09631cb2722f6fda2d0dd"} Oct 03 08:41:40 crc kubenswrapper[4765]: I1003 08:41:40.985279 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="hostpath-provisioner/csi-hostpathplugin-xjnnd" event={"ID":"193eea7c-6015-42df-b104-9a2848192515","Type":"ContainerStarted","Data":"347a8037477034fbb139c75e9200e78789b38bdf84bee2600311e98ac87f8a2e"} Oct 03 08:41:40 crc kubenswrapper[4765]: I1003 08:41:40.995081 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-vphkx" event={"ID":"0cc05495-73d0-4866-adcb-aa89431470c5","Type":"ContainerStarted","Data":"95ff895613e679c431c937faaf86f4c3f4d8d76803550f8ede4476f14300aeb8"} Oct 03 08:41:41 crc kubenswrapper[4765]: I1003 08:41:41.010103 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-server-7s2rp" event={"ID":"231de3ee-0e46-4fa8-8380-b31d98d3fab0","Type":"ContainerStarted","Data":"2059e0f4ea277d38b115db693fe8a5871b58cfc808a29a79d6514d0d44a485fe"} Oct 03 08:41:41 crc kubenswrapper[4765]: I1003 08:41:41.013917 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-kthpd" event={"ID":"0599e7ee-91e4-4ef3-8b8d-a5aca9e637d3","Type":"ContainerStarted","Data":"9a0da89ccf0c0e584c875978d48ae4b59309b04065f997f1ffb11cbdd931c928"} Oct 03 08:41:41 crc kubenswrapper[4765]: I1003 08:41:41.023608 4765 patch_prober.go:28] interesting pod/router-default-5444994796-f64ph container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Oct 03 08:41:41 crc kubenswrapper[4765]: [-]has-synced failed: reason withheld Oct 03 08:41:41 crc kubenswrapper[4765]: [+]process-running ok Oct 03 08:41:41 crc kubenswrapper[4765]: healthz check failed Oct 03 08:41:41 crc kubenswrapper[4765]: I1003 08:41:41.023867 4765 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-f64ph" podUID="1a071347-8c80-4f91-87f3-1d95c7b18a1c" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Oct 03 08:41:41 crc kubenswrapper[4765]: I1003 08:41:41.028928 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca/service-ca-9c57cc56f-pfj5p" event={"ID":"8bb60ae0-1dd0-4af1-ba69-49f17ed39eba","Type":"ContainerStarted","Data":"46ba64a7dc32de351264e15a3841f5e4178a0c0720f62bb7eb8bdbd0034da042"} Oct 03 08:41:41 crc kubenswrapper[4765]: I1003 08:41:41.042130 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console-operator/console-operator-58897d9998-gzcf9" event={"ID":"6ef10623-b762-418f-bc9d-36a66d6ec9fd","Type":"ContainerStarted","Data":"022ebecdd897362e2e8b89fc155d86772084ed088fa11251069d9dda8069aa23"} Oct 03 08:41:41 crc kubenswrapper[4765]: I1003 08:41:41.044477 4765 patch_prober.go:28] interesting pod/console-operator-58897d9998-gzcf9 container/console-operator namespace/openshift-console-operator: Readiness probe status=failure output="Get \"https://10.217.0.14:8443/readyz\": dial tcp 10.217.0.14:8443: connect: connection refused" start-of-body= Oct 03 08:41:41 crc kubenswrapper[4765]: I1003 08:41:41.044518 4765 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console-operator/console-operator-58897d9998-gzcf9" podUID="6ef10623-b762-418f-bc9d-36a66d6ec9fd" containerName="console-operator" probeResult="failure" output="Get \"https://10.217.0.14:8443/readyz\": dial tcp 10.217.0.14:8443: connect: connection refused" Oct 03 08:41:41 crc kubenswrapper[4765]: I1003 08:41:41.044580 4765 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-console-operator/console-operator-58897d9998-gzcf9" Oct 03 08:41:41 crc kubenswrapper[4765]: I1003 08:41:41.053198 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-rnsx7" event={"ID":"8f8201b3-edba-4bac-9d31-08452195ff1f","Type":"ContainerStarted","Data":"53a4761db099fe49409037075310b3f5c3c01997a45fdc155ab64d5f7bede3d6"} Oct 03 08:41:41 crc kubenswrapper[4765]: I1003 08:41:41.081841 4765 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 03 08:41:41 crc kubenswrapper[4765]: E1003 08:41:41.082343 4765 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-03 08:41:41.582303742 +0000 UTC m=+145.883798072 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 03 08:41:41 crc kubenswrapper[4765]: I1003 08:41:41.082357 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver/apiserver-76f77b778f-gcgfs" event={"ID":"6d5f9563-ba1f-4c05-a32d-127a5c01932d","Type":"ContainerStarted","Data":"0c1c00811b14522248a6600e35c18e2809c0e61215726101717c886fc64e7746"} Oct 03 08:41:41 crc kubenswrapper[4765]: I1003 08:41:41.095004 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-qwb6x\" (UID: \"32b16068-abfd-4a3f-870c-a17c7ff31d4b\") " pod="openshift-image-registry/image-registry-697d97f7c8-qwb6x" Oct 03 08:41:41 crc kubenswrapper[4765]: I1003 08:41:41.099615 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-5g7vw" event={"ID":"43701ed5-3c65-480e-b414-9757b707d6be","Type":"ContainerStarted","Data":"ca7d78272ca62fcc60cc501085a762a2635447ddd1dcf30517e58ae632fbfbf0"} Oct 03 08:41:41 crc kubenswrapper[4765]: E1003 08:41:41.105212 4765 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-03 08:41:41.605194646 +0000 UTC m=+145.906688976 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-qwb6x" (UID: "32b16068-abfd-4a3f-870c-a17c7ff31d4b") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 03 08:41:41 crc kubenswrapper[4765]: I1003 08:41:41.107909 4765 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/machine-config-server-7s2rp" podStartSLOduration=7.107891693 podStartE2EDuration="7.107891693s" podCreationTimestamp="2025-10-03 08:41:34 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-03 08:41:41.096279557 +0000 UTC m=+145.397773887" watchObservedRunningTime="2025-10-03 08:41:41.107891693 +0000 UTC m=+145.409386023" Oct 03 08:41:41 crc kubenswrapper[4765]: I1003 08:41:41.138611 4765 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console-operator/console-operator-58897d9998-gzcf9" podStartSLOduration=126.13859718 podStartE2EDuration="2m6.13859718s" podCreationTimestamp="2025-10-03 08:39:35 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-03 08:41:41.133081094 +0000 UTC m=+145.434575424" watchObservedRunningTime="2025-10-03 08:41:41.13859718 +0000 UTC m=+145.440091500" Oct 03 08:41:41 crc kubenswrapper[4765]: I1003 08:41:41.180166 4765 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-service-ca/service-ca-9c57cc56f-pfj5p" podStartSLOduration=125.180147094 podStartE2EDuration="2m5.180147094s" podCreationTimestamp="2025-10-03 08:39:36 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-03 08:41:41.177619322 +0000 UTC m=+145.479113652" watchObservedRunningTime="2025-10-03 08:41:41.180147094 +0000 UTC m=+145.481641424" Oct 03 08:41:41 crc kubenswrapper[4765]: I1003 08:41:41.198654 4765 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 03 08:41:41 crc kubenswrapper[4765]: E1003 08:41:41.199117 4765 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-03 08:41:41.699098141 +0000 UTC m=+146.000592471 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 03 08:41:41 crc kubenswrapper[4765]: I1003 08:41:41.233115 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-7fwgr" event={"ID":"6639174a-08b1-409e-a6f1-5e238ef9ae85","Type":"ContainerStarted","Data":"484fab93142cde2513eeee867f2abdc36b84edec403bfab4b532c9f6f1d5d73a"} Oct 03 08:41:41 crc kubenswrapper[4765]: I1003 08:41:41.249112 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-canary/ingress-canary-qj7dr" event={"ID":"ba0a8c66-3540-44e3-a29f-dda86ace66e8","Type":"ContainerStarted","Data":"736a74286859c1af9328ce2b54589dc023eeab947098abce5fd9fa5a09a7591a"} Oct 03 08:41:41 crc kubenswrapper[4765]: I1003 08:41:41.249183 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-canary/ingress-canary-qj7dr" event={"ID":"ba0a8c66-3540-44e3-a29f-dda86ace66e8","Type":"ContainerStarted","Data":"72047dcf329cdb12229f2731278191cad3d2599808246d744e9c6377f5e04c46"} Oct 03 08:41:41 crc kubenswrapper[4765]: I1003 08:41:41.263853 4765 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-7fwgr" podStartSLOduration=125.263837826 podStartE2EDuration="2m5.263837826s" podCreationTimestamp="2025-10-03 08:39:36 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-03 08:41:41.262975465 +0000 UTC m=+145.564469795" watchObservedRunningTime="2025-10-03 08:41:41.263837826 +0000 UTC m=+145.565332156" Oct 03 08:41:41 crc kubenswrapper[4765]: I1003 08:41:41.282220 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-admission-controller-857f4d67dd-rmm5l" event={"ID":"4a3a7817-f128-4b5a-bbb7-604c846009d5","Type":"ContainerStarted","Data":"581c78b85ebf976d6cecb891d0a94dd0145d8a54398909fd22d4789edfc005d9"} Oct 03 08:41:41 crc kubenswrapper[4765]: I1003 08:41:41.299784 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-qwb6x\" (UID: \"32b16068-abfd-4a3f-870c-a17c7ff31d4b\") " pod="openshift-image-registry/image-registry-697d97f7c8-qwb6x" Oct 03 08:41:41 crc kubenswrapper[4765]: E1003 08:41:41.301628 4765 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-03 08:41:41.801615767 +0000 UTC m=+146.103110097 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-qwb6x" (UID: "32b16068-abfd-4a3f-870c-a17c7ff31d4b") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 03 08:41:41 crc kubenswrapper[4765]: I1003 08:41:41.328625 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-x6hb4" event={"ID":"94f97c0b-6272-475f-8794-4d9d26318d18","Type":"ContainerStarted","Data":"39bf46ed3d279bb67e0d98d2fd674a8f490b8b69c39f5ba7eb84ce30e0fa5d72"} Oct 03 08:41:41 crc kubenswrapper[4765]: I1003 08:41:41.330472 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca-operator/service-ca-operator-777779d784-n4hxl" event={"ID":"a58649d2-054a-42ed-848d-beb9e9de3522","Type":"ContainerStarted","Data":"6bba565a238a98e4ddbddc178fe4f406c184aa948cf6b802e3dd9bd7448c762a"} Oct 03 08:41:41 crc kubenswrapper[4765]: I1003 08:41:41.336521 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29324670-fpjm8" event={"ID":"432cff95-d219-46af-bfc4-c5afbe99c9c0","Type":"ContainerStarted","Data":"84b5e30a2661036d7c3e2520ca56192ba3ced9248fa0b33efe611b28b75cc791"} Oct 03 08:41:41 crc kubenswrapper[4765]: I1003 08:41:41.356599 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-74jfz" event={"ID":"6d421bb9-ba2e-416a-9554-7c4c7c93658b","Type":"ContainerStarted","Data":"e7ea417b7271b485907cd202c4f4e086b40e4851e2c7d21763fc28fdcdbe407a"} Oct 03 08:41:41 crc kubenswrapper[4765]: I1003 08:41:41.385055 4765 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-service-ca-operator/service-ca-operator-777779d784-n4hxl" podStartSLOduration=125.385040194 podStartE2EDuration="2m5.385040194s" podCreationTimestamp="2025-10-03 08:39:36 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-03 08:41:41.383319611 +0000 UTC m=+145.684813951" watchObservedRunningTime="2025-10-03 08:41:41.385040194 +0000 UTC m=+145.686534524" Oct 03 08:41:41 crc kubenswrapper[4765]: I1003 08:41:41.385881 4765 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress-canary/ingress-canary-qj7dr" podStartSLOduration=7.3858735939999995 podStartE2EDuration="7.385873594s" podCreationTimestamp="2025-10-03 08:41:34 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-03 08:41:41.299129616 +0000 UTC m=+145.600623946" watchObservedRunningTime="2025-10-03 08:41:41.385873594 +0000 UTC m=+145.687367924" Oct 03 08:41:41 crc kubenswrapper[4765]: I1003 08:41:41.393984 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-5wxg4" event={"ID":"3bf30b77-2306-4ae3-9ae2-02af916249f2","Type":"ContainerStarted","Data":"8b1115b8d44e4934a6f40cdec5152df2be56a5f79c272c9dc99b7784a27b7991"} Oct 03 08:41:41 crc kubenswrapper[4765]: I1003 08:41:41.401197 4765 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 03 08:41:41 crc kubenswrapper[4765]: E1003 08:41:41.401321 4765 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-03 08:41:41.901300295 +0000 UTC m=+146.202794625 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 03 08:41:41 crc kubenswrapper[4765]: I1003 08:41:41.401453 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-qwb6x\" (UID: \"32b16068-abfd-4a3f-870c-a17c7ff31d4b\") " pod="openshift-image-registry/image-registry-697d97f7c8-qwb6x" Oct 03 08:41:41 crc kubenswrapper[4765]: E1003 08:41:41.402587 4765 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-03 08:41:41.902572176 +0000 UTC m=+146.204066516 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-qwb6x" (UID: "32b16068-abfd-4a3f-870c-a17c7ff31d4b") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 03 08:41:41 crc kubenswrapper[4765]: I1003 08:41:41.456746 4765 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-74jfz" podStartSLOduration=125.45670083 podStartE2EDuration="2m5.45670083s" podCreationTimestamp="2025-10-03 08:39:36 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-03 08:41:41.428856144 +0000 UTC m=+145.730350474" watchObservedRunningTime="2025-10-03 08:41:41.45670083 +0000 UTC m=+145.758195160" Oct 03 08:41:41 crc kubenswrapper[4765]: I1003 08:41:41.461194 4765 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-5wxg4" podStartSLOduration=125.46116542 podStartE2EDuration="2m5.46116542s" podCreationTimestamp="2025-10-03 08:39:36 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-03 08:41:41.460274998 +0000 UTC m=+145.761769328" watchObservedRunningTime="2025-10-03 08:41:41.46116542 +0000 UTC m=+145.762659750" Oct 03 08:41:41 crc kubenswrapper[4765]: I1003 08:41:41.495625 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-8fzw7" event={"ID":"b2831ebb-3ca5-490d-b0f7-ea2c669f78e3","Type":"ContainerStarted","Data":"55d7c6ed3c8033addb8b2f2dd7f54259d7232082cc7e1d851068243ad25f2a9c"} Oct 03 08:41:41 crc kubenswrapper[4765]: I1003 08:41:41.502740 4765 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 03 08:41:41 crc kubenswrapper[4765]: E1003 08:41:41.504207 4765 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-03 08:41:42.004184651 +0000 UTC m=+146.305678981 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 03 08:41:41 crc kubenswrapper[4765]: I1003 08:41:41.510838 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-558db77b4-jgs5w" event={"ID":"0ea22c01-e088-40b8-aecd-e83fe862bc78","Type":"ContainerStarted","Data":"ea4d2d07de738dd0bf91ad87156e5f9a973511e56d3a418ec90cb37f5349f6fd"} Oct 03 08:41:41 crc kubenswrapper[4765]: I1003 08:41:41.510885 4765 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-authentication/oauth-openshift-558db77b4-jgs5w" Oct 03 08:41:41 crc kubenswrapper[4765]: I1003 08:41:41.535245 4765 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-tmd5f" Oct 03 08:41:41 crc kubenswrapper[4765]: I1003 08:41:41.541185 4765 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-8fzw7" podStartSLOduration=125.541157652 podStartE2EDuration="2m5.541157652s" podCreationTimestamp="2025-10-03 08:39:36 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-03 08:41:41.539193704 +0000 UTC m=+145.840688034" watchObservedRunningTime="2025-10-03 08:41:41.541157652 +0000 UTC m=+145.842651982" Oct 03 08:41:41 crc kubenswrapper[4765]: I1003 08:41:41.606593 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-qwb6x\" (UID: \"32b16068-abfd-4a3f-870c-a17c7ff31d4b\") " pod="openshift-image-registry/image-registry-697d97f7c8-qwb6x" Oct 03 08:41:41 crc kubenswrapper[4765]: E1003 08:41:41.608394 4765 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-03 08:41:42.108380859 +0000 UTC m=+146.409875189 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-qwb6x" (UID: "32b16068-abfd-4a3f-870c-a17c7ff31d4b") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 03 08:41:41 crc kubenswrapper[4765]: I1003 08:41:41.708625 4765 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 03 08:41:41 crc kubenswrapper[4765]: E1003 08:41:41.709399 4765 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-03 08:41:42.209360388 +0000 UTC m=+146.510854718 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 03 08:41:41 crc kubenswrapper[4765]: I1003 08:41:41.710136 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-qwb6x\" (UID: \"32b16068-abfd-4a3f-870c-a17c7ff31d4b\") " pod="openshift-image-registry/image-registry-697d97f7c8-qwb6x" Oct 03 08:41:41 crc kubenswrapper[4765]: E1003 08:41:41.712662 4765 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-03 08:41:42.212621719 +0000 UTC m=+146.514116049 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-qwb6x" (UID: "32b16068-abfd-4a3f-870c-a17c7ff31d4b") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 03 08:41:41 crc kubenswrapper[4765]: I1003 08:41:41.811189 4765 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 03 08:41:41 crc kubenswrapper[4765]: E1003 08:41:41.812622 4765 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-03 08:41:42.312594563 +0000 UTC m=+146.614088893 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 03 08:41:41 crc kubenswrapper[4765]: I1003 08:41:41.815889 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-qwb6x\" (UID: \"32b16068-abfd-4a3f-870c-a17c7ff31d4b\") " pod="openshift-image-registry/image-registry-697d97f7c8-qwb6x" Oct 03 08:41:41 crc kubenswrapper[4765]: E1003 08:41:41.816366 4765 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-03 08:41:42.316352886 +0000 UTC m=+146.617847216 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-qwb6x" (UID: "32b16068-abfd-4a3f-870c-a17c7ff31d4b") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 03 08:41:41 crc kubenswrapper[4765]: I1003 08:41:41.916789 4765 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 03 08:41:41 crc kubenswrapper[4765]: E1003 08:41:41.917163 4765 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-03 08:41:42.417144741 +0000 UTC m=+146.718639071 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 03 08:41:42 crc kubenswrapper[4765]: I1003 08:41:42.021049 4765 patch_prober.go:28] interesting pod/router-default-5444994796-f64ph container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Oct 03 08:41:42 crc kubenswrapper[4765]: [-]has-synced failed: reason withheld Oct 03 08:41:42 crc kubenswrapper[4765]: [+]process-running ok Oct 03 08:41:42 crc kubenswrapper[4765]: healthz check failed Oct 03 08:41:42 crc kubenswrapper[4765]: I1003 08:41:42.021336 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-qwb6x\" (UID: \"32b16068-abfd-4a3f-870c-a17c7ff31d4b\") " pod="openshift-image-registry/image-registry-697d97f7c8-qwb6x" Oct 03 08:41:42 crc kubenswrapper[4765]: I1003 08:41:42.021372 4765 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-f64ph" podUID="1a071347-8c80-4f91-87f3-1d95c7b18a1c" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Oct 03 08:41:42 crc kubenswrapper[4765]: E1003 08:41:42.021664 4765 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-03 08:41:42.521632136 +0000 UTC m=+146.823126466 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-qwb6x" (UID: "32b16068-abfd-4a3f-870c-a17c7ff31d4b") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 03 08:41:42 crc kubenswrapper[4765]: I1003 08:41:42.122714 4765 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 03 08:41:42 crc kubenswrapper[4765]: E1003 08:41:42.122906 4765 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-03 08:41:42.622864092 +0000 UTC m=+146.924358422 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 03 08:41:42 crc kubenswrapper[4765]: I1003 08:41:42.123011 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-qwb6x\" (UID: \"32b16068-abfd-4a3f-870c-a17c7ff31d4b\") " pod="openshift-image-registry/image-registry-697d97f7c8-qwb6x" Oct 03 08:41:42 crc kubenswrapper[4765]: E1003 08:41:42.123302 4765 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-03 08:41:42.623290422 +0000 UTC m=+146.924784752 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-qwb6x" (UID: "32b16068-abfd-4a3f-870c-a17c7ff31d4b") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 03 08:41:42 crc kubenswrapper[4765]: I1003 08:41:42.223776 4765 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 03 08:41:42 crc kubenswrapper[4765]: E1003 08:41:42.224136 4765 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-03 08:41:42.724121668 +0000 UTC m=+147.025615998 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 03 08:41:42 crc kubenswrapper[4765]: I1003 08:41:42.325171 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-qwb6x\" (UID: \"32b16068-abfd-4a3f-870c-a17c7ff31d4b\") " pod="openshift-image-registry/image-registry-697d97f7c8-qwb6x" Oct 03 08:41:42 crc kubenswrapper[4765]: E1003 08:41:42.325692 4765 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-03 08:41:42.825666401 +0000 UTC m=+147.127160731 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-qwb6x" (UID: "32b16068-abfd-4a3f-870c-a17c7ff31d4b") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 03 08:41:42 crc kubenswrapper[4765]: I1003 08:41:42.426461 4765 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 03 08:41:42 crc kubenswrapper[4765]: E1003 08:41:42.426630 4765 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-03 08:41:42.926606839 +0000 UTC m=+147.228101159 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 03 08:41:42 crc kubenswrapper[4765]: I1003 08:41:42.426834 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-qwb6x\" (UID: \"32b16068-abfd-4a3f-870c-a17c7ff31d4b\") " pod="openshift-image-registry/image-registry-697d97f7c8-qwb6x" Oct 03 08:41:42 crc kubenswrapper[4765]: E1003 08:41:42.427251 4765 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-03 08:41:42.927232875 +0000 UTC m=+147.228727205 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-qwb6x" (UID: "32b16068-abfd-4a3f-870c-a17c7ff31d4b") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 03 08:41:42 crc kubenswrapper[4765]: I1003 08:41:42.515025 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-rnsx7" event={"ID":"8f8201b3-edba-4bac-9d31-08452195ff1f","Type":"ContainerStarted","Data":"f65b8b31244c399e7310239426eb5266fa3be6fe5087202a37077c717f5d24f7"} Oct 03 08:41:42 crc kubenswrapper[4765]: I1003 08:41:42.517073 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-x6hb4" event={"ID":"94f97c0b-6272-475f-8794-4d9d26318d18","Type":"ContainerStarted","Data":"4fc0fce7fa272f101793ecfdda068ffe6bc96dd3a479eb64fe24f459daaf3fd2"} Oct 03 08:41:42 crc kubenswrapper[4765]: I1003 08:41:42.517112 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-x6hb4" event={"ID":"94f97c0b-6272-475f-8794-4d9d26318d18","Type":"ContainerStarted","Data":"a783c9e62ccc7bc9fd8ebf9c46424a21c20ddd5dbdcebcb699a78541aaa2df1f"} Oct 03 08:41:42 crc kubenswrapper[4765]: I1003 08:41:42.517144 4765 patch_prober.go:28] interesting pod/oauth-openshift-558db77b4-jgs5w container/oauth-openshift namespace/openshift-authentication: Readiness probe status=failure output="Get \"https://10.217.0.6:6443/healthz\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" start-of-body= Oct 03 08:41:42 crc kubenswrapper[4765]: I1003 08:41:42.517191 4765 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-authentication/oauth-openshift-558db77b4-jgs5w" podUID="0ea22c01-e088-40b8-aecd-e83fe862bc78" containerName="oauth-openshift" probeResult="failure" output="Get \"https://10.217.0.6:6443/healthz\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Oct 03 08:41:42 crc kubenswrapper[4765]: I1003 08:41:42.519171 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-gvz2v" event={"ID":"c59afdc0-a7ed-4cc2-8972-7c8d7414375e","Type":"ContainerStarted","Data":"484643013a76a7d88f9d59f7189e84949408c4ed99b76ef259279c7eb1da4303"} Oct 03 08:41:42 crc kubenswrapper[4765]: I1003 08:41:42.519214 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-gvz2v" event={"ID":"c59afdc0-a7ed-4cc2-8972-7c8d7414375e","Type":"ContainerStarted","Data":"4a4033683302bff29392ab86f07bc56e402998cf898a64fb35d991ca19796fe0"} Oct 03 08:41:42 crc kubenswrapper[4765]: I1003 08:41:42.519319 4765 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-gvz2v" Oct 03 08:41:42 crc kubenswrapper[4765]: I1003 08:41:42.521705 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver/apiserver-76f77b778f-gcgfs" event={"ID":"6d5f9563-ba1f-4c05-a32d-127a5c01932d","Type":"ContainerStarted","Data":"925ed0dbe6229ada2943e44e9d4ac884a1a956e018bdf2c753b0612906db92aa"} Oct 03 08:41:42 crc kubenswrapper[4765]: I1003 08:41:42.523902 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-9g9cw" event={"ID":"28cc4e4f-507b-49c7-9a8f-2107e600e834","Type":"ContainerStarted","Data":"9fc01a9d3ec9443dd862e65771dc9f3be592acce41d965949734c07886215d4a"} Oct 03 08:41:42 crc kubenswrapper[4765]: I1003 08:41:42.524041 4765 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/marketplace-operator-79b997595-9g9cw" Oct 03 08:41:42 crc kubenswrapper[4765]: I1003 08:41:42.526253 4765 patch_prober.go:28] interesting pod/marketplace-operator-79b997595-9g9cw container/marketplace-operator namespace/openshift-marketplace: Readiness probe status=failure output="Get \"http://10.217.0.39:8080/healthz\": dial tcp 10.217.0.39:8080: connect: connection refused" start-of-body= Oct 03 08:41:42 crc kubenswrapper[4765]: I1003 08:41:42.526305 4765 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-marketplace/marketplace-operator-79b997595-9g9cw" podUID="28cc4e4f-507b-49c7-9a8f-2107e600e834" containerName="marketplace-operator" probeResult="failure" output="Get \"http://10.217.0.39:8080/healthz\": dial tcp 10.217.0.39:8080: connect: connection refused" Oct 03 08:41:42 crc kubenswrapper[4765]: I1003 08:41:42.526891 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-kthpd" event={"ID":"0599e7ee-91e4-4ef3-8b8d-a5aca9e637d3","Type":"ContainerStarted","Data":"ffab0d503c01e6cc978adcebbe3d06e47b089dad582bf5ba622e407da1e43651"} Oct 03 08:41:42 crc kubenswrapper[4765]: I1003 08:41:42.526932 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-kthpd" event={"ID":"0599e7ee-91e4-4ef3-8b8d-a5aca9e637d3","Type":"ContainerStarted","Data":"c1ec3c2540ebdb110111cc539a485fb133a89a92bffc27e3b013da99b651094f"} Oct 03 08:41:42 crc kubenswrapper[4765]: I1003 08:41:42.527463 4765 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 03 08:41:42 crc kubenswrapper[4765]: E1003 08:41:42.527922 4765 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-03 08:41:43.027887156 +0000 UTC m=+147.329381486 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 03 08:41:42 crc kubenswrapper[4765]: I1003 08:41:42.528779 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="hostpath-provisioner/csi-hostpathplugin-xjnnd" event={"ID":"193eea7c-6015-42df-b104-9a2848192515","Type":"ContainerStarted","Data":"af1bfee93cb32705245b22e714f321e7449b2847f082dc3a427208244a7bf942"} Oct 03 08:41:42 crc kubenswrapper[4765]: I1003 08:41:42.530398 4765 generic.go:334] "Generic (PLEG): container finished" podID="fb1c8d7c-5da9-41d7-85a7-3a36c632e7b3" containerID="228c5898a01d7ace79d90d52f66e206a098c30e37bd1857cc39f62a6202fb39d" exitCode=0 Oct 03 08:41:42 crc kubenswrapper[4765]: I1003 08:41:42.530467 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-l2dpj" event={"ID":"fb1c8d7c-5da9-41d7-85a7-3a36c632e7b3","Type":"ContainerDied","Data":"228c5898a01d7ace79d90d52f66e206a098c30e37bd1857cc39f62a6202fb39d"} Oct 03 08:41:42 crc kubenswrapper[4765]: I1003 08:41:42.532857 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca-operator/service-ca-operator-777779d784-n4hxl" event={"ID":"a58649d2-054a-42ed-848d-beb9e9de3522","Type":"ContainerStarted","Data":"8c6149c8b0b5e879050ce064a4a69c1ee80c0de27f026a209d4143da904797f6"} Oct 03 08:41:42 crc kubenswrapper[4765]: I1003 08:41:42.538587 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-pmrxv" event={"ID":"19fb459f-dca0-464c-9cc0-830b67a34583","Type":"ContainerStarted","Data":"d3fe2547415cb5666a1dc22ff0a7d7dadff5d115cd1e02d611fdded10f91cf6b"} Oct 03 08:41:42 crc kubenswrapper[4765]: I1003 08:41:42.542344 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns-operator/dns-operator-744455d44c-swd9l" event={"ID":"a7ea52a8-bcd9-4234-ba4d-f4181094c260","Type":"ContainerStarted","Data":"44416fe2be3cc65513e3c6685b012b1436191baec1e057e00f6a1bf2e3f12dd1"} Oct 03 08:41:42 crc kubenswrapper[4765]: I1003 08:41:42.542390 4765 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-pmrxv" Oct 03 08:41:42 crc kubenswrapper[4765]: I1003 08:41:42.542405 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns-operator/dns-operator-744455d44c-swd9l" event={"ID":"a7ea52a8-bcd9-4234-ba4d-f4181094c260","Type":"ContainerStarted","Data":"8a23fafe330cdf2a0b8bbc5bb5fcfc43084641be75652c3e1a30d67c9d85539c"} Oct 03 08:41:42 crc kubenswrapper[4765]: I1003 08:41:42.549504 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-djlns" event={"ID":"ee85c45f-e702-4221-a738-c57382513f5b","Type":"ContainerStarted","Data":"0e1ef7f0d64ea736726ed3baf8432440534cd94f41934591d7ee7b39e1aabd76"} Oct 03 08:41:42 crc kubenswrapper[4765]: I1003 08:41:42.550361 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-djlns" event={"ID":"ee85c45f-e702-4221-a738-c57382513f5b","Type":"ContainerStarted","Data":"1ee4e0fa6a4b8d8f79ad4084b1d847b868cd1044281d0aa14c5f7d95fe3bf6fc"} Oct 03 08:41:42 crc kubenswrapper[4765]: I1003 08:41:42.555597 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/machine-api-operator-5694c8668f-8mnt6" event={"ID":"17c891c2-c5ff-4815-9f09-347204c5da1d","Type":"ContainerStarted","Data":"d39d9d64ab6c3cf0f7d7bd792937e2b2fe0e2618f46a88c6d2d0e6835f201132"} Oct 03 08:41:42 crc kubenswrapper[4765]: I1003 08:41:42.558083 4765 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-pmrxv" Oct 03 08:41:42 crc kubenswrapper[4765]: I1003 08:41:42.566751 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-rhthj" event={"ID":"c15a1eca-d125-468b-ac64-8046e4bcd19b","Type":"ContainerStarted","Data":"fa04cde2a42d60657139dfe64220c74f4e7deb0e18eb462298203cf3397dfb50"} Oct 03 08:41:42 crc kubenswrapper[4765]: I1003 08:41:42.574160 4765 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-authentication/oauth-openshift-558db77b4-jgs5w" podStartSLOduration=127.574141946 podStartE2EDuration="2m7.574141946s" podCreationTimestamp="2025-10-03 08:39:35 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-03 08:41:41.62586992 +0000 UTC m=+145.927364270" watchObservedRunningTime="2025-10-03 08:41:42.574141946 +0000 UTC m=+146.875636276" Oct 03 08:41:42 crc kubenswrapper[4765]: I1003 08:41:42.583513 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-vphkx" event={"ID":"0cc05495-73d0-4866-adcb-aa89431470c5","Type":"ContainerStarted","Data":"db8f42cfac3ac68a35c02ec5b4a7967a2a780eb702abbab870f275d0e246dc8e"} Oct 03 08:41:42 crc kubenswrapper[4765]: I1003 08:41:42.598089 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-5g7vw" event={"ID":"43701ed5-3c65-480e-b414-9757b707d6be","Type":"ContainerStarted","Data":"9af8b50cbe8839aacf57bb5bb2692dbfb1555bcda3b48964359f61f7f42153bd"} Oct 03 08:41:42 crc kubenswrapper[4765]: I1003 08:41:42.598130 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-5g7vw" event={"ID":"43701ed5-3c65-480e-b414-9757b707d6be","Type":"ContainerStarted","Data":"06c5a34a5911781d14482717175dcfd56f742c342f1d8ca7d2a41276c1363873"} Oct 03 08:41:42 crc kubenswrapper[4765]: I1003 08:41:42.599886 4765 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-dns/dns-default-5g7vw" Oct 03 08:41:42 crc kubenswrapper[4765]: I1003 08:41:42.618181 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-74jfz" event={"ID":"6d421bb9-ba2e-416a-9554-7c4c7c93658b","Type":"ContainerStarted","Data":"75cf8667249f72d5ac90278b26feade1e25f9603989836ec390f27fa2aca483e"} Oct 03 08:41:42 crc kubenswrapper[4765]: I1003 08:41:42.633952 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-qwb6x\" (UID: \"32b16068-abfd-4a3f-870c-a17c7ff31d4b\") " pod="openshift-image-registry/image-registry-697d97f7c8-qwb6x" Oct 03 08:41:42 crc kubenswrapper[4765]: I1003 08:41:42.643388 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-admission-controller-857f4d67dd-rmm5l" event={"ID":"4a3a7817-f128-4b5a-bbb7-604c846009d5","Type":"ContainerStarted","Data":"2034545cd8d95c1ac1b61a725f70f4aa83af8caa9f7530dcadf3000fa47c4487"} Oct 03 08:41:42 crc kubenswrapper[4765]: I1003 08:41:42.643430 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-admission-controller-857f4d67dd-rmm5l" event={"ID":"4a3a7817-f128-4b5a-bbb7-604c846009d5","Type":"ContainerStarted","Data":"086cf24ee41d3ba93825706ad5d6286bbf88b14e1e9f8aaa7c62c03641778875"} Oct 03 08:41:42 crc kubenswrapper[4765]: I1003 08:41:42.647116 4765 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-rnsx7" podStartSLOduration=126.647100244 podStartE2EDuration="2m6.647100244s" podCreationTimestamp="2025-10-03 08:39:36 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-03 08:41:42.57918276 +0000 UTC m=+146.880677090" watchObservedRunningTime="2025-10-03 08:41:42.647100244 +0000 UTC m=+146.948594574" Oct 03 08:41:42 crc kubenswrapper[4765]: I1003 08:41:42.647196 4765 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-pmrxv" podStartSLOduration=126.647192857 podStartE2EDuration="2m6.647192857s" podCreationTimestamp="2025-10-03 08:39:36 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-03 08:41:42.646178402 +0000 UTC m=+146.947672732" watchObservedRunningTime="2025-10-03 08:41:42.647192857 +0000 UTC m=+146.948687187" Oct 03 08:41:42 crc kubenswrapper[4765]: E1003 08:41:42.647540 4765 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-03 08:41:43.147525875 +0000 UTC m=+147.449020195 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-qwb6x" (UID: "32b16068-abfd-4a3f-870c-a17c7ff31d4b") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 03 08:41:42 crc kubenswrapper[4765]: I1003 08:41:42.680217 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-config-operator/openshift-config-operator-7777fb866f-9k7k4" event={"ID":"4a5c90d2-421e-47fd-a2ae-c7c0c3c5a170","Type":"ContainerStarted","Data":"9f7ee9a62135d4b434d927c5c0e6477d0b10175c5a33a9c72c7aa54a81b90fc0"} Oct 03 08:41:42 crc kubenswrapper[4765]: I1003 08:41:42.680954 4765 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-config-operator/openshift-config-operator-7777fb866f-9k7k4" Oct 03 08:41:42 crc kubenswrapper[4765]: I1003 08:41:42.702334 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29324670-fpjm8" event={"ID":"432cff95-d219-46af-bfc4-c5afbe99c9c0","Type":"ContainerStarted","Data":"69df2fe02aca2ccab6a7c3ba623df1fe4c5bd0889c95dcaab6a47e692ab00860"} Oct 03 08:41:42 crc kubenswrapper[4765]: I1003 08:41:42.718345 4765 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-authentication/oauth-openshift-558db77b4-jgs5w" Oct 03 08:41:42 crc kubenswrapper[4765]: I1003 08:41:42.735435 4765 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 03 08:41:42 crc kubenswrapper[4765]: E1003 08:41:42.736791 4765 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-03 08:41:43.236769155 +0000 UTC m=+147.538263485 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 03 08:41:42 crc kubenswrapper[4765]: I1003 08:41:42.789983 4765 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-gvz2v" podStartSLOduration=126.789964166 podStartE2EDuration="2m6.789964166s" podCreationTimestamp="2025-10-03 08:39:36 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-03 08:41:42.685759007 +0000 UTC m=+146.987253347" watchObservedRunningTime="2025-10-03 08:41:42.789964166 +0000 UTC m=+147.091458496" Oct 03 08:41:42 crc kubenswrapper[4765]: I1003 08:41:42.791492 4765 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-x6hb4" podStartSLOduration=126.791464763 podStartE2EDuration="2m6.791464763s" podCreationTimestamp="2025-10-03 08:39:36 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-03 08:41:42.790309195 +0000 UTC m=+147.091803555" watchObservedRunningTime="2025-10-03 08:41:42.791464763 +0000 UTC m=+147.092959093" Oct 03 08:41:42 crc kubenswrapper[4765]: I1003 08:41:42.839341 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-qwb6x\" (UID: \"32b16068-abfd-4a3f-870c-a17c7ff31d4b\") " pod="openshift-image-registry/image-registry-697d97f7c8-qwb6x" Oct 03 08:41:42 crc kubenswrapper[4765]: E1003 08:41:42.839684 4765 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-03 08:41:43.339672221 +0000 UTC m=+147.641166551 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-qwb6x" (UID: "32b16068-abfd-4a3f-870c-a17c7ff31d4b") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 03 08:41:42 crc kubenswrapper[4765]: I1003 08:41:42.940873 4765 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 03 08:41:42 crc kubenswrapper[4765]: E1003 08:41:42.941289 4765 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-03 08:41:43.441271596 +0000 UTC m=+147.742765926 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 03 08:41:42 crc kubenswrapper[4765]: I1003 08:41:42.963912 4765 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-api/machine-api-operator-5694c8668f-8mnt6" podStartSLOduration=126.963896474 podStartE2EDuration="2m6.963896474s" podCreationTimestamp="2025-10-03 08:39:36 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-03 08:41:42.962967941 +0000 UTC m=+147.264462271" watchObservedRunningTime="2025-10-03 08:41:42.963896474 +0000 UTC m=+147.265390794" Oct 03 08:41:43 crc kubenswrapper[4765]: I1003 08:41:43.029683 4765 patch_prober.go:28] interesting pod/router-default-5444994796-f64ph container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Oct 03 08:41:43 crc kubenswrapper[4765]: [-]has-synced failed: reason withheld Oct 03 08:41:43 crc kubenswrapper[4765]: [+]process-running ok Oct 03 08:41:43 crc kubenswrapper[4765]: healthz check failed Oct 03 08:41:43 crc kubenswrapper[4765]: I1003 08:41:43.030050 4765 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-f64ph" podUID="1a071347-8c80-4f91-87f3-1d95c7b18a1c" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Oct 03 08:41:43 crc kubenswrapper[4765]: I1003 08:41:43.042061 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-qwb6x\" (UID: \"32b16068-abfd-4a3f-870c-a17c7ff31d4b\") " pod="openshift-image-registry/image-registry-697d97f7c8-qwb6x" Oct 03 08:41:43 crc kubenswrapper[4765]: E1003 08:41:43.042456 4765 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-03 08:41:43.5424424 +0000 UTC m=+147.843936730 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-qwb6x" (UID: "32b16068-abfd-4a3f-870c-a17c7ff31d4b") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 03 08:41:43 crc kubenswrapper[4765]: I1003 08:41:43.078988 4765 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns-operator/dns-operator-744455d44c-swd9l" podStartSLOduration=127.07895432 podStartE2EDuration="2m7.07895432s" podCreationTimestamp="2025-10-03 08:39:36 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-03 08:41:43.028492006 +0000 UTC m=+147.329986336" watchObservedRunningTime="2025-10-03 08:41:43.07895432 +0000 UTC m=+147.380448650" Oct 03 08:41:43 crc kubenswrapper[4765]: I1003 08:41:43.108229 4765 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console-operator/console-operator-58897d9998-gzcf9" Oct 03 08:41:43 crc kubenswrapper[4765]: I1003 08:41:43.124584 4765 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-djlns" podStartSLOduration=127.124564514 podStartE2EDuration="2m7.124564514s" podCreationTimestamp="2025-10-03 08:39:36 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-03 08:41:43.121260953 +0000 UTC m=+147.422755273" watchObservedRunningTime="2025-10-03 08:41:43.124564514 +0000 UTC m=+147.426058844" Oct 03 08:41:43 crc kubenswrapper[4765]: I1003 08:41:43.126128 4765 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/marketplace-operator-79b997595-9g9cw" podStartSLOduration=127.126121803 podStartE2EDuration="2m7.126121803s" podCreationTimestamp="2025-10-03 08:39:36 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-03 08:41:43.080078028 +0000 UTC m=+147.381572368" watchObservedRunningTime="2025-10-03 08:41:43.126121803 +0000 UTC m=+147.427616133" Oct 03 08:41:43 crc kubenswrapper[4765]: I1003 08:41:43.143717 4765 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 03 08:41:43 crc kubenswrapper[4765]: E1003 08:41:43.143882 4765 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-03 08:41:43.64385868 +0000 UTC m=+147.945353010 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 03 08:41:43 crc kubenswrapper[4765]: I1003 08:41:43.144034 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-qwb6x\" (UID: \"32b16068-abfd-4a3f-870c-a17c7ff31d4b\") " pod="openshift-image-registry/image-registry-697d97f7c8-qwb6x" Oct 03 08:41:43 crc kubenswrapper[4765]: E1003 08:41:43.144326 4765 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-03 08:41:43.644318281 +0000 UTC m=+147.945812611 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-qwb6x" (UID: "32b16068-abfd-4a3f-870c-a17c7ff31d4b") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 03 08:41:43 crc kubenswrapper[4765]: I1003 08:41:43.204662 4765 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-apiserver/apiserver-76f77b778f-gcgfs" podStartSLOduration=128.204631858 podStartE2EDuration="2m8.204631858s" podCreationTimestamp="2025-10-03 08:39:35 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-03 08:41:43.175200682 +0000 UTC m=+147.476695022" watchObservedRunningTime="2025-10-03 08:41:43.204631858 +0000 UTC m=+147.506126188" Oct 03 08:41:43 crc kubenswrapper[4765]: I1003 08:41:43.204924 4765 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-kthpd" podStartSLOduration=127.204920915 podStartE2EDuration="2m7.204920915s" podCreationTimestamp="2025-10-03 08:39:36 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-03 08:41:43.203141901 +0000 UTC m=+147.504636241" watchObservedRunningTime="2025-10-03 08:41:43.204920915 +0000 UTC m=+147.506415235" Oct 03 08:41:43 crc kubenswrapper[4765]: I1003 08:41:43.245571 4765 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 03 08:41:43 crc kubenswrapper[4765]: E1003 08:41:43.246105 4765 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-03 08:41:43.74608557 +0000 UTC m=+148.047579910 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 03 08:41:43 crc kubenswrapper[4765]: I1003 08:41:43.247589 4765 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-rhthj" podStartSLOduration=127.247571586 podStartE2EDuration="2m7.247571586s" podCreationTimestamp="2025-10-03 08:39:36 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-03 08:41:43.245857424 +0000 UTC m=+147.547351754" watchObservedRunningTime="2025-10-03 08:41:43.247571586 +0000 UTC m=+147.549065906" Oct 03 08:41:43 crc kubenswrapper[4765]: I1003 08:41:43.350405 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 03 08:41:43 crc kubenswrapper[4765]: I1003 08:41:43.350450 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 03 08:41:43 crc kubenswrapper[4765]: I1003 08:41:43.350474 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 03 08:41:43 crc kubenswrapper[4765]: I1003 08:41:43.350498 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 03 08:41:43 crc kubenswrapper[4765]: I1003 08:41:43.350524 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-qwb6x\" (UID: \"32b16068-abfd-4a3f-870c-a17c7ff31d4b\") " pod="openshift-image-registry/image-registry-697d97f7c8-qwb6x" Oct 03 08:41:43 crc kubenswrapper[4765]: E1003 08:41:43.350869 4765 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-03 08:41:43.850855912 +0000 UTC m=+148.152350242 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-qwb6x" (UID: "32b16068-abfd-4a3f-870c-a17c7ff31d4b") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 03 08:41:43 crc kubenswrapper[4765]: I1003 08:41:43.352351 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 03 08:41:43 crc kubenswrapper[4765]: I1003 08:41:43.376636 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 03 08:41:43 crc kubenswrapper[4765]: I1003 08:41:43.379450 4765 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/collect-profiles-29324670-fpjm8" podStartSLOduration=128.379433807 podStartE2EDuration="2m8.379433807s" podCreationTimestamp="2025-10-03 08:39:35 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-03 08:41:43.3791616 +0000 UTC m=+147.680655940" watchObservedRunningTime="2025-10-03 08:41:43.379433807 +0000 UTC m=+147.680928137" Oct 03 08:41:43 crc kubenswrapper[4765]: I1003 08:41:43.380083 4765 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-vphkx" podStartSLOduration=128.380077283 podStartE2EDuration="2m8.380077283s" podCreationTimestamp="2025-10-03 08:39:35 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-03 08:41:43.350031002 +0000 UTC m=+147.651525352" watchObservedRunningTime="2025-10-03 08:41:43.380077283 +0000 UTC m=+147.681571613" Oct 03 08:41:43 crc kubenswrapper[4765]: I1003 08:41:43.380529 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 03 08:41:43 crc kubenswrapper[4765]: I1003 08:41:43.382266 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 03 08:41:43 crc kubenswrapper[4765]: I1003 08:41:43.452105 4765 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 03 08:41:43 crc kubenswrapper[4765]: E1003 08:41:43.452267 4765 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-03 08:41:43.952237472 +0000 UTC m=+148.253731812 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 03 08:41:43 crc kubenswrapper[4765]: I1003 08:41:43.453079 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-qwb6x\" (UID: \"32b16068-abfd-4a3f-870c-a17c7ff31d4b\") " pod="openshift-image-registry/image-registry-697d97f7c8-qwb6x" Oct 03 08:41:43 crc kubenswrapper[4765]: E1003 08:41:43.453562 4765 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-03 08:41:43.953547754 +0000 UTC m=+148.255042094 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-qwb6x" (UID: "32b16068-abfd-4a3f-870c-a17c7ff31d4b") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 03 08:41:43 crc kubenswrapper[4765]: I1003 08:41:43.498991 4765 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-config-operator/openshift-config-operator-7777fb866f-9k7k4" podStartSLOduration=128.498974884 podStartE2EDuration="2m8.498974884s" podCreationTimestamp="2025-10-03 08:39:35 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-03 08:41:43.438010451 +0000 UTC m=+147.739504801" watchObservedRunningTime="2025-10-03 08:41:43.498974884 +0000 UTC m=+147.800469214" Oct 03 08:41:43 crc kubenswrapper[4765]: I1003 08:41:43.517368 4765 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 03 08:41:43 crc kubenswrapper[4765]: I1003 08:41:43.525279 4765 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 03 08:41:43 crc kubenswrapper[4765]: I1003 08:41:43.531636 4765 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 03 08:41:43 crc kubenswrapper[4765]: I1003 08:41:43.554686 4765 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 03 08:41:43 crc kubenswrapper[4765]: E1003 08:41:43.554974 4765 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-03 08:41:44.054958724 +0000 UTC m=+148.356453054 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 03 08:41:43 crc kubenswrapper[4765]: I1003 08:41:43.587100 4765 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-admission-controller-857f4d67dd-rmm5l" podStartSLOduration=127.587072205 podStartE2EDuration="2m7.587072205s" podCreationTimestamp="2025-10-03 08:39:36 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-03 08:41:43.58523761 +0000 UTC m=+147.886731950" watchObservedRunningTime="2025-10-03 08:41:43.587072205 +0000 UTC m=+147.888566535" Oct 03 08:41:43 crc kubenswrapper[4765]: I1003 08:41:43.588600 4765 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns/dns-default-5g7vw" podStartSLOduration=9.588589573 podStartE2EDuration="9.588589573s" podCreationTimestamp="2025-10-03 08:41:34 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-03 08:41:43.530616414 +0000 UTC m=+147.832110744" watchObservedRunningTime="2025-10-03 08:41:43.588589573 +0000 UTC m=+147.890083903" Oct 03 08:41:43 crc kubenswrapper[4765]: I1003 08:41:43.656449 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-qwb6x\" (UID: \"32b16068-abfd-4a3f-870c-a17c7ff31d4b\") " pod="openshift-image-registry/image-registry-697d97f7c8-qwb6x" Oct 03 08:41:43 crc kubenswrapper[4765]: E1003 08:41:43.657440 4765 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-03 08:41:44.15742571 +0000 UTC m=+148.458920040 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-qwb6x" (UID: "32b16068-abfd-4a3f-870c-a17c7ff31d4b") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 03 08:41:43 crc kubenswrapper[4765]: I1003 08:41:43.759823 4765 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 03 08:41:43 crc kubenswrapper[4765]: E1003 08:41:43.760801 4765 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-03 08:41:44.260771667 +0000 UTC m=+148.562265997 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 03 08:41:43 crc kubenswrapper[4765]: I1003 08:41:43.767460 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-l2dpj" event={"ID":"fb1c8d7c-5da9-41d7-85a7-3a36c632e7b3","Type":"ContainerStarted","Data":"7f411f08c11fd6306c0215934d747325f2bebe58564b53523585233c8713add4"} Oct 03 08:41:43 crc kubenswrapper[4765]: I1003 08:41:43.795344 4765 patch_prober.go:28] interesting pod/marketplace-operator-79b997595-9g9cw container/marketplace-operator namespace/openshift-marketplace: Readiness probe status=failure output="Get \"http://10.217.0.39:8080/healthz\": dial tcp 10.217.0.39:8080: connect: connection refused" start-of-body= Oct 03 08:41:43 crc kubenswrapper[4765]: I1003 08:41:43.795394 4765 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-marketplace/marketplace-operator-79b997595-9g9cw" podUID="28cc4e4f-507b-49c7-9a8f-2107e600e834" containerName="marketplace-operator" probeResult="failure" output="Get \"http://10.217.0.39:8080/healthz\": dial tcp 10.217.0.39:8080: connect: connection refused" Oct 03 08:41:43 crc kubenswrapper[4765]: I1003 08:41:43.863499 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-qwb6x\" (UID: \"32b16068-abfd-4a3f-870c-a17c7ff31d4b\") " pod="openshift-image-registry/image-registry-697d97f7c8-qwb6x" Oct 03 08:41:43 crc kubenswrapper[4765]: E1003 08:41:43.863849 4765 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-03 08:41:44.363837378 +0000 UTC m=+148.665331708 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-qwb6x" (UID: "32b16068-abfd-4a3f-870c-a17c7ff31d4b") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 03 08:41:43 crc kubenswrapper[4765]: I1003 08:41:43.965960 4765 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 03 08:41:43 crc kubenswrapper[4765]: E1003 08:41:43.967987 4765 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-03 08:41:44.467970425 +0000 UTC m=+148.769464755 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 03 08:41:44 crc kubenswrapper[4765]: I1003 08:41:44.052951 4765 patch_prober.go:28] interesting pod/router-default-5444994796-f64ph container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Oct 03 08:41:44 crc kubenswrapper[4765]: [-]has-synced failed: reason withheld Oct 03 08:41:44 crc kubenswrapper[4765]: [+]process-running ok Oct 03 08:41:44 crc kubenswrapper[4765]: healthz check failed Oct 03 08:41:44 crc kubenswrapper[4765]: I1003 08:41:44.053010 4765 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-f64ph" podUID="1a071347-8c80-4f91-87f3-1d95c7b18a1c" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Oct 03 08:41:44 crc kubenswrapper[4765]: I1003 08:41:44.075275 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-qwb6x\" (UID: \"32b16068-abfd-4a3f-870c-a17c7ff31d4b\") " pod="openshift-image-registry/image-registry-697d97f7c8-qwb6x" Oct 03 08:41:44 crc kubenswrapper[4765]: E1003 08:41:44.076184 4765 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-03 08:41:44.576166862 +0000 UTC m=+148.877661192 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-qwb6x" (UID: "32b16068-abfd-4a3f-870c-a17c7ff31d4b") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 03 08:41:44 crc kubenswrapper[4765]: I1003 08:41:44.176426 4765 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 03 08:41:44 crc kubenswrapper[4765]: E1003 08:41:44.176837 4765 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-03 08:41:44.676819073 +0000 UTC m=+148.978313403 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 03 08:41:44 crc kubenswrapper[4765]: I1003 08:41:44.251772 4765 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-l2dpj" podStartSLOduration=128.25174887 podStartE2EDuration="2m8.25174887s" podCreationTimestamp="2025-10-03 08:39:36 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-03 08:41:43.823839722 +0000 UTC m=+148.125334062" watchObservedRunningTime="2025-10-03 08:41:44.25174887 +0000 UTC m=+148.553243210" Oct 03 08:41:44 crc kubenswrapper[4765]: I1003 08:41:44.291661 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-qwb6x\" (UID: \"32b16068-abfd-4a3f-870c-a17c7ff31d4b\") " pod="openshift-image-registry/image-registry-697d97f7c8-qwb6x" Oct 03 08:41:44 crc kubenswrapper[4765]: E1003 08:41:44.292219 4765 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-03 08:41:44.792192257 +0000 UTC m=+149.093686587 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-qwb6x" (UID: "32b16068-abfd-4a3f-870c-a17c7ff31d4b") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 03 08:41:44 crc kubenswrapper[4765]: I1003 08:41:44.398992 4765 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 03 08:41:44 crc kubenswrapper[4765]: E1003 08:41:44.399369 4765 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-03 08:41:44.899350088 +0000 UTC m=+149.200844418 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 03 08:41:44 crc kubenswrapper[4765]: I1003 08:41:44.424916 4765 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-9hhf6"] Oct 03 08:41:44 crc kubenswrapper[4765]: I1003 08:41:44.425930 4765 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-9hhf6" Oct 03 08:41:44 crc kubenswrapper[4765]: I1003 08:41:44.430262 4765 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"certified-operators-dockercfg-4rs5g" Oct 03 08:41:44 crc kubenswrapper[4765]: I1003 08:41:44.486021 4765 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-9hhf6"] Oct 03 08:41:44 crc kubenswrapper[4765]: I1003 08:41:44.509735 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/52d70e1c-3f04-4bab-a6a3-2ea9d66489db-catalog-content\") pod \"certified-operators-9hhf6\" (UID: \"52d70e1c-3f04-4bab-a6a3-2ea9d66489db\") " pod="openshift-marketplace/certified-operators-9hhf6" Oct 03 08:41:44 crc kubenswrapper[4765]: I1003 08:41:44.509817 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fc4db\" (UniqueName: \"kubernetes.io/projected/52d70e1c-3f04-4bab-a6a3-2ea9d66489db-kube-api-access-fc4db\") pod \"certified-operators-9hhf6\" (UID: \"52d70e1c-3f04-4bab-a6a3-2ea9d66489db\") " pod="openshift-marketplace/certified-operators-9hhf6" Oct 03 08:41:44 crc kubenswrapper[4765]: I1003 08:41:44.509851 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-qwb6x\" (UID: \"32b16068-abfd-4a3f-870c-a17c7ff31d4b\") " pod="openshift-image-registry/image-registry-697d97f7c8-qwb6x" Oct 03 08:41:44 crc kubenswrapper[4765]: I1003 08:41:44.509873 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/52d70e1c-3f04-4bab-a6a3-2ea9d66489db-utilities\") pod \"certified-operators-9hhf6\" (UID: \"52d70e1c-3f04-4bab-a6a3-2ea9d66489db\") " pod="openshift-marketplace/certified-operators-9hhf6" Oct 03 08:41:44 crc kubenswrapper[4765]: E1003 08:41:44.510173 4765 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-03 08:41:45.01016183 +0000 UTC m=+149.311656160 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-qwb6x" (UID: "32b16068-abfd-4a3f-870c-a17c7ff31d4b") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 03 08:41:44 crc kubenswrapper[4765]: I1003 08:41:44.610290 4765 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 03 08:41:44 crc kubenswrapper[4765]: E1003 08:41:44.610452 4765 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-03 08:41:45.110418611 +0000 UTC m=+149.411912941 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 03 08:41:44 crc kubenswrapper[4765]: I1003 08:41:44.610918 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-qwb6x\" (UID: \"32b16068-abfd-4a3f-870c-a17c7ff31d4b\") " pod="openshift-image-registry/image-registry-697d97f7c8-qwb6x" Oct 03 08:41:44 crc kubenswrapper[4765]: I1003 08:41:44.610967 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/52d70e1c-3f04-4bab-a6a3-2ea9d66489db-utilities\") pod \"certified-operators-9hhf6\" (UID: \"52d70e1c-3f04-4bab-a6a3-2ea9d66489db\") " pod="openshift-marketplace/certified-operators-9hhf6" Oct 03 08:41:44 crc kubenswrapper[4765]: I1003 08:41:44.611004 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/52d70e1c-3f04-4bab-a6a3-2ea9d66489db-catalog-content\") pod \"certified-operators-9hhf6\" (UID: \"52d70e1c-3f04-4bab-a6a3-2ea9d66489db\") " pod="openshift-marketplace/certified-operators-9hhf6" Oct 03 08:41:44 crc kubenswrapper[4765]: I1003 08:41:44.611091 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fc4db\" (UniqueName: \"kubernetes.io/projected/52d70e1c-3f04-4bab-a6a3-2ea9d66489db-kube-api-access-fc4db\") pod \"certified-operators-9hhf6\" (UID: \"52d70e1c-3f04-4bab-a6a3-2ea9d66489db\") " pod="openshift-marketplace/certified-operators-9hhf6" Oct 03 08:41:44 crc kubenswrapper[4765]: I1003 08:41:44.611536 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/52d70e1c-3f04-4bab-a6a3-2ea9d66489db-utilities\") pod \"certified-operators-9hhf6\" (UID: \"52d70e1c-3f04-4bab-a6a3-2ea9d66489db\") " pod="openshift-marketplace/certified-operators-9hhf6" Oct 03 08:41:44 crc kubenswrapper[4765]: E1003 08:41:44.611890 4765 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-03 08:41:45.111878957 +0000 UTC m=+149.413373287 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-qwb6x" (UID: "32b16068-abfd-4a3f-870c-a17c7ff31d4b") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 03 08:41:44 crc kubenswrapper[4765]: I1003 08:41:44.611877 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/52d70e1c-3f04-4bab-a6a3-2ea9d66489db-catalog-content\") pod \"certified-operators-9hhf6\" (UID: \"52d70e1c-3f04-4bab-a6a3-2ea9d66489db\") " pod="openshift-marketplace/certified-operators-9hhf6" Oct 03 08:41:44 crc kubenswrapper[4765]: I1003 08:41:44.650709 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fc4db\" (UniqueName: \"kubernetes.io/projected/52d70e1c-3f04-4bab-a6a3-2ea9d66489db-kube-api-access-fc4db\") pod \"certified-operators-9hhf6\" (UID: \"52d70e1c-3f04-4bab-a6a3-2ea9d66489db\") " pod="openshift-marketplace/certified-operators-9hhf6" Oct 03 08:41:44 crc kubenswrapper[4765]: W1003 08:41:44.669615 4765 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod3b6479f0_333b_4a96_9adf_2099afdc2447.slice/crio-66d8e7f9b82f42da80a0a48bf72971e7659d8598b4d3ef508b05ea345c0a2767 WatchSource:0}: Error finding container 66d8e7f9b82f42da80a0a48bf72971e7659d8598b4d3ef508b05ea345c0a2767: Status 404 returned error can't find the container with id 66d8e7f9b82f42da80a0a48bf72971e7659d8598b4d3ef508b05ea345c0a2767 Oct 03 08:41:44 crc kubenswrapper[4765]: I1003 08:41:44.673010 4765 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-8lvxz"] Oct 03 08:41:44 crc kubenswrapper[4765]: I1003 08:41:44.674547 4765 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-8lvxz" Oct 03 08:41:44 crc kubenswrapper[4765]: W1003 08:41:44.679994 4765 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod5fe485a1_e14f_4c09_b5b9_f252bc42b7e8.slice/crio-eb744d279ecda5b145dbd3a150cde1527537d785dec09b00bd583f9139ec9458 WatchSource:0}: Error finding container eb744d279ecda5b145dbd3a150cde1527537d785dec09b00bd583f9139ec9458: Status 404 returned error can't find the container with id eb744d279ecda5b145dbd3a150cde1527537d785dec09b00bd583f9139ec9458 Oct 03 08:41:44 crc kubenswrapper[4765]: I1003 08:41:44.683799 4765 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"community-operators-dockercfg-dmngl" Oct 03 08:41:44 crc kubenswrapper[4765]: I1003 08:41:44.685170 4765 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-8lvxz"] Oct 03 08:41:44 crc kubenswrapper[4765]: I1003 08:41:44.717073 4765 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 03 08:41:44 crc kubenswrapper[4765]: I1003 08:41:44.717411 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6vwkv\" (UniqueName: \"kubernetes.io/projected/4de34feb-a2a4-49c9-b066-f7a71b39cd06-kube-api-access-6vwkv\") pod \"community-operators-8lvxz\" (UID: \"4de34feb-a2a4-49c9-b066-f7a71b39cd06\") " pod="openshift-marketplace/community-operators-8lvxz" Oct 03 08:41:44 crc kubenswrapper[4765]: I1003 08:41:44.717517 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4de34feb-a2a4-49c9-b066-f7a71b39cd06-catalog-content\") pod \"community-operators-8lvxz\" (UID: \"4de34feb-a2a4-49c9-b066-f7a71b39cd06\") " pod="openshift-marketplace/community-operators-8lvxz" Oct 03 08:41:44 crc kubenswrapper[4765]: I1003 08:41:44.717557 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4de34feb-a2a4-49c9-b066-f7a71b39cd06-utilities\") pod \"community-operators-8lvxz\" (UID: \"4de34feb-a2a4-49c9-b066-f7a71b39cd06\") " pod="openshift-marketplace/community-operators-8lvxz" Oct 03 08:41:44 crc kubenswrapper[4765]: E1003 08:41:44.717724 4765 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-03 08:41:45.217699116 +0000 UTC m=+149.519193456 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 03 08:41:44 crc kubenswrapper[4765]: I1003 08:41:44.747397 4765 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-9hhf6" Oct 03 08:41:44 crc kubenswrapper[4765]: I1003 08:41:44.811123 4765 plugin_watcher.go:194] "Adding socket path or updating timestamp to desired state cache" path="/var/lib/kubelet/plugins_registry/kubevirt.io.hostpath-provisioner-reg.sock" Oct 03 08:41:44 crc kubenswrapper[4765]: I1003 08:41:44.812999 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" event={"ID":"9d751cbb-f2e2-430d-9754-c882a5e924a5","Type":"ContainerStarted","Data":"0ab8170b5e2605e7ca2584dc157e9b1541c429c5eccc06e355dd267988bda64f"} Oct 03 08:41:44 crc kubenswrapper[4765]: I1003 08:41:44.813039 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" event={"ID":"9d751cbb-f2e2-430d-9754-c882a5e924a5","Type":"ContainerStarted","Data":"1f0ee9aed9f5af86e9482810bcf04e4a31f9aaac65ab5fce4d853d50bfdb0ffd"} Oct 03 08:41:44 crc kubenswrapper[4765]: I1003 08:41:44.817063 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" event={"ID":"3b6479f0-333b-4a96-9adf-2099afdc2447","Type":"ContainerStarted","Data":"66d8e7f9b82f42da80a0a48bf72971e7659d8598b4d3ef508b05ea345c0a2767"} Oct 03 08:41:44 crc kubenswrapper[4765]: I1003 08:41:44.824370 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-qwb6x\" (UID: \"32b16068-abfd-4a3f-870c-a17c7ff31d4b\") " pod="openshift-image-registry/image-registry-697d97f7c8-qwb6x" Oct 03 08:41:44 crc kubenswrapper[4765]: I1003 08:41:44.824449 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4de34feb-a2a4-49c9-b066-f7a71b39cd06-catalog-content\") pod \"community-operators-8lvxz\" (UID: \"4de34feb-a2a4-49c9-b066-f7a71b39cd06\") " pod="openshift-marketplace/community-operators-8lvxz" Oct 03 08:41:44 crc kubenswrapper[4765]: I1003 08:41:44.824484 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4de34feb-a2a4-49c9-b066-f7a71b39cd06-utilities\") pod \"community-operators-8lvxz\" (UID: \"4de34feb-a2a4-49c9-b066-f7a71b39cd06\") " pod="openshift-marketplace/community-operators-8lvxz" Oct 03 08:41:44 crc kubenswrapper[4765]: I1003 08:41:44.824539 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6vwkv\" (UniqueName: \"kubernetes.io/projected/4de34feb-a2a4-49c9-b066-f7a71b39cd06-kube-api-access-6vwkv\") pod \"community-operators-8lvxz\" (UID: \"4de34feb-a2a4-49c9-b066-f7a71b39cd06\") " pod="openshift-marketplace/community-operators-8lvxz" Oct 03 08:41:44 crc kubenswrapper[4765]: I1003 08:41:44.825915 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4de34feb-a2a4-49c9-b066-f7a71b39cd06-catalog-content\") pod \"community-operators-8lvxz\" (UID: \"4de34feb-a2a4-49c9-b066-f7a71b39cd06\") " pod="openshift-marketplace/community-operators-8lvxz" Oct 03 08:41:44 crc kubenswrapper[4765]: E1003 08:41:44.826240 4765 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-03 08:41:45.32622431 +0000 UTC m=+149.627718640 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-qwb6x" (UID: "32b16068-abfd-4a3f-870c-a17c7ff31d4b") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 03 08:41:44 crc kubenswrapper[4765]: I1003 08:41:44.837777 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4de34feb-a2a4-49c9-b066-f7a71b39cd06-utilities\") pod \"community-operators-8lvxz\" (UID: \"4de34feb-a2a4-49c9-b066-f7a71b39cd06\") " pod="openshift-marketplace/community-operators-8lvxz" Oct 03 08:41:44 crc kubenswrapper[4765]: I1003 08:41:44.853319 4765 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-4w7fr"] Oct 03 08:41:44 crc kubenswrapper[4765]: I1003 08:41:44.878424 4765 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-4w7fr" Oct 03 08:41:44 crc kubenswrapper[4765]: I1003 08:41:44.886101 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" event={"ID":"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8","Type":"ContainerStarted","Data":"eb744d279ecda5b145dbd3a150cde1527537d785dec09b00bd583f9139ec9458"} Oct 03 08:41:44 crc kubenswrapper[4765]: I1003 08:41:44.895636 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="hostpath-provisioner/csi-hostpathplugin-xjnnd" event={"ID":"193eea7c-6015-42df-b104-9a2848192515","Type":"ContainerStarted","Data":"c53a096cd84646cfcc6ec28c5bd53c6a2c49d3f85e3f4234645321e0530688a2"} Oct 03 08:41:44 crc kubenswrapper[4765]: I1003 08:41:44.895706 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="hostpath-provisioner/csi-hostpathplugin-xjnnd" event={"ID":"193eea7c-6015-42df-b104-9a2848192515","Type":"ContainerStarted","Data":"69968df926387a8d3945b21ef86348773649721701e6f065574ec15410fc4abb"} Oct 03 08:41:44 crc kubenswrapper[4765]: I1003 08:41:44.903538 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6vwkv\" (UniqueName: \"kubernetes.io/projected/4de34feb-a2a4-49c9-b066-f7a71b39cd06-kube-api-access-6vwkv\") pod \"community-operators-8lvxz\" (UID: \"4de34feb-a2a4-49c9-b066-f7a71b39cd06\") " pod="openshift-marketplace/community-operators-8lvxz" Oct 03 08:41:44 crc kubenswrapper[4765]: I1003 08:41:44.935789 4765 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 03 08:41:44 crc kubenswrapper[4765]: I1003 08:41:44.936027 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ba3fb502-6081-420d-8ef8-a249a7e69e60-utilities\") pod \"certified-operators-4w7fr\" (UID: \"ba3fb502-6081-420d-8ef8-a249a7e69e60\") " pod="openshift-marketplace/certified-operators-4w7fr" Oct 03 08:41:44 crc kubenswrapper[4765]: I1003 08:41:44.936131 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ba3fb502-6081-420d-8ef8-a249a7e69e60-catalog-content\") pod \"certified-operators-4w7fr\" (UID: \"ba3fb502-6081-420d-8ef8-a249a7e69e60\") " pod="openshift-marketplace/certified-operators-4w7fr" Oct 03 08:41:44 crc kubenswrapper[4765]: I1003 08:41:44.936149 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gtv5z\" (UniqueName: \"kubernetes.io/projected/ba3fb502-6081-420d-8ef8-a249a7e69e60-kube-api-access-gtv5z\") pod \"certified-operators-4w7fr\" (UID: \"ba3fb502-6081-420d-8ef8-a249a7e69e60\") " pod="openshift-marketplace/certified-operators-4w7fr" Oct 03 08:41:44 crc kubenswrapper[4765]: I1003 08:41:44.938671 4765 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-4w7fr"] Oct 03 08:41:44 crc kubenswrapper[4765]: E1003 08:41:44.939191 4765 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-03 08:41:45.439169954 +0000 UTC m=+149.740664284 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 03 08:41:45 crc kubenswrapper[4765]: I1003 08:41:45.004986 4765 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-q84qb"] Oct 03 08:41:45 crc kubenswrapper[4765]: I1003 08:41:45.006201 4765 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-q84qb" Oct 03 08:41:45 crc kubenswrapper[4765]: I1003 08:41:45.021835 4765 patch_prober.go:28] interesting pod/router-default-5444994796-f64ph container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Oct 03 08:41:45 crc kubenswrapper[4765]: [-]has-synced failed: reason withheld Oct 03 08:41:45 crc kubenswrapper[4765]: [+]process-running ok Oct 03 08:41:45 crc kubenswrapper[4765]: healthz check failed Oct 03 08:41:45 crc kubenswrapper[4765]: I1003 08:41:45.021895 4765 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-f64ph" podUID="1a071347-8c80-4f91-87f3-1d95c7b18a1c" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Oct 03 08:41:45 crc kubenswrapper[4765]: I1003 08:41:45.025230 4765 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-q84qb"] Oct 03 08:41:45 crc kubenswrapper[4765]: I1003 08:41:45.040461 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ba3fb502-6081-420d-8ef8-a249a7e69e60-utilities\") pod \"certified-operators-4w7fr\" (UID: \"ba3fb502-6081-420d-8ef8-a249a7e69e60\") " pod="openshift-marketplace/certified-operators-4w7fr" Oct 03 08:41:45 crc kubenswrapper[4765]: I1003 08:41:45.040528 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-qwb6x\" (UID: \"32b16068-abfd-4a3f-870c-a17c7ff31d4b\") " pod="openshift-image-registry/image-registry-697d97f7c8-qwb6x" Oct 03 08:41:45 crc kubenswrapper[4765]: I1003 08:41:45.040568 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ba3fb502-6081-420d-8ef8-a249a7e69e60-catalog-content\") pod \"certified-operators-4w7fr\" (UID: \"ba3fb502-6081-420d-8ef8-a249a7e69e60\") " pod="openshift-marketplace/certified-operators-4w7fr" Oct 03 08:41:45 crc kubenswrapper[4765]: I1003 08:41:45.040584 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gtv5z\" (UniqueName: \"kubernetes.io/projected/ba3fb502-6081-420d-8ef8-a249a7e69e60-kube-api-access-gtv5z\") pod \"certified-operators-4w7fr\" (UID: \"ba3fb502-6081-420d-8ef8-a249a7e69e60\") " pod="openshift-marketplace/certified-operators-4w7fr" Oct 03 08:41:45 crc kubenswrapper[4765]: I1003 08:41:45.041665 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ba3fb502-6081-420d-8ef8-a249a7e69e60-utilities\") pod \"certified-operators-4w7fr\" (UID: \"ba3fb502-6081-420d-8ef8-a249a7e69e60\") " pod="openshift-marketplace/certified-operators-4w7fr" Oct 03 08:41:45 crc kubenswrapper[4765]: E1003 08:41:45.041962 4765 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-03 08:41:45.541949478 +0000 UTC m=+149.843443808 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-qwb6x" (UID: "32b16068-abfd-4a3f-870c-a17c7ff31d4b") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 03 08:41:45 crc kubenswrapper[4765]: I1003 08:41:45.042212 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ba3fb502-6081-420d-8ef8-a249a7e69e60-catalog-content\") pod \"certified-operators-4w7fr\" (UID: \"ba3fb502-6081-420d-8ef8-a249a7e69e60\") " pod="openshift-marketplace/certified-operators-4w7fr" Oct 03 08:41:45 crc kubenswrapper[4765]: I1003 08:41:45.051088 4765 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-8lvxz" Oct 03 08:41:45 crc kubenswrapper[4765]: I1003 08:41:45.078247 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gtv5z\" (UniqueName: \"kubernetes.io/projected/ba3fb502-6081-420d-8ef8-a249a7e69e60-kube-api-access-gtv5z\") pod \"certified-operators-4w7fr\" (UID: \"ba3fb502-6081-420d-8ef8-a249a7e69e60\") " pod="openshift-marketplace/certified-operators-4w7fr" Oct 03 08:41:45 crc kubenswrapper[4765]: I1003 08:41:45.141458 4765 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 03 08:41:45 crc kubenswrapper[4765]: E1003 08:41:45.141803 4765 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-03 08:41:45.641778539 +0000 UTC m=+149.943272859 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 03 08:41:45 crc kubenswrapper[4765]: I1003 08:41:45.142215 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-b2269\" (UniqueName: \"kubernetes.io/projected/de26c094-f060-4b1d-b06f-13bf0f1794ce-kube-api-access-b2269\") pod \"community-operators-q84qb\" (UID: \"de26c094-f060-4b1d-b06f-13bf0f1794ce\") " pod="openshift-marketplace/community-operators-q84qb" Oct 03 08:41:45 crc kubenswrapper[4765]: I1003 08:41:45.142265 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-qwb6x\" (UID: \"32b16068-abfd-4a3f-870c-a17c7ff31d4b\") " pod="openshift-image-registry/image-registry-697d97f7c8-qwb6x" Oct 03 08:41:45 crc kubenswrapper[4765]: I1003 08:41:45.142328 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/de26c094-f060-4b1d-b06f-13bf0f1794ce-catalog-content\") pod \"community-operators-q84qb\" (UID: \"de26c094-f060-4b1d-b06f-13bf0f1794ce\") " pod="openshift-marketplace/community-operators-q84qb" Oct 03 08:41:45 crc kubenswrapper[4765]: I1003 08:41:45.142353 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/de26c094-f060-4b1d-b06f-13bf0f1794ce-utilities\") pod \"community-operators-q84qb\" (UID: \"de26c094-f060-4b1d-b06f-13bf0f1794ce\") " pod="openshift-marketplace/community-operators-q84qb" Oct 03 08:41:45 crc kubenswrapper[4765]: E1003 08:41:45.142821 4765 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-03 08:41:45.642811834 +0000 UTC m=+149.944306164 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-qwb6x" (UID: "32b16068-abfd-4a3f-870c-a17c7ff31d4b") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 03 08:41:45 crc kubenswrapper[4765]: I1003 08:41:45.243520 4765 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 03 08:41:45 crc kubenswrapper[4765]: I1003 08:41:45.243986 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-b2269\" (UniqueName: \"kubernetes.io/projected/de26c094-f060-4b1d-b06f-13bf0f1794ce-kube-api-access-b2269\") pod \"community-operators-q84qb\" (UID: \"de26c094-f060-4b1d-b06f-13bf0f1794ce\") " pod="openshift-marketplace/community-operators-q84qb" Oct 03 08:41:45 crc kubenswrapper[4765]: I1003 08:41:45.244067 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/de26c094-f060-4b1d-b06f-13bf0f1794ce-catalog-content\") pod \"community-operators-q84qb\" (UID: \"de26c094-f060-4b1d-b06f-13bf0f1794ce\") " pod="openshift-marketplace/community-operators-q84qb" Oct 03 08:41:45 crc kubenswrapper[4765]: I1003 08:41:45.244091 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/de26c094-f060-4b1d-b06f-13bf0f1794ce-utilities\") pod \"community-operators-q84qb\" (UID: \"de26c094-f060-4b1d-b06f-13bf0f1794ce\") " pod="openshift-marketplace/community-operators-q84qb" Oct 03 08:41:45 crc kubenswrapper[4765]: I1003 08:41:45.245051 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/de26c094-f060-4b1d-b06f-13bf0f1794ce-utilities\") pod \"community-operators-q84qb\" (UID: \"de26c094-f060-4b1d-b06f-13bf0f1794ce\") " pod="openshift-marketplace/community-operators-q84qb" Oct 03 08:41:45 crc kubenswrapper[4765]: E1003 08:41:45.245148 4765 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-03 08:41:45.745119436 +0000 UTC m=+150.046613766 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 03 08:41:45 crc kubenswrapper[4765]: I1003 08:41:45.245748 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/de26c094-f060-4b1d-b06f-13bf0f1794ce-catalog-content\") pod \"community-operators-q84qb\" (UID: \"de26c094-f060-4b1d-b06f-13bf0f1794ce\") " pod="openshift-marketplace/community-operators-q84qb" Oct 03 08:41:45 crc kubenswrapper[4765]: I1003 08:41:45.249958 4765 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-4w7fr" Oct 03 08:41:45 crc kubenswrapper[4765]: I1003 08:41:45.271424 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-b2269\" (UniqueName: \"kubernetes.io/projected/de26c094-f060-4b1d-b06f-13bf0f1794ce-kube-api-access-b2269\") pod \"community-operators-q84qb\" (UID: \"de26c094-f060-4b1d-b06f-13bf0f1794ce\") " pod="openshift-marketplace/community-operators-q84qb" Oct 03 08:41:45 crc kubenswrapper[4765]: I1003 08:41:45.321881 4765 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-9hhf6"] Oct 03 08:41:45 crc kubenswrapper[4765]: I1003 08:41:45.343035 4765 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-q84qb" Oct 03 08:41:45 crc kubenswrapper[4765]: I1003 08:41:45.345882 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-qwb6x\" (UID: \"32b16068-abfd-4a3f-870c-a17c7ff31d4b\") " pod="openshift-image-registry/image-registry-697d97f7c8-qwb6x" Oct 03 08:41:45 crc kubenswrapper[4765]: E1003 08:41:45.346386 4765 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-03 08:41:45.846366572 +0000 UTC m=+150.147860902 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-qwb6x" (UID: "32b16068-abfd-4a3f-870c-a17c7ff31d4b") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 03 08:41:45 crc kubenswrapper[4765]: I1003 08:41:45.391204 4765 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-8lvxz"] Oct 03 08:41:45 crc kubenswrapper[4765]: I1003 08:41:45.450293 4765 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 03 08:41:45 crc kubenswrapper[4765]: E1003 08:41:45.450880 4765 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-03 08:41:45.950852728 +0000 UTC m=+150.252347058 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 03 08:41:45 crc kubenswrapper[4765]: I1003 08:41:45.511795 4765 reconciler.go:161] "OperationExecutor.RegisterPlugin started" plugin={"SocketPath":"/var/lib/kubelet/plugins_registry/kubevirt.io.hostpath-provisioner-reg.sock","Timestamp":"2025-10-03T08:41:44.81114699Z","Handler":null,"Name":""} Oct 03 08:41:45 crc kubenswrapper[4765]: I1003 08:41:45.515367 4765 csi_plugin.go:100] kubernetes.io/csi: Trying to validate a new CSI Driver with name: kubevirt.io.hostpath-provisioner endpoint: /var/lib/kubelet/plugins/csi-hostpath/csi.sock versions: 1.0.0 Oct 03 08:41:45 crc kubenswrapper[4765]: I1003 08:41:45.515406 4765 csi_plugin.go:113] kubernetes.io/csi: Register new plugin with name: kubevirt.io.hostpath-provisioner at endpoint: /var/lib/kubelet/plugins/csi-hostpath/csi.sock Oct 03 08:41:45 crc kubenswrapper[4765]: I1003 08:41:45.551924 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-qwb6x\" (UID: \"32b16068-abfd-4a3f-870c-a17c7ff31d4b\") " pod="openshift-image-registry/image-registry-697d97f7c8-qwb6x" Oct 03 08:41:45 crc kubenswrapper[4765]: I1003 08:41:45.557199 4765 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Oct 03 08:41:45 crc kubenswrapper[4765]: I1003 08:41:45.557239 4765 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-qwb6x\" (UID: \"32b16068-abfd-4a3f-870c-a17c7ff31d4b\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/1f4776af88835e41c12b831b4c9fed40233456d14189815a54dbe7f892fc1983/globalmount\"" pod="openshift-image-registry/image-registry-697d97f7c8-qwb6x" Oct 03 08:41:45 crc kubenswrapper[4765]: I1003 08:41:45.588309 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-qwb6x\" (UID: \"32b16068-abfd-4a3f-870c-a17c7ff31d4b\") " pod="openshift-image-registry/image-registry-697d97f7c8-qwb6x" Oct 03 08:41:45 crc kubenswrapper[4765]: I1003 08:41:45.629349 4765 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-4w7fr"] Oct 03 08:41:45 crc kubenswrapper[4765]: I1003 08:41:45.630776 4765 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-697d97f7c8-qwb6x" Oct 03 08:41:45 crc kubenswrapper[4765]: I1003 08:41:45.654138 4765 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 03 08:41:45 crc kubenswrapper[4765]: I1003 08:41:45.684715 4765 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (OuterVolumeSpecName: "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8". PluginName "kubernetes.io/csi", VolumeGidValue "" Oct 03 08:41:45 crc kubenswrapper[4765]: I1003 08:41:45.716849 4765 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-q84qb"] Oct 03 08:41:45 crc kubenswrapper[4765]: W1003 08:41:45.760135 4765 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podde26c094_f060_4b1d_b06f_13bf0f1794ce.slice/crio-52c675bd5ed319ccf4f0e24a6691cac2de9331fbb4ab75f165fccfaff5957ba7 WatchSource:0}: Error finding container 52c675bd5ed319ccf4f0e24a6691cac2de9331fbb4ab75f165fccfaff5957ba7: Status 404 returned error can't find the container with id 52c675bd5ed319ccf4f0e24a6691cac2de9331fbb4ab75f165fccfaff5957ba7 Oct 03 08:41:45 crc kubenswrapper[4765]: I1003 08:41:45.906208 4765 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-qwb6x"] Oct 03 08:41:45 crc kubenswrapper[4765]: I1003 08:41:45.914471 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="hostpath-provisioner/csi-hostpathplugin-xjnnd" event={"ID":"193eea7c-6015-42df-b104-9a2848192515","Type":"ContainerStarted","Data":"923c51f128fc3e5ae82cf0c313683601fc6dc63089804c146390f27caf0fce32"} Oct 03 08:41:45 crc kubenswrapper[4765]: I1003 08:41:45.916948 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" event={"ID":"3b6479f0-333b-4a96-9adf-2099afdc2447","Type":"ContainerStarted","Data":"329d494121fc4824ed5d92e7500f9b1ec823950847faa21e2c970c7b5cc1bf4b"} Oct 03 08:41:45 crc kubenswrapper[4765]: I1003 08:41:45.917552 4765 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 03 08:41:45 crc kubenswrapper[4765]: I1003 08:41:45.920363 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" event={"ID":"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8","Type":"ContainerStarted","Data":"e088ed85f4bd083738948207989220965dea3bcdf503c71749e806d8eaa44af2"} Oct 03 08:41:45 crc kubenswrapper[4765]: I1003 08:41:45.927628 4765 generic.go:334] "Generic (PLEG): container finished" podID="52d70e1c-3f04-4bab-a6a3-2ea9d66489db" containerID="a62e6791acdd0b386426527d3d9540d23a6430da734d022b40ce600c4d95995a" exitCode=0 Oct 03 08:41:45 crc kubenswrapper[4765]: I1003 08:41:45.927772 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-9hhf6" event={"ID":"52d70e1c-3f04-4bab-a6a3-2ea9d66489db","Type":"ContainerDied","Data":"a62e6791acdd0b386426527d3d9540d23a6430da734d022b40ce600c4d95995a"} Oct 03 08:41:45 crc kubenswrapper[4765]: I1003 08:41:45.927819 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-9hhf6" event={"ID":"52d70e1c-3f04-4bab-a6a3-2ea9d66489db","Type":"ContainerStarted","Data":"813fe3195fae0fcc8814e443c775e31effa0c57a4ee5f78bde2bc2d0dcea0d1b"} Oct 03 08:41:45 crc kubenswrapper[4765]: I1003 08:41:45.938810 4765 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Oct 03 08:41:45 crc kubenswrapper[4765]: I1003 08:41:45.939149 4765 generic.go:334] "Generic (PLEG): container finished" podID="ba3fb502-6081-420d-8ef8-a249a7e69e60" containerID="0bd2d5d61d76395548491dfb4b9e73b01c57de23e4db86dc0a918ebc4dd36f60" exitCode=0 Oct 03 08:41:45 crc kubenswrapper[4765]: I1003 08:41:45.939244 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-4w7fr" event={"ID":"ba3fb502-6081-420d-8ef8-a249a7e69e60","Type":"ContainerDied","Data":"0bd2d5d61d76395548491dfb4b9e73b01c57de23e4db86dc0a918ebc4dd36f60"} Oct 03 08:41:45 crc kubenswrapper[4765]: I1003 08:41:45.939278 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-4w7fr" event={"ID":"ba3fb502-6081-420d-8ef8-a249a7e69e60","Type":"ContainerStarted","Data":"5ae33041628482000243745b7c41e2095c7901d591469dbc168d0237d9d6841b"} Oct 03 08:41:45 crc kubenswrapper[4765]: I1003 08:41:45.955199 4765 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="hostpath-provisioner/csi-hostpathplugin-xjnnd" podStartSLOduration=11.955163559 podStartE2EDuration="11.955163559s" podCreationTimestamp="2025-10-03 08:41:34 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-03 08:41:45.937402441 +0000 UTC m=+150.238896781" watchObservedRunningTime="2025-10-03 08:41:45.955163559 +0000 UTC m=+150.256657889" Oct 03 08:41:45 crc kubenswrapper[4765]: I1003 08:41:45.968700 4765 generic.go:334] "Generic (PLEG): container finished" podID="4de34feb-a2a4-49c9-b066-f7a71b39cd06" containerID="c2e8e2d4dc455287d7e0c91edd5b9f890e1d42f190459e0b7478e41746b05130" exitCode=0 Oct 03 08:41:45 crc kubenswrapper[4765]: I1003 08:41:45.968814 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-8lvxz" event={"ID":"4de34feb-a2a4-49c9-b066-f7a71b39cd06","Type":"ContainerDied","Data":"c2e8e2d4dc455287d7e0c91edd5b9f890e1d42f190459e0b7478e41746b05130"} Oct 03 08:41:45 crc kubenswrapper[4765]: I1003 08:41:45.968853 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-8lvxz" event={"ID":"4de34feb-a2a4-49c9-b066-f7a71b39cd06","Type":"ContainerStarted","Data":"0850e92d765660f276b519be6f2dba13010d1a6d02efa38b6ad3722a95673323"} Oct 03 08:41:45 crc kubenswrapper[4765]: I1003 08:41:45.970968 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-q84qb" event={"ID":"de26c094-f060-4b1d-b06f-13bf0f1794ce","Type":"ContainerStarted","Data":"52c675bd5ed319ccf4f0e24a6691cac2de9331fbb4ab75f165fccfaff5957ba7"} Oct 03 08:41:46 crc kubenswrapper[4765]: I1003 08:41:46.028309 4765 patch_prober.go:28] interesting pod/router-default-5444994796-f64ph container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Oct 03 08:41:46 crc kubenswrapper[4765]: [-]has-synced failed: reason withheld Oct 03 08:41:46 crc kubenswrapper[4765]: [+]process-running ok Oct 03 08:41:46 crc kubenswrapper[4765]: healthz check failed Oct 03 08:41:46 crc kubenswrapper[4765]: I1003 08:41:46.028412 4765 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-f64ph" podUID="1a071347-8c80-4f91-87f3-1d95c7b18a1c" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Oct 03 08:41:46 crc kubenswrapper[4765]: I1003 08:41:46.235633 4765 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-config-operator/openshift-config-operator-7777fb866f-9k7k4" Oct 03 08:41:46 crc kubenswrapper[4765]: I1003 08:41:46.314776 4765 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8f668bae-612b-4b75-9490-919e737c6a3b" path="/var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes" Oct 03 08:41:46 crc kubenswrapper[4765]: I1003 08:41:46.395923 4765 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-zfkjv"] Oct 03 08:41:46 crc kubenswrapper[4765]: I1003 08:41:46.397534 4765 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-zfkjv" Oct 03 08:41:46 crc kubenswrapper[4765]: I1003 08:41:46.402060 4765 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-marketplace-dockercfg-x2ctb" Oct 03 08:41:46 crc kubenswrapper[4765]: I1003 08:41:46.421744 4765 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-zfkjv"] Oct 03 08:41:46 crc kubenswrapper[4765]: I1003 08:41:46.572187 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-68rb5\" (UniqueName: \"kubernetes.io/projected/d55c53fc-df46-4bb6-a4b7-4d269d965dc6-kube-api-access-68rb5\") pod \"redhat-marketplace-zfkjv\" (UID: \"d55c53fc-df46-4bb6-a4b7-4d269d965dc6\") " pod="openshift-marketplace/redhat-marketplace-zfkjv" Oct 03 08:41:46 crc kubenswrapper[4765]: I1003 08:41:46.572276 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d55c53fc-df46-4bb6-a4b7-4d269d965dc6-utilities\") pod \"redhat-marketplace-zfkjv\" (UID: \"d55c53fc-df46-4bb6-a4b7-4d269d965dc6\") " pod="openshift-marketplace/redhat-marketplace-zfkjv" Oct 03 08:41:46 crc kubenswrapper[4765]: I1003 08:41:46.572298 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d55c53fc-df46-4bb6-a4b7-4d269d965dc6-catalog-content\") pod \"redhat-marketplace-zfkjv\" (UID: \"d55c53fc-df46-4bb6-a4b7-4d269d965dc6\") " pod="openshift-marketplace/redhat-marketplace-zfkjv" Oct 03 08:41:46 crc kubenswrapper[4765]: I1003 08:41:46.673668 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d55c53fc-df46-4bb6-a4b7-4d269d965dc6-utilities\") pod \"redhat-marketplace-zfkjv\" (UID: \"d55c53fc-df46-4bb6-a4b7-4d269d965dc6\") " pod="openshift-marketplace/redhat-marketplace-zfkjv" Oct 03 08:41:46 crc kubenswrapper[4765]: I1003 08:41:46.673754 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d55c53fc-df46-4bb6-a4b7-4d269d965dc6-catalog-content\") pod \"redhat-marketplace-zfkjv\" (UID: \"d55c53fc-df46-4bb6-a4b7-4d269d965dc6\") " pod="openshift-marketplace/redhat-marketplace-zfkjv" Oct 03 08:41:46 crc kubenswrapper[4765]: I1003 08:41:46.673839 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-68rb5\" (UniqueName: \"kubernetes.io/projected/d55c53fc-df46-4bb6-a4b7-4d269d965dc6-kube-api-access-68rb5\") pod \"redhat-marketplace-zfkjv\" (UID: \"d55c53fc-df46-4bb6-a4b7-4d269d965dc6\") " pod="openshift-marketplace/redhat-marketplace-zfkjv" Oct 03 08:41:46 crc kubenswrapper[4765]: I1003 08:41:46.674361 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d55c53fc-df46-4bb6-a4b7-4d269d965dc6-catalog-content\") pod \"redhat-marketplace-zfkjv\" (UID: \"d55c53fc-df46-4bb6-a4b7-4d269d965dc6\") " pod="openshift-marketplace/redhat-marketplace-zfkjv" Oct 03 08:41:46 crc kubenswrapper[4765]: I1003 08:41:46.674341 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d55c53fc-df46-4bb6-a4b7-4d269d965dc6-utilities\") pod \"redhat-marketplace-zfkjv\" (UID: \"d55c53fc-df46-4bb6-a4b7-4d269d965dc6\") " pod="openshift-marketplace/redhat-marketplace-zfkjv" Oct 03 08:41:46 crc kubenswrapper[4765]: I1003 08:41:46.679623 4765 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-console/console-f9d7485db-g8jbc" Oct 03 08:41:46 crc kubenswrapper[4765]: I1003 08:41:46.679678 4765 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-console/console-f9d7485db-g8jbc" Oct 03 08:41:46 crc kubenswrapper[4765]: I1003 08:41:46.681855 4765 patch_prober.go:28] interesting pod/console-f9d7485db-g8jbc container/console namespace/openshift-console: Startup probe status=failure output="Get \"https://10.217.0.34:8443/health\": dial tcp 10.217.0.34:8443: connect: connection refused" start-of-body= Oct 03 08:41:46 crc kubenswrapper[4765]: I1003 08:41:46.681968 4765 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-console/console-f9d7485db-g8jbc" podUID="d6e8ca49-1faf-4e22-8760-d7eca3820980" containerName="console" probeResult="failure" output="Get \"https://10.217.0.34:8443/health\": dial tcp 10.217.0.34:8443: connect: connection refused" Oct 03 08:41:46 crc kubenswrapper[4765]: I1003 08:41:46.698314 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-68rb5\" (UniqueName: \"kubernetes.io/projected/d55c53fc-df46-4bb6-a4b7-4d269d965dc6-kube-api-access-68rb5\") pod \"redhat-marketplace-zfkjv\" (UID: \"d55c53fc-df46-4bb6-a4b7-4d269d965dc6\") " pod="openshift-marketplace/redhat-marketplace-zfkjv" Oct 03 08:41:46 crc kubenswrapper[4765]: I1003 08:41:46.716945 4765 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-zfkjv" Oct 03 08:41:46 crc kubenswrapper[4765]: I1003 08:41:46.792661 4765 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-8pvgz"] Oct 03 08:41:46 crc kubenswrapper[4765]: I1003 08:41:46.793788 4765 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-8pvgz" Oct 03 08:41:46 crc kubenswrapper[4765]: I1003 08:41:46.802310 4765 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-8pvgz"] Oct 03 08:41:46 crc kubenswrapper[4765]: I1003 08:41:46.906536 4765 patch_prober.go:28] interesting pod/downloads-7954f5f757-qdr5x container/download-server namespace/openshift-console: Readiness probe status=failure output="Get \"http://10.217.0.7:8080/\": dial tcp 10.217.0.7:8080: connect: connection refused" start-of-body= Oct 03 08:41:46 crc kubenswrapper[4765]: I1003 08:41:46.906898 4765 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/downloads-7954f5f757-qdr5x" podUID="d6fe9149-6e84-4fe5-97b0-5b6fd0a522bc" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.7:8080/\": dial tcp 10.217.0.7:8080: connect: connection refused" Oct 03 08:41:46 crc kubenswrapper[4765]: I1003 08:41:46.906536 4765 patch_prober.go:28] interesting pod/downloads-7954f5f757-qdr5x container/download-server namespace/openshift-console: Liveness probe status=failure output="Get \"http://10.217.0.7:8080/\": dial tcp 10.217.0.7:8080: connect: connection refused" start-of-body= Oct 03 08:41:46 crc kubenswrapper[4765]: I1003 08:41:46.907317 4765 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-console/downloads-7954f5f757-qdr5x" podUID="d6fe9149-6e84-4fe5-97b0-5b6fd0a522bc" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.7:8080/\": dial tcp 10.217.0.7:8080: connect: connection refused" Oct 03 08:41:46 crc kubenswrapper[4765]: I1003 08:41:46.917956 4765 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-apiserver/apiserver-76f77b778f-gcgfs" Oct 03 08:41:46 crc kubenswrapper[4765]: I1003 08:41:46.919888 4765 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-apiserver/apiserver-76f77b778f-gcgfs" Oct 03 08:41:46 crc kubenswrapper[4765]: I1003 08:41:46.933284 4765 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-apiserver/apiserver-76f77b778f-gcgfs" Oct 03 08:41:46 crc kubenswrapper[4765]: I1003 08:41:46.979243 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8lj49\" (UniqueName: \"kubernetes.io/projected/9c3ab66b-9f3e-4764-a5f6-acf1f378e489-kube-api-access-8lj49\") pod \"redhat-marketplace-8pvgz\" (UID: \"9c3ab66b-9f3e-4764-a5f6-acf1f378e489\") " pod="openshift-marketplace/redhat-marketplace-8pvgz" Oct 03 08:41:46 crc kubenswrapper[4765]: I1003 08:41:46.979327 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9c3ab66b-9f3e-4764-a5f6-acf1f378e489-utilities\") pod \"redhat-marketplace-8pvgz\" (UID: \"9c3ab66b-9f3e-4764-a5f6-acf1f378e489\") " pod="openshift-marketplace/redhat-marketplace-8pvgz" Oct 03 08:41:46 crc kubenswrapper[4765]: I1003 08:41:46.979515 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9c3ab66b-9f3e-4764-a5f6-acf1f378e489-catalog-content\") pod \"redhat-marketplace-8pvgz\" (UID: \"9c3ab66b-9f3e-4764-a5f6-acf1f378e489\") " pod="openshift-marketplace/redhat-marketplace-8pvgz" Oct 03 08:41:46 crc kubenswrapper[4765]: I1003 08:41:46.990314 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-697d97f7c8-qwb6x" event={"ID":"32b16068-abfd-4a3f-870c-a17c7ff31d4b","Type":"ContainerStarted","Data":"e92cfe8eb3ebb1e407b2a79114c38b5b0ef27cbdbbc86d9736f8790ee2d1e875"} Oct 03 08:41:46 crc kubenswrapper[4765]: I1003 08:41:46.990363 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-697d97f7c8-qwb6x" event={"ID":"32b16068-abfd-4a3f-870c-a17c7ff31d4b","Type":"ContainerStarted","Data":"16f5447bc263041e5e6ab54c072d1a8842f6702c45704c96e83efcd4ee3ba199"} Oct 03 08:41:46 crc kubenswrapper[4765]: I1003 08:41:46.991342 4765 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-image-registry/image-registry-697d97f7c8-qwb6x" Oct 03 08:41:46 crc kubenswrapper[4765]: I1003 08:41:46.992454 4765 generic.go:334] "Generic (PLEG): container finished" podID="de26c094-f060-4b1d-b06f-13bf0f1794ce" containerID="5934c00221ebc58714a0bbfc40a96eb4bc372f3a98f5c73e3917ad88b760581f" exitCode=0 Oct 03 08:41:46 crc kubenswrapper[4765]: I1003 08:41:46.992703 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-q84qb" event={"ID":"de26c094-f060-4b1d-b06f-13bf0f1794ce","Type":"ContainerDied","Data":"5934c00221ebc58714a0bbfc40a96eb4bc372f3a98f5c73e3917ad88b760581f"} Oct 03 08:41:46 crc kubenswrapper[4765]: I1003 08:41:46.997429 4765 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-apiserver/apiserver-76f77b778f-gcgfs" Oct 03 08:41:47 crc kubenswrapper[4765]: I1003 08:41:47.017071 4765 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ingress/router-default-5444994796-f64ph" Oct 03 08:41:47 crc kubenswrapper[4765]: I1003 08:41:47.022225 4765 patch_prober.go:28] interesting pod/router-default-5444994796-f64ph container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Oct 03 08:41:47 crc kubenswrapper[4765]: [-]has-synced failed: reason withheld Oct 03 08:41:47 crc kubenswrapper[4765]: [+]process-running ok Oct 03 08:41:47 crc kubenswrapper[4765]: healthz check failed Oct 03 08:41:47 crc kubenswrapper[4765]: I1003 08:41:47.022283 4765 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-f64ph" podUID="1a071347-8c80-4f91-87f3-1d95c7b18a1c" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Oct 03 08:41:47 crc kubenswrapper[4765]: I1003 08:41:47.023279 4765 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/image-registry-697d97f7c8-qwb6x" podStartSLOduration=131.023252798 podStartE2EDuration="2m11.023252798s" podCreationTimestamp="2025-10-03 08:39:36 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-03 08:41:47.017452125 +0000 UTC m=+151.318946475" watchObservedRunningTime="2025-10-03 08:41:47.023252798 +0000 UTC m=+151.324747148" Oct 03 08:41:47 crc kubenswrapper[4765]: I1003 08:41:47.086092 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9c3ab66b-9f3e-4764-a5f6-acf1f378e489-catalog-content\") pod \"redhat-marketplace-8pvgz\" (UID: \"9c3ab66b-9f3e-4764-a5f6-acf1f378e489\") " pod="openshift-marketplace/redhat-marketplace-8pvgz" Oct 03 08:41:47 crc kubenswrapper[4765]: I1003 08:41:47.086296 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8lj49\" (UniqueName: \"kubernetes.io/projected/9c3ab66b-9f3e-4764-a5f6-acf1f378e489-kube-api-access-8lj49\") pod \"redhat-marketplace-8pvgz\" (UID: \"9c3ab66b-9f3e-4764-a5f6-acf1f378e489\") " pod="openshift-marketplace/redhat-marketplace-8pvgz" Oct 03 08:41:47 crc kubenswrapper[4765]: I1003 08:41:47.086359 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9c3ab66b-9f3e-4764-a5f6-acf1f378e489-utilities\") pod \"redhat-marketplace-8pvgz\" (UID: \"9c3ab66b-9f3e-4764-a5f6-acf1f378e489\") " pod="openshift-marketplace/redhat-marketplace-8pvgz" Oct 03 08:41:47 crc kubenswrapper[4765]: I1003 08:41:47.088056 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9c3ab66b-9f3e-4764-a5f6-acf1f378e489-catalog-content\") pod \"redhat-marketplace-8pvgz\" (UID: \"9c3ab66b-9f3e-4764-a5f6-acf1f378e489\") " pod="openshift-marketplace/redhat-marketplace-8pvgz" Oct 03 08:41:47 crc kubenswrapper[4765]: I1003 08:41:47.089947 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9c3ab66b-9f3e-4764-a5f6-acf1f378e489-utilities\") pod \"redhat-marketplace-8pvgz\" (UID: \"9c3ab66b-9f3e-4764-a5f6-acf1f378e489\") " pod="openshift-marketplace/redhat-marketplace-8pvgz" Oct 03 08:41:47 crc kubenswrapper[4765]: I1003 08:41:47.121072 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8lj49\" (UniqueName: \"kubernetes.io/projected/9c3ab66b-9f3e-4764-a5f6-acf1f378e489-kube-api-access-8lj49\") pod \"redhat-marketplace-8pvgz\" (UID: \"9c3ab66b-9f3e-4764-a5f6-acf1f378e489\") " pod="openshift-marketplace/redhat-marketplace-8pvgz" Oct 03 08:41:47 crc kubenswrapper[4765]: I1003 08:41:47.121437 4765 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-8pvgz" Oct 03 08:41:47 crc kubenswrapper[4765]: I1003 08:41:47.225657 4765 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-zfkjv"] Oct 03 08:41:47 crc kubenswrapper[4765]: W1003 08:41:47.244667 4765 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd55c53fc_df46_4bb6_a4b7_4d269d965dc6.slice/crio-1889bbcf53f87ace244a7ea33e7f37a9c1c476bfe674cd6b469ccf9ba1f4d99b WatchSource:0}: Error finding container 1889bbcf53f87ace244a7ea33e7f37a9c1c476bfe674cd6b469ccf9ba1f4d99b: Status 404 returned error can't find the container with id 1889bbcf53f87ace244a7ea33e7f37a9c1c476bfe674cd6b469ccf9ba1f4d99b Oct 03 08:41:47 crc kubenswrapper[4765]: I1003 08:41:47.430727 4765 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-controller-manager/revision-pruner-9-crc"] Oct 03 08:41:47 crc kubenswrapper[4765]: I1003 08:41:47.432047 4765 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/revision-pruner-9-crc" Oct 03 08:41:47 crc kubenswrapper[4765]: I1003 08:41:47.439579 4765 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-controller-manager"/"kube-root-ca.crt" Oct 03 08:41:47 crc kubenswrapper[4765]: I1003 08:41:47.439827 4765 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-controller-manager"/"installer-sa-dockercfg-kjl2n" Oct 03 08:41:47 crc kubenswrapper[4765]: I1003 08:41:47.444624 4765 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-controller-manager/revision-pruner-9-crc"] Oct 03 08:41:47 crc kubenswrapper[4765]: I1003 08:41:47.468017 4765 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-8pvgz"] Oct 03 08:41:47 crc kubenswrapper[4765]: W1003 08:41:47.485608 4765 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod9c3ab66b_9f3e_4764_a5f6_acf1f378e489.slice/crio-21b8779ebb8f94cbe94f21afc3850c00b6533f1a040e8f5ffb25cdad6c3f81da WatchSource:0}: Error finding container 21b8779ebb8f94cbe94f21afc3850c00b6533f1a040e8f5ffb25cdad6c3f81da: Status 404 returned error can't find the container with id 21b8779ebb8f94cbe94f21afc3850c00b6533f1a040e8f5ffb25cdad6c3f81da Oct 03 08:41:47 crc kubenswrapper[4765]: I1003 08:41:47.608321 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/9c5e0b35-5d25-4e2b-9a0a-accdf6cc5e23-kubelet-dir\") pod \"revision-pruner-9-crc\" (UID: \"9c5e0b35-5d25-4e2b-9a0a-accdf6cc5e23\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Oct 03 08:41:47 crc kubenswrapper[4765]: I1003 08:41:47.608931 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/9c5e0b35-5d25-4e2b-9a0a-accdf6cc5e23-kube-api-access\") pod \"revision-pruner-9-crc\" (UID: \"9c5e0b35-5d25-4e2b-9a0a-accdf6cc5e23\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Oct 03 08:41:47 crc kubenswrapper[4765]: I1003 08:41:47.673810 4765 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-l2dpj" Oct 03 08:41:47 crc kubenswrapper[4765]: I1003 08:41:47.673869 4765 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-l2dpj" Oct 03 08:41:47 crc kubenswrapper[4765]: I1003 08:41:47.684123 4765 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-l2dpj" Oct 03 08:41:47 crc kubenswrapper[4765]: I1003 08:41:47.710189 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/9c5e0b35-5d25-4e2b-9a0a-accdf6cc5e23-kube-api-access\") pod \"revision-pruner-9-crc\" (UID: \"9c5e0b35-5d25-4e2b-9a0a-accdf6cc5e23\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Oct 03 08:41:47 crc kubenswrapper[4765]: I1003 08:41:47.710295 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/9c5e0b35-5d25-4e2b-9a0a-accdf6cc5e23-kubelet-dir\") pod \"revision-pruner-9-crc\" (UID: \"9c5e0b35-5d25-4e2b-9a0a-accdf6cc5e23\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Oct 03 08:41:47 crc kubenswrapper[4765]: I1003 08:41:47.710416 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/9c5e0b35-5d25-4e2b-9a0a-accdf6cc5e23-kubelet-dir\") pod \"revision-pruner-9-crc\" (UID: \"9c5e0b35-5d25-4e2b-9a0a-accdf6cc5e23\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Oct 03 08:41:47 crc kubenswrapper[4765]: I1003 08:41:47.744474 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/9c5e0b35-5d25-4e2b-9a0a-accdf6cc5e23-kube-api-access\") pod \"revision-pruner-9-crc\" (UID: \"9c5e0b35-5d25-4e2b-9a0a-accdf6cc5e23\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Oct 03 08:41:47 crc kubenswrapper[4765]: I1003 08:41:47.767049 4765 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/revision-pruner-9-crc" Oct 03 08:41:47 crc kubenswrapper[4765]: I1003 08:41:47.824869 4765 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-sgxf4"] Oct 03 08:41:47 crc kubenswrapper[4765]: I1003 08:41:47.826574 4765 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-sgxf4" Oct 03 08:41:47 crc kubenswrapper[4765]: I1003 08:41:47.834535 4765 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-operators-dockercfg-ct8rh" Oct 03 08:41:47 crc kubenswrapper[4765]: I1003 08:41:47.873549 4765 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-sgxf4"] Oct 03 08:41:47 crc kubenswrapper[4765]: I1003 08:41:47.909227 4765 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/marketplace-operator-79b997595-9g9cw" Oct 03 08:41:48 crc kubenswrapper[4765]: I1003 08:41:48.010338 4765 generic.go:334] "Generic (PLEG): container finished" podID="d55c53fc-df46-4bb6-a4b7-4d269d965dc6" containerID="0285cd5fc73847426afcdfeb5b226e9ed90d74c3612eea25d3fdaa161ed38cb9" exitCode=0 Oct 03 08:41:48 crc kubenswrapper[4765]: I1003 08:41:48.010936 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-zfkjv" event={"ID":"d55c53fc-df46-4bb6-a4b7-4d269d965dc6","Type":"ContainerDied","Data":"0285cd5fc73847426afcdfeb5b226e9ed90d74c3612eea25d3fdaa161ed38cb9"} Oct 03 08:41:48 crc kubenswrapper[4765]: I1003 08:41:48.010987 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-zfkjv" event={"ID":"d55c53fc-df46-4bb6-a4b7-4d269d965dc6","Type":"ContainerStarted","Data":"1889bbcf53f87ace244a7ea33e7f37a9c1c476bfe674cd6b469ccf9ba1f4d99b"} Oct 03 08:41:48 crc kubenswrapper[4765]: I1003 08:41:48.021555 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b280de40-e91c-4010-9173-48ed01320bd4-catalog-content\") pod \"redhat-operators-sgxf4\" (UID: \"b280de40-e91c-4010-9173-48ed01320bd4\") " pod="openshift-marketplace/redhat-operators-sgxf4" Oct 03 08:41:48 crc kubenswrapper[4765]: I1003 08:41:48.021668 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-h7sjc\" (UniqueName: \"kubernetes.io/projected/b280de40-e91c-4010-9173-48ed01320bd4-kube-api-access-h7sjc\") pod \"redhat-operators-sgxf4\" (UID: \"b280de40-e91c-4010-9173-48ed01320bd4\") " pod="openshift-marketplace/redhat-operators-sgxf4" Oct 03 08:41:48 crc kubenswrapper[4765]: I1003 08:41:48.021696 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b280de40-e91c-4010-9173-48ed01320bd4-utilities\") pod \"redhat-operators-sgxf4\" (UID: \"b280de40-e91c-4010-9173-48ed01320bd4\") " pod="openshift-marketplace/redhat-operators-sgxf4" Oct 03 08:41:48 crc kubenswrapper[4765]: I1003 08:41:48.023294 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29324670-fpjm8" event={"ID":"432cff95-d219-46af-bfc4-c5afbe99c9c0","Type":"ContainerDied","Data":"69df2fe02aca2ccab6a7c3ba623df1fe4c5bd0889c95dcaab6a47e692ab00860"} Oct 03 08:41:48 crc kubenswrapper[4765]: I1003 08:41:48.025225 4765 patch_prober.go:28] interesting pod/router-default-5444994796-f64ph container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Oct 03 08:41:48 crc kubenswrapper[4765]: [-]has-synced failed: reason withheld Oct 03 08:41:48 crc kubenswrapper[4765]: [+]process-running ok Oct 03 08:41:48 crc kubenswrapper[4765]: healthz check failed Oct 03 08:41:48 crc kubenswrapper[4765]: I1003 08:41:48.025262 4765 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-f64ph" podUID="1a071347-8c80-4f91-87f3-1d95c7b18a1c" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Oct 03 08:41:48 crc kubenswrapper[4765]: I1003 08:41:48.023071 4765 generic.go:334] "Generic (PLEG): container finished" podID="432cff95-d219-46af-bfc4-c5afbe99c9c0" containerID="69df2fe02aca2ccab6a7c3ba623df1fe4c5bd0889c95dcaab6a47e692ab00860" exitCode=0 Oct 03 08:41:48 crc kubenswrapper[4765]: I1003 08:41:48.046158 4765 generic.go:334] "Generic (PLEG): container finished" podID="9c3ab66b-9f3e-4764-a5f6-acf1f378e489" containerID="0a19d6023eb03d5076d3dd7fcf1dfb4aa15c68389cdb222d1e96593faabc90de" exitCode=0 Oct 03 08:41:48 crc kubenswrapper[4765]: I1003 08:41:48.046454 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-8pvgz" event={"ID":"9c3ab66b-9f3e-4764-a5f6-acf1f378e489","Type":"ContainerDied","Data":"0a19d6023eb03d5076d3dd7fcf1dfb4aa15c68389cdb222d1e96593faabc90de"} Oct 03 08:41:48 crc kubenswrapper[4765]: I1003 08:41:48.046550 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-8pvgz" event={"ID":"9c3ab66b-9f3e-4764-a5f6-acf1f378e489","Type":"ContainerStarted","Data":"21b8779ebb8f94cbe94f21afc3850c00b6533f1a040e8f5ffb25cdad6c3f81da"} Oct 03 08:41:48 crc kubenswrapper[4765]: I1003 08:41:48.064853 4765 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-l2dpj" Oct 03 08:41:48 crc kubenswrapper[4765]: I1003 08:41:48.124130 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b280de40-e91c-4010-9173-48ed01320bd4-catalog-content\") pod \"redhat-operators-sgxf4\" (UID: \"b280de40-e91c-4010-9173-48ed01320bd4\") " pod="openshift-marketplace/redhat-operators-sgxf4" Oct 03 08:41:48 crc kubenswrapper[4765]: I1003 08:41:48.124253 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-h7sjc\" (UniqueName: \"kubernetes.io/projected/b280de40-e91c-4010-9173-48ed01320bd4-kube-api-access-h7sjc\") pod \"redhat-operators-sgxf4\" (UID: \"b280de40-e91c-4010-9173-48ed01320bd4\") " pod="openshift-marketplace/redhat-operators-sgxf4" Oct 03 08:41:48 crc kubenswrapper[4765]: I1003 08:41:48.124285 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b280de40-e91c-4010-9173-48ed01320bd4-utilities\") pod \"redhat-operators-sgxf4\" (UID: \"b280de40-e91c-4010-9173-48ed01320bd4\") " pod="openshift-marketplace/redhat-operators-sgxf4" Oct 03 08:41:48 crc kubenswrapper[4765]: I1003 08:41:48.125749 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b280de40-e91c-4010-9173-48ed01320bd4-catalog-content\") pod \"redhat-operators-sgxf4\" (UID: \"b280de40-e91c-4010-9173-48ed01320bd4\") " pod="openshift-marketplace/redhat-operators-sgxf4" Oct 03 08:41:48 crc kubenswrapper[4765]: I1003 08:41:48.126366 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b280de40-e91c-4010-9173-48ed01320bd4-utilities\") pod \"redhat-operators-sgxf4\" (UID: \"b280de40-e91c-4010-9173-48ed01320bd4\") " pod="openshift-marketplace/redhat-operators-sgxf4" Oct 03 08:41:48 crc kubenswrapper[4765]: I1003 08:41:48.158104 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-h7sjc\" (UniqueName: \"kubernetes.io/projected/b280de40-e91c-4010-9173-48ed01320bd4-kube-api-access-h7sjc\") pod \"redhat-operators-sgxf4\" (UID: \"b280de40-e91c-4010-9173-48ed01320bd4\") " pod="openshift-marketplace/redhat-operators-sgxf4" Oct 03 08:41:48 crc kubenswrapper[4765]: I1003 08:41:48.203975 4765 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-fmg4j"] Oct 03 08:41:48 crc kubenswrapper[4765]: I1003 08:41:48.205350 4765 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-fmg4j" Oct 03 08:41:48 crc kubenswrapper[4765]: I1003 08:41:48.211812 4765 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-controller-manager/revision-pruner-9-crc"] Oct 03 08:41:48 crc kubenswrapper[4765]: I1003 08:41:48.240720 4765 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-sgxf4" Oct 03 08:41:48 crc kubenswrapper[4765]: W1003 08:41:48.261693 4765 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-pod9c5e0b35_5d25_4e2b_9a0a_accdf6cc5e23.slice/crio-7ff29d4b6a2b93f483be622f3d498d1928d34dc17ecf3ee475b6403a3cedc1e3 WatchSource:0}: Error finding container 7ff29d4b6a2b93f483be622f3d498d1928d34dc17ecf3ee475b6403a3cedc1e3: Status 404 returned error can't find the container with id 7ff29d4b6a2b93f483be622f3d498d1928d34dc17ecf3ee475b6403a3cedc1e3 Oct 03 08:41:48 crc kubenswrapper[4765]: I1003 08:41:48.271623 4765 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-fmg4j"] Oct 03 08:41:48 crc kubenswrapper[4765]: I1003 08:41:48.338486 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/111267ae-7647-4185-8fd3-138ad5d3a864-catalog-content\") pod \"redhat-operators-fmg4j\" (UID: \"111267ae-7647-4185-8fd3-138ad5d3a864\") " pod="openshift-marketplace/redhat-operators-fmg4j" Oct 03 08:41:48 crc kubenswrapper[4765]: I1003 08:41:48.338957 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/111267ae-7647-4185-8fd3-138ad5d3a864-utilities\") pod \"redhat-operators-fmg4j\" (UID: \"111267ae-7647-4185-8fd3-138ad5d3a864\") " pod="openshift-marketplace/redhat-operators-fmg4j" Oct 03 08:41:48 crc kubenswrapper[4765]: I1003 08:41:48.338992 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-htwg7\" (UniqueName: \"kubernetes.io/projected/111267ae-7647-4185-8fd3-138ad5d3a864-kube-api-access-htwg7\") pod \"redhat-operators-fmg4j\" (UID: \"111267ae-7647-4185-8fd3-138ad5d3a864\") " pod="openshift-marketplace/redhat-operators-fmg4j" Oct 03 08:41:48 crc kubenswrapper[4765]: I1003 08:41:48.444260 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/111267ae-7647-4185-8fd3-138ad5d3a864-catalog-content\") pod \"redhat-operators-fmg4j\" (UID: \"111267ae-7647-4185-8fd3-138ad5d3a864\") " pod="openshift-marketplace/redhat-operators-fmg4j" Oct 03 08:41:48 crc kubenswrapper[4765]: I1003 08:41:48.444621 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/111267ae-7647-4185-8fd3-138ad5d3a864-utilities\") pod \"redhat-operators-fmg4j\" (UID: \"111267ae-7647-4185-8fd3-138ad5d3a864\") " pod="openshift-marketplace/redhat-operators-fmg4j" Oct 03 08:41:48 crc kubenswrapper[4765]: I1003 08:41:48.444697 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-htwg7\" (UniqueName: \"kubernetes.io/projected/111267ae-7647-4185-8fd3-138ad5d3a864-kube-api-access-htwg7\") pod \"redhat-operators-fmg4j\" (UID: \"111267ae-7647-4185-8fd3-138ad5d3a864\") " pod="openshift-marketplace/redhat-operators-fmg4j" Oct 03 08:41:48 crc kubenswrapper[4765]: I1003 08:41:48.444784 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/111267ae-7647-4185-8fd3-138ad5d3a864-catalog-content\") pod \"redhat-operators-fmg4j\" (UID: \"111267ae-7647-4185-8fd3-138ad5d3a864\") " pod="openshift-marketplace/redhat-operators-fmg4j" Oct 03 08:41:48 crc kubenswrapper[4765]: I1003 08:41:48.445090 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/111267ae-7647-4185-8fd3-138ad5d3a864-utilities\") pod \"redhat-operators-fmg4j\" (UID: \"111267ae-7647-4185-8fd3-138ad5d3a864\") " pod="openshift-marketplace/redhat-operators-fmg4j" Oct 03 08:41:48 crc kubenswrapper[4765]: I1003 08:41:48.474346 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-htwg7\" (UniqueName: \"kubernetes.io/projected/111267ae-7647-4185-8fd3-138ad5d3a864-kube-api-access-htwg7\") pod \"redhat-operators-fmg4j\" (UID: \"111267ae-7647-4185-8fd3-138ad5d3a864\") " pod="openshift-marketplace/redhat-operators-fmg4j" Oct 03 08:41:48 crc kubenswrapper[4765]: I1003 08:41:48.542147 4765 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-fmg4j" Oct 03 08:41:48 crc kubenswrapper[4765]: I1003 08:41:48.593096 4765 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-srgbb" Oct 03 08:41:48 crc kubenswrapper[4765]: I1003 08:41:48.671846 4765 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-sgxf4"] Oct 03 08:41:48 crc kubenswrapper[4765]: W1003 08:41:48.719705 4765 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podb280de40_e91c_4010_9173_48ed01320bd4.slice/crio-aef576156db95aa8b8d3f3694194eaacda5ebd22be014388ad123590a841fd4c WatchSource:0}: Error finding container aef576156db95aa8b8d3f3694194eaacda5ebd22be014388ad123590a841fd4c: Status 404 returned error can't find the container with id aef576156db95aa8b8d3f3694194eaacda5ebd22be014388ad123590a841fd4c Oct 03 08:41:48 crc kubenswrapper[4765]: I1003 08:41:48.755016 4765 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver/revision-pruner-8-crc"] Oct 03 08:41:48 crc kubenswrapper[4765]: I1003 08:41:48.756247 4765 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-8-crc" Oct 03 08:41:48 crc kubenswrapper[4765]: I1003 08:41:48.770267 4765 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver"/"installer-sa-dockercfg-5pr6n" Oct 03 08:41:48 crc kubenswrapper[4765]: I1003 08:41:48.771709 4765 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver"/"kube-root-ca.crt" Oct 03 08:41:48 crc kubenswrapper[4765]: I1003 08:41:48.774277 4765 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/revision-pruner-8-crc"] Oct 03 08:41:48 crc kubenswrapper[4765]: I1003 08:41:48.852701 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/d532aa95-c0e2-4e0a-b3e9-d67cdf15b1a6-kubelet-dir\") pod \"revision-pruner-8-crc\" (UID: \"d532aa95-c0e2-4e0a-b3e9-d67cdf15b1a6\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Oct 03 08:41:48 crc kubenswrapper[4765]: I1003 08:41:48.852847 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/d532aa95-c0e2-4e0a-b3e9-d67cdf15b1a6-kube-api-access\") pod \"revision-pruner-8-crc\" (UID: \"d532aa95-c0e2-4e0a-b3e9-d67cdf15b1a6\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Oct 03 08:41:48 crc kubenswrapper[4765]: I1003 08:41:48.954336 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/d532aa95-c0e2-4e0a-b3e9-d67cdf15b1a6-kubelet-dir\") pod \"revision-pruner-8-crc\" (UID: \"d532aa95-c0e2-4e0a-b3e9-d67cdf15b1a6\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Oct 03 08:41:48 crc kubenswrapper[4765]: I1003 08:41:48.954449 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/d532aa95-c0e2-4e0a-b3e9-d67cdf15b1a6-kube-api-access\") pod \"revision-pruner-8-crc\" (UID: \"d532aa95-c0e2-4e0a-b3e9-d67cdf15b1a6\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Oct 03 08:41:48 crc kubenswrapper[4765]: I1003 08:41:48.954591 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/d532aa95-c0e2-4e0a-b3e9-d67cdf15b1a6-kubelet-dir\") pod \"revision-pruner-8-crc\" (UID: \"d532aa95-c0e2-4e0a-b3e9-d67cdf15b1a6\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Oct 03 08:41:48 crc kubenswrapper[4765]: I1003 08:41:48.991551 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/d532aa95-c0e2-4e0a-b3e9-d67cdf15b1a6-kube-api-access\") pod \"revision-pruner-8-crc\" (UID: \"d532aa95-c0e2-4e0a-b3e9-d67cdf15b1a6\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Oct 03 08:41:49 crc kubenswrapper[4765]: I1003 08:41:49.029243 4765 patch_prober.go:28] interesting pod/router-default-5444994796-f64ph container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Oct 03 08:41:49 crc kubenswrapper[4765]: [-]has-synced failed: reason withheld Oct 03 08:41:49 crc kubenswrapper[4765]: [+]process-running ok Oct 03 08:41:49 crc kubenswrapper[4765]: healthz check failed Oct 03 08:41:49 crc kubenswrapper[4765]: I1003 08:41:49.029301 4765 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-f64ph" podUID="1a071347-8c80-4f91-87f3-1d95c7b18a1c" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Oct 03 08:41:49 crc kubenswrapper[4765]: I1003 08:41:49.029831 4765 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-fmg4j"] Oct 03 08:41:49 crc kubenswrapper[4765]: I1003 08:41:49.093195 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/revision-pruner-9-crc" event={"ID":"9c5e0b35-5d25-4e2b-9a0a-accdf6cc5e23","Type":"ContainerStarted","Data":"7ff29d4b6a2b93f483be622f3d498d1928d34dc17ecf3ee475b6403a3cedc1e3"} Oct 03 08:41:49 crc kubenswrapper[4765]: I1003 08:41:49.097119 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-fmg4j" event={"ID":"111267ae-7647-4185-8fd3-138ad5d3a864","Type":"ContainerStarted","Data":"19344b48e560f3ef461a7947addbe9957fec9b26f66c6ec945605f664684e0c1"} Oct 03 08:41:49 crc kubenswrapper[4765]: I1003 08:41:49.106219 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-sgxf4" event={"ID":"b280de40-e91c-4010-9173-48ed01320bd4","Type":"ContainerStarted","Data":"aef576156db95aa8b8d3f3694194eaacda5ebd22be014388ad123590a841fd4c"} Oct 03 08:41:49 crc kubenswrapper[4765]: I1003 08:41:49.131263 4765 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-8-crc" Oct 03 08:41:49 crc kubenswrapper[4765]: I1003 08:41:49.539781 4765 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29324670-fpjm8" Oct 03 08:41:49 crc kubenswrapper[4765]: I1003 08:41:49.683496 4765 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-s2j29\" (UniqueName: \"kubernetes.io/projected/432cff95-d219-46af-bfc4-c5afbe99c9c0-kube-api-access-s2j29\") pod \"432cff95-d219-46af-bfc4-c5afbe99c9c0\" (UID: \"432cff95-d219-46af-bfc4-c5afbe99c9c0\") " Oct 03 08:41:49 crc kubenswrapper[4765]: I1003 08:41:49.683593 4765 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/432cff95-d219-46af-bfc4-c5afbe99c9c0-config-volume\") pod \"432cff95-d219-46af-bfc4-c5afbe99c9c0\" (UID: \"432cff95-d219-46af-bfc4-c5afbe99c9c0\") " Oct 03 08:41:49 crc kubenswrapper[4765]: I1003 08:41:49.683637 4765 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/432cff95-d219-46af-bfc4-c5afbe99c9c0-secret-volume\") pod \"432cff95-d219-46af-bfc4-c5afbe99c9c0\" (UID: \"432cff95-d219-46af-bfc4-c5afbe99c9c0\") " Oct 03 08:41:49 crc kubenswrapper[4765]: I1003 08:41:49.687370 4765 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/432cff95-d219-46af-bfc4-c5afbe99c9c0-config-volume" (OuterVolumeSpecName: "config-volume") pod "432cff95-d219-46af-bfc4-c5afbe99c9c0" (UID: "432cff95-d219-46af-bfc4-c5afbe99c9c0"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 03 08:41:49 crc kubenswrapper[4765]: I1003 08:41:49.692857 4765 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/432cff95-d219-46af-bfc4-c5afbe99c9c0-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "432cff95-d219-46af-bfc4-c5afbe99c9c0" (UID: "432cff95-d219-46af-bfc4-c5afbe99c9c0"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 08:41:49 crc kubenswrapper[4765]: I1003 08:41:49.694308 4765 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/432cff95-d219-46af-bfc4-c5afbe99c9c0-kube-api-access-s2j29" (OuterVolumeSpecName: "kube-api-access-s2j29") pod "432cff95-d219-46af-bfc4-c5afbe99c9c0" (UID: "432cff95-d219-46af-bfc4-c5afbe99c9c0"). InnerVolumeSpecName "kube-api-access-s2j29". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 08:41:49 crc kubenswrapper[4765]: I1003 08:41:49.741112 4765 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/revision-pruner-8-crc"] Oct 03 08:41:49 crc kubenswrapper[4765]: W1003 08:41:49.752202 4765 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-podd532aa95_c0e2_4e0a_b3e9_d67cdf15b1a6.slice/crio-6cf5e2da614238475ee702e2b5a862d6da4260517c55ea58627bb5d261167d10 WatchSource:0}: Error finding container 6cf5e2da614238475ee702e2b5a862d6da4260517c55ea58627bb5d261167d10: Status 404 returned error can't find the container with id 6cf5e2da614238475ee702e2b5a862d6da4260517c55ea58627bb5d261167d10 Oct 03 08:41:49 crc kubenswrapper[4765]: I1003 08:41:49.785868 4765 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/432cff95-d219-46af-bfc4-c5afbe99c9c0-config-volume\") on node \"crc\" DevicePath \"\"" Oct 03 08:41:49 crc kubenswrapper[4765]: I1003 08:41:49.785905 4765 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/432cff95-d219-46af-bfc4-c5afbe99c9c0-secret-volume\") on node \"crc\" DevicePath \"\"" Oct 03 08:41:49 crc kubenswrapper[4765]: I1003 08:41:49.785915 4765 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-s2j29\" (UniqueName: \"kubernetes.io/projected/432cff95-d219-46af-bfc4-c5afbe99c9c0-kube-api-access-s2j29\") on node \"crc\" DevicePath \"\"" Oct 03 08:41:50 crc kubenswrapper[4765]: I1003 08:41:50.021355 4765 patch_prober.go:28] interesting pod/router-default-5444994796-f64ph container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Oct 03 08:41:50 crc kubenswrapper[4765]: [-]has-synced failed: reason withheld Oct 03 08:41:50 crc kubenswrapper[4765]: [+]process-running ok Oct 03 08:41:50 crc kubenswrapper[4765]: healthz check failed Oct 03 08:41:50 crc kubenswrapper[4765]: I1003 08:41:50.021449 4765 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-f64ph" podUID="1a071347-8c80-4f91-87f3-1d95c7b18a1c" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Oct 03 08:41:50 crc kubenswrapper[4765]: I1003 08:41:50.141235 4765 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29324670-fpjm8" Oct 03 08:41:50 crc kubenswrapper[4765]: I1003 08:41:50.141275 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29324670-fpjm8" event={"ID":"432cff95-d219-46af-bfc4-c5afbe99c9c0","Type":"ContainerDied","Data":"84b5e30a2661036d7c3e2520ca56192ba3ced9248fa0b33efe611b28b75cc791"} Oct 03 08:41:50 crc kubenswrapper[4765]: I1003 08:41:50.141817 4765 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="84b5e30a2661036d7c3e2520ca56192ba3ced9248fa0b33efe611b28b75cc791" Oct 03 08:41:50 crc kubenswrapper[4765]: I1003 08:41:50.154515 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/revision-pruner-9-crc" event={"ID":"9c5e0b35-5d25-4e2b-9a0a-accdf6cc5e23","Type":"ContainerStarted","Data":"f2fc4c072165ebc2c448caa012f9fea8c926b8d0712e898b38c703aaa55c9152"} Oct 03 08:41:50 crc kubenswrapper[4765]: I1003 08:41:50.171220 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-8-crc" event={"ID":"d532aa95-c0e2-4e0a-b3e9-d67cdf15b1a6","Type":"ContainerStarted","Data":"6cf5e2da614238475ee702e2b5a862d6da4260517c55ea58627bb5d261167d10"} Oct 03 08:41:51 crc kubenswrapper[4765]: I1003 08:41:51.024805 4765 patch_prober.go:28] interesting pod/router-default-5444994796-f64ph container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Oct 03 08:41:51 crc kubenswrapper[4765]: [-]has-synced failed: reason withheld Oct 03 08:41:51 crc kubenswrapper[4765]: [+]process-running ok Oct 03 08:41:51 crc kubenswrapper[4765]: healthz check failed Oct 03 08:41:51 crc kubenswrapper[4765]: I1003 08:41:51.024880 4765 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-f64ph" podUID="1a071347-8c80-4f91-87f3-1d95c7b18a1c" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Oct 03 08:41:51 crc kubenswrapper[4765]: I1003 08:41:51.198281 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-8-crc" event={"ID":"d532aa95-c0e2-4e0a-b3e9-d67cdf15b1a6","Type":"ContainerStarted","Data":"739566f7b53ca6d75c9329f78fb6d9e5f14bda70a8800323df1fb31dc9b61fda"} Oct 03 08:41:51 crc kubenswrapper[4765]: I1003 08:41:51.207988 4765 generic.go:334] "Generic (PLEG): container finished" podID="b280de40-e91c-4010-9173-48ed01320bd4" containerID="ce4f29e8ce2d9a0eea23527c493640dfd37319f108e127813877781a2a1e4451" exitCode=0 Oct 03 08:41:51 crc kubenswrapper[4765]: I1003 08:41:51.208091 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-sgxf4" event={"ID":"b280de40-e91c-4010-9173-48ed01320bd4","Type":"ContainerDied","Data":"ce4f29e8ce2d9a0eea23527c493640dfd37319f108e127813877781a2a1e4451"} Oct 03 08:41:51 crc kubenswrapper[4765]: I1003 08:41:51.218858 4765 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-apiserver/revision-pruner-8-crc" podStartSLOduration=3.218836441 podStartE2EDuration="3.218836441s" podCreationTimestamp="2025-10-03 08:41:48 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-03 08:41:51.216246468 +0000 UTC m=+155.517740798" watchObservedRunningTime="2025-10-03 08:41:51.218836441 +0000 UTC m=+155.520330771" Oct 03 08:41:51 crc kubenswrapper[4765]: I1003 08:41:51.220261 4765 generic.go:334] "Generic (PLEG): container finished" podID="111267ae-7647-4185-8fd3-138ad5d3a864" containerID="da742df054ea3fed9b0b9ebbc09a13684254c1f1e93d9aefc2ecbbf1fb847657" exitCode=0 Oct 03 08:41:51 crc kubenswrapper[4765]: I1003 08:41:51.220345 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-fmg4j" event={"ID":"111267ae-7647-4185-8fd3-138ad5d3a864","Type":"ContainerDied","Data":"da742df054ea3fed9b0b9ebbc09a13684254c1f1e93d9aefc2ecbbf1fb847657"} Oct 03 08:41:51 crc kubenswrapper[4765]: I1003 08:41:51.224919 4765 generic.go:334] "Generic (PLEG): container finished" podID="9c5e0b35-5d25-4e2b-9a0a-accdf6cc5e23" containerID="f2fc4c072165ebc2c448caa012f9fea8c926b8d0712e898b38c703aaa55c9152" exitCode=0 Oct 03 08:41:51 crc kubenswrapper[4765]: I1003 08:41:51.224967 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/revision-pruner-9-crc" event={"ID":"9c5e0b35-5d25-4e2b-9a0a-accdf6cc5e23","Type":"ContainerDied","Data":"f2fc4c072165ebc2c448caa012f9fea8c926b8d0712e898b38c703aaa55c9152"} Oct 03 08:41:52 crc kubenswrapper[4765]: I1003 08:41:52.020638 4765 patch_prober.go:28] interesting pod/router-default-5444994796-f64ph container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Oct 03 08:41:52 crc kubenswrapper[4765]: [-]has-synced failed: reason withheld Oct 03 08:41:52 crc kubenswrapper[4765]: [+]process-running ok Oct 03 08:41:52 crc kubenswrapper[4765]: healthz check failed Oct 03 08:41:52 crc kubenswrapper[4765]: I1003 08:41:52.021044 4765 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-f64ph" podUID="1a071347-8c80-4f91-87f3-1d95c7b18a1c" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Oct 03 08:41:52 crc kubenswrapper[4765]: I1003 08:41:52.275821 4765 generic.go:334] "Generic (PLEG): container finished" podID="d532aa95-c0e2-4e0a-b3e9-d67cdf15b1a6" containerID="739566f7b53ca6d75c9329f78fb6d9e5f14bda70a8800323df1fb31dc9b61fda" exitCode=0 Oct 03 08:41:52 crc kubenswrapper[4765]: I1003 08:41:52.275960 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-8-crc" event={"ID":"d532aa95-c0e2-4e0a-b3e9-d67cdf15b1a6","Type":"ContainerDied","Data":"739566f7b53ca6d75c9329f78fb6d9e5f14bda70a8800323df1fb31dc9b61fda"} Oct 03 08:41:52 crc kubenswrapper[4765]: I1003 08:41:52.656253 4765 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/revision-pruner-9-crc" Oct 03 08:41:52 crc kubenswrapper[4765]: I1003 08:41:52.743862 4765 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/9c5e0b35-5d25-4e2b-9a0a-accdf6cc5e23-kubelet-dir" (OuterVolumeSpecName: "kubelet-dir") pod "9c5e0b35-5d25-4e2b-9a0a-accdf6cc5e23" (UID: "9c5e0b35-5d25-4e2b-9a0a-accdf6cc5e23"). InnerVolumeSpecName "kubelet-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 03 08:41:52 crc kubenswrapper[4765]: I1003 08:41:52.743710 4765 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/9c5e0b35-5d25-4e2b-9a0a-accdf6cc5e23-kubelet-dir\") pod \"9c5e0b35-5d25-4e2b-9a0a-accdf6cc5e23\" (UID: \"9c5e0b35-5d25-4e2b-9a0a-accdf6cc5e23\") " Oct 03 08:41:52 crc kubenswrapper[4765]: I1003 08:41:52.744006 4765 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/9c5e0b35-5d25-4e2b-9a0a-accdf6cc5e23-kube-api-access\") pod \"9c5e0b35-5d25-4e2b-9a0a-accdf6cc5e23\" (UID: \"9c5e0b35-5d25-4e2b-9a0a-accdf6cc5e23\") " Oct 03 08:41:52 crc kubenswrapper[4765]: I1003 08:41:52.746589 4765 reconciler_common.go:293] "Volume detached for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/9c5e0b35-5d25-4e2b-9a0a-accdf6cc5e23-kubelet-dir\") on node \"crc\" DevicePath \"\"" Oct 03 08:41:52 crc kubenswrapper[4765]: I1003 08:41:52.751516 4765 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9c5e0b35-5d25-4e2b-9a0a-accdf6cc5e23-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "9c5e0b35-5d25-4e2b-9a0a-accdf6cc5e23" (UID: "9c5e0b35-5d25-4e2b-9a0a-accdf6cc5e23"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 08:41:52 crc kubenswrapper[4765]: I1003 08:41:52.848941 4765 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/9c5e0b35-5d25-4e2b-9a0a-accdf6cc5e23-kube-api-access\") on node \"crc\" DevicePath \"\"" Oct 03 08:41:52 crc kubenswrapper[4765]: I1003 08:41:52.945414 4765 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-dns/dns-default-5g7vw" Oct 03 08:41:53 crc kubenswrapper[4765]: I1003 08:41:53.030659 4765 patch_prober.go:28] interesting pod/router-default-5444994796-f64ph container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Oct 03 08:41:53 crc kubenswrapper[4765]: [-]has-synced failed: reason withheld Oct 03 08:41:53 crc kubenswrapper[4765]: [+]process-running ok Oct 03 08:41:53 crc kubenswrapper[4765]: healthz check failed Oct 03 08:41:53 crc kubenswrapper[4765]: I1003 08:41:53.030757 4765 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-f64ph" podUID="1a071347-8c80-4f91-87f3-1d95c7b18a1c" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Oct 03 08:41:53 crc kubenswrapper[4765]: I1003 08:41:53.287166 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/revision-pruner-9-crc" event={"ID":"9c5e0b35-5d25-4e2b-9a0a-accdf6cc5e23","Type":"ContainerDied","Data":"7ff29d4b6a2b93f483be622f3d498d1928d34dc17ecf3ee475b6403a3cedc1e3"} Oct 03 08:41:53 crc kubenswrapper[4765]: I1003 08:41:53.287219 4765 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/revision-pruner-9-crc" Oct 03 08:41:53 crc kubenswrapper[4765]: I1003 08:41:53.287249 4765 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="7ff29d4b6a2b93f483be622f3d498d1928d34dc17ecf3ee475b6403a3cedc1e3" Oct 03 08:41:54 crc kubenswrapper[4765]: I1003 08:41:54.022498 4765 patch_prober.go:28] interesting pod/router-default-5444994796-f64ph container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Oct 03 08:41:54 crc kubenswrapper[4765]: [-]has-synced failed: reason withheld Oct 03 08:41:54 crc kubenswrapper[4765]: [+]process-running ok Oct 03 08:41:54 crc kubenswrapper[4765]: healthz check failed Oct 03 08:41:54 crc kubenswrapper[4765]: I1003 08:41:54.022926 4765 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-f64ph" podUID="1a071347-8c80-4f91-87f3-1d95c7b18a1c" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Oct 03 08:41:55 crc kubenswrapper[4765]: I1003 08:41:55.020011 4765 patch_prober.go:28] interesting pod/router-default-5444994796-f64ph container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Oct 03 08:41:55 crc kubenswrapper[4765]: [-]has-synced failed: reason withheld Oct 03 08:41:55 crc kubenswrapper[4765]: [+]process-running ok Oct 03 08:41:55 crc kubenswrapper[4765]: healthz check failed Oct 03 08:41:55 crc kubenswrapper[4765]: I1003 08:41:55.020081 4765 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-f64ph" podUID="1a071347-8c80-4f91-87f3-1d95c7b18a1c" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Oct 03 08:41:56 crc kubenswrapper[4765]: I1003 08:41:56.019896 4765 patch_prober.go:28] interesting pod/router-default-5444994796-f64ph container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Oct 03 08:41:56 crc kubenswrapper[4765]: [+]has-synced ok Oct 03 08:41:56 crc kubenswrapper[4765]: [+]process-running ok Oct 03 08:41:56 crc kubenswrapper[4765]: healthz check failed Oct 03 08:41:56 crc kubenswrapper[4765]: I1003 08:41:56.020303 4765 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-f64ph" podUID="1a071347-8c80-4f91-87f3-1d95c7b18a1c" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Oct 03 08:41:56 crc kubenswrapper[4765]: I1003 08:41:56.680622 4765 patch_prober.go:28] interesting pod/console-f9d7485db-g8jbc container/console namespace/openshift-console: Startup probe status=failure output="Get \"https://10.217.0.34:8443/health\": dial tcp 10.217.0.34:8443: connect: connection refused" start-of-body= Oct 03 08:41:56 crc kubenswrapper[4765]: I1003 08:41:56.680692 4765 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-console/console-f9d7485db-g8jbc" podUID="d6e8ca49-1faf-4e22-8760-d7eca3820980" containerName="console" probeResult="failure" output="Get \"https://10.217.0.34:8443/health\": dial tcp 10.217.0.34:8443: connect: connection refused" Oct 03 08:41:56 crc kubenswrapper[4765]: I1003 08:41:56.912065 4765 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/downloads-7954f5f757-qdr5x" Oct 03 08:41:57 crc kubenswrapper[4765]: I1003 08:41:57.020121 4765 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-ingress/router-default-5444994796-f64ph" Oct 03 08:41:57 crc kubenswrapper[4765]: I1003 08:41:57.023758 4765 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ingress/router-default-5444994796-f64ph" Oct 03 08:41:58 crc kubenswrapper[4765]: I1003 08:41:58.474076 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/6824483c-e9a7-4e95-bb3d-e00bac2af3aa-metrics-certs\") pod \"network-metrics-daemon-wdwf5\" (UID: \"6824483c-e9a7-4e95-bb3d-e00bac2af3aa\") " pod="openshift-multus/network-metrics-daemon-wdwf5" Oct 03 08:41:58 crc kubenswrapper[4765]: I1003 08:41:58.486589 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/6824483c-e9a7-4e95-bb3d-e00bac2af3aa-metrics-certs\") pod \"network-metrics-daemon-wdwf5\" (UID: \"6824483c-e9a7-4e95-bb3d-e00bac2af3aa\") " pod="openshift-multus/network-metrics-daemon-wdwf5" Oct 03 08:41:58 crc kubenswrapper[4765]: I1003 08:41:58.719016 4765 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-wdwf5" Oct 03 08:42:00 crc kubenswrapper[4765]: I1003 08:42:00.516132 4765 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-8-crc" Oct 03 08:42:00 crc kubenswrapper[4765]: I1003 08:42:00.605450 4765 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/d532aa95-c0e2-4e0a-b3e9-d67cdf15b1a6-kubelet-dir\") pod \"d532aa95-c0e2-4e0a-b3e9-d67cdf15b1a6\" (UID: \"d532aa95-c0e2-4e0a-b3e9-d67cdf15b1a6\") " Oct 03 08:42:00 crc kubenswrapper[4765]: I1003 08:42:00.605574 4765 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/d532aa95-c0e2-4e0a-b3e9-d67cdf15b1a6-kube-api-access\") pod \"d532aa95-c0e2-4e0a-b3e9-d67cdf15b1a6\" (UID: \"d532aa95-c0e2-4e0a-b3e9-d67cdf15b1a6\") " Oct 03 08:42:00 crc kubenswrapper[4765]: I1003 08:42:00.605565 4765 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/d532aa95-c0e2-4e0a-b3e9-d67cdf15b1a6-kubelet-dir" (OuterVolumeSpecName: "kubelet-dir") pod "d532aa95-c0e2-4e0a-b3e9-d67cdf15b1a6" (UID: "d532aa95-c0e2-4e0a-b3e9-d67cdf15b1a6"). InnerVolumeSpecName "kubelet-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 03 08:42:00 crc kubenswrapper[4765]: I1003 08:42:00.606026 4765 reconciler_common.go:293] "Volume detached for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/d532aa95-c0e2-4e0a-b3e9-d67cdf15b1a6-kubelet-dir\") on node \"crc\" DevicePath \"\"" Oct 03 08:42:00 crc kubenswrapper[4765]: I1003 08:42:00.611958 4765 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d532aa95-c0e2-4e0a-b3e9-d67cdf15b1a6-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "d532aa95-c0e2-4e0a-b3e9-d67cdf15b1a6" (UID: "d532aa95-c0e2-4e0a-b3e9-d67cdf15b1a6"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 08:42:00 crc kubenswrapper[4765]: I1003 08:42:00.680518 4765 patch_prober.go:28] interesting pod/machine-config-daemon-j8mss container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 03 08:42:00 crc kubenswrapper[4765]: I1003 08:42:00.680625 4765 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-j8mss" podUID="d636dbad-9ffa-4ba7-953f-adea04b76a23" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 03 08:42:00 crc kubenswrapper[4765]: I1003 08:42:00.707577 4765 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/d532aa95-c0e2-4e0a-b3e9-d67cdf15b1a6-kube-api-access\") on node \"crc\" DevicePath \"\"" Oct 03 08:42:01 crc kubenswrapper[4765]: I1003 08:42:01.355040 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-8-crc" event={"ID":"d532aa95-c0e2-4e0a-b3e9-d67cdf15b1a6","Type":"ContainerDied","Data":"6cf5e2da614238475ee702e2b5a862d6da4260517c55ea58627bb5d261167d10"} Oct 03 08:42:01 crc kubenswrapper[4765]: I1003 08:42:01.355082 4765 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="6cf5e2da614238475ee702e2b5a862d6da4260517c55ea58627bb5d261167d10" Oct 03 08:42:01 crc kubenswrapper[4765]: I1003 08:42:01.355506 4765 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-8-crc" Oct 03 08:42:05 crc kubenswrapper[4765]: I1003 08:42:05.638206 4765 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-image-registry/image-registry-697d97f7c8-qwb6x" Oct 03 08:42:06 crc kubenswrapper[4765]: I1003 08:42:06.684597 4765 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-console/console-f9d7485db-g8jbc" Oct 03 08:42:06 crc kubenswrapper[4765]: I1003 08:42:06.689087 4765 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/console-f9d7485db-g8jbc" Oct 03 08:42:17 crc kubenswrapper[4765]: I1003 08:42:17.850152 4765 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-gvz2v" Oct 03 08:42:19 crc kubenswrapper[4765]: E1003 08:42:19.128754 4765 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/redhat/certified-operator-index:v4.18" Oct 03 08:42:19 crc kubenswrapper[4765]: E1003 08:42:19.129525 4765 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/certified-operator-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-fc4db,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod certified-operators-9hhf6_openshift-marketplace(52d70e1c-3f04-4bab-a6a3-2ea9d66489db): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Oct 03 08:42:19 crc kubenswrapper[4765]: E1003 08:42:19.130779 4765 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-marketplace/certified-operators-9hhf6" podUID="52d70e1c-3f04-4bab-a6a3-2ea9d66489db" Oct 03 08:42:22 crc kubenswrapper[4765]: E1003 08:42:22.064671 4765 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"\"" pod="openshift-marketplace/certified-operators-9hhf6" podUID="52d70e1c-3f04-4bab-a6a3-2ea9d66489db" Oct 03 08:42:22 crc kubenswrapper[4765]: E1003 08:42:22.135101 4765 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/redhat/redhat-operator-index:v4.18" Oct 03 08:42:22 crc kubenswrapper[4765]: E1003 08:42:22.135399 4765 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/redhat-operator-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-htwg7,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod redhat-operators-fmg4j_openshift-marketplace(111267ae-7647-4185-8fd3-138ad5d3a864): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Oct 03 08:42:22 crc kubenswrapper[4765]: E1003 08:42:22.136561 4765 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-marketplace/redhat-operators-fmg4j" podUID="111267ae-7647-4185-8fd3-138ad5d3a864" Oct 03 08:42:22 crc kubenswrapper[4765]: E1003 08:42:22.146486 4765 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/redhat/certified-operator-index:v4.18" Oct 03 08:42:22 crc kubenswrapper[4765]: E1003 08:42:22.146689 4765 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/certified-operator-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-gtv5z,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod certified-operators-4w7fr_openshift-marketplace(ba3fb502-6081-420d-8ef8-a249a7e69e60): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Oct 03 08:42:22 crc kubenswrapper[4765]: E1003 08:42:22.148013 4765 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-marketplace/certified-operators-4w7fr" podUID="ba3fb502-6081-420d-8ef8-a249a7e69e60" Oct 03 08:42:23 crc kubenswrapper[4765]: I1003 08:42:23.537151 4765 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 03 08:42:25 crc kubenswrapper[4765]: E1003 08:42:25.061213 4765 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"\"" pod="openshift-marketplace/certified-operators-4w7fr" podUID="ba3fb502-6081-420d-8ef8-a249a7e69e60" Oct 03 08:42:25 crc kubenswrapper[4765]: E1003 08:42:25.061368 4765 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"\"" pod="openshift-marketplace/redhat-operators-fmg4j" podUID="111267ae-7647-4185-8fd3-138ad5d3a864" Oct 03 08:42:26 crc kubenswrapper[4765]: E1003 08:42:26.609205 4765 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/redhat/community-operator-index:v4.18" Oct 03 08:42:26 crc kubenswrapper[4765]: E1003 08:42:26.609360 4765 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/community-operator-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-6vwkv,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod community-operators-8lvxz_openshift-marketplace(4de34feb-a2a4-49c9-b066-f7a71b39cd06): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Oct 03 08:42:26 crc kubenswrapper[4765]: E1003 08:42:26.611096 4765 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-marketplace/community-operators-8lvxz" podUID="4de34feb-a2a4-49c9-b066-f7a71b39cd06" Oct 03 08:42:26 crc kubenswrapper[4765]: E1003 08:42:26.623916 4765 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/redhat/community-operator-index:v4.18" Oct 03 08:42:26 crc kubenswrapper[4765]: E1003 08:42:26.624393 4765 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/community-operator-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-b2269,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod community-operators-q84qb_openshift-marketplace(de26c094-f060-4b1d-b06f-13bf0f1794ce): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Oct 03 08:42:26 crc kubenswrapper[4765]: E1003 08:42:26.625655 4765 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-marketplace/community-operators-q84qb" podUID="de26c094-f060-4b1d-b06f-13bf0f1794ce" Oct 03 08:42:26 crc kubenswrapper[4765]: I1003 08:42:26.924782 4765 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-multus/network-metrics-daemon-wdwf5"] Oct 03 08:42:27 crc kubenswrapper[4765]: W1003 08:42:27.452294 4765 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod6824483c_e9a7_4e95_bb3d_e00bac2af3aa.slice/crio-f9c542693a5a214c2dd8f279d4823083b27a765dcf7762918f1126594b74768c WatchSource:0}: Error finding container f9c542693a5a214c2dd8f279d4823083b27a765dcf7762918f1126594b74768c: Status 404 returned error can't find the container with id f9c542693a5a214c2dd8f279d4823083b27a765dcf7762918f1126594b74768c Oct 03 08:42:27 crc kubenswrapper[4765]: E1003 08:42:27.497599 4765 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/redhat/redhat-marketplace-index:v4.18" Oct 03 08:42:27 crc kubenswrapper[4765]: E1003 08:42:27.498008 4765 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/redhat-marketplace-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-8lj49,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod redhat-marketplace-8pvgz_openshift-marketplace(9c3ab66b-9f3e-4764-a5f6-acf1f378e489): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Oct 03 08:42:27 crc kubenswrapper[4765]: E1003 08:42:27.499417 4765 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-marketplace/redhat-marketplace-8pvgz" podUID="9c3ab66b-9f3e-4764-a5f6-acf1f378e489" Oct 03 08:42:27 crc kubenswrapper[4765]: E1003 08:42:27.507813 4765 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/redhat/redhat-marketplace-index:v4.18" Oct 03 08:42:27 crc kubenswrapper[4765]: E1003 08:42:27.508065 4765 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/redhat-marketplace-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-68rb5,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod redhat-marketplace-zfkjv_openshift-marketplace(d55c53fc-df46-4bb6-a4b7-4d269d965dc6): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Oct 03 08:42:27 crc kubenswrapper[4765]: E1003 08:42:27.510229 4765 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-marketplace/redhat-marketplace-zfkjv" podUID="d55c53fc-df46-4bb6-a4b7-4d269d965dc6" Oct 03 08:42:27 crc kubenswrapper[4765]: I1003 08:42:27.527627 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-wdwf5" event={"ID":"6824483c-e9a7-4e95-bb3d-e00bac2af3aa","Type":"ContainerStarted","Data":"f9c542693a5a214c2dd8f279d4823083b27a765dcf7762918f1126594b74768c"} Oct 03 08:42:27 crc kubenswrapper[4765]: E1003 08:42:27.530528 4765 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"\"" pod="openshift-marketplace/community-operators-q84qb" podUID="de26c094-f060-4b1d-b06f-13bf0f1794ce" Oct 03 08:42:27 crc kubenswrapper[4765]: E1003 08:42:27.530548 4765 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"\"" pod="openshift-marketplace/redhat-marketplace-8pvgz" podUID="9c3ab66b-9f3e-4764-a5f6-acf1f378e489" Oct 03 08:42:27 crc kubenswrapper[4765]: E1003 08:42:27.530550 4765 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"\"" pod="openshift-marketplace/community-operators-8lvxz" podUID="4de34feb-a2a4-49c9-b066-f7a71b39cd06" Oct 03 08:42:27 crc kubenswrapper[4765]: E1003 08:42:27.530942 4765 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"\"" pod="openshift-marketplace/redhat-marketplace-zfkjv" podUID="d55c53fc-df46-4bb6-a4b7-4d269d965dc6" Oct 03 08:42:28 crc kubenswrapper[4765]: I1003 08:42:28.536820 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-wdwf5" event={"ID":"6824483c-e9a7-4e95-bb3d-e00bac2af3aa","Type":"ContainerStarted","Data":"e181d22182830405576dc481c6ceb96e6f00f953d6f457a1ccf04d22f49d1c40"} Oct 03 08:42:28 crc kubenswrapper[4765]: I1003 08:42:28.537209 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-wdwf5" event={"ID":"6824483c-e9a7-4e95-bb3d-e00bac2af3aa","Type":"ContainerStarted","Data":"a496c4ba484a2e13f85cd194f7c306d2cdc928d130cf8b5bb14745ebd4f00a72"} Oct 03 08:42:28 crc kubenswrapper[4765]: I1003 08:42:28.547829 4765 generic.go:334] "Generic (PLEG): container finished" podID="b280de40-e91c-4010-9173-48ed01320bd4" containerID="6bec777ef28515667057c6422dcdbba25fa7ae8da0af00ba91b0ea9fc843e51a" exitCode=0 Oct 03 08:42:28 crc kubenswrapper[4765]: I1003 08:42:28.547910 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-sgxf4" event={"ID":"b280de40-e91c-4010-9173-48ed01320bd4","Type":"ContainerDied","Data":"6bec777ef28515667057c6422dcdbba25fa7ae8da0af00ba91b0ea9fc843e51a"} Oct 03 08:42:28 crc kubenswrapper[4765]: I1003 08:42:28.567661 4765 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/network-metrics-daemon-wdwf5" podStartSLOduration=172.567618038 podStartE2EDuration="2m52.567618038s" podCreationTimestamp="2025-10-03 08:39:36 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-03 08:42:28.567440974 +0000 UTC m=+192.868935314" watchObservedRunningTime="2025-10-03 08:42:28.567618038 +0000 UTC m=+192.869112368" Oct 03 08:42:29 crc kubenswrapper[4765]: I1003 08:42:29.560373 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-sgxf4" event={"ID":"b280de40-e91c-4010-9173-48ed01320bd4","Type":"ContainerStarted","Data":"4d1b5c3436013a2e933445c0bfdfcf3c9a279ff54ea8bf4f7e4180d2abae5bd9"} Oct 03 08:42:29 crc kubenswrapper[4765]: I1003 08:42:29.583169 4765 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-sgxf4" podStartSLOduration=4.669597803 podStartE2EDuration="42.583146132s" podCreationTimestamp="2025-10-03 08:41:47 +0000 UTC" firstStartedPulling="2025-10-03 08:41:51.218571205 +0000 UTC m=+155.520065535" lastFinishedPulling="2025-10-03 08:42:29.132119534 +0000 UTC m=+193.433613864" observedRunningTime="2025-10-03 08:42:29.580473026 +0000 UTC m=+193.881967356" watchObservedRunningTime="2025-10-03 08:42:29.583146132 +0000 UTC m=+193.884640462" Oct 03 08:42:30 crc kubenswrapper[4765]: I1003 08:42:30.680863 4765 patch_prober.go:28] interesting pod/machine-config-daemon-j8mss container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 03 08:42:30 crc kubenswrapper[4765]: I1003 08:42:30.681401 4765 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-j8mss" podUID="d636dbad-9ffa-4ba7-953f-adea04b76a23" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 03 08:42:38 crc kubenswrapper[4765]: I1003 08:42:38.241464 4765 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-sgxf4" Oct 03 08:42:38 crc kubenswrapper[4765]: I1003 08:42:38.241943 4765 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-sgxf4" Oct 03 08:42:38 crc kubenswrapper[4765]: I1003 08:42:38.722439 4765 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-sgxf4" Oct 03 08:42:38 crc kubenswrapper[4765]: I1003 08:42:38.765855 4765 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-sgxf4" Oct 03 08:42:39 crc kubenswrapper[4765]: I1003 08:42:39.611359 4765 generic.go:334] "Generic (PLEG): container finished" podID="52d70e1c-3f04-4bab-a6a3-2ea9d66489db" containerID="a1ff09d8d7d1e8f1b42a68c31bfa58310d634a870f25d75ff4172fc0e79ba901" exitCode=0 Oct 03 08:42:39 crc kubenswrapper[4765]: I1003 08:42:39.611460 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-9hhf6" event={"ID":"52d70e1c-3f04-4bab-a6a3-2ea9d66489db","Type":"ContainerDied","Data":"a1ff09d8d7d1e8f1b42a68c31bfa58310d634a870f25d75ff4172fc0e79ba901"} Oct 03 08:42:48 crc kubenswrapper[4765]: I1003 08:42:48.659795 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-9hhf6" event={"ID":"52d70e1c-3f04-4bab-a6a3-2ea9d66489db","Type":"ContainerStarted","Data":"8684de6f72267e6ca3e62c5a7d4e19b18182b6272fc31ebf41077147710b10ea"} Oct 03 08:42:48 crc kubenswrapper[4765]: I1003 08:42:48.662078 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-4w7fr" event={"ID":"ba3fb502-6081-420d-8ef8-a249a7e69e60","Type":"ContainerStarted","Data":"b25af2cc26f0d693473336bb041ccce03f2540a447126bd2fe131df3602ea805"} Oct 03 08:42:48 crc kubenswrapper[4765]: I1003 08:42:48.669998 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-fmg4j" event={"ID":"111267ae-7647-4185-8fd3-138ad5d3a864","Type":"ContainerStarted","Data":"033e0909b90cda367ea286420b15c72e32b33394ddc6d9a99b3b9a05ba65761b"} Oct 03 08:42:48 crc kubenswrapper[4765]: I1003 08:42:48.674265 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-8lvxz" event={"ID":"4de34feb-a2a4-49c9-b066-f7a71b39cd06","Type":"ContainerStarted","Data":"4e2480de9decc3f0297ff9d3cc848e06dae49b21c99cb3c43529b9e5dc9b1b52"} Oct 03 08:42:48 crc kubenswrapper[4765]: I1003 08:42:48.677054 4765 generic.go:334] "Generic (PLEG): container finished" podID="9c3ab66b-9f3e-4764-a5f6-acf1f378e489" containerID="18e9132017ffd38aa69175bedc7b1494048cbc198a5b5005d34304f760f36385" exitCode=0 Oct 03 08:42:48 crc kubenswrapper[4765]: I1003 08:42:48.677119 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-8pvgz" event={"ID":"9c3ab66b-9f3e-4764-a5f6-acf1f378e489","Type":"ContainerDied","Data":"18e9132017ffd38aa69175bedc7b1494048cbc198a5b5005d34304f760f36385"} Oct 03 08:42:48 crc kubenswrapper[4765]: I1003 08:42:48.679517 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-q84qb" event={"ID":"de26c094-f060-4b1d-b06f-13bf0f1794ce","Type":"ContainerStarted","Data":"fbad4950c3ead7b282d572b338e9a5e37cf4bf2143356af03a02759632fbeec6"} Oct 03 08:42:48 crc kubenswrapper[4765]: I1003 08:42:48.686824 4765 generic.go:334] "Generic (PLEG): container finished" podID="d55c53fc-df46-4bb6-a4b7-4d269d965dc6" containerID="0c1ef063524c93625fe0bc43a07be7b73848dcafff64c393cbbe45be66a995d0" exitCode=0 Oct 03 08:42:48 crc kubenswrapper[4765]: I1003 08:42:48.686901 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-zfkjv" event={"ID":"d55c53fc-df46-4bb6-a4b7-4d269d965dc6","Type":"ContainerDied","Data":"0c1ef063524c93625fe0bc43a07be7b73848dcafff64c393cbbe45be66a995d0"} Oct 03 08:42:48 crc kubenswrapper[4765]: I1003 08:42:48.693145 4765 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-9hhf6" podStartSLOduration=3.329122574 podStartE2EDuration="1m4.693125514s" podCreationTimestamp="2025-10-03 08:41:44 +0000 UTC" firstStartedPulling="2025-10-03 08:41:45.938476378 +0000 UTC m=+150.239970708" lastFinishedPulling="2025-10-03 08:42:47.302479318 +0000 UTC m=+211.603973648" observedRunningTime="2025-10-03 08:42:48.690780116 +0000 UTC m=+212.992274476" watchObservedRunningTime="2025-10-03 08:42:48.693125514 +0000 UTC m=+212.994619854" Oct 03 08:42:49 crc kubenswrapper[4765]: I1003 08:42:49.697551 4765 generic.go:334] "Generic (PLEG): container finished" podID="111267ae-7647-4185-8fd3-138ad5d3a864" containerID="033e0909b90cda367ea286420b15c72e32b33394ddc6d9a99b3b9a05ba65761b" exitCode=0 Oct 03 08:42:49 crc kubenswrapper[4765]: I1003 08:42:49.697679 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-fmg4j" event={"ID":"111267ae-7647-4185-8fd3-138ad5d3a864","Type":"ContainerDied","Data":"033e0909b90cda367ea286420b15c72e32b33394ddc6d9a99b3b9a05ba65761b"} Oct 03 08:42:49 crc kubenswrapper[4765]: I1003 08:42:49.701842 4765 generic.go:334] "Generic (PLEG): container finished" podID="4de34feb-a2a4-49c9-b066-f7a71b39cd06" containerID="4e2480de9decc3f0297ff9d3cc848e06dae49b21c99cb3c43529b9e5dc9b1b52" exitCode=0 Oct 03 08:42:49 crc kubenswrapper[4765]: I1003 08:42:49.701917 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-8lvxz" event={"ID":"4de34feb-a2a4-49c9-b066-f7a71b39cd06","Type":"ContainerDied","Data":"4e2480de9decc3f0297ff9d3cc848e06dae49b21c99cb3c43529b9e5dc9b1b52"} Oct 03 08:42:49 crc kubenswrapper[4765]: I1003 08:42:49.708063 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-8pvgz" event={"ID":"9c3ab66b-9f3e-4764-a5f6-acf1f378e489","Type":"ContainerStarted","Data":"3d8c676dacbb48fe27f2cfecb7a396cd39128e0795625e853466eb8204c8f279"} Oct 03 08:42:49 crc kubenswrapper[4765]: I1003 08:42:49.710631 4765 generic.go:334] "Generic (PLEG): container finished" podID="de26c094-f060-4b1d-b06f-13bf0f1794ce" containerID="fbad4950c3ead7b282d572b338e9a5e37cf4bf2143356af03a02759632fbeec6" exitCode=0 Oct 03 08:42:49 crc kubenswrapper[4765]: I1003 08:42:49.710887 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-q84qb" event={"ID":"de26c094-f060-4b1d-b06f-13bf0f1794ce","Type":"ContainerDied","Data":"fbad4950c3ead7b282d572b338e9a5e37cf4bf2143356af03a02759632fbeec6"} Oct 03 08:42:49 crc kubenswrapper[4765]: I1003 08:42:49.714794 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-zfkjv" event={"ID":"d55c53fc-df46-4bb6-a4b7-4d269d965dc6","Type":"ContainerStarted","Data":"4cd5aa9415cd0abd8a14b58991ad92118f4698c67b70e6d1b549d831f8ca1f5c"} Oct 03 08:42:49 crc kubenswrapper[4765]: I1003 08:42:49.724201 4765 generic.go:334] "Generic (PLEG): container finished" podID="ba3fb502-6081-420d-8ef8-a249a7e69e60" containerID="b25af2cc26f0d693473336bb041ccce03f2540a447126bd2fe131df3602ea805" exitCode=0 Oct 03 08:42:49 crc kubenswrapper[4765]: I1003 08:42:49.724256 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-4w7fr" event={"ID":"ba3fb502-6081-420d-8ef8-a249a7e69e60","Type":"ContainerDied","Data":"b25af2cc26f0d693473336bb041ccce03f2540a447126bd2fe131df3602ea805"} Oct 03 08:42:49 crc kubenswrapper[4765]: I1003 08:42:49.749322 4765 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-8pvgz" podStartSLOduration=2.7083884769999997 podStartE2EDuration="1m3.749299599s" podCreationTimestamp="2025-10-03 08:41:46 +0000 UTC" firstStartedPulling="2025-10-03 08:41:48.053278249 +0000 UTC m=+152.354772579" lastFinishedPulling="2025-10-03 08:42:49.094189371 +0000 UTC m=+213.395683701" observedRunningTime="2025-10-03 08:42:49.746897449 +0000 UTC m=+214.048391789" watchObservedRunningTime="2025-10-03 08:42:49.749299599 +0000 UTC m=+214.050793929" Oct 03 08:42:49 crc kubenswrapper[4765]: I1003 08:42:49.802664 4765 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-zfkjv" podStartSLOduration=2.694468518 podStartE2EDuration="1m3.802611913s" podCreationTimestamp="2025-10-03 08:41:46 +0000 UTC" firstStartedPulling="2025-10-03 08:41:48.021002624 +0000 UTC m=+152.322496954" lastFinishedPulling="2025-10-03 08:42:49.129146019 +0000 UTC m=+213.430640349" observedRunningTime="2025-10-03 08:42:49.798308907 +0000 UTC m=+214.099803247" watchObservedRunningTime="2025-10-03 08:42:49.802611913 +0000 UTC m=+214.104106243" Oct 03 08:42:50 crc kubenswrapper[4765]: I1003 08:42:50.734740 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-4w7fr" event={"ID":"ba3fb502-6081-420d-8ef8-a249a7e69e60","Type":"ContainerStarted","Data":"ea90f378b290faa2d9f700f85818898d39d1d868088a809139820761cb11e78e"} Oct 03 08:42:50 crc kubenswrapper[4765]: I1003 08:42:50.737998 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-fmg4j" event={"ID":"111267ae-7647-4185-8fd3-138ad5d3a864","Type":"ContainerStarted","Data":"065ee4deea1f9d34192d94623da8ede035b792f88a1d90f09f382ce4bd3078d9"} Oct 03 08:42:50 crc kubenswrapper[4765]: I1003 08:42:50.740174 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-8lvxz" event={"ID":"4de34feb-a2a4-49c9-b066-f7a71b39cd06","Type":"ContainerStarted","Data":"535142c5dddf7f4fe82c5b7ad47d58d91bc4e851ca45cfa62f12be107faeca43"} Oct 03 08:42:50 crc kubenswrapper[4765]: I1003 08:42:50.743205 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-q84qb" event={"ID":"de26c094-f060-4b1d-b06f-13bf0f1794ce","Type":"ContainerStarted","Data":"ac57b949cbeff458cc6ea8531f4cb7cfce6ddf327e6ae3c5debc46cd9cd28c51"} Oct 03 08:42:50 crc kubenswrapper[4765]: I1003 08:42:50.759470 4765 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-4w7fr" podStartSLOduration=2.382589549 podStartE2EDuration="1m6.75944828s" podCreationTimestamp="2025-10-03 08:41:44 +0000 UTC" firstStartedPulling="2025-10-03 08:41:45.959710761 +0000 UTC m=+150.261205091" lastFinishedPulling="2025-10-03 08:42:50.336569492 +0000 UTC m=+214.638063822" observedRunningTime="2025-10-03 08:42:50.757392129 +0000 UTC m=+215.058886459" watchObservedRunningTime="2025-10-03 08:42:50.75944828 +0000 UTC m=+215.060942610" Oct 03 08:42:50 crc kubenswrapper[4765]: I1003 08:42:50.790858 4765 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-fmg4j" podStartSLOduration=3.689028745 podStartE2EDuration="1m2.79083439s" podCreationTimestamp="2025-10-03 08:41:48 +0000 UTC" firstStartedPulling="2025-10-03 08:41:51.221698122 +0000 UTC m=+155.523192452" lastFinishedPulling="2025-10-03 08:42:50.323503767 +0000 UTC m=+214.624998097" observedRunningTime="2025-10-03 08:42:50.783672822 +0000 UTC m=+215.085167162" watchObservedRunningTime="2025-10-03 08:42:50.79083439 +0000 UTC m=+215.092328720" Oct 03 08:42:50 crc kubenswrapper[4765]: I1003 08:42:50.805530 4765 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-q84qb" podStartSLOduration=2.592637425 podStartE2EDuration="1m6.805503005s" podCreationTimestamp="2025-10-03 08:41:44 +0000 UTC" firstStartedPulling="2025-10-03 08:41:45.981282063 +0000 UTC m=+150.282776393" lastFinishedPulling="2025-10-03 08:42:50.194147643 +0000 UTC m=+214.495641973" observedRunningTime="2025-10-03 08:42:50.803273469 +0000 UTC m=+215.104767799" watchObservedRunningTime="2025-10-03 08:42:50.805503005 +0000 UTC m=+215.106997335" Oct 03 08:42:50 crc kubenswrapper[4765]: I1003 08:42:50.830699 4765 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-8lvxz" podStartSLOduration=2.720037471 podStartE2EDuration="1m6.83067561s" podCreationTimestamp="2025-10-03 08:41:44 +0000 UTC" firstStartedPulling="2025-10-03 08:41:45.980971806 +0000 UTC m=+150.282466136" lastFinishedPulling="2025-10-03 08:42:50.091609945 +0000 UTC m=+214.393104275" observedRunningTime="2025-10-03 08:42:50.827260205 +0000 UTC m=+215.128754535" watchObservedRunningTime="2025-10-03 08:42:50.83067561 +0000 UTC m=+215.132169940" Oct 03 08:42:54 crc kubenswrapper[4765]: I1003 08:42:54.747983 4765 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-9hhf6" Oct 03 08:42:54 crc kubenswrapper[4765]: I1003 08:42:54.748621 4765 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-9hhf6" Oct 03 08:42:54 crc kubenswrapper[4765]: I1003 08:42:54.800666 4765 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-9hhf6" Oct 03 08:42:54 crc kubenswrapper[4765]: I1003 08:42:54.855916 4765 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-9hhf6" Oct 03 08:42:55 crc kubenswrapper[4765]: I1003 08:42:55.051907 4765 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-8lvxz" Oct 03 08:42:55 crc kubenswrapper[4765]: I1003 08:42:55.052003 4765 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-8lvxz" Oct 03 08:42:55 crc kubenswrapper[4765]: I1003 08:42:55.104290 4765 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-8lvxz" Oct 03 08:42:55 crc kubenswrapper[4765]: I1003 08:42:55.251250 4765 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-4w7fr" Oct 03 08:42:55 crc kubenswrapper[4765]: I1003 08:42:55.251380 4765 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-4w7fr" Oct 03 08:42:55 crc kubenswrapper[4765]: I1003 08:42:55.298076 4765 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-4w7fr" Oct 03 08:42:55 crc kubenswrapper[4765]: I1003 08:42:55.343334 4765 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-q84qb" Oct 03 08:42:55 crc kubenswrapper[4765]: I1003 08:42:55.343405 4765 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-q84qb" Oct 03 08:42:55 crc kubenswrapper[4765]: I1003 08:42:55.383679 4765 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-q84qb" Oct 03 08:42:55 crc kubenswrapper[4765]: I1003 08:42:55.810115 4765 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-q84qb" Oct 03 08:42:55 crc kubenswrapper[4765]: I1003 08:42:55.810248 4765 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-4w7fr" Oct 03 08:42:55 crc kubenswrapper[4765]: I1003 08:42:55.813153 4765 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-8lvxz" Oct 03 08:42:56 crc kubenswrapper[4765]: I1003 08:42:56.718012 4765 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-zfkjv" Oct 03 08:42:56 crc kubenswrapper[4765]: I1003 08:42:56.718446 4765 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-zfkjv" Oct 03 08:42:56 crc kubenswrapper[4765]: I1003 08:42:56.760777 4765 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-zfkjv" Oct 03 08:42:56 crc kubenswrapper[4765]: I1003 08:42:56.821482 4765 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-zfkjv" Oct 03 08:42:57 crc kubenswrapper[4765]: I1003 08:42:57.122073 4765 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-8pvgz" Oct 03 08:42:57 crc kubenswrapper[4765]: I1003 08:42:57.122159 4765 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-8pvgz" Oct 03 08:42:57 crc kubenswrapper[4765]: I1003 08:42:57.148376 4765 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-4w7fr"] Oct 03 08:42:57 crc kubenswrapper[4765]: I1003 08:42:57.180306 4765 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-8pvgz" Oct 03 08:42:57 crc kubenswrapper[4765]: I1003 08:42:57.822235 4765 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-8pvgz" Oct 03 08:42:58 crc kubenswrapper[4765]: I1003 08:42:58.150353 4765 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-q84qb"] Oct 03 08:42:58 crc kubenswrapper[4765]: I1003 08:42:58.151663 4765 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-q84qb" podUID="de26c094-f060-4b1d-b06f-13bf0f1794ce" containerName="registry-server" containerID="cri-o://ac57b949cbeff458cc6ea8531f4cb7cfce6ddf327e6ae3c5debc46cd9cd28c51" gracePeriod=2 Oct 03 08:42:58 crc kubenswrapper[4765]: I1003 08:42:58.505971 4765 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-q84qb" Oct 03 08:42:58 crc kubenswrapper[4765]: I1003 08:42:58.543706 4765 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-fmg4j" Oct 03 08:42:58 crc kubenswrapper[4765]: I1003 08:42:58.543762 4765 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-fmg4j" Oct 03 08:42:58 crc kubenswrapper[4765]: I1003 08:42:58.594532 4765 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-fmg4j" Oct 03 08:42:58 crc kubenswrapper[4765]: I1003 08:42:58.690008 4765 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/de26c094-f060-4b1d-b06f-13bf0f1794ce-catalog-content\") pod \"de26c094-f060-4b1d-b06f-13bf0f1794ce\" (UID: \"de26c094-f060-4b1d-b06f-13bf0f1794ce\") " Oct 03 08:42:58 crc kubenswrapper[4765]: I1003 08:42:58.690077 4765 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/de26c094-f060-4b1d-b06f-13bf0f1794ce-utilities\") pod \"de26c094-f060-4b1d-b06f-13bf0f1794ce\" (UID: \"de26c094-f060-4b1d-b06f-13bf0f1794ce\") " Oct 03 08:42:58 crc kubenswrapper[4765]: I1003 08:42:58.690112 4765 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-b2269\" (UniqueName: \"kubernetes.io/projected/de26c094-f060-4b1d-b06f-13bf0f1794ce-kube-api-access-b2269\") pod \"de26c094-f060-4b1d-b06f-13bf0f1794ce\" (UID: \"de26c094-f060-4b1d-b06f-13bf0f1794ce\") " Oct 03 08:42:58 crc kubenswrapper[4765]: I1003 08:42:58.690903 4765 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/de26c094-f060-4b1d-b06f-13bf0f1794ce-utilities" (OuterVolumeSpecName: "utilities") pod "de26c094-f060-4b1d-b06f-13bf0f1794ce" (UID: "de26c094-f060-4b1d-b06f-13bf0f1794ce"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 03 08:42:58 crc kubenswrapper[4765]: I1003 08:42:58.695806 4765 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/de26c094-f060-4b1d-b06f-13bf0f1794ce-kube-api-access-b2269" (OuterVolumeSpecName: "kube-api-access-b2269") pod "de26c094-f060-4b1d-b06f-13bf0f1794ce" (UID: "de26c094-f060-4b1d-b06f-13bf0f1794ce"). InnerVolumeSpecName "kube-api-access-b2269". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 08:42:58 crc kubenswrapper[4765]: I1003 08:42:58.745089 4765 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/de26c094-f060-4b1d-b06f-13bf0f1794ce-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "de26c094-f060-4b1d-b06f-13bf0f1794ce" (UID: "de26c094-f060-4b1d-b06f-13bf0f1794ce"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 03 08:42:58 crc kubenswrapper[4765]: I1003 08:42:58.791129 4765 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/de26c094-f060-4b1d-b06f-13bf0f1794ce-utilities\") on node \"crc\" DevicePath \"\"" Oct 03 08:42:58 crc kubenswrapper[4765]: I1003 08:42:58.791167 4765 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-b2269\" (UniqueName: \"kubernetes.io/projected/de26c094-f060-4b1d-b06f-13bf0f1794ce-kube-api-access-b2269\") on node \"crc\" DevicePath \"\"" Oct 03 08:42:58 crc kubenswrapper[4765]: I1003 08:42:58.791183 4765 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/de26c094-f060-4b1d-b06f-13bf0f1794ce-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 03 08:42:58 crc kubenswrapper[4765]: I1003 08:42:58.791432 4765 generic.go:334] "Generic (PLEG): container finished" podID="de26c094-f060-4b1d-b06f-13bf0f1794ce" containerID="ac57b949cbeff458cc6ea8531f4cb7cfce6ddf327e6ae3c5debc46cd9cd28c51" exitCode=0 Oct 03 08:42:58 crc kubenswrapper[4765]: I1003 08:42:58.791537 4765 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-q84qb" Oct 03 08:42:58 crc kubenswrapper[4765]: I1003 08:42:58.791598 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-q84qb" event={"ID":"de26c094-f060-4b1d-b06f-13bf0f1794ce","Type":"ContainerDied","Data":"ac57b949cbeff458cc6ea8531f4cb7cfce6ddf327e6ae3c5debc46cd9cd28c51"} Oct 03 08:42:58 crc kubenswrapper[4765]: I1003 08:42:58.791687 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-q84qb" event={"ID":"de26c094-f060-4b1d-b06f-13bf0f1794ce","Type":"ContainerDied","Data":"52c675bd5ed319ccf4f0e24a6691cac2de9331fbb4ab75f165fccfaff5957ba7"} Oct 03 08:42:58 crc kubenswrapper[4765]: I1003 08:42:58.791712 4765 scope.go:117] "RemoveContainer" containerID="ac57b949cbeff458cc6ea8531f4cb7cfce6ddf327e6ae3c5debc46cd9cd28c51" Oct 03 08:42:58 crc kubenswrapper[4765]: I1003 08:42:58.792539 4765 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-4w7fr" podUID="ba3fb502-6081-420d-8ef8-a249a7e69e60" containerName="registry-server" containerID="cri-o://ea90f378b290faa2d9f700f85818898d39d1d868088a809139820761cb11e78e" gracePeriod=2 Oct 03 08:42:58 crc kubenswrapper[4765]: I1003 08:42:58.811371 4765 scope.go:117] "RemoveContainer" containerID="fbad4950c3ead7b282d572b338e9a5e37cf4bf2143356af03a02759632fbeec6" Oct 03 08:42:58 crc kubenswrapper[4765]: I1003 08:42:58.823526 4765 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-q84qb"] Oct 03 08:42:58 crc kubenswrapper[4765]: I1003 08:42:58.833885 4765 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-q84qb"] Oct 03 08:42:58 crc kubenswrapper[4765]: I1003 08:42:58.849567 4765 scope.go:117] "RemoveContainer" containerID="5934c00221ebc58714a0bbfc40a96eb4bc372f3a98f5c73e3917ad88b760581f" Oct 03 08:42:58 crc kubenswrapper[4765]: I1003 08:42:58.849631 4765 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-fmg4j" Oct 03 08:42:58 crc kubenswrapper[4765]: I1003 08:42:58.914353 4765 scope.go:117] "RemoveContainer" containerID="ac57b949cbeff458cc6ea8531f4cb7cfce6ddf327e6ae3c5debc46cd9cd28c51" Oct 03 08:42:58 crc kubenswrapper[4765]: E1003 08:42:58.914958 4765 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ac57b949cbeff458cc6ea8531f4cb7cfce6ddf327e6ae3c5debc46cd9cd28c51\": container with ID starting with ac57b949cbeff458cc6ea8531f4cb7cfce6ddf327e6ae3c5debc46cd9cd28c51 not found: ID does not exist" containerID="ac57b949cbeff458cc6ea8531f4cb7cfce6ddf327e6ae3c5debc46cd9cd28c51" Oct 03 08:42:58 crc kubenswrapper[4765]: I1003 08:42:58.915010 4765 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ac57b949cbeff458cc6ea8531f4cb7cfce6ddf327e6ae3c5debc46cd9cd28c51"} err="failed to get container status \"ac57b949cbeff458cc6ea8531f4cb7cfce6ddf327e6ae3c5debc46cd9cd28c51\": rpc error: code = NotFound desc = could not find container \"ac57b949cbeff458cc6ea8531f4cb7cfce6ddf327e6ae3c5debc46cd9cd28c51\": container with ID starting with ac57b949cbeff458cc6ea8531f4cb7cfce6ddf327e6ae3c5debc46cd9cd28c51 not found: ID does not exist" Oct 03 08:42:58 crc kubenswrapper[4765]: I1003 08:42:58.915070 4765 scope.go:117] "RemoveContainer" containerID="fbad4950c3ead7b282d572b338e9a5e37cf4bf2143356af03a02759632fbeec6" Oct 03 08:42:58 crc kubenswrapper[4765]: E1003 08:42:58.915943 4765 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"fbad4950c3ead7b282d572b338e9a5e37cf4bf2143356af03a02759632fbeec6\": container with ID starting with fbad4950c3ead7b282d572b338e9a5e37cf4bf2143356af03a02759632fbeec6 not found: ID does not exist" containerID="fbad4950c3ead7b282d572b338e9a5e37cf4bf2143356af03a02759632fbeec6" Oct 03 08:42:58 crc kubenswrapper[4765]: I1003 08:42:58.916016 4765 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"fbad4950c3ead7b282d572b338e9a5e37cf4bf2143356af03a02759632fbeec6"} err="failed to get container status \"fbad4950c3ead7b282d572b338e9a5e37cf4bf2143356af03a02759632fbeec6\": rpc error: code = NotFound desc = could not find container \"fbad4950c3ead7b282d572b338e9a5e37cf4bf2143356af03a02759632fbeec6\": container with ID starting with fbad4950c3ead7b282d572b338e9a5e37cf4bf2143356af03a02759632fbeec6 not found: ID does not exist" Oct 03 08:42:58 crc kubenswrapper[4765]: I1003 08:42:58.916080 4765 scope.go:117] "RemoveContainer" containerID="5934c00221ebc58714a0bbfc40a96eb4bc372f3a98f5c73e3917ad88b760581f" Oct 03 08:42:58 crc kubenswrapper[4765]: E1003 08:42:58.916737 4765 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5934c00221ebc58714a0bbfc40a96eb4bc372f3a98f5c73e3917ad88b760581f\": container with ID starting with 5934c00221ebc58714a0bbfc40a96eb4bc372f3a98f5c73e3917ad88b760581f not found: ID does not exist" containerID="5934c00221ebc58714a0bbfc40a96eb4bc372f3a98f5c73e3917ad88b760581f" Oct 03 08:42:58 crc kubenswrapper[4765]: I1003 08:42:58.916774 4765 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5934c00221ebc58714a0bbfc40a96eb4bc372f3a98f5c73e3917ad88b760581f"} err="failed to get container status \"5934c00221ebc58714a0bbfc40a96eb4bc372f3a98f5c73e3917ad88b760581f\": rpc error: code = NotFound desc = could not find container \"5934c00221ebc58714a0bbfc40a96eb4bc372f3a98f5c73e3917ad88b760581f\": container with ID starting with 5934c00221ebc58714a0bbfc40a96eb4bc372f3a98f5c73e3917ad88b760581f not found: ID does not exist" Oct 03 08:42:59 crc kubenswrapper[4765]: I1003 08:42:59.109091 4765 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-4w7fr" Oct 03 08:42:59 crc kubenswrapper[4765]: I1003 08:42:59.296790 4765 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gtv5z\" (UniqueName: \"kubernetes.io/projected/ba3fb502-6081-420d-8ef8-a249a7e69e60-kube-api-access-gtv5z\") pod \"ba3fb502-6081-420d-8ef8-a249a7e69e60\" (UID: \"ba3fb502-6081-420d-8ef8-a249a7e69e60\") " Oct 03 08:42:59 crc kubenswrapper[4765]: I1003 08:42:59.296914 4765 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ba3fb502-6081-420d-8ef8-a249a7e69e60-utilities\") pod \"ba3fb502-6081-420d-8ef8-a249a7e69e60\" (UID: \"ba3fb502-6081-420d-8ef8-a249a7e69e60\") " Oct 03 08:42:59 crc kubenswrapper[4765]: I1003 08:42:59.297405 4765 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ba3fb502-6081-420d-8ef8-a249a7e69e60-catalog-content\") pod \"ba3fb502-6081-420d-8ef8-a249a7e69e60\" (UID: \"ba3fb502-6081-420d-8ef8-a249a7e69e60\") " Oct 03 08:42:59 crc kubenswrapper[4765]: I1003 08:42:59.298126 4765 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ba3fb502-6081-420d-8ef8-a249a7e69e60-utilities" (OuterVolumeSpecName: "utilities") pod "ba3fb502-6081-420d-8ef8-a249a7e69e60" (UID: "ba3fb502-6081-420d-8ef8-a249a7e69e60"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 03 08:42:59 crc kubenswrapper[4765]: I1003 08:42:59.306555 4765 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ba3fb502-6081-420d-8ef8-a249a7e69e60-kube-api-access-gtv5z" (OuterVolumeSpecName: "kube-api-access-gtv5z") pod "ba3fb502-6081-420d-8ef8-a249a7e69e60" (UID: "ba3fb502-6081-420d-8ef8-a249a7e69e60"). InnerVolumeSpecName "kube-api-access-gtv5z". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 08:42:59 crc kubenswrapper[4765]: I1003 08:42:59.310248 4765 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ba3fb502-6081-420d-8ef8-a249a7e69e60-utilities\") on node \"crc\" DevicePath \"\"" Oct 03 08:42:59 crc kubenswrapper[4765]: I1003 08:42:59.311253 4765 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-gtv5z\" (UniqueName: \"kubernetes.io/projected/ba3fb502-6081-420d-8ef8-a249a7e69e60-kube-api-access-gtv5z\") on node \"crc\" DevicePath \"\"" Oct 03 08:42:59 crc kubenswrapper[4765]: I1003 08:42:59.351633 4765 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ba3fb502-6081-420d-8ef8-a249a7e69e60-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "ba3fb502-6081-420d-8ef8-a249a7e69e60" (UID: "ba3fb502-6081-420d-8ef8-a249a7e69e60"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 03 08:42:59 crc kubenswrapper[4765]: I1003 08:42:59.414144 4765 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ba3fb502-6081-420d-8ef8-a249a7e69e60-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 03 08:42:59 crc kubenswrapper[4765]: I1003 08:42:59.550875 4765 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-8pvgz"] Oct 03 08:42:59 crc kubenswrapper[4765]: I1003 08:42:59.799857 4765 generic.go:334] "Generic (PLEG): container finished" podID="ba3fb502-6081-420d-8ef8-a249a7e69e60" containerID="ea90f378b290faa2d9f700f85818898d39d1d868088a809139820761cb11e78e" exitCode=0 Oct 03 08:42:59 crc kubenswrapper[4765]: I1003 08:42:59.799954 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-4w7fr" event={"ID":"ba3fb502-6081-420d-8ef8-a249a7e69e60","Type":"ContainerDied","Data":"ea90f378b290faa2d9f700f85818898d39d1d868088a809139820761cb11e78e"} Oct 03 08:42:59 crc kubenswrapper[4765]: I1003 08:42:59.799982 4765 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-4w7fr" Oct 03 08:42:59 crc kubenswrapper[4765]: I1003 08:42:59.800013 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-4w7fr" event={"ID":"ba3fb502-6081-420d-8ef8-a249a7e69e60","Type":"ContainerDied","Data":"5ae33041628482000243745b7c41e2095c7901d591469dbc168d0237d9d6841b"} Oct 03 08:42:59 crc kubenswrapper[4765]: I1003 08:42:59.800035 4765 scope.go:117] "RemoveContainer" containerID="ea90f378b290faa2d9f700f85818898d39d1d868088a809139820761cb11e78e" Oct 03 08:42:59 crc kubenswrapper[4765]: I1003 08:42:59.802180 4765 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-8pvgz" podUID="9c3ab66b-9f3e-4764-a5f6-acf1f378e489" containerName="registry-server" containerID="cri-o://3d8c676dacbb48fe27f2cfecb7a396cd39128e0795625e853466eb8204c8f279" gracePeriod=2 Oct 03 08:42:59 crc kubenswrapper[4765]: I1003 08:42:59.815085 4765 scope.go:117] "RemoveContainer" containerID="b25af2cc26f0d693473336bb041ccce03f2540a447126bd2fe131df3602ea805" Oct 03 08:42:59 crc kubenswrapper[4765]: I1003 08:42:59.832610 4765 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-4w7fr"] Oct 03 08:42:59 crc kubenswrapper[4765]: I1003 08:42:59.836240 4765 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-4w7fr"] Oct 03 08:42:59 crc kubenswrapper[4765]: I1003 08:42:59.851401 4765 scope.go:117] "RemoveContainer" containerID="0bd2d5d61d76395548491dfb4b9e73b01c57de23e4db86dc0a918ebc4dd36f60" Oct 03 08:42:59 crc kubenswrapper[4765]: I1003 08:42:59.863403 4765 scope.go:117] "RemoveContainer" containerID="ea90f378b290faa2d9f700f85818898d39d1d868088a809139820761cb11e78e" Oct 03 08:42:59 crc kubenswrapper[4765]: E1003 08:42:59.863797 4765 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ea90f378b290faa2d9f700f85818898d39d1d868088a809139820761cb11e78e\": container with ID starting with ea90f378b290faa2d9f700f85818898d39d1d868088a809139820761cb11e78e not found: ID does not exist" containerID="ea90f378b290faa2d9f700f85818898d39d1d868088a809139820761cb11e78e" Oct 03 08:42:59 crc kubenswrapper[4765]: I1003 08:42:59.863841 4765 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ea90f378b290faa2d9f700f85818898d39d1d868088a809139820761cb11e78e"} err="failed to get container status \"ea90f378b290faa2d9f700f85818898d39d1d868088a809139820761cb11e78e\": rpc error: code = NotFound desc = could not find container \"ea90f378b290faa2d9f700f85818898d39d1d868088a809139820761cb11e78e\": container with ID starting with ea90f378b290faa2d9f700f85818898d39d1d868088a809139820761cb11e78e not found: ID does not exist" Oct 03 08:42:59 crc kubenswrapper[4765]: I1003 08:42:59.863867 4765 scope.go:117] "RemoveContainer" containerID="b25af2cc26f0d693473336bb041ccce03f2540a447126bd2fe131df3602ea805" Oct 03 08:42:59 crc kubenswrapper[4765]: E1003 08:42:59.864179 4765 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b25af2cc26f0d693473336bb041ccce03f2540a447126bd2fe131df3602ea805\": container with ID starting with b25af2cc26f0d693473336bb041ccce03f2540a447126bd2fe131df3602ea805 not found: ID does not exist" containerID="b25af2cc26f0d693473336bb041ccce03f2540a447126bd2fe131df3602ea805" Oct 03 08:42:59 crc kubenswrapper[4765]: I1003 08:42:59.864202 4765 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b25af2cc26f0d693473336bb041ccce03f2540a447126bd2fe131df3602ea805"} err="failed to get container status \"b25af2cc26f0d693473336bb041ccce03f2540a447126bd2fe131df3602ea805\": rpc error: code = NotFound desc = could not find container \"b25af2cc26f0d693473336bb041ccce03f2540a447126bd2fe131df3602ea805\": container with ID starting with b25af2cc26f0d693473336bb041ccce03f2540a447126bd2fe131df3602ea805 not found: ID does not exist" Oct 03 08:42:59 crc kubenswrapper[4765]: I1003 08:42:59.864217 4765 scope.go:117] "RemoveContainer" containerID="0bd2d5d61d76395548491dfb4b9e73b01c57de23e4db86dc0a918ebc4dd36f60" Oct 03 08:42:59 crc kubenswrapper[4765]: E1003 08:42:59.864672 4765 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0bd2d5d61d76395548491dfb4b9e73b01c57de23e4db86dc0a918ebc4dd36f60\": container with ID starting with 0bd2d5d61d76395548491dfb4b9e73b01c57de23e4db86dc0a918ebc4dd36f60 not found: ID does not exist" containerID="0bd2d5d61d76395548491dfb4b9e73b01c57de23e4db86dc0a918ebc4dd36f60" Oct 03 08:42:59 crc kubenswrapper[4765]: I1003 08:42:59.864712 4765 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0bd2d5d61d76395548491dfb4b9e73b01c57de23e4db86dc0a918ebc4dd36f60"} err="failed to get container status \"0bd2d5d61d76395548491dfb4b9e73b01c57de23e4db86dc0a918ebc4dd36f60\": rpc error: code = NotFound desc = could not find container \"0bd2d5d61d76395548491dfb4b9e73b01c57de23e4db86dc0a918ebc4dd36f60\": container with ID starting with 0bd2d5d61d76395548491dfb4b9e73b01c57de23e4db86dc0a918ebc4dd36f60 not found: ID does not exist" Oct 03 08:43:00 crc kubenswrapper[4765]: I1003 08:43:00.312764 4765 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ba3fb502-6081-420d-8ef8-a249a7e69e60" path="/var/lib/kubelet/pods/ba3fb502-6081-420d-8ef8-a249a7e69e60/volumes" Oct 03 08:43:00 crc kubenswrapper[4765]: I1003 08:43:00.313541 4765 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="de26c094-f060-4b1d-b06f-13bf0f1794ce" path="/var/lib/kubelet/pods/de26c094-f060-4b1d-b06f-13bf0f1794ce/volumes" Oct 03 08:43:00 crc kubenswrapper[4765]: I1003 08:43:00.680227 4765 patch_prober.go:28] interesting pod/machine-config-daemon-j8mss container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 03 08:43:00 crc kubenswrapper[4765]: I1003 08:43:00.680286 4765 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-j8mss" podUID="d636dbad-9ffa-4ba7-953f-adea04b76a23" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 03 08:43:00 crc kubenswrapper[4765]: I1003 08:43:00.680329 4765 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-j8mss" Oct 03 08:43:00 crc kubenswrapper[4765]: I1003 08:43:00.680886 4765 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"714c78e9165f96e2aee03ad7be980399f06aeb852da4d76611c236f262518281"} pod="openshift-machine-config-operator/machine-config-daemon-j8mss" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Oct 03 08:43:00 crc kubenswrapper[4765]: I1003 08:43:00.680941 4765 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-j8mss" podUID="d636dbad-9ffa-4ba7-953f-adea04b76a23" containerName="machine-config-daemon" containerID="cri-o://714c78e9165f96e2aee03ad7be980399f06aeb852da4d76611c236f262518281" gracePeriod=600 Oct 03 08:43:00 crc kubenswrapper[4765]: I1003 08:43:00.813978 4765 generic.go:334] "Generic (PLEG): container finished" podID="9c3ab66b-9f3e-4764-a5f6-acf1f378e489" containerID="3d8c676dacbb48fe27f2cfecb7a396cd39128e0795625e853466eb8204c8f279" exitCode=0 Oct 03 08:43:00 crc kubenswrapper[4765]: I1003 08:43:00.814150 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-8pvgz" event={"ID":"9c3ab66b-9f3e-4764-a5f6-acf1f378e489","Type":"ContainerDied","Data":"3d8c676dacbb48fe27f2cfecb7a396cd39128e0795625e853466eb8204c8f279"} Oct 03 08:43:01 crc kubenswrapper[4765]: I1003 08:43:01.322449 4765 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-8pvgz" Oct 03 08:43:01 crc kubenswrapper[4765]: I1003 08:43:01.337489 4765 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9c3ab66b-9f3e-4764-a5f6-acf1f378e489-utilities\") pod \"9c3ab66b-9f3e-4764-a5f6-acf1f378e489\" (UID: \"9c3ab66b-9f3e-4764-a5f6-acf1f378e489\") " Oct 03 08:43:01 crc kubenswrapper[4765]: I1003 08:43:01.337527 4765 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8lj49\" (UniqueName: \"kubernetes.io/projected/9c3ab66b-9f3e-4764-a5f6-acf1f378e489-kube-api-access-8lj49\") pod \"9c3ab66b-9f3e-4764-a5f6-acf1f378e489\" (UID: \"9c3ab66b-9f3e-4764-a5f6-acf1f378e489\") " Oct 03 08:43:01 crc kubenswrapper[4765]: I1003 08:43:01.337593 4765 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9c3ab66b-9f3e-4764-a5f6-acf1f378e489-catalog-content\") pod \"9c3ab66b-9f3e-4764-a5f6-acf1f378e489\" (UID: \"9c3ab66b-9f3e-4764-a5f6-acf1f378e489\") " Oct 03 08:43:01 crc kubenswrapper[4765]: I1003 08:43:01.347362 4765 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/9c3ab66b-9f3e-4764-a5f6-acf1f378e489-utilities" (OuterVolumeSpecName: "utilities") pod "9c3ab66b-9f3e-4764-a5f6-acf1f378e489" (UID: "9c3ab66b-9f3e-4764-a5f6-acf1f378e489"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 03 08:43:01 crc kubenswrapper[4765]: I1003 08:43:01.347510 4765 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9c3ab66b-9f3e-4764-a5f6-acf1f378e489-kube-api-access-8lj49" (OuterVolumeSpecName: "kube-api-access-8lj49") pod "9c3ab66b-9f3e-4764-a5f6-acf1f378e489" (UID: "9c3ab66b-9f3e-4764-a5f6-acf1f378e489"). InnerVolumeSpecName "kube-api-access-8lj49". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 08:43:01 crc kubenswrapper[4765]: I1003 08:43:01.366338 4765 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/9c3ab66b-9f3e-4764-a5f6-acf1f378e489-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "9c3ab66b-9f3e-4764-a5f6-acf1f378e489" (UID: "9c3ab66b-9f3e-4764-a5f6-acf1f378e489"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 03 08:43:01 crc kubenswrapper[4765]: I1003 08:43:01.439510 4765 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9c3ab66b-9f3e-4764-a5f6-acf1f378e489-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 03 08:43:01 crc kubenswrapper[4765]: I1003 08:43:01.439541 4765 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9c3ab66b-9f3e-4764-a5f6-acf1f378e489-utilities\") on node \"crc\" DevicePath \"\"" Oct 03 08:43:01 crc kubenswrapper[4765]: I1003 08:43:01.439554 4765 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8lj49\" (UniqueName: \"kubernetes.io/projected/9c3ab66b-9f3e-4764-a5f6-acf1f378e489-kube-api-access-8lj49\") on node \"crc\" DevicePath \"\"" Oct 03 08:43:01 crc kubenswrapper[4765]: I1003 08:43:01.822257 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-8pvgz" event={"ID":"9c3ab66b-9f3e-4764-a5f6-acf1f378e489","Type":"ContainerDied","Data":"21b8779ebb8f94cbe94f21afc3850c00b6533f1a040e8f5ffb25cdad6c3f81da"} Oct 03 08:43:01 crc kubenswrapper[4765]: I1003 08:43:01.822299 4765 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-8pvgz" Oct 03 08:43:01 crc kubenswrapper[4765]: I1003 08:43:01.822314 4765 scope.go:117] "RemoveContainer" containerID="3d8c676dacbb48fe27f2cfecb7a396cd39128e0795625e853466eb8204c8f279" Oct 03 08:43:01 crc kubenswrapper[4765]: I1003 08:43:01.824685 4765 generic.go:334] "Generic (PLEG): container finished" podID="d636dbad-9ffa-4ba7-953f-adea04b76a23" containerID="714c78e9165f96e2aee03ad7be980399f06aeb852da4d76611c236f262518281" exitCode=0 Oct 03 08:43:01 crc kubenswrapper[4765]: I1003 08:43:01.824922 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-j8mss" event={"ID":"d636dbad-9ffa-4ba7-953f-adea04b76a23","Type":"ContainerDied","Data":"714c78e9165f96e2aee03ad7be980399f06aeb852da4d76611c236f262518281"} Oct 03 08:43:01 crc kubenswrapper[4765]: I1003 08:43:01.836249 4765 scope.go:117] "RemoveContainer" containerID="18e9132017ffd38aa69175bedc7b1494048cbc198a5b5005d34304f760f36385" Oct 03 08:43:01 crc kubenswrapper[4765]: I1003 08:43:01.852529 4765 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-8pvgz"] Oct 03 08:43:01 crc kubenswrapper[4765]: I1003 08:43:01.856416 4765 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-8pvgz"] Oct 03 08:43:01 crc kubenswrapper[4765]: I1003 08:43:01.867220 4765 scope.go:117] "RemoveContainer" containerID="0a19d6023eb03d5076d3dd7fcf1dfb4aa15c68389cdb222d1e96593faabc90de" Oct 03 08:43:02 crc kubenswrapper[4765]: I1003 08:43:02.314752 4765 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9c3ab66b-9f3e-4764-a5f6-acf1f378e489" path="/var/lib/kubelet/pods/9c3ab66b-9f3e-4764-a5f6-acf1f378e489/volumes" Oct 03 08:43:02 crc kubenswrapper[4765]: I1003 08:43:02.546103 4765 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-fmg4j"] Oct 03 08:43:02 crc kubenswrapper[4765]: I1003 08:43:02.546308 4765 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-fmg4j" podUID="111267ae-7647-4185-8fd3-138ad5d3a864" containerName="registry-server" containerID="cri-o://065ee4deea1f9d34192d94623da8ede035b792f88a1d90f09f382ce4bd3078d9" gracePeriod=2 Oct 03 08:43:02 crc kubenswrapper[4765]: I1003 08:43:02.850372 4765 generic.go:334] "Generic (PLEG): container finished" podID="111267ae-7647-4185-8fd3-138ad5d3a864" containerID="065ee4deea1f9d34192d94623da8ede035b792f88a1d90f09f382ce4bd3078d9" exitCode=0 Oct 03 08:43:02 crc kubenswrapper[4765]: I1003 08:43:02.850427 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-fmg4j" event={"ID":"111267ae-7647-4185-8fd3-138ad5d3a864","Type":"ContainerDied","Data":"065ee4deea1f9d34192d94623da8ede035b792f88a1d90f09f382ce4bd3078d9"} Oct 03 08:43:02 crc kubenswrapper[4765]: I1003 08:43:02.868241 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-j8mss" event={"ID":"d636dbad-9ffa-4ba7-953f-adea04b76a23","Type":"ContainerStarted","Data":"f8f01250a245724a5319a16414554a23f7e9e678c70099524c09e960696e7846"} Oct 03 08:43:02 crc kubenswrapper[4765]: I1003 08:43:02.944045 4765 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-fmg4j" Oct 03 08:43:02 crc kubenswrapper[4765]: I1003 08:43:02.960121 4765 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/111267ae-7647-4185-8fd3-138ad5d3a864-utilities\") pod \"111267ae-7647-4185-8fd3-138ad5d3a864\" (UID: \"111267ae-7647-4185-8fd3-138ad5d3a864\") " Oct 03 08:43:02 crc kubenswrapper[4765]: I1003 08:43:02.960206 4765 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-htwg7\" (UniqueName: \"kubernetes.io/projected/111267ae-7647-4185-8fd3-138ad5d3a864-kube-api-access-htwg7\") pod \"111267ae-7647-4185-8fd3-138ad5d3a864\" (UID: \"111267ae-7647-4185-8fd3-138ad5d3a864\") " Oct 03 08:43:02 crc kubenswrapper[4765]: I1003 08:43:02.960253 4765 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/111267ae-7647-4185-8fd3-138ad5d3a864-catalog-content\") pod \"111267ae-7647-4185-8fd3-138ad5d3a864\" (UID: \"111267ae-7647-4185-8fd3-138ad5d3a864\") " Oct 03 08:43:02 crc kubenswrapper[4765]: I1003 08:43:02.961393 4765 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/111267ae-7647-4185-8fd3-138ad5d3a864-utilities" (OuterVolumeSpecName: "utilities") pod "111267ae-7647-4185-8fd3-138ad5d3a864" (UID: "111267ae-7647-4185-8fd3-138ad5d3a864"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 03 08:43:02 crc kubenswrapper[4765]: I1003 08:43:02.966195 4765 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/111267ae-7647-4185-8fd3-138ad5d3a864-kube-api-access-htwg7" (OuterVolumeSpecName: "kube-api-access-htwg7") pod "111267ae-7647-4185-8fd3-138ad5d3a864" (UID: "111267ae-7647-4185-8fd3-138ad5d3a864"). InnerVolumeSpecName "kube-api-access-htwg7". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 08:43:03 crc kubenswrapper[4765]: I1003 08:43:03.043181 4765 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/111267ae-7647-4185-8fd3-138ad5d3a864-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "111267ae-7647-4185-8fd3-138ad5d3a864" (UID: "111267ae-7647-4185-8fd3-138ad5d3a864"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 03 08:43:03 crc kubenswrapper[4765]: I1003 08:43:03.062300 4765 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/111267ae-7647-4185-8fd3-138ad5d3a864-utilities\") on node \"crc\" DevicePath \"\"" Oct 03 08:43:03 crc kubenswrapper[4765]: I1003 08:43:03.062351 4765 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-htwg7\" (UniqueName: \"kubernetes.io/projected/111267ae-7647-4185-8fd3-138ad5d3a864-kube-api-access-htwg7\") on node \"crc\" DevicePath \"\"" Oct 03 08:43:03 crc kubenswrapper[4765]: I1003 08:43:03.062558 4765 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/111267ae-7647-4185-8fd3-138ad5d3a864-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 03 08:43:03 crc kubenswrapper[4765]: I1003 08:43:03.876755 4765 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-fmg4j" Oct 03 08:43:03 crc kubenswrapper[4765]: I1003 08:43:03.876934 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-fmg4j" event={"ID":"111267ae-7647-4185-8fd3-138ad5d3a864","Type":"ContainerDied","Data":"19344b48e560f3ef461a7947addbe9957fec9b26f66c6ec945605f664684e0c1"} Oct 03 08:43:03 crc kubenswrapper[4765]: I1003 08:43:03.877878 4765 scope.go:117] "RemoveContainer" containerID="065ee4deea1f9d34192d94623da8ede035b792f88a1d90f09f382ce4bd3078d9" Oct 03 08:43:03 crc kubenswrapper[4765]: I1003 08:43:03.899165 4765 scope.go:117] "RemoveContainer" containerID="033e0909b90cda367ea286420b15c72e32b33394ddc6d9a99b3b9a05ba65761b" Oct 03 08:43:03 crc kubenswrapper[4765]: I1003 08:43:03.904141 4765 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-fmg4j"] Oct 03 08:43:03 crc kubenswrapper[4765]: I1003 08:43:03.907223 4765 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-fmg4j"] Oct 03 08:43:03 crc kubenswrapper[4765]: I1003 08:43:03.938058 4765 scope.go:117] "RemoveContainer" containerID="da742df054ea3fed9b0b9ebbc09a13684254c1f1e93d9aefc2ecbbf1fb847657" Oct 03 08:43:04 crc kubenswrapper[4765]: I1003 08:43:04.313082 4765 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="111267ae-7647-4185-8fd3-138ad5d3a864" path="/var/lib/kubelet/pods/111267ae-7647-4185-8fd3-138ad5d3a864/volumes" Oct 03 08:43:06 crc kubenswrapper[4765]: I1003 08:43:06.648454 4765 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-authentication/oauth-openshift-558db77b4-jgs5w"] Oct 03 08:43:31 crc kubenswrapper[4765]: I1003 08:43:31.673084 4765 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-authentication/oauth-openshift-558db77b4-jgs5w" podUID="0ea22c01-e088-40b8-aecd-e83fe862bc78" containerName="oauth-openshift" containerID="cri-o://ea4d2d07de738dd0bf91ad87156e5f9a973511e56d3a418ec90cb37f5349f6fd" gracePeriod=15 Oct 03 08:43:31 crc kubenswrapper[4765]: I1003 08:43:31.995657 4765 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-558db77b4-jgs5w" Oct 03 08:43:32 crc kubenswrapper[4765]: I1003 08:43:32.016295 4765 generic.go:334] "Generic (PLEG): container finished" podID="0ea22c01-e088-40b8-aecd-e83fe862bc78" containerID="ea4d2d07de738dd0bf91ad87156e5f9a973511e56d3a418ec90cb37f5349f6fd" exitCode=0 Oct 03 08:43:32 crc kubenswrapper[4765]: I1003 08:43:32.016373 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-558db77b4-jgs5w" event={"ID":"0ea22c01-e088-40b8-aecd-e83fe862bc78","Type":"ContainerDied","Data":"ea4d2d07de738dd0bf91ad87156e5f9a973511e56d3a418ec90cb37f5349f6fd"} Oct 03 08:43:32 crc kubenswrapper[4765]: I1003 08:43:32.016408 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-558db77b4-jgs5w" event={"ID":"0ea22c01-e088-40b8-aecd-e83fe862bc78","Type":"ContainerDied","Data":"23e25634a8e1beb840e011822048cce038653c029b8b1d089b39e89d75c8a421"} Oct 03 08:43:32 crc kubenswrapper[4765]: I1003 08:43:32.016428 4765 scope.go:117] "RemoveContainer" containerID="ea4d2d07de738dd0bf91ad87156e5f9a973511e56d3a418ec90cb37f5349f6fd" Oct 03 08:43:32 crc kubenswrapper[4765]: I1003 08:43:32.016597 4765 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-558db77b4-jgs5w" Oct 03 08:43:32 crc kubenswrapper[4765]: I1003 08:43:32.033167 4765 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-authentication/oauth-openshift-7765894ccc-n7lwz"] Oct 03 08:43:32 crc kubenswrapper[4765]: E1003 08:43:32.033475 4765 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="de26c094-f060-4b1d-b06f-13bf0f1794ce" containerName="registry-server" Oct 03 08:43:32 crc kubenswrapper[4765]: I1003 08:43:32.033489 4765 state_mem.go:107] "Deleted CPUSet assignment" podUID="de26c094-f060-4b1d-b06f-13bf0f1794ce" containerName="registry-server" Oct 03 08:43:32 crc kubenswrapper[4765]: E1003 08:43:32.033502 4765 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9c3ab66b-9f3e-4764-a5f6-acf1f378e489" containerName="extract-utilities" Oct 03 08:43:32 crc kubenswrapper[4765]: I1003 08:43:32.033509 4765 state_mem.go:107] "Deleted CPUSet assignment" podUID="9c3ab66b-9f3e-4764-a5f6-acf1f378e489" containerName="extract-utilities" Oct 03 08:43:32 crc kubenswrapper[4765]: E1003 08:43:32.033517 4765 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="de26c094-f060-4b1d-b06f-13bf0f1794ce" containerName="extract-utilities" Oct 03 08:43:32 crc kubenswrapper[4765]: I1003 08:43:32.033525 4765 state_mem.go:107] "Deleted CPUSet assignment" podUID="de26c094-f060-4b1d-b06f-13bf0f1794ce" containerName="extract-utilities" Oct 03 08:43:32 crc kubenswrapper[4765]: E1003 08:43:32.033535 4765 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="111267ae-7647-4185-8fd3-138ad5d3a864" containerName="extract-utilities" Oct 03 08:43:32 crc kubenswrapper[4765]: I1003 08:43:32.033543 4765 state_mem.go:107] "Deleted CPUSet assignment" podUID="111267ae-7647-4185-8fd3-138ad5d3a864" containerName="extract-utilities" Oct 03 08:43:32 crc kubenswrapper[4765]: E1003 08:43:32.033550 4765 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ba3fb502-6081-420d-8ef8-a249a7e69e60" containerName="registry-server" Oct 03 08:43:32 crc kubenswrapper[4765]: I1003 08:43:32.033556 4765 state_mem.go:107] "Deleted CPUSet assignment" podUID="ba3fb502-6081-420d-8ef8-a249a7e69e60" containerName="registry-server" Oct 03 08:43:32 crc kubenswrapper[4765]: E1003 08:43:32.033565 4765 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="de26c094-f060-4b1d-b06f-13bf0f1794ce" containerName="extract-content" Oct 03 08:43:32 crc kubenswrapper[4765]: I1003 08:43:32.033570 4765 state_mem.go:107] "Deleted CPUSet assignment" podUID="de26c094-f060-4b1d-b06f-13bf0f1794ce" containerName="extract-content" Oct 03 08:43:32 crc kubenswrapper[4765]: E1003 08:43:32.033589 4765 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ba3fb502-6081-420d-8ef8-a249a7e69e60" containerName="extract-utilities" Oct 03 08:43:32 crc kubenswrapper[4765]: I1003 08:43:32.033596 4765 state_mem.go:107] "Deleted CPUSet assignment" podUID="ba3fb502-6081-420d-8ef8-a249a7e69e60" containerName="extract-utilities" Oct 03 08:43:32 crc kubenswrapper[4765]: E1003 08:43:32.033626 4765 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0ea22c01-e088-40b8-aecd-e83fe862bc78" containerName="oauth-openshift" Oct 03 08:43:32 crc kubenswrapper[4765]: I1003 08:43:32.033635 4765 state_mem.go:107] "Deleted CPUSet assignment" podUID="0ea22c01-e088-40b8-aecd-e83fe862bc78" containerName="oauth-openshift" Oct 03 08:43:32 crc kubenswrapper[4765]: E1003 08:43:32.033662 4765 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9c3ab66b-9f3e-4764-a5f6-acf1f378e489" containerName="registry-server" Oct 03 08:43:32 crc kubenswrapper[4765]: I1003 08:43:32.033669 4765 state_mem.go:107] "Deleted CPUSet assignment" podUID="9c3ab66b-9f3e-4764-a5f6-acf1f378e489" containerName="registry-server" Oct 03 08:43:32 crc kubenswrapper[4765]: E1003 08:43:32.033681 4765 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="111267ae-7647-4185-8fd3-138ad5d3a864" containerName="extract-content" Oct 03 08:43:32 crc kubenswrapper[4765]: I1003 08:43:32.033691 4765 state_mem.go:107] "Deleted CPUSet assignment" podUID="111267ae-7647-4185-8fd3-138ad5d3a864" containerName="extract-content" Oct 03 08:43:32 crc kubenswrapper[4765]: E1003 08:43:32.033704 4765 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d532aa95-c0e2-4e0a-b3e9-d67cdf15b1a6" containerName="pruner" Oct 03 08:43:32 crc kubenswrapper[4765]: I1003 08:43:32.033712 4765 state_mem.go:107] "Deleted CPUSet assignment" podUID="d532aa95-c0e2-4e0a-b3e9-d67cdf15b1a6" containerName="pruner" Oct 03 08:43:32 crc kubenswrapper[4765]: E1003 08:43:32.033724 4765 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9c5e0b35-5d25-4e2b-9a0a-accdf6cc5e23" containerName="pruner" Oct 03 08:43:32 crc kubenswrapper[4765]: I1003 08:43:32.033730 4765 state_mem.go:107] "Deleted CPUSet assignment" podUID="9c5e0b35-5d25-4e2b-9a0a-accdf6cc5e23" containerName="pruner" Oct 03 08:43:32 crc kubenswrapper[4765]: E1003 08:43:32.033740 4765 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9c3ab66b-9f3e-4764-a5f6-acf1f378e489" containerName="extract-content" Oct 03 08:43:32 crc kubenswrapper[4765]: I1003 08:43:32.033745 4765 state_mem.go:107] "Deleted CPUSet assignment" podUID="9c3ab66b-9f3e-4764-a5f6-acf1f378e489" containerName="extract-content" Oct 03 08:43:32 crc kubenswrapper[4765]: E1003 08:43:32.033753 4765 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="111267ae-7647-4185-8fd3-138ad5d3a864" containerName="registry-server" Oct 03 08:43:32 crc kubenswrapper[4765]: I1003 08:43:32.033759 4765 state_mem.go:107] "Deleted CPUSet assignment" podUID="111267ae-7647-4185-8fd3-138ad5d3a864" containerName="registry-server" Oct 03 08:43:32 crc kubenswrapper[4765]: E1003 08:43:32.033766 4765 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="432cff95-d219-46af-bfc4-c5afbe99c9c0" containerName="collect-profiles" Oct 03 08:43:32 crc kubenswrapper[4765]: I1003 08:43:32.033772 4765 state_mem.go:107] "Deleted CPUSet assignment" podUID="432cff95-d219-46af-bfc4-c5afbe99c9c0" containerName="collect-profiles" Oct 03 08:43:32 crc kubenswrapper[4765]: E1003 08:43:32.033783 4765 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ba3fb502-6081-420d-8ef8-a249a7e69e60" containerName="extract-content" Oct 03 08:43:32 crc kubenswrapper[4765]: I1003 08:43:32.033789 4765 state_mem.go:107] "Deleted CPUSet assignment" podUID="ba3fb502-6081-420d-8ef8-a249a7e69e60" containerName="extract-content" Oct 03 08:43:32 crc kubenswrapper[4765]: I1003 08:43:32.033897 4765 memory_manager.go:354] "RemoveStaleState removing state" podUID="ba3fb502-6081-420d-8ef8-a249a7e69e60" containerName="registry-server" Oct 03 08:43:32 crc kubenswrapper[4765]: I1003 08:43:32.033915 4765 memory_manager.go:354] "RemoveStaleState removing state" podUID="d532aa95-c0e2-4e0a-b3e9-d67cdf15b1a6" containerName="pruner" Oct 03 08:43:32 crc kubenswrapper[4765]: I1003 08:43:32.033925 4765 memory_manager.go:354] "RemoveStaleState removing state" podUID="432cff95-d219-46af-bfc4-c5afbe99c9c0" containerName="collect-profiles" Oct 03 08:43:32 crc kubenswrapper[4765]: I1003 08:43:32.033932 4765 memory_manager.go:354] "RemoveStaleState removing state" podUID="9c3ab66b-9f3e-4764-a5f6-acf1f378e489" containerName="registry-server" Oct 03 08:43:32 crc kubenswrapper[4765]: I1003 08:43:32.033940 4765 memory_manager.go:354] "RemoveStaleState removing state" podUID="111267ae-7647-4185-8fd3-138ad5d3a864" containerName="registry-server" Oct 03 08:43:32 crc kubenswrapper[4765]: I1003 08:43:32.033950 4765 memory_manager.go:354] "RemoveStaleState removing state" podUID="9c5e0b35-5d25-4e2b-9a0a-accdf6cc5e23" containerName="pruner" Oct 03 08:43:32 crc kubenswrapper[4765]: I1003 08:43:32.033962 4765 memory_manager.go:354] "RemoveStaleState removing state" podUID="0ea22c01-e088-40b8-aecd-e83fe862bc78" containerName="oauth-openshift" Oct 03 08:43:32 crc kubenswrapper[4765]: I1003 08:43:32.033972 4765 memory_manager.go:354] "RemoveStaleState removing state" podUID="de26c094-f060-4b1d-b06f-13bf0f1794ce" containerName="registry-server" Oct 03 08:43:32 crc kubenswrapper[4765]: I1003 08:43:32.034482 4765 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-7765894ccc-n7lwz" Oct 03 08:43:32 crc kubenswrapper[4765]: I1003 08:43:32.037820 4765 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication/oauth-openshift-7765894ccc-n7lwz"] Oct 03 08:43:32 crc kubenswrapper[4765]: I1003 08:43:32.047993 4765 scope.go:117] "RemoveContainer" containerID="ea4d2d07de738dd0bf91ad87156e5f9a973511e56d3a418ec90cb37f5349f6fd" Oct 03 08:43:32 crc kubenswrapper[4765]: E1003 08:43:32.048432 4765 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ea4d2d07de738dd0bf91ad87156e5f9a973511e56d3a418ec90cb37f5349f6fd\": container with ID starting with ea4d2d07de738dd0bf91ad87156e5f9a973511e56d3a418ec90cb37f5349f6fd not found: ID does not exist" containerID="ea4d2d07de738dd0bf91ad87156e5f9a973511e56d3a418ec90cb37f5349f6fd" Oct 03 08:43:32 crc kubenswrapper[4765]: I1003 08:43:32.048470 4765 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ea4d2d07de738dd0bf91ad87156e5f9a973511e56d3a418ec90cb37f5349f6fd"} err="failed to get container status \"ea4d2d07de738dd0bf91ad87156e5f9a973511e56d3a418ec90cb37f5349f6fd\": rpc error: code = NotFound desc = could not find container \"ea4d2d07de738dd0bf91ad87156e5f9a973511e56d3a418ec90cb37f5349f6fd\": container with ID starting with ea4d2d07de738dd0bf91ad87156e5f9a973511e56d3a418ec90cb37f5349f6fd not found: ID does not exist" Oct 03 08:43:32 crc kubenswrapper[4765]: I1003 08:43:32.142483 4765 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/0ea22c01-e088-40b8-aecd-e83fe862bc78-audit-policies\") pod \"0ea22c01-e088-40b8-aecd-e83fe862bc78\" (UID: \"0ea22c01-e088-40b8-aecd-e83fe862bc78\") " Oct 03 08:43:32 crc kubenswrapper[4765]: I1003 08:43:32.142553 4765 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/0ea22c01-e088-40b8-aecd-e83fe862bc78-v4-0-config-system-service-ca\") pod \"0ea22c01-e088-40b8-aecd-e83fe862bc78\" (UID: \"0ea22c01-e088-40b8-aecd-e83fe862bc78\") " Oct 03 08:43:32 crc kubenswrapper[4765]: I1003 08:43:32.142603 4765 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/0ea22c01-e088-40b8-aecd-e83fe862bc78-v4-0-config-user-template-provider-selection\") pod \"0ea22c01-e088-40b8-aecd-e83fe862bc78\" (UID: \"0ea22c01-e088-40b8-aecd-e83fe862bc78\") " Oct 03 08:43:32 crc kubenswrapper[4765]: I1003 08:43:32.142714 4765 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/0ea22c01-e088-40b8-aecd-e83fe862bc78-v4-0-config-user-template-login\") pod \"0ea22c01-e088-40b8-aecd-e83fe862bc78\" (UID: \"0ea22c01-e088-40b8-aecd-e83fe862bc78\") " Oct 03 08:43:32 crc kubenswrapper[4765]: I1003 08:43:32.142753 4765 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/0ea22c01-e088-40b8-aecd-e83fe862bc78-v4-0-config-user-idp-0-file-data\") pod \"0ea22c01-e088-40b8-aecd-e83fe862bc78\" (UID: \"0ea22c01-e088-40b8-aecd-e83fe862bc78\") " Oct 03 08:43:32 crc kubenswrapper[4765]: I1003 08:43:32.142777 4765 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/0ea22c01-e088-40b8-aecd-e83fe862bc78-v4-0-config-system-ocp-branding-template\") pod \"0ea22c01-e088-40b8-aecd-e83fe862bc78\" (UID: \"0ea22c01-e088-40b8-aecd-e83fe862bc78\") " Oct 03 08:43:32 crc kubenswrapper[4765]: I1003 08:43:32.142798 4765 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qwks2\" (UniqueName: \"kubernetes.io/projected/0ea22c01-e088-40b8-aecd-e83fe862bc78-kube-api-access-qwks2\") pod \"0ea22c01-e088-40b8-aecd-e83fe862bc78\" (UID: \"0ea22c01-e088-40b8-aecd-e83fe862bc78\") " Oct 03 08:43:32 crc kubenswrapper[4765]: I1003 08:43:32.142828 4765 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/0ea22c01-e088-40b8-aecd-e83fe862bc78-v4-0-config-system-serving-cert\") pod \"0ea22c01-e088-40b8-aecd-e83fe862bc78\" (UID: \"0ea22c01-e088-40b8-aecd-e83fe862bc78\") " Oct 03 08:43:32 crc kubenswrapper[4765]: I1003 08:43:32.142852 4765 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/0ea22c01-e088-40b8-aecd-e83fe862bc78-v4-0-config-user-template-error\") pod \"0ea22c01-e088-40b8-aecd-e83fe862bc78\" (UID: \"0ea22c01-e088-40b8-aecd-e83fe862bc78\") " Oct 03 08:43:32 crc kubenswrapper[4765]: I1003 08:43:32.142885 4765 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/0ea22c01-e088-40b8-aecd-e83fe862bc78-v4-0-config-system-cliconfig\") pod \"0ea22c01-e088-40b8-aecd-e83fe862bc78\" (UID: \"0ea22c01-e088-40b8-aecd-e83fe862bc78\") " Oct 03 08:43:32 crc kubenswrapper[4765]: I1003 08:43:32.142911 4765 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/0ea22c01-e088-40b8-aecd-e83fe862bc78-audit-dir\") pod \"0ea22c01-e088-40b8-aecd-e83fe862bc78\" (UID: \"0ea22c01-e088-40b8-aecd-e83fe862bc78\") " Oct 03 08:43:32 crc kubenswrapper[4765]: I1003 08:43:32.142933 4765 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/0ea22c01-e088-40b8-aecd-e83fe862bc78-v4-0-config-system-router-certs\") pod \"0ea22c01-e088-40b8-aecd-e83fe862bc78\" (UID: \"0ea22c01-e088-40b8-aecd-e83fe862bc78\") " Oct 03 08:43:32 crc kubenswrapper[4765]: I1003 08:43:32.143609 4765 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/0ea22c01-e088-40b8-aecd-e83fe862bc78-v4-0-config-system-trusted-ca-bundle\") pod \"0ea22c01-e088-40b8-aecd-e83fe862bc78\" (UID: \"0ea22c01-e088-40b8-aecd-e83fe862bc78\") " Oct 03 08:43:32 crc kubenswrapper[4765]: I1003 08:43:32.143682 4765 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/0ea22c01-e088-40b8-aecd-e83fe862bc78-v4-0-config-system-session\") pod \"0ea22c01-e088-40b8-aecd-e83fe862bc78\" (UID: \"0ea22c01-e088-40b8-aecd-e83fe862bc78\") " Oct 03 08:43:32 crc kubenswrapper[4765]: I1003 08:43:32.143876 4765 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/0ea22c01-e088-40b8-aecd-e83fe862bc78-audit-dir" (OuterVolumeSpecName: "audit-dir") pod "0ea22c01-e088-40b8-aecd-e83fe862bc78" (UID: "0ea22c01-e088-40b8-aecd-e83fe862bc78"). InnerVolumeSpecName "audit-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 03 08:43:32 crc kubenswrapper[4765]: I1003 08:43:32.144082 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/72215636-3977-4a51-8a14-7ba9c464d45a-audit-policies\") pod \"oauth-openshift-7765894ccc-n7lwz\" (UID: \"72215636-3977-4a51-8a14-7ba9c464d45a\") " pod="openshift-authentication/oauth-openshift-7765894ccc-n7lwz" Oct 03 08:43:32 crc kubenswrapper[4765]: I1003 08:43:32.144127 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/72215636-3977-4a51-8a14-7ba9c464d45a-v4-0-config-system-cliconfig\") pod \"oauth-openshift-7765894ccc-n7lwz\" (UID: \"72215636-3977-4a51-8a14-7ba9c464d45a\") " pod="openshift-authentication/oauth-openshift-7765894ccc-n7lwz" Oct 03 08:43:32 crc kubenswrapper[4765]: I1003 08:43:32.144154 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/72215636-3977-4a51-8a14-7ba9c464d45a-v4-0-config-system-service-ca\") pod \"oauth-openshift-7765894ccc-n7lwz\" (UID: \"72215636-3977-4a51-8a14-7ba9c464d45a\") " pod="openshift-authentication/oauth-openshift-7765894ccc-n7lwz" Oct 03 08:43:32 crc kubenswrapper[4765]: I1003 08:43:32.144191 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/72215636-3977-4a51-8a14-7ba9c464d45a-v4-0-config-system-session\") pod \"oauth-openshift-7765894ccc-n7lwz\" (UID: \"72215636-3977-4a51-8a14-7ba9c464d45a\") " pod="openshift-authentication/oauth-openshift-7765894ccc-n7lwz" Oct 03 08:43:32 crc kubenswrapper[4765]: I1003 08:43:32.144220 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/72215636-3977-4a51-8a14-7ba9c464d45a-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-7765894ccc-n7lwz\" (UID: \"72215636-3977-4a51-8a14-7ba9c464d45a\") " pod="openshift-authentication/oauth-openshift-7765894ccc-n7lwz" Oct 03 08:43:32 crc kubenswrapper[4765]: I1003 08:43:32.144268 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lfvdt\" (UniqueName: \"kubernetes.io/projected/72215636-3977-4a51-8a14-7ba9c464d45a-kube-api-access-lfvdt\") pod \"oauth-openshift-7765894ccc-n7lwz\" (UID: \"72215636-3977-4a51-8a14-7ba9c464d45a\") " pod="openshift-authentication/oauth-openshift-7765894ccc-n7lwz" Oct 03 08:43:32 crc kubenswrapper[4765]: I1003 08:43:32.144294 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/72215636-3977-4a51-8a14-7ba9c464d45a-v4-0-config-user-template-login\") pod \"oauth-openshift-7765894ccc-n7lwz\" (UID: \"72215636-3977-4a51-8a14-7ba9c464d45a\") " pod="openshift-authentication/oauth-openshift-7765894ccc-n7lwz" Oct 03 08:43:32 crc kubenswrapper[4765]: I1003 08:43:32.144343 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/72215636-3977-4a51-8a14-7ba9c464d45a-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-7765894ccc-n7lwz\" (UID: \"72215636-3977-4a51-8a14-7ba9c464d45a\") " pod="openshift-authentication/oauth-openshift-7765894ccc-n7lwz" Oct 03 08:43:32 crc kubenswrapper[4765]: I1003 08:43:32.144382 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/72215636-3977-4a51-8a14-7ba9c464d45a-v4-0-config-system-serving-cert\") pod \"oauth-openshift-7765894ccc-n7lwz\" (UID: \"72215636-3977-4a51-8a14-7ba9c464d45a\") " pod="openshift-authentication/oauth-openshift-7765894ccc-n7lwz" Oct 03 08:43:32 crc kubenswrapper[4765]: I1003 08:43:32.144619 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/72215636-3977-4a51-8a14-7ba9c464d45a-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-7765894ccc-n7lwz\" (UID: \"72215636-3977-4a51-8a14-7ba9c464d45a\") " pod="openshift-authentication/oauth-openshift-7765894ccc-n7lwz" Oct 03 08:43:32 crc kubenswrapper[4765]: I1003 08:43:32.144690 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/72215636-3977-4a51-8a14-7ba9c464d45a-v4-0-config-user-template-error\") pod \"oauth-openshift-7765894ccc-n7lwz\" (UID: \"72215636-3977-4a51-8a14-7ba9c464d45a\") " pod="openshift-authentication/oauth-openshift-7765894ccc-n7lwz" Oct 03 08:43:32 crc kubenswrapper[4765]: I1003 08:43:32.144724 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/72215636-3977-4a51-8a14-7ba9c464d45a-v4-0-config-system-router-certs\") pod \"oauth-openshift-7765894ccc-n7lwz\" (UID: \"72215636-3977-4a51-8a14-7ba9c464d45a\") " pod="openshift-authentication/oauth-openshift-7765894ccc-n7lwz" Oct 03 08:43:32 crc kubenswrapper[4765]: I1003 08:43:32.144815 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/72215636-3977-4a51-8a14-7ba9c464d45a-audit-dir\") pod \"oauth-openshift-7765894ccc-n7lwz\" (UID: \"72215636-3977-4a51-8a14-7ba9c464d45a\") " pod="openshift-authentication/oauth-openshift-7765894ccc-n7lwz" Oct 03 08:43:32 crc kubenswrapper[4765]: I1003 08:43:32.144860 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/72215636-3977-4a51-8a14-7ba9c464d45a-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-7765894ccc-n7lwz\" (UID: \"72215636-3977-4a51-8a14-7ba9c464d45a\") " pod="openshift-authentication/oauth-openshift-7765894ccc-n7lwz" Oct 03 08:43:32 crc kubenswrapper[4765]: I1003 08:43:32.144875 4765 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0ea22c01-e088-40b8-aecd-e83fe862bc78-v4-0-config-system-cliconfig" (OuterVolumeSpecName: "v4-0-config-system-cliconfig") pod "0ea22c01-e088-40b8-aecd-e83fe862bc78" (UID: "0ea22c01-e088-40b8-aecd-e83fe862bc78"). InnerVolumeSpecName "v4-0-config-system-cliconfig". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 03 08:43:32 crc kubenswrapper[4765]: I1003 08:43:32.144923 4765 reconciler_common.go:293] "Volume detached for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/0ea22c01-e088-40b8-aecd-e83fe862bc78-audit-dir\") on node \"crc\" DevicePath \"\"" Oct 03 08:43:32 crc kubenswrapper[4765]: I1003 08:43:32.145060 4765 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0ea22c01-e088-40b8-aecd-e83fe862bc78-v4-0-config-system-trusted-ca-bundle" (OuterVolumeSpecName: "v4-0-config-system-trusted-ca-bundle") pod "0ea22c01-e088-40b8-aecd-e83fe862bc78" (UID: "0ea22c01-e088-40b8-aecd-e83fe862bc78"). InnerVolumeSpecName "v4-0-config-system-trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 03 08:43:32 crc kubenswrapper[4765]: I1003 08:43:32.145081 4765 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0ea22c01-e088-40b8-aecd-e83fe862bc78-audit-policies" (OuterVolumeSpecName: "audit-policies") pod "0ea22c01-e088-40b8-aecd-e83fe862bc78" (UID: "0ea22c01-e088-40b8-aecd-e83fe862bc78"). InnerVolumeSpecName "audit-policies". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 03 08:43:32 crc kubenswrapper[4765]: I1003 08:43:32.145151 4765 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0ea22c01-e088-40b8-aecd-e83fe862bc78-v4-0-config-system-service-ca" (OuterVolumeSpecName: "v4-0-config-system-service-ca") pod "0ea22c01-e088-40b8-aecd-e83fe862bc78" (UID: "0ea22c01-e088-40b8-aecd-e83fe862bc78"). InnerVolumeSpecName "v4-0-config-system-service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 03 08:43:32 crc kubenswrapper[4765]: I1003 08:43:32.149614 4765 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0ea22c01-e088-40b8-aecd-e83fe862bc78-v4-0-config-user-template-provider-selection" (OuterVolumeSpecName: "v4-0-config-user-template-provider-selection") pod "0ea22c01-e088-40b8-aecd-e83fe862bc78" (UID: "0ea22c01-e088-40b8-aecd-e83fe862bc78"). InnerVolumeSpecName "v4-0-config-user-template-provider-selection". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 08:43:32 crc kubenswrapper[4765]: I1003 08:43:32.149909 4765 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0ea22c01-e088-40b8-aecd-e83fe862bc78-v4-0-config-system-session" (OuterVolumeSpecName: "v4-0-config-system-session") pod "0ea22c01-e088-40b8-aecd-e83fe862bc78" (UID: "0ea22c01-e088-40b8-aecd-e83fe862bc78"). InnerVolumeSpecName "v4-0-config-system-session". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 08:43:32 crc kubenswrapper[4765]: I1003 08:43:32.150075 4765 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0ea22c01-e088-40b8-aecd-e83fe862bc78-v4-0-config-system-router-certs" (OuterVolumeSpecName: "v4-0-config-system-router-certs") pod "0ea22c01-e088-40b8-aecd-e83fe862bc78" (UID: "0ea22c01-e088-40b8-aecd-e83fe862bc78"). InnerVolumeSpecName "v4-0-config-system-router-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 08:43:32 crc kubenswrapper[4765]: I1003 08:43:32.150391 4765 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0ea22c01-e088-40b8-aecd-e83fe862bc78-v4-0-config-user-idp-0-file-data" (OuterVolumeSpecName: "v4-0-config-user-idp-0-file-data") pod "0ea22c01-e088-40b8-aecd-e83fe862bc78" (UID: "0ea22c01-e088-40b8-aecd-e83fe862bc78"). InnerVolumeSpecName "v4-0-config-user-idp-0-file-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 08:43:32 crc kubenswrapper[4765]: I1003 08:43:32.150460 4765 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0ea22c01-e088-40b8-aecd-e83fe862bc78-kube-api-access-qwks2" (OuterVolumeSpecName: "kube-api-access-qwks2") pod "0ea22c01-e088-40b8-aecd-e83fe862bc78" (UID: "0ea22c01-e088-40b8-aecd-e83fe862bc78"). InnerVolumeSpecName "kube-api-access-qwks2". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 08:43:32 crc kubenswrapper[4765]: I1003 08:43:32.150979 4765 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0ea22c01-e088-40b8-aecd-e83fe862bc78-v4-0-config-system-serving-cert" (OuterVolumeSpecName: "v4-0-config-system-serving-cert") pod "0ea22c01-e088-40b8-aecd-e83fe862bc78" (UID: "0ea22c01-e088-40b8-aecd-e83fe862bc78"). InnerVolumeSpecName "v4-0-config-system-serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 08:43:32 crc kubenswrapper[4765]: I1003 08:43:32.151045 4765 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0ea22c01-e088-40b8-aecd-e83fe862bc78-v4-0-config-system-ocp-branding-template" (OuterVolumeSpecName: "v4-0-config-system-ocp-branding-template") pod "0ea22c01-e088-40b8-aecd-e83fe862bc78" (UID: "0ea22c01-e088-40b8-aecd-e83fe862bc78"). InnerVolumeSpecName "v4-0-config-system-ocp-branding-template". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 08:43:32 crc kubenswrapper[4765]: I1003 08:43:32.151692 4765 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0ea22c01-e088-40b8-aecd-e83fe862bc78-v4-0-config-user-template-error" (OuterVolumeSpecName: "v4-0-config-user-template-error") pod "0ea22c01-e088-40b8-aecd-e83fe862bc78" (UID: "0ea22c01-e088-40b8-aecd-e83fe862bc78"). InnerVolumeSpecName "v4-0-config-user-template-error". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 08:43:32 crc kubenswrapper[4765]: I1003 08:43:32.155359 4765 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0ea22c01-e088-40b8-aecd-e83fe862bc78-v4-0-config-user-template-login" (OuterVolumeSpecName: "v4-0-config-user-template-login") pod "0ea22c01-e088-40b8-aecd-e83fe862bc78" (UID: "0ea22c01-e088-40b8-aecd-e83fe862bc78"). InnerVolumeSpecName "v4-0-config-user-template-login". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 08:43:32 crc kubenswrapper[4765]: I1003 08:43:32.246126 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/72215636-3977-4a51-8a14-7ba9c464d45a-audit-policies\") pod \"oauth-openshift-7765894ccc-n7lwz\" (UID: \"72215636-3977-4a51-8a14-7ba9c464d45a\") " pod="openshift-authentication/oauth-openshift-7765894ccc-n7lwz" Oct 03 08:43:32 crc kubenswrapper[4765]: I1003 08:43:32.246202 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/72215636-3977-4a51-8a14-7ba9c464d45a-v4-0-config-system-cliconfig\") pod \"oauth-openshift-7765894ccc-n7lwz\" (UID: \"72215636-3977-4a51-8a14-7ba9c464d45a\") " pod="openshift-authentication/oauth-openshift-7765894ccc-n7lwz" Oct 03 08:43:32 crc kubenswrapper[4765]: I1003 08:43:32.246227 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/72215636-3977-4a51-8a14-7ba9c464d45a-v4-0-config-system-service-ca\") pod \"oauth-openshift-7765894ccc-n7lwz\" (UID: \"72215636-3977-4a51-8a14-7ba9c464d45a\") " pod="openshift-authentication/oauth-openshift-7765894ccc-n7lwz" Oct 03 08:43:32 crc kubenswrapper[4765]: I1003 08:43:32.246258 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/72215636-3977-4a51-8a14-7ba9c464d45a-v4-0-config-system-session\") pod \"oauth-openshift-7765894ccc-n7lwz\" (UID: \"72215636-3977-4a51-8a14-7ba9c464d45a\") " pod="openshift-authentication/oauth-openshift-7765894ccc-n7lwz" Oct 03 08:43:32 crc kubenswrapper[4765]: I1003 08:43:32.246282 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/72215636-3977-4a51-8a14-7ba9c464d45a-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-7765894ccc-n7lwz\" (UID: \"72215636-3977-4a51-8a14-7ba9c464d45a\") " pod="openshift-authentication/oauth-openshift-7765894ccc-n7lwz" Oct 03 08:43:32 crc kubenswrapper[4765]: I1003 08:43:32.246313 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lfvdt\" (UniqueName: \"kubernetes.io/projected/72215636-3977-4a51-8a14-7ba9c464d45a-kube-api-access-lfvdt\") pod \"oauth-openshift-7765894ccc-n7lwz\" (UID: \"72215636-3977-4a51-8a14-7ba9c464d45a\") " pod="openshift-authentication/oauth-openshift-7765894ccc-n7lwz" Oct 03 08:43:32 crc kubenswrapper[4765]: I1003 08:43:32.246355 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/72215636-3977-4a51-8a14-7ba9c464d45a-v4-0-config-user-template-login\") pod \"oauth-openshift-7765894ccc-n7lwz\" (UID: \"72215636-3977-4a51-8a14-7ba9c464d45a\") " pod="openshift-authentication/oauth-openshift-7765894ccc-n7lwz" Oct 03 08:43:32 crc kubenswrapper[4765]: I1003 08:43:32.246384 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/72215636-3977-4a51-8a14-7ba9c464d45a-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-7765894ccc-n7lwz\" (UID: \"72215636-3977-4a51-8a14-7ba9c464d45a\") " pod="openshift-authentication/oauth-openshift-7765894ccc-n7lwz" Oct 03 08:43:32 crc kubenswrapper[4765]: I1003 08:43:32.246416 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/72215636-3977-4a51-8a14-7ba9c464d45a-v4-0-config-system-serving-cert\") pod \"oauth-openshift-7765894ccc-n7lwz\" (UID: \"72215636-3977-4a51-8a14-7ba9c464d45a\") " pod="openshift-authentication/oauth-openshift-7765894ccc-n7lwz" Oct 03 08:43:32 crc kubenswrapper[4765]: I1003 08:43:32.246446 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/72215636-3977-4a51-8a14-7ba9c464d45a-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-7765894ccc-n7lwz\" (UID: \"72215636-3977-4a51-8a14-7ba9c464d45a\") " pod="openshift-authentication/oauth-openshift-7765894ccc-n7lwz" Oct 03 08:43:32 crc kubenswrapper[4765]: I1003 08:43:32.246479 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/72215636-3977-4a51-8a14-7ba9c464d45a-v4-0-config-user-template-error\") pod \"oauth-openshift-7765894ccc-n7lwz\" (UID: \"72215636-3977-4a51-8a14-7ba9c464d45a\") " pod="openshift-authentication/oauth-openshift-7765894ccc-n7lwz" Oct 03 08:43:32 crc kubenswrapper[4765]: I1003 08:43:32.246501 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/72215636-3977-4a51-8a14-7ba9c464d45a-v4-0-config-system-router-certs\") pod \"oauth-openshift-7765894ccc-n7lwz\" (UID: \"72215636-3977-4a51-8a14-7ba9c464d45a\") " pod="openshift-authentication/oauth-openshift-7765894ccc-n7lwz" Oct 03 08:43:32 crc kubenswrapper[4765]: I1003 08:43:32.246545 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/72215636-3977-4a51-8a14-7ba9c464d45a-audit-dir\") pod \"oauth-openshift-7765894ccc-n7lwz\" (UID: \"72215636-3977-4a51-8a14-7ba9c464d45a\") " pod="openshift-authentication/oauth-openshift-7765894ccc-n7lwz" Oct 03 08:43:32 crc kubenswrapper[4765]: I1003 08:43:32.246573 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/72215636-3977-4a51-8a14-7ba9c464d45a-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-7765894ccc-n7lwz\" (UID: \"72215636-3977-4a51-8a14-7ba9c464d45a\") " pod="openshift-authentication/oauth-openshift-7765894ccc-n7lwz" Oct 03 08:43:32 crc kubenswrapper[4765]: I1003 08:43:32.246618 4765 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/0ea22c01-e088-40b8-aecd-e83fe862bc78-v4-0-config-system-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 03 08:43:32 crc kubenswrapper[4765]: I1003 08:43:32.246633 4765 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/0ea22c01-e088-40b8-aecd-e83fe862bc78-v4-0-config-system-session\") on node \"crc\" DevicePath \"\"" Oct 03 08:43:32 crc kubenswrapper[4765]: I1003 08:43:32.246671 4765 reconciler_common.go:293] "Volume detached for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/0ea22c01-e088-40b8-aecd-e83fe862bc78-audit-policies\") on node \"crc\" DevicePath \"\"" Oct 03 08:43:32 crc kubenswrapper[4765]: I1003 08:43:32.246685 4765 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/0ea22c01-e088-40b8-aecd-e83fe862bc78-v4-0-config-system-service-ca\") on node \"crc\" DevicePath \"\"" Oct 03 08:43:32 crc kubenswrapper[4765]: I1003 08:43:32.246698 4765 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/0ea22c01-e088-40b8-aecd-e83fe862bc78-v4-0-config-user-template-provider-selection\") on node \"crc\" DevicePath \"\"" Oct 03 08:43:32 crc kubenswrapper[4765]: I1003 08:43:32.246712 4765 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/0ea22c01-e088-40b8-aecd-e83fe862bc78-v4-0-config-user-template-login\") on node \"crc\" DevicePath \"\"" Oct 03 08:43:32 crc kubenswrapper[4765]: I1003 08:43:32.246725 4765 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/0ea22c01-e088-40b8-aecd-e83fe862bc78-v4-0-config-user-idp-0-file-data\") on node \"crc\" DevicePath \"\"" Oct 03 08:43:32 crc kubenswrapper[4765]: I1003 08:43:32.246736 4765 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/0ea22c01-e088-40b8-aecd-e83fe862bc78-v4-0-config-system-ocp-branding-template\") on node \"crc\" DevicePath \"\"" Oct 03 08:43:32 crc kubenswrapper[4765]: I1003 08:43:32.246747 4765 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qwks2\" (UniqueName: \"kubernetes.io/projected/0ea22c01-e088-40b8-aecd-e83fe862bc78-kube-api-access-qwks2\") on node \"crc\" DevicePath \"\"" Oct 03 08:43:32 crc kubenswrapper[4765]: I1003 08:43:32.246758 4765 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/0ea22c01-e088-40b8-aecd-e83fe862bc78-v4-0-config-system-serving-cert\") on node \"crc\" DevicePath \"\"" Oct 03 08:43:32 crc kubenswrapper[4765]: I1003 08:43:32.246778 4765 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/0ea22c01-e088-40b8-aecd-e83fe862bc78-v4-0-config-user-template-error\") on node \"crc\" DevicePath \"\"" Oct 03 08:43:32 crc kubenswrapper[4765]: I1003 08:43:32.246792 4765 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/0ea22c01-e088-40b8-aecd-e83fe862bc78-v4-0-config-system-cliconfig\") on node \"crc\" DevicePath \"\"" Oct 03 08:43:32 crc kubenswrapper[4765]: I1003 08:43:32.246803 4765 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/0ea22c01-e088-40b8-aecd-e83fe862bc78-v4-0-config-system-router-certs\") on node \"crc\" DevicePath \"\"" Oct 03 08:43:32 crc kubenswrapper[4765]: I1003 08:43:32.247344 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/72215636-3977-4a51-8a14-7ba9c464d45a-v4-0-config-system-cliconfig\") pod \"oauth-openshift-7765894ccc-n7lwz\" (UID: \"72215636-3977-4a51-8a14-7ba9c464d45a\") " pod="openshift-authentication/oauth-openshift-7765894ccc-n7lwz" Oct 03 08:43:32 crc kubenswrapper[4765]: I1003 08:43:32.247403 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/72215636-3977-4a51-8a14-7ba9c464d45a-audit-policies\") pod \"oauth-openshift-7765894ccc-n7lwz\" (UID: \"72215636-3977-4a51-8a14-7ba9c464d45a\") " pod="openshift-authentication/oauth-openshift-7765894ccc-n7lwz" Oct 03 08:43:32 crc kubenswrapper[4765]: I1003 08:43:32.247578 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/72215636-3977-4a51-8a14-7ba9c464d45a-audit-dir\") pod \"oauth-openshift-7765894ccc-n7lwz\" (UID: \"72215636-3977-4a51-8a14-7ba9c464d45a\") " pod="openshift-authentication/oauth-openshift-7765894ccc-n7lwz" Oct 03 08:43:32 crc kubenswrapper[4765]: I1003 08:43:32.248571 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/72215636-3977-4a51-8a14-7ba9c464d45a-v4-0-config-system-service-ca\") pod \"oauth-openshift-7765894ccc-n7lwz\" (UID: \"72215636-3977-4a51-8a14-7ba9c464d45a\") " pod="openshift-authentication/oauth-openshift-7765894ccc-n7lwz" Oct 03 08:43:32 crc kubenswrapper[4765]: I1003 08:43:32.250367 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/72215636-3977-4a51-8a14-7ba9c464d45a-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-7765894ccc-n7lwz\" (UID: \"72215636-3977-4a51-8a14-7ba9c464d45a\") " pod="openshift-authentication/oauth-openshift-7765894ccc-n7lwz" Oct 03 08:43:32 crc kubenswrapper[4765]: I1003 08:43:32.250465 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/72215636-3977-4a51-8a14-7ba9c464d45a-v4-0-config-system-session\") pod \"oauth-openshift-7765894ccc-n7lwz\" (UID: \"72215636-3977-4a51-8a14-7ba9c464d45a\") " pod="openshift-authentication/oauth-openshift-7765894ccc-n7lwz" Oct 03 08:43:32 crc kubenswrapper[4765]: I1003 08:43:32.250776 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/72215636-3977-4a51-8a14-7ba9c464d45a-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-7765894ccc-n7lwz\" (UID: \"72215636-3977-4a51-8a14-7ba9c464d45a\") " pod="openshift-authentication/oauth-openshift-7765894ccc-n7lwz" Oct 03 08:43:32 crc kubenswrapper[4765]: I1003 08:43:32.251022 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/72215636-3977-4a51-8a14-7ba9c464d45a-v4-0-config-user-template-error\") pod \"oauth-openshift-7765894ccc-n7lwz\" (UID: \"72215636-3977-4a51-8a14-7ba9c464d45a\") " pod="openshift-authentication/oauth-openshift-7765894ccc-n7lwz" Oct 03 08:43:32 crc kubenswrapper[4765]: I1003 08:43:32.252374 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/72215636-3977-4a51-8a14-7ba9c464d45a-v4-0-config-system-serving-cert\") pod \"oauth-openshift-7765894ccc-n7lwz\" (UID: \"72215636-3977-4a51-8a14-7ba9c464d45a\") " pod="openshift-authentication/oauth-openshift-7765894ccc-n7lwz" Oct 03 08:43:32 crc kubenswrapper[4765]: I1003 08:43:32.253170 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/72215636-3977-4a51-8a14-7ba9c464d45a-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-7765894ccc-n7lwz\" (UID: \"72215636-3977-4a51-8a14-7ba9c464d45a\") " pod="openshift-authentication/oauth-openshift-7765894ccc-n7lwz" Oct 03 08:43:32 crc kubenswrapper[4765]: I1003 08:43:32.253298 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/72215636-3977-4a51-8a14-7ba9c464d45a-v4-0-config-user-template-login\") pod \"oauth-openshift-7765894ccc-n7lwz\" (UID: \"72215636-3977-4a51-8a14-7ba9c464d45a\") " pod="openshift-authentication/oauth-openshift-7765894ccc-n7lwz" Oct 03 08:43:32 crc kubenswrapper[4765]: I1003 08:43:32.253319 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/72215636-3977-4a51-8a14-7ba9c464d45a-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-7765894ccc-n7lwz\" (UID: \"72215636-3977-4a51-8a14-7ba9c464d45a\") " pod="openshift-authentication/oauth-openshift-7765894ccc-n7lwz" Oct 03 08:43:32 crc kubenswrapper[4765]: I1003 08:43:32.256149 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/72215636-3977-4a51-8a14-7ba9c464d45a-v4-0-config-system-router-certs\") pod \"oauth-openshift-7765894ccc-n7lwz\" (UID: \"72215636-3977-4a51-8a14-7ba9c464d45a\") " pod="openshift-authentication/oauth-openshift-7765894ccc-n7lwz" Oct 03 08:43:32 crc kubenswrapper[4765]: I1003 08:43:32.264776 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lfvdt\" (UniqueName: \"kubernetes.io/projected/72215636-3977-4a51-8a14-7ba9c464d45a-kube-api-access-lfvdt\") pod \"oauth-openshift-7765894ccc-n7lwz\" (UID: \"72215636-3977-4a51-8a14-7ba9c464d45a\") " pod="openshift-authentication/oauth-openshift-7765894ccc-n7lwz" Oct 03 08:43:32 crc kubenswrapper[4765]: I1003 08:43:32.347447 4765 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-authentication/oauth-openshift-558db77b4-jgs5w"] Oct 03 08:43:32 crc kubenswrapper[4765]: I1003 08:43:32.350570 4765 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-authentication/oauth-openshift-558db77b4-jgs5w"] Oct 03 08:43:32 crc kubenswrapper[4765]: I1003 08:43:32.356013 4765 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-7765894ccc-n7lwz" Oct 03 08:43:32 crc kubenswrapper[4765]: I1003 08:43:32.551794 4765 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication/oauth-openshift-7765894ccc-n7lwz"] Oct 03 08:43:33 crc kubenswrapper[4765]: I1003 08:43:33.024186 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-7765894ccc-n7lwz" event={"ID":"72215636-3977-4a51-8a14-7ba9c464d45a","Type":"ContainerStarted","Data":"332107ff573d9f8c3bbdb198e979d0a8e7b4b11b38f6ed250c280c7af1443f39"} Oct 03 08:43:33 crc kubenswrapper[4765]: I1003 08:43:33.024289 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-7765894ccc-n7lwz" event={"ID":"72215636-3977-4a51-8a14-7ba9c464d45a","Type":"ContainerStarted","Data":"7fc38ba7449f1f4cfa830af9fc4da1b8a9b00155a8c57cdcd3e5159ef0d84cd1"} Oct 03 08:43:33 crc kubenswrapper[4765]: I1003 08:43:33.024499 4765 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-authentication/oauth-openshift-7765894ccc-n7lwz" Oct 03 08:43:33 crc kubenswrapper[4765]: I1003 08:43:33.382379 4765 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-authentication/oauth-openshift-7765894ccc-n7lwz" Oct 03 08:43:33 crc kubenswrapper[4765]: I1003 08:43:33.408056 4765 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-authentication/oauth-openshift-7765894ccc-n7lwz" podStartSLOduration=27.408035167 podStartE2EDuration="27.408035167s" podCreationTimestamp="2025-10-03 08:43:06 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-03 08:43:33.04434514 +0000 UTC m=+257.345839480" watchObservedRunningTime="2025-10-03 08:43:33.408035167 +0000 UTC m=+257.709529497" Oct 03 08:43:34 crc kubenswrapper[4765]: I1003 08:43:34.316019 4765 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0ea22c01-e088-40b8-aecd-e83fe862bc78" path="/var/lib/kubelet/pods/0ea22c01-e088-40b8-aecd-e83fe862bc78/volumes" Oct 03 08:43:43 crc kubenswrapper[4765]: I1003 08:43:43.923922 4765 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-9hhf6"] Oct 03 08:43:43 crc kubenswrapper[4765]: I1003 08:43:43.924754 4765 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-9hhf6" podUID="52d70e1c-3f04-4bab-a6a3-2ea9d66489db" containerName="registry-server" containerID="cri-o://8684de6f72267e6ca3e62c5a7d4e19b18182b6272fc31ebf41077147710b10ea" gracePeriod=30 Oct 03 08:43:43 crc kubenswrapper[4765]: I1003 08:43:43.935438 4765 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-8lvxz"] Oct 03 08:43:43 crc kubenswrapper[4765]: I1003 08:43:43.936166 4765 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-8lvxz" podUID="4de34feb-a2a4-49c9-b066-f7a71b39cd06" containerName="registry-server" containerID="cri-o://535142c5dddf7f4fe82c5b7ad47d58d91bc4e851ca45cfa62f12be107faeca43" gracePeriod=30 Oct 03 08:43:43 crc kubenswrapper[4765]: I1003 08:43:43.952989 4765 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-9g9cw"] Oct 03 08:43:43 crc kubenswrapper[4765]: I1003 08:43:43.953235 4765 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/marketplace-operator-79b997595-9g9cw" podUID="28cc4e4f-507b-49c7-9a8f-2107e600e834" containerName="marketplace-operator" containerID="cri-o://9fc01a9d3ec9443dd862e65771dc9f3be592acce41d965949734c07886215d4a" gracePeriod=30 Oct 03 08:43:43 crc kubenswrapper[4765]: I1003 08:43:43.968119 4765 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-zfkjv"] Oct 03 08:43:43 crc kubenswrapper[4765]: I1003 08:43:43.968411 4765 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-zfkjv" podUID="d55c53fc-df46-4bb6-a4b7-4d269d965dc6" containerName="registry-server" containerID="cri-o://4cd5aa9415cd0abd8a14b58991ad92118f4698c67b70e6d1b549d831f8ca1f5c" gracePeriod=30 Oct 03 08:43:43 crc kubenswrapper[4765]: I1003 08:43:43.984538 4765 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-fpzd6"] Oct 03 08:43:43 crc kubenswrapper[4765]: I1003 08:43:43.985458 4765 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-fpzd6" Oct 03 08:43:43 crc kubenswrapper[4765]: I1003 08:43:43.988215 4765 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-sgxf4"] Oct 03 08:43:43 crc kubenswrapper[4765]: I1003 08:43:43.988485 4765 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-sgxf4" podUID="b280de40-e91c-4010-9173-48ed01320bd4" containerName="registry-server" containerID="cri-o://4d1b5c3436013a2e933445c0bfdfcf3c9a279ff54ea8bf4f7e4180d2abae5bd9" gracePeriod=30 Oct 03 08:43:43 crc kubenswrapper[4765]: I1003 08:43:43.999147 4765 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-fpzd6"] Oct 03 08:43:44 crc kubenswrapper[4765]: I1003 08:43:44.102538 4765 generic.go:334] "Generic (PLEG): container finished" podID="d55c53fc-df46-4bb6-a4b7-4d269d965dc6" containerID="4cd5aa9415cd0abd8a14b58991ad92118f4698c67b70e6d1b549d831f8ca1f5c" exitCode=0 Oct 03 08:43:44 crc kubenswrapper[4765]: I1003 08:43:44.102659 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-zfkjv" event={"ID":"d55c53fc-df46-4bb6-a4b7-4d269d965dc6","Type":"ContainerDied","Data":"4cd5aa9415cd0abd8a14b58991ad92118f4698c67b70e6d1b549d831f8ca1f5c"} Oct 03 08:43:44 crc kubenswrapper[4765]: I1003 08:43:44.119535 4765 generic.go:334] "Generic (PLEG): container finished" podID="52d70e1c-3f04-4bab-a6a3-2ea9d66489db" containerID="8684de6f72267e6ca3e62c5a7d4e19b18182b6272fc31ebf41077147710b10ea" exitCode=0 Oct 03 08:43:44 crc kubenswrapper[4765]: I1003 08:43:44.119620 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-9hhf6" event={"ID":"52d70e1c-3f04-4bab-a6a3-2ea9d66489db","Type":"ContainerDied","Data":"8684de6f72267e6ca3e62c5a7d4e19b18182b6272fc31ebf41077147710b10ea"} Oct 03 08:43:44 crc kubenswrapper[4765]: I1003 08:43:44.130837 4765 generic.go:334] "Generic (PLEG): container finished" podID="4de34feb-a2a4-49c9-b066-f7a71b39cd06" containerID="535142c5dddf7f4fe82c5b7ad47d58d91bc4e851ca45cfa62f12be107faeca43" exitCode=0 Oct 03 08:43:44 crc kubenswrapper[4765]: I1003 08:43:44.130925 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-8lvxz" event={"ID":"4de34feb-a2a4-49c9-b066-f7a71b39cd06","Type":"ContainerDied","Data":"535142c5dddf7f4fe82c5b7ad47d58d91bc4e851ca45cfa62f12be107faeca43"} Oct 03 08:43:44 crc kubenswrapper[4765]: I1003 08:43:44.134609 4765 generic.go:334] "Generic (PLEG): container finished" podID="28cc4e4f-507b-49c7-9a8f-2107e600e834" containerID="9fc01a9d3ec9443dd862e65771dc9f3be592acce41d965949734c07886215d4a" exitCode=0 Oct 03 08:43:44 crc kubenswrapper[4765]: I1003 08:43:44.134655 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-9g9cw" event={"ID":"28cc4e4f-507b-49c7-9a8f-2107e600e834","Type":"ContainerDied","Data":"9fc01a9d3ec9443dd862e65771dc9f3be592acce41d965949734c07886215d4a"} Oct 03 08:43:44 crc kubenswrapper[4765]: I1003 08:43:44.152424 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/a266cf6c-8014-4ce5-a57a-9851e4f971c5-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-fpzd6\" (UID: \"a266cf6c-8014-4ce5-a57a-9851e4f971c5\") " pod="openshift-marketplace/marketplace-operator-79b997595-fpzd6" Oct 03 08:43:44 crc kubenswrapper[4765]: I1003 08:43:44.152508 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qkpgs\" (UniqueName: \"kubernetes.io/projected/a266cf6c-8014-4ce5-a57a-9851e4f971c5-kube-api-access-qkpgs\") pod \"marketplace-operator-79b997595-fpzd6\" (UID: \"a266cf6c-8014-4ce5-a57a-9851e4f971c5\") " pod="openshift-marketplace/marketplace-operator-79b997595-fpzd6" Oct 03 08:43:44 crc kubenswrapper[4765]: I1003 08:43:44.152577 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/a266cf6c-8014-4ce5-a57a-9851e4f971c5-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-fpzd6\" (UID: \"a266cf6c-8014-4ce5-a57a-9851e4f971c5\") " pod="openshift-marketplace/marketplace-operator-79b997595-fpzd6" Oct 03 08:43:44 crc kubenswrapper[4765]: I1003 08:43:44.253824 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qkpgs\" (UniqueName: \"kubernetes.io/projected/a266cf6c-8014-4ce5-a57a-9851e4f971c5-kube-api-access-qkpgs\") pod \"marketplace-operator-79b997595-fpzd6\" (UID: \"a266cf6c-8014-4ce5-a57a-9851e4f971c5\") " pod="openshift-marketplace/marketplace-operator-79b997595-fpzd6" Oct 03 08:43:44 crc kubenswrapper[4765]: I1003 08:43:44.253896 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/a266cf6c-8014-4ce5-a57a-9851e4f971c5-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-fpzd6\" (UID: \"a266cf6c-8014-4ce5-a57a-9851e4f971c5\") " pod="openshift-marketplace/marketplace-operator-79b997595-fpzd6" Oct 03 08:43:44 crc kubenswrapper[4765]: I1003 08:43:44.253945 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/a266cf6c-8014-4ce5-a57a-9851e4f971c5-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-fpzd6\" (UID: \"a266cf6c-8014-4ce5-a57a-9851e4f971c5\") " pod="openshift-marketplace/marketplace-operator-79b997595-fpzd6" Oct 03 08:43:44 crc kubenswrapper[4765]: I1003 08:43:44.256555 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/a266cf6c-8014-4ce5-a57a-9851e4f971c5-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-fpzd6\" (UID: \"a266cf6c-8014-4ce5-a57a-9851e4f971c5\") " pod="openshift-marketplace/marketplace-operator-79b997595-fpzd6" Oct 03 08:43:44 crc kubenswrapper[4765]: I1003 08:43:44.262245 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/a266cf6c-8014-4ce5-a57a-9851e4f971c5-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-fpzd6\" (UID: \"a266cf6c-8014-4ce5-a57a-9851e4f971c5\") " pod="openshift-marketplace/marketplace-operator-79b997595-fpzd6" Oct 03 08:43:44 crc kubenswrapper[4765]: I1003 08:43:44.272170 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qkpgs\" (UniqueName: \"kubernetes.io/projected/a266cf6c-8014-4ce5-a57a-9851e4f971c5-kube-api-access-qkpgs\") pod \"marketplace-operator-79b997595-fpzd6\" (UID: \"a266cf6c-8014-4ce5-a57a-9851e4f971c5\") " pod="openshift-marketplace/marketplace-operator-79b997595-fpzd6" Oct 03 08:43:44 crc kubenswrapper[4765]: I1003 08:43:44.436385 4765 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-fpzd6" Oct 03 08:43:44 crc kubenswrapper[4765]: I1003 08:43:44.441702 4765 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-9hhf6" Oct 03 08:43:44 crc kubenswrapper[4765]: I1003 08:43:44.449618 4765 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-zfkjv" Oct 03 08:43:44 crc kubenswrapper[4765]: I1003 08:43:44.452203 4765 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-8lvxz" Oct 03 08:43:44 crc kubenswrapper[4765]: I1003 08:43:44.463748 4765 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-9g9cw" Oct 03 08:43:44 crc kubenswrapper[4765]: I1003 08:43:44.483513 4765 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-sgxf4" Oct 03 08:43:44 crc kubenswrapper[4765]: I1003 08:43:44.558783 4765 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fc4db\" (UniqueName: \"kubernetes.io/projected/52d70e1c-3f04-4bab-a6a3-2ea9d66489db-kube-api-access-fc4db\") pod \"52d70e1c-3f04-4bab-a6a3-2ea9d66489db\" (UID: \"52d70e1c-3f04-4bab-a6a3-2ea9d66489db\") " Oct 03 08:43:44 crc kubenswrapper[4765]: I1003 08:43:44.558841 4765 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4de34feb-a2a4-49c9-b066-f7a71b39cd06-catalog-content\") pod \"4de34feb-a2a4-49c9-b066-f7a71b39cd06\" (UID: \"4de34feb-a2a4-49c9-b066-f7a71b39cd06\") " Oct 03 08:43:44 crc kubenswrapper[4765]: I1003 08:43:44.558881 4765 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/28cc4e4f-507b-49c7-9a8f-2107e600e834-marketplace-trusted-ca\") pod \"28cc4e4f-507b-49c7-9a8f-2107e600e834\" (UID: \"28cc4e4f-507b-49c7-9a8f-2107e600e834\") " Oct 03 08:43:44 crc kubenswrapper[4765]: I1003 08:43:44.558918 4765 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4de34feb-a2a4-49c9-b066-f7a71b39cd06-utilities\") pod \"4de34feb-a2a4-49c9-b066-f7a71b39cd06\" (UID: \"4de34feb-a2a4-49c9-b066-f7a71b39cd06\") " Oct 03 08:43:44 crc kubenswrapper[4765]: I1003 08:43:44.558958 4765 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/28cc4e4f-507b-49c7-9a8f-2107e600e834-marketplace-operator-metrics\") pod \"28cc4e4f-507b-49c7-9a8f-2107e600e834\" (UID: \"28cc4e4f-507b-49c7-9a8f-2107e600e834\") " Oct 03 08:43:44 crc kubenswrapper[4765]: I1003 08:43:44.559244 4765 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-85mlv\" (UniqueName: \"kubernetes.io/projected/28cc4e4f-507b-49c7-9a8f-2107e600e834-kube-api-access-85mlv\") pod \"28cc4e4f-507b-49c7-9a8f-2107e600e834\" (UID: \"28cc4e4f-507b-49c7-9a8f-2107e600e834\") " Oct 03 08:43:44 crc kubenswrapper[4765]: I1003 08:43:44.559336 4765 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/52d70e1c-3f04-4bab-a6a3-2ea9d66489db-catalog-content\") pod \"52d70e1c-3f04-4bab-a6a3-2ea9d66489db\" (UID: \"52d70e1c-3f04-4bab-a6a3-2ea9d66489db\") " Oct 03 08:43:44 crc kubenswrapper[4765]: I1003 08:43:44.559371 4765 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-68rb5\" (UniqueName: \"kubernetes.io/projected/d55c53fc-df46-4bb6-a4b7-4d269d965dc6-kube-api-access-68rb5\") pod \"d55c53fc-df46-4bb6-a4b7-4d269d965dc6\" (UID: \"d55c53fc-df46-4bb6-a4b7-4d269d965dc6\") " Oct 03 08:43:44 crc kubenswrapper[4765]: I1003 08:43:44.559429 4765 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d55c53fc-df46-4bb6-a4b7-4d269d965dc6-catalog-content\") pod \"d55c53fc-df46-4bb6-a4b7-4d269d965dc6\" (UID: \"d55c53fc-df46-4bb6-a4b7-4d269d965dc6\") " Oct 03 08:43:44 crc kubenswrapper[4765]: I1003 08:43:44.559467 4765 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/52d70e1c-3f04-4bab-a6a3-2ea9d66489db-utilities\") pod \"52d70e1c-3f04-4bab-a6a3-2ea9d66489db\" (UID: \"52d70e1c-3f04-4bab-a6a3-2ea9d66489db\") " Oct 03 08:43:44 crc kubenswrapper[4765]: I1003 08:43:44.559513 4765 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d55c53fc-df46-4bb6-a4b7-4d269d965dc6-utilities\") pod \"d55c53fc-df46-4bb6-a4b7-4d269d965dc6\" (UID: \"d55c53fc-df46-4bb6-a4b7-4d269d965dc6\") " Oct 03 08:43:44 crc kubenswrapper[4765]: I1003 08:43:44.559538 4765 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6vwkv\" (UniqueName: \"kubernetes.io/projected/4de34feb-a2a4-49c9-b066-f7a71b39cd06-kube-api-access-6vwkv\") pod \"4de34feb-a2a4-49c9-b066-f7a71b39cd06\" (UID: \"4de34feb-a2a4-49c9-b066-f7a71b39cd06\") " Oct 03 08:43:44 crc kubenswrapper[4765]: I1003 08:43:44.560194 4765 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/4de34feb-a2a4-49c9-b066-f7a71b39cd06-utilities" (OuterVolumeSpecName: "utilities") pod "4de34feb-a2a4-49c9-b066-f7a71b39cd06" (UID: "4de34feb-a2a4-49c9-b066-f7a71b39cd06"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 03 08:43:44 crc kubenswrapper[4765]: I1003 08:43:44.560479 4765 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/28cc4e4f-507b-49c7-9a8f-2107e600e834-marketplace-trusted-ca" (OuterVolumeSpecName: "marketplace-trusted-ca") pod "28cc4e4f-507b-49c7-9a8f-2107e600e834" (UID: "28cc4e4f-507b-49c7-9a8f-2107e600e834"). InnerVolumeSpecName "marketplace-trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 03 08:43:44 crc kubenswrapper[4765]: I1003 08:43:44.560823 4765 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d55c53fc-df46-4bb6-a4b7-4d269d965dc6-utilities" (OuterVolumeSpecName: "utilities") pod "d55c53fc-df46-4bb6-a4b7-4d269d965dc6" (UID: "d55c53fc-df46-4bb6-a4b7-4d269d965dc6"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 03 08:43:44 crc kubenswrapper[4765]: I1003 08:43:44.560909 4765 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/52d70e1c-3f04-4bab-a6a3-2ea9d66489db-utilities" (OuterVolumeSpecName: "utilities") pod "52d70e1c-3f04-4bab-a6a3-2ea9d66489db" (UID: "52d70e1c-3f04-4bab-a6a3-2ea9d66489db"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 03 08:43:44 crc kubenswrapper[4765]: I1003 08:43:44.563198 4765 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d55c53fc-df46-4bb6-a4b7-4d269d965dc6-kube-api-access-68rb5" (OuterVolumeSpecName: "kube-api-access-68rb5") pod "d55c53fc-df46-4bb6-a4b7-4d269d965dc6" (UID: "d55c53fc-df46-4bb6-a4b7-4d269d965dc6"). InnerVolumeSpecName "kube-api-access-68rb5". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 08:43:44 crc kubenswrapper[4765]: I1003 08:43:44.563682 4765 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/28cc4e4f-507b-49c7-9a8f-2107e600e834-marketplace-operator-metrics" (OuterVolumeSpecName: "marketplace-operator-metrics") pod "28cc4e4f-507b-49c7-9a8f-2107e600e834" (UID: "28cc4e4f-507b-49c7-9a8f-2107e600e834"). InnerVolumeSpecName "marketplace-operator-metrics". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 08:43:44 crc kubenswrapper[4765]: I1003 08:43:44.565000 4765 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/28cc4e4f-507b-49c7-9a8f-2107e600e834-kube-api-access-85mlv" (OuterVolumeSpecName: "kube-api-access-85mlv") pod "28cc4e4f-507b-49c7-9a8f-2107e600e834" (UID: "28cc4e4f-507b-49c7-9a8f-2107e600e834"). InnerVolumeSpecName "kube-api-access-85mlv". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 08:43:44 crc kubenswrapper[4765]: I1003 08:43:44.566801 4765 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/52d70e1c-3f04-4bab-a6a3-2ea9d66489db-kube-api-access-fc4db" (OuterVolumeSpecName: "kube-api-access-fc4db") pod "52d70e1c-3f04-4bab-a6a3-2ea9d66489db" (UID: "52d70e1c-3f04-4bab-a6a3-2ea9d66489db"). InnerVolumeSpecName "kube-api-access-fc4db". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 08:43:44 crc kubenswrapper[4765]: I1003 08:43:44.568068 4765 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4de34feb-a2a4-49c9-b066-f7a71b39cd06-kube-api-access-6vwkv" (OuterVolumeSpecName: "kube-api-access-6vwkv") pod "4de34feb-a2a4-49c9-b066-f7a71b39cd06" (UID: "4de34feb-a2a4-49c9-b066-f7a71b39cd06"). InnerVolumeSpecName "kube-api-access-6vwkv". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 08:43:44 crc kubenswrapper[4765]: I1003 08:43:44.589052 4765 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d55c53fc-df46-4bb6-a4b7-4d269d965dc6-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "d55c53fc-df46-4bb6-a4b7-4d269d965dc6" (UID: "d55c53fc-df46-4bb6-a4b7-4d269d965dc6"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 03 08:43:44 crc kubenswrapper[4765]: I1003 08:43:44.631557 4765 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/52d70e1c-3f04-4bab-a6a3-2ea9d66489db-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "52d70e1c-3f04-4bab-a6a3-2ea9d66489db" (UID: "52d70e1c-3f04-4bab-a6a3-2ea9d66489db"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 03 08:43:44 crc kubenswrapper[4765]: I1003 08:43:44.638215 4765 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/4de34feb-a2a4-49c9-b066-f7a71b39cd06-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "4de34feb-a2a4-49c9-b066-f7a71b39cd06" (UID: "4de34feb-a2a4-49c9-b066-f7a71b39cd06"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 03 08:43:44 crc kubenswrapper[4765]: I1003 08:43:44.661414 4765 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b280de40-e91c-4010-9173-48ed01320bd4-catalog-content\") pod \"b280de40-e91c-4010-9173-48ed01320bd4\" (UID: \"b280de40-e91c-4010-9173-48ed01320bd4\") " Oct 03 08:43:44 crc kubenswrapper[4765]: I1003 08:43:44.661484 4765 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b280de40-e91c-4010-9173-48ed01320bd4-utilities\") pod \"b280de40-e91c-4010-9173-48ed01320bd4\" (UID: \"b280de40-e91c-4010-9173-48ed01320bd4\") " Oct 03 08:43:44 crc kubenswrapper[4765]: I1003 08:43:44.661509 4765 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-h7sjc\" (UniqueName: \"kubernetes.io/projected/b280de40-e91c-4010-9173-48ed01320bd4-kube-api-access-h7sjc\") pod \"b280de40-e91c-4010-9173-48ed01320bd4\" (UID: \"b280de40-e91c-4010-9173-48ed01320bd4\") " Oct 03 08:43:44 crc kubenswrapper[4765]: I1003 08:43:44.661780 4765 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/52d70e1c-3f04-4bab-a6a3-2ea9d66489db-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 03 08:43:44 crc kubenswrapper[4765]: I1003 08:43:44.661792 4765 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-68rb5\" (UniqueName: \"kubernetes.io/projected/d55c53fc-df46-4bb6-a4b7-4d269d965dc6-kube-api-access-68rb5\") on node \"crc\" DevicePath \"\"" Oct 03 08:43:44 crc kubenswrapper[4765]: I1003 08:43:44.661802 4765 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d55c53fc-df46-4bb6-a4b7-4d269d965dc6-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 03 08:43:44 crc kubenswrapper[4765]: I1003 08:43:44.661813 4765 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/52d70e1c-3f04-4bab-a6a3-2ea9d66489db-utilities\") on node \"crc\" DevicePath \"\"" Oct 03 08:43:44 crc kubenswrapper[4765]: I1003 08:43:44.661821 4765 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d55c53fc-df46-4bb6-a4b7-4d269d965dc6-utilities\") on node \"crc\" DevicePath \"\"" Oct 03 08:43:44 crc kubenswrapper[4765]: I1003 08:43:44.661829 4765 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6vwkv\" (UniqueName: \"kubernetes.io/projected/4de34feb-a2a4-49c9-b066-f7a71b39cd06-kube-api-access-6vwkv\") on node \"crc\" DevicePath \"\"" Oct 03 08:43:44 crc kubenswrapper[4765]: I1003 08:43:44.661888 4765 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fc4db\" (UniqueName: \"kubernetes.io/projected/52d70e1c-3f04-4bab-a6a3-2ea9d66489db-kube-api-access-fc4db\") on node \"crc\" DevicePath \"\"" Oct 03 08:43:44 crc kubenswrapper[4765]: I1003 08:43:44.661901 4765 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4de34feb-a2a4-49c9-b066-f7a71b39cd06-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 03 08:43:44 crc kubenswrapper[4765]: I1003 08:43:44.661910 4765 reconciler_common.go:293] "Volume detached for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/28cc4e4f-507b-49c7-9a8f-2107e600e834-marketplace-trusted-ca\") on node \"crc\" DevicePath \"\"" Oct 03 08:43:44 crc kubenswrapper[4765]: I1003 08:43:44.661918 4765 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4de34feb-a2a4-49c9-b066-f7a71b39cd06-utilities\") on node \"crc\" DevicePath \"\"" Oct 03 08:43:44 crc kubenswrapper[4765]: I1003 08:43:44.661926 4765 reconciler_common.go:293] "Volume detached for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/28cc4e4f-507b-49c7-9a8f-2107e600e834-marketplace-operator-metrics\") on node \"crc\" DevicePath \"\"" Oct 03 08:43:44 crc kubenswrapper[4765]: I1003 08:43:44.661935 4765 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-85mlv\" (UniqueName: \"kubernetes.io/projected/28cc4e4f-507b-49c7-9a8f-2107e600e834-kube-api-access-85mlv\") on node \"crc\" DevicePath \"\"" Oct 03 08:43:44 crc kubenswrapper[4765]: I1003 08:43:44.662529 4765 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b280de40-e91c-4010-9173-48ed01320bd4-utilities" (OuterVolumeSpecName: "utilities") pod "b280de40-e91c-4010-9173-48ed01320bd4" (UID: "b280de40-e91c-4010-9173-48ed01320bd4"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 03 08:43:44 crc kubenswrapper[4765]: I1003 08:43:44.665845 4765 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b280de40-e91c-4010-9173-48ed01320bd4-kube-api-access-h7sjc" (OuterVolumeSpecName: "kube-api-access-h7sjc") pod "b280de40-e91c-4010-9173-48ed01320bd4" (UID: "b280de40-e91c-4010-9173-48ed01320bd4"). InnerVolumeSpecName "kube-api-access-h7sjc". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 08:43:44 crc kubenswrapper[4765]: I1003 08:43:44.745150 4765 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b280de40-e91c-4010-9173-48ed01320bd4-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "b280de40-e91c-4010-9173-48ed01320bd4" (UID: "b280de40-e91c-4010-9173-48ed01320bd4"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 03 08:43:44 crc kubenswrapper[4765]: I1003 08:43:44.763430 4765 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b280de40-e91c-4010-9173-48ed01320bd4-utilities\") on node \"crc\" DevicePath \"\"" Oct 03 08:43:44 crc kubenswrapper[4765]: I1003 08:43:44.763478 4765 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-h7sjc\" (UniqueName: \"kubernetes.io/projected/b280de40-e91c-4010-9173-48ed01320bd4-kube-api-access-h7sjc\") on node \"crc\" DevicePath \"\"" Oct 03 08:43:44 crc kubenswrapper[4765]: I1003 08:43:44.763495 4765 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b280de40-e91c-4010-9173-48ed01320bd4-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 03 08:43:44 crc kubenswrapper[4765]: I1003 08:43:44.886100 4765 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-fpzd6"] Oct 03 08:43:45 crc kubenswrapper[4765]: I1003 08:43:45.141594 4765 generic.go:334] "Generic (PLEG): container finished" podID="b280de40-e91c-4010-9173-48ed01320bd4" containerID="4d1b5c3436013a2e933445c0bfdfcf3c9a279ff54ea8bf4f7e4180d2abae5bd9" exitCode=0 Oct 03 08:43:45 crc kubenswrapper[4765]: I1003 08:43:45.141679 4765 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-sgxf4" Oct 03 08:43:45 crc kubenswrapper[4765]: I1003 08:43:45.141661 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-sgxf4" event={"ID":"b280de40-e91c-4010-9173-48ed01320bd4","Type":"ContainerDied","Data":"4d1b5c3436013a2e933445c0bfdfcf3c9a279ff54ea8bf4f7e4180d2abae5bd9"} Oct 03 08:43:45 crc kubenswrapper[4765]: I1003 08:43:45.142207 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-sgxf4" event={"ID":"b280de40-e91c-4010-9173-48ed01320bd4","Type":"ContainerDied","Data":"aef576156db95aa8b8d3f3694194eaacda5ebd22be014388ad123590a841fd4c"} Oct 03 08:43:45 crc kubenswrapper[4765]: I1003 08:43:45.142227 4765 scope.go:117] "RemoveContainer" containerID="4d1b5c3436013a2e933445c0bfdfcf3c9a279ff54ea8bf4f7e4180d2abae5bd9" Oct 03 08:43:45 crc kubenswrapper[4765]: I1003 08:43:45.149486 4765 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-9hhf6" Oct 03 08:43:45 crc kubenswrapper[4765]: I1003 08:43:45.149464 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-9hhf6" event={"ID":"52d70e1c-3f04-4bab-a6a3-2ea9d66489db","Type":"ContainerDied","Data":"813fe3195fae0fcc8814e443c775e31effa0c57a4ee5f78bde2bc2d0dcea0d1b"} Oct 03 08:43:45 crc kubenswrapper[4765]: I1003 08:43:45.153908 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-8lvxz" event={"ID":"4de34feb-a2a4-49c9-b066-f7a71b39cd06","Type":"ContainerDied","Data":"0850e92d765660f276b519be6f2dba13010d1a6d02efa38b6ad3722a95673323"} Oct 03 08:43:45 crc kubenswrapper[4765]: I1003 08:43:45.154012 4765 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-8lvxz" Oct 03 08:43:45 crc kubenswrapper[4765]: I1003 08:43:45.157327 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-9g9cw" event={"ID":"28cc4e4f-507b-49c7-9a8f-2107e600e834","Type":"ContainerDied","Data":"cf64aba4343e2af32e954b206eacf6fe7a97d7645315c42a21256e478f188b22"} Oct 03 08:43:45 crc kubenswrapper[4765]: I1003 08:43:45.157367 4765 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-9g9cw" Oct 03 08:43:45 crc kubenswrapper[4765]: I1003 08:43:45.162464 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-zfkjv" event={"ID":"d55c53fc-df46-4bb6-a4b7-4d269d965dc6","Type":"ContainerDied","Data":"1889bbcf53f87ace244a7ea33e7f37a9c1c476bfe674cd6b469ccf9ba1f4d99b"} Oct 03 08:43:45 crc kubenswrapper[4765]: I1003 08:43:45.162561 4765 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-zfkjv" Oct 03 08:43:45 crc kubenswrapper[4765]: I1003 08:43:45.171939 4765 scope.go:117] "RemoveContainer" containerID="6bec777ef28515667057c6422dcdbba25fa7ae8da0af00ba91b0ea9fc843e51a" Oct 03 08:43:45 crc kubenswrapper[4765]: I1003 08:43:45.173255 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-fpzd6" event={"ID":"a266cf6c-8014-4ce5-a57a-9851e4f971c5","Type":"ContainerStarted","Data":"a6a4a6bfc34717eeb40fa04f9ccf3bb65c55744e1c3fd3ce85f4bd75ae3c6770"} Oct 03 08:43:45 crc kubenswrapper[4765]: I1003 08:43:45.173406 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-fpzd6" event={"ID":"a266cf6c-8014-4ce5-a57a-9851e4f971c5","Type":"ContainerStarted","Data":"756ef7f9d6b1310c0b3992bc6924b5f9c1bda4d105ccd4027b5c7edb2b430311"} Oct 03 08:43:45 crc kubenswrapper[4765]: I1003 08:43:45.173776 4765 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/marketplace-operator-79b997595-fpzd6" Oct 03 08:43:45 crc kubenswrapper[4765]: I1003 08:43:45.175892 4765 patch_prober.go:28] interesting pod/marketplace-operator-79b997595-fpzd6 container/marketplace-operator namespace/openshift-marketplace: Readiness probe status=failure output="Get \"http://10.217.0.55:8080/healthz\": dial tcp 10.217.0.55:8080: connect: connection refused" start-of-body= Oct 03 08:43:45 crc kubenswrapper[4765]: I1003 08:43:45.175927 4765 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-marketplace/marketplace-operator-79b997595-fpzd6" podUID="a266cf6c-8014-4ce5-a57a-9851e4f971c5" containerName="marketplace-operator" probeResult="failure" output="Get \"http://10.217.0.55:8080/healthz\": dial tcp 10.217.0.55:8080: connect: connection refused" Oct 03 08:43:45 crc kubenswrapper[4765]: I1003 08:43:45.196256 4765 scope.go:117] "RemoveContainer" containerID="ce4f29e8ce2d9a0eea23527c493640dfd37319f108e127813877781a2a1e4451" Oct 03 08:43:45 crc kubenswrapper[4765]: I1003 08:43:45.197040 4765 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-sgxf4"] Oct 03 08:43:45 crc kubenswrapper[4765]: I1003 08:43:45.201136 4765 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-sgxf4"] Oct 03 08:43:45 crc kubenswrapper[4765]: I1003 08:43:45.218892 4765 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/marketplace-operator-79b997595-fpzd6" podStartSLOduration=2.21886774 podStartE2EDuration="2.21886774s" podCreationTimestamp="2025-10-03 08:43:43 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-03 08:43:45.217534447 +0000 UTC m=+269.519028777" watchObservedRunningTime="2025-10-03 08:43:45.21886774 +0000 UTC m=+269.520362070" Oct 03 08:43:45 crc kubenswrapper[4765]: I1003 08:43:45.230515 4765 scope.go:117] "RemoveContainer" containerID="4d1b5c3436013a2e933445c0bfdfcf3c9a279ff54ea8bf4f7e4180d2abae5bd9" Oct 03 08:43:45 crc kubenswrapper[4765]: E1003 08:43:45.230949 4765 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4d1b5c3436013a2e933445c0bfdfcf3c9a279ff54ea8bf4f7e4180d2abae5bd9\": container with ID starting with 4d1b5c3436013a2e933445c0bfdfcf3c9a279ff54ea8bf4f7e4180d2abae5bd9 not found: ID does not exist" containerID="4d1b5c3436013a2e933445c0bfdfcf3c9a279ff54ea8bf4f7e4180d2abae5bd9" Oct 03 08:43:45 crc kubenswrapper[4765]: I1003 08:43:45.231001 4765 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4d1b5c3436013a2e933445c0bfdfcf3c9a279ff54ea8bf4f7e4180d2abae5bd9"} err="failed to get container status \"4d1b5c3436013a2e933445c0bfdfcf3c9a279ff54ea8bf4f7e4180d2abae5bd9\": rpc error: code = NotFound desc = could not find container \"4d1b5c3436013a2e933445c0bfdfcf3c9a279ff54ea8bf4f7e4180d2abae5bd9\": container with ID starting with 4d1b5c3436013a2e933445c0bfdfcf3c9a279ff54ea8bf4f7e4180d2abae5bd9 not found: ID does not exist" Oct 03 08:43:45 crc kubenswrapper[4765]: I1003 08:43:45.231034 4765 scope.go:117] "RemoveContainer" containerID="6bec777ef28515667057c6422dcdbba25fa7ae8da0af00ba91b0ea9fc843e51a" Oct 03 08:43:45 crc kubenswrapper[4765]: E1003 08:43:45.232343 4765 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6bec777ef28515667057c6422dcdbba25fa7ae8da0af00ba91b0ea9fc843e51a\": container with ID starting with 6bec777ef28515667057c6422dcdbba25fa7ae8da0af00ba91b0ea9fc843e51a not found: ID does not exist" containerID="6bec777ef28515667057c6422dcdbba25fa7ae8da0af00ba91b0ea9fc843e51a" Oct 03 08:43:45 crc kubenswrapper[4765]: I1003 08:43:45.232385 4765 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6bec777ef28515667057c6422dcdbba25fa7ae8da0af00ba91b0ea9fc843e51a"} err="failed to get container status \"6bec777ef28515667057c6422dcdbba25fa7ae8da0af00ba91b0ea9fc843e51a\": rpc error: code = NotFound desc = could not find container \"6bec777ef28515667057c6422dcdbba25fa7ae8da0af00ba91b0ea9fc843e51a\": container with ID starting with 6bec777ef28515667057c6422dcdbba25fa7ae8da0af00ba91b0ea9fc843e51a not found: ID does not exist" Oct 03 08:43:45 crc kubenswrapper[4765]: I1003 08:43:45.232416 4765 scope.go:117] "RemoveContainer" containerID="ce4f29e8ce2d9a0eea23527c493640dfd37319f108e127813877781a2a1e4451" Oct 03 08:43:45 crc kubenswrapper[4765]: E1003 08:43:45.232790 4765 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ce4f29e8ce2d9a0eea23527c493640dfd37319f108e127813877781a2a1e4451\": container with ID starting with ce4f29e8ce2d9a0eea23527c493640dfd37319f108e127813877781a2a1e4451 not found: ID does not exist" containerID="ce4f29e8ce2d9a0eea23527c493640dfd37319f108e127813877781a2a1e4451" Oct 03 08:43:45 crc kubenswrapper[4765]: I1003 08:43:45.232836 4765 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ce4f29e8ce2d9a0eea23527c493640dfd37319f108e127813877781a2a1e4451"} err="failed to get container status \"ce4f29e8ce2d9a0eea23527c493640dfd37319f108e127813877781a2a1e4451\": rpc error: code = NotFound desc = could not find container \"ce4f29e8ce2d9a0eea23527c493640dfd37319f108e127813877781a2a1e4451\": container with ID starting with ce4f29e8ce2d9a0eea23527c493640dfd37319f108e127813877781a2a1e4451 not found: ID does not exist" Oct 03 08:43:45 crc kubenswrapper[4765]: I1003 08:43:45.232866 4765 scope.go:117] "RemoveContainer" containerID="8684de6f72267e6ca3e62c5a7d4e19b18182b6272fc31ebf41077147710b10ea" Oct 03 08:43:45 crc kubenswrapper[4765]: I1003 08:43:45.236085 4765 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-9g9cw"] Oct 03 08:43:45 crc kubenswrapper[4765]: I1003 08:43:45.247322 4765 scope.go:117] "RemoveContainer" containerID="a1ff09d8d7d1e8f1b42a68c31bfa58310d634a870f25d75ff4172fc0e79ba901" Oct 03 08:43:45 crc kubenswrapper[4765]: I1003 08:43:45.250027 4765 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-9g9cw"] Oct 03 08:43:45 crc kubenswrapper[4765]: I1003 08:43:45.257771 4765 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-zfkjv"] Oct 03 08:43:45 crc kubenswrapper[4765]: I1003 08:43:45.262786 4765 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-zfkjv"] Oct 03 08:43:45 crc kubenswrapper[4765]: I1003 08:43:45.274052 4765 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-8lvxz"] Oct 03 08:43:45 crc kubenswrapper[4765]: I1003 08:43:45.279592 4765 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-8lvxz"] Oct 03 08:43:45 crc kubenswrapper[4765]: I1003 08:43:45.289411 4765 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-9hhf6"] Oct 03 08:43:45 crc kubenswrapper[4765]: I1003 08:43:45.291938 4765 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-9hhf6"] Oct 03 08:43:45 crc kubenswrapper[4765]: I1003 08:43:45.295733 4765 scope.go:117] "RemoveContainer" containerID="a62e6791acdd0b386426527d3d9540d23a6430da734d022b40ce600c4d95995a" Oct 03 08:43:45 crc kubenswrapper[4765]: I1003 08:43:45.316950 4765 scope.go:117] "RemoveContainer" containerID="535142c5dddf7f4fe82c5b7ad47d58d91bc4e851ca45cfa62f12be107faeca43" Oct 03 08:43:45 crc kubenswrapper[4765]: I1003 08:43:45.333001 4765 scope.go:117] "RemoveContainer" containerID="4e2480de9decc3f0297ff9d3cc848e06dae49b21c99cb3c43529b9e5dc9b1b52" Oct 03 08:43:45 crc kubenswrapper[4765]: I1003 08:43:45.360738 4765 scope.go:117] "RemoveContainer" containerID="c2e8e2d4dc455287d7e0c91edd5b9f890e1d42f190459e0b7478e41746b05130" Oct 03 08:43:45 crc kubenswrapper[4765]: I1003 08:43:45.375430 4765 scope.go:117] "RemoveContainer" containerID="9fc01a9d3ec9443dd862e65771dc9f3be592acce41d965949734c07886215d4a" Oct 03 08:43:45 crc kubenswrapper[4765]: I1003 08:43:45.388494 4765 scope.go:117] "RemoveContainer" containerID="4cd5aa9415cd0abd8a14b58991ad92118f4698c67b70e6d1b549d831f8ca1f5c" Oct 03 08:43:45 crc kubenswrapper[4765]: I1003 08:43:45.401516 4765 scope.go:117] "RemoveContainer" containerID="0c1ef063524c93625fe0bc43a07be7b73848dcafff64c393cbbe45be66a995d0" Oct 03 08:43:45 crc kubenswrapper[4765]: I1003 08:43:45.418132 4765 scope.go:117] "RemoveContainer" containerID="0285cd5fc73847426afcdfeb5b226e9ed90d74c3612eea25d3fdaa161ed38cb9" Oct 03 08:43:45 crc kubenswrapper[4765]: I1003 08:43:45.933612 4765 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-vw7jl"] Oct 03 08:43:45 crc kubenswrapper[4765]: E1003 08:43:45.933877 4765 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b280de40-e91c-4010-9173-48ed01320bd4" containerName="registry-server" Oct 03 08:43:45 crc kubenswrapper[4765]: I1003 08:43:45.933894 4765 state_mem.go:107] "Deleted CPUSet assignment" podUID="b280de40-e91c-4010-9173-48ed01320bd4" containerName="registry-server" Oct 03 08:43:45 crc kubenswrapper[4765]: E1003 08:43:45.933902 4765 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4de34feb-a2a4-49c9-b066-f7a71b39cd06" containerName="extract-content" Oct 03 08:43:45 crc kubenswrapper[4765]: I1003 08:43:45.933910 4765 state_mem.go:107] "Deleted CPUSet assignment" podUID="4de34feb-a2a4-49c9-b066-f7a71b39cd06" containerName="extract-content" Oct 03 08:43:45 crc kubenswrapper[4765]: E1003 08:43:45.933925 4765 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b280de40-e91c-4010-9173-48ed01320bd4" containerName="extract-utilities" Oct 03 08:43:45 crc kubenswrapper[4765]: I1003 08:43:45.933935 4765 state_mem.go:107] "Deleted CPUSet assignment" podUID="b280de40-e91c-4010-9173-48ed01320bd4" containerName="extract-utilities" Oct 03 08:43:45 crc kubenswrapper[4765]: E1003 08:43:45.933945 4765 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="52d70e1c-3f04-4bab-a6a3-2ea9d66489db" containerName="registry-server" Oct 03 08:43:45 crc kubenswrapper[4765]: I1003 08:43:45.933953 4765 state_mem.go:107] "Deleted CPUSet assignment" podUID="52d70e1c-3f04-4bab-a6a3-2ea9d66489db" containerName="registry-server" Oct 03 08:43:45 crc kubenswrapper[4765]: E1003 08:43:45.933968 4765 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="52d70e1c-3f04-4bab-a6a3-2ea9d66489db" containerName="extract-utilities" Oct 03 08:43:45 crc kubenswrapper[4765]: I1003 08:43:45.933976 4765 state_mem.go:107] "Deleted CPUSet assignment" podUID="52d70e1c-3f04-4bab-a6a3-2ea9d66489db" containerName="extract-utilities" Oct 03 08:43:45 crc kubenswrapper[4765]: E1003 08:43:45.933984 4765 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4de34feb-a2a4-49c9-b066-f7a71b39cd06" containerName="extract-utilities" Oct 03 08:43:45 crc kubenswrapper[4765]: I1003 08:43:45.933992 4765 state_mem.go:107] "Deleted CPUSet assignment" podUID="4de34feb-a2a4-49c9-b066-f7a71b39cd06" containerName="extract-utilities" Oct 03 08:43:45 crc kubenswrapper[4765]: E1003 08:43:45.934002 4765 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="28cc4e4f-507b-49c7-9a8f-2107e600e834" containerName="marketplace-operator" Oct 03 08:43:45 crc kubenswrapper[4765]: I1003 08:43:45.934038 4765 state_mem.go:107] "Deleted CPUSet assignment" podUID="28cc4e4f-507b-49c7-9a8f-2107e600e834" containerName="marketplace-operator" Oct 03 08:43:45 crc kubenswrapper[4765]: E1003 08:43:45.934047 4765 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d55c53fc-df46-4bb6-a4b7-4d269d965dc6" containerName="extract-utilities" Oct 03 08:43:45 crc kubenswrapper[4765]: I1003 08:43:45.934055 4765 state_mem.go:107] "Deleted CPUSet assignment" podUID="d55c53fc-df46-4bb6-a4b7-4d269d965dc6" containerName="extract-utilities" Oct 03 08:43:45 crc kubenswrapper[4765]: E1003 08:43:45.934067 4765 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d55c53fc-df46-4bb6-a4b7-4d269d965dc6" containerName="extract-content" Oct 03 08:43:45 crc kubenswrapper[4765]: I1003 08:43:45.934074 4765 state_mem.go:107] "Deleted CPUSet assignment" podUID="d55c53fc-df46-4bb6-a4b7-4d269d965dc6" containerName="extract-content" Oct 03 08:43:45 crc kubenswrapper[4765]: E1003 08:43:45.934084 4765 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b280de40-e91c-4010-9173-48ed01320bd4" containerName="extract-content" Oct 03 08:43:45 crc kubenswrapper[4765]: I1003 08:43:45.934091 4765 state_mem.go:107] "Deleted CPUSet assignment" podUID="b280de40-e91c-4010-9173-48ed01320bd4" containerName="extract-content" Oct 03 08:43:45 crc kubenswrapper[4765]: E1003 08:43:45.934100 4765 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d55c53fc-df46-4bb6-a4b7-4d269d965dc6" containerName="registry-server" Oct 03 08:43:45 crc kubenswrapper[4765]: I1003 08:43:45.934109 4765 state_mem.go:107] "Deleted CPUSet assignment" podUID="d55c53fc-df46-4bb6-a4b7-4d269d965dc6" containerName="registry-server" Oct 03 08:43:45 crc kubenswrapper[4765]: E1003 08:43:45.934121 4765 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="52d70e1c-3f04-4bab-a6a3-2ea9d66489db" containerName="extract-content" Oct 03 08:43:45 crc kubenswrapper[4765]: I1003 08:43:45.934127 4765 state_mem.go:107] "Deleted CPUSet assignment" podUID="52d70e1c-3f04-4bab-a6a3-2ea9d66489db" containerName="extract-content" Oct 03 08:43:45 crc kubenswrapper[4765]: E1003 08:43:45.934140 4765 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4de34feb-a2a4-49c9-b066-f7a71b39cd06" containerName="registry-server" Oct 03 08:43:45 crc kubenswrapper[4765]: I1003 08:43:45.934149 4765 state_mem.go:107] "Deleted CPUSet assignment" podUID="4de34feb-a2a4-49c9-b066-f7a71b39cd06" containerName="registry-server" Oct 03 08:43:45 crc kubenswrapper[4765]: I1003 08:43:45.934269 4765 memory_manager.go:354] "RemoveStaleState removing state" podUID="28cc4e4f-507b-49c7-9a8f-2107e600e834" containerName="marketplace-operator" Oct 03 08:43:45 crc kubenswrapper[4765]: I1003 08:43:45.934281 4765 memory_manager.go:354] "RemoveStaleState removing state" podUID="b280de40-e91c-4010-9173-48ed01320bd4" containerName="registry-server" Oct 03 08:43:45 crc kubenswrapper[4765]: I1003 08:43:45.934298 4765 memory_manager.go:354] "RemoveStaleState removing state" podUID="4de34feb-a2a4-49c9-b066-f7a71b39cd06" containerName="registry-server" Oct 03 08:43:45 crc kubenswrapper[4765]: I1003 08:43:45.934310 4765 memory_manager.go:354] "RemoveStaleState removing state" podUID="d55c53fc-df46-4bb6-a4b7-4d269d965dc6" containerName="registry-server" Oct 03 08:43:45 crc kubenswrapper[4765]: I1003 08:43:45.934319 4765 memory_manager.go:354] "RemoveStaleState removing state" podUID="52d70e1c-3f04-4bab-a6a3-2ea9d66489db" containerName="registry-server" Oct 03 08:43:45 crc kubenswrapper[4765]: I1003 08:43:45.935147 4765 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-vw7jl" Oct 03 08:43:45 crc kubenswrapper[4765]: I1003 08:43:45.937042 4765 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"certified-operators-dockercfg-4rs5g" Oct 03 08:43:45 crc kubenswrapper[4765]: I1003 08:43:45.948799 4765 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-vw7jl"] Oct 03 08:43:46 crc kubenswrapper[4765]: I1003 08:43:46.083693 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b1baebbf-ae2b-4d7c-a366-3c4ecb7741db-utilities\") pod \"certified-operators-vw7jl\" (UID: \"b1baebbf-ae2b-4d7c-a366-3c4ecb7741db\") " pod="openshift-marketplace/certified-operators-vw7jl" Oct 03 08:43:46 crc kubenswrapper[4765]: I1003 08:43:46.083758 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qnkjf\" (UniqueName: \"kubernetes.io/projected/b1baebbf-ae2b-4d7c-a366-3c4ecb7741db-kube-api-access-qnkjf\") pod \"certified-operators-vw7jl\" (UID: \"b1baebbf-ae2b-4d7c-a366-3c4ecb7741db\") " pod="openshift-marketplace/certified-operators-vw7jl" Oct 03 08:43:46 crc kubenswrapper[4765]: I1003 08:43:46.083809 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b1baebbf-ae2b-4d7c-a366-3c4ecb7741db-catalog-content\") pod \"certified-operators-vw7jl\" (UID: \"b1baebbf-ae2b-4d7c-a366-3c4ecb7741db\") " pod="openshift-marketplace/certified-operators-vw7jl" Oct 03 08:43:46 crc kubenswrapper[4765]: I1003 08:43:46.185054 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b1baebbf-ae2b-4d7c-a366-3c4ecb7741db-utilities\") pod \"certified-operators-vw7jl\" (UID: \"b1baebbf-ae2b-4d7c-a366-3c4ecb7741db\") " pod="openshift-marketplace/certified-operators-vw7jl" Oct 03 08:43:46 crc kubenswrapper[4765]: I1003 08:43:46.185169 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qnkjf\" (UniqueName: \"kubernetes.io/projected/b1baebbf-ae2b-4d7c-a366-3c4ecb7741db-kube-api-access-qnkjf\") pod \"certified-operators-vw7jl\" (UID: \"b1baebbf-ae2b-4d7c-a366-3c4ecb7741db\") " pod="openshift-marketplace/certified-operators-vw7jl" Oct 03 08:43:46 crc kubenswrapper[4765]: I1003 08:43:46.185211 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b1baebbf-ae2b-4d7c-a366-3c4ecb7741db-catalog-content\") pod \"certified-operators-vw7jl\" (UID: \"b1baebbf-ae2b-4d7c-a366-3c4ecb7741db\") " pod="openshift-marketplace/certified-operators-vw7jl" Oct 03 08:43:46 crc kubenswrapper[4765]: I1003 08:43:46.185629 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b1baebbf-ae2b-4d7c-a366-3c4ecb7741db-utilities\") pod \"certified-operators-vw7jl\" (UID: \"b1baebbf-ae2b-4d7c-a366-3c4ecb7741db\") " pod="openshift-marketplace/certified-operators-vw7jl" Oct 03 08:43:46 crc kubenswrapper[4765]: I1003 08:43:46.185690 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b1baebbf-ae2b-4d7c-a366-3c4ecb7741db-catalog-content\") pod \"certified-operators-vw7jl\" (UID: \"b1baebbf-ae2b-4d7c-a366-3c4ecb7741db\") " pod="openshift-marketplace/certified-operators-vw7jl" Oct 03 08:43:46 crc kubenswrapper[4765]: I1003 08:43:46.192442 4765 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/marketplace-operator-79b997595-fpzd6" Oct 03 08:43:46 crc kubenswrapper[4765]: I1003 08:43:46.202599 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qnkjf\" (UniqueName: \"kubernetes.io/projected/b1baebbf-ae2b-4d7c-a366-3c4ecb7741db-kube-api-access-qnkjf\") pod \"certified-operators-vw7jl\" (UID: \"b1baebbf-ae2b-4d7c-a366-3c4ecb7741db\") " pod="openshift-marketplace/certified-operators-vw7jl" Oct 03 08:43:46 crc kubenswrapper[4765]: I1003 08:43:46.253163 4765 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-vw7jl" Oct 03 08:43:46 crc kubenswrapper[4765]: I1003 08:43:46.314563 4765 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="28cc4e4f-507b-49c7-9a8f-2107e600e834" path="/var/lib/kubelet/pods/28cc4e4f-507b-49c7-9a8f-2107e600e834/volumes" Oct 03 08:43:46 crc kubenswrapper[4765]: I1003 08:43:46.315506 4765 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4de34feb-a2a4-49c9-b066-f7a71b39cd06" path="/var/lib/kubelet/pods/4de34feb-a2a4-49c9-b066-f7a71b39cd06/volumes" Oct 03 08:43:46 crc kubenswrapper[4765]: I1003 08:43:46.316416 4765 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="52d70e1c-3f04-4bab-a6a3-2ea9d66489db" path="/var/lib/kubelet/pods/52d70e1c-3f04-4bab-a6a3-2ea9d66489db/volumes" Oct 03 08:43:46 crc kubenswrapper[4765]: I1003 08:43:46.318029 4765 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b280de40-e91c-4010-9173-48ed01320bd4" path="/var/lib/kubelet/pods/b280de40-e91c-4010-9173-48ed01320bd4/volumes" Oct 03 08:43:46 crc kubenswrapper[4765]: I1003 08:43:46.318698 4765 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d55c53fc-df46-4bb6-a4b7-4d269d965dc6" path="/var/lib/kubelet/pods/d55c53fc-df46-4bb6-a4b7-4d269d965dc6/volumes" Oct 03 08:43:46 crc kubenswrapper[4765]: I1003 08:43:46.537822 4765 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-zg6xn"] Oct 03 08:43:46 crc kubenswrapper[4765]: I1003 08:43:46.539474 4765 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-zg6xn" Oct 03 08:43:46 crc kubenswrapper[4765]: I1003 08:43:46.542204 4765 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-marketplace-dockercfg-x2ctb" Oct 03 08:43:46 crc kubenswrapper[4765]: I1003 08:43:46.549905 4765 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-zg6xn"] Oct 03 08:43:46 crc kubenswrapper[4765]: I1003 08:43:46.657270 4765 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-vw7jl"] Oct 03 08:43:46 crc kubenswrapper[4765]: W1003 08:43:46.666578 4765 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podb1baebbf_ae2b_4d7c_a366_3c4ecb7741db.slice/crio-a1282649a160484cd4e8d0790bce0d2b55b8da14a778c87a246205d0346d27ca WatchSource:0}: Error finding container a1282649a160484cd4e8d0790bce0d2b55b8da14a778c87a246205d0346d27ca: Status 404 returned error can't find the container with id a1282649a160484cd4e8d0790bce0d2b55b8da14a778c87a246205d0346d27ca Oct 03 08:43:46 crc kubenswrapper[4765]: I1003 08:43:46.691845 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/66ed5f9c-a15e-45b4-b79f-e574371609c8-utilities\") pod \"redhat-marketplace-zg6xn\" (UID: \"66ed5f9c-a15e-45b4-b79f-e574371609c8\") " pod="openshift-marketplace/redhat-marketplace-zg6xn" Oct 03 08:43:46 crc kubenswrapper[4765]: I1003 08:43:46.692407 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dk2vl\" (UniqueName: \"kubernetes.io/projected/66ed5f9c-a15e-45b4-b79f-e574371609c8-kube-api-access-dk2vl\") pod \"redhat-marketplace-zg6xn\" (UID: \"66ed5f9c-a15e-45b4-b79f-e574371609c8\") " pod="openshift-marketplace/redhat-marketplace-zg6xn" Oct 03 08:43:46 crc kubenswrapper[4765]: I1003 08:43:46.692601 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/66ed5f9c-a15e-45b4-b79f-e574371609c8-catalog-content\") pod \"redhat-marketplace-zg6xn\" (UID: \"66ed5f9c-a15e-45b4-b79f-e574371609c8\") " pod="openshift-marketplace/redhat-marketplace-zg6xn" Oct 03 08:43:46 crc kubenswrapper[4765]: I1003 08:43:46.794041 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/66ed5f9c-a15e-45b4-b79f-e574371609c8-utilities\") pod \"redhat-marketplace-zg6xn\" (UID: \"66ed5f9c-a15e-45b4-b79f-e574371609c8\") " pod="openshift-marketplace/redhat-marketplace-zg6xn" Oct 03 08:43:46 crc kubenswrapper[4765]: I1003 08:43:46.794336 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dk2vl\" (UniqueName: \"kubernetes.io/projected/66ed5f9c-a15e-45b4-b79f-e574371609c8-kube-api-access-dk2vl\") pod \"redhat-marketplace-zg6xn\" (UID: \"66ed5f9c-a15e-45b4-b79f-e574371609c8\") " pod="openshift-marketplace/redhat-marketplace-zg6xn" Oct 03 08:43:46 crc kubenswrapper[4765]: I1003 08:43:46.794467 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/66ed5f9c-a15e-45b4-b79f-e574371609c8-catalog-content\") pod \"redhat-marketplace-zg6xn\" (UID: \"66ed5f9c-a15e-45b4-b79f-e574371609c8\") " pod="openshift-marketplace/redhat-marketplace-zg6xn" Oct 03 08:43:46 crc kubenswrapper[4765]: I1003 08:43:46.794572 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/66ed5f9c-a15e-45b4-b79f-e574371609c8-utilities\") pod \"redhat-marketplace-zg6xn\" (UID: \"66ed5f9c-a15e-45b4-b79f-e574371609c8\") " pod="openshift-marketplace/redhat-marketplace-zg6xn" Oct 03 08:43:46 crc kubenswrapper[4765]: I1003 08:43:46.794885 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/66ed5f9c-a15e-45b4-b79f-e574371609c8-catalog-content\") pod \"redhat-marketplace-zg6xn\" (UID: \"66ed5f9c-a15e-45b4-b79f-e574371609c8\") " pod="openshift-marketplace/redhat-marketplace-zg6xn" Oct 03 08:43:46 crc kubenswrapper[4765]: I1003 08:43:46.814542 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dk2vl\" (UniqueName: \"kubernetes.io/projected/66ed5f9c-a15e-45b4-b79f-e574371609c8-kube-api-access-dk2vl\") pod \"redhat-marketplace-zg6xn\" (UID: \"66ed5f9c-a15e-45b4-b79f-e574371609c8\") " pod="openshift-marketplace/redhat-marketplace-zg6xn" Oct 03 08:43:46 crc kubenswrapper[4765]: I1003 08:43:46.866375 4765 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-zg6xn" Oct 03 08:43:47 crc kubenswrapper[4765]: I1003 08:43:47.053237 4765 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-zg6xn"] Oct 03 08:43:47 crc kubenswrapper[4765]: W1003 08:43:47.063148 4765 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod66ed5f9c_a15e_45b4_b79f_e574371609c8.slice/crio-b71deaa54e5d05da9f293fa357a0ca3af040dcba39c0cfe50596bf278c727cd1 WatchSource:0}: Error finding container b71deaa54e5d05da9f293fa357a0ca3af040dcba39c0cfe50596bf278c727cd1: Status 404 returned error can't find the container with id b71deaa54e5d05da9f293fa357a0ca3af040dcba39c0cfe50596bf278c727cd1 Oct 03 08:43:47 crc kubenswrapper[4765]: I1003 08:43:47.195203 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-zg6xn" event={"ID":"66ed5f9c-a15e-45b4-b79f-e574371609c8","Type":"ContainerStarted","Data":"b71deaa54e5d05da9f293fa357a0ca3af040dcba39c0cfe50596bf278c727cd1"} Oct 03 08:43:47 crc kubenswrapper[4765]: I1003 08:43:47.196782 4765 generic.go:334] "Generic (PLEG): container finished" podID="b1baebbf-ae2b-4d7c-a366-3c4ecb7741db" containerID="75827cae9e8184ff7f427eb63ce35eb15498ea89d036acec72ff7c202316a3ce" exitCode=0 Oct 03 08:43:47 crc kubenswrapper[4765]: I1003 08:43:47.197848 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-vw7jl" event={"ID":"b1baebbf-ae2b-4d7c-a366-3c4ecb7741db","Type":"ContainerDied","Data":"75827cae9e8184ff7f427eb63ce35eb15498ea89d036acec72ff7c202316a3ce"} Oct 03 08:43:47 crc kubenswrapper[4765]: I1003 08:43:47.197868 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-vw7jl" event={"ID":"b1baebbf-ae2b-4d7c-a366-3c4ecb7741db","Type":"ContainerStarted","Data":"a1282649a160484cd4e8d0790bce0d2b55b8da14a778c87a246205d0346d27ca"} Oct 03 08:43:48 crc kubenswrapper[4765]: I1003 08:43:48.205633 4765 generic.go:334] "Generic (PLEG): container finished" podID="66ed5f9c-a15e-45b4-b79f-e574371609c8" containerID="a446d27d0665c14a8b82faf9307b499152cf59850516391a58cae3a757bc150c" exitCode=0 Oct 03 08:43:48 crc kubenswrapper[4765]: I1003 08:43:48.205754 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-zg6xn" event={"ID":"66ed5f9c-a15e-45b4-b79f-e574371609c8","Type":"ContainerDied","Data":"a446d27d0665c14a8b82faf9307b499152cf59850516391a58cae3a757bc150c"} Oct 03 08:43:48 crc kubenswrapper[4765]: I1003 08:43:48.209684 4765 generic.go:334] "Generic (PLEG): container finished" podID="b1baebbf-ae2b-4d7c-a366-3c4ecb7741db" containerID="555b946d9bbf9dab777650aa7b30fb4d8ecd2941db2b205a107f00060702124f" exitCode=0 Oct 03 08:43:48 crc kubenswrapper[4765]: I1003 08:43:48.209794 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-vw7jl" event={"ID":"b1baebbf-ae2b-4d7c-a366-3c4ecb7741db","Type":"ContainerDied","Data":"555b946d9bbf9dab777650aa7b30fb4d8ecd2941db2b205a107f00060702124f"} Oct 03 08:43:48 crc kubenswrapper[4765]: I1003 08:43:48.336190 4765 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-bgnsr"] Oct 03 08:43:48 crc kubenswrapper[4765]: I1003 08:43:48.338050 4765 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-bgnsr" Oct 03 08:43:48 crc kubenswrapper[4765]: I1003 08:43:48.340028 4765 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-operators-dockercfg-ct8rh" Oct 03 08:43:48 crc kubenswrapper[4765]: I1003 08:43:48.352942 4765 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-bgnsr"] Oct 03 08:43:48 crc kubenswrapper[4765]: I1003 08:43:48.422992 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/085812e0-4616-4b7e-a4b4-1301aa194042-utilities\") pod \"redhat-operators-bgnsr\" (UID: \"085812e0-4616-4b7e-a4b4-1301aa194042\") " pod="openshift-marketplace/redhat-operators-bgnsr" Oct 03 08:43:48 crc kubenswrapper[4765]: I1003 08:43:48.423072 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/085812e0-4616-4b7e-a4b4-1301aa194042-catalog-content\") pod \"redhat-operators-bgnsr\" (UID: \"085812e0-4616-4b7e-a4b4-1301aa194042\") " pod="openshift-marketplace/redhat-operators-bgnsr" Oct 03 08:43:48 crc kubenswrapper[4765]: I1003 08:43:48.423112 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fddz5\" (UniqueName: \"kubernetes.io/projected/085812e0-4616-4b7e-a4b4-1301aa194042-kube-api-access-fddz5\") pod \"redhat-operators-bgnsr\" (UID: \"085812e0-4616-4b7e-a4b4-1301aa194042\") " pod="openshift-marketplace/redhat-operators-bgnsr" Oct 03 08:43:48 crc kubenswrapper[4765]: I1003 08:43:48.524468 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/085812e0-4616-4b7e-a4b4-1301aa194042-utilities\") pod \"redhat-operators-bgnsr\" (UID: \"085812e0-4616-4b7e-a4b4-1301aa194042\") " pod="openshift-marketplace/redhat-operators-bgnsr" Oct 03 08:43:48 crc kubenswrapper[4765]: I1003 08:43:48.524561 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/085812e0-4616-4b7e-a4b4-1301aa194042-catalog-content\") pod \"redhat-operators-bgnsr\" (UID: \"085812e0-4616-4b7e-a4b4-1301aa194042\") " pod="openshift-marketplace/redhat-operators-bgnsr" Oct 03 08:43:48 crc kubenswrapper[4765]: I1003 08:43:48.524598 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fddz5\" (UniqueName: \"kubernetes.io/projected/085812e0-4616-4b7e-a4b4-1301aa194042-kube-api-access-fddz5\") pod \"redhat-operators-bgnsr\" (UID: \"085812e0-4616-4b7e-a4b4-1301aa194042\") " pod="openshift-marketplace/redhat-operators-bgnsr" Oct 03 08:43:48 crc kubenswrapper[4765]: I1003 08:43:48.525241 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/085812e0-4616-4b7e-a4b4-1301aa194042-utilities\") pod \"redhat-operators-bgnsr\" (UID: \"085812e0-4616-4b7e-a4b4-1301aa194042\") " pod="openshift-marketplace/redhat-operators-bgnsr" Oct 03 08:43:48 crc kubenswrapper[4765]: I1003 08:43:48.525264 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/085812e0-4616-4b7e-a4b4-1301aa194042-catalog-content\") pod \"redhat-operators-bgnsr\" (UID: \"085812e0-4616-4b7e-a4b4-1301aa194042\") " pod="openshift-marketplace/redhat-operators-bgnsr" Oct 03 08:43:48 crc kubenswrapper[4765]: I1003 08:43:48.549146 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fddz5\" (UniqueName: \"kubernetes.io/projected/085812e0-4616-4b7e-a4b4-1301aa194042-kube-api-access-fddz5\") pod \"redhat-operators-bgnsr\" (UID: \"085812e0-4616-4b7e-a4b4-1301aa194042\") " pod="openshift-marketplace/redhat-operators-bgnsr" Oct 03 08:43:48 crc kubenswrapper[4765]: I1003 08:43:48.660581 4765 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-bgnsr" Oct 03 08:43:48 crc kubenswrapper[4765]: I1003 08:43:48.913004 4765 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-bgnsr"] Oct 03 08:43:48 crc kubenswrapper[4765]: I1003 08:43:48.952094 4765 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-kcpdx"] Oct 03 08:43:48 crc kubenswrapper[4765]: I1003 08:43:48.953473 4765 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-kcpdx" Oct 03 08:43:48 crc kubenswrapper[4765]: I1003 08:43:48.956204 4765 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"community-operators-dockercfg-dmngl" Oct 03 08:43:48 crc kubenswrapper[4765]: I1003 08:43:48.958325 4765 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-kcpdx"] Oct 03 08:43:49 crc kubenswrapper[4765]: I1003 08:43:49.137696 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2shh4\" (UniqueName: \"kubernetes.io/projected/5ff32a58-625a-44cf-bd6e-7a2f7b3f06f9-kube-api-access-2shh4\") pod \"community-operators-kcpdx\" (UID: \"5ff32a58-625a-44cf-bd6e-7a2f7b3f06f9\") " pod="openshift-marketplace/community-operators-kcpdx" Oct 03 08:43:49 crc kubenswrapper[4765]: I1003 08:43:49.138144 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5ff32a58-625a-44cf-bd6e-7a2f7b3f06f9-utilities\") pod \"community-operators-kcpdx\" (UID: \"5ff32a58-625a-44cf-bd6e-7a2f7b3f06f9\") " pod="openshift-marketplace/community-operators-kcpdx" Oct 03 08:43:49 crc kubenswrapper[4765]: I1003 08:43:49.138249 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5ff32a58-625a-44cf-bd6e-7a2f7b3f06f9-catalog-content\") pod \"community-operators-kcpdx\" (UID: \"5ff32a58-625a-44cf-bd6e-7a2f7b3f06f9\") " pod="openshift-marketplace/community-operators-kcpdx" Oct 03 08:43:49 crc kubenswrapper[4765]: I1003 08:43:49.222037 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-vw7jl" event={"ID":"b1baebbf-ae2b-4d7c-a366-3c4ecb7741db","Type":"ContainerStarted","Data":"ad2aea2d6c243171bfa545d82270e8677dfbfcd096842b11e6549bdc3c5a4f36"} Oct 03 08:43:49 crc kubenswrapper[4765]: I1003 08:43:49.224543 4765 generic.go:334] "Generic (PLEG): container finished" podID="085812e0-4616-4b7e-a4b4-1301aa194042" containerID="46f9d7b1ad3cc0e141537182550dd9ce0c085b1259cad51a7f47c30b18cdef0a" exitCode=0 Oct 03 08:43:49 crc kubenswrapper[4765]: I1003 08:43:49.224614 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-bgnsr" event={"ID":"085812e0-4616-4b7e-a4b4-1301aa194042","Type":"ContainerDied","Data":"46f9d7b1ad3cc0e141537182550dd9ce0c085b1259cad51a7f47c30b18cdef0a"} Oct 03 08:43:49 crc kubenswrapper[4765]: I1003 08:43:49.224678 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-bgnsr" event={"ID":"085812e0-4616-4b7e-a4b4-1301aa194042","Type":"ContainerStarted","Data":"dd22448543f4503b9e6d13118fff68046ca3aa6b241e310770cad0e0f6e88059"} Oct 03 08:43:49 crc kubenswrapper[4765]: I1003 08:43:49.239520 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2shh4\" (UniqueName: \"kubernetes.io/projected/5ff32a58-625a-44cf-bd6e-7a2f7b3f06f9-kube-api-access-2shh4\") pod \"community-operators-kcpdx\" (UID: \"5ff32a58-625a-44cf-bd6e-7a2f7b3f06f9\") " pod="openshift-marketplace/community-operators-kcpdx" Oct 03 08:43:49 crc kubenswrapper[4765]: I1003 08:43:49.239667 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5ff32a58-625a-44cf-bd6e-7a2f7b3f06f9-utilities\") pod \"community-operators-kcpdx\" (UID: \"5ff32a58-625a-44cf-bd6e-7a2f7b3f06f9\") " pod="openshift-marketplace/community-operators-kcpdx" Oct 03 08:43:49 crc kubenswrapper[4765]: I1003 08:43:49.239693 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5ff32a58-625a-44cf-bd6e-7a2f7b3f06f9-catalog-content\") pod \"community-operators-kcpdx\" (UID: \"5ff32a58-625a-44cf-bd6e-7a2f7b3f06f9\") " pod="openshift-marketplace/community-operators-kcpdx" Oct 03 08:43:49 crc kubenswrapper[4765]: I1003 08:43:49.240173 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5ff32a58-625a-44cf-bd6e-7a2f7b3f06f9-catalog-content\") pod \"community-operators-kcpdx\" (UID: \"5ff32a58-625a-44cf-bd6e-7a2f7b3f06f9\") " pod="openshift-marketplace/community-operators-kcpdx" Oct 03 08:43:49 crc kubenswrapper[4765]: I1003 08:43:49.240484 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5ff32a58-625a-44cf-bd6e-7a2f7b3f06f9-utilities\") pod \"community-operators-kcpdx\" (UID: \"5ff32a58-625a-44cf-bd6e-7a2f7b3f06f9\") " pod="openshift-marketplace/community-operators-kcpdx" Oct 03 08:43:49 crc kubenswrapper[4765]: I1003 08:43:49.250470 4765 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-vw7jl" podStartSLOduration=2.63439912 podStartE2EDuration="4.250449437s" podCreationTimestamp="2025-10-03 08:43:45 +0000 UTC" firstStartedPulling="2025-10-03 08:43:47.198422284 +0000 UTC m=+271.499916614" lastFinishedPulling="2025-10-03 08:43:48.814472601 +0000 UTC m=+273.115966931" observedRunningTime="2025-10-03 08:43:49.245109163 +0000 UTC m=+273.546603503" watchObservedRunningTime="2025-10-03 08:43:49.250449437 +0000 UTC m=+273.551943767" Oct 03 08:43:49 crc kubenswrapper[4765]: I1003 08:43:49.264701 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2shh4\" (UniqueName: \"kubernetes.io/projected/5ff32a58-625a-44cf-bd6e-7a2f7b3f06f9-kube-api-access-2shh4\") pod \"community-operators-kcpdx\" (UID: \"5ff32a58-625a-44cf-bd6e-7a2f7b3f06f9\") " pod="openshift-marketplace/community-operators-kcpdx" Oct 03 08:43:49 crc kubenswrapper[4765]: I1003 08:43:49.274057 4765 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-kcpdx" Oct 03 08:43:49 crc kubenswrapper[4765]: I1003 08:43:49.484988 4765 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-kcpdx"] Oct 03 08:43:50 crc kubenswrapper[4765]: I1003 08:43:50.232199 4765 generic.go:334] "Generic (PLEG): container finished" podID="5ff32a58-625a-44cf-bd6e-7a2f7b3f06f9" containerID="64453125a3fd6075ec5d0c3b98ab1be7a0c317833f9d076c860f74ca152b6822" exitCode=0 Oct 03 08:43:50 crc kubenswrapper[4765]: I1003 08:43:50.232290 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-kcpdx" event={"ID":"5ff32a58-625a-44cf-bd6e-7a2f7b3f06f9","Type":"ContainerDied","Data":"64453125a3fd6075ec5d0c3b98ab1be7a0c317833f9d076c860f74ca152b6822"} Oct 03 08:43:50 crc kubenswrapper[4765]: I1003 08:43:50.232842 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-kcpdx" event={"ID":"5ff32a58-625a-44cf-bd6e-7a2f7b3f06f9","Type":"ContainerStarted","Data":"b5853219238288ce1312e409d2d3e1cdf88559f3c59d338881aa504ef2a3fba1"} Oct 03 08:43:50 crc kubenswrapper[4765]: I1003 08:43:50.236137 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-bgnsr" event={"ID":"085812e0-4616-4b7e-a4b4-1301aa194042","Type":"ContainerStarted","Data":"94be3376868ab974ba3a80000c71cb22b22952ef53558993c255588c5f68d2fe"} Oct 03 08:43:50 crc kubenswrapper[4765]: I1003 08:43:50.239567 4765 generic.go:334] "Generic (PLEG): container finished" podID="66ed5f9c-a15e-45b4-b79f-e574371609c8" containerID="9cacb77e53ee7204931b5d3aba1227c875210a8761ae8e95004378f31364ec28" exitCode=0 Oct 03 08:43:50 crc kubenswrapper[4765]: I1003 08:43:50.240032 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-zg6xn" event={"ID":"66ed5f9c-a15e-45b4-b79f-e574371609c8","Type":"ContainerDied","Data":"9cacb77e53ee7204931b5d3aba1227c875210a8761ae8e95004378f31364ec28"} Oct 03 08:43:51 crc kubenswrapper[4765]: I1003 08:43:51.246191 4765 generic.go:334] "Generic (PLEG): container finished" podID="085812e0-4616-4b7e-a4b4-1301aa194042" containerID="94be3376868ab974ba3a80000c71cb22b22952ef53558993c255588c5f68d2fe" exitCode=0 Oct 03 08:43:51 crc kubenswrapper[4765]: I1003 08:43:51.246272 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-bgnsr" event={"ID":"085812e0-4616-4b7e-a4b4-1301aa194042","Type":"ContainerDied","Data":"94be3376868ab974ba3a80000c71cb22b22952ef53558993c255588c5f68d2fe"} Oct 03 08:43:51 crc kubenswrapper[4765]: I1003 08:43:51.249493 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-zg6xn" event={"ID":"66ed5f9c-a15e-45b4-b79f-e574371609c8","Type":"ContainerStarted","Data":"561665cee6de6e0ab6954842ab01e9c5bf10d1f0656d8c297cd6843781ed8906"} Oct 03 08:43:51 crc kubenswrapper[4765]: I1003 08:43:51.252928 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-kcpdx" event={"ID":"5ff32a58-625a-44cf-bd6e-7a2f7b3f06f9","Type":"ContainerStarted","Data":"0304bd92a4c6f3f4d44911e2c5d66c96ad5888d235ba144122d1989ac4c5065c"} Oct 03 08:43:51 crc kubenswrapper[4765]: I1003 08:43:51.282290 4765 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-zg6xn" podStartSLOduration=2.756978219 podStartE2EDuration="5.282255272s" podCreationTimestamp="2025-10-03 08:43:46 +0000 UTC" firstStartedPulling="2025-10-03 08:43:48.207992208 +0000 UTC m=+272.509486538" lastFinishedPulling="2025-10-03 08:43:50.733269261 +0000 UTC m=+275.034763591" observedRunningTime="2025-10-03 08:43:51.281676397 +0000 UTC m=+275.583170727" watchObservedRunningTime="2025-10-03 08:43:51.282255272 +0000 UTC m=+275.583749602" Oct 03 08:43:52 crc kubenswrapper[4765]: I1003 08:43:52.271484 4765 generic.go:334] "Generic (PLEG): container finished" podID="5ff32a58-625a-44cf-bd6e-7a2f7b3f06f9" containerID="0304bd92a4c6f3f4d44911e2c5d66c96ad5888d235ba144122d1989ac4c5065c" exitCode=0 Oct 03 08:43:52 crc kubenswrapper[4765]: I1003 08:43:52.271558 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-kcpdx" event={"ID":"5ff32a58-625a-44cf-bd6e-7a2f7b3f06f9","Type":"ContainerDied","Data":"0304bd92a4c6f3f4d44911e2c5d66c96ad5888d235ba144122d1989ac4c5065c"} Oct 03 08:43:54 crc kubenswrapper[4765]: I1003 08:43:54.286415 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-kcpdx" event={"ID":"5ff32a58-625a-44cf-bd6e-7a2f7b3f06f9","Type":"ContainerStarted","Data":"f13883b6ff96102e2b877197ab9da3514bc45fbfd9c5b111e2374380fb167af2"} Oct 03 08:43:54 crc kubenswrapper[4765]: I1003 08:43:54.289924 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-bgnsr" event={"ID":"085812e0-4616-4b7e-a4b4-1301aa194042","Type":"ContainerStarted","Data":"12da4bc41fce3961a93ceff7cefbaad331479beb469dc960f71bf236fec0915d"} Oct 03 08:43:54 crc kubenswrapper[4765]: I1003 08:43:54.305318 4765 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-kcpdx" podStartSLOduration=3.649222176 podStartE2EDuration="6.305302901s" podCreationTimestamp="2025-10-03 08:43:48 +0000 UTC" firstStartedPulling="2025-10-03 08:43:50.234187512 +0000 UTC m=+274.535681842" lastFinishedPulling="2025-10-03 08:43:52.890268237 +0000 UTC m=+277.191762567" observedRunningTime="2025-10-03 08:43:54.303865335 +0000 UTC m=+278.605359695" watchObservedRunningTime="2025-10-03 08:43:54.305302901 +0000 UTC m=+278.606797231" Oct 03 08:43:54 crc kubenswrapper[4765]: I1003 08:43:54.329229 4765 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-bgnsr" podStartSLOduration=3.813822845 podStartE2EDuration="6.32919135s" podCreationTimestamp="2025-10-03 08:43:48 +0000 UTC" firstStartedPulling="2025-10-03 08:43:49.225901311 +0000 UTC m=+273.527395641" lastFinishedPulling="2025-10-03 08:43:51.741269816 +0000 UTC m=+276.042764146" observedRunningTime="2025-10-03 08:43:54.326497963 +0000 UTC m=+278.627992303" watchObservedRunningTime="2025-10-03 08:43:54.32919135 +0000 UTC m=+278.630685690" Oct 03 08:43:56 crc kubenswrapper[4765]: I1003 08:43:56.254256 4765 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-vw7jl" Oct 03 08:43:56 crc kubenswrapper[4765]: I1003 08:43:56.254665 4765 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-vw7jl" Oct 03 08:43:56 crc kubenswrapper[4765]: I1003 08:43:56.302516 4765 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-vw7jl" Oct 03 08:43:56 crc kubenswrapper[4765]: I1003 08:43:56.342019 4765 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-vw7jl" Oct 03 08:43:56 crc kubenswrapper[4765]: I1003 08:43:56.866837 4765 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-zg6xn" Oct 03 08:43:56 crc kubenswrapper[4765]: I1003 08:43:56.866894 4765 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-zg6xn" Oct 03 08:43:56 crc kubenswrapper[4765]: I1003 08:43:56.908405 4765 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-zg6xn" Oct 03 08:43:57 crc kubenswrapper[4765]: I1003 08:43:57.348530 4765 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-zg6xn" Oct 03 08:43:58 crc kubenswrapper[4765]: I1003 08:43:58.661792 4765 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-bgnsr" Oct 03 08:43:58 crc kubenswrapper[4765]: I1003 08:43:58.661862 4765 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-bgnsr" Oct 03 08:44:00 crc kubenswrapper[4765]: I1003 08:43:58.710868 4765 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-bgnsr" Oct 03 08:44:00 crc kubenswrapper[4765]: I1003 08:43:59.275100 4765 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-kcpdx" Oct 03 08:44:00 crc kubenswrapper[4765]: I1003 08:43:59.275522 4765 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-kcpdx" Oct 03 08:44:00 crc kubenswrapper[4765]: I1003 08:43:59.322361 4765 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-kcpdx" Oct 03 08:44:00 crc kubenswrapper[4765]: I1003 08:43:59.362771 4765 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-bgnsr" Oct 03 08:44:00 crc kubenswrapper[4765]: I1003 08:43:59.371365 4765 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-kcpdx" Oct 03 08:45:00 crc kubenswrapper[4765]: I1003 08:45:00.142332 4765 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29324685-wjbbs"] Oct 03 08:45:00 crc kubenswrapper[4765]: I1003 08:45:00.144331 4765 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29324685-wjbbs" Oct 03 08:45:00 crc kubenswrapper[4765]: I1003 08:45:00.146996 4765 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Oct 03 08:45:00 crc kubenswrapper[4765]: I1003 08:45:00.147897 4765 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Oct 03 08:45:00 crc kubenswrapper[4765]: I1003 08:45:00.156250 4765 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29324685-wjbbs"] Oct 03 08:45:00 crc kubenswrapper[4765]: I1003 08:45:00.323965 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/2655d101-f2a0-494e-b78b-d6d37f771960-config-volume\") pod \"collect-profiles-29324685-wjbbs\" (UID: \"2655d101-f2a0-494e-b78b-d6d37f771960\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29324685-wjbbs" Oct 03 08:45:00 crc kubenswrapper[4765]: I1003 08:45:00.324044 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ndbtv\" (UniqueName: \"kubernetes.io/projected/2655d101-f2a0-494e-b78b-d6d37f771960-kube-api-access-ndbtv\") pod \"collect-profiles-29324685-wjbbs\" (UID: \"2655d101-f2a0-494e-b78b-d6d37f771960\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29324685-wjbbs" Oct 03 08:45:00 crc kubenswrapper[4765]: I1003 08:45:00.324081 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/2655d101-f2a0-494e-b78b-d6d37f771960-secret-volume\") pod \"collect-profiles-29324685-wjbbs\" (UID: \"2655d101-f2a0-494e-b78b-d6d37f771960\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29324685-wjbbs" Oct 03 08:45:00 crc kubenswrapper[4765]: I1003 08:45:00.425571 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/2655d101-f2a0-494e-b78b-d6d37f771960-secret-volume\") pod \"collect-profiles-29324685-wjbbs\" (UID: \"2655d101-f2a0-494e-b78b-d6d37f771960\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29324685-wjbbs" Oct 03 08:45:00 crc kubenswrapper[4765]: I1003 08:45:00.425742 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/2655d101-f2a0-494e-b78b-d6d37f771960-config-volume\") pod \"collect-profiles-29324685-wjbbs\" (UID: \"2655d101-f2a0-494e-b78b-d6d37f771960\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29324685-wjbbs" Oct 03 08:45:00 crc kubenswrapper[4765]: I1003 08:45:00.425837 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ndbtv\" (UniqueName: \"kubernetes.io/projected/2655d101-f2a0-494e-b78b-d6d37f771960-kube-api-access-ndbtv\") pod \"collect-profiles-29324685-wjbbs\" (UID: \"2655d101-f2a0-494e-b78b-d6d37f771960\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29324685-wjbbs" Oct 03 08:45:00 crc kubenswrapper[4765]: I1003 08:45:00.428114 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/2655d101-f2a0-494e-b78b-d6d37f771960-config-volume\") pod \"collect-profiles-29324685-wjbbs\" (UID: \"2655d101-f2a0-494e-b78b-d6d37f771960\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29324685-wjbbs" Oct 03 08:45:00 crc kubenswrapper[4765]: I1003 08:45:00.435170 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/2655d101-f2a0-494e-b78b-d6d37f771960-secret-volume\") pod \"collect-profiles-29324685-wjbbs\" (UID: \"2655d101-f2a0-494e-b78b-d6d37f771960\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29324685-wjbbs" Oct 03 08:45:00 crc kubenswrapper[4765]: I1003 08:45:00.450356 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ndbtv\" (UniqueName: \"kubernetes.io/projected/2655d101-f2a0-494e-b78b-d6d37f771960-kube-api-access-ndbtv\") pod \"collect-profiles-29324685-wjbbs\" (UID: \"2655d101-f2a0-494e-b78b-d6d37f771960\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29324685-wjbbs" Oct 03 08:45:00 crc kubenswrapper[4765]: I1003 08:45:00.468262 4765 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29324685-wjbbs" Oct 03 08:45:00 crc kubenswrapper[4765]: I1003 08:45:00.660994 4765 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29324685-wjbbs"] Oct 03 08:45:01 crc kubenswrapper[4765]: I1003 08:45:01.646614 4765 generic.go:334] "Generic (PLEG): container finished" podID="2655d101-f2a0-494e-b78b-d6d37f771960" containerID="09d79c2aa3536a9712a2e60e6ed271b5910884837d8034b8de022782712ff701" exitCode=0 Oct 03 08:45:01 crc kubenswrapper[4765]: I1003 08:45:01.646694 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29324685-wjbbs" event={"ID":"2655d101-f2a0-494e-b78b-d6d37f771960","Type":"ContainerDied","Data":"09d79c2aa3536a9712a2e60e6ed271b5910884837d8034b8de022782712ff701"} Oct 03 08:45:01 crc kubenswrapper[4765]: I1003 08:45:01.646914 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29324685-wjbbs" event={"ID":"2655d101-f2a0-494e-b78b-d6d37f771960","Type":"ContainerStarted","Data":"fd85ae30c1e72bb2d0befbd890ed784c3f52f3e0126cad68f0e69a15bea72062"} Oct 03 08:45:02 crc kubenswrapper[4765]: I1003 08:45:02.846306 4765 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29324685-wjbbs" Oct 03 08:45:02 crc kubenswrapper[4765]: I1003 08:45:02.857671 4765 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ndbtv\" (UniqueName: \"kubernetes.io/projected/2655d101-f2a0-494e-b78b-d6d37f771960-kube-api-access-ndbtv\") pod \"2655d101-f2a0-494e-b78b-d6d37f771960\" (UID: \"2655d101-f2a0-494e-b78b-d6d37f771960\") " Oct 03 08:45:02 crc kubenswrapper[4765]: I1003 08:45:02.857751 4765 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/2655d101-f2a0-494e-b78b-d6d37f771960-secret-volume\") pod \"2655d101-f2a0-494e-b78b-d6d37f771960\" (UID: \"2655d101-f2a0-494e-b78b-d6d37f771960\") " Oct 03 08:45:02 crc kubenswrapper[4765]: I1003 08:45:02.857785 4765 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/2655d101-f2a0-494e-b78b-d6d37f771960-config-volume\") pod \"2655d101-f2a0-494e-b78b-d6d37f771960\" (UID: \"2655d101-f2a0-494e-b78b-d6d37f771960\") " Oct 03 08:45:02 crc kubenswrapper[4765]: I1003 08:45:02.859137 4765 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/2655d101-f2a0-494e-b78b-d6d37f771960-config-volume" (OuterVolumeSpecName: "config-volume") pod "2655d101-f2a0-494e-b78b-d6d37f771960" (UID: "2655d101-f2a0-494e-b78b-d6d37f771960"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 03 08:45:02 crc kubenswrapper[4765]: I1003 08:45:02.865224 4765 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2655d101-f2a0-494e-b78b-d6d37f771960-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "2655d101-f2a0-494e-b78b-d6d37f771960" (UID: "2655d101-f2a0-494e-b78b-d6d37f771960"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 08:45:02 crc kubenswrapper[4765]: I1003 08:45:02.865317 4765 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2655d101-f2a0-494e-b78b-d6d37f771960-kube-api-access-ndbtv" (OuterVolumeSpecName: "kube-api-access-ndbtv") pod "2655d101-f2a0-494e-b78b-d6d37f771960" (UID: "2655d101-f2a0-494e-b78b-d6d37f771960"). InnerVolumeSpecName "kube-api-access-ndbtv". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 08:45:02 crc kubenswrapper[4765]: I1003 08:45:02.959845 4765 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ndbtv\" (UniqueName: \"kubernetes.io/projected/2655d101-f2a0-494e-b78b-d6d37f771960-kube-api-access-ndbtv\") on node \"crc\" DevicePath \"\"" Oct 03 08:45:02 crc kubenswrapper[4765]: I1003 08:45:02.959888 4765 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/2655d101-f2a0-494e-b78b-d6d37f771960-secret-volume\") on node \"crc\" DevicePath \"\"" Oct 03 08:45:02 crc kubenswrapper[4765]: I1003 08:45:02.959900 4765 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/2655d101-f2a0-494e-b78b-d6d37f771960-config-volume\") on node \"crc\" DevicePath \"\"" Oct 03 08:45:03 crc kubenswrapper[4765]: I1003 08:45:03.658000 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29324685-wjbbs" event={"ID":"2655d101-f2a0-494e-b78b-d6d37f771960","Type":"ContainerDied","Data":"fd85ae30c1e72bb2d0befbd890ed784c3f52f3e0126cad68f0e69a15bea72062"} Oct 03 08:45:03 crc kubenswrapper[4765]: I1003 08:45:03.658320 4765 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="fd85ae30c1e72bb2d0befbd890ed784c3f52f3e0126cad68f0e69a15bea72062" Oct 03 08:45:03 crc kubenswrapper[4765]: I1003 08:45:03.658041 4765 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29324685-wjbbs" Oct 03 08:45:30 crc kubenswrapper[4765]: I1003 08:45:30.680912 4765 patch_prober.go:28] interesting pod/machine-config-daemon-j8mss container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 03 08:45:30 crc kubenswrapper[4765]: I1003 08:45:30.681466 4765 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-j8mss" podUID="d636dbad-9ffa-4ba7-953f-adea04b76a23" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 03 08:46:00 crc kubenswrapper[4765]: I1003 08:46:00.680504 4765 patch_prober.go:28] interesting pod/machine-config-daemon-j8mss container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 03 08:46:00 crc kubenswrapper[4765]: I1003 08:46:00.681084 4765 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-j8mss" podUID="d636dbad-9ffa-4ba7-953f-adea04b76a23" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 03 08:46:30 crc kubenswrapper[4765]: I1003 08:46:30.681216 4765 patch_prober.go:28] interesting pod/machine-config-daemon-j8mss container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 03 08:46:30 crc kubenswrapper[4765]: I1003 08:46:30.682222 4765 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-j8mss" podUID="d636dbad-9ffa-4ba7-953f-adea04b76a23" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 03 08:46:30 crc kubenswrapper[4765]: I1003 08:46:30.682301 4765 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-j8mss" Oct 03 08:46:30 crc kubenswrapper[4765]: I1003 08:46:30.684686 4765 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"f8f01250a245724a5319a16414554a23f7e9e678c70099524c09e960696e7846"} pod="openshift-machine-config-operator/machine-config-daemon-j8mss" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Oct 03 08:46:30 crc kubenswrapper[4765]: I1003 08:46:30.684798 4765 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-j8mss" podUID="d636dbad-9ffa-4ba7-953f-adea04b76a23" containerName="machine-config-daemon" containerID="cri-o://f8f01250a245724a5319a16414554a23f7e9e678c70099524c09e960696e7846" gracePeriod=600 Oct 03 08:46:31 crc kubenswrapper[4765]: I1003 08:46:31.137693 4765 generic.go:334] "Generic (PLEG): container finished" podID="d636dbad-9ffa-4ba7-953f-adea04b76a23" containerID="f8f01250a245724a5319a16414554a23f7e9e678c70099524c09e960696e7846" exitCode=0 Oct 03 08:46:31 crc kubenswrapper[4765]: I1003 08:46:31.137905 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-j8mss" event={"ID":"d636dbad-9ffa-4ba7-953f-adea04b76a23","Type":"ContainerDied","Data":"f8f01250a245724a5319a16414554a23f7e9e678c70099524c09e960696e7846"} Oct 03 08:46:31 crc kubenswrapper[4765]: I1003 08:46:31.138299 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-j8mss" event={"ID":"d636dbad-9ffa-4ba7-953f-adea04b76a23","Type":"ContainerStarted","Data":"089a5d4e72b0c207517b7963f9dfafd9affe2b20c69fc8f9bbf6c0c97d24c65d"} Oct 03 08:46:31 crc kubenswrapper[4765]: I1003 08:46:31.138377 4765 scope.go:117] "RemoveContainer" containerID="714c78e9165f96e2aee03ad7be980399f06aeb852da4d76611c236f262518281" Oct 03 08:46:58 crc kubenswrapper[4765]: I1003 08:46:58.798178 4765 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-image-registry/image-registry-66df7c8f76-8svlv"] Oct 03 08:46:58 crc kubenswrapper[4765]: E1003 08:46:58.800179 4765 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2655d101-f2a0-494e-b78b-d6d37f771960" containerName="collect-profiles" Oct 03 08:46:58 crc kubenswrapper[4765]: I1003 08:46:58.800263 4765 state_mem.go:107] "Deleted CPUSet assignment" podUID="2655d101-f2a0-494e-b78b-d6d37f771960" containerName="collect-profiles" Oct 03 08:46:58 crc kubenswrapper[4765]: I1003 08:46:58.800437 4765 memory_manager.go:354] "RemoveStaleState removing state" podUID="2655d101-f2a0-494e-b78b-d6d37f771960" containerName="collect-profiles" Oct 03 08:46:58 crc kubenswrapper[4765]: I1003 08:46:58.800985 4765 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-66df7c8f76-8svlv" Oct 03 08:46:58 crc kubenswrapper[4765]: I1003 08:46:58.823656 4765 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-66df7c8f76-8svlv"] Oct 03 08:46:58 crc kubenswrapper[4765]: I1003 08:46:58.906686 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/c9f0c287-c7e4-49e6-aed2-9746cae4bb83-bound-sa-token\") pod \"image-registry-66df7c8f76-8svlv\" (UID: \"c9f0c287-c7e4-49e6-aed2-9746cae4bb83\") " pod="openshift-image-registry/image-registry-66df7c8f76-8svlv" Oct 03 08:46:58 crc kubenswrapper[4765]: I1003 08:46:58.906756 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/c9f0c287-c7e4-49e6-aed2-9746cae4bb83-installation-pull-secrets\") pod \"image-registry-66df7c8f76-8svlv\" (UID: \"c9f0c287-c7e4-49e6-aed2-9746cae4bb83\") " pod="openshift-image-registry/image-registry-66df7c8f76-8svlv" Oct 03 08:46:58 crc kubenswrapper[4765]: I1003 08:46:58.906785 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/c9f0c287-c7e4-49e6-aed2-9746cae4bb83-registry-tls\") pod \"image-registry-66df7c8f76-8svlv\" (UID: \"c9f0c287-c7e4-49e6-aed2-9746cae4bb83\") " pod="openshift-image-registry/image-registry-66df7c8f76-8svlv" Oct 03 08:46:58 crc kubenswrapper[4765]: I1003 08:46:58.906808 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/c9f0c287-c7e4-49e6-aed2-9746cae4bb83-ca-trust-extracted\") pod \"image-registry-66df7c8f76-8svlv\" (UID: \"c9f0c287-c7e4-49e6-aed2-9746cae4bb83\") " pod="openshift-image-registry/image-registry-66df7c8f76-8svlv" Oct 03 08:46:58 crc kubenswrapper[4765]: I1003 08:46:58.906843 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-66df7c8f76-8svlv\" (UID: \"c9f0c287-c7e4-49e6-aed2-9746cae4bb83\") " pod="openshift-image-registry/image-registry-66df7c8f76-8svlv" Oct 03 08:46:58 crc kubenswrapper[4765]: I1003 08:46:58.906864 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/c9f0c287-c7e4-49e6-aed2-9746cae4bb83-trusted-ca\") pod \"image-registry-66df7c8f76-8svlv\" (UID: \"c9f0c287-c7e4-49e6-aed2-9746cae4bb83\") " pod="openshift-image-registry/image-registry-66df7c8f76-8svlv" Oct 03 08:46:58 crc kubenswrapper[4765]: I1003 08:46:58.906883 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/c9f0c287-c7e4-49e6-aed2-9746cae4bb83-registry-certificates\") pod \"image-registry-66df7c8f76-8svlv\" (UID: \"c9f0c287-c7e4-49e6-aed2-9746cae4bb83\") " pod="openshift-image-registry/image-registry-66df7c8f76-8svlv" Oct 03 08:46:58 crc kubenswrapper[4765]: I1003 08:46:58.906900 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dxlqs\" (UniqueName: \"kubernetes.io/projected/c9f0c287-c7e4-49e6-aed2-9746cae4bb83-kube-api-access-dxlqs\") pod \"image-registry-66df7c8f76-8svlv\" (UID: \"c9f0c287-c7e4-49e6-aed2-9746cae4bb83\") " pod="openshift-image-registry/image-registry-66df7c8f76-8svlv" Oct 03 08:46:58 crc kubenswrapper[4765]: I1003 08:46:58.931460 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-66df7c8f76-8svlv\" (UID: \"c9f0c287-c7e4-49e6-aed2-9746cae4bb83\") " pod="openshift-image-registry/image-registry-66df7c8f76-8svlv" Oct 03 08:46:59 crc kubenswrapper[4765]: I1003 08:46:59.008022 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/c9f0c287-c7e4-49e6-aed2-9746cae4bb83-ca-trust-extracted\") pod \"image-registry-66df7c8f76-8svlv\" (UID: \"c9f0c287-c7e4-49e6-aed2-9746cae4bb83\") " pod="openshift-image-registry/image-registry-66df7c8f76-8svlv" Oct 03 08:46:59 crc kubenswrapper[4765]: I1003 08:46:59.008452 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/c9f0c287-c7e4-49e6-aed2-9746cae4bb83-trusted-ca\") pod \"image-registry-66df7c8f76-8svlv\" (UID: \"c9f0c287-c7e4-49e6-aed2-9746cae4bb83\") " pod="openshift-image-registry/image-registry-66df7c8f76-8svlv" Oct 03 08:46:59 crc kubenswrapper[4765]: I1003 08:46:59.008483 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/c9f0c287-c7e4-49e6-aed2-9746cae4bb83-registry-certificates\") pod \"image-registry-66df7c8f76-8svlv\" (UID: \"c9f0c287-c7e4-49e6-aed2-9746cae4bb83\") " pod="openshift-image-registry/image-registry-66df7c8f76-8svlv" Oct 03 08:46:59 crc kubenswrapper[4765]: I1003 08:46:59.008507 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dxlqs\" (UniqueName: \"kubernetes.io/projected/c9f0c287-c7e4-49e6-aed2-9746cae4bb83-kube-api-access-dxlqs\") pod \"image-registry-66df7c8f76-8svlv\" (UID: \"c9f0c287-c7e4-49e6-aed2-9746cae4bb83\") " pod="openshift-image-registry/image-registry-66df7c8f76-8svlv" Oct 03 08:46:59 crc kubenswrapper[4765]: I1003 08:46:59.008561 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/c9f0c287-c7e4-49e6-aed2-9746cae4bb83-bound-sa-token\") pod \"image-registry-66df7c8f76-8svlv\" (UID: \"c9f0c287-c7e4-49e6-aed2-9746cae4bb83\") " pod="openshift-image-registry/image-registry-66df7c8f76-8svlv" Oct 03 08:46:59 crc kubenswrapper[4765]: I1003 08:46:59.008620 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/c9f0c287-c7e4-49e6-aed2-9746cae4bb83-ca-trust-extracted\") pod \"image-registry-66df7c8f76-8svlv\" (UID: \"c9f0c287-c7e4-49e6-aed2-9746cae4bb83\") " pod="openshift-image-registry/image-registry-66df7c8f76-8svlv" Oct 03 08:46:59 crc kubenswrapper[4765]: I1003 08:46:59.008639 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/c9f0c287-c7e4-49e6-aed2-9746cae4bb83-installation-pull-secrets\") pod \"image-registry-66df7c8f76-8svlv\" (UID: \"c9f0c287-c7e4-49e6-aed2-9746cae4bb83\") " pod="openshift-image-registry/image-registry-66df7c8f76-8svlv" Oct 03 08:46:59 crc kubenswrapper[4765]: I1003 08:46:59.008710 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/c9f0c287-c7e4-49e6-aed2-9746cae4bb83-registry-tls\") pod \"image-registry-66df7c8f76-8svlv\" (UID: \"c9f0c287-c7e4-49e6-aed2-9746cae4bb83\") " pod="openshift-image-registry/image-registry-66df7c8f76-8svlv" Oct 03 08:46:59 crc kubenswrapper[4765]: I1003 08:46:59.009900 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/c9f0c287-c7e4-49e6-aed2-9746cae4bb83-trusted-ca\") pod \"image-registry-66df7c8f76-8svlv\" (UID: \"c9f0c287-c7e4-49e6-aed2-9746cae4bb83\") " pod="openshift-image-registry/image-registry-66df7c8f76-8svlv" Oct 03 08:46:59 crc kubenswrapper[4765]: I1003 08:46:59.010198 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/c9f0c287-c7e4-49e6-aed2-9746cae4bb83-registry-certificates\") pod \"image-registry-66df7c8f76-8svlv\" (UID: \"c9f0c287-c7e4-49e6-aed2-9746cae4bb83\") " pod="openshift-image-registry/image-registry-66df7c8f76-8svlv" Oct 03 08:46:59 crc kubenswrapper[4765]: I1003 08:46:59.022324 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/c9f0c287-c7e4-49e6-aed2-9746cae4bb83-installation-pull-secrets\") pod \"image-registry-66df7c8f76-8svlv\" (UID: \"c9f0c287-c7e4-49e6-aed2-9746cae4bb83\") " pod="openshift-image-registry/image-registry-66df7c8f76-8svlv" Oct 03 08:46:59 crc kubenswrapper[4765]: I1003 08:46:59.022493 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/c9f0c287-c7e4-49e6-aed2-9746cae4bb83-registry-tls\") pod \"image-registry-66df7c8f76-8svlv\" (UID: \"c9f0c287-c7e4-49e6-aed2-9746cae4bb83\") " pod="openshift-image-registry/image-registry-66df7c8f76-8svlv" Oct 03 08:46:59 crc kubenswrapper[4765]: I1003 08:46:59.027354 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dxlqs\" (UniqueName: \"kubernetes.io/projected/c9f0c287-c7e4-49e6-aed2-9746cae4bb83-kube-api-access-dxlqs\") pod \"image-registry-66df7c8f76-8svlv\" (UID: \"c9f0c287-c7e4-49e6-aed2-9746cae4bb83\") " pod="openshift-image-registry/image-registry-66df7c8f76-8svlv" Oct 03 08:46:59 crc kubenswrapper[4765]: I1003 08:46:59.027958 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/c9f0c287-c7e4-49e6-aed2-9746cae4bb83-bound-sa-token\") pod \"image-registry-66df7c8f76-8svlv\" (UID: \"c9f0c287-c7e4-49e6-aed2-9746cae4bb83\") " pod="openshift-image-registry/image-registry-66df7c8f76-8svlv" Oct 03 08:46:59 crc kubenswrapper[4765]: I1003 08:46:59.119330 4765 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-66df7c8f76-8svlv" Oct 03 08:46:59 crc kubenswrapper[4765]: I1003 08:46:59.312095 4765 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-66df7c8f76-8svlv"] Oct 03 08:47:00 crc kubenswrapper[4765]: I1003 08:47:00.299898 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-66df7c8f76-8svlv" event={"ID":"c9f0c287-c7e4-49e6-aed2-9746cae4bb83","Type":"ContainerStarted","Data":"9f09c47fddad8a607350353ebe50911878c2626f81ae802ff1ff7b3cc45a1cce"} Oct 03 08:47:00 crc kubenswrapper[4765]: I1003 08:47:00.299950 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-66df7c8f76-8svlv" event={"ID":"c9f0c287-c7e4-49e6-aed2-9746cae4bb83","Type":"ContainerStarted","Data":"eaad9709038373ba26fc127b02d1c6f02b303cc362d5a869434f35df32c45afb"} Oct 03 08:47:00 crc kubenswrapper[4765]: I1003 08:47:00.300060 4765 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-image-registry/image-registry-66df7c8f76-8svlv" Oct 03 08:47:00 crc kubenswrapper[4765]: I1003 08:47:00.327803 4765 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/image-registry-66df7c8f76-8svlv" podStartSLOduration=2.327785323 podStartE2EDuration="2.327785323s" podCreationTimestamp="2025-10-03 08:46:58 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-03 08:47:00.324747365 +0000 UTC m=+464.626241695" watchObservedRunningTime="2025-10-03 08:47:00.327785323 +0000 UTC m=+464.629279653" Oct 03 08:47:19 crc kubenswrapper[4765]: I1003 08:47:19.125736 4765 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-image-registry/image-registry-66df7c8f76-8svlv" Oct 03 08:47:19 crc kubenswrapper[4765]: I1003 08:47:19.178018 4765 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-qwb6x"] Oct 03 08:47:44 crc kubenswrapper[4765]: I1003 08:47:44.215724 4765 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-image-registry/image-registry-697d97f7c8-qwb6x" podUID="32b16068-abfd-4a3f-870c-a17c7ff31d4b" containerName="registry" containerID="cri-o://e92cfe8eb3ebb1e407b2a79114c38b5b0ef27cbdbbc86d9736f8790ee2d1e875" gracePeriod=30 Oct 03 08:47:44 crc kubenswrapper[4765]: I1003 08:47:44.518162 4765 generic.go:334] "Generic (PLEG): container finished" podID="32b16068-abfd-4a3f-870c-a17c7ff31d4b" containerID="e92cfe8eb3ebb1e407b2a79114c38b5b0ef27cbdbbc86d9736f8790ee2d1e875" exitCode=0 Oct 03 08:47:44 crc kubenswrapper[4765]: I1003 08:47:44.518244 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-697d97f7c8-qwb6x" event={"ID":"32b16068-abfd-4a3f-870c-a17c7ff31d4b","Type":"ContainerDied","Data":"e92cfe8eb3ebb1e407b2a79114c38b5b0ef27cbdbbc86d9736f8790ee2d1e875"} Oct 03 08:47:44 crc kubenswrapper[4765]: I1003 08:47:44.518815 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-697d97f7c8-qwb6x" event={"ID":"32b16068-abfd-4a3f-870c-a17c7ff31d4b","Type":"ContainerDied","Data":"16f5447bc263041e5e6ab54c072d1a8842f6702c45704c96e83efcd4ee3ba199"} Oct 03 08:47:44 crc kubenswrapper[4765]: I1003 08:47:44.518874 4765 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="16f5447bc263041e5e6ab54c072d1a8842f6702c45704c96e83efcd4ee3ba199" Oct 03 08:47:44 crc kubenswrapper[4765]: I1003 08:47:44.535129 4765 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-697d97f7c8-qwb6x" Oct 03 08:47:44 crc kubenswrapper[4765]: I1003 08:47:44.697321 4765 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/32b16068-abfd-4a3f-870c-a17c7ff31d4b-ca-trust-extracted\") pod \"32b16068-abfd-4a3f-870c-a17c7ff31d4b\" (UID: \"32b16068-abfd-4a3f-870c-a17c7ff31d4b\") " Oct 03 08:47:44 crc kubenswrapper[4765]: I1003 08:47:44.697374 4765 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/32b16068-abfd-4a3f-870c-a17c7ff31d4b-trusted-ca\") pod \"32b16068-abfd-4a3f-870c-a17c7ff31d4b\" (UID: \"32b16068-abfd-4a3f-870c-a17c7ff31d4b\") " Oct 03 08:47:44 crc kubenswrapper[4765]: I1003 08:47:44.697403 4765 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/32b16068-abfd-4a3f-870c-a17c7ff31d4b-bound-sa-token\") pod \"32b16068-abfd-4a3f-870c-a17c7ff31d4b\" (UID: \"32b16068-abfd-4a3f-870c-a17c7ff31d4b\") " Oct 03 08:47:44 crc kubenswrapper[4765]: I1003 08:47:44.697442 4765 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/32b16068-abfd-4a3f-870c-a17c7ff31d4b-registry-certificates\") pod \"32b16068-abfd-4a3f-870c-a17c7ff31d4b\" (UID: \"32b16068-abfd-4a3f-870c-a17c7ff31d4b\") " Oct 03 08:47:44 crc kubenswrapper[4765]: I1003 08:47:44.697620 4765 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"registry-storage\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"32b16068-abfd-4a3f-870c-a17c7ff31d4b\" (UID: \"32b16068-abfd-4a3f-870c-a17c7ff31d4b\") " Oct 03 08:47:44 crc kubenswrapper[4765]: I1003 08:47:44.697676 4765 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/32b16068-abfd-4a3f-870c-a17c7ff31d4b-installation-pull-secrets\") pod \"32b16068-abfd-4a3f-870c-a17c7ff31d4b\" (UID: \"32b16068-abfd-4a3f-870c-a17c7ff31d4b\") " Oct 03 08:47:44 crc kubenswrapper[4765]: I1003 08:47:44.697705 4765 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-w8dbx\" (UniqueName: \"kubernetes.io/projected/32b16068-abfd-4a3f-870c-a17c7ff31d4b-kube-api-access-w8dbx\") pod \"32b16068-abfd-4a3f-870c-a17c7ff31d4b\" (UID: \"32b16068-abfd-4a3f-870c-a17c7ff31d4b\") " Oct 03 08:47:44 crc kubenswrapper[4765]: I1003 08:47:44.697727 4765 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/32b16068-abfd-4a3f-870c-a17c7ff31d4b-registry-tls\") pod \"32b16068-abfd-4a3f-870c-a17c7ff31d4b\" (UID: \"32b16068-abfd-4a3f-870c-a17c7ff31d4b\") " Oct 03 08:47:44 crc kubenswrapper[4765]: I1003 08:47:44.701431 4765 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/32b16068-abfd-4a3f-870c-a17c7ff31d4b-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "32b16068-abfd-4a3f-870c-a17c7ff31d4b" (UID: "32b16068-abfd-4a3f-870c-a17c7ff31d4b"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 03 08:47:44 crc kubenswrapper[4765]: I1003 08:47:44.702389 4765 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/32b16068-abfd-4a3f-870c-a17c7ff31d4b-registry-certificates" (OuterVolumeSpecName: "registry-certificates") pod "32b16068-abfd-4a3f-870c-a17c7ff31d4b" (UID: "32b16068-abfd-4a3f-870c-a17c7ff31d4b"). InnerVolumeSpecName "registry-certificates". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 03 08:47:44 crc kubenswrapper[4765]: I1003 08:47:44.703709 4765 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/32b16068-abfd-4a3f-870c-a17c7ff31d4b-bound-sa-token" (OuterVolumeSpecName: "bound-sa-token") pod "32b16068-abfd-4a3f-870c-a17c7ff31d4b" (UID: "32b16068-abfd-4a3f-870c-a17c7ff31d4b"). InnerVolumeSpecName "bound-sa-token". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 08:47:44 crc kubenswrapper[4765]: I1003 08:47:44.704576 4765 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/32b16068-abfd-4a3f-870c-a17c7ff31d4b-installation-pull-secrets" (OuterVolumeSpecName: "installation-pull-secrets") pod "32b16068-abfd-4a3f-870c-a17c7ff31d4b" (UID: "32b16068-abfd-4a3f-870c-a17c7ff31d4b"). InnerVolumeSpecName "installation-pull-secrets". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 08:47:44 crc kubenswrapper[4765]: I1003 08:47:44.704823 4765 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/32b16068-abfd-4a3f-870c-a17c7ff31d4b-registry-tls" (OuterVolumeSpecName: "registry-tls") pod "32b16068-abfd-4a3f-870c-a17c7ff31d4b" (UID: "32b16068-abfd-4a3f-870c-a17c7ff31d4b"). InnerVolumeSpecName "registry-tls". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 08:47:44 crc kubenswrapper[4765]: I1003 08:47:44.706409 4765 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/32b16068-abfd-4a3f-870c-a17c7ff31d4b-kube-api-access-w8dbx" (OuterVolumeSpecName: "kube-api-access-w8dbx") pod "32b16068-abfd-4a3f-870c-a17c7ff31d4b" (UID: "32b16068-abfd-4a3f-870c-a17c7ff31d4b"). InnerVolumeSpecName "kube-api-access-w8dbx". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 08:47:44 crc kubenswrapper[4765]: I1003 08:47:44.710523 4765 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (OuterVolumeSpecName: "registry-storage") pod "32b16068-abfd-4a3f-870c-a17c7ff31d4b" (UID: "32b16068-abfd-4a3f-870c-a17c7ff31d4b"). InnerVolumeSpecName "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8". PluginName "kubernetes.io/csi", VolumeGidValue "" Oct 03 08:47:44 crc kubenswrapper[4765]: I1003 08:47:44.718273 4765 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/32b16068-abfd-4a3f-870c-a17c7ff31d4b-ca-trust-extracted" (OuterVolumeSpecName: "ca-trust-extracted") pod "32b16068-abfd-4a3f-870c-a17c7ff31d4b" (UID: "32b16068-abfd-4a3f-870c-a17c7ff31d4b"). InnerVolumeSpecName "ca-trust-extracted". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 03 08:47:44 crc kubenswrapper[4765]: I1003 08:47:44.798844 4765 reconciler_common.go:293] "Volume detached for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/32b16068-abfd-4a3f-870c-a17c7ff31d4b-installation-pull-secrets\") on node \"crc\" DevicePath \"\"" Oct 03 08:47:44 crc kubenswrapper[4765]: I1003 08:47:44.798874 4765 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-w8dbx\" (UniqueName: \"kubernetes.io/projected/32b16068-abfd-4a3f-870c-a17c7ff31d4b-kube-api-access-w8dbx\") on node \"crc\" DevicePath \"\"" Oct 03 08:47:44 crc kubenswrapper[4765]: I1003 08:47:44.798886 4765 reconciler_common.go:293] "Volume detached for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/32b16068-abfd-4a3f-870c-a17c7ff31d4b-registry-tls\") on node \"crc\" DevicePath \"\"" Oct 03 08:47:44 crc kubenswrapper[4765]: I1003 08:47:44.798898 4765 reconciler_common.go:293] "Volume detached for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/32b16068-abfd-4a3f-870c-a17c7ff31d4b-ca-trust-extracted\") on node \"crc\" DevicePath \"\"" Oct 03 08:47:44 crc kubenswrapper[4765]: I1003 08:47:44.798909 4765 reconciler_common.go:293] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/32b16068-abfd-4a3f-870c-a17c7ff31d4b-trusted-ca\") on node \"crc\" DevicePath \"\"" Oct 03 08:47:44 crc kubenswrapper[4765]: I1003 08:47:44.798917 4765 reconciler_common.go:293] "Volume detached for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/32b16068-abfd-4a3f-870c-a17c7ff31d4b-bound-sa-token\") on node \"crc\" DevicePath \"\"" Oct 03 08:47:44 crc kubenswrapper[4765]: I1003 08:47:44.798929 4765 reconciler_common.go:293] "Volume detached for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/32b16068-abfd-4a3f-870c-a17c7ff31d4b-registry-certificates\") on node \"crc\" DevicePath \"\"" Oct 03 08:47:45 crc kubenswrapper[4765]: I1003 08:47:45.523770 4765 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-697d97f7c8-qwb6x" Oct 03 08:47:45 crc kubenswrapper[4765]: I1003 08:47:45.558717 4765 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-qwb6x"] Oct 03 08:47:45 crc kubenswrapper[4765]: I1003 08:47:45.561296 4765 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-qwb6x"] Oct 03 08:47:46 crc kubenswrapper[4765]: I1003 08:47:46.314068 4765 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="32b16068-abfd-4a3f-870c-a17c7ff31d4b" path="/var/lib/kubelet/pods/32b16068-abfd-4a3f-870c-a17c7ff31d4b/volumes" Oct 03 08:48:16 crc kubenswrapper[4765]: I1003 08:48:16.430801 4765 scope.go:117] "RemoveContainer" containerID="e92cfe8eb3ebb1e407b2a79114c38b5b0ef27cbdbbc86d9736f8790ee2d1e875" Oct 03 08:48:30 crc kubenswrapper[4765]: I1003 08:48:30.680099 4765 patch_prober.go:28] interesting pod/machine-config-daemon-j8mss container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 03 08:48:30 crc kubenswrapper[4765]: I1003 08:48:30.681145 4765 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-j8mss" podUID="d636dbad-9ffa-4ba7-953f-adea04b76a23" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 03 08:49:00 crc kubenswrapper[4765]: I1003 08:49:00.680508 4765 patch_prober.go:28] interesting pod/machine-config-daemon-j8mss container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 03 08:49:00 crc kubenswrapper[4765]: I1003 08:49:00.681076 4765 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-j8mss" podUID="d636dbad-9ffa-4ba7-953f-adea04b76a23" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 03 08:49:30 crc kubenswrapper[4765]: I1003 08:49:30.680066 4765 patch_prober.go:28] interesting pod/machine-config-daemon-j8mss container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 03 08:49:30 crc kubenswrapper[4765]: I1003 08:49:30.681109 4765 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-j8mss" podUID="d636dbad-9ffa-4ba7-953f-adea04b76a23" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 03 08:49:30 crc kubenswrapper[4765]: I1003 08:49:30.681183 4765 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-j8mss" Oct 03 08:49:30 crc kubenswrapper[4765]: I1003 08:49:30.682187 4765 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"089a5d4e72b0c207517b7963f9dfafd9affe2b20c69fc8f9bbf6c0c97d24c65d"} pod="openshift-machine-config-operator/machine-config-daemon-j8mss" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Oct 03 08:49:30 crc kubenswrapper[4765]: I1003 08:49:30.682272 4765 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-j8mss" podUID="d636dbad-9ffa-4ba7-953f-adea04b76a23" containerName="machine-config-daemon" containerID="cri-o://089a5d4e72b0c207517b7963f9dfafd9affe2b20c69fc8f9bbf6c0c97d24c65d" gracePeriod=600 Oct 03 08:49:31 crc kubenswrapper[4765]: I1003 08:49:31.030418 4765 generic.go:334] "Generic (PLEG): container finished" podID="d636dbad-9ffa-4ba7-953f-adea04b76a23" containerID="089a5d4e72b0c207517b7963f9dfafd9affe2b20c69fc8f9bbf6c0c97d24c65d" exitCode=0 Oct 03 08:49:31 crc kubenswrapper[4765]: I1003 08:49:31.030467 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-j8mss" event={"ID":"d636dbad-9ffa-4ba7-953f-adea04b76a23","Type":"ContainerDied","Data":"089a5d4e72b0c207517b7963f9dfafd9affe2b20c69fc8f9bbf6c0c97d24c65d"} Oct 03 08:49:31 crc kubenswrapper[4765]: I1003 08:49:31.030517 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-j8mss" event={"ID":"d636dbad-9ffa-4ba7-953f-adea04b76a23","Type":"ContainerStarted","Data":"5477c0c212e204859773f5f32ccf8d9a259a11b347c7a6534e70a233d47641c8"} Oct 03 08:49:31 crc kubenswrapper[4765]: I1003 08:49:31.030535 4765 scope.go:117] "RemoveContainer" containerID="f8f01250a245724a5319a16414554a23f7e9e678c70099524c09e960696e7846" Oct 03 08:50:06 crc kubenswrapper[4765]: I1003 08:50:06.133732 4765 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/a6d815214afcb93f379916e45350d3de39072121f31a1d7eaaf6e22c2dzxfhl"] Oct 03 08:50:06 crc kubenswrapper[4765]: E1003 08:50:06.134492 4765 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="32b16068-abfd-4a3f-870c-a17c7ff31d4b" containerName="registry" Oct 03 08:50:06 crc kubenswrapper[4765]: I1003 08:50:06.134506 4765 state_mem.go:107] "Deleted CPUSet assignment" podUID="32b16068-abfd-4a3f-870c-a17c7ff31d4b" containerName="registry" Oct 03 08:50:06 crc kubenswrapper[4765]: I1003 08:50:06.134599 4765 memory_manager.go:354] "RemoveStaleState removing state" podUID="32b16068-abfd-4a3f-870c-a17c7ff31d4b" containerName="registry" Oct 03 08:50:06 crc kubenswrapper[4765]: I1003 08:50:06.135334 4765 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/a6d815214afcb93f379916e45350d3de39072121f31a1d7eaaf6e22c2dzxfhl" Oct 03 08:50:06 crc kubenswrapper[4765]: I1003 08:50:06.141131 4765 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"default-dockercfg-vmwhc" Oct 03 08:50:06 crc kubenswrapper[4765]: I1003 08:50:06.148992 4765 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/a6d815214afcb93f379916e45350d3de39072121f31a1d7eaaf6e22c2dzxfhl"] Oct 03 08:50:06 crc kubenswrapper[4765]: I1003 08:50:06.292994 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2p5gd\" (UniqueName: \"kubernetes.io/projected/4522df7f-bbdd-4884-a115-d33fec3bb365-kube-api-access-2p5gd\") pod \"a6d815214afcb93f379916e45350d3de39072121f31a1d7eaaf6e22c2dzxfhl\" (UID: \"4522df7f-bbdd-4884-a115-d33fec3bb365\") " pod="openshift-marketplace/a6d815214afcb93f379916e45350d3de39072121f31a1d7eaaf6e22c2dzxfhl" Oct 03 08:50:06 crc kubenswrapper[4765]: I1003 08:50:06.293136 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/4522df7f-bbdd-4884-a115-d33fec3bb365-bundle\") pod \"a6d815214afcb93f379916e45350d3de39072121f31a1d7eaaf6e22c2dzxfhl\" (UID: \"4522df7f-bbdd-4884-a115-d33fec3bb365\") " pod="openshift-marketplace/a6d815214afcb93f379916e45350d3de39072121f31a1d7eaaf6e22c2dzxfhl" Oct 03 08:50:06 crc kubenswrapper[4765]: I1003 08:50:06.293161 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/4522df7f-bbdd-4884-a115-d33fec3bb365-util\") pod \"a6d815214afcb93f379916e45350d3de39072121f31a1d7eaaf6e22c2dzxfhl\" (UID: \"4522df7f-bbdd-4884-a115-d33fec3bb365\") " pod="openshift-marketplace/a6d815214afcb93f379916e45350d3de39072121f31a1d7eaaf6e22c2dzxfhl" Oct 03 08:50:06 crc kubenswrapper[4765]: I1003 08:50:06.395878 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/4522df7f-bbdd-4884-a115-d33fec3bb365-bundle\") pod \"a6d815214afcb93f379916e45350d3de39072121f31a1d7eaaf6e22c2dzxfhl\" (UID: \"4522df7f-bbdd-4884-a115-d33fec3bb365\") " pod="openshift-marketplace/a6d815214afcb93f379916e45350d3de39072121f31a1d7eaaf6e22c2dzxfhl" Oct 03 08:50:06 crc kubenswrapper[4765]: I1003 08:50:06.395955 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/4522df7f-bbdd-4884-a115-d33fec3bb365-util\") pod \"a6d815214afcb93f379916e45350d3de39072121f31a1d7eaaf6e22c2dzxfhl\" (UID: \"4522df7f-bbdd-4884-a115-d33fec3bb365\") " pod="openshift-marketplace/a6d815214afcb93f379916e45350d3de39072121f31a1d7eaaf6e22c2dzxfhl" Oct 03 08:50:06 crc kubenswrapper[4765]: I1003 08:50:06.396041 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2p5gd\" (UniqueName: \"kubernetes.io/projected/4522df7f-bbdd-4884-a115-d33fec3bb365-kube-api-access-2p5gd\") pod \"a6d815214afcb93f379916e45350d3de39072121f31a1d7eaaf6e22c2dzxfhl\" (UID: \"4522df7f-bbdd-4884-a115-d33fec3bb365\") " pod="openshift-marketplace/a6d815214afcb93f379916e45350d3de39072121f31a1d7eaaf6e22c2dzxfhl" Oct 03 08:50:06 crc kubenswrapper[4765]: I1003 08:50:06.396398 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/4522df7f-bbdd-4884-a115-d33fec3bb365-util\") pod \"a6d815214afcb93f379916e45350d3de39072121f31a1d7eaaf6e22c2dzxfhl\" (UID: \"4522df7f-bbdd-4884-a115-d33fec3bb365\") " pod="openshift-marketplace/a6d815214afcb93f379916e45350d3de39072121f31a1d7eaaf6e22c2dzxfhl" Oct 03 08:50:06 crc kubenswrapper[4765]: I1003 08:50:06.396436 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/4522df7f-bbdd-4884-a115-d33fec3bb365-bundle\") pod \"a6d815214afcb93f379916e45350d3de39072121f31a1d7eaaf6e22c2dzxfhl\" (UID: \"4522df7f-bbdd-4884-a115-d33fec3bb365\") " pod="openshift-marketplace/a6d815214afcb93f379916e45350d3de39072121f31a1d7eaaf6e22c2dzxfhl" Oct 03 08:50:06 crc kubenswrapper[4765]: I1003 08:50:06.417857 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2p5gd\" (UniqueName: \"kubernetes.io/projected/4522df7f-bbdd-4884-a115-d33fec3bb365-kube-api-access-2p5gd\") pod \"a6d815214afcb93f379916e45350d3de39072121f31a1d7eaaf6e22c2dzxfhl\" (UID: \"4522df7f-bbdd-4884-a115-d33fec3bb365\") " pod="openshift-marketplace/a6d815214afcb93f379916e45350d3de39072121f31a1d7eaaf6e22c2dzxfhl" Oct 03 08:50:06 crc kubenswrapper[4765]: I1003 08:50:06.457667 4765 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/a6d815214afcb93f379916e45350d3de39072121f31a1d7eaaf6e22c2dzxfhl" Oct 03 08:50:06 crc kubenswrapper[4765]: I1003 08:50:06.855012 4765 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/a6d815214afcb93f379916e45350d3de39072121f31a1d7eaaf6e22c2dzxfhl"] Oct 03 08:50:06 crc kubenswrapper[4765]: W1003 08:50:06.863360 4765 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod4522df7f_bbdd_4884_a115_d33fec3bb365.slice/crio-5964e02a917820c958317b15e84777642e63a605ef479e7ab6c2efde4053d314 WatchSource:0}: Error finding container 5964e02a917820c958317b15e84777642e63a605ef479e7ab6c2efde4053d314: Status 404 returned error can't find the container with id 5964e02a917820c958317b15e84777642e63a605ef479e7ab6c2efde4053d314 Oct 03 08:50:07 crc kubenswrapper[4765]: I1003 08:50:07.217137 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/a6d815214afcb93f379916e45350d3de39072121f31a1d7eaaf6e22c2dzxfhl" event={"ID":"4522df7f-bbdd-4884-a115-d33fec3bb365","Type":"ContainerStarted","Data":"b57cbbc3fa5845fbac69ddac38662ae8f77872c4bc9d3f093b56a887302c4c5e"} Oct 03 08:50:07 crc kubenswrapper[4765]: I1003 08:50:07.217521 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/a6d815214afcb93f379916e45350d3de39072121f31a1d7eaaf6e22c2dzxfhl" event={"ID":"4522df7f-bbdd-4884-a115-d33fec3bb365","Type":"ContainerStarted","Data":"5964e02a917820c958317b15e84777642e63a605ef479e7ab6c2efde4053d314"} Oct 03 08:50:08 crc kubenswrapper[4765]: I1003 08:50:08.232250 4765 generic.go:334] "Generic (PLEG): container finished" podID="4522df7f-bbdd-4884-a115-d33fec3bb365" containerID="b57cbbc3fa5845fbac69ddac38662ae8f77872c4bc9d3f093b56a887302c4c5e" exitCode=0 Oct 03 08:50:08 crc kubenswrapper[4765]: I1003 08:50:08.232309 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/a6d815214afcb93f379916e45350d3de39072121f31a1d7eaaf6e22c2dzxfhl" event={"ID":"4522df7f-bbdd-4884-a115-d33fec3bb365","Type":"ContainerDied","Data":"b57cbbc3fa5845fbac69ddac38662ae8f77872c4bc9d3f093b56a887302c4c5e"} Oct 03 08:50:08 crc kubenswrapper[4765]: I1003 08:50:08.234758 4765 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Oct 03 08:50:10 crc kubenswrapper[4765]: I1003 08:50:10.262180 4765 generic.go:334] "Generic (PLEG): container finished" podID="4522df7f-bbdd-4884-a115-d33fec3bb365" containerID="5d2e18e5ca6f110d62ea93821e836b975ba5e16c2c85016fe6bbce7c2ae35790" exitCode=0 Oct 03 08:50:10 crc kubenswrapper[4765]: I1003 08:50:10.262340 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/a6d815214afcb93f379916e45350d3de39072121f31a1d7eaaf6e22c2dzxfhl" event={"ID":"4522df7f-bbdd-4884-a115-d33fec3bb365","Type":"ContainerDied","Data":"5d2e18e5ca6f110d62ea93821e836b975ba5e16c2c85016fe6bbce7c2ae35790"} Oct 03 08:50:11 crc kubenswrapper[4765]: I1003 08:50:11.269209 4765 generic.go:334] "Generic (PLEG): container finished" podID="4522df7f-bbdd-4884-a115-d33fec3bb365" containerID="400483bfe3b0db34a4b194c802d0a1fca4c385edf0f1392462e22955f75f1572" exitCode=0 Oct 03 08:50:11 crc kubenswrapper[4765]: I1003 08:50:11.269343 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/a6d815214afcb93f379916e45350d3de39072121f31a1d7eaaf6e22c2dzxfhl" event={"ID":"4522df7f-bbdd-4884-a115-d33fec3bb365","Type":"ContainerDied","Data":"400483bfe3b0db34a4b194c802d0a1fca4c385edf0f1392462e22955f75f1572"} Oct 03 08:50:12 crc kubenswrapper[4765]: I1003 08:50:12.498985 4765 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/a6d815214afcb93f379916e45350d3de39072121f31a1d7eaaf6e22c2dzxfhl" Oct 03 08:50:12 crc kubenswrapper[4765]: I1003 08:50:12.575837 4765 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2p5gd\" (UniqueName: \"kubernetes.io/projected/4522df7f-bbdd-4884-a115-d33fec3bb365-kube-api-access-2p5gd\") pod \"4522df7f-bbdd-4884-a115-d33fec3bb365\" (UID: \"4522df7f-bbdd-4884-a115-d33fec3bb365\") " Oct 03 08:50:12 crc kubenswrapper[4765]: I1003 08:50:12.576438 4765 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/4522df7f-bbdd-4884-a115-d33fec3bb365-util\") pod \"4522df7f-bbdd-4884-a115-d33fec3bb365\" (UID: \"4522df7f-bbdd-4884-a115-d33fec3bb365\") " Oct 03 08:50:12 crc kubenswrapper[4765]: I1003 08:50:12.576533 4765 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/4522df7f-bbdd-4884-a115-d33fec3bb365-bundle\") pod \"4522df7f-bbdd-4884-a115-d33fec3bb365\" (UID: \"4522df7f-bbdd-4884-a115-d33fec3bb365\") " Oct 03 08:50:12 crc kubenswrapper[4765]: I1003 08:50:12.578485 4765 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/4522df7f-bbdd-4884-a115-d33fec3bb365-bundle" (OuterVolumeSpecName: "bundle") pod "4522df7f-bbdd-4884-a115-d33fec3bb365" (UID: "4522df7f-bbdd-4884-a115-d33fec3bb365"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 03 08:50:12 crc kubenswrapper[4765]: I1003 08:50:12.582122 4765 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4522df7f-bbdd-4884-a115-d33fec3bb365-kube-api-access-2p5gd" (OuterVolumeSpecName: "kube-api-access-2p5gd") pod "4522df7f-bbdd-4884-a115-d33fec3bb365" (UID: "4522df7f-bbdd-4884-a115-d33fec3bb365"). InnerVolumeSpecName "kube-api-access-2p5gd". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 08:50:12 crc kubenswrapper[4765]: I1003 08:50:12.677596 4765 reconciler_common.go:293] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/4522df7f-bbdd-4884-a115-d33fec3bb365-bundle\") on node \"crc\" DevicePath \"\"" Oct 03 08:50:12 crc kubenswrapper[4765]: I1003 08:50:12.677634 4765 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2p5gd\" (UniqueName: \"kubernetes.io/projected/4522df7f-bbdd-4884-a115-d33fec3bb365-kube-api-access-2p5gd\") on node \"crc\" DevicePath \"\"" Oct 03 08:50:12 crc kubenswrapper[4765]: I1003 08:50:12.739730 4765 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/4522df7f-bbdd-4884-a115-d33fec3bb365-util" (OuterVolumeSpecName: "util") pod "4522df7f-bbdd-4884-a115-d33fec3bb365" (UID: "4522df7f-bbdd-4884-a115-d33fec3bb365"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 03 08:50:12 crc kubenswrapper[4765]: I1003 08:50:12.778418 4765 reconciler_common.go:293] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/4522df7f-bbdd-4884-a115-d33fec3bb365-util\") on node \"crc\" DevicePath \"\"" Oct 03 08:50:13 crc kubenswrapper[4765]: I1003 08:50:13.283078 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/a6d815214afcb93f379916e45350d3de39072121f31a1d7eaaf6e22c2dzxfhl" event={"ID":"4522df7f-bbdd-4884-a115-d33fec3bb365","Type":"ContainerDied","Data":"5964e02a917820c958317b15e84777642e63a605ef479e7ab6c2efde4053d314"} Oct 03 08:50:13 crc kubenswrapper[4765]: I1003 08:50:13.283122 4765 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="5964e02a917820c958317b15e84777642e63a605ef479e7ab6c2efde4053d314" Oct 03 08:50:13 crc kubenswrapper[4765]: I1003 08:50:13.283377 4765 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/a6d815214afcb93f379916e45350d3de39072121f31a1d7eaaf6e22c2dzxfhl" Oct 03 08:50:23 crc kubenswrapper[4765]: I1003 08:50:23.080696 4765 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operators/obo-prometheus-operator-7c8cf85677-b27n7"] Oct 03 08:50:23 crc kubenswrapper[4765]: E1003 08:50:23.081524 4765 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4522df7f-bbdd-4884-a115-d33fec3bb365" containerName="extract" Oct 03 08:50:23 crc kubenswrapper[4765]: I1003 08:50:23.081543 4765 state_mem.go:107] "Deleted CPUSet assignment" podUID="4522df7f-bbdd-4884-a115-d33fec3bb365" containerName="extract" Oct 03 08:50:23 crc kubenswrapper[4765]: E1003 08:50:23.081560 4765 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4522df7f-bbdd-4884-a115-d33fec3bb365" containerName="util" Oct 03 08:50:23 crc kubenswrapper[4765]: I1003 08:50:23.081568 4765 state_mem.go:107] "Deleted CPUSet assignment" podUID="4522df7f-bbdd-4884-a115-d33fec3bb365" containerName="util" Oct 03 08:50:23 crc kubenswrapper[4765]: E1003 08:50:23.081582 4765 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4522df7f-bbdd-4884-a115-d33fec3bb365" containerName="pull" Oct 03 08:50:23 crc kubenswrapper[4765]: I1003 08:50:23.081589 4765 state_mem.go:107] "Deleted CPUSet assignment" podUID="4522df7f-bbdd-4884-a115-d33fec3bb365" containerName="pull" Oct 03 08:50:23 crc kubenswrapper[4765]: I1003 08:50:23.081723 4765 memory_manager.go:354] "RemoveStaleState removing state" podUID="4522df7f-bbdd-4884-a115-d33fec3bb365" containerName="extract" Oct 03 08:50:23 crc kubenswrapper[4765]: I1003 08:50:23.082221 4765 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/obo-prometheus-operator-7c8cf85677-b27n7" Oct 03 08:50:23 crc kubenswrapper[4765]: I1003 08:50:23.084543 4765 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operators"/"kube-root-ca.crt" Oct 03 08:50:23 crc kubenswrapper[4765]: I1003 08:50:23.085059 4765 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operators"/"obo-prometheus-operator-dockercfg-4b4f2" Oct 03 08:50:23 crc kubenswrapper[4765]: I1003 08:50:23.091118 4765 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operators/obo-prometheus-operator-7c8cf85677-b27n7"] Oct 03 08:50:23 crc kubenswrapper[4765]: I1003 08:50:23.093559 4765 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operators"/"openshift-service-ca.crt" Oct 03 08:50:23 crc kubenswrapper[4765]: I1003 08:50:23.127004 4765 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operators/obo-prometheus-operator-admission-webhook-665f459cd-hcgt2"] Oct 03 08:50:23 crc kubenswrapper[4765]: I1003 08:50:23.127731 4765 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/obo-prometheus-operator-admission-webhook-665f459cd-hcgt2" Oct 03 08:50:23 crc kubenswrapper[4765]: I1003 08:50:23.129573 4765 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operators"/"obo-prometheus-operator-admission-webhook-service-cert" Oct 03 08:50:23 crc kubenswrapper[4765]: I1003 08:50:23.129778 4765 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operators"/"obo-prometheus-operator-admission-webhook-dockercfg-9qqt2" Oct 03 08:50:23 crc kubenswrapper[4765]: I1003 08:50:23.149421 4765 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operators/obo-prometheus-operator-admission-webhook-665f459cd-2mtm8"] Oct 03 08:50:23 crc kubenswrapper[4765]: I1003 08:50:23.150321 4765 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/obo-prometheus-operator-admission-webhook-665f459cd-2mtm8" Oct 03 08:50:23 crc kubenswrapper[4765]: I1003 08:50:23.153194 4765 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operators/obo-prometheus-operator-admission-webhook-665f459cd-hcgt2"] Oct 03 08:50:23 crc kubenswrapper[4765]: I1003 08:50:23.184801 4765 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operators/obo-prometheus-operator-admission-webhook-665f459cd-2mtm8"] Oct 03 08:50:23 crc kubenswrapper[4765]: I1003 08:50:23.225118 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/4310e68b-d273-4784-8e5f-9114306616d8-webhook-cert\") pod \"obo-prometheus-operator-admission-webhook-665f459cd-2mtm8\" (UID: \"4310e68b-d273-4784-8e5f-9114306616d8\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-665f459cd-2mtm8" Oct 03 08:50:23 crc kubenswrapper[4765]: I1003 08:50:23.225171 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/4310e68b-d273-4784-8e5f-9114306616d8-apiservice-cert\") pod \"obo-prometheus-operator-admission-webhook-665f459cd-2mtm8\" (UID: \"4310e68b-d273-4784-8e5f-9114306616d8\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-665f459cd-2mtm8" Oct 03 08:50:23 crc kubenswrapper[4765]: I1003 08:50:23.225208 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/3f879d8d-cb72-4b5f-a9f6-96ba70a723c0-webhook-cert\") pod \"obo-prometheus-operator-admission-webhook-665f459cd-hcgt2\" (UID: \"3f879d8d-cb72-4b5f-a9f6-96ba70a723c0\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-665f459cd-hcgt2" Oct 03 08:50:23 crc kubenswrapper[4765]: I1003 08:50:23.225234 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lnjm2\" (UniqueName: \"kubernetes.io/projected/bc3e65d2-829a-4385-b865-15e288293af9-kube-api-access-lnjm2\") pod \"obo-prometheus-operator-7c8cf85677-b27n7\" (UID: \"bc3e65d2-829a-4385-b865-15e288293af9\") " pod="openshift-operators/obo-prometheus-operator-7c8cf85677-b27n7" Oct 03 08:50:23 crc kubenswrapper[4765]: I1003 08:50:23.225256 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/3f879d8d-cb72-4b5f-a9f6-96ba70a723c0-apiservice-cert\") pod \"obo-prometheus-operator-admission-webhook-665f459cd-hcgt2\" (UID: \"3f879d8d-cb72-4b5f-a9f6-96ba70a723c0\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-665f459cd-hcgt2" Oct 03 08:50:23 crc kubenswrapper[4765]: I1003 08:50:23.327081 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/4310e68b-d273-4784-8e5f-9114306616d8-webhook-cert\") pod \"obo-prometheus-operator-admission-webhook-665f459cd-2mtm8\" (UID: \"4310e68b-d273-4784-8e5f-9114306616d8\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-665f459cd-2mtm8" Oct 03 08:50:23 crc kubenswrapper[4765]: I1003 08:50:23.327188 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/4310e68b-d273-4784-8e5f-9114306616d8-apiservice-cert\") pod \"obo-prometheus-operator-admission-webhook-665f459cd-2mtm8\" (UID: \"4310e68b-d273-4784-8e5f-9114306616d8\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-665f459cd-2mtm8" Oct 03 08:50:23 crc kubenswrapper[4765]: I1003 08:50:23.327259 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/3f879d8d-cb72-4b5f-a9f6-96ba70a723c0-webhook-cert\") pod \"obo-prometheus-operator-admission-webhook-665f459cd-hcgt2\" (UID: \"3f879d8d-cb72-4b5f-a9f6-96ba70a723c0\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-665f459cd-hcgt2" Oct 03 08:50:23 crc kubenswrapper[4765]: I1003 08:50:23.327307 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lnjm2\" (UniqueName: \"kubernetes.io/projected/bc3e65d2-829a-4385-b865-15e288293af9-kube-api-access-lnjm2\") pod \"obo-prometheus-operator-7c8cf85677-b27n7\" (UID: \"bc3e65d2-829a-4385-b865-15e288293af9\") " pod="openshift-operators/obo-prometheus-operator-7c8cf85677-b27n7" Oct 03 08:50:23 crc kubenswrapper[4765]: I1003 08:50:23.327343 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/3f879d8d-cb72-4b5f-a9f6-96ba70a723c0-apiservice-cert\") pod \"obo-prometheus-operator-admission-webhook-665f459cd-hcgt2\" (UID: \"3f879d8d-cb72-4b5f-a9f6-96ba70a723c0\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-665f459cd-hcgt2" Oct 03 08:50:23 crc kubenswrapper[4765]: I1003 08:50:23.338763 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/3f879d8d-cb72-4b5f-a9f6-96ba70a723c0-apiservice-cert\") pod \"obo-prometheus-operator-admission-webhook-665f459cd-hcgt2\" (UID: \"3f879d8d-cb72-4b5f-a9f6-96ba70a723c0\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-665f459cd-hcgt2" Oct 03 08:50:23 crc kubenswrapper[4765]: I1003 08:50:23.345317 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/4310e68b-d273-4784-8e5f-9114306616d8-apiservice-cert\") pod \"obo-prometheus-operator-admission-webhook-665f459cd-2mtm8\" (UID: \"4310e68b-d273-4784-8e5f-9114306616d8\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-665f459cd-2mtm8" Oct 03 08:50:23 crc kubenswrapper[4765]: I1003 08:50:23.346427 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/3f879d8d-cb72-4b5f-a9f6-96ba70a723c0-webhook-cert\") pod \"obo-prometheus-operator-admission-webhook-665f459cd-hcgt2\" (UID: \"3f879d8d-cb72-4b5f-a9f6-96ba70a723c0\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-665f459cd-hcgt2" Oct 03 08:50:23 crc kubenswrapper[4765]: I1003 08:50:23.347219 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/4310e68b-d273-4784-8e5f-9114306616d8-webhook-cert\") pod \"obo-prometheus-operator-admission-webhook-665f459cd-2mtm8\" (UID: \"4310e68b-d273-4784-8e5f-9114306616d8\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-665f459cd-2mtm8" Oct 03 08:50:23 crc kubenswrapper[4765]: I1003 08:50:23.353785 4765 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operators/observability-operator-cc5f78dfc-ndvd9"] Oct 03 08:50:23 crc kubenswrapper[4765]: I1003 08:50:23.365550 4765 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/observability-operator-cc5f78dfc-ndvd9" Oct 03 08:50:23 crc kubenswrapper[4765]: I1003 08:50:23.374202 4765 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operators"/"observability-operator-sa-dockercfg-7pln9" Oct 03 08:50:23 crc kubenswrapper[4765]: I1003 08:50:23.374597 4765 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operators"/"observability-operator-tls" Oct 03 08:50:23 crc kubenswrapper[4765]: I1003 08:50:23.406888 4765 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operators/observability-operator-cc5f78dfc-ndvd9"] Oct 03 08:50:23 crc kubenswrapper[4765]: I1003 08:50:23.407697 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lnjm2\" (UniqueName: \"kubernetes.io/projected/bc3e65d2-829a-4385-b865-15e288293af9-kube-api-access-lnjm2\") pod \"obo-prometheus-operator-7c8cf85677-b27n7\" (UID: \"bc3e65d2-829a-4385-b865-15e288293af9\") " pod="openshift-operators/obo-prometheus-operator-7c8cf85677-b27n7" Oct 03 08:50:23 crc kubenswrapper[4765]: I1003 08:50:23.443256 4765 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/obo-prometheus-operator-admission-webhook-665f459cd-hcgt2" Oct 03 08:50:23 crc kubenswrapper[4765]: I1003 08:50:23.463472 4765 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/obo-prometheus-operator-admission-webhook-665f459cd-2mtm8" Oct 03 08:50:23 crc kubenswrapper[4765]: I1003 08:50:23.536101 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-scwmf\" (UniqueName: \"kubernetes.io/projected/9b3ed21b-9c8f-45e2-a048-c6d1e0324360-kube-api-access-scwmf\") pod \"observability-operator-cc5f78dfc-ndvd9\" (UID: \"9b3ed21b-9c8f-45e2-a048-c6d1e0324360\") " pod="openshift-operators/observability-operator-cc5f78dfc-ndvd9" Oct 03 08:50:23 crc kubenswrapper[4765]: I1003 08:50:23.536172 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"observability-operator-tls\" (UniqueName: \"kubernetes.io/secret/9b3ed21b-9c8f-45e2-a048-c6d1e0324360-observability-operator-tls\") pod \"observability-operator-cc5f78dfc-ndvd9\" (UID: \"9b3ed21b-9c8f-45e2-a048-c6d1e0324360\") " pod="openshift-operators/observability-operator-cc5f78dfc-ndvd9" Oct 03 08:50:23 crc kubenswrapper[4765]: I1003 08:50:23.544978 4765 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operators/perses-operator-54bc95c9fb-76rtm"] Oct 03 08:50:23 crc kubenswrapper[4765]: I1003 08:50:23.546074 4765 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/perses-operator-54bc95c9fb-76rtm" Oct 03 08:50:23 crc kubenswrapper[4765]: I1003 08:50:23.552861 4765 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operators"/"perses-operator-dockercfg-xtj7d" Oct 03 08:50:23 crc kubenswrapper[4765]: I1003 08:50:23.613162 4765 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operators/perses-operator-54bc95c9fb-76rtm"] Oct 03 08:50:23 crc kubenswrapper[4765]: I1003 08:50:23.638402 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openshift-service-ca\" (UniqueName: \"kubernetes.io/configmap/4cdf3ecf-31a5-43b7-9df2-d0a6d2e8fb56-openshift-service-ca\") pod \"perses-operator-54bc95c9fb-76rtm\" (UID: \"4cdf3ecf-31a5-43b7-9df2-d0a6d2e8fb56\") " pod="openshift-operators/perses-operator-54bc95c9fb-76rtm" Oct 03 08:50:23 crc kubenswrapper[4765]: I1003 08:50:23.638466 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-scwmf\" (UniqueName: \"kubernetes.io/projected/9b3ed21b-9c8f-45e2-a048-c6d1e0324360-kube-api-access-scwmf\") pod \"observability-operator-cc5f78dfc-ndvd9\" (UID: \"9b3ed21b-9c8f-45e2-a048-c6d1e0324360\") " pod="openshift-operators/observability-operator-cc5f78dfc-ndvd9" Oct 03 08:50:23 crc kubenswrapper[4765]: I1003 08:50:23.638501 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-z685c\" (UniqueName: \"kubernetes.io/projected/4cdf3ecf-31a5-43b7-9df2-d0a6d2e8fb56-kube-api-access-z685c\") pod \"perses-operator-54bc95c9fb-76rtm\" (UID: \"4cdf3ecf-31a5-43b7-9df2-d0a6d2e8fb56\") " pod="openshift-operators/perses-operator-54bc95c9fb-76rtm" Oct 03 08:50:23 crc kubenswrapper[4765]: I1003 08:50:23.638529 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"observability-operator-tls\" (UniqueName: \"kubernetes.io/secret/9b3ed21b-9c8f-45e2-a048-c6d1e0324360-observability-operator-tls\") pod \"observability-operator-cc5f78dfc-ndvd9\" (UID: \"9b3ed21b-9c8f-45e2-a048-c6d1e0324360\") " pod="openshift-operators/observability-operator-cc5f78dfc-ndvd9" Oct 03 08:50:23 crc kubenswrapper[4765]: I1003 08:50:23.643460 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"observability-operator-tls\" (UniqueName: \"kubernetes.io/secret/9b3ed21b-9c8f-45e2-a048-c6d1e0324360-observability-operator-tls\") pod \"observability-operator-cc5f78dfc-ndvd9\" (UID: \"9b3ed21b-9c8f-45e2-a048-c6d1e0324360\") " pod="openshift-operators/observability-operator-cc5f78dfc-ndvd9" Oct 03 08:50:23 crc kubenswrapper[4765]: I1003 08:50:23.659682 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-scwmf\" (UniqueName: \"kubernetes.io/projected/9b3ed21b-9c8f-45e2-a048-c6d1e0324360-kube-api-access-scwmf\") pod \"observability-operator-cc5f78dfc-ndvd9\" (UID: \"9b3ed21b-9c8f-45e2-a048-c6d1e0324360\") " pod="openshift-operators/observability-operator-cc5f78dfc-ndvd9" Oct 03 08:50:23 crc kubenswrapper[4765]: I1003 08:50:23.697125 4765 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/obo-prometheus-operator-7c8cf85677-b27n7" Oct 03 08:50:23 crc kubenswrapper[4765]: I1003 08:50:23.736226 4765 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/observability-operator-cc5f78dfc-ndvd9" Oct 03 08:50:23 crc kubenswrapper[4765]: I1003 08:50:23.739857 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openshift-service-ca\" (UniqueName: \"kubernetes.io/configmap/4cdf3ecf-31a5-43b7-9df2-d0a6d2e8fb56-openshift-service-ca\") pod \"perses-operator-54bc95c9fb-76rtm\" (UID: \"4cdf3ecf-31a5-43b7-9df2-d0a6d2e8fb56\") " pod="openshift-operators/perses-operator-54bc95c9fb-76rtm" Oct 03 08:50:23 crc kubenswrapper[4765]: I1003 08:50:23.739919 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-z685c\" (UniqueName: \"kubernetes.io/projected/4cdf3ecf-31a5-43b7-9df2-d0a6d2e8fb56-kube-api-access-z685c\") pod \"perses-operator-54bc95c9fb-76rtm\" (UID: \"4cdf3ecf-31a5-43b7-9df2-d0a6d2e8fb56\") " pod="openshift-operators/perses-operator-54bc95c9fb-76rtm" Oct 03 08:50:23 crc kubenswrapper[4765]: I1003 08:50:23.741516 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openshift-service-ca\" (UniqueName: \"kubernetes.io/configmap/4cdf3ecf-31a5-43b7-9df2-d0a6d2e8fb56-openshift-service-ca\") pod \"perses-operator-54bc95c9fb-76rtm\" (UID: \"4cdf3ecf-31a5-43b7-9df2-d0a6d2e8fb56\") " pod="openshift-operators/perses-operator-54bc95c9fb-76rtm" Oct 03 08:50:23 crc kubenswrapper[4765]: I1003 08:50:23.747617 4765 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operators/obo-prometheus-operator-admission-webhook-665f459cd-hcgt2"] Oct 03 08:50:23 crc kubenswrapper[4765]: I1003 08:50:23.764638 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-z685c\" (UniqueName: \"kubernetes.io/projected/4cdf3ecf-31a5-43b7-9df2-d0a6d2e8fb56-kube-api-access-z685c\") pod \"perses-operator-54bc95c9fb-76rtm\" (UID: \"4cdf3ecf-31a5-43b7-9df2-d0a6d2e8fb56\") " pod="openshift-operators/perses-operator-54bc95c9fb-76rtm" Oct 03 08:50:23 crc kubenswrapper[4765]: W1003 08:50:23.779964 4765 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod3f879d8d_cb72_4b5f_a9f6_96ba70a723c0.slice/crio-15cbdcbad9a15f52b65bf07b46a46af5a5bd8d0bedfcadb3b19e0786babae2ae WatchSource:0}: Error finding container 15cbdcbad9a15f52b65bf07b46a46af5a5bd8d0bedfcadb3b19e0786babae2ae: Status 404 returned error can't find the container with id 15cbdcbad9a15f52b65bf07b46a46af5a5bd8d0bedfcadb3b19e0786babae2ae Oct 03 08:50:23 crc kubenswrapper[4765]: I1003 08:50:23.808854 4765 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operators/obo-prometheus-operator-admission-webhook-665f459cd-2mtm8"] Oct 03 08:50:23 crc kubenswrapper[4765]: I1003 08:50:23.904711 4765 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/perses-operator-54bc95c9fb-76rtm" Oct 03 08:50:24 crc kubenswrapper[4765]: I1003 08:50:24.052566 4765 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operators/obo-prometheus-operator-7c8cf85677-b27n7"] Oct 03 08:50:24 crc kubenswrapper[4765]: W1003 08:50:24.070219 4765 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podbc3e65d2_829a_4385_b865_15e288293af9.slice/crio-fc3d2e07536b47b7b86121e73880bed6919a8171331b7c431830d2d365e59acf WatchSource:0}: Error finding container fc3d2e07536b47b7b86121e73880bed6919a8171331b7c431830d2d365e59acf: Status 404 returned error can't find the container with id fc3d2e07536b47b7b86121e73880bed6919a8171331b7c431830d2d365e59acf Oct 03 08:50:24 crc kubenswrapper[4765]: I1003 08:50:24.101422 4765 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operators/observability-operator-cc5f78dfc-ndvd9"] Oct 03 08:50:24 crc kubenswrapper[4765]: W1003 08:50:24.111118 4765 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod9b3ed21b_9c8f_45e2_a048_c6d1e0324360.slice/crio-a46dcffb5efdd7a98132941c0757625e38cc012b763a481d7bb26e986dbfc858 WatchSource:0}: Error finding container a46dcffb5efdd7a98132941c0757625e38cc012b763a481d7bb26e986dbfc858: Status 404 returned error can't find the container with id a46dcffb5efdd7a98132941c0757625e38cc012b763a481d7bb26e986dbfc858 Oct 03 08:50:24 crc kubenswrapper[4765]: I1003 08:50:24.180358 4765 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operators/perses-operator-54bc95c9fb-76rtm"] Oct 03 08:50:24 crc kubenswrapper[4765]: W1003 08:50:24.189458 4765 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod4cdf3ecf_31a5_43b7_9df2_d0a6d2e8fb56.slice/crio-8f506c6ea73b10d508009eabf9332b778da59761049e770c75679cad57fb32ba WatchSource:0}: Error finding container 8f506c6ea73b10d508009eabf9332b778da59761049e770c75679cad57fb32ba: Status 404 returned error can't find the container with id 8f506c6ea73b10d508009eabf9332b778da59761049e770c75679cad57fb32ba Oct 03 08:50:24 crc kubenswrapper[4765]: I1003 08:50:24.347383 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operators/obo-prometheus-operator-admission-webhook-665f459cd-hcgt2" event={"ID":"3f879d8d-cb72-4b5f-a9f6-96ba70a723c0","Type":"ContainerStarted","Data":"15cbdcbad9a15f52b65bf07b46a46af5a5bd8d0bedfcadb3b19e0786babae2ae"} Oct 03 08:50:24 crc kubenswrapper[4765]: I1003 08:50:24.349585 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operators/obo-prometheus-operator-admission-webhook-665f459cd-2mtm8" event={"ID":"4310e68b-d273-4784-8e5f-9114306616d8","Type":"ContainerStarted","Data":"ddd419f44ddebe21e1e1bee8919096efd6ad80226ceca26e6c0ffb7bb26115b7"} Oct 03 08:50:24 crc kubenswrapper[4765]: I1003 08:50:24.351555 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operators/perses-operator-54bc95c9fb-76rtm" event={"ID":"4cdf3ecf-31a5-43b7-9df2-d0a6d2e8fb56","Type":"ContainerStarted","Data":"8f506c6ea73b10d508009eabf9332b778da59761049e770c75679cad57fb32ba"} Oct 03 08:50:24 crc kubenswrapper[4765]: I1003 08:50:24.353441 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operators/observability-operator-cc5f78dfc-ndvd9" event={"ID":"9b3ed21b-9c8f-45e2-a048-c6d1e0324360","Type":"ContainerStarted","Data":"a46dcffb5efdd7a98132941c0757625e38cc012b763a481d7bb26e986dbfc858"} Oct 03 08:50:24 crc kubenswrapper[4765]: I1003 08:50:24.355460 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operators/obo-prometheus-operator-7c8cf85677-b27n7" event={"ID":"bc3e65d2-829a-4385-b865-15e288293af9","Type":"ContainerStarted","Data":"fc3d2e07536b47b7b86121e73880bed6919a8171331b7c431830d2d365e59acf"} Oct 03 08:50:37 crc kubenswrapper[4765]: I1003 08:50:37.121821 4765 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-ovn-kubernetes/ovnkube-node-srgbb"] Oct 03 08:50:37 crc kubenswrapper[4765]: I1003 08:50:37.122540 4765 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-srgbb" podUID="ea01fba1-445f-46c1-898c-1ceb34866850" containerName="ovn-controller" containerID="cri-o://68b9b8a7ec5c072f50d44aa0d3800b7cdee18bdd868d37ec129ceb37a23bd3ca" gracePeriod=30 Oct 03 08:50:37 crc kubenswrapper[4765]: I1003 08:50:37.122915 4765 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-srgbb" podUID="ea01fba1-445f-46c1-898c-1ceb34866850" containerName="sbdb" containerID="cri-o://6d5d60eb6ab5ff22cc2c6826b1d47220bb827fa0429f2a59020ae01d0a43f6bf" gracePeriod=30 Oct 03 08:50:37 crc kubenswrapper[4765]: I1003 08:50:37.122956 4765 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-srgbb" podUID="ea01fba1-445f-46c1-898c-1ceb34866850" containerName="nbdb" containerID="cri-o://902d94d2cc9ce526c6ea774f1bb70fbee7da85cedab72fcd842f87d47ee8a458" gracePeriod=30 Oct 03 08:50:37 crc kubenswrapper[4765]: I1003 08:50:37.122983 4765 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-srgbb" podUID="ea01fba1-445f-46c1-898c-1ceb34866850" containerName="northd" containerID="cri-o://95502595a856f5f235331ab5db3d4f97a50f968857c1962d12b873a714689f0c" gracePeriod=30 Oct 03 08:50:37 crc kubenswrapper[4765]: I1003 08:50:37.123012 4765 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-srgbb" podUID="ea01fba1-445f-46c1-898c-1ceb34866850" containerName="kube-rbac-proxy-ovn-metrics" containerID="cri-o://fa40947035e07c4926ee170348e2bd545830d0c6c1fa6b59a2aa7f12eac2c6da" gracePeriod=30 Oct 03 08:50:37 crc kubenswrapper[4765]: I1003 08:50:37.123038 4765 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-srgbb" podUID="ea01fba1-445f-46c1-898c-1ceb34866850" containerName="kube-rbac-proxy-node" containerID="cri-o://d73e2e54676fc570262cfd551322ed003812c372ddc25695ca3b34ae2a05423b" gracePeriod=30 Oct 03 08:50:37 crc kubenswrapper[4765]: I1003 08:50:37.123070 4765 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-srgbb" podUID="ea01fba1-445f-46c1-898c-1ceb34866850" containerName="ovn-acl-logging" containerID="cri-o://a3ad66691c9dcf004703b79d697a78f9b42791fafba2ddf278997b6ad28bdd4a" gracePeriod=30 Oct 03 08:50:37 crc kubenswrapper[4765]: I1003 08:50:37.193179 4765 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-srgbb" podUID="ea01fba1-445f-46c1-898c-1ceb34866850" containerName="ovnkube-controller" containerID="cri-o://7f00afa4ccebb6c76784137043797c0ee3ab98e16e9dffb9acb0f972b0c35b63" gracePeriod=30 Oct 03 08:50:37 crc kubenswrapper[4765]: E1003 08:50:37.307930 4765 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="6d5d60eb6ab5ff22cc2c6826b1d47220bb827fa0429f2a59020ae01d0a43f6bf" cmd=["/bin/bash","-c","set -xeo pipefail\n. /ovnkube-lib/ovnkube-lib.sh || exit 1\novndb-readiness-probe \"sb\"\n"] Oct 03 08:50:37 crc kubenswrapper[4765]: E1003 08:50:37.308116 4765 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="902d94d2cc9ce526c6ea774f1bb70fbee7da85cedab72fcd842f87d47ee8a458" cmd=["/bin/bash","-c","set -xeo pipefail\n. /ovnkube-lib/ovnkube-lib.sh || exit 1\novndb-readiness-probe \"nb\"\n"] Oct 03 08:50:37 crc kubenswrapper[4765]: E1003 08:50:37.310081 4765 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="6d5d60eb6ab5ff22cc2c6826b1d47220bb827fa0429f2a59020ae01d0a43f6bf" cmd=["/bin/bash","-c","set -xeo pipefail\n. /ovnkube-lib/ovnkube-lib.sh || exit 1\novndb-readiness-probe \"sb\"\n"] Oct 03 08:50:37 crc kubenswrapper[4765]: E1003 08:50:37.310217 4765 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="902d94d2cc9ce526c6ea774f1bb70fbee7da85cedab72fcd842f87d47ee8a458" cmd=["/bin/bash","-c","set -xeo pipefail\n. /ovnkube-lib/ovnkube-lib.sh || exit 1\novndb-readiness-probe \"nb\"\n"] Oct 03 08:50:37 crc kubenswrapper[4765]: E1003 08:50:37.319196 4765 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="902d94d2cc9ce526c6ea774f1bb70fbee7da85cedab72fcd842f87d47ee8a458" cmd=["/bin/bash","-c","set -xeo pipefail\n. /ovnkube-lib/ovnkube-lib.sh || exit 1\novndb-readiness-probe \"nb\"\n"] Oct 03 08:50:37 crc kubenswrapper[4765]: E1003 08:50:37.319281 4765 prober.go:104] "Probe errored" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" probeType="Readiness" pod="openshift-ovn-kubernetes/ovnkube-node-srgbb" podUID="ea01fba1-445f-46c1-898c-1ceb34866850" containerName="nbdb" Oct 03 08:50:37 crc kubenswrapper[4765]: E1003 08:50:37.320186 4765 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="6d5d60eb6ab5ff22cc2c6826b1d47220bb827fa0429f2a59020ae01d0a43f6bf" cmd=["/bin/bash","-c","set -xeo pipefail\n. /ovnkube-lib/ovnkube-lib.sh || exit 1\novndb-readiness-probe \"sb\"\n"] Oct 03 08:50:37 crc kubenswrapper[4765]: E1003 08:50:37.320223 4765 prober.go:104] "Probe errored" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" probeType="Readiness" pod="openshift-ovn-kubernetes/ovnkube-node-srgbb" podUID="ea01fba1-445f-46c1-898c-1ceb34866850" containerName="sbdb" Oct 03 08:50:37 crc kubenswrapper[4765]: I1003 08:50:37.498529 4765 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-csb5z_912755c8-dd28-4fbc-82de-9cf85df54f4f/kube-multus/2.log" Oct 03 08:50:37 crc kubenswrapper[4765]: I1003 08:50:37.499152 4765 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-csb5z_912755c8-dd28-4fbc-82de-9cf85df54f4f/kube-multus/1.log" Oct 03 08:50:37 crc kubenswrapper[4765]: I1003 08:50:37.499205 4765 generic.go:334] "Generic (PLEG): container finished" podID="912755c8-dd28-4fbc-82de-9cf85df54f4f" containerID="8d3bc39e0926f0219495285f71e5ec98da034b168a3092f1121da2eabf6b6f10" exitCode=2 Oct 03 08:50:37 crc kubenswrapper[4765]: I1003 08:50:37.499302 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-csb5z" event={"ID":"912755c8-dd28-4fbc-82de-9cf85df54f4f","Type":"ContainerDied","Data":"8d3bc39e0926f0219495285f71e5ec98da034b168a3092f1121da2eabf6b6f10"} Oct 03 08:50:37 crc kubenswrapper[4765]: I1003 08:50:37.499390 4765 scope.go:117] "RemoveContainer" containerID="52f5a7f443bf8e52988e8645ff60745a747d602261e7dbf01b68c58aaf9bae05" Oct 03 08:50:37 crc kubenswrapper[4765]: I1003 08:50:37.500152 4765 scope.go:117] "RemoveContainer" containerID="8d3bc39e0926f0219495285f71e5ec98da034b168a3092f1121da2eabf6b6f10" Oct 03 08:50:37 crc kubenswrapper[4765]: E1003 08:50:37.500409 4765 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-multus\" with CrashLoopBackOff: \"back-off 20s restarting failed container=kube-multus pod=multus-csb5z_openshift-multus(912755c8-dd28-4fbc-82de-9cf85df54f4f)\"" pod="openshift-multus/multus-csb5z" podUID="912755c8-dd28-4fbc-82de-9cf85df54f4f" Oct 03 08:50:37 crc kubenswrapper[4765]: I1003 08:50:37.503141 4765 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-srgbb_ea01fba1-445f-46c1-898c-1ceb34866850/ovnkube-controller/3.log" Oct 03 08:50:37 crc kubenswrapper[4765]: I1003 08:50:37.534731 4765 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-srgbb_ea01fba1-445f-46c1-898c-1ceb34866850/ovn-acl-logging/0.log" Oct 03 08:50:37 crc kubenswrapper[4765]: I1003 08:50:37.535904 4765 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-srgbb_ea01fba1-445f-46c1-898c-1ceb34866850/ovn-controller/0.log" Oct 03 08:50:37 crc kubenswrapper[4765]: I1003 08:50:37.536188 4765 generic.go:334] "Generic (PLEG): container finished" podID="ea01fba1-445f-46c1-898c-1ceb34866850" containerID="7f00afa4ccebb6c76784137043797c0ee3ab98e16e9dffb9acb0f972b0c35b63" exitCode=0 Oct 03 08:50:37 crc kubenswrapper[4765]: I1003 08:50:37.536208 4765 generic.go:334] "Generic (PLEG): container finished" podID="ea01fba1-445f-46c1-898c-1ceb34866850" containerID="6d5d60eb6ab5ff22cc2c6826b1d47220bb827fa0429f2a59020ae01d0a43f6bf" exitCode=0 Oct 03 08:50:37 crc kubenswrapper[4765]: I1003 08:50:37.536223 4765 generic.go:334] "Generic (PLEG): container finished" podID="ea01fba1-445f-46c1-898c-1ceb34866850" containerID="902d94d2cc9ce526c6ea774f1bb70fbee7da85cedab72fcd842f87d47ee8a458" exitCode=0 Oct 03 08:50:37 crc kubenswrapper[4765]: I1003 08:50:37.536230 4765 generic.go:334] "Generic (PLEG): container finished" podID="ea01fba1-445f-46c1-898c-1ceb34866850" containerID="95502595a856f5f235331ab5db3d4f97a50f968857c1962d12b873a714689f0c" exitCode=0 Oct 03 08:50:37 crc kubenswrapper[4765]: I1003 08:50:37.536239 4765 generic.go:334] "Generic (PLEG): container finished" podID="ea01fba1-445f-46c1-898c-1ceb34866850" containerID="fa40947035e07c4926ee170348e2bd545830d0c6c1fa6b59a2aa7f12eac2c6da" exitCode=0 Oct 03 08:50:37 crc kubenswrapper[4765]: I1003 08:50:37.536245 4765 generic.go:334] "Generic (PLEG): container finished" podID="ea01fba1-445f-46c1-898c-1ceb34866850" containerID="d73e2e54676fc570262cfd551322ed003812c372ddc25695ca3b34ae2a05423b" exitCode=0 Oct 03 08:50:37 crc kubenswrapper[4765]: I1003 08:50:37.536252 4765 generic.go:334] "Generic (PLEG): container finished" podID="ea01fba1-445f-46c1-898c-1ceb34866850" containerID="a3ad66691c9dcf004703b79d697a78f9b42791fafba2ddf278997b6ad28bdd4a" exitCode=143 Oct 03 08:50:37 crc kubenswrapper[4765]: I1003 08:50:37.536259 4765 generic.go:334] "Generic (PLEG): container finished" podID="ea01fba1-445f-46c1-898c-1ceb34866850" containerID="68b9b8a7ec5c072f50d44aa0d3800b7cdee18bdd868d37ec129ceb37a23bd3ca" exitCode=143 Oct 03 08:50:37 crc kubenswrapper[4765]: I1003 08:50:37.536279 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-srgbb" event={"ID":"ea01fba1-445f-46c1-898c-1ceb34866850","Type":"ContainerDied","Data":"7f00afa4ccebb6c76784137043797c0ee3ab98e16e9dffb9acb0f972b0c35b63"} Oct 03 08:50:37 crc kubenswrapper[4765]: I1003 08:50:37.536307 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-srgbb" event={"ID":"ea01fba1-445f-46c1-898c-1ceb34866850","Type":"ContainerDied","Data":"6d5d60eb6ab5ff22cc2c6826b1d47220bb827fa0429f2a59020ae01d0a43f6bf"} Oct 03 08:50:37 crc kubenswrapper[4765]: I1003 08:50:37.536324 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-srgbb" event={"ID":"ea01fba1-445f-46c1-898c-1ceb34866850","Type":"ContainerDied","Data":"902d94d2cc9ce526c6ea774f1bb70fbee7da85cedab72fcd842f87d47ee8a458"} Oct 03 08:50:37 crc kubenswrapper[4765]: I1003 08:50:37.536333 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-srgbb" event={"ID":"ea01fba1-445f-46c1-898c-1ceb34866850","Type":"ContainerDied","Data":"95502595a856f5f235331ab5db3d4f97a50f968857c1962d12b873a714689f0c"} Oct 03 08:50:37 crc kubenswrapper[4765]: I1003 08:50:37.536341 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-srgbb" event={"ID":"ea01fba1-445f-46c1-898c-1ceb34866850","Type":"ContainerDied","Data":"fa40947035e07c4926ee170348e2bd545830d0c6c1fa6b59a2aa7f12eac2c6da"} Oct 03 08:50:37 crc kubenswrapper[4765]: I1003 08:50:37.536349 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-srgbb" event={"ID":"ea01fba1-445f-46c1-898c-1ceb34866850","Type":"ContainerDied","Data":"d73e2e54676fc570262cfd551322ed003812c372ddc25695ca3b34ae2a05423b"} Oct 03 08:50:37 crc kubenswrapper[4765]: I1003 08:50:37.536359 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-srgbb" event={"ID":"ea01fba1-445f-46c1-898c-1ceb34866850","Type":"ContainerDied","Data":"a3ad66691c9dcf004703b79d697a78f9b42791fafba2ddf278997b6ad28bdd4a"} Oct 03 08:50:37 crc kubenswrapper[4765]: I1003 08:50:37.536367 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-srgbb" event={"ID":"ea01fba1-445f-46c1-898c-1ceb34866850","Type":"ContainerDied","Data":"68b9b8a7ec5c072f50d44aa0d3800b7cdee18bdd868d37ec129ceb37a23bd3ca"} Oct 03 08:50:38 crc kubenswrapper[4765]: I1003 08:50:38.543863 4765 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-srgbb_ea01fba1-445f-46c1-898c-1ceb34866850/ovnkube-controller/3.log" Oct 03 08:50:38 crc kubenswrapper[4765]: I1003 08:50:38.546638 4765 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-srgbb_ea01fba1-445f-46c1-898c-1ceb34866850/ovn-acl-logging/0.log" Oct 03 08:50:38 crc kubenswrapper[4765]: I1003 08:50:38.547241 4765 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-srgbb_ea01fba1-445f-46c1-898c-1ceb34866850/ovn-controller/0.log" Oct 03 08:50:38 crc kubenswrapper[4765]: I1003 08:50:38.547310 4765 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-srgbb_ea01fba1-445f-46c1-898c-1ceb34866850/ovnkube-controller/3.log" Oct 03 08:50:38 crc kubenswrapper[4765]: I1003 08:50:38.547676 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-srgbb" event={"ID":"ea01fba1-445f-46c1-898c-1ceb34866850","Type":"ContainerDied","Data":"5b0d53b60064af65d60dce9388dbb62aebfc827635dea2e607ca3af27c7883de"} Oct 03 08:50:38 crc kubenswrapper[4765]: I1003 08:50:38.547718 4765 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="5b0d53b60064af65d60dce9388dbb62aebfc827635dea2e607ca3af27c7883de" Oct 03 08:50:38 crc kubenswrapper[4765]: I1003 08:50:38.549904 4765 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-srgbb_ea01fba1-445f-46c1-898c-1ceb34866850/ovn-acl-logging/0.log" Oct 03 08:50:38 crc kubenswrapper[4765]: I1003 08:50:38.550477 4765 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-srgbb_ea01fba1-445f-46c1-898c-1ceb34866850/ovn-controller/0.log" Oct 03 08:50:38 crc kubenswrapper[4765]: I1003 08:50:38.551047 4765 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-srgbb" Oct 03 08:50:38 crc kubenswrapper[4765]: I1003 08:50:38.602966 4765 scope.go:117] "RemoveContainer" containerID="115444bc9990e2060fb9e8fff1ca7328f3abbaee25879c6af5feac46f0a417bb" Oct 03 08:50:38 crc kubenswrapper[4765]: I1003 08:50:38.608906 4765 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ovn-kubernetes/ovnkube-node-vm8hh"] Oct 03 08:50:38 crc kubenswrapper[4765]: E1003 08:50:38.609103 4765 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ea01fba1-445f-46c1-898c-1ceb34866850" containerName="ovnkube-controller" Oct 03 08:50:38 crc kubenswrapper[4765]: I1003 08:50:38.609118 4765 state_mem.go:107] "Deleted CPUSet assignment" podUID="ea01fba1-445f-46c1-898c-1ceb34866850" containerName="ovnkube-controller" Oct 03 08:50:38 crc kubenswrapper[4765]: E1003 08:50:38.609126 4765 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ea01fba1-445f-46c1-898c-1ceb34866850" containerName="northd" Oct 03 08:50:38 crc kubenswrapper[4765]: I1003 08:50:38.609131 4765 state_mem.go:107] "Deleted CPUSet assignment" podUID="ea01fba1-445f-46c1-898c-1ceb34866850" containerName="northd" Oct 03 08:50:38 crc kubenswrapper[4765]: E1003 08:50:38.609141 4765 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ea01fba1-445f-46c1-898c-1ceb34866850" containerName="nbdb" Oct 03 08:50:38 crc kubenswrapper[4765]: I1003 08:50:38.609148 4765 state_mem.go:107] "Deleted CPUSet assignment" podUID="ea01fba1-445f-46c1-898c-1ceb34866850" containerName="nbdb" Oct 03 08:50:38 crc kubenswrapper[4765]: E1003 08:50:38.609157 4765 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ea01fba1-445f-46c1-898c-1ceb34866850" containerName="ovn-acl-logging" Oct 03 08:50:38 crc kubenswrapper[4765]: I1003 08:50:38.609163 4765 state_mem.go:107] "Deleted CPUSet assignment" podUID="ea01fba1-445f-46c1-898c-1ceb34866850" containerName="ovn-acl-logging" Oct 03 08:50:38 crc kubenswrapper[4765]: E1003 08:50:38.609170 4765 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ea01fba1-445f-46c1-898c-1ceb34866850" containerName="ovn-controller" Oct 03 08:50:38 crc kubenswrapper[4765]: I1003 08:50:38.609175 4765 state_mem.go:107] "Deleted CPUSet assignment" podUID="ea01fba1-445f-46c1-898c-1ceb34866850" containerName="ovn-controller" Oct 03 08:50:38 crc kubenswrapper[4765]: E1003 08:50:38.609188 4765 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ea01fba1-445f-46c1-898c-1ceb34866850" containerName="kubecfg-setup" Oct 03 08:50:38 crc kubenswrapper[4765]: I1003 08:50:38.609194 4765 state_mem.go:107] "Deleted CPUSet assignment" podUID="ea01fba1-445f-46c1-898c-1ceb34866850" containerName="kubecfg-setup" Oct 03 08:50:38 crc kubenswrapper[4765]: E1003 08:50:38.609202 4765 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ea01fba1-445f-46c1-898c-1ceb34866850" containerName="sbdb" Oct 03 08:50:38 crc kubenswrapper[4765]: I1003 08:50:38.609208 4765 state_mem.go:107] "Deleted CPUSet assignment" podUID="ea01fba1-445f-46c1-898c-1ceb34866850" containerName="sbdb" Oct 03 08:50:38 crc kubenswrapper[4765]: E1003 08:50:38.609215 4765 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ea01fba1-445f-46c1-898c-1ceb34866850" containerName="ovnkube-controller" Oct 03 08:50:38 crc kubenswrapper[4765]: I1003 08:50:38.609221 4765 state_mem.go:107] "Deleted CPUSet assignment" podUID="ea01fba1-445f-46c1-898c-1ceb34866850" containerName="ovnkube-controller" Oct 03 08:50:38 crc kubenswrapper[4765]: E1003 08:50:38.609228 4765 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ea01fba1-445f-46c1-898c-1ceb34866850" containerName="ovnkube-controller" Oct 03 08:50:38 crc kubenswrapper[4765]: I1003 08:50:38.609233 4765 state_mem.go:107] "Deleted CPUSet assignment" podUID="ea01fba1-445f-46c1-898c-1ceb34866850" containerName="ovnkube-controller" Oct 03 08:50:38 crc kubenswrapper[4765]: E1003 08:50:38.609241 4765 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ea01fba1-445f-46c1-898c-1ceb34866850" containerName="kube-rbac-proxy-ovn-metrics" Oct 03 08:50:38 crc kubenswrapper[4765]: I1003 08:50:38.609247 4765 state_mem.go:107] "Deleted CPUSet assignment" podUID="ea01fba1-445f-46c1-898c-1ceb34866850" containerName="kube-rbac-proxy-ovn-metrics" Oct 03 08:50:38 crc kubenswrapper[4765]: E1003 08:50:38.609256 4765 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ea01fba1-445f-46c1-898c-1ceb34866850" containerName="kube-rbac-proxy-node" Oct 03 08:50:38 crc kubenswrapper[4765]: I1003 08:50:38.609261 4765 state_mem.go:107] "Deleted CPUSet assignment" podUID="ea01fba1-445f-46c1-898c-1ceb34866850" containerName="kube-rbac-proxy-node" Oct 03 08:50:38 crc kubenswrapper[4765]: I1003 08:50:38.609346 4765 memory_manager.go:354] "RemoveStaleState removing state" podUID="ea01fba1-445f-46c1-898c-1ceb34866850" containerName="ovnkube-controller" Oct 03 08:50:38 crc kubenswrapper[4765]: I1003 08:50:38.609356 4765 memory_manager.go:354] "RemoveStaleState removing state" podUID="ea01fba1-445f-46c1-898c-1ceb34866850" containerName="ovn-acl-logging" Oct 03 08:50:38 crc kubenswrapper[4765]: I1003 08:50:38.609367 4765 memory_manager.go:354] "RemoveStaleState removing state" podUID="ea01fba1-445f-46c1-898c-1ceb34866850" containerName="ovn-controller" Oct 03 08:50:38 crc kubenswrapper[4765]: I1003 08:50:38.609378 4765 memory_manager.go:354] "RemoveStaleState removing state" podUID="ea01fba1-445f-46c1-898c-1ceb34866850" containerName="ovnkube-controller" Oct 03 08:50:38 crc kubenswrapper[4765]: I1003 08:50:38.609385 4765 memory_manager.go:354] "RemoveStaleState removing state" podUID="ea01fba1-445f-46c1-898c-1ceb34866850" containerName="ovnkube-controller" Oct 03 08:50:38 crc kubenswrapper[4765]: I1003 08:50:38.609395 4765 memory_manager.go:354] "RemoveStaleState removing state" podUID="ea01fba1-445f-46c1-898c-1ceb34866850" containerName="kube-rbac-proxy-node" Oct 03 08:50:38 crc kubenswrapper[4765]: I1003 08:50:38.609403 4765 memory_manager.go:354] "RemoveStaleState removing state" podUID="ea01fba1-445f-46c1-898c-1ceb34866850" containerName="northd" Oct 03 08:50:38 crc kubenswrapper[4765]: I1003 08:50:38.609411 4765 memory_manager.go:354] "RemoveStaleState removing state" podUID="ea01fba1-445f-46c1-898c-1ceb34866850" containerName="sbdb" Oct 03 08:50:38 crc kubenswrapper[4765]: I1003 08:50:38.609417 4765 memory_manager.go:354] "RemoveStaleState removing state" podUID="ea01fba1-445f-46c1-898c-1ceb34866850" containerName="kube-rbac-proxy-ovn-metrics" Oct 03 08:50:38 crc kubenswrapper[4765]: I1003 08:50:38.609423 4765 memory_manager.go:354] "RemoveStaleState removing state" podUID="ea01fba1-445f-46c1-898c-1ceb34866850" containerName="nbdb" Oct 03 08:50:38 crc kubenswrapper[4765]: E1003 08:50:38.609511 4765 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ea01fba1-445f-46c1-898c-1ceb34866850" containerName="ovnkube-controller" Oct 03 08:50:38 crc kubenswrapper[4765]: I1003 08:50:38.609518 4765 state_mem.go:107] "Deleted CPUSet assignment" podUID="ea01fba1-445f-46c1-898c-1ceb34866850" containerName="ovnkube-controller" Oct 03 08:50:38 crc kubenswrapper[4765]: E1003 08:50:38.609527 4765 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ea01fba1-445f-46c1-898c-1ceb34866850" containerName="ovnkube-controller" Oct 03 08:50:38 crc kubenswrapper[4765]: I1003 08:50:38.609533 4765 state_mem.go:107] "Deleted CPUSet assignment" podUID="ea01fba1-445f-46c1-898c-1ceb34866850" containerName="ovnkube-controller" Oct 03 08:50:38 crc kubenswrapper[4765]: I1003 08:50:38.609617 4765 memory_manager.go:354] "RemoveStaleState removing state" podUID="ea01fba1-445f-46c1-898c-1ceb34866850" containerName="ovnkube-controller" Oct 03 08:50:38 crc kubenswrapper[4765]: I1003 08:50:38.609806 4765 memory_manager.go:354] "RemoveStaleState removing state" podUID="ea01fba1-445f-46c1-898c-1ceb34866850" containerName="ovnkube-controller" Oct 03 08:50:38 crc kubenswrapper[4765]: I1003 08:50:38.611278 4765 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-vm8hh" Oct 03 08:50:38 crc kubenswrapper[4765]: I1003 08:50:38.686995 4765 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/ea01fba1-445f-46c1-898c-1ceb34866850-run-openvswitch\") pod \"ea01fba1-445f-46c1-898c-1ceb34866850\" (UID: \"ea01fba1-445f-46c1-898c-1ceb34866850\") " Oct 03 08:50:38 crc kubenswrapper[4765]: I1003 08:50:38.687056 4765 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/ea01fba1-445f-46c1-898c-1ceb34866850-run-ovn\") pod \"ea01fba1-445f-46c1-898c-1ceb34866850\" (UID: \"ea01fba1-445f-46c1-898c-1ceb34866850\") " Oct 03 08:50:38 crc kubenswrapper[4765]: I1003 08:50:38.687086 4765 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/ea01fba1-445f-46c1-898c-1ceb34866850-ovnkube-script-lib\") pod \"ea01fba1-445f-46c1-898c-1ceb34866850\" (UID: \"ea01fba1-445f-46c1-898c-1ceb34866850\") " Oct 03 08:50:38 crc kubenswrapper[4765]: I1003 08:50:38.687108 4765 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/ea01fba1-445f-46c1-898c-1ceb34866850-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ea01fba1-445f-46c1-898c-1ceb34866850\" (UID: \"ea01fba1-445f-46c1-898c-1ceb34866850\") " Oct 03 08:50:38 crc kubenswrapper[4765]: I1003 08:50:38.687142 4765 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/ea01fba1-445f-46c1-898c-1ceb34866850-host-cni-bin\") pod \"ea01fba1-445f-46c1-898c-1ceb34866850\" (UID: \"ea01fba1-445f-46c1-898c-1ceb34866850\") " Oct 03 08:50:38 crc kubenswrapper[4765]: I1003 08:50:38.687159 4765 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/ea01fba1-445f-46c1-898c-1ceb34866850-env-overrides\") pod \"ea01fba1-445f-46c1-898c-1ceb34866850\" (UID: \"ea01fba1-445f-46c1-898c-1ceb34866850\") " Oct 03 08:50:38 crc kubenswrapper[4765]: I1003 08:50:38.687184 4765 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-m4zqv\" (UniqueName: \"kubernetes.io/projected/ea01fba1-445f-46c1-898c-1ceb34866850-kube-api-access-m4zqv\") pod \"ea01fba1-445f-46c1-898c-1ceb34866850\" (UID: \"ea01fba1-445f-46c1-898c-1ceb34866850\") " Oct 03 08:50:38 crc kubenswrapper[4765]: I1003 08:50:38.687204 4765 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/ea01fba1-445f-46c1-898c-1ceb34866850-host-run-netns\") pod \"ea01fba1-445f-46c1-898c-1ceb34866850\" (UID: \"ea01fba1-445f-46c1-898c-1ceb34866850\") " Oct 03 08:50:38 crc kubenswrapper[4765]: I1003 08:50:38.687216 4765 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/ea01fba1-445f-46c1-898c-1ceb34866850-run-systemd\") pod \"ea01fba1-445f-46c1-898c-1ceb34866850\" (UID: \"ea01fba1-445f-46c1-898c-1ceb34866850\") " Oct 03 08:50:38 crc kubenswrapper[4765]: I1003 08:50:38.687239 4765 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/ea01fba1-445f-46c1-898c-1ceb34866850-systemd-units\") pod \"ea01fba1-445f-46c1-898c-1ceb34866850\" (UID: \"ea01fba1-445f-46c1-898c-1ceb34866850\") " Oct 03 08:50:38 crc kubenswrapper[4765]: I1003 08:50:38.687257 4765 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/ea01fba1-445f-46c1-898c-1ceb34866850-host-kubelet\") pod \"ea01fba1-445f-46c1-898c-1ceb34866850\" (UID: \"ea01fba1-445f-46c1-898c-1ceb34866850\") " Oct 03 08:50:38 crc kubenswrapper[4765]: I1003 08:50:38.687274 4765 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/ea01fba1-445f-46c1-898c-1ceb34866850-host-cni-netd\") pod \"ea01fba1-445f-46c1-898c-1ceb34866850\" (UID: \"ea01fba1-445f-46c1-898c-1ceb34866850\") " Oct 03 08:50:38 crc kubenswrapper[4765]: I1003 08:50:38.687294 4765 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/ea01fba1-445f-46c1-898c-1ceb34866850-var-lib-openvswitch\") pod \"ea01fba1-445f-46c1-898c-1ceb34866850\" (UID: \"ea01fba1-445f-46c1-898c-1ceb34866850\") " Oct 03 08:50:38 crc kubenswrapper[4765]: I1003 08:50:38.687311 4765 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/ea01fba1-445f-46c1-898c-1ceb34866850-ovn-node-metrics-cert\") pod \"ea01fba1-445f-46c1-898c-1ceb34866850\" (UID: \"ea01fba1-445f-46c1-898c-1ceb34866850\") " Oct 03 08:50:38 crc kubenswrapper[4765]: I1003 08:50:38.687331 4765 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/ea01fba1-445f-46c1-898c-1ceb34866850-ovnkube-config\") pod \"ea01fba1-445f-46c1-898c-1ceb34866850\" (UID: \"ea01fba1-445f-46c1-898c-1ceb34866850\") " Oct 03 08:50:38 crc kubenswrapper[4765]: I1003 08:50:38.687346 4765 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/ea01fba1-445f-46c1-898c-1ceb34866850-log-socket\") pod \"ea01fba1-445f-46c1-898c-1ceb34866850\" (UID: \"ea01fba1-445f-46c1-898c-1ceb34866850\") " Oct 03 08:50:38 crc kubenswrapper[4765]: I1003 08:50:38.687388 4765 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/ea01fba1-445f-46c1-898c-1ceb34866850-node-log\") pod \"ea01fba1-445f-46c1-898c-1ceb34866850\" (UID: \"ea01fba1-445f-46c1-898c-1ceb34866850\") " Oct 03 08:50:38 crc kubenswrapper[4765]: I1003 08:50:38.687410 4765 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/ea01fba1-445f-46c1-898c-1ceb34866850-host-slash\") pod \"ea01fba1-445f-46c1-898c-1ceb34866850\" (UID: \"ea01fba1-445f-46c1-898c-1ceb34866850\") " Oct 03 08:50:38 crc kubenswrapper[4765]: I1003 08:50:38.687431 4765 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/ea01fba1-445f-46c1-898c-1ceb34866850-etc-openvswitch\") pod \"ea01fba1-445f-46c1-898c-1ceb34866850\" (UID: \"ea01fba1-445f-46c1-898c-1ceb34866850\") " Oct 03 08:50:38 crc kubenswrapper[4765]: I1003 08:50:38.687458 4765 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/ea01fba1-445f-46c1-898c-1ceb34866850-host-run-ovn-kubernetes\") pod \"ea01fba1-445f-46c1-898c-1ceb34866850\" (UID: \"ea01fba1-445f-46c1-898c-1ceb34866850\") " Oct 03 08:50:38 crc kubenswrapper[4765]: I1003 08:50:38.687822 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/dd54bc15-5647-465d-9616-cf09d1ad5885-host-cni-bin\") pod \"ovnkube-node-vm8hh\" (UID: \"dd54bc15-5647-465d-9616-cf09d1ad5885\") " pod="openshift-ovn-kubernetes/ovnkube-node-vm8hh" Oct 03 08:50:38 crc kubenswrapper[4765]: I1003 08:50:38.687847 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/dd54bc15-5647-465d-9616-cf09d1ad5885-host-slash\") pod \"ovnkube-node-vm8hh\" (UID: \"dd54bc15-5647-465d-9616-cf09d1ad5885\") " pod="openshift-ovn-kubernetes/ovnkube-node-vm8hh" Oct 03 08:50:38 crc kubenswrapper[4765]: I1003 08:50:38.687871 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/dd54bc15-5647-465d-9616-cf09d1ad5885-host-cni-netd\") pod \"ovnkube-node-vm8hh\" (UID: \"dd54bc15-5647-465d-9616-cf09d1ad5885\") " pod="openshift-ovn-kubernetes/ovnkube-node-vm8hh" Oct 03 08:50:38 crc kubenswrapper[4765]: I1003 08:50:38.687894 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/dd54bc15-5647-465d-9616-cf09d1ad5885-run-ovn\") pod \"ovnkube-node-vm8hh\" (UID: \"dd54bc15-5647-465d-9616-cf09d1ad5885\") " pod="openshift-ovn-kubernetes/ovnkube-node-vm8hh" Oct 03 08:50:38 crc kubenswrapper[4765]: I1003 08:50:38.687912 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/dd54bc15-5647-465d-9616-cf09d1ad5885-ovnkube-script-lib\") pod \"ovnkube-node-vm8hh\" (UID: \"dd54bc15-5647-465d-9616-cf09d1ad5885\") " pod="openshift-ovn-kubernetes/ovnkube-node-vm8hh" Oct 03 08:50:38 crc kubenswrapper[4765]: I1003 08:50:38.687947 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/dd54bc15-5647-465d-9616-cf09d1ad5885-etc-openvswitch\") pod \"ovnkube-node-vm8hh\" (UID: \"dd54bc15-5647-465d-9616-cf09d1ad5885\") " pod="openshift-ovn-kubernetes/ovnkube-node-vm8hh" Oct 03 08:50:38 crc kubenswrapper[4765]: I1003 08:50:38.687967 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/dd54bc15-5647-465d-9616-cf09d1ad5885-node-log\") pod \"ovnkube-node-vm8hh\" (UID: \"dd54bc15-5647-465d-9616-cf09d1ad5885\") " pod="openshift-ovn-kubernetes/ovnkube-node-vm8hh" Oct 03 08:50:38 crc kubenswrapper[4765]: I1003 08:50:38.687992 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/dd54bc15-5647-465d-9616-cf09d1ad5885-run-openvswitch\") pod \"ovnkube-node-vm8hh\" (UID: \"dd54bc15-5647-465d-9616-cf09d1ad5885\") " pod="openshift-ovn-kubernetes/ovnkube-node-vm8hh" Oct 03 08:50:38 crc kubenswrapper[4765]: I1003 08:50:38.688006 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/dd54bc15-5647-465d-9616-cf09d1ad5885-env-overrides\") pod \"ovnkube-node-vm8hh\" (UID: \"dd54bc15-5647-465d-9616-cf09d1ad5885\") " pod="openshift-ovn-kubernetes/ovnkube-node-vm8hh" Oct 03 08:50:38 crc kubenswrapper[4765]: I1003 08:50:38.688034 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/dd54bc15-5647-465d-9616-cf09d1ad5885-run-systemd\") pod \"ovnkube-node-vm8hh\" (UID: \"dd54bc15-5647-465d-9616-cf09d1ad5885\") " pod="openshift-ovn-kubernetes/ovnkube-node-vm8hh" Oct 03 08:50:38 crc kubenswrapper[4765]: I1003 08:50:38.688061 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6fx26\" (UniqueName: \"kubernetes.io/projected/dd54bc15-5647-465d-9616-cf09d1ad5885-kube-api-access-6fx26\") pod \"ovnkube-node-vm8hh\" (UID: \"dd54bc15-5647-465d-9616-cf09d1ad5885\") " pod="openshift-ovn-kubernetes/ovnkube-node-vm8hh" Oct 03 08:50:38 crc kubenswrapper[4765]: I1003 08:50:38.688082 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/dd54bc15-5647-465d-9616-cf09d1ad5885-ovn-node-metrics-cert\") pod \"ovnkube-node-vm8hh\" (UID: \"dd54bc15-5647-465d-9616-cf09d1ad5885\") " pod="openshift-ovn-kubernetes/ovnkube-node-vm8hh" Oct 03 08:50:38 crc kubenswrapper[4765]: I1003 08:50:38.688107 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/dd54bc15-5647-465d-9616-cf09d1ad5885-host-kubelet\") pod \"ovnkube-node-vm8hh\" (UID: \"dd54bc15-5647-465d-9616-cf09d1ad5885\") " pod="openshift-ovn-kubernetes/ovnkube-node-vm8hh" Oct 03 08:50:38 crc kubenswrapper[4765]: I1003 08:50:38.688126 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/dd54bc15-5647-465d-9616-cf09d1ad5885-systemd-units\") pod \"ovnkube-node-vm8hh\" (UID: \"dd54bc15-5647-465d-9616-cf09d1ad5885\") " pod="openshift-ovn-kubernetes/ovnkube-node-vm8hh" Oct 03 08:50:38 crc kubenswrapper[4765]: I1003 08:50:38.688153 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/dd54bc15-5647-465d-9616-cf09d1ad5885-host-run-ovn-kubernetes\") pod \"ovnkube-node-vm8hh\" (UID: \"dd54bc15-5647-465d-9616-cf09d1ad5885\") " pod="openshift-ovn-kubernetes/ovnkube-node-vm8hh" Oct 03 08:50:38 crc kubenswrapper[4765]: I1003 08:50:38.688178 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/dd54bc15-5647-465d-9616-cf09d1ad5885-ovnkube-config\") pod \"ovnkube-node-vm8hh\" (UID: \"dd54bc15-5647-465d-9616-cf09d1ad5885\") " pod="openshift-ovn-kubernetes/ovnkube-node-vm8hh" Oct 03 08:50:38 crc kubenswrapper[4765]: I1003 08:50:38.688202 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/dd54bc15-5647-465d-9616-cf09d1ad5885-var-lib-openvswitch\") pod \"ovnkube-node-vm8hh\" (UID: \"dd54bc15-5647-465d-9616-cf09d1ad5885\") " pod="openshift-ovn-kubernetes/ovnkube-node-vm8hh" Oct 03 08:50:38 crc kubenswrapper[4765]: I1003 08:50:38.688224 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/dd54bc15-5647-465d-9616-cf09d1ad5885-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-vm8hh\" (UID: \"dd54bc15-5647-465d-9616-cf09d1ad5885\") " pod="openshift-ovn-kubernetes/ovnkube-node-vm8hh" Oct 03 08:50:38 crc kubenswrapper[4765]: I1003 08:50:38.688246 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/dd54bc15-5647-465d-9616-cf09d1ad5885-host-run-netns\") pod \"ovnkube-node-vm8hh\" (UID: \"dd54bc15-5647-465d-9616-cf09d1ad5885\") " pod="openshift-ovn-kubernetes/ovnkube-node-vm8hh" Oct 03 08:50:38 crc kubenswrapper[4765]: I1003 08:50:38.688268 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/dd54bc15-5647-465d-9616-cf09d1ad5885-log-socket\") pod \"ovnkube-node-vm8hh\" (UID: \"dd54bc15-5647-465d-9616-cf09d1ad5885\") " pod="openshift-ovn-kubernetes/ovnkube-node-vm8hh" Oct 03 08:50:38 crc kubenswrapper[4765]: I1003 08:50:38.688405 4765 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/ea01fba1-445f-46c1-898c-1ceb34866850-run-openvswitch" (OuterVolumeSpecName: "run-openvswitch") pod "ea01fba1-445f-46c1-898c-1ceb34866850" (UID: "ea01fba1-445f-46c1-898c-1ceb34866850"). InnerVolumeSpecName "run-openvswitch". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 03 08:50:38 crc kubenswrapper[4765]: I1003 08:50:38.688421 4765 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/ea01fba1-445f-46c1-898c-1ceb34866850-systemd-units" (OuterVolumeSpecName: "systemd-units") pod "ea01fba1-445f-46c1-898c-1ceb34866850" (UID: "ea01fba1-445f-46c1-898c-1ceb34866850"). InnerVolumeSpecName "systemd-units". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 03 08:50:38 crc kubenswrapper[4765]: I1003 08:50:38.688449 4765 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/ea01fba1-445f-46c1-898c-1ceb34866850-host-kubelet" (OuterVolumeSpecName: "host-kubelet") pod "ea01fba1-445f-46c1-898c-1ceb34866850" (UID: "ea01fba1-445f-46c1-898c-1ceb34866850"). InnerVolumeSpecName "host-kubelet". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 03 08:50:38 crc kubenswrapper[4765]: I1003 08:50:38.688471 4765 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/ea01fba1-445f-46c1-898c-1ceb34866850-host-cni-netd" (OuterVolumeSpecName: "host-cni-netd") pod "ea01fba1-445f-46c1-898c-1ceb34866850" (UID: "ea01fba1-445f-46c1-898c-1ceb34866850"). InnerVolumeSpecName "host-cni-netd". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 03 08:50:38 crc kubenswrapper[4765]: I1003 08:50:38.688496 4765 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/ea01fba1-445f-46c1-898c-1ceb34866850-var-lib-openvswitch" (OuterVolumeSpecName: "var-lib-openvswitch") pod "ea01fba1-445f-46c1-898c-1ceb34866850" (UID: "ea01fba1-445f-46c1-898c-1ceb34866850"). InnerVolumeSpecName "var-lib-openvswitch". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 03 08:50:38 crc kubenswrapper[4765]: I1003 08:50:38.688496 4765 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/ea01fba1-445f-46c1-898c-1ceb34866850-run-ovn" (OuterVolumeSpecName: "run-ovn") pod "ea01fba1-445f-46c1-898c-1ceb34866850" (UID: "ea01fba1-445f-46c1-898c-1ceb34866850"). InnerVolumeSpecName "run-ovn". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 03 08:50:38 crc kubenswrapper[4765]: I1003 08:50:38.688536 4765 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/ea01fba1-445f-46c1-898c-1ceb34866850-host-slash" (OuterVolumeSpecName: "host-slash") pod "ea01fba1-445f-46c1-898c-1ceb34866850" (UID: "ea01fba1-445f-46c1-898c-1ceb34866850"). InnerVolumeSpecName "host-slash". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 03 08:50:38 crc kubenswrapper[4765]: I1003 08:50:38.688681 4765 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/ea01fba1-445f-46c1-898c-1ceb34866850-log-socket" (OuterVolumeSpecName: "log-socket") pod "ea01fba1-445f-46c1-898c-1ceb34866850" (UID: "ea01fba1-445f-46c1-898c-1ceb34866850"). InnerVolumeSpecName "log-socket". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 03 08:50:38 crc kubenswrapper[4765]: I1003 08:50:38.688724 4765 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/ea01fba1-445f-46c1-898c-1ceb34866850-etc-openvswitch" (OuterVolumeSpecName: "etc-openvswitch") pod "ea01fba1-445f-46c1-898c-1ceb34866850" (UID: "ea01fba1-445f-46c1-898c-1ceb34866850"). InnerVolumeSpecName "etc-openvswitch". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 03 08:50:38 crc kubenswrapper[4765]: I1003 08:50:38.688416 4765 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/ea01fba1-445f-46c1-898c-1ceb34866850-host-run-ovn-kubernetes" (OuterVolumeSpecName: "host-run-ovn-kubernetes") pod "ea01fba1-445f-46c1-898c-1ceb34866850" (UID: "ea01fba1-445f-46c1-898c-1ceb34866850"). InnerVolumeSpecName "host-run-ovn-kubernetes". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 03 08:50:38 crc kubenswrapper[4765]: I1003 08:50:38.689742 4765 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/ea01fba1-445f-46c1-898c-1ceb34866850-node-log" (OuterVolumeSpecName: "node-log") pod "ea01fba1-445f-46c1-898c-1ceb34866850" (UID: "ea01fba1-445f-46c1-898c-1ceb34866850"). InnerVolumeSpecName "node-log". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 03 08:50:38 crc kubenswrapper[4765]: I1003 08:50:38.689750 4765 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/ea01fba1-445f-46c1-898c-1ceb34866850-host-cni-bin" (OuterVolumeSpecName: "host-cni-bin") pod "ea01fba1-445f-46c1-898c-1ceb34866850" (UID: "ea01fba1-445f-46c1-898c-1ceb34866850"). InnerVolumeSpecName "host-cni-bin". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 03 08:50:38 crc kubenswrapper[4765]: I1003 08:50:38.689787 4765 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/ea01fba1-445f-46c1-898c-1ceb34866850-host-var-lib-cni-networks-ovn-kubernetes" (OuterVolumeSpecName: "host-var-lib-cni-networks-ovn-kubernetes") pod "ea01fba1-445f-46c1-898c-1ceb34866850" (UID: "ea01fba1-445f-46c1-898c-1ceb34866850"). InnerVolumeSpecName "host-var-lib-cni-networks-ovn-kubernetes". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 03 08:50:38 crc kubenswrapper[4765]: I1003 08:50:38.689815 4765 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/ea01fba1-445f-46c1-898c-1ceb34866850-host-run-netns" (OuterVolumeSpecName: "host-run-netns") pod "ea01fba1-445f-46c1-898c-1ceb34866850" (UID: "ea01fba1-445f-46c1-898c-1ceb34866850"). InnerVolumeSpecName "host-run-netns". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 03 08:50:38 crc kubenswrapper[4765]: I1003 08:50:38.692238 4765 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ea01fba1-445f-46c1-898c-1ceb34866850-ovnkube-config" (OuterVolumeSpecName: "ovnkube-config") pod "ea01fba1-445f-46c1-898c-1ceb34866850" (UID: "ea01fba1-445f-46c1-898c-1ceb34866850"). InnerVolumeSpecName "ovnkube-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 03 08:50:38 crc kubenswrapper[4765]: I1003 08:50:38.692666 4765 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ea01fba1-445f-46c1-898c-1ceb34866850-ovnkube-script-lib" (OuterVolumeSpecName: "ovnkube-script-lib") pod "ea01fba1-445f-46c1-898c-1ceb34866850" (UID: "ea01fba1-445f-46c1-898c-1ceb34866850"). InnerVolumeSpecName "ovnkube-script-lib". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 03 08:50:38 crc kubenswrapper[4765]: I1003 08:50:38.699977 4765 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ea01fba1-445f-46c1-898c-1ceb34866850-env-overrides" (OuterVolumeSpecName: "env-overrides") pod "ea01fba1-445f-46c1-898c-1ceb34866850" (UID: "ea01fba1-445f-46c1-898c-1ceb34866850"). InnerVolumeSpecName "env-overrides". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 03 08:50:38 crc kubenswrapper[4765]: I1003 08:50:38.699659 4765 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ea01fba1-445f-46c1-898c-1ceb34866850-kube-api-access-m4zqv" (OuterVolumeSpecName: "kube-api-access-m4zqv") pod "ea01fba1-445f-46c1-898c-1ceb34866850" (UID: "ea01fba1-445f-46c1-898c-1ceb34866850"). InnerVolumeSpecName "kube-api-access-m4zqv". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 08:50:38 crc kubenswrapper[4765]: I1003 08:50:38.714457 4765 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/ea01fba1-445f-46c1-898c-1ceb34866850-run-systemd" (OuterVolumeSpecName: "run-systemd") pod "ea01fba1-445f-46c1-898c-1ceb34866850" (UID: "ea01fba1-445f-46c1-898c-1ceb34866850"). InnerVolumeSpecName "run-systemd". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 03 08:50:38 crc kubenswrapper[4765]: I1003 08:50:38.717156 4765 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ea01fba1-445f-46c1-898c-1ceb34866850-ovn-node-metrics-cert" (OuterVolumeSpecName: "ovn-node-metrics-cert") pod "ea01fba1-445f-46c1-898c-1ceb34866850" (UID: "ea01fba1-445f-46c1-898c-1ceb34866850"). InnerVolumeSpecName "ovn-node-metrics-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 08:50:38 crc kubenswrapper[4765]: I1003 08:50:38.789442 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/dd54bc15-5647-465d-9616-cf09d1ad5885-host-run-ovn-kubernetes\") pod \"ovnkube-node-vm8hh\" (UID: \"dd54bc15-5647-465d-9616-cf09d1ad5885\") " pod="openshift-ovn-kubernetes/ovnkube-node-vm8hh" Oct 03 08:50:38 crc kubenswrapper[4765]: I1003 08:50:38.789749 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/dd54bc15-5647-465d-9616-cf09d1ad5885-ovnkube-config\") pod \"ovnkube-node-vm8hh\" (UID: \"dd54bc15-5647-465d-9616-cf09d1ad5885\") " pod="openshift-ovn-kubernetes/ovnkube-node-vm8hh" Oct 03 08:50:38 crc kubenswrapper[4765]: I1003 08:50:38.789771 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/dd54bc15-5647-465d-9616-cf09d1ad5885-var-lib-openvswitch\") pod \"ovnkube-node-vm8hh\" (UID: \"dd54bc15-5647-465d-9616-cf09d1ad5885\") " pod="openshift-ovn-kubernetes/ovnkube-node-vm8hh" Oct 03 08:50:38 crc kubenswrapper[4765]: I1003 08:50:38.789790 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/dd54bc15-5647-465d-9616-cf09d1ad5885-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-vm8hh\" (UID: \"dd54bc15-5647-465d-9616-cf09d1ad5885\") " pod="openshift-ovn-kubernetes/ovnkube-node-vm8hh" Oct 03 08:50:38 crc kubenswrapper[4765]: I1003 08:50:38.789811 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/dd54bc15-5647-465d-9616-cf09d1ad5885-host-run-netns\") pod \"ovnkube-node-vm8hh\" (UID: \"dd54bc15-5647-465d-9616-cf09d1ad5885\") " pod="openshift-ovn-kubernetes/ovnkube-node-vm8hh" Oct 03 08:50:38 crc kubenswrapper[4765]: I1003 08:50:38.789827 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/dd54bc15-5647-465d-9616-cf09d1ad5885-log-socket\") pod \"ovnkube-node-vm8hh\" (UID: \"dd54bc15-5647-465d-9616-cf09d1ad5885\") " pod="openshift-ovn-kubernetes/ovnkube-node-vm8hh" Oct 03 08:50:38 crc kubenswrapper[4765]: I1003 08:50:38.789845 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/dd54bc15-5647-465d-9616-cf09d1ad5885-host-cni-bin\") pod \"ovnkube-node-vm8hh\" (UID: \"dd54bc15-5647-465d-9616-cf09d1ad5885\") " pod="openshift-ovn-kubernetes/ovnkube-node-vm8hh" Oct 03 08:50:38 crc kubenswrapper[4765]: I1003 08:50:38.789860 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/dd54bc15-5647-465d-9616-cf09d1ad5885-host-slash\") pod \"ovnkube-node-vm8hh\" (UID: \"dd54bc15-5647-465d-9616-cf09d1ad5885\") " pod="openshift-ovn-kubernetes/ovnkube-node-vm8hh" Oct 03 08:50:38 crc kubenswrapper[4765]: I1003 08:50:38.789879 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/dd54bc15-5647-465d-9616-cf09d1ad5885-host-cni-netd\") pod \"ovnkube-node-vm8hh\" (UID: \"dd54bc15-5647-465d-9616-cf09d1ad5885\") " pod="openshift-ovn-kubernetes/ovnkube-node-vm8hh" Oct 03 08:50:38 crc kubenswrapper[4765]: I1003 08:50:38.789912 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/dd54bc15-5647-465d-9616-cf09d1ad5885-run-ovn\") pod \"ovnkube-node-vm8hh\" (UID: \"dd54bc15-5647-465d-9616-cf09d1ad5885\") " pod="openshift-ovn-kubernetes/ovnkube-node-vm8hh" Oct 03 08:50:38 crc kubenswrapper[4765]: I1003 08:50:38.789928 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/dd54bc15-5647-465d-9616-cf09d1ad5885-ovnkube-script-lib\") pod \"ovnkube-node-vm8hh\" (UID: \"dd54bc15-5647-465d-9616-cf09d1ad5885\") " pod="openshift-ovn-kubernetes/ovnkube-node-vm8hh" Oct 03 08:50:38 crc kubenswrapper[4765]: I1003 08:50:38.789953 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/dd54bc15-5647-465d-9616-cf09d1ad5885-etc-openvswitch\") pod \"ovnkube-node-vm8hh\" (UID: \"dd54bc15-5647-465d-9616-cf09d1ad5885\") " pod="openshift-ovn-kubernetes/ovnkube-node-vm8hh" Oct 03 08:50:38 crc kubenswrapper[4765]: I1003 08:50:38.789970 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/dd54bc15-5647-465d-9616-cf09d1ad5885-node-log\") pod \"ovnkube-node-vm8hh\" (UID: \"dd54bc15-5647-465d-9616-cf09d1ad5885\") " pod="openshift-ovn-kubernetes/ovnkube-node-vm8hh" Oct 03 08:50:38 crc kubenswrapper[4765]: I1003 08:50:38.789989 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/dd54bc15-5647-465d-9616-cf09d1ad5885-run-openvswitch\") pod \"ovnkube-node-vm8hh\" (UID: \"dd54bc15-5647-465d-9616-cf09d1ad5885\") " pod="openshift-ovn-kubernetes/ovnkube-node-vm8hh" Oct 03 08:50:38 crc kubenswrapper[4765]: I1003 08:50:38.790004 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/dd54bc15-5647-465d-9616-cf09d1ad5885-env-overrides\") pod \"ovnkube-node-vm8hh\" (UID: \"dd54bc15-5647-465d-9616-cf09d1ad5885\") " pod="openshift-ovn-kubernetes/ovnkube-node-vm8hh" Oct 03 08:50:38 crc kubenswrapper[4765]: I1003 08:50:38.790030 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/dd54bc15-5647-465d-9616-cf09d1ad5885-run-systemd\") pod \"ovnkube-node-vm8hh\" (UID: \"dd54bc15-5647-465d-9616-cf09d1ad5885\") " pod="openshift-ovn-kubernetes/ovnkube-node-vm8hh" Oct 03 08:50:38 crc kubenswrapper[4765]: I1003 08:50:38.790052 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6fx26\" (UniqueName: \"kubernetes.io/projected/dd54bc15-5647-465d-9616-cf09d1ad5885-kube-api-access-6fx26\") pod \"ovnkube-node-vm8hh\" (UID: \"dd54bc15-5647-465d-9616-cf09d1ad5885\") " pod="openshift-ovn-kubernetes/ovnkube-node-vm8hh" Oct 03 08:50:38 crc kubenswrapper[4765]: I1003 08:50:38.790071 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/dd54bc15-5647-465d-9616-cf09d1ad5885-ovn-node-metrics-cert\") pod \"ovnkube-node-vm8hh\" (UID: \"dd54bc15-5647-465d-9616-cf09d1ad5885\") " pod="openshift-ovn-kubernetes/ovnkube-node-vm8hh" Oct 03 08:50:38 crc kubenswrapper[4765]: I1003 08:50:38.790088 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/dd54bc15-5647-465d-9616-cf09d1ad5885-host-kubelet\") pod \"ovnkube-node-vm8hh\" (UID: \"dd54bc15-5647-465d-9616-cf09d1ad5885\") " pod="openshift-ovn-kubernetes/ovnkube-node-vm8hh" Oct 03 08:50:38 crc kubenswrapper[4765]: I1003 08:50:38.790103 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/dd54bc15-5647-465d-9616-cf09d1ad5885-systemd-units\") pod \"ovnkube-node-vm8hh\" (UID: \"dd54bc15-5647-465d-9616-cf09d1ad5885\") " pod="openshift-ovn-kubernetes/ovnkube-node-vm8hh" Oct 03 08:50:38 crc kubenswrapper[4765]: I1003 08:50:38.790141 4765 reconciler_common.go:293] "Volume detached for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/ea01fba1-445f-46c1-898c-1ceb34866850-host-run-ovn-kubernetes\") on node \"crc\" DevicePath \"\"" Oct 03 08:50:38 crc kubenswrapper[4765]: I1003 08:50:38.790152 4765 reconciler_common.go:293] "Volume detached for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/ea01fba1-445f-46c1-898c-1ceb34866850-run-openvswitch\") on node \"crc\" DevicePath \"\"" Oct 03 08:50:38 crc kubenswrapper[4765]: I1003 08:50:38.790162 4765 reconciler_common.go:293] "Volume detached for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/ea01fba1-445f-46c1-898c-1ceb34866850-run-ovn\") on node \"crc\" DevicePath \"\"" Oct 03 08:50:38 crc kubenswrapper[4765]: I1003 08:50:38.790170 4765 reconciler_common.go:293] "Volume detached for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/ea01fba1-445f-46c1-898c-1ceb34866850-host-var-lib-cni-networks-ovn-kubernetes\") on node \"crc\" DevicePath \"\"" Oct 03 08:50:38 crc kubenswrapper[4765]: I1003 08:50:38.790179 4765 reconciler_common.go:293] "Volume detached for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/ea01fba1-445f-46c1-898c-1ceb34866850-ovnkube-script-lib\") on node \"crc\" DevicePath \"\"" Oct 03 08:50:38 crc kubenswrapper[4765]: I1003 08:50:38.790188 4765 reconciler_common.go:293] "Volume detached for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/ea01fba1-445f-46c1-898c-1ceb34866850-host-cni-bin\") on node \"crc\" DevicePath \"\"" Oct 03 08:50:38 crc kubenswrapper[4765]: I1003 08:50:38.790196 4765 reconciler_common.go:293] "Volume detached for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/ea01fba1-445f-46c1-898c-1ceb34866850-env-overrides\") on node \"crc\" DevicePath \"\"" Oct 03 08:50:38 crc kubenswrapper[4765]: I1003 08:50:38.790205 4765 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-m4zqv\" (UniqueName: \"kubernetes.io/projected/ea01fba1-445f-46c1-898c-1ceb34866850-kube-api-access-m4zqv\") on node \"crc\" DevicePath \"\"" Oct 03 08:50:38 crc kubenswrapper[4765]: I1003 08:50:38.790216 4765 reconciler_common.go:293] "Volume detached for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/ea01fba1-445f-46c1-898c-1ceb34866850-host-run-netns\") on node \"crc\" DevicePath \"\"" Oct 03 08:50:38 crc kubenswrapper[4765]: I1003 08:50:38.790230 4765 reconciler_common.go:293] "Volume detached for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/ea01fba1-445f-46c1-898c-1ceb34866850-run-systemd\") on node \"crc\" DevicePath \"\"" Oct 03 08:50:38 crc kubenswrapper[4765]: I1003 08:50:38.790238 4765 reconciler_common.go:293] "Volume detached for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/ea01fba1-445f-46c1-898c-1ceb34866850-systemd-units\") on node \"crc\" DevicePath \"\"" Oct 03 08:50:38 crc kubenswrapper[4765]: I1003 08:50:38.790248 4765 reconciler_common.go:293] "Volume detached for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/ea01fba1-445f-46c1-898c-1ceb34866850-host-kubelet\") on node \"crc\" DevicePath \"\"" Oct 03 08:50:38 crc kubenswrapper[4765]: I1003 08:50:38.790256 4765 reconciler_common.go:293] "Volume detached for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/ea01fba1-445f-46c1-898c-1ceb34866850-host-cni-netd\") on node \"crc\" DevicePath \"\"" Oct 03 08:50:38 crc kubenswrapper[4765]: I1003 08:50:38.790264 4765 reconciler_common.go:293] "Volume detached for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/ea01fba1-445f-46c1-898c-1ceb34866850-var-lib-openvswitch\") on node \"crc\" DevicePath \"\"" Oct 03 08:50:38 crc kubenswrapper[4765]: I1003 08:50:38.790273 4765 reconciler_common.go:293] "Volume detached for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/ea01fba1-445f-46c1-898c-1ceb34866850-ovn-node-metrics-cert\") on node \"crc\" DevicePath \"\"" Oct 03 08:50:38 crc kubenswrapper[4765]: I1003 08:50:38.790283 4765 reconciler_common.go:293] "Volume detached for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/ea01fba1-445f-46c1-898c-1ceb34866850-ovnkube-config\") on node \"crc\" DevicePath \"\"" Oct 03 08:50:38 crc kubenswrapper[4765]: I1003 08:50:38.790291 4765 reconciler_common.go:293] "Volume detached for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/ea01fba1-445f-46c1-898c-1ceb34866850-log-socket\") on node \"crc\" DevicePath \"\"" Oct 03 08:50:38 crc kubenswrapper[4765]: I1003 08:50:38.790300 4765 reconciler_common.go:293] "Volume detached for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/ea01fba1-445f-46c1-898c-1ceb34866850-node-log\") on node \"crc\" DevicePath \"\"" Oct 03 08:50:38 crc kubenswrapper[4765]: I1003 08:50:38.790308 4765 reconciler_common.go:293] "Volume detached for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/ea01fba1-445f-46c1-898c-1ceb34866850-host-slash\") on node \"crc\" DevicePath \"\"" Oct 03 08:50:38 crc kubenswrapper[4765]: I1003 08:50:38.790316 4765 reconciler_common.go:293] "Volume detached for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/ea01fba1-445f-46c1-898c-1ceb34866850-etc-openvswitch\") on node \"crc\" DevicePath \"\"" Oct 03 08:50:38 crc kubenswrapper[4765]: I1003 08:50:38.790353 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/dd54bc15-5647-465d-9616-cf09d1ad5885-systemd-units\") pod \"ovnkube-node-vm8hh\" (UID: \"dd54bc15-5647-465d-9616-cf09d1ad5885\") " pod="openshift-ovn-kubernetes/ovnkube-node-vm8hh" Oct 03 08:50:38 crc kubenswrapper[4765]: I1003 08:50:38.790390 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/dd54bc15-5647-465d-9616-cf09d1ad5885-host-run-ovn-kubernetes\") pod \"ovnkube-node-vm8hh\" (UID: \"dd54bc15-5647-465d-9616-cf09d1ad5885\") " pod="openshift-ovn-kubernetes/ovnkube-node-vm8hh" Oct 03 08:50:38 crc kubenswrapper[4765]: I1003 08:50:38.791293 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/dd54bc15-5647-465d-9616-cf09d1ad5885-ovnkube-config\") pod \"ovnkube-node-vm8hh\" (UID: \"dd54bc15-5647-465d-9616-cf09d1ad5885\") " pod="openshift-ovn-kubernetes/ovnkube-node-vm8hh" Oct 03 08:50:38 crc kubenswrapper[4765]: I1003 08:50:38.791331 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/dd54bc15-5647-465d-9616-cf09d1ad5885-var-lib-openvswitch\") pod \"ovnkube-node-vm8hh\" (UID: \"dd54bc15-5647-465d-9616-cf09d1ad5885\") " pod="openshift-ovn-kubernetes/ovnkube-node-vm8hh" Oct 03 08:50:38 crc kubenswrapper[4765]: I1003 08:50:38.791360 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/dd54bc15-5647-465d-9616-cf09d1ad5885-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-vm8hh\" (UID: \"dd54bc15-5647-465d-9616-cf09d1ad5885\") " pod="openshift-ovn-kubernetes/ovnkube-node-vm8hh" Oct 03 08:50:38 crc kubenswrapper[4765]: I1003 08:50:38.791387 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/dd54bc15-5647-465d-9616-cf09d1ad5885-host-run-netns\") pod \"ovnkube-node-vm8hh\" (UID: \"dd54bc15-5647-465d-9616-cf09d1ad5885\") " pod="openshift-ovn-kubernetes/ovnkube-node-vm8hh" Oct 03 08:50:38 crc kubenswrapper[4765]: I1003 08:50:38.791409 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/dd54bc15-5647-465d-9616-cf09d1ad5885-log-socket\") pod \"ovnkube-node-vm8hh\" (UID: \"dd54bc15-5647-465d-9616-cf09d1ad5885\") " pod="openshift-ovn-kubernetes/ovnkube-node-vm8hh" Oct 03 08:50:38 crc kubenswrapper[4765]: I1003 08:50:38.791435 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/dd54bc15-5647-465d-9616-cf09d1ad5885-host-cni-bin\") pod \"ovnkube-node-vm8hh\" (UID: \"dd54bc15-5647-465d-9616-cf09d1ad5885\") " pod="openshift-ovn-kubernetes/ovnkube-node-vm8hh" Oct 03 08:50:38 crc kubenswrapper[4765]: I1003 08:50:38.791461 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/dd54bc15-5647-465d-9616-cf09d1ad5885-host-slash\") pod \"ovnkube-node-vm8hh\" (UID: \"dd54bc15-5647-465d-9616-cf09d1ad5885\") " pod="openshift-ovn-kubernetes/ovnkube-node-vm8hh" Oct 03 08:50:38 crc kubenswrapper[4765]: I1003 08:50:38.791487 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/dd54bc15-5647-465d-9616-cf09d1ad5885-host-cni-netd\") pod \"ovnkube-node-vm8hh\" (UID: \"dd54bc15-5647-465d-9616-cf09d1ad5885\") " pod="openshift-ovn-kubernetes/ovnkube-node-vm8hh" Oct 03 08:50:38 crc kubenswrapper[4765]: I1003 08:50:38.791517 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/dd54bc15-5647-465d-9616-cf09d1ad5885-run-ovn\") pod \"ovnkube-node-vm8hh\" (UID: \"dd54bc15-5647-465d-9616-cf09d1ad5885\") " pod="openshift-ovn-kubernetes/ovnkube-node-vm8hh" Oct 03 08:50:38 crc kubenswrapper[4765]: I1003 08:50:38.791829 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/dd54bc15-5647-465d-9616-cf09d1ad5885-node-log\") pod \"ovnkube-node-vm8hh\" (UID: \"dd54bc15-5647-465d-9616-cf09d1ad5885\") " pod="openshift-ovn-kubernetes/ovnkube-node-vm8hh" Oct 03 08:50:38 crc kubenswrapper[4765]: I1003 08:50:38.791929 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/dd54bc15-5647-465d-9616-cf09d1ad5885-etc-openvswitch\") pod \"ovnkube-node-vm8hh\" (UID: \"dd54bc15-5647-465d-9616-cf09d1ad5885\") " pod="openshift-ovn-kubernetes/ovnkube-node-vm8hh" Oct 03 08:50:38 crc kubenswrapper[4765]: I1003 08:50:38.791966 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/dd54bc15-5647-465d-9616-cf09d1ad5885-host-kubelet\") pod \"ovnkube-node-vm8hh\" (UID: \"dd54bc15-5647-465d-9616-cf09d1ad5885\") " pod="openshift-ovn-kubernetes/ovnkube-node-vm8hh" Oct 03 08:50:38 crc kubenswrapper[4765]: I1003 08:50:38.791980 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/dd54bc15-5647-465d-9616-cf09d1ad5885-run-systemd\") pod \"ovnkube-node-vm8hh\" (UID: \"dd54bc15-5647-465d-9616-cf09d1ad5885\") " pod="openshift-ovn-kubernetes/ovnkube-node-vm8hh" Oct 03 08:50:38 crc kubenswrapper[4765]: I1003 08:50:38.791952 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/dd54bc15-5647-465d-9616-cf09d1ad5885-run-openvswitch\") pod \"ovnkube-node-vm8hh\" (UID: \"dd54bc15-5647-465d-9616-cf09d1ad5885\") " pod="openshift-ovn-kubernetes/ovnkube-node-vm8hh" Oct 03 08:50:38 crc kubenswrapper[4765]: I1003 08:50:38.792553 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/dd54bc15-5647-465d-9616-cf09d1ad5885-env-overrides\") pod \"ovnkube-node-vm8hh\" (UID: \"dd54bc15-5647-465d-9616-cf09d1ad5885\") " pod="openshift-ovn-kubernetes/ovnkube-node-vm8hh" Oct 03 08:50:38 crc kubenswrapper[4765]: I1003 08:50:38.792634 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/dd54bc15-5647-465d-9616-cf09d1ad5885-ovnkube-script-lib\") pod \"ovnkube-node-vm8hh\" (UID: \"dd54bc15-5647-465d-9616-cf09d1ad5885\") " pod="openshift-ovn-kubernetes/ovnkube-node-vm8hh" Oct 03 08:50:38 crc kubenswrapper[4765]: I1003 08:50:38.803278 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/dd54bc15-5647-465d-9616-cf09d1ad5885-ovn-node-metrics-cert\") pod \"ovnkube-node-vm8hh\" (UID: \"dd54bc15-5647-465d-9616-cf09d1ad5885\") " pod="openshift-ovn-kubernetes/ovnkube-node-vm8hh" Oct 03 08:50:38 crc kubenswrapper[4765]: I1003 08:50:38.817591 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6fx26\" (UniqueName: \"kubernetes.io/projected/dd54bc15-5647-465d-9616-cf09d1ad5885-kube-api-access-6fx26\") pod \"ovnkube-node-vm8hh\" (UID: \"dd54bc15-5647-465d-9616-cf09d1ad5885\") " pod="openshift-ovn-kubernetes/ovnkube-node-vm8hh" Oct 03 08:50:38 crc kubenswrapper[4765]: I1003 08:50:38.935724 4765 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-vm8hh" Oct 03 08:50:38 crc kubenswrapper[4765]: W1003 08:50:38.957103 4765 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poddd54bc15_5647_465d_9616_cf09d1ad5885.slice/crio-de6b315568f54e8045e5214cf9826692f48ed67634490c5cb1e948a7c5ac1b64 WatchSource:0}: Error finding container de6b315568f54e8045e5214cf9826692f48ed67634490c5cb1e948a7c5ac1b64: Status 404 returned error can't find the container with id de6b315568f54e8045e5214cf9826692f48ed67634490c5cb1e948a7c5ac1b64 Oct 03 08:50:39 crc kubenswrapper[4765]: I1003 08:50:39.553952 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operators/obo-prometheus-operator-7c8cf85677-b27n7" event={"ID":"bc3e65d2-829a-4385-b865-15e288293af9","Type":"ContainerStarted","Data":"40fa522ea9ed1985192ed0c3c42a31bc65d10c36d7f5781fa552bcb539bcd0bd"} Oct 03 08:50:39 crc kubenswrapper[4765]: I1003 08:50:39.555713 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operators/obo-prometheus-operator-admission-webhook-665f459cd-hcgt2" event={"ID":"3f879d8d-cb72-4b5f-a9f6-96ba70a723c0","Type":"ContainerStarted","Data":"969919ff622987267301b5b2a24088af7e46696917ae00486786a340371d02a2"} Oct 03 08:50:39 crc kubenswrapper[4765]: I1003 08:50:39.557226 4765 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-csb5z_912755c8-dd28-4fbc-82de-9cf85df54f4f/kube-multus/2.log" Oct 03 08:50:39 crc kubenswrapper[4765]: I1003 08:50:39.558457 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operators/obo-prometheus-operator-admission-webhook-665f459cd-2mtm8" event={"ID":"4310e68b-d273-4784-8e5f-9114306616d8","Type":"ContainerStarted","Data":"9c0d1bf49ba563a9ba0a96f5d0e2e1c1d08b8aef11dd883bb1ff64443efe82b7"} Oct 03 08:50:39 crc kubenswrapper[4765]: I1003 08:50:39.559859 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operators/perses-operator-54bc95c9fb-76rtm" event={"ID":"4cdf3ecf-31a5-43b7-9df2-d0a6d2e8fb56","Type":"ContainerStarted","Data":"04d1fd83f6301699c3c1af2ff51d825c499c95de61a9cb5524137445ad3587b7"} Oct 03 08:50:39 crc kubenswrapper[4765]: I1003 08:50:39.559949 4765 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operators/perses-operator-54bc95c9fb-76rtm" Oct 03 08:50:39 crc kubenswrapper[4765]: I1003 08:50:39.563150 4765 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-srgbb_ea01fba1-445f-46c1-898c-1ceb34866850/ovn-acl-logging/0.log" Oct 03 08:50:39 crc kubenswrapper[4765]: I1003 08:50:39.563713 4765 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-srgbb_ea01fba1-445f-46c1-898c-1ceb34866850/ovn-controller/0.log" Oct 03 08:50:39 crc kubenswrapper[4765]: I1003 08:50:39.564150 4765 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-srgbb" Oct 03 08:50:39 crc kubenswrapper[4765]: I1003 08:50:39.565751 4765 generic.go:334] "Generic (PLEG): container finished" podID="dd54bc15-5647-465d-9616-cf09d1ad5885" containerID="f9dd134e116c7665e809ab8c21e5709975b7bf12a26f98e2ec763dbafb8357b3" exitCode=0 Oct 03 08:50:39 crc kubenswrapper[4765]: I1003 08:50:39.565818 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-vm8hh" event={"ID":"dd54bc15-5647-465d-9616-cf09d1ad5885","Type":"ContainerDied","Data":"f9dd134e116c7665e809ab8c21e5709975b7bf12a26f98e2ec763dbafb8357b3"} Oct 03 08:50:39 crc kubenswrapper[4765]: I1003 08:50:39.565850 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-vm8hh" event={"ID":"dd54bc15-5647-465d-9616-cf09d1ad5885","Type":"ContainerStarted","Data":"de6b315568f54e8045e5214cf9826692f48ed67634490c5cb1e948a7c5ac1b64"} Oct 03 08:50:39 crc kubenswrapper[4765]: I1003 08:50:39.568225 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operators/observability-operator-cc5f78dfc-ndvd9" event={"ID":"9b3ed21b-9c8f-45e2-a048-c6d1e0324360","Type":"ContainerStarted","Data":"7eaad8a8537bbaa55f7faaf483888fc4acf1f8f231303dc601295ae9e088e903"} Oct 03 08:50:39 crc kubenswrapper[4765]: I1003 08:50:39.568990 4765 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operators/observability-operator-cc5f78dfc-ndvd9" Oct 03 08:50:39 crc kubenswrapper[4765]: I1003 08:50:39.585107 4765 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operators/obo-prometheus-operator-7c8cf85677-b27n7" podStartSLOduration=2.048293006 podStartE2EDuration="16.585084651s" podCreationTimestamp="2025-10-03 08:50:23 +0000 UTC" firstStartedPulling="2025-10-03 08:50:24.078570313 +0000 UTC m=+668.380064643" lastFinishedPulling="2025-10-03 08:50:38.615361958 +0000 UTC m=+682.916856288" observedRunningTime="2025-10-03 08:50:39.580071523 +0000 UTC m=+683.881565853" watchObservedRunningTime="2025-10-03 08:50:39.585084651 +0000 UTC m=+683.886578981" Oct 03 08:50:39 crc kubenswrapper[4765]: I1003 08:50:39.601385 4765 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operators/observability-operator-cc5f78dfc-ndvd9" Oct 03 08:50:39 crc kubenswrapper[4765]: I1003 08:50:39.612226 4765 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operators/perses-operator-54bc95c9fb-76rtm" podStartSLOduration=2.145895051 podStartE2EDuration="16.612207331s" podCreationTimestamp="2025-10-03 08:50:23 +0000 UTC" firstStartedPulling="2025-10-03 08:50:24.192438613 +0000 UTC m=+668.493932943" lastFinishedPulling="2025-10-03 08:50:38.658750893 +0000 UTC m=+682.960245223" observedRunningTime="2025-10-03 08:50:39.609431171 +0000 UTC m=+683.910925501" watchObservedRunningTime="2025-10-03 08:50:39.612207331 +0000 UTC m=+683.913701661" Oct 03 08:50:39 crc kubenswrapper[4765]: I1003 08:50:39.646315 4765 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operators/obo-prometheus-operator-admission-webhook-665f459cd-hcgt2" podStartSLOduration=1.911824972 podStartE2EDuration="16.646296809s" podCreationTimestamp="2025-10-03 08:50:23 +0000 UTC" firstStartedPulling="2025-10-03 08:50:23.791089094 +0000 UTC m=+668.092583424" lastFinishedPulling="2025-10-03 08:50:38.525560931 +0000 UTC m=+682.827055261" observedRunningTime="2025-10-03 08:50:39.64555362 +0000 UTC m=+683.947047960" watchObservedRunningTime="2025-10-03 08:50:39.646296809 +0000 UTC m=+683.947791139" Oct 03 08:50:39 crc kubenswrapper[4765]: I1003 08:50:39.666456 4765 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operators/observability-operator-cc5f78dfc-ndvd9" podStartSLOduration=2.1741368899999998 podStartE2EDuration="16.666440042s" podCreationTimestamp="2025-10-03 08:50:23 +0000 UTC" firstStartedPulling="2025-10-03 08:50:24.122459001 +0000 UTC m=+668.423953341" lastFinishedPulling="2025-10-03 08:50:38.614762163 +0000 UTC m=+682.916256493" observedRunningTime="2025-10-03 08:50:39.66320777 +0000 UTC m=+683.964702100" watchObservedRunningTime="2025-10-03 08:50:39.666440042 +0000 UTC m=+683.967934372" Oct 03 08:50:39 crc kubenswrapper[4765]: I1003 08:50:39.726737 4765 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-ovn-kubernetes/ovnkube-node-srgbb"] Oct 03 08:50:39 crc kubenswrapper[4765]: I1003 08:50:39.736397 4765 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-ovn-kubernetes/ovnkube-node-srgbb"] Oct 03 08:50:39 crc kubenswrapper[4765]: I1003 08:50:39.753013 4765 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operators/obo-prometheus-operator-admission-webhook-665f459cd-2mtm8" podStartSLOduration=2.055527689 podStartE2EDuration="16.752992586s" podCreationTimestamp="2025-10-03 08:50:23 +0000 UTC" firstStartedPulling="2025-10-03 08:50:23.828137276 +0000 UTC m=+668.129631626" lastFinishedPulling="2025-10-03 08:50:38.525602193 +0000 UTC m=+682.827096523" observedRunningTime="2025-10-03 08:50:39.751888408 +0000 UTC m=+684.053382738" watchObservedRunningTime="2025-10-03 08:50:39.752992586 +0000 UTC m=+684.054486916" Oct 03 08:50:40 crc kubenswrapper[4765]: I1003 08:50:40.312840 4765 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ea01fba1-445f-46c1-898c-1ceb34866850" path="/var/lib/kubelet/pods/ea01fba1-445f-46c1-898c-1ceb34866850/volumes" Oct 03 08:50:40 crc kubenswrapper[4765]: I1003 08:50:40.577195 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-vm8hh" event={"ID":"dd54bc15-5647-465d-9616-cf09d1ad5885","Type":"ContainerStarted","Data":"6781f0a4d582121574637093c391da958c4119533c345bc82b182e2b0e0ae667"} Oct 03 08:50:40 crc kubenswrapper[4765]: I1003 08:50:40.577243 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-vm8hh" event={"ID":"dd54bc15-5647-465d-9616-cf09d1ad5885","Type":"ContainerStarted","Data":"b07f16f785a45d472831cfdbe8f174bd564080241a020087bda1f70d495d1575"} Oct 03 08:50:40 crc kubenswrapper[4765]: I1003 08:50:40.577258 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-vm8hh" event={"ID":"dd54bc15-5647-465d-9616-cf09d1ad5885","Type":"ContainerStarted","Data":"383fd22412629ff12d0d65bf21d70f7dad42189623ce8ac2de94d4ef175590f8"} Oct 03 08:50:40 crc kubenswrapper[4765]: I1003 08:50:40.577271 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-vm8hh" event={"ID":"dd54bc15-5647-465d-9616-cf09d1ad5885","Type":"ContainerStarted","Data":"29b8f70053252a86e7524235d43dcc376ee7855dc852f5be952b262e464081ce"} Oct 03 08:50:40 crc kubenswrapper[4765]: I1003 08:50:40.577282 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-vm8hh" event={"ID":"dd54bc15-5647-465d-9616-cf09d1ad5885","Type":"ContainerStarted","Data":"d09095b20fa12bbe656d71cc771a49a8746188679098693cf461109f2d6d7282"} Oct 03 08:50:40 crc kubenswrapper[4765]: I1003 08:50:40.577292 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-vm8hh" event={"ID":"dd54bc15-5647-465d-9616-cf09d1ad5885","Type":"ContainerStarted","Data":"07bea521f78a2bfe3b33a2e1fda8ad92ff7f941f5a7eb47c3b3cbff591341881"} Oct 03 08:50:43 crc kubenswrapper[4765]: I1003 08:50:43.596982 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-vm8hh" event={"ID":"dd54bc15-5647-465d-9616-cf09d1ad5885","Type":"ContainerStarted","Data":"95fa3c7f5ba5ec853cd09f65225944a819ed9fcdb874f8dcf5b4ddba38ee3c5c"} Oct 03 08:50:43 crc kubenswrapper[4765]: I1003 08:50:43.908749 4765 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operators/perses-operator-54bc95c9fb-76rtm" Oct 03 08:50:45 crc kubenswrapper[4765]: I1003 08:50:45.340810 4765 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835ck649t"] Oct 03 08:50:45 crc kubenswrapper[4765]: I1003 08:50:45.342334 4765 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835ck649t" Oct 03 08:50:45 crc kubenswrapper[4765]: I1003 08:50:45.344436 4765 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"default-dockercfg-vmwhc" Oct 03 08:50:45 crc kubenswrapper[4765]: I1003 08:50:45.490414 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/e758dad4-a664-40cc-b2e1-e7e43c757276-bundle\") pod \"fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835ck649t\" (UID: \"e758dad4-a664-40cc-b2e1-e7e43c757276\") " pod="openshift-marketplace/fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835ck649t" Oct 03 08:50:45 crc kubenswrapper[4765]: I1003 08:50:45.490741 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7svrz\" (UniqueName: \"kubernetes.io/projected/e758dad4-a664-40cc-b2e1-e7e43c757276-kube-api-access-7svrz\") pod \"fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835ck649t\" (UID: \"e758dad4-a664-40cc-b2e1-e7e43c757276\") " pod="openshift-marketplace/fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835ck649t" Oct 03 08:50:45 crc kubenswrapper[4765]: I1003 08:50:45.490860 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/e758dad4-a664-40cc-b2e1-e7e43c757276-util\") pod \"fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835ck649t\" (UID: \"e758dad4-a664-40cc-b2e1-e7e43c757276\") " pod="openshift-marketplace/fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835ck649t" Oct 03 08:50:45 crc kubenswrapper[4765]: I1003 08:50:45.592633 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7svrz\" (UniqueName: \"kubernetes.io/projected/e758dad4-a664-40cc-b2e1-e7e43c757276-kube-api-access-7svrz\") pod \"fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835ck649t\" (UID: \"e758dad4-a664-40cc-b2e1-e7e43c757276\") " pod="openshift-marketplace/fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835ck649t" Oct 03 08:50:45 crc kubenswrapper[4765]: I1003 08:50:45.592703 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/e758dad4-a664-40cc-b2e1-e7e43c757276-util\") pod \"fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835ck649t\" (UID: \"e758dad4-a664-40cc-b2e1-e7e43c757276\") " pod="openshift-marketplace/fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835ck649t" Oct 03 08:50:45 crc kubenswrapper[4765]: I1003 08:50:45.592754 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/e758dad4-a664-40cc-b2e1-e7e43c757276-bundle\") pod \"fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835ck649t\" (UID: \"e758dad4-a664-40cc-b2e1-e7e43c757276\") " pod="openshift-marketplace/fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835ck649t" Oct 03 08:50:45 crc kubenswrapper[4765]: I1003 08:50:45.593218 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/e758dad4-a664-40cc-b2e1-e7e43c757276-util\") pod \"fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835ck649t\" (UID: \"e758dad4-a664-40cc-b2e1-e7e43c757276\") " pod="openshift-marketplace/fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835ck649t" Oct 03 08:50:45 crc kubenswrapper[4765]: I1003 08:50:45.593263 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/e758dad4-a664-40cc-b2e1-e7e43c757276-bundle\") pod \"fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835ck649t\" (UID: \"e758dad4-a664-40cc-b2e1-e7e43c757276\") " pod="openshift-marketplace/fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835ck649t" Oct 03 08:50:45 crc kubenswrapper[4765]: I1003 08:50:45.610820 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-vm8hh" event={"ID":"dd54bc15-5647-465d-9616-cf09d1ad5885","Type":"ContainerStarted","Data":"4612be6f8a33fa8095c79af58e1462cb8595642976481b5f9feb83aacc20931a"} Oct 03 08:50:45 crc kubenswrapper[4765]: I1003 08:50:45.612224 4765 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-vm8hh" Oct 03 08:50:45 crc kubenswrapper[4765]: I1003 08:50:45.612264 4765 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-vm8hh" Oct 03 08:50:45 crc kubenswrapper[4765]: I1003 08:50:45.612317 4765 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-vm8hh" Oct 03 08:50:45 crc kubenswrapper[4765]: I1003 08:50:45.616472 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7svrz\" (UniqueName: \"kubernetes.io/projected/e758dad4-a664-40cc-b2e1-e7e43c757276-kube-api-access-7svrz\") pod \"fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835ck649t\" (UID: \"e758dad4-a664-40cc-b2e1-e7e43c757276\") " pod="openshift-marketplace/fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835ck649t" Oct 03 08:50:45 crc kubenswrapper[4765]: I1003 08:50:45.647160 4765 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-vm8hh" Oct 03 08:50:45 crc kubenswrapper[4765]: I1003 08:50:45.649887 4765 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-vm8hh" Oct 03 08:50:45 crc kubenswrapper[4765]: I1003 08:50:45.656248 4765 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835ck649t" Oct 03 08:50:45 crc kubenswrapper[4765]: I1003 08:50:45.690918 4765 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ovn-kubernetes/ovnkube-node-vm8hh" podStartSLOduration=7.690896405 podStartE2EDuration="7.690896405s" podCreationTimestamp="2025-10-03 08:50:38 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-03 08:50:45.685311673 +0000 UTC m=+689.986806023" watchObservedRunningTime="2025-10-03 08:50:45.690896405 +0000 UTC m=+689.992390735" Oct 03 08:50:45 crc kubenswrapper[4765]: E1003 08:50:45.691063 4765 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835ck649t_openshift-marketplace_e758dad4-a664-40cc-b2e1-e7e43c757276_0(b89ce8cd5a69009222075e7a7127ba66966923564d7bd0df7215f5c9a15aa91a): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Oct 03 08:50:45 crc kubenswrapper[4765]: E1003 08:50:45.691154 4765 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835ck649t_openshift-marketplace_e758dad4-a664-40cc-b2e1-e7e43c757276_0(b89ce8cd5a69009222075e7a7127ba66966923564d7bd0df7215f5c9a15aa91a): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-marketplace/fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835ck649t" Oct 03 08:50:45 crc kubenswrapper[4765]: E1003 08:50:45.691189 4765 kuberuntime_manager.go:1170] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835ck649t_openshift-marketplace_e758dad4-a664-40cc-b2e1-e7e43c757276_0(b89ce8cd5a69009222075e7a7127ba66966923564d7bd0df7215f5c9a15aa91a): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-marketplace/fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835ck649t" Oct 03 08:50:45 crc kubenswrapper[4765]: E1003 08:50:45.691268 4765 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835ck649t_openshift-marketplace(e758dad4-a664-40cc-b2e1-e7e43c757276)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835ck649t_openshift-marketplace(e758dad4-a664-40cc-b2e1-e7e43c757276)\\\": rpc error: code = Unknown desc = failed to create pod network sandbox k8s_fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835ck649t_openshift-marketplace_e758dad4-a664-40cc-b2e1-e7e43c757276_0(b89ce8cd5a69009222075e7a7127ba66966923564d7bd0df7215f5c9a15aa91a): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\"" pod="openshift-marketplace/fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835ck649t" podUID="e758dad4-a664-40cc-b2e1-e7e43c757276" Oct 03 08:50:45 crc kubenswrapper[4765]: I1003 08:50:45.694357 4765 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835ck649t"] Oct 03 08:50:46 crc kubenswrapper[4765]: I1003 08:50:46.615358 4765 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835ck649t" Oct 03 08:50:46 crc kubenswrapper[4765]: I1003 08:50:46.615936 4765 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835ck649t" Oct 03 08:50:46 crc kubenswrapper[4765]: E1003 08:50:46.646879 4765 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835ck649t_openshift-marketplace_e758dad4-a664-40cc-b2e1-e7e43c757276_0(f3c3911085af7e35eb5fc1a7bc8aac0e003a7962a7dcba4c76b2bdc13d9477f8): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Oct 03 08:50:46 crc kubenswrapper[4765]: E1003 08:50:46.646954 4765 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835ck649t_openshift-marketplace_e758dad4-a664-40cc-b2e1-e7e43c757276_0(f3c3911085af7e35eb5fc1a7bc8aac0e003a7962a7dcba4c76b2bdc13d9477f8): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-marketplace/fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835ck649t" Oct 03 08:50:46 crc kubenswrapper[4765]: E1003 08:50:46.646980 4765 kuberuntime_manager.go:1170] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835ck649t_openshift-marketplace_e758dad4-a664-40cc-b2e1-e7e43c757276_0(f3c3911085af7e35eb5fc1a7bc8aac0e003a7962a7dcba4c76b2bdc13d9477f8): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-marketplace/fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835ck649t" Oct 03 08:50:46 crc kubenswrapper[4765]: E1003 08:50:46.647027 4765 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835ck649t_openshift-marketplace(e758dad4-a664-40cc-b2e1-e7e43c757276)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835ck649t_openshift-marketplace(e758dad4-a664-40cc-b2e1-e7e43c757276)\\\": rpc error: code = Unknown desc = failed to create pod network sandbox k8s_fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835ck649t_openshift-marketplace_e758dad4-a664-40cc-b2e1-e7e43c757276_0(f3c3911085af7e35eb5fc1a7bc8aac0e003a7962a7dcba4c76b2bdc13d9477f8): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\"" pod="openshift-marketplace/fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835ck649t" podUID="e758dad4-a664-40cc-b2e1-e7e43c757276" Oct 03 08:50:52 crc kubenswrapper[4765]: I1003 08:50:52.307366 4765 scope.go:117] "RemoveContainer" containerID="8d3bc39e0926f0219495285f71e5ec98da034b168a3092f1121da2eabf6b6f10" Oct 03 08:50:52 crc kubenswrapper[4765]: E1003 08:50:52.307937 4765 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-multus\" with CrashLoopBackOff: \"back-off 20s restarting failed container=kube-multus pod=multus-csb5z_openshift-multus(912755c8-dd28-4fbc-82de-9cf85df54f4f)\"" pod="openshift-multus/multus-csb5z" podUID="912755c8-dd28-4fbc-82de-9cf85df54f4f" Oct 03 08:51:00 crc kubenswrapper[4765]: I1003 08:51:00.305731 4765 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835ck649t" Oct 03 08:51:00 crc kubenswrapper[4765]: I1003 08:51:00.306785 4765 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835ck649t" Oct 03 08:51:00 crc kubenswrapper[4765]: E1003 08:51:00.332886 4765 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835ck649t_openshift-marketplace_e758dad4-a664-40cc-b2e1-e7e43c757276_0(f7753eeb8c6220a5b68131d4cabca9ad920a20315907f6b3436d28e000678fe4): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Oct 03 08:51:00 crc kubenswrapper[4765]: E1003 08:51:00.332968 4765 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835ck649t_openshift-marketplace_e758dad4-a664-40cc-b2e1-e7e43c757276_0(f7753eeb8c6220a5b68131d4cabca9ad920a20315907f6b3436d28e000678fe4): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-marketplace/fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835ck649t" Oct 03 08:51:00 crc kubenswrapper[4765]: E1003 08:51:00.333000 4765 kuberuntime_manager.go:1170] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835ck649t_openshift-marketplace_e758dad4-a664-40cc-b2e1-e7e43c757276_0(f7753eeb8c6220a5b68131d4cabca9ad920a20315907f6b3436d28e000678fe4): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-marketplace/fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835ck649t" Oct 03 08:51:00 crc kubenswrapper[4765]: E1003 08:51:00.333062 4765 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835ck649t_openshift-marketplace(e758dad4-a664-40cc-b2e1-e7e43c757276)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835ck649t_openshift-marketplace(e758dad4-a664-40cc-b2e1-e7e43c757276)\\\": rpc error: code = Unknown desc = failed to create pod network sandbox k8s_fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835ck649t_openshift-marketplace_e758dad4-a664-40cc-b2e1-e7e43c757276_0(f7753eeb8c6220a5b68131d4cabca9ad920a20315907f6b3436d28e000678fe4): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\"" pod="openshift-marketplace/fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835ck649t" podUID="e758dad4-a664-40cc-b2e1-e7e43c757276" Oct 03 08:51:05 crc kubenswrapper[4765]: I1003 08:51:05.307466 4765 scope.go:117] "RemoveContainer" containerID="8d3bc39e0926f0219495285f71e5ec98da034b168a3092f1121da2eabf6b6f10" Oct 03 08:51:07 crc kubenswrapper[4765]: I1003 08:51:07.718122 4765 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-csb5z_912755c8-dd28-4fbc-82de-9cf85df54f4f/kube-multus/2.log" Oct 03 08:51:07 crc kubenswrapper[4765]: I1003 08:51:07.718612 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-csb5z" event={"ID":"912755c8-dd28-4fbc-82de-9cf85df54f4f","Type":"ContainerStarted","Data":"50d14a20bcf277ffb48fde60c3d42d00d08b0bf1366e59048acda943421aec75"} Oct 03 08:51:08 crc kubenswrapper[4765]: I1003 08:51:08.953981 4765 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-vm8hh" Oct 03 08:51:13 crc kubenswrapper[4765]: I1003 08:51:13.306359 4765 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835ck649t" Oct 03 08:51:13 crc kubenswrapper[4765]: I1003 08:51:13.307218 4765 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835ck649t" Oct 03 08:51:13 crc kubenswrapper[4765]: I1003 08:51:13.681427 4765 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835ck649t"] Oct 03 08:51:13 crc kubenswrapper[4765]: I1003 08:51:13.749055 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835ck649t" event={"ID":"e758dad4-a664-40cc-b2e1-e7e43c757276","Type":"ContainerStarted","Data":"aab376700d26f58c003377e08c2be043c12c73bcd0d807936db027dbfe6547d4"} Oct 03 08:51:13 crc kubenswrapper[4765]: E1003 08:51:13.928377 4765 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode758dad4_a664_40cc_b2e1_e7e43c757276.slice/crio-8c10fea64cfee518113e95c04918ddec37f1b963d3d7f3ec486aeefe2de9dd80.scope\": RecentStats: unable to find data in memory cache]" Oct 03 08:51:14 crc kubenswrapper[4765]: I1003 08:51:14.756088 4765 generic.go:334] "Generic (PLEG): container finished" podID="e758dad4-a664-40cc-b2e1-e7e43c757276" containerID="8c10fea64cfee518113e95c04918ddec37f1b963d3d7f3ec486aeefe2de9dd80" exitCode=0 Oct 03 08:51:14 crc kubenswrapper[4765]: I1003 08:51:14.756194 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835ck649t" event={"ID":"e758dad4-a664-40cc-b2e1-e7e43c757276","Type":"ContainerDied","Data":"8c10fea64cfee518113e95c04918ddec37f1b963d3d7f3ec486aeefe2de9dd80"} Oct 03 08:51:15 crc kubenswrapper[4765]: I1003 08:51:15.763396 4765 generic.go:334] "Generic (PLEG): container finished" podID="e758dad4-a664-40cc-b2e1-e7e43c757276" containerID="ad59af45d1ec364309577f1ffb5e9bcb5b269c0a4a9f1a6adee91b9d2f9400db" exitCode=0 Oct 03 08:51:15 crc kubenswrapper[4765]: I1003 08:51:15.763491 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835ck649t" event={"ID":"e758dad4-a664-40cc-b2e1-e7e43c757276","Type":"ContainerDied","Data":"ad59af45d1ec364309577f1ffb5e9bcb5b269c0a4a9f1a6adee91b9d2f9400db"} Oct 03 08:51:16 crc kubenswrapper[4765]: I1003 08:51:16.479025 4765 scope.go:117] "RemoveContainer" containerID="95502595a856f5f235331ab5db3d4f97a50f968857c1962d12b873a714689f0c" Oct 03 08:51:16 crc kubenswrapper[4765]: I1003 08:51:16.498557 4765 scope.go:117] "RemoveContainer" containerID="902d94d2cc9ce526c6ea774f1bb70fbee7da85cedab72fcd842f87d47ee8a458" Oct 03 08:51:16 crc kubenswrapper[4765]: I1003 08:51:16.513822 4765 scope.go:117] "RemoveContainer" containerID="a3ad66691c9dcf004703b79d697a78f9b42791fafba2ddf278997b6ad28bdd4a" Oct 03 08:51:16 crc kubenswrapper[4765]: I1003 08:51:16.532158 4765 scope.go:117] "RemoveContainer" containerID="6d5d60eb6ab5ff22cc2c6826b1d47220bb827fa0429f2a59020ae01d0a43f6bf" Oct 03 08:51:16 crc kubenswrapper[4765]: I1003 08:51:16.553258 4765 scope.go:117] "RemoveContainer" containerID="761ad034a8a6a7a6280fcccaca136b26ddd423885bc2a7ab6b8e1856f16f7416" Oct 03 08:51:16 crc kubenswrapper[4765]: I1003 08:51:16.578417 4765 scope.go:117] "RemoveContainer" containerID="7f00afa4ccebb6c76784137043797c0ee3ab98e16e9dffb9acb0f972b0c35b63" Oct 03 08:51:16 crc kubenswrapper[4765]: I1003 08:51:16.598865 4765 scope.go:117] "RemoveContainer" containerID="d73e2e54676fc570262cfd551322ed003812c372ddc25695ca3b34ae2a05423b" Oct 03 08:51:16 crc kubenswrapper[4765]: I1003 08:51:16.617700 4765 scope.go:117] "RemoveContainer" containerID="fa40947035e07c4926ee170348e2bd545830d0c6c1fa6b59a2aa7f12eac2c6da" Oct 03 08:51:16 crc kubenswrapper[4765]: I1003 08:51:16.646578 4765 scope.go:117] "RemoveContainer" containerID="68b9b8a7ec5c072f50d44aa0d3800b7cdee18bdd868d37ec129ceb37a23bd3ca" Oct 03 08:51:16 crc kubenswrapper[4765]: I1003 08:51:16.772787 4765 generic.go:334] "Generic (PLEG): container finished" podID="e758dad4-a664-40cc-b2e1-e7e43c757276" containerID="3f4a984cfa77d71040d2aaf6b6d58fa361a48a20df48ea74d112a210087d8c1c" exitCode=0 Oct 03 08:51:16 crc kubenswrapper[4765]: I1003 08:51:16.772843 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835ck649t" event={"ID":"e758dad4-a664-40cc-b2e1-e7e43c757276","Type":"ContainerDied","Data":"3f4a984cfa77d71040d2aaf6b6d58fa361a48a20df48ea74d112a210087d8c1c"} Oct 03 08:51:17 crc kubenswrapper[4765]: I1003 08:51:17.967272 4765 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835ck649t" Oct 03 08:51:18 crc kubenswrapper[4765]: I1003 08:51:18.092324 4765 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/e758dad4-a664-40cc-b2e1-e7e43c757276-util\") pod \"e758dad4-a664-40cc-b2e1-e7e43c757276\" (UID: \"e758dad4-a664-40cc-b2e1-e7e43c757276\") " Oct 03 08:51:18 crc kubenswrapper[4765]: I1003 08:51:18.092467 4765 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7svrz\" (UniqueName: \"kubernetes.io/projected/e758dad4-a664-40cc-b2e1-e7e43c757276-kube-api-access-7svrz\") pod \"e758dad4-a664-40cc-b2e1-e7e43c757276\" (UID: \"e758dad4-a664-40cc-b2e1-e7e43c757276\") " Oct 03 08:51:18 crc kubenswrapper[4765]: I1003 08:51:18.092500 4765 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/e758dad4-a664-40cc-b2e1-e7e43c757276-bundle\") pod \"e758dad4-a664-40cc-b2e1-e7e43c757276\" (UID: \"e758dad4-a664-40cc-b2e1-e7e43c757276\") " Oct 03 08:51:18 crc kubenswrapper[4765]: I1003 08:51:18.093990 4765 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e758dad4-a664-40cc-b2e1-e7e43c757276-bundle" (OuterVolumeSpecName: "bundle") pod "e758dad4-a664-40cc-b2e1-e7e43c757276" (UID: "e758dad4-a664-40cc-b2e1-e7e43c757276"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 03 08:51:18 crc kubenswrapper[4765]: I1003 08:51:18.100276 4765 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e758dad4-a664-40cc-b2e1-e7e43c757276-kube-api-access-7svrz" (OuterVolumeSpecName: "kube-api-access-7svrz") pod "e758dad4-a664-40cc-b2e1-e7e43c757276" (UID: "e758dad4-a664-40cc-b2e1-e7e43c757276"). InnerVolumeSpecName "kube-api-access-7svrz". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 08:51:18 crc kubenswrapper[4765]: I1003 08:51:18.110894 4765 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e758dad4-a664-40cc-b2e1-e7e43c757276-util" (OuterVolumeSpecName: "util") pod "e758dad4-a664-40cc-b2e1-e7e43c757276" (UID: "e758dad4-a664-40cc-b2e1-e7e43c757276"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 03 08:51:18 crc kubenswrapper[4765]: I1003 08:51:18.194106 4765 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7svrz\" (UniqueName: \"kubernetes.io/projected/e758dad4-a664-40cc-b2e1-e7e43c757276-kube-api-access-7svrz\") on node \"crc\" DevicePath \"\"" Oct 03 08:51:18 crc kubenswrapper[4765]: I1003 08:51:18.194156 4765 reconciler_common.go:293] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/e758dad4-a664-40cc-b2e1-e7e43c757276-bundle\") on node \"crc\" DevicePath \"\"" Oct 03 08:51:18 crc kubenswrapper[4765]: I1003 08:51:18.194166 4765 reconciler_common.go:293] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/e758dad4-a664-40cc-b2e1-e7e43c757276-util\") on node \"crc\" DevicePath \"\"" Oct 03 08:51:18 crc kubenswrapper[4765]: I1003 08:51:18.787437 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835ck649t" event={"ID":"e758dad4-a664-40cc-b2e1-e7e43c757276","Type":"ContainerDied","Data":"aab376700d26f58c003377e08c2be043c12c73bcd0d807936db027dbfe6547d4"} Oct 03 08:51:18 crc kubenswrapper[4765]: I1003 08:51:18.787942 4765 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="aab376700d26f58c003377e08c2be043c12c73bcd0d807936db027dbfe6547d4" Oct 03 08:51:18 crc kubenswrapper[4765]: I1003 08:51:18.787523 4765 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835ck649t" Oct 03 08:51:21 crc kubenswrapper[4765]: I1003 08:51:21.914309 4765 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-nmstate/nmstate-operator-858ddd8f98-r5vr9"] Oct 03 08:51:21 crc kubenswrapper[4765]: E1003 08:51:21.914823 4765 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e758dad4-a664-40cc-b2e1-e7e43c757276" containerName="util" Oct 03 08:51:21 crc kubenswrapper[4765]: I1003 08:51:21.914836 4765 state_mem.go:107] "Deleted CPUSet assignment" podUID="e758dad4-a664-40cc-b2e1-e7e43c757276" containerName="util" Oct 03 08:51:21 crc kubenswrapper[4765]: E1003 08:51:21.914848 4765 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e758dad4-a664-40cc-b2e1-e7e43c757276" containerName="extract" Oct 03 08:51:21 crc kubenswrapper[4765]: I1003 08:51:21.914854 4765 state_mem.go:107] "Deleted CPUSet assignment" podUID="e758dad4-a664-40cc-b2e1-e7e43c757276" containerName="extract" Oct 03 08:51:21 crc kubenswrapper[4765]: E1003 08:51:21.914865 4765 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e758dad4-a664-40cc-b2e1-e7e43c757276" containerName="pull" Oct 03 08:51:21 crc kubenswrapper[4765]: I1003 08:51:21.914872 4765 state_mem.go:107] "Deleted CPUSet assignment" podUID="e758dad4-a664-40cc-b2e1-e7e43c757276" containerName="pull" Oct 03 08:51:21 crc kubenswrapper[4765]: I1003 08:51:21.914993 4765 memory_manager.go:354] "RemoveStaleState removing state" podUID="e758dad4-a664-40cc-b2e1-e7e43c757276" containerName="extract" Oct 03 08:51:21 crc kubenswrapper[4765]: I1003 08:51:21.915423 4765 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-operator-858ddd8f98-r5vr9" Oct 03 08:51:21 crc kubenswrapper[4765]: I1003 08:51:21.918205 4765 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-nmstate"/"openshift-service-ca.crt" Oct 03 08:51:21 crc kubenswrapper[4765]: I1003 08:51:21.918592 4765 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-nmstate"/"kube-root-ca.crt" Oct 03 08:51:21 crc kubenswrapper[4765]: I1003 08:51:21.918842 4765 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-nmstate"/"nmstate-operator-dockercfg-zp76f" Oct 03 08:51:21 crc kubenswrapper[4765]: I1003 08:51:21.936694 4765 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-operator-858ddd8f98-r5vr9"] Oct 03 08:51:22 crc kubenswrapper[4765]: I1003 08:51:22.047190 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6jcsm\" (UniqueName: \"kubernetes.io/projected/6a17b86e-1da6-447f-abd8-54873d510ee3-kube-api-access-6jcsm\") pod \"nmstate-operator-858ddd8f98-r5vr9\" (UID: \"6a17b86e-1da6-447f-abd8-54873d510ee3\") " pod="openshift-nmstate/nmstate-operator-858ddd8f98-r5vr9" Oct 03 08:51:22 crc kubenswrapper[4765]: I1003 08:51:22.149061 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6jcsm\" (UniqueName: \"kubernetes.io/projected/6a17b86e-1da6-447f-abd8-54873d510ee3-kube-api-access-6jcsm\") pod \"nmstate-operator-858ddd8f98-r5vr9\" (UID: \"6a17b86e-1da6-447f-abd8-54873d510ee3\") " pod="openshift-nmstate/nmstate-operator-858ddd8f98-r5vr9" Oct 03 08:51:22 crc kubenswrapper[4765]: I1003 08:51:22.175142 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6jcsm\" (UniqueName: \"kubernetes.io/projected/6a17b86e-1da6-447f-abd8-54873d510ee3-kube-api-access-6jcsm\") pod \"nmstate-operator-858ddd8f98-r5vr9\" (UID: \"6a17b86e-1da6-447f-abd8-54873d510ee3\") " pod="openshift-nmstate/nmstate-operator-858ddd8f98-r5vr9" Oct 03 08:51:22 crc kubenswrapper[4765]: I1003 08:51:22.235504 4765 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-operator-858ddd8f98-r5vr9" Oct 03 08:51:22 crc kubenswrapper[4765]: I1003 08:51:22.651030 4765 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-operator-858ddd8f98-r5vr9"] Oct 03 08:51:22 crc kubenswrapper[4765]: I1003 08:51:22.820324 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-operator-858ddd8f98-r5vr9" event={"ID":"6a17b86e-1da6-447f-abd8-54873d510ee3","Type":"ContainerStarted","Data":"491c9d54db7960191cb6dd016b298c48c19936f97be646ba6d102e149e81c9e1"} Oct 03 08:51:25 crc kubenswrapper[4765]: I1003 08:51:25.846980 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-operator-858ddd8f98-r5vr9" event={"ID":"6a17b86e-1da6-447f-abd8-54873d510ee3","Type":"ContainerStarted","Data":"fb3eae8a83f279ab78acc01614f125a24e17d2dada464dde0f3cc66fb82fcb1a"} Oct 03 08:51:25 crc kubenswrapper[4765]: I1003 08:51:25.873467 4765 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-nmstate/nmstate-operator-858ddd8f98-r5vr9" podStartSLOduration=2.84532844 podStartE2EDuration="4.873434781s" podCreationTimestamp="2025-10-03 08:51:21 +0000 UTC" firstStartedPulling="2025-10-03 08:51:22.667077428 +0000 UTC m=+726.968571758" lastFinishedPulling="2025-10-03 08:51:24.695183769 +0000 UTC m=+728.996678099" observedRunningTime="2025-10-03 08:51:25.873067271 +0000 UTC m=+730.174561611" watchObservedRunningTime="2025-10-03 08:51:25.873434781 +0000 UTC m=+730.174929141" Oct 03 08:51:26 crc kubenswrapper[4765]: I1003 08:51:26.947130 4765 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-nmstate/nmstate-metrics-fdff9cb8d-6gs2v"] Oct 03 08:51:26 crc kubenswrapper[4765]: I1003 08:51:26.948116 4765 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-metrics-fdff9cb8d-6gs2v" Oct 03 08:51:26 crc kubenswrapper[4765]: I1003 08:51:26.949833 4765 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-nmstate"/"nmstate-handler-dockercfg-b99rl" Oct 03 08:51:26 crc kubenswrapper[4765]: I1003 08:51:26.964923 4765 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-metrics-fdff9cb8d-6gs2v"] Oct 03 08:51:26 crc kubenswrapper[4765]: I1003 08:51:26.972353 4765 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-nmstate/nmstate-webhook-6cdbc54649-m6bmz"] Oct 03 08:51:26 crc kubenswrapper[4765]: I1003 08:51:26.975365 4765 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-webhook-6cdbc54649-m6bmz" Oct 03 08:51:26 crc kubenswrapper[4765]: I1003 08:51:26.977175 4765 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-nmstate"/"openshift-nmstate-webhook" Oct 03 08:51:27 crc kubenswrapper[4765]: I1003 08:51:27.000737 4765 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-webhook-6cdbc54649-m6bmz"] Oct 03 08:51:27 crc kubenswrapper[4765]: I1003 08:51:27.011216 4765 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-nmstate/nmstate-handler-bwwmp"] Oct 03 08:51:27 crc kubenswrapper[4765]: I1003 08:51:27.014542 4765 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-handler-bwwmp" Oct 03 08:51:27 crc kubenswrapper[4765]: I1003 08:51:27.028281 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dbus-socket\" (UniqueName: \"kubernetes.io/host-path/572363ec-fce1-468d-998d-3ae0dac9c35a-dbus-socket\") pod \"nmstate-handler-bwwmp\" (UID: \"572363ec-fce1-468d-998d-3ae0dac9c35a\") " pod="openshift-nmstate/nmstate-handler-bwwmp" Oct 03 08:51:27 crc kubenswrapper[4765]: I1003 08:51:27.028373 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-r8bn2\" (UniqueName: \"kubernetes.io/projected/079a3543-9818-46c7-8500-d84424d4f411-kube-api-access-r8bn2\") pod \"nmstate-metrics-fdff9cb8d-6gs2v\" (UID: \"079a3543-9818-46c7-8500-d84424d4f411\") " pod="openshift-nmstate/nmstate-metrics-fdff9cb8d-6gs2v" Oct 03 08:51:27 crc kubenswrapper[4765]: I1003 08:51:27.028402 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovs-socket\" (UniqueName: \"kubernetes.io/host-path/572363ec-fce1-468d-998d-3ae0dac9c35a-ovs-socket\") pod \"nmstate-handler-bwwmp\" (UID: \"572363ec-fce1-468d-998d-3ae0dac9c35a\") " pod="openshift-nmstate/nmstate-handler-bwwmp" Oct 03 08:51:27 crc kubenswrapper[4765]: I1003 08:51:27.028432 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lc5gd\" (UniqueName: \"kubernetes.io/projected/572363ec-fce1-468d-998d-3ae0dac9c35a-kube-api-access-lc5gd\") pod \"nmstate-handler-bwwmp\" (UID: \"572363ec-fce1-468d-998d-3ae0dac9c35a\") " pod="openshift-nmstate/nmstate-handler-bwwmp" Oct 03 08:51:27 crc kubenswrapper[4765]: I1003 08:51:27.028459 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-key-pair\" (UniqueName: \"kubernetes.io/secret/e3620a8c-3a6c-4f66-8101-cd7f7b91f7d0-tls-key-pair\") pod \"nmstate-webhook-6cdbc54649-m6bmz\" (UID: \"e3620a8c-3a6c-4f66-8101-cd7f7b91f7d0\") " pod="openshift-nmstate/nmstate-webhook-6cdbc54649-m6bmz" Oct 03 08:51:27 crc kubenswrapper[4765]: I1003 08:51:27.028489 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nmstate-lock\" (UniqueName: \"kubernetes.io/host-path/572363ec-fce1-468d-998d-3ae0dac9c35a-nmstate-lock\") pod \"nmstate-handler-bwwmp\" (UID: \"572363ec-fce1-468d-998d-3ae0dac9c35a\") " pod="openshift-nmstate/nmstate-handler-bwwmp" Oct 03 08:51:27 crc kubenswrapper[4765]: I1003 08:51:27.028515 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cbdk6\" (UniqueName: \"kubernetes.io/projected/e3620a8c-3a6c-4f66-8101-cd7f7b91f7d0-kube-api-access-cbdk6\") pod \"nmstate-webhook-6cdbc54649-m6bmz\" (UID: \"e3620a8c-3a6c-4f66-8101-cd7f7b91f7d0\") " pod="openshift-nmstate/nmstate-webhook-6cdbc54649-m6bmz" Oct 03 08:51:27 crc kubenswrapper[4765]: I1003 08:51:27.120428 4765 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-nmstate/nmstate-console-plugin-6b874cbd85-bvxsq"] Oct 03 08:51:27 crc kubenswrapper[4765]: I1003 08:51:27.121110 4765 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-console-plugin-6b874cbd85-bvxsq" Oct 03 08:51:27 crc kubenswrapper[4765]: I1003 08:51:27.123426 4765 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-nmstate"/"plugin-serving-cert" Oct 03 08:51:27 crc kubenswrapper[4765]: I1003 08:51:27.123494 4765 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-nmstate"/"nginx-conf" Oct 03 08:51:27 crc kubenswrapper[4765]: I1003 08:51:27.123865 4765 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-nmstate"/"default-dockercfg-tb7d5" Oct 03 08:51:27 crc kubenswrapper[4765]: I1003 08:51:27.129777 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/6a5b68b1-49a0-4e3c-a08b-fc4de9903242-nginx-conf\") pod \"nmstate-console-plugin-6b874cbd85-bvxsq\" (UID: \"6a5b68b1-49a0-4e3c-a08b-fc4de9903242\") " pod="openshift-nmstate/nmstate-console-plugin-6b874cbd85-bvxsq" Oct 03 08:51:27 crc kubenswrapper[4765]: I1003 08:51:27.129823 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugin-serving-cert\" (UniqueName: \"kubernetes.io/secret/6a5b68b1-49a0-4e3c-a08b-fc4de9903242-plugin-serving-cert\") pod \"nmstate-console-plugin-6b874cbd85-bvxsq\" (UID: \"6a5b68b1-49a0-4e3c-a08b-fc4de9903242\") " pod="openshift-nmstate/nmstate-console-plugin-6b874cbd85-bvxsq" Oct 03 08:51:27 crc kubenswrapper[4765]: I1003 08:51:27.129864 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lpbb9\" (UniqueName: \"kubernetes.io/projected/6a5b68b1-49a0-4e3c-a08b-fc4de9903242-kube-api-access-lpbb9\") pod \"nmstate-console-plugin-6b874cbd85-bvxsq\" (UID: \"6a5b68b1-49a0-4e3c-a08b-fc4de9903242\") " pod="openshift-nmstate/nmstate-console-plugin-6b874cbd85-bvxsq" Oct 03 08:51:27 crc kubenswrapper[4765]: I1003 08:51:27.129903 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-r8bn2\" (UniqueName: \"kubernetes.io/projected/079a3543-9818-46c7-8500-d84424d4f411-kube-api-access-r8bn2\") pod \"nmstate-metrics-fdff9cb8d-6gs2v\" (UID: \"079a3543-9818-46c7-8500-d84424d4f411\") " pod="openshift-nmstate/nmstate-metrics-fdff9cb8d-6gs2v" Oct 03 08:51:27 crc kubenswrapper[4765]: I1003 08:51:27.129923 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovs-socket\" (UniqueName: \"kubernetes.io/host-path/572363ec-fce1-468d-998d-3ae0dac9c35a-ovs-socket\") pod \"nmstate-handler-bwwmp\" (UID: \"572363ec-fce1-468d-998d-3ae0dac9c35a\") " pod="openshift-nmstate/nmstate-handler-bwwmp" Oct 03 08:51:27 crc kubenswrapper[4765]: I1003 08:51:27.129972 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovs-socket\" (UniqueName: \"kubernetes.io/host-path/572363ec-fce1-468d-998d-3ae0dac9c35a-ovs-socket\") pod \"nmstate-handler-bwwmp\" (UID: \"572363ec-fce1-468d-998d-3ae0dac9c35a\") " pod="openshift-nmstate/nmstate-handler-bwwmp" Oct 03 08:51:27 crc kubenswrapper[4765]: I1003 08:51:27.130022 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lc5gd\" (UniqueName: \"kubernetes.io/projected/572363ec-fce1-468d-998d-3ae0dac9c35a-kube-api-access-lc5gd\") pod \"nmstate-handler-bwwmp\" (UID: \"572363ec-fce1-468d-998d-3ae0dac9c35a\") " pod="openshift-nmstate/nmstate-handler-bwwmp" Oct 03 08:51:27 crc kubenswrapper[4765]: I1003 08:51:27.130047 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tls-key-pair\" (UniqueName: \"kubernetes.io/secret/e3620a8c-3a6c-4f66-8101-cd7f7b91f7d0-tls-key-pair\") pod \"nmstate-webhook-6cdbc54649-m6bmz\" (UID: \"e3620a8c-3a6c-4f66-8101-cd7f7b91f7d0\") " pod="openshift-nmstate/nmstate-webhook-6cdbc54649-m6bmz" Oct 03 08:51:27 crc kubenswrapper[4765]: I1003 08:51:27.130082 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nmstate-lock\" (UniqueName: \"kubernetes.io/host-path/572363ec-fce1-468d-998d-3ae0dac9c35a-nmstate-lock\") pod \"nmstate-handler-bwwmp\" (UID: \"572363ec-fce1-468d-998d-3ae0dac9c35a\") " pod="openshift-nmstate/nmstate-handler-bwwmp" Oct 03 08:51:27 crc kubenswrapper[4765]: I1003 08:51:27.130107 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cbdk6\" (UniqueName: \"kubernetes.io/projected/e3620a8c-3a6c-4f66-8101-cd7f7b91f7d0-kube-api-access-cbdk6\") pod \"nmstate-webhook-6cdbc54649-m6bmz\" (UID: \"e3620a8c-3a6c-4f66-8101-cd7f7b91f7d0\") " pod="openshift-nmstate/nmstate-webhook-6cdbc54649-m6bmz" Oct 03 08:51:27 crc kubenswrapper[4765]: I1003 08:51:27.130132 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dbus-socket\" (UniqueName: \"kubernetes.io/host-path/572363ec-fce1-468d-998d-3ae0dac9c35a-dbus-socket\") pod \"nmstate-handler-bwwmp\" (UID: \"572363ec-fce1-468d-998d-3ae0dac9c35a\") " pod="openshift-nmstate/nmstate-handler-bwwmp" Oct 03 08:51:27 crc kubenswrapper[4765]: I1003 08:51:27.130425 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nmstate-lock\" (UniqueName: \"kubernetes.io/host-path/572363ec-fce1-468d-998d-3ae0dac9c35a-nmstate-lock\") pod \"nmstate-handler-bwwmp\" (UID: \"572363ec-fce1-468d-998d-3ae0dac9c35a\") " pod="openshift-nmstate/nmstate-handler-bwwmp" Oct 03 08:51:27 crc kubenswrapper[4765]: I1003 08:51:27.130567 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dbus-socket\" (UniqueName: \"kubernetes.io/host-path/572363ec-fce1-468d-998d-3ae0dac9c35a-dbus-socket\") pod \"nmstate-handler-bwwmp\" (UID: \"572363ec-fce1-468d-998d-3ae0dac9c35a\") " pod="openshift-nmstate/nmstate-handler-bwwmp" Oct 03 08:51:27 crc kubenswrapper[4765]: I1003 08:51:27.136237 4765 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-console-plugin-6b874cbd85-bvxsq"] Oct 03 08:51:27 crc kubenswrapper[4765]: I1003 08:51:27.147899 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tls-key-pair\" (UniqueName: \"kubernetes.io/secret/e3620a8c-3a6c-4f66-8101-cd7f7b91f7d0-tls-key-pair\") pod \"nmstate-webhook-6cdbc54649-m6bmz\" (UID: \"e3620a8c-3a6c-4f66-8101-cd7f7b91f7d0\") " pod="openshift-nmstate/nmstate-webhook-6cdbc54649-m6bmz" Oct 03 08:51:27 crc kubenswrapper[4765]: I1003 08:51:27.150627 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-r8bn2\" (UniqueName: \"kubernetes.io/projected/079a3543-9818-46c7-8500-d84424d4f411-kube-api-access-r8bn2\") pod \"nmstate-metrics-fdff9cb8d-6gs2v\" (UID: \"079a3543-9818-46c7-8500-d84424d4f411\") " pod="openshift-nmstate/nmstate-metrics-fdff9cb8d-6gs2v" Oct 03 08:51:27 crc kubenswrapper[4765]: I1003 08:51:27.153853 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cbdk6\" (UniqueName: \"kubernetes.io/projected/e3620a8c-3a6c-4f66-8101-cd7f7b91f7d0-kube-api-access-cbdk6\") pod \"nmstate-webhook-6cdbc54649-m6bmz\" (UID: \"e3620a8c-3a6c-4f66-8101-cd7f7b91f7d0\") " pod="openshift-nmstate/nmstate-webhook-6cdbc54649-m6bmz" Oct 03 08:51:27 crc kubenswrapper[4765]: I1003 08:51:27.159080 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lc5gd\" (UniqueName: \"kubernetes.io/projected/572363ec-fce1-468d-998d-3ae0dac9c35a-kube-api-access-lc5gd\") pod \"nmstate-handler-bwwmp\" (UID: \"572363ec-fce1-468d-998d-3ae0dac9c35a\") " pod="openshift-nmstate/nmstate-handler-bwwmp" Oct 03 08:51:27 crc kubenswrapper[4765]: I1003 08:51:27.231320 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugin-serving-cert\" (UniqueName: \"kubernetes.io/secret/6a5b68b1-49a0-4e3c-a08b-fc4de9903242-plugin-serving-cert\") pod \"nmstate-console-plugin-6b874cbd85-bvxsq\" (UID: \"6a5b68b1-49a0-4e3c-a08b-fc4de9903242\") " pod="openshift-nmstate/nmstate-console-plugin-6b874cbd85-bvxsq" Oct 03 08:51:27 crc kubenswrapper[4765]: I1003 08:51:27.231387 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/6a5b68b1-49a0-4e3c-a08b-fc4de9903242-nginx-conf\") pod \"nmstate-console-plugin-6b874cbd85-bvxsq\" (UID: \"6a5b68b1-49a0-4e3c-a08b-fc4de9903242\") " pod="openshift-nmstate/nmstate-console-plugin-6b874cbd85-bvxsq" Oct 03 08:51:27 crc kubenswrapper[4765]: I1003 08:51:27.231441 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lpbb9\" (UniqueName: \"kubernetes.io/projected/6a5b68b1-49a0-4e3c-a08b-fc4de9903242-kube-api-access-lpbb9\") pod \"nmstate-console-plugin-6b874cbd85-bvxsq\" (UID: \"6a5b68b1-49a0-4e3c-a08b-fc4de9903242\") " pod="openshift-nmstate/nmstate-console-plugin-6b874cbd85-bvxsq" Oct 03 08:51:27 crc kubenswrapper[4765]: E1003 08:51:27.231780 4765 secret.go:188] Couldn't get secret openshift-nmstate/plugin-serving-cert: secret "plugin-serving-cert" not found Oct 03 08:51:27 crc kubenswrapper[4765]: E1003 08:51:27.231888 4765 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/6a5b68b1-49a0-4e3c-a08b-fc4de9903242-plugin-serving-cert podName:6a5b68b1-49a0-4e3c-a08b-fc4de9903242 nodeName:}" failed. No retries permitted until 2025-10-03 08:51:27.73186005 +0000 UTC m=+732.033354380 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "plugin-serving-cert" (UniqueName: "kubernetes.io/secret/6a5b68b1-49a0-4e3c-a08b-fc4de9903242-plugin-serving-cert") pod "nmstate-console-plugin-6b874cbd85-bvxsq" (UID: "6a5b68b1-49a0-4e3c-a08b-fc4de9903242") : secret "plugin-serving-cert" not found Oct 03 08:51:27 crc kubenswrapper[4765]: I1003 08:51:27.232972 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/6a5b68b1-49a0-4e3c-a08b-fc4de9903242-nginx-conf\") pod \"nmstate-console-plugin-6b874cbd85-bvxsq\" (UID: \"6a5b68b1-49a0-4e3c-a08b-fc4de9903242\") " pod="openshift-nmstate/nmstate-console-plugin-6b874cbd85-bvxsq" Oct 03 08:51:27 crc kubenswrapper[4765]: I1003 08:51:27.259098 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lpbb9\" (UniqueName: \"kubernetes.io/projected/6a5b68b1-49a0-4e3c-a08b-fc4de9903242-kube-api-access-lpbb9\") pod \"nmstate-console-plugin-6b874cbd85-bvxsq\" (UID: \"6a5b68b1-49a0-4e3c-a08b-fc4de9903242\") " pod="openshift-nmstate/nmstate-console-plugin-6b874cbd85-bvxsq" Oct 03 08:51:27 crc kubenswrapper[4765]: I1003 08:51:27.266783 4765 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-metrics-fdff9cb8d-6gs2v" Oct 03 08:51:27 crc kubenswrapper[4765]: I1003 08:51:27.293592 4765 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-webhook-6cdbc54649-m6bmz" Oct 03 08:51:27 crc kubenswrapper[4765]: I1003 08:51:27.333936 4765 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-handler-bwwmp" Oct 03 08:51:27 crc kubenswrapper[4765]: I1003 08:51:27.341552 4765 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-console/console-848877996-52ncb"] Oct 03 08:51:27 crc kubenswrapper[4765]: I1003 08:51:27.342432 4765 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-848877996-52ncb" Oct 03 08:51:27 crc kubenswrapper[4765]: I1003 08:51:27.365920 4765 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-848877996-52ncb"] Oct 03 08:51:27 crc kubenswrapper[4765]: I1003 08:51:27.538178 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/64809084-8cca-4e95-ace6-5ecfcf98b208-console-oauth-config\") pod \"console-848877996-52ncb\" (UID: \"64809084-8cca-4e95-ace6-5ecfcf98b208\") " pod="openshift-console/console-848877996-52ncb" Oct 03 08:51:27 crc kubenswrapper[4765]: I1003 08:51:27.538466 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kkvs7\" (UniqueName: \"kubernetes.io/projected/64809084-8cca-4e95-ace6-5ecfcf98b208-kube-api-access-kkvs7\") pod \"console-848877996-52ncb\" (UID: \"64809084-8cca-4e95-ace6-5ecfcf98b208\") " pod="openshift-console/console-848877996-52ncb" Oct 03 08:51:27 crc kubenswrapper[4765]: I1003 08:51:27.538495 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/64809084-8cca-4e95-ace6-5ecfcf98b208-console-serving-cert\") pod \"console-848877996-52ncb\" (UID: \"64809084-8cca-4e95-ace6-5ecfcf98b208\") " pod="openshift-console/console-848877996-52ncb" Oct 03 08:51:27 crc kubenswrapper[4765]: I1003 08:51:27.538526 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/64809084-8cca-4e95-ace6-5ecfcf98b208-service-ca\") pod \"console-848877996-52ncb\" (UID: \"64809084-8cca-4e95-ace6-5ecfcf98b208\") " pod="openshift-console/console-848877996-52ncb" Oct 03 08:51:27 crc kubenswrapper[4765]: I1003 08:51:27.538601 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/64809084-8cca-4e95-ace6-5ecfcf98b208-oauth-serving-cert\") pod \"console-848877996-52ncb\" (UID: \"64809084-8cca-4e95-ace6-5ecfcf98b208\") " pod="openshift-console/console-848877996-52ncb" Oct 03 08:51:27 crc kubenswrapper[4765]: I1003 08:51:27.538667 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/64809084-8cca-4e95-ace6-5ecfcf98b208-trusted-ca-bundle\") pod \"console-848877996-52ncb\" (UID: \"64809084-8cca-4e95-ace6-5ecfcf98b208\") " pod="openshift-console/console-848877996-52ncb" Oct 03 08:51:27 crc kubenswrapper[4765]: I1003 08:51:27.538688 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/64809084-8cca-4e95-ace6-5ecfcf98b208-console-config\") pod \"console-848877996-52ncb\" (UID: \"64809084-8cca-4e95-ace6-5ecfcf98b208\") " pod="openshift-console/console-848877996-52ncb" Oct 03 08:51:27 crc kubenswrapper[4765]: I1003 08:51:27.639903 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/64809084-8cca-4e95-ace6-5ecfcf98b208-service-ca\") pod \"console-848877996-52ncb\" (UID: \"64809084-8cca-4e95-ace6-5ecfcf98b208\") " pod="openshift-console/console-848877996-52ncb" Oct 03 08:51:27 crc kubenswrapper[4765]: I1003 08:51:27.639986 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/64809084-8cca-4e95-ace6-5ecfcf98b208-oauth-serving-cert\") pod \"console-848877996-52ncb\" (UID: \"64809084-8cca-4e95-ace6-5ecfcf98b208\") " pod="openshift-console/console-848877996-52ncb" Oct 03 08:51:27 crc kubenswrapper[4765]: I1003 08:51:27.640021 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/64809084-8cca-4e95-ace6-5ecfcf98b208-trusted-ca-bundle\") pod \"console-848877996-52ncb\" (UID: \"64809084-8cca-4e95-ace6-5ecfcf98b208\") " pod="openshift-console/console-848877996-52ncb" Oct 03 08:51:27 crc kubenswrapper[4765]: I1003 08:51:27.640046 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/64809084-8cca-4e95-ace6-5ecfcf98b208-console-config\") pod \"console-848877996-52ncb\" (UID: \"64809084-8cca-4e95-ace6-5ecfcf98b208\") " pod="openshift-console/console-848877996-52ncb" Oct 03 08:51:27 crc kubenswrapper[4765]: I1003 08:51:27.640121 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/64809084-8cca-4e95-ace6-5ecfcf98b208-console-oauth-config\") pod \"console-848877996-52ncb\" (UID: \"64809084-8cca-4e95-ace6-5ecfcf98b208\") " pod="openshift-console/console-848877996-52ncb" Oct 03 08:51:27 crc kubenswrapper[4765]: I1003 08:51:27.640157 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kkvs7\" (UniqueName: \"kubernetes.io/projected/64809084-8cca-4e95-ace6-5ecfcf98b208-kube-api-access-kkvs7\") pod \"console-848877996-52ncb\" (UID: \"64809084-8cca-4e95-ace6-5ecfcf98b208\") " pod="openshift-console/console-848877996-52ncb" Oct 03 08:51:27 crc kubenswrapper[4765]: I1003 08:51:27.640217 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/64809084-8cca-4e95-ace6-5ecfcf98b208-console-serving-cert\") pod \"console-848877996-52ncb\" (UID: \"64809084-8cca-4e95-ace6-5ecfcf98b208\") " pod="openshift-console/console-848877996-52ncb" Oct 03 08:51:27 crc kubenswrapper[4765]: I1003 08:51:27.641267 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/64809084-8cca-4e95-ace6-5ecfcf98b208-service-ca\") pod \"console-848877996-52ncb\" (UID: \"64809084-8cca-4e95-ace6-5ecfcf98b208\") " pod="openshift-console/console-848877996-52ncb" Oct 03 08:51:27 crc kubenswrapper[4765]: I1003 08:51:27.641291 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/64809084-8cca-4e95-ace6-5ecfcf98b208-oauth-serving-cert\") pod \"console-848877996-52ncb\" (UID: \"64809084-8cca-4e95-ace6-5ecfcf98b208\") " pod="openshift-console/console-848877996-52ncb" Oct 03 08:51:27 crc kubenswrapper[4765]: I1003 08:51:27.641462 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/64809084-8cca-4e95-ace6-5ecfcf98b208-trusted-ca-bundle\") pod \"console-848877996-52ncb\" (UID: \"64809084-8cca-4e95-ace6-5ecfcf98b208\") " pod="openshift-console/console-848877996-52ncb" Oct 03 08:51:27 crc kubenswrapper[4765]: I1003 08:51:27.642107 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/64809084-8cca-4e95-ace6-5ecfcf98b208-console-config\") pod \"console-848877996-52ncb\" (UID: \"64809084-8cca-4e95-ace6-5ecfcf98b208\") " pod="openshift-console/console-848877996-52ncb" Oct 03 08:51:27 crc kubenswrapper[4765]: I1003 08:51:27.647334 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/64809084-8cca-4e95-ace6-5ecfcf98b208-console-serving-cert\") pod \"console-848877996-52ncb\" (UID: \"64809084-8cca-4e95-ace6-5ecfcf98b208\") " pod="openshift-console/console-848877996-52ncb" Oct 03 08:51:27 crc kubenswrapper[4765]: I1003 08:51:27.647423 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/64809084-8cca-4e95-ace6-5ecfcf98b208-console-oauth-config\") pod \"console-848877996-52ncb\" (UID: \"64809084-8cca-4e95-ace6-5ecfcf98b208\") " pod="openshift-console/console-848877996-52ncb" Oct 03 08:51:27 crc kubenswrapper[4765]: I1003 08:51:27.657068 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kkvs7\" (UniqueName: \"kubernetes.io/projected/64809084-8cca-4e95-ace6-5ecfcf98b208-kube-api-access-kkvs7\") pod \"console-848877996-52ncb\" (UID: \"64809084-8cca-4e95-ace6-5ecfcf98b208\") " pod="openshift-console/console-848877996-52ncb" Oct 03 08:51:27 crc kubenswrapper[4765]: I1003 08:51:27.676708 4765 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-848877996-52ncb" Oct 03 08:51:27 crc kubenswrapper[4765]: I1003 08:51:27.743838 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugin-serving-cert\" (UniqueName: \"kubernetes.io/secret/6a5b68b1-49a0-4e3c-a08b-fc4de9903242-plugin-serving-cert\") pod \"nmstate-console-plugin-6b874cbd85-bvxsq\" (UID: \"6a5b68b1-49a0-4e3c-a08b-fc4de9903242\") " pod="openshift-nmstate/nmstate-console-plugin-6b874cbd85-bvxsq" Oct 03 08:51:27 crc kubenswrapper[4765]: I1003 08:51:27.751898 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugin-serving-cert\" (UniqueName: \"kubernetes.io/secret/6a5b68b1-49a0-4e3c-a08b-fc4de9903242-plugin-serving-cert\") pod \"nmstate-console-plugin-6b874cbd85-bvxsq\" (UID: \"6a5b68b1-49a0-4e3c-a08b-fc4de9903242\") " pod="openshift-nmstate/nmstate-console-plugin-6b874cbd85-bvxsq" Oct 03 08:51:27 crc kubenswrapper[4765]: I1003 08:51:27.783128 4765 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-metrics-fdff9cb8d-6gs2v"] Oct 03 08:51:27 crc kubenswrapper[4765]: W1003 08:51:27.792565 4765 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod079a3543_9818_46c7_8500_d84424d4f411.slice/crio-2df7b5252f3b506c3d53fc6dc28b8266e1c25fd29267668a39e96516f7e38897 WatchSource:0}: Error finding container 2df7b5252f3b506c3d53fc6dc28b8266e1c25fd29267668a39e96516f7e38897: Status 404 returned error can't find the container with id 2df7b5252f3b506c3d53fc6dc28b8266e1c25fd29267668a39e96516f7e38897 Oct 03 08:51:27 crc kubenswrapper[4765]: I1003 08:51:27.823084 4765 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-webhook-6cdbc54649-m6bmz"] Oct 03 08:51:27 crc kubenswrapper[4765]: W1003 08:51:27.830894 4765 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode3620a8c_3a6c_4f66_8101_cd7f7b91f7d0.slice/crio-4a13253932e9d78f62e44b378fa687bf8dae27c7fcdf3297f549785fd5f0498c WatchSource:0}: Error finding container 4a13253932e9d78f62e44b378fa687bf8dae27c7fcdf3297f549785fd5f0498c: Status 404 returned error can't find the container with id 4a13253932e9d78f62e44b378fa687bf8dae27c7fcdf3297f549785fd5f0498c Oct 03 08:51:27 crc kubenswrapper[4765]: I1003 08:51:27.866171 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-metrics-fdff9cb8d-6gs2v" event={"ID":"079a3543-9818-46c7-8500-d84424d4f411","Type":"ContainerStarted","Data":"2df7b5252f3b506c3d53fc6dc28b8266e1c25fd29267668a39e96516f7e38897"} Oct 03 08:51:27 crc kubenswrapper[4765]: I1003 08:51:27.867330 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-webhook-6cdbc54649-m6bmz" event={"ID":"e3620a8c-3a6c-4f66-8101-cd7f7b91f7d0","Type":"ContainerStarted","Data":"4a13253932e9d78f62e44b378fa687bf8dae27c7fcdf3297f549785fd5f0498c"} Oct 03 08:51:27 crc kubenswrapper[4765]: I1003 08:51:27.869399 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-handler-bwwmp" event={"ID":"572363ec-fce1-468d-998d-3ae0dac9c35a","Type":"ContainerStarted","Data":"5366878ee809e315873d530bc166bd5728a97ccbc66f88782db73ecf3573ee4f"} Oct 03 08:51:28 crc kubenswrapper[4765]: I1003 08:51:28.037562 4765 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-console-plugin-6b874cbd85-bvxsq" Oct 03 08:51:28 crc kubenswrapper[4765]: I1003 08:51:28.158229 4765 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-848877996-52ncb"] Oct 03 08:51:28 crc kubenswrapper[4765]: W1003 08:51:28.166486 4765 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod64809084_8cca_4e95_ace6_5ecfcf98b208.slice/crio-f1a9920efd5c05acc7a769148e2d708e52c9564ff040eda2315f13862846dd64 WatchSource:0}: Error finding container f1a9920efd5c05acc7a769148e2d708e52c9564ff040eda2315f13862846dd64: Status 404 returned error can't find the container with id f1a9920efd5c05acc7a769148e2d708e52c9564ff040eda2315f13862846dd64 Oct 03 08:51:28 crc kubenswrapper[4765]: I1003 08:51:28.239238 4765 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-console-plugin-6b874cbd85-bvxsq"] Oct 03 08:51:28 crc kubenswrapper[4765]: W1003 08:51:28.248698 4765 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod6a5b68b1_49a0_4e3c_a08b_fc4de9903242.slice/crio-99d7cdae2cb706c9866e1637cc0d69088da9087af9b4998b477d3716ccfe9aac WatchSource:0}: Error finding container 99d7cdae2cb706c9866e1637cc0d69088da9087af9b4998b477d3716ccfe9aac: Status 404 returned error can't find the container with id 99d7cdae2cb706c9866e1637cc0d69088da9087af9b4998b477d3716ccfe9aac Oct 03 08:51:28 crc kubenswrapper[4765]: I1003 08:51:28.879119 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-console-plugin-6b874cbd85-bvxsq" event={"ID":"6a5b68b1-49a0-4e3c-a08b-fc4de9903242","Type":"ContainerStarted","Data":"99d7cdae2cb706c9866e1637cc0d69088da9087af9b4998b477d3716ccfe9aac"} Oct 03 08:51:28 crc kubenswrapper[4765]: I1003 08:51:28.887860 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-848877996-52ncb" event={"ID":"64809084-8cca-4e95-ace6-5ecfcf98b208","Type":"ContainerStarted","Data":"2d9da8541379e88bba639a15369f9fde0e3eb86a21afce0f6a1db49db0ab4b39"} Oct 03 08:51:28 crc kubenswrapper[4765]: I1003 08:51:28.887907 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-848877996-52ncb" event={"ID":"64809084-8cca-4e95-ace6-5ecfcf98b208","Type":"ContainerStarted","Data":"f1a9920efd5c05acc7a769148e2d708e52c9564ff040eda2315f13862846dd64"} Oct 03 08:51:30 crc kubenswrapper[4765]: I1003 08:51:30.680826 4765 patch_prober.go:28] interesting pod/machine-config-daemon-j8mss container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 03 08:51:30 crc kubenswrapper[4765]: I1003 08:51:30.681583 4765 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-j8mss" podUID="d636dbad-9ffa-4ba7-953f-adea04b76a23" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 03 08:51:30 crc kubenswrapper[4765]: I1003 08:51:30.923559 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-webhook-6cdbc54649-m6bmz" event={"ID":"e3620a8c-3a6c-4f66-8101-cd7f7b91f7d0","Type":"ContainerStarted","Data":"58fda5d460a15db1f51fa5c484621cbc07d92b4d85239a42daf8b34c4874c1d3"} Oct 03 08:51:30 crc kubenswrapper[4765]: I1003 08:51:30.924818 4765 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-nmstate/nmstate-webhook-6cdbc54649-m6bmz" Oct 03 08:51:30 crc kubenswrapper[4765]: I1003 08:51:30.937090 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-handler-bwwmp" event={"ID":"572363ec-fce1-468d-998d-3ae0dac9c35a","Type":"ContainerStarted","Data":"d849de794dd2c1e6761ea7b0e81825b93db55371e3d32eac96a9ef1071b87049"} Oct 03 08:51:30 crc kubenswrapper[4765]: I1003 08:51:30.937277 4765 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-nmstate/nmstate-handler-bwwmp" Oct 03 08:51:30 crc kubenswrapper[4765]: I1003 08:51:30.943250 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-metrics-fdff9cb8d-6gs2v" event={"ID":"079a3543-9818-46c7-8500-d84424d4f411","Type":"ContainerStarted","Data":"ff9f5f66ebb5e3650c68ca650785ef832e367b10ef7408aa6b34e4c908c15413"} Oct 03 08:51:30 crc kubenswrapper[4765]: I1003 08:51:30.949637 4765 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/console-848877996-52ncb" podStartSLOduration=3.949615473 podStartE2EDuration="3.949615473s" podCreationTimestamp="2025-10-03 08:51:27 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-03 08:51:28.921066511 +0000 UTC m=+733.222560841" watchObservedRunningTime="2025-10-03 08:51:30.949615473 +0000 UTC m=+735.251109803" Oct 03 08:51:30 crc kubenswrapper[4765]: I1003 08:51:30.951805 4765 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-nmstate/nmstate-webhook-6cdbc54649-m6bmz" podStartSLOduration=2.659529992 podStartE2EDuration="4.951795829s" podCreationTimestamp="2025-10-03 08:51:26 +0000 UTC" firstStartedPulling="2025-10-03 08:51:27.833703274 +0000 UTC m=+732.135197604" lastFinishedPulling="2025-10-03 08:51:30.125969101 +0000 UTC m=+734.427463441" observedRunningTime="2025-10-03 08:51:30.948153376 +0000 UTC m=+735.249647726" watchObservedRunningTime="2025-10-03 08:51:30.951795829 +0000 UTC m=+735.253290159" Oct 03 08:51:30 crc kubenswrapper[4765]: I1003 08:51:30.982342 4765 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-nmstate/nmstate-handler-bwwmp" podStartSLOduration=2.247150342 podStartE2EDuration="4.982316726s" podCreationTimestamp="2025-10-03 08:51:26 +0000 UTC" firstStartedPulling="2025-10-03 08:51:27.389929895 +0000 UTC m=+731.691424215" lastFinishedPulling="2025-10-03 08:51:30.125096259 +0000 UTC m=+734.426590599" observedRunningTime="2025-10-03 08:51:30.981112055 +0000 UTC m=+735.282606385" watchObservedRunningTime="2025-10-03 08:51:30.982316726 +0000 UTC m=+735.283811056" Oct 03 08:51:31 crc kubenswrapper[4765]: I1003 08:51:31.951620 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-console-plugin-6b874cbd85-bvxsq" event={"ID":"6a5b68b1-49a0-4e3c-a08b-fc4de9903242","Type":"ContainerStarted","Data":"59e9b26918bea24fd037ddbb2eabd6ced6447d976707ef97eddc7b72f7008b41"} Oct 03 08:51:31 crc kubenswrapper[4765]: I1003 08:51:31.984862 4765 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-nmstate/nmstate-console-plugin-6b874cbd85-bvxsq" podStartSLOduration=1.868089721 podStartE2EDuration="4.984825022s" podCreationTimestamp="2025-10-03 08:51:27 +0000 UTC" firstStartedPulling="2025-10-03 08:51:28.251365728 +0000 UTC m=+732.552860058" lastFinishedPulling="2025-10-03 08:51:31.368101029 +0000 UTC m=+735.669595359" observedRunningTime="2025-10-03 08:51:31.977620808 +0000 UTC m=+736.279115138" watchObservedRunningTime="2025-10-03 08:51:31.984825022 +0000 UTC m=+736.286319352" Oct 03 08:51:32 crc kubenswrapper[4765]: I1003 08:51:32.962263 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-metrics-fdff9cb8d-6gs2v" event={"ID":"079a3543-9818-46c7-8500-d84424d4f411","Type":"ContainerStarted","Data":"34c7fc5915ee6d03792818fcb74317cab5a3adfdd9b89dde084d45624865e4e9"} Oct 03 08:51:32 crc kubenswrapper[4765]: I1003 08:51:32.983421 4765 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-nmstate/nmstate-metrics-fdff9cb8d-6gs2v" podStartSLOduration=1.9844278119999998 podStartE2EDuration="6.983403728s" podCreationTimestamp="2025-10-03 08:51:26 +0000 UTC" firstStartedPulling="2025-10-03 08:51:27.795275876 +0000 UTC m=+732.096770206" lastFinishedPulling="2025-10-03 08:51:32.794251792 +0000 UTC m=+737.095746122" observedRunningTime="2025-10-03 08:51:32.979226462 +0000 UTC m=+737.280720792" watchObservedRunningTime="2025-10-03 08:51:32.983403728 +0000 UTC m=+737.284898058" Oct 03 08:51:37 crc kubenswrapper[4765]: I1003 08:51:37.359959 4765 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-nmstate/nmstate-handler-bwwmp" Oct 03 08:51:37 crc kubenswrapper[4765]: I1003 08:51:37.680036 4765 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-console/console-848877996-52ncb" Oct 03 08:51:37 crc kubenswrapper[4765]: I1003 08:51:37.680080 4765 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-console/console-848877996-52ncb" Oct 03 08:51:37 crc kubenswrapper[4765]: I1003 08:51:37.685480 4765 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-console/console-848877996-52ncb" Oct 03 08:51:37 crc kubenswrapper[4765]: I1003 08:51:37.997707 4765 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/console-848877996-52ncb" Oct 03 08:51:38 crc kubenswrapper[4765]: I1003 08:51:38.093126 4765 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-console/console-f9d7485db-g8jbc"] Oct 03 08:51:42 crc kubenswrapper[4765]: I1003 08:51:42.932258 4765 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-2gzl6"] Oct 03 08:51:42 crc kubenswrapper[4765]: I1003 08:51:42.933407 4765 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-controller-manager/controller-manager-879f6c89f-2gzl6" podUID="e41d9b4e-c3ce-4604-a3f8-1e972308f9a7" containerName="controller-manager" containerID="cri-o://41abd7bba33bd5738b20482f2f0b1c2605e5902861db39ec8ac96b5eb405557e" gracePeriod=30 Oct 03 08:51:43 crc kubenswrapper[4765]: I1003 08:51:43.028368 4765 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-4srhb"] Oct 03 08:51:43 crc kubenswrapper[4765]: I1003 08:51:43.029066 4765 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-4srhb" podUID="3b2c5fda-4f45-444f-991b-0afa96721739" containerName="route-controller-manager" containerID="cri-o://d86a79509e5ffc0e76448c5d7eb1b089cb88afe857861ea1ff255560d8f789f4" gracePeriod=30 Oct 03 08:51:43 crc kubenswrapper[4765]: I1003 08:51:43.378436 4765 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-879f6c89f-2gzl6" Oct 03 08:51:43 crc kubenswrapper[4765]: I1003 08:51:43.401206 4765 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-4srhb" Oct 03 08:51:43 crc kubenswrapper[4765]: I1003 08:51:43.491148 4765 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e41d9b4e-c3ce-4604-a3f8-1e972308f9a7-config\") pod \"e41d9b4e-c3ce-4604-a3f8-1e972308f9a7\" (UID: \"e41d9b4e-c3ce-4604-a3f8-1e972308f9a7\") " Oct 03 08:51:43 crc kubenswrapper[4765]: I1003 08:51:43.491233 4765 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/3b2c5fda-4f45-444f-991b-0afa96721739-serving-cert\") pod \"3b2c5fda-4f45-444f-991b-0afa96721739\" (UID: \"3b2c5fda-4f45-444f-991b-0afa96721739\") " Oct 03 08:51:43 crc kubenswrapper[4765]: I1003 08:51:43.491275 4765 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-z7psh\" (UniqueName: \"kubernetes.io/projected/e41d9b4e-c3ce-4604-a3f8-1e972308f9a7-kube-api-access-z7psh\") pod \"e41d9b4e-c3ce-4604-a3f8-1e972308f9a7\" (UID: \"e41d9b4e-c3ce-4604-a3f8-1e972308f9a7\") " Oct 03 08:51:43 crc kubenswrapper[4765]: I1003 08:51:43.491313 4765 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/e41d9b4e-c3ce-4604-a3f8-1e972308f9a7-client-ca\") pod \"e41d9b4e-c3ce-4604-a3f8-1e972308f9a7\" (UID: \"e41d9b4e-c3ce-4604-a3f8-1e972308f9a7\") " Oct 03 08:51:43 crc kubenswrapper[4765]: I1003 08:51:43.491343 4765 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/e41d9b4e-c3ce-4604-a3f8-1e972308f9a7-proxy-ca-bundles\") pod \"e41d9b4e-c3ce-4604-a3f8-1e972308f9a7\" (UID: \"e41d9b4e-c3ce-4604-a3f8-1e972308f9a7\") " Oct 03 08:51:43 crc kubenswrapper[4765]: I1003 08:51:43.491389 4765 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/3b2c5fda-4f45-444f-991b-0afa96721739-client-ca\") pod \"3b2c5fda-4f45-444f-991b-0afa96721739\" (UID: \"3b2c5fda-4f45-444f-991b-0afa96721739\") " Oct 03 08:51:43 crc kubenswrapper[4765]: I1003 08:51:43.491428 4765 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7tr8j\" (UniqueName: \"kubernetes.io/projected/3b2c5fda-4f45-444f-991b-0afa96721739-kube-api-access-7tr8j\") pod \"3b2c5fda-4f45-444f-991b-0afa96721739\" (UID: \"3b2c5fda-4f45-444f-991b-0afa96721739\") " Oct 03 08:51:43 crc kubenswrapper[4765]: I1003 08:51:43.491464 4765 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3b2c5fda-4f45-444f-991b-0afa96721739-config\") pod \"3b2c5fda-4f45-444f-991b-0afa96721739\" (UID: \"3b2c5fda-4f45-444f-991b-0afa96721739\") " Oct 03 08:51:43 crc kubenswrapper[4765]: I1003 08:51:43.491539 4765 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e41d9b4e-c3ce-4604-a3f8-1e972308f9a7-serving-cert\") pod \"e41d9b4e-c3ce-4604-a3f8-1e972308f9a7\" (UID: \"e41d9b4e-c3ce-4604-a3f8-1e972308f9a7\") " Oct 03 08:51:43 crc kubenswrapper[4765]: I1003 08:51:43.492270 4765 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e41d9b4e-c3ce-4604-a3f8-1e972308f9a7-config" (OuterVolumeSpecName: "config") pod "e41d9b4e-c3ce-4604-a3f8-1e972308f9a7" (UID: "e41d9b4e-c3ce-4604-a3f8-1e972308f9a7"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 03 08:51:43 crc kubenswrapper[4765]: I1003 08:51:43.493480 4765 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e41d9b4e-c3ce-4604-a3f8-1e972308f9a7-client-ca" (OuterVolumeSpecName: "client-ca") pod "e41d9b4e-c3ce-4604-a3f8-1e972308f9a7" (UID: "e41d9b4e-c3ce-4604-a3f8-1e972308f9a7"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 03 08:51:43 crc kubenswrapper[4765]: I1003 08:51:43.493803 4765 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e41d9b4e-c3ce-4604-a3f8-1e972308f9a7-proxy-ca-bundles" (OuterVolumeSpecName: "proxy-ca-bundles") pod "e41d9b4e-c3ce-4604-a3f8-1e972308f9a7" (UID: "e41d9b4e-c3ce-4604-a3f8-1e972308f9a7"). InnerVolumeSpecName "proxy-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 03 08:51:43 crc kubenswrapper[4765]: I1003 08:51:43.493842 4765 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3b2c5fda-4f45-444f-991b-0afa96721739-config" (OuterVolumeSpecName: "config") pod "3b2c5fda-4f45-444f-991b-0afa96721739" (UID: "3b2c5fda-4f45-444f-991b-0afa96721739"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 03 08:51:43 crc kubenswrapper[4765]: I1003 08:51:43.493996 4765 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3b2c5fda-4f45-444f-991b-0afa96721739-client-ca" (OuterVolumeSpecName: "client-ca") pod "3b2c5fda-4f45-444f-991b-0afa96721739" (UID: "3b2c5fda-4f45-444f-991b-0afa96721739"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 03 08:51:43 crc kubenswrapper[4765]: I1003 08:51:43.498848 4765 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e41d9b4e-c3ce-4604-a3f8-1e972308f9a7-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "e41d9b4e-c3ce-4604-a3f8-1e972308f9a7" (UID: "e41d9b4e-c3ce-4604-a3f8-1e972308f9a7"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 08:51:43 crc kubenswrapper[4765]: I1003 08:51:43.499362 4765 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e41d9b4e-c3ce-4604-a3f8-1e972308f9a7-kube-api-access-z7psh" (OuterVolumeSpecName: "kube-api-access-z7psh") pod "e41d9b4e-c3ce-4604-a3f8-1e972308f9a7" (UID: "e41d9b4e-c3ce-4604-a3f8-1e972308f9a7"). InnerVolumeSpecName "kube-api-access-z7psh". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 08:51:43 crc kubenswrapper[4765]: I1003 08:51:43.502417 4765 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3b2c5fda-4f45-444f-991b-0afa96721739-kube-api-access-7tr8j" (OuterVolumeSpecName: "kube-api-access-7tr8j") pod "3b2c5fda-4f45-444f-991b-0afa96721739" (UID: "3b2c5fda-4f45-444f-991b-0afa96721739"). InnerVolumeSpecName "kube-api-access-7tr8j". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 08:51:43 crc kubenswrapper[4765]: I1003 08:51:43.502706 4765 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3b2c5fda-4f45-444f-991b-0afa96721739-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "3b2c5fda-4f45-444f-991b-0afa96721739" (UID: "3b2c5fda-4f45-444f-991b-0afa96721739"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 08:51:43 crc kubenswrapper[4765]: I1003 08:51:43.592570 4765 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7tr8j\" (UniqueName: \"kubernetes.io/projected/3b2c5fda-4f45-444f-991b-0afa96721739-kube-api-access-7tr8j\") on node \"crc\" DevicePath \"\"" Oct 03 08:51:43 crc kubenswrapper[4765]: I1003 08:51:43.592612 4765 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3b2c5fda-4f45-444f-991b-0afa96721739-config\") on node \"crc\" DevicePath \"\"" Oct 03 08:51:43 crc kubenswrapper[4765]: I1003 08:51:43.592625 4765 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e41d9b4e-c3ce-4604-a3f8-1e972308f9a7-serving-cert\") on node \"crc\" DevicePath \"\"" Oct 03 08:51:43 crc kubenswrapper[4765]: I1003 08:51:43.592636 4765 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e41d9b4e-c3ce-4604-a3f8-1e972308f9a7-config\") on node \"crc\" DevicePath \"\"" Oct 03 08:51:43 crc kubenswrapper[4765]: I1003 08:51:43.592658 4765 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/3b2c5fda-4f45-444f-991b-0afa96721739-serving-cert\") on node \"crc\" DevicePath \"\"" Oct 03 08:51:43 crc kubenswrapper[4765]: I1003 08:51:43.592693 4765 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-z7psh\" (UniqueName: \"kubernetes.io/projected/e41d9b4e-c3ce-4604-a3f8-1e972308f9a7-kube-api-access-z7psh\") on node \"crc\" DevicePath \"\"" Oct 03 08:51:43 crc kubenswrapper[4765]: I1003 08:51:43.592702 4765 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/e41d9b4e-c3ce-4604-a3f8-1e972308f9a7-client-ca\") on node \"crc\" DevicePath \"\"" Oct 03 08:51:43 crc kubenswrapper[4765]: I1003 08:51:43.592714 4765 reconciler_common.go:293] "Volume detached for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/e41d9b4e-c3ce-4604-a3f8-1e972308f9a7-proxy-ca-bundles\") on node \"crc\" DevicePath \"\"" Oct 03 08:51:43 crc kubenswrapper[4765]: I1003 08:51:43.592723 4765 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/3b2c5fda-4f45-444f-991b-0afa96721739-client-ca\") on node \"crc\" DevicePath \"\"" Oct 03 08:51:44 crc kubenswrapper[4765]: I1003 08:51:44.032033 4765 generic.go:334] "Generic (PLEG): container finished" podID="3b2c5fda-4f45-444f-991b-0afa96721739" containerID="d86a79509e5ffc0e76448c5d7eb1b089cb88afe857861ea1ff255560d8f789f4" exitCode=0 Oct 03 08:51:44 crc kubenswrapper[4765]: I1003 08:51:44.032102 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-4srhb" event={"ID":"3b2c5fda-4f45-444f-991b-0afa96721739","Type":"ContainerDied","Data":"d86a79509e5ffc0e76448c5d7eb1b089cb88afe857861ea1ff255560d8f789f4"} Oct 03 08:51:44 crc kubenswrapper[4765]: I1003 08:51:44.032130 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-4srhb" event={"ID":"3b2c5fda-4f45-444f-991b-0afa96721739","Type":"ContainerDied","Data":"d699286acc7e5a3e670786b6c3628476dea60a398a96fe5b394392f1d115d335"} Oct 03 08:51:44 crc kubenswrapper[4765]: I1003 08:51:44.032149 4765 scope.go:117] "RemoveContainer" containerID="d86a79509e5ffc0e76448c5d7eb1b089cb88afe857861ea1ff255560d8f789f4" Oct 03 08:51:44 crc kubenswrapper[4765]: I1003 08:51:44.032184 4765 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-4srhb" Oct 03 08:51:44 crc kubenswrapper[4765]: I1003 08:51:44.033789 4765 generic.go:334] "Generic (PLEG): container finished" podID="e41d9b4e-c3ce-4604-a3f8-1e972308f9a7" containerID="41abd7bba33bd5738b20482f2f0b1c2605e5902861db39ec8ac96b5eb405557e" exitCode=0 Oct 03 08:51:44 crc kubenswrapper[4765]: I1003 08:51:44.033826 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-879f6c89f-2gzl6" event={"ID":"e41d9b4e-c3ce-4604-a3f8-1e972308f9a7","Type":"ContainerDied","Data":"41abd7bba33bd5738b20482f2f0b1c2605e5902861db39ec8ac96b5eb405557e"} Oct 03 08:51:44 crc kubenswrapper[4765]: I1003 08:51:44.033848 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-879f6c89f-2gzl6" event={"ID":"e41d9b4e-c3ce-4604-a3f8-1e972308f9a7","Type":"ContainerDied","Data":"c16e50c5e053e3a8bb68771c0c4c03786f8cba92cfb89ec044499f22d5792dfb"} Oct 03 08:51:44 crc kubenswrapper[4765]: I1003 08:51:44.033849 4765 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-879f6c89f-2gzl6" Oct 03 08:51:44 crc kubenswrapper[4765]: I1003 08:51:44.056586 4765 scope.go:117] "RemoveContainer" containerID="d86a79509e5ffc0e76448c5d7eb1b089cb88afe857861ea1ff255560d8f789f4" Oct 03 08:51:44 crc kubenswrapper[4765]: E1003 08:51:44.057123 4765 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d86a79509e5ffc0e76448c5d7eb1b089cb88afe857861ea1ff255560d8f789f4\": container with ID starting with d86a79509e5ffc0e76448c5d7eb1b089cb88afe857861ea1ff255560d8f789f4 not found: ID does not exist" containerID="d86a79509e5ffc0e76448c5d7eb1b089cb88afe857861ea1ff255560d8f789f4" Oct 03 08:51:44 crc kubenswrapper[4765]: I1003 08:51:44.057150 4765 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d86a79509e5ffc0e76448c5d7eb1b089cb88afe857861ea1ff255560d8f789f4"} err="failed to get container status \"d86a79509e5ffc0e76448c5d7eb1b089cb88afe857861ea1ff255560d8f789f4\": rpc error: code = NotFound desc = could not find container \"d86a79509e5ffc0e76448c5d7eb1b089cb88afe857861ea1ff255560d8f789f4\": container with ID starting with d86a79509e5ffc0e76448c5d7eb1b089cb88afe857861ea1ff255560d8f789f4 not found: ID does not exist" Oct 03 08:51:44 crc kubenswrapper[4765]: I1003 08:51:44.057171 4765 scope.go:117] "RemoveContainer" containerID="41abd7bba33bd5738b20482f2f0b1c2605e5902861db39ec8ac96b5eb405557e" Oct 03 08:51:44 crc kubenswrapper[4765]: I1003 08:51:44.074659 4765 scope.go:117] "RemoveContainer" containerID="41abd7bba33bd5738b20482f2f0b1c2605e5902861db39ec8ac96b5eb405557e" Oct 03 08:51:44 crc kubenswrapper[4765]: E1003 08:51:44.075611 4765 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"41abd7bba33bd5738b20482f2f0b1c2605e5902861db39ec8ac96b5eb405557e\": container with ID starting with 41abd7bba33bd5738b20482f2f0b1c2605e5902861db39ec8ac96b5eb405557e not found: ID does not exist" containerID="41abd7bba33bd5738b20482f2f0b1c2605e5902861db39ec8ac96b5eb405557e" Oct 03 08:51:44 crc kubenswrapper[4765]: I1003 08:51:44.075657 4765 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"41abd7bba33bd5738b20482f2f0b1c2605e5902861db39ec8ac96b5eb405557e"} err="failed to get container status \"41abd7bba33bd5738b20482f2f0b1c2605e5902861db39ec8ac96b5eb405557e\": rpc error: code = NotFound desc = could not find container \"41abd7bba33bd5738b20482f2f0b1c2605e5902861db39ec8ac96b5eb405557e\": container with ID starting with 41abd7bba33bd5738b20482f2f0b1c2605e5902861db39ec8ac96b5eb405557e not found: ID does not exist" Oct 03 08:51:44 crc kubenswrapper[4765]: I1003 08:51:44.086788 4765 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-4srhb"] Oct 03 08:51:44 crc kubenswrapper[4765]: I1003 08:51:44.096559 4765 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-4srhb"] Oct 03 08:51:44 crc kubenswrapper[4765]: I1003 08:51:44.100149 4765 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-2gzl6"] Oct 03 08:51:44 crc kubenswrapper[4765]: I1003 08:51:44.110906 4765 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-2gzl6"] Oct 03 08:51:44 crc kubenswrapper[4765]: I1003 08:51:44.318115 4765 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3b2c5fda-4f45-444f-991b-0afa96721739" path="/var/lib/kubelet/pods/3b2c5fda-4f45-444f-991b-0afa96721739/volumes" Oct 03 08:51:44 crc kubenswrapper[4765]: I1003 08:51:44.319436 4765 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e41d9b4e-c3ce-4604-a3f8-1e972308f9a7" path="/var/lib/kubelet/pods/e41d9b4e-c3ce-4604-a3f8-1e972308f9a7/volumes" Oct 03 08:51:44 crc kubenswrapper[4765]: I1003 08:51:44.722323 4765 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-controller-manager/controller-manager-77dd696b7c-7drnk"] Oct 03 08:51:44 crc kubenswrapper[4765]: E1003 08:51:44.722583 4765 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e41d9b4e-c3ce-4604-a3f8-1e972308f9a7" containerName="controller-manager" Oct 03 08:51:44 crc kubenswrapper[4765]: I1003 08:51:44.722600 4765 state_mem.go:107] "Deleted CPUSet assignment" podUID="e41d9b4e-c3ce-4604-a3f8-1e972308f9a7" containerName="controller-manager" Oct 03 08:51:44 crc kubenswrapper[4765]: E1003 08:51:44.722617 4765 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3b2c5fda-4f45-444f-991b-0afa96721739" containerName="route-controller-manager" Oct 03 08:51:44 crc kubenswrapper[4765]: I1003 08:51:44.722625 4765 state_mem.go:107] "Deleted CPUSet assignment" podUID="3b2c5fda-4f45-444f-991b-0afa96721739" containerName="route-controller-manager" Oct 03 08:51:44 crc kubenswrapper[4765]: I1003 08:51:44.722781 4765 memory_manager.go:354] "RemoveStaleState removing state" podUID="e41d9b4e-c3ce-4604-a3f8-1e972308f9a7" containerName="controller-manager" Oct 03 08:51:44 crc kubenswrapper[4765]: I1003 08:51:44.722808 4765 memory_manager.go:354] "RemoveStaleState removing state" podUID="3b2c5fda-4f45-444f-991b-0afa96721739" containerName="route-controller-manager" Oct 03 08:51:44 crc kubenswrapper[4765]: I1003 08:51:44.723331 4765 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-77dd696b7c-7drnk" Oct 03 08:51:44 crc kubenswrapper[4765]: I1003 08:51:44.726535 4765 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"kube-root-ca.crt" Oct 03 08:51:44 crc kubenswrapper[4765]: I1003 08:51:44.726609 4765 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-route-controller-manager/route-controller-manager-5dd874986d-mq5qs"] Oct 03 08:51:44 crc kubenswrapper[4765]: I1003 08:51:44.726694 4765 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"openshift-controller-manager-sa-dockercfg-msq4c" Oct 03 08:51:44 crc kubenswrapper[4765]: I1003 08:51:44.727231 4765 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"serving-cert" Oct 03 08:51:44 crc kubenswrapper[4765]: I1003 08:51:44.727732 4765 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"config" Oct 03 08:51:44 crc kubenswrapper[4765]: I1003 08:51:44.727781 4765 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-5dd874986d-mq5qs" Oct 03 08:51:44 crc kubenswrapper[4765]: I1003 08:51:44.727939 4765 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"client-ca" Oct 03 08:51:44 crc kubenswrapper[4765]: I1003 08:51:44.727942 4765 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-service-ca.crt" Oct 03 08:51:44 crc kubenswrapper[4765]: I1003 08:51:44.732259 4765 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"config" Oct 03 08:51:44 crc kubenswrapper[4765]: I1003 08:51:44.737755 4765 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"route-controller-manager-sa-dockercfg-h2zr2" Oct 03 08:51:44 crc kubenswrapper[4765]: I1003 08:51:44.737783 4765 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"serving-cert" Oct 03 08:51:44 crc kubenswrapper[4765]: I1003 08:51:44.737755 4765 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"kube-root-ca.crt" Oct 03 08:51:44 crc kubenswrapper[4765]: I1003 08:51:44.737967 4765 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"client-ca" Oct 03 08:51:44 crc kubenswrapper[4765]: I1003 08:51:44.738126 4765 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"openshift-service-ca.crt" Oct 03 08:51:44 crc kubenswrapper[4765]: I1003 08:51:44.742959 4765 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-5dd874986d-mq5qs"] Oct 03 08:51:44 crc kubenswrapper[4765]: I1003 08:51:44.746141 4765 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-global-ca" Oct 03 08:51:44 crc kubenswrapper[4765]: I1003 08:51:44.746471 4765 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-77dd696b7c-7drnk"] Oct 03 08:51:44 crc kubenswrapper[4765]: I1003 08:51:44.807909 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/36634e16-c726-45ea-b1ff-8cbad86e3cb0-proxy-ca-bundles\") pod \"controller-manager-77dd696b7c-7drnk\" (UID: \"36634e16-c726-45ea-b1ff-8cbad86e3cb0\") " pod="openshift-controller-manager/controller-manager-77dd696b7c-7drnk" Oct 03 08:51:44 crc kubenswrapper[4765]: I1003 08:51:44.808007 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/f945d245-6ef8-4e0f-bfd2-51265e9a866d-serving-cert\") pod \"route-controller-manager-5dd874986d-mq5qs\" (UID: \"f945d245-6ef8-4e0f-bfd2-51265e9a866d\") " pod="openshift-route-controller-manager/route-controller-manager-5dd874986d-mq5qs" Oct 03 08:51:44 crc kubenswrapper[4765]: I1003 08:51:44.808032 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/f945d245-6ef8-4e0f-bfd2-51265e9a866d-client-ca\") pod \"route-controller-manager-5dd874986d-mq5qs\" (UID: \"f945d245-6ef8-4e0f-bfd2-51265e9a866d\") " pod="openshift-route-controller-manager/route-controller-manager-5dd874986d-mq5qs" Oct 03 08:51:44 crc kubenswrapper[4765]: I1003 08:51:44.808047 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-72kcx\" (UniqueName: \"kubernetes.io/projected/f945d245-6ef8-4e0f-bfd2-51265e9a866d-kube-api-access-72kcx\") pod \"route-controller-manager-5dd874986d-mq5qs\" (UID: \"f945d245-6ef8-4e0f-bfd2-51265e9a866d\") " pod="openshift-route-controller-manager/route-controller-manager-5dd874986d-mq5qs" Oct 03 08:51:44 crc kubenswrapper[4765]: I1003 08:51:44.808119 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mqzrn\" (UniqueName: \"kubernetes.io/projected/36634e16-c726-45ea-b1ff-8cbad86e3cb0-kube-api-access-mqzrn\") pod \"controller-manager-77dd696b7c-7drnk\" (UID: \"36634e16-c726-45ea-b1ff-8cbad86e3cb0\") " pod="openshift-controller-manager/controller-manager-77dd696b7c-7drnk" Oct 03 08:51:44 crc kubenswrapper[4765]: I1003 08:51:44.808137 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f945d245-6ef8-4e0f-bfd2-51265e9a866d-config\") pod \"route-controller-manager-5dd874986d-mq5qs\" (UID: \"f945d245-6ef8-4e0f-bfd2-51265e9a866d\") " pod="openshift-route-controller-manager/route-controller-manager-5dd874986d-mq5qs" Oct 03 08:51:44 crc kubenswrapper[4765]: I1003 08:51:44.808160 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/36634e16-c726-45ea-b1ff-8cbad86e3cb0-config\") pod \"controller-manager-77dd696b7c-7drnk\" (UID: \"36634e16-c726-45ea-b1ff-8cbad86e3cb0\") " pod="openshift-controller-manager/controller-manager-77dd696b7c-7drnk" Oct 03 08:51:44 crc kubenswrapper[4765]: I1003 08:51:44.808206 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/36634e16-c726-45ea-b1ff-8cbad86e3cb0-serving-cert\") pod \"controller-manager-77dd696b7c-7drnk\" (UID: \"36634e16-c726-45ea-b1ff-8cbad86e3cb0\") " pod="openshift-controller-manager/controller-manager-77dd696b7c-7drnk" Oct 03 08:51:44 crc kubenswrapper[4765]: I1003 08:51:44.808225 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/36634e16-c726-45ea-b1ff-8cbad86e3cb0-client-ca\") pod \"controller-manager-77dd696b7c-7drnk\" (UID: \"36634e16-c726-45ea-b1ff-8cbad86e3cb0\") " pod="openshift-controller-manager/controller-manager-77dd696b7c-7drnk" Oct 03 08:51:44 crc kubenswrapper[4765]: I1003 08:51:44.909253 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/36634e16-c726-45ea-b1ff-8cbad86e3cb0-proxy-ca-bundles\") pod \"controller-manager-77dd696b7c-7drnk\" (UID: \"36634e16-c726-45ea-b1ff-8cbad86e3cb0\") " pod="openshift-controller-manager/controller-manager-77dd696b7c-7drnk" Oct 03 08:51:44 crc kubenswrapper[4765]: I1003 08:51:44.909573 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/f945d245-6ef8-4e0f-bfd2-51265e9a866d-serving-cert\") pod \"route-controller-manager-5dd874986d-mq5qs\" (UID: \"f945d245-6ef8-4e0f-bfd2-51265e9a866d\") " pod="openshift-route-controller-manager/route-controller-manager-5dd874986d-mq5qs" Oct 03 08:51:44 crc kubenswrapper[4765]: I1003 08:51:44.909697 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/f945d245-6ef8-4e0f-bfd2-51265e9a866d-client-ca\") pod \"route-controller-manager-5dd874986d-mq5qs\" (UID: \"f945d245-6ef8-4e0f-bfd2-51265e9a866d\") " pod="openshift-route-controller-manager/route-controller-manager-5dd874986d-mq5qs" Oct 03 08:51:44 crc kubenswrapper[4765]: I1003 08:51:44.909803 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-72kcx\" (UniqueName: \"kubernetes.io/projected/f945d245-6ef8-4e0f-bfd2-51265e9a866d-kube-api-access-72kcx\") pod \"route-controller-manager-5dd874986d-mq5qs\" (UID: \"f945d245-6ef8-4e0f-bfd2-51265e9a866d\") " pod="openshift-route-controller-manager/route-controller-manager-5dd874986d-mq5qs" Oct 03 08:51:44 crc kubenswrapper[4765]: I1003 08:51:44.909893 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f945d245-6ef8-4e0f-bfd2-51265e9a866d-config\") pod \"route-controller-manager-5dd874986d-mq5qs\" (UID: \"f945d245-6ef8-4e0f-bfd2-51265e9a866d\") " pod="openshift-route-controller-manager/route-controller-manager-5dd874986d-mq5qs" Oct 03 08:51:44 crc kubenswrapper[4765]: I1003 08:51:44.909969 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mqzrn\" (UniqueName: \"kubernetes.io/projected/36634e16-c726-45ea-b1ff-8cbad86e3cb0-kube-api-access-mqzrn\") pod \"controller-manager-77dd696b7c-7drnk\" (UID: \"36634e16-c726-45ea-b1ff-8cbad86e3cb0\") " pod="openshift-controller-manager/controller-manager-77dd696b7c-7drnk" Oct 03 08:51:44 crc kubenswrapper[4765]: I1003 08:51:44.910041 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/36634e16-c726-45ea-b1ff-8cbad86e3cb0-config\") pod \"controller-manager-77dd696b7c-7drnk\" (UID: \"36634e16-c726-45ea-b1ff-8cbad86e3cb0\") " pod="openshift-controller-manager/controller-manager-77dd696b7c-7drnk" Oct 03 08:51:44 crc kubenswrapper[4765]: I1003 08:51:44.910111 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/36634e16-c726-45ea-b1ff-8cbad86e3cb0-serving-cert\") pod \"controller-manager-77dd696b7c-7drnk\" (UID: \"36634e16-c726-45ea-b1ff-8cbad86e3cb0\") " pod="openshift-controller-manager/controller-manager-77dd696b7c-7drnk" Oct 03 08:51:44 crc kubenswrapper[4765]: I1003 08:51:44.910191 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/36634e16-c726-45ea-b1ff-8cbad86e3cb0-client-ca\") pod \"controller-manager-77dd696b7c-7drnk\" (UID: \"36634e16-c726-45ea-b1ff-8cbad86e3cb0\") " pod="openshift-controller-manager/controller-manager-77dd696b7c-7drnk" Oct 03 08:51:44 crc kubenswrapper[4765]: I1003 08:51:44.910348 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/36634e16-c726-45ea-b1ff-8cbad86e3cb0-proxy-ca-bundles\") pod \"controller-manager-77dd696b7c-7drnk\" (UID: \"36634e16-c726-45ea-b1ff-8cbad86e3cb0\") " pod="openshift-controller-manager/controller-manager-77dd696b7c-7drnk" Oct 03 08:51:44 crc kubenswrapper[4765]: I1003 08:51:44.910629 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/f945d245-6ef8-4e0f-bfd2-51265e9a866d-client-ca\") pod \"route-controller-manager-5dd874986d-mq5qs\" (UID: \"f945d245-6ef8-4e0f-bfd2-51265e9a866d\") " pod="openshift-route-controller-manager/route-controller-manager-5dd874986d-mq5qs" Oct 03 08:51:44 crc kubenswrapper[4765]: I1003 08:51:44.911156 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/36634e16-c726-45ea-b1ff-8cbad86e3cb0-client-ca\") pod \"controller-manager-77dd696b7c-7drnk\" (UID: \"36634e16-c726-45ea-b1ff-8cbad86e3cb0\") " pod="openshift-controller-manager/controller-manager-77dd696b7c-7drnk" Oct 03 08:51:44 crc kubenswrapper[4765]: I1003 08:51:44.911190 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f945d245-6ef8-4e0f-bfd2-51265e9a866d-config\") pod \"route-controller-manager-5dd874986d-mq5qs\" (UID: \"f945d245-6ef8-4e0f-bfd2-51265e9a866d\") " pod="openshift-route-controller-manager/route-controller-manager-5dd874986d-mq5qs" Oct 03 08:51:44 crc kubenswrapper[4765]: I1003 08:51:44.911550 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/36634e16-c726-45ea-b1ff-8cbad86e3cb0-config\") pod \"controller-manager-77dd696b7c-7drnk\" (UID: \"36634e16-c726-45ea-b1ff-8cbad86e3cb0\") " pod="openshift-controller-manager/controller-manager-77dd696b7c-7drnk" Oct 03 08:51:44 crc kubenswrapper[4765]: I1003 08:51:44.915597 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/f945d245-6ef8-4e0f-bfd2-51265e9a866d-serving-cert\") pod \"route-controller-manager-5dd874986d-mq5qs\" (UID: \"f945d245-6ef8-4e0f-bfd2-51265e9a866d\") " pod="openshift-route-controller-manager/route-controller-manager-5dd874986d-mq5qs" Oct 03 08:51:44 crc kubenswrapper[4765]: I1003 08:51:44.923363 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/36634e16-c726-45ea-b1ff-8cbad86e3cb0-serving-cert\") pod \"controller-manager-77dd696b7c-7drnk\" (UID: \"36634e16-c726-45ea-b1ff-8cbad86e3cb0\") " pod="openshift-controller-manager/controller-manager-77dd696b7c-7drnk" Oct 03 08:51:44 crc kubenswrapper[4765]: I1003 08:51:44.926804 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mqzrn\" (UniqueName: \"kubernetes.io/projected/36634e16-c726-45ea-b1ff-8cbad86e3cb0-kube-api-access-mqzrn\") pod \"controller-manager-77dd696b7c-7drnk\" (UID: \"36634e16-c726-45ea-b1ff-8cbad86e3cb0\") " pod="openshift-controller-manager/controller-manager-77dd696b7c-7drnk" Oct 03 08:51:44 crc kubenswrapper[4765]: I1003 08:51:44.927310 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-72kcx\" (UniqueName: \"kubernetes.io/projected/f945d245-6ef8-4e0f-bfd2-51265e9a866d-kube-api-access-72kcx\") pod \"route-controller-manager-5dd874986d-mq5qs\" (UID: \"f945d245-6ef8-4e0f-bfd2-51265e9a866d\") " pod="openshift-route-controller-manager/route-controller-manager-5dd874986d-mq5qs" Oct 03 08:51:45 crc kubenswrapper[4765]: I1003 08:51:45.047668 4765 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-77dd696b7c-7drnk" Oct 03 08:51:45 crc kubenswrapper[4765]: I1003 08:51:45.061397 4765 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-5dd874986d-mq5qs" Oct 03 08:51:45 crc kubenswrapper[4765]: I1003 08:51:45.250370 4765 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-77dd696b7c-7drnk"] Oct 03 08:51:45 crc kubenswrapper[4765]: I1003 08:51:45.304762 4765 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-5dd874986d-mq5qs"] Oct 03 08:51:46 crc kubenswrapper[4765]: I1003 08:51:46.051550 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-77dd696b7c-7drnk" event={"ID":"36634e16-c726-45ea-b1ff-8cbad86e3cb0","Type":"ContainerStarted","Data":"31153659b19baed8a94d075e5f89ed47afb99cfba3205db470d598c10cba4c93"} Oct 03 08:51:46 crc kubenswrapper[4765]: I1003 08:51:46.052256 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-77dd696b7c-7drnk" event={"ID":"36634e16-c726-45ea-b1ff-8cbad86e3cb0","Type":"ContainerStarted","Data":"c954de7ac197e42b77f8861f696ed41ec7460786f974a6fdd38544b110a36deb"} Oct 03 08:51:46 crc kubenswrapper[4765]: I1003 08:51:46.052309 4765 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-controller-manager/controller-manager-77dd696b7c-7drnk" Oct 03 08:51:46 crc kubenswrapper[4765]: I1003 08:51:46.053509 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-5dd874986d-mq5qs" event={"ID":"f945d245-6ef8-4e0f-bfd2-51265e9a866d","Type":"ContainerStarted","Data":"11b39c927d27aa2d289fc57747a952a579b374b3fff825fa8aae313d1011b4c1"} Oct 03 08:51:46 crc kubenswrapper[4765]: I1003 08:51:46.053591 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-5dd874986d-mq5qs" event={"ID":"f945d245-6ef8-4e0f-bfd2-51265e9a866d","Type":"ContainerStarted","Data":"ab664e24243a13a1082e6855354a8fd4cb251c14fa8c6b4449e80022f52bac29"} Oct 03 08:51:46 crc kubenswrapper[4765]: I1003 08:51:46.054311 4765 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-route-controller-manager/route-controller-manager-5dd874986d-mq5qs" Oct 03 08:51:46 crc kubenswrapper[4765]: I1003 08:51:46.058413 4765 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-controller-manager/controller-manager-77dd696b7c-7drnk" Oct 03 08:51:46 crc kubenswrapper[4765]: I1003 08:51:46.060860 4765 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-route-controller-manager/route-controller-manager-5dd874986d-mq5qs" Oct 03 08:51:46 crc kubenswrapper[4765]: I1003 08:51:46.104099 4765 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-controller-manager/controller-manager-77dd696b7c-7drnk" podStartSLOduration=3.104079243 podStartE2EDuration="3.104079243s" podCreationTimestamp="2025-10-03 08:51:43 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-03 08:51:46.086485465 +0000 UTC m=+750.387979795" watchObservedRunningTime="2025-10-03 08:51:46.104079243 +0000 UTC m=+750.405573573" Oct 03 08:51:46 crc kubenswrapper[4765]: I1003 08:51:46.105853 4765 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-route-controller-manager/route-controller-manager-5dd874986d-mq5qs" podStartSLOduration=3.105848578 podStartE2EDuration="3.105848578s" podCreationTimestamp="2025-10-03 08:51:43 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-03 08:51:46.102713258 +0000 UTC m=+750.404207588" watchObservedRunningTime="2025-10-03 08:51:46.105848578 +0000 UTC m=+750.407342908" Oct 03 08:51:47 crc kubenswrapper[4765]: I1003 08:51:47.301901 4765 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-nmstate/nmstate-webhook-6cdbc54649-m6bmz" Oct 03 08:51:52 crc kubenswrapper[4765]: I1003 08:51:52.138840 4765 dynamic_cafile_content.go:123] "Loaded a new CA Bundle and Verifier" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Oct 03 08:52:00 crc kubenswrapper[4765]: I1003 08:52:00.679688 4765 patch_prober.go:28] interesting pod/machine-config-daemon-j8mss container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 03 08:52:00 crc kubenswrapper[4765]: I1003 08:52:00.680227 4765 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-j8mss" podUID="d636dbad-9ffa-4ba7-953f-adea04b76a23" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 03 08:52:01 crc kubenswrapper[4765]: I1003 08:52:01.070896 4765 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d2r76hj"] Oct 03 08:52:01 crc kubenswrapper[4765]: I1003 08:52:01.072216 4765 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d2r76hj" Oct 03 08:52:01 crc kubenswrapper[4765]: I1003 08:52:01.074271 4765 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"default-dockercfg-vmwhc" Oct 03 08:52:01 crc kubenswrapper[4765]: I1003 08:52:01.080686 4765 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d2r76hj"] Oct 03 08:52:01 crc kubenswrapper[4765]: I1003 08:52:01.264332 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vzppd\" (UniqueName: \"kubernetes.io/projected/aca861ef-e249-4395-8760-c2b556b47ae7-kube-api-access-vzppd\") pod \"8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d2r76hj\" (UID: \"aca861ef-e249-4395-8760-c2b556b47ae7\") " pod="openshift-marketplace/8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d2r76hj" Oct 03 08:52:01 crc kubenswrapper[4765]: I1003 08:52:01.264413 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/aca861ef-e249-4395-8760-c2b556b47ae7-bundle\") pod \"8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d2r76hj\" (UID: \"aca861ef-e249-4395-8760-c2b556b47ae7\") " pod="openshift-marketplace/8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d2r76hj" Oct 03 08:52:01 crc kubenswrapper[4765]: I1003 08:52:01.264483 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/aca861ef-e249-4395-8760-c2b556b47ae7-util\") pod \"8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d2r76hj\" (UID: \"aca861ef-e249-4395-8760-c2b556b47ae7\") " pod="openshift-marketplace/8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d2r76hj" Oct 03 08:52:01 crc kubenswrapper[4765]: I1003 08:52:01.366816 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vzppd\" (UniqueName: \"kubernetes.io/projected/aca861ef-e249-4395-8760-c2b556b47ae7-kube-api-access-vzppd\") pod \"8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d2r76hj\" (UID: \"aca861ef-e249-4395-8760-c2b556b47ae7\") " pod="openshift-marketplace/8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d2r76hj" Oct 03 08:52:01 crc kubenswrapper[4765]: I1003 08:52:01.366903 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/aca861ef-e249-4395-8760-c2b556b47ae7-bundle\") pod \"8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d2r76hj\" (UID: \"aca861ef-e249-4395-8760-c2b556b47ae7\") " pod="openshift-marketplace/8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d2r76hj" Oct 03 08:52:01 crc kubenswrapper[4765]: I1003 08:52:01.366957 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/aca861ef-e249-4395-8760-c2b556b47ae7-util\") pod \"8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d2r76hj\" (UID: \"aca861ef-e249-4395-8760-c2b556b47ae7\") " pod="openshift-marketplace/8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d2r76hj" Oct 03 08:52:01 crc kubenswrapper[4765]: I1003 08:52:01.367421 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/aca861ef-e249-4395-8760-c2b556b47ae7-bundle\") pod \"8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d2r76hj\" (UID: \"aca861ef-e249-4395-8760-c2b556b47ae7\") " pod="openshift-marketplace/8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d2r76hj" Oct 03 08:52:01 crc kubenswrapper[4765]: I1003 08:52:01.367912 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/aca861ef-e249-4395-8760-c2b556b47ae7-util\") pod \"8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d2r76hj\" (UID: \"aca861ef-e249-4395-8760-c2b556b47ae7\") " pod="openshift-marketplace/8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d2r76hj" Oct 03 08:52:01 crc kubenswrapper[4765]: I1003 08:52:01.389239 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vzppd\" (UniqueName: \"kubernetes.io/projected/aca861ef-e249-4395-8760-c2b556b47ae7-kube-api-access-vzppd\") pod \"8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d2r76hj\" (UID: \"aca861ef-e249-4395-8760-c2b556b47ae7\") " pod="openshift-marketplace/8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d2r76hj" Oct 03 08:52:01 crc kubenswrapper[4765]: I1003 08:52:01.419940 4765 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d2r76hj" Oct 03 08:52:01 crc kubenswrapper[4765]: I1003 08:52:01.830393 4765 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d2r76hj"] Oct 03 08:52:01 crc kubenswrapper[4765]: W1003 08:52:01.844837 4765 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podaca861ef_e249_4395_8760_c2b556b47ae7.slice/crio-132daf9e7167cbdf4cad6f266e1033d48461fe24a5d5c233978a7aa6ff94e1c5 WatchSource:0}: Error finding container 132daf9e7167cbdf4cad6f266e1033d48461fe24a5d5c233978a7aa6ff94e1c5: Status 404 returned error can't find the container with id 132daf9e7167cbdf4cad6f266e1033d48461fe24a5d5c233978a7aa6ff94e1c5 Oct 03 08:52:02 crc kubenswrapper[4765]: I1003 08:52:02.141537 4765 generic.go:334] "Generic (PLEG): container finished" podID="aca861ef-e249-4395-8760-c2b556b47ae7" containerID="22db36809e0f392262811712613e8ae97bb4f8cc250fdd2d5966b65c2cb528cd" exitCode=0 Oct 03 08:52:02 crc kubenswrapper[4765]: I1003 08:52:02.141591 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d2r76hj" event={"ID":"aca861ef-e249-4395-8760-c2b556b47ae7","Type":"ContainerDied","Data":"22db36809e0f392262811712613e8ae97bb4f8cc250fdd2d5966b65c2cb528cd"} Oct 03 08:52:02 crc kubenswrapper[4765]: I1003 08:52:02.141619 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d2r76hj" event={"ID":"aca861ef-e249-4395-8760-c2b556b47ae7","Type":"ContainerStarted","Data":"132daf9e7167cbdf4cad6f266e1033d48461fe24a5d5c233978a7aa6ff94e1c5"} Oct 03 08:52:03 crc kubenswrapper[4765]: I1003 08:52:03.167237 4765 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-console/console-f9d7485db-g8jbc" podUID="d6e8ca49-1faf-4e22-8760-d7eca3820980" containerName="console" containerID="cri-o://5eac0a0fb9aafe9e42d534b96384e590fa0025a066eb96cb750e9e0f08c39bd9" gracePeriod=15 Oct 03 08:52:03 crc kubenswrapper[4765]: I1003 08:52:03.618238 4765 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-f9d7485db-g8jbc_d6e8ca49-1faf-4e22-8760-d7eca3820980/console/0.log" Oct 03 08:52:03 crc kubenswrapper[4765]: I1003 08:52:03.618562 4765 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-f9d7485db-g8jbc" Oct 03 08:52:03 crc kubenswrapper[4765]: I1003 08:52:03.699204 4765 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/d6e8ca49-1faf-4e22-8760-d7eca3820980-console-config\") pod \"d6e8ca49-1faf-4e22-8760-d7eca3820980\" (UID: \"d6e8ca49-1faf-4e22-8760-d7eca3820980\") " Oct 03 08:52:03 crc kubenswrapper[4765]: I1003 08:52:03.699599 4765 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/d6e8ca49-1faf-4e22-8760-d7eca3820980-console-serving-cert\") pod \"d6e8ca49-1faf-4e22-8760-d7eca3820980\" (UID: \"d6e8ca49-1faf-4e22-8760-d7eca3820980\") " Oct 03 08:52:03 crc kubenswrapper[4765]: I1003 08:52:03.699658 4765 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/d6e8ca49-1faf-4e22-8760-d7eca3820980-trusted-ca-bundle\") pod \"d6e8ca49-1faf-4e22-8760-d7eca3820980\" (UID: \"d6e8ca49-1faf-4e22-8760-d7eca3820980\") " Oct 03 08:52:03 crc kubenswrapper[4765]: I1003 08:52:03.699692 4765 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/d6e8ca49-1faf-4e22-8760-d7eca3820980-console-oauth-config\") pod \"d6e8ca49-1faf-4e22-8760-d7eca3820980\" (UID: \"d6e8ca49-1faf-4e22-8760-d7eca3820980\") " Oct 03 08:52:03 crc kubenswrapper[4765]: I1003 08:52:03.699722 4765 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/d6e8ca49-1faf-4e22-8760-d7eca3820980-service-ca\") pod \"d6e8ca49-1faf-4e22-8760-d7eca3820980\" (UID: \"d6e8ca49-1faf-4e22-8760-d7eca3820980\") " Oct 03 08:52:03 crc kubenswrapper[4765]: I1003 08:52:03.699747 4765 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/d6e8ca49-1faf-4e22-8760-d7eca3820980-oauth-serving-cert\") pod \"d6e8ca49-1faf-4e22-8760-d7eca3820980\" (UID: \"d6e8ca49-1faf-4e22-8760-d7eca3820980\") " Oct 03 08:52:03 crc kubenswrapper[4765]: I1003 08:52:03.699779 4765 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-f5rgb\" (UniqueName: \"kubernetes.io/projected/d6e8ca49-1faf-4e22-8760-d7eca3820980-kube-api-access-f5rgb\") pod \"d6e8ca49-1faf-4e22-8760-d7eca3820980\" (UID: \"d6e8ca49-1faf-4e22-8760-d7eca3820980\") " Oct 03 08:52:03 crc kubenswrapper[4765]: I1003 08:52:03.700171 4765 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d6e8ca49-1faf-4e22-8760-d7eca3820980-console-config" (OuterVolumeSpecName: "console-config") pod "d6e8ca49-1faf-4e22-8760-d7eca3820980" (UID: "d6e8ca49-1faf-4e22-8760-d7eca3820980"). InnerVolumeSpecName "console-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 03 08:52:03 crc kubenswrapper[4765]: I1003 08:52:03.700180 4765 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d6e8ca49-1faf-4e22-8760-d7eca3820980-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "d6e8ca49-1faf-4e22-8760-d7eca3820980" (UID: "d6e8ca49-1faf-4e22-8760-d7eca3820980"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 03 08:52:03 crc kubenswrapper[4765]: I1003 08:52:03.700521 4765 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d6e8ca49-1faf-4e22-8760-d7eca3820980-oauth-serving-cert" (OuterVolumeSpecName: "oauth-serving-cert") pod "d6e8ca49-1faf-4e22-8760-d7eca3820980" (UID: "d6e8ca49-1faf-4e22-8760-d7eca3820980"). InnerVolumeSpecName "oauth-serving-cert". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 03 08:52:03 crc kubenswrapper[4765]: I1003 08:52:03.700677 4765 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d6e8ca49-1faf-4e22-8760-d7eca3820980-service-ca" (OuterVolumeSpecName: "service-ca") pod "d6e8ca49-1faf-4e22-8760-d7eca3820980" (UID: "d6e8ca49-1faf-4e22-8760-d7eca3820980"). InnerVolumeSpecName "service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 03 08:52:03 crc kubenswrapper[4765]: I1003 08:52:03.705322 4765 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d6e8ca49-1faf-4e22-8760-d7eca3820980-console-serving-cert" (OuterVolumeSpecName: "console-serving-cert") pod "d6e8ca49-1faf-4e22-8760-d7eca3820980" (UID: "d6e8ca49-1faf-4e22-8760-d7eca3820980"). InnerVolumeSpecName "console-serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 08:52:03 crc kubenswrapper[4765]: I1003 08:52:03.705521 4765 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d6e8ca49-1faf-4e22-8760-d7eca3820980-console-oauth-config" (OuterVolumeSpecName: "console-oauth-config") pod "d6e8ca49-1faf-4e22-8760-d7eca3820980" (UID: "d6e8ca49-1faf-4e22-8760-d7eca3820980"). InnerVolumeSpecName "console-oauth-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 08:52:03 crc kubenswrapper[4765]: I1003 08:52:03.706443 4765 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d6e8ca49-1faf-4e22-8760-d7eca3820980-kube-api-access-f5rgb" (OuterVolumeSpecName: "kube-api-access-f5rgb") pod "d6e8ca49-1faf-4e22-8760-d7eca3820980" (UID: "d6e8ca49-1faf-4e22-8760-d7eca3820980"). InnerVolumeSpecName "kube-api-access-f5rgb". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 08:52:03 crc kubenswrapper[4765]: I1003 08:52:03.800358 4765 reconciler_common.go:293] "Volume detached for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/d6e8ca49-1faf-4e22-8760-d7eca3820980-console-config\") on node \"crc\" DevicePath \"\"" Oct 03 08:52:03 crc kubenswrapper[4765]: I1003 08:52:03.800440 4765 reconciler_common.go:293] "Volume detached for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/d6e8ca49-1faf-4e22-8760-d7eca3820980-console-serving-cert\") on node \"crc\" DevicePath \"\"" Oct 03 08:52:03 crc kubenswrapper[4765]: I1003 08:52:03.800452 4765 reconciler_common.go:293] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/d6e8ca49-1faf-4e22-8760-d7eca3820980-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 03 08:52:03 crc kubenswrapper[4765]: I1003 08:52:03.800462 4765 reconciler_common.go:293] "Volume detached for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/d6e8ca49-1faf-4e22-8760-d7eca3820980-console-oauth-config\") on node \"crc\" DevicePath \"\"" Oct 03 08:52:03 crc kubenswrapper[4765]: I1003 08:52:03.800471 4765 reconciler_common.go:293] "Volume detached for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/d6e8ca49-1faf-4e22-8760-d7eca3820980-service-ca\") on node \"crc\" DevicePath \"\"" Oct 03 08:52:03 crc kubenswrapper[4765]: I1003 08:52:03.800479 4765 reconciler_common.go:293] "Volume detached for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/d6e8ca49-1faf-4e22-8760-d7eca3820980-oauth-serving-cert\") on node \"crc\" DevicePath \"\"" Oct 03 08:52:03 crc kubenswrapper[4765]: I1003 08:52:03.800490 4765 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-f5rgb\" (UniqueName: \"kubernetes.io/projected/d6e8ca49-1faf-4e22-8760-d7eca3820980-kube-api-access-f5rgb\") on node \"crc\" DevicePath \"\"" Oct 03 08:52:04 crc kubenswrapper[4765]: I1003 08:52:04.156534 4765 generic.go:334] "Generic (PLEG): container finished" podID="aca861ef-e249-4395-8760-c2b556b47ae7" containerID="8a6e1c9f96c5ec7f9f246035bd9f40504ece7238bb215b12533abed023df02a3" exitCode=0 Oct 03 08:52:04 crc kubenswrapper[4765]: I1003 08:52:04.156624 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d2r76hj" event={"ID":"aca861ef-e249-4395-8760-c2b556b47ae7","Type":"ContainerDied","Data":"8a6e1c9f96c5ec7f9f246035bd9f40504ece7238bb215b12533abed023df02a3"} Oct 03 08:52:04 crc kubenswrapper[4765]: I1003 08:52:04.158410 4765 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-f9d7485db-g8jbc_d6e8ca49-1faf-4e22-8760-d7eca3820980/console/0.log" Oct 03 08:52:04 crc kubenswrapper[4765]: I1003 08:52:04.158460 4765 generic.go:334] "Generic (PLEG): container finished" podID="d6e8ca49-1faf-4e22-8760-d7eca3820980" containerID="5eac0a0fb9aafe9e42d534b96384e590fa0025a066eb96cb750e9e0f08c39bd9" exitCode=2 Oct 03 08:52:04 crc kubenswrapper[4765]: I1003 08:52:04.158492 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-f9d7485db-g8jbc" event={"ID":"d6e8ca49-1faf-4e22-8760-d7eca3820980","Type":"ContainerDied","Data":"5eac0a0fb9aafe9e42d534b96384e590fa0025a066eb96cb750e9e0f08c39bd9"} Oct 03 08:52:04 crc kubenswrapper[4765]: I1003 08:52:04.158516 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-f9d7485db-g8jbc" event={"ID":"d6e8ca49-1faf-4e22-8760-d7eca3820980","Type":"ContainerDied","Data":"02edebc121a2539e6a682e8abe0e16ae0068222cf3ff2034cda2deb460533dff"} Oct 03 08:52:04 crc kubenswrapper[4765]: I1003 08:52:04.158542 4765 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-f9d7485db-g8jbc" Oct 03 08:52:04 crc kubenswrapper[4765]: I1003 08:52:04.158549 4765 scope.go:117] "RemoveContainer" containerID="5eac0a0fb9aafe9e42d534b96384e590fa0025a066eb96cb750e9e0f08c39bd9" Oct 03 08:52:04 crc kubenswrapper[4765]: I1003 08:52:04.181255 4765 scope.go:117] "RemoveContainer" containerID="5eac0a0fb9aafe9e42d534b96384e590fa0025a066eb96cb750e9e0f08c39bd9" Oct 03 08:52:04 crc kubenswrapper[4765]: E1003 08:52:04.181807 4765 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5eac0a0fb9aafe9e42d534b96384e590fa0025a066eb96cb750e9e0f08c39bd9\": container with ID starting with 5eac0a0fb9aafe9e42d534b96384e590fa0025a066eb96cb750e9e0f08c39bd9 not found: ID does not exist" containerID="5eac0a0fb9aafe9e42d534b96384e590fa0025a066eb96cb750e9e0f08c39bd9" Oct 03 08:52:04 crc kubenswrapper[4765]: I1003 08:52:04.181846 4765 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5eac0a0fb9aafe9e42d534b96384e590fa0025a066eb96cb750e9e0f08c39bd9"} err="failed to get container status \"5eac0a0fb9aafe9e42d534b96384e590fa0025a066eb96cb750e9e0f08c39bd9\": rpc error: code = NotFound desc = could not find container \"5eac0a0fb9aafe9e42d534b96384e590fa0025a066eb96cb750e9e0f08c39bd9\": container with ID starting with 5eac0a0fb9aafe9e42d534b96384e590fa0025a066eb96cb750e9e0f08c39bd9 not found: ID does not exist" Oct 03 08:52:04 crc kubenswrapper[4765]: I1003 08:52:04.196072 4765 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-console/console-f9d7485db-g8jbc"] Oct 03 08:52:04 crc kubenswrapper[4765]: I1003 08:52:04.199426 4765 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-console/console-f9d7485db-g8jbc"] Oct 03 08:52:04 crc kubenswrapper[4765]: I1003 08:52:04.315779 4765 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d6e8ca49-1faf-4e22-8760-d7eca3820980" path="/var/lib/kubelet/pods/d6e8ca49-1faf-4e22-8760-d7eca3820980/volumes" Oct 03 08:52:04 crc kubenswrapper[4765]: I1003 08:52:04.422841 4765 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-cdz52"] Oct 03 08:52:04 crc kubenswrapper[4765]: E1003 08:52:04.423065 4765 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d6e8ca49-1faf-4e22-8760-d7eca3820980" containerName="console" Oct 03 08:52:04 crc kubenswrapper[4765]: I1003 08:52:04.423075 4765 state_mem.go:107] "Deleted CPUSet assignment" podUID="d6e8ca49-1faf-4e22-8760-d7eca3820980" containerName="console" Oct 03 08:52:04 crc kubenswrapper[4765]: I1003 08:52:04.423190 4765 memory_manager.go:354] "RemoveStaleState removing state" podUID="d6e8ca49-1faf-4e22-8760-d7eca3820980" containerName="console" Oct 03 08:52:04 crc kubenswrapper[4765]: I1003 08:52:04.423970 4765 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-cdz52" Oct 03 08:52:04 crc kubenswrapper[4765]: I1003 08:52:04.441539 4765 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-cdz52"] Oct 03 08:52:04 crc kubenswrapper[4765]: I1003 08:52:04.507793 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c4252c83-3dc7-420a-aa87-63e1eccad487-catalog-content\") pod \"redhat-operators-cdz52\" (UID: \"c4252c83-3dc7-420a-aa87-63e1eccad487\") " pod="openshift-marketplace/redhat-operators-cdz52" Oct 03 08:52:04 crc kubenswrapper[4765]: I1003 08:52:04.507860 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-s4jks\" (UniqueName: \"kubernetes.io/projected/c4252c83-3dc7-420a-aa87-63e1eccad487-kube-api-access-s4jks\") pod \"redhat-operators-cdz52\" (UID: \"c4252c83-3dc7-420a-aa87-63e1eccad487\") " pod="openshift-marketplace/redhat-operators-cdz52" Oct 03 08:52:04 crc kubenswrapper[4765]: I1003 08:52:04.507884 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c4252c83-3dc7-420a-aa87-63e1eccad487-utilities\") pod \"redhat-operators-cdz52\" (UID: \"c4252c83-3dc7-420a-aa87-63e1eccad487\") " pod="openshift-marketplace/redhat-operators-cdz52" Oct 03 08:52:04 crc kubenswrapper[4765]: I1003 08:52:04.608744 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c4252c83-3dc7-420a-aa87-63e1eccad487-catalog-content\") pod \"redhat-operators-cdz52\" (UID: \"c4252c83-3dc7-420a-aa87-63e1eccad487\") " pod="openshift-marketplace/redhat-operators-cdz52" Oct 03 08:52:04 crc kubenswrapper[4765]: I1003 08:52:04.608812 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s4jks\" (UniqueName: \"kubernetes.io/projected/c4252c83-3dc7-420a-aa87-63e1eccad487-kube-api-access-s4jks\") pod \"redhat-operators-cdz52\" (UID: \"c4252c83-3dc7-420a-aa87-63e1eccad487\") " pod="openshift-marketplace/redhat-operators-cdz52" Oct 03 08:52:04 crc kubenswrapper[4765]: I1003 08:52:04.608837 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c4252c83-3dc7-420a-aa87-63e1eccad487-utilities\") pod \"redhat-operators-cdz52\" (UID: \"c4252c83-3dc7-420a-aa87-63e1eccad487\") " pod="openshift-marketplace/redhat-operators-cdz52" Oct 03 08:52:04 crc kubenswrapper[4765]: I1003 08:52:04.609290 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c4252c83-3dc7-420a-aa87-63e1eccad487-utilities\") pod \"redhat-operators-cdz52\" (UID: \"c4252c83-3dc7-420a-aa87-63e1eccad487\") " pod="openshift-marketplace/redhat-operators-cdz52" Oct 03 08:52:04 crc kubenswrapper[4765]: I1003 08:52:04.609407 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c4252c83-3dc7-420a-aa87-63e1eccad487-catalog-content\") pod \"redhat-operators-cdz52\" (UID: \"c4252c83-3dc7-420a-aa87-63e1eccad487\") " pod="openshift-marketplace/redhat-operators-cdz52" Oct 03 08:52:04 crc kubenswrapper[4765]: I1003 08:52:04.627421 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-s4jks\" (UniqueName: \"kubernetes.io/projected/c4252c83-3dc7-420a-aa87-63e1eccad487-kube-api-access-s4jks\") pod \"redhat-operators-cdz52\" (UID: \"c4252c83-3dc7-420a-aa87-63e1eccad487\") " pod="openshift-marketplace/redhat-operators-cdz52" Oct 03 08:52:04 crc kubenswrapper[4765]: I1003 08:52:04.738921 4765 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-cdz52" Oct 03 08:52:05 crc kubenswrapper[4765]: I1003 08:52:05.163396 4765 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-cdz52"] Oct 03 08:52:05 crc kubenswrapper[4765]: I1003 08:52:05.166452 4765 generic.go:334] "Generic (PLEG): container finished" podID="aca861ef-e249-4395-8760-c2b556b47ae7" containerID="0e88cf4bb074adcd846bd51d8f80ed9ce8d424cf155dd84c34269fd4e233d846" exitCode=0 Oct 03 08:52:05 crc kubenswrapper[4765]: I1003 08:52:05.166498 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d2r76hj" event={"ID":"aca861ef-e249-4395-8760-c2b556b47ae7","Type":"ContainerDied","Data":"0e88cf4bb074adcd846bd51d8f80ed9ce8d424cf155dd84c34269fd4e233d846"} Oct 03 08:52:06 crc kubenswrapper[4765]: I1003 08:52:06.175614 4765 generic.go:334] "Generic (PLEG): container finished" podID="c4252c83-3dc7-420a-aa87-63e1eccad487" containerID="2994ed8dc1e7a5fd7cd001aef31f4b1c80b57d4bbfb2f08b7ec6c30fe4cfc1bf" exitCode=0 Oct 03 08:52:06 crc kubenswrapper[4765]: I1003 08:52:06.175737 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-cdz52" event={"ID":"c4252c83-3dc7-420a-aa87-63e1eccad487","Type":"ContainerDied","Data":"2994ed8dc1e7a5fd7cd001aef31f4b1c80b57d4bbfb2f08b7ec6c30fe4cfc1bf"} Oct 03 08:52:06 crc kubenswrapper[4765]: I1003 08:52:06.175789 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-cdz52" event={"ID":"c4252c83-3dc7-420a-aa87-63e1eccad487","Type":"ContainerStarted","Data":"109ee9b3be2bd013be44dce9c4b99ec1b2fe01a5023a510df185f44504e868ab"} Oct 03 08:52:06 crc kubenswrapper[4765]: I1003 08:52:06.507973 4765 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d2r76hj" Oct 03 08:52:06 crc kubenswrapper[4765]: I1003 08:52:06.649580 4765 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vzppd\" (UniqueName: \"kubernetes.io/projected/aca861ef-e249-4395-8760-c2b556b47ae7-kube-api-access-vzppd\") pod \"aca861ef-e249-4395-8760-c2b556b47ae7\" (UID: \"aca861ef-e249-4395-8760-c2b556b47ae7\") " Oct 03 08:52:06 crc kubenswrapper[4765]: I1003 08:52:06.649704 4765 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/aca861ef-e249-4395-8760-c2b556b47ae7-util\") pod \"aca861ef-e249-4395-8760-c2b556b47ae7\" (UID: \"aca861ef-e249-4395-8760-c2b556b47ae7\") " Oct 03 08:52:06 crc kubenswrapper[4765]: I1003 08:52:06.649775 4765 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/aca861ef-e249-4395-8760-c2b556b47ae7-bundle\") pod \"aca861ef-e249-4395-8760-c2b556b47ae7\" (UID: \"aca861ef-e249-4395-8760-c2b556b47ae7\") " Oct 03 08:52:06 crc kubenswrapper[4765]: I1003 08:52:06.651007 4765 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/aca861ef-e249-4395-8760-c2b556b47ae7-bundle" (OuterVolumeSpecName: "bundle") pod "aca861ef-e249-4395-8760-c2b556b47ae7" (UID: "aca861ef-e249-4395-8760-c2b556b47ae7"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 03 08:52:06 crc kubenswrapper[4765]: I1003 08:52:06.656407 4765 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/aca861ef-e249-4395-8760-c2b556b47ae7-kube-api-access-vzppd" (OuterVolumeSpecName: "kube-api-access-vzppd") pod "aca861ef-e249-4395-8760-c2b556b47ae7" (UID: "aca861ef-e249-4395-8760-c2b556b47ae7"). InnerVolumeSpecName "kube-api-access-vzppd". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 08:52:06 crc kubenswrapper[4765]: I1003 08:52:06.664633 4765 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/aca861ef-e249-4395-8760-c2b556b47ae7-util" (OuterVolumeSpecName: "util") pod "aca861ef-e249-4395-8760-c2b556b47ae7" (UID: "aca861ef-e249-4395-8760-c2b556b47ae7"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 03 08:52:06 crc kubenswrapper[4765]: I1003 08:52:06.750922 4765 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vzppd\" (UniqueName: \"kubernetes.io/projected/aca861ef-e249-4395-8760-c2b556b47ae7-kube-api-access-vzppd\") on node \"crc\" DevicePath \"\"" Oct 03 08:52:06 crc kubenswrapper[4765]: I1003 08:52:06.750954 4765 reconciler_common.go:293] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/aca861ef-e249-4395-8760-c2b556b47ae7-util\") on node \"crc\" DevicePath \"\"" Oct 03 08:52:06 crc kubenswrapper[4765]: I1003 08:52:06.750965 4765 reconciler_common.go:293] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/aca861ef-e249-4395-8760-c2b556b47ae7-bundle\") on node \"crc\" DevicePath \"\"" Oct 03 08:52:07 crc kubenswrapper[4765]: I1003 08:52:07.184256 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d2r76hj" event={"ID":"aca861ef-e249-4395-8760-c2b556b47ae7","Type":"ContainerDied","Data":"132daf9e7167cbdf4cad6f266e1033d48461fe24a5d5c233978a7aa6ff94e1c5"} Oct 03 08:52:07 crc kubenswrapper[4765]: I1003 08:52:07.184601 4765 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="132daf9e7167cbdf4cad6f266e1033d48461fe24a5d5c233978a7aa6ff94e1c5" Oct 03 08:52:07 crc kubenswrapper[4765]: I1003 08:52:07.184357 4765 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d2r76hj" Oct 03 08:52:07 crc kubenswrapper[4765]: I1003 08:52:07.186415 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-cdz52" event={"ID":"c4252c83-3dc7-420a-aa87-63e1eccad487","Type":"ContainerStarted","Data":"052d57ed21c1723fb7fe81b16621a0a8759bd5ecd999749e84d26e96e4f4ff80"} Oct 03 08:52:08 crc kubenswrapper[4765]: I1003 08:52:08.194027 4765 generic.go:334] "Generic (PLEG): container finished" podID="c4252c83-3dc7-420a-aa87-63e1eccad487" containerID="052d57ed21c1723fb7fe81b16621a0a8759bd5ecd999749e84d26e96e4f4ff80" exitCode=0 Oct 03 08:52:08 crc kubenswrapper[4765]: I1003 08:52:08.194122 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-cdz52" event={"ID":"c4252c83-3dc7-420a-aa87-63e1eccad487","Type":"ContainerDied","Data":"052d57ed21c1723fb7fe81b16621a0a8759bd5ecd999749e84d26e96e4f4ff80"} Oct 03 08:52:09 crc kubenswrapper[4765]: I1003 08:52:09.219207 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-cdz52" event={"ID":"c4252c83-3dc7-420a-aa87-63e1eccad487","Type":"ContainerStarted","Data":"cbd7cc76026446dd74ab6da88d0c3e64bae9fa06e134787a5d54eb50f72c71bf"} Oct 03 08:52:09 crc kubenswrapper[4765]: I1003 08:52:09.249535 4765 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-cdz52" podStartSLOduration=2.812972544 podStartE2EDuration="5.249514304s" podCreationTimestamp="2025-10-03 08:52:04 +0000 UTC" firstStartedPulling="2025-10-03 08:52:06.177301108 +0000 UTC m=+770.478795438" lastFinishedPulling="2025-10-03 08:52:08.613842848 +0000 UTC m=+772.915337198" observedRunningTime="2025-10-03 08:52:09.243797878 +0000 UTC m=+773.545292218" watchObservedRunningTime="2025-10-03 08:52:09.249514304 +0000 UTC m=+773.551008654" Oct 03 08:52:14 crc kubenswrapper[4765]: I1003 08:52:14.739117 4765 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-cdz52" Oct 03 08:52:14 crc kubenswrapper[4765]: I1003 08:52:14.739370 4765 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-cdz52" Oct 03 08:52:14 crc kubenswrapper[4765]: I1003 08:52:14.779870 4765 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-cdz52" Oct 03 08:52:15 crc kubenswrapper[4765]: I1003 08:52:15.299580 4765 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-cdz52" Oct 03 08:52:18 crc kubenswrapper[4765]: I1003 08:52:18.412457 4765 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-cdz52"] Oct 03 08:52:18 crc kubenswrapper[4765]: I1003 08:52:18.413433 4765 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-cdz52" podUID="c4252c83-3dc7-420a-aa87-63e1eccad487" containerName="registry-server" containerID="cri-o://cbd7cc76026446dd74ab6da88d0c3e64bae9fa06e134787a5d54eb50f72c71bf" gracePeriod=2 Oct 03 08:52:19 crc kubenswrapper[4765]: I1003 08:52:19.277931 4765 generic.go:334] "Generic (PLEG): container finished" podID="c4252c83-3dc7-420a-aa87-63e1eccad487" containerID="cbd7cc76026446dd74ab6da88d0c3e64bae9fa06e134787a5d54eb50f72c71bf" exitCode=0 Oct 03 08:52:19 crc kubenswrapper[4765]: I1003 08:52:19.277978 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-cdz52" event={"ID":"c4252c83-3dc7-420a-aa87-63e1eccad487","Type":"ContainerDied","Data":"cbd7cc76026446dd74ab6da88d0c3e64bae9fa06e134787a5d54eb50f72c71bf"} Oct 03 08:52:19 crc kubenswrapper[4765]: I1003 08:52:19.459546 4765 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-cdz52" Oct 03 08:52:19 crc kubenswrapper[4765]: I1003 08:52:19.612889 4765 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c4252c83-3dc7-420a-aa87-63e1eccad487-catalog-content\") pod \"c4252c83-3dc7-420a-aa87-63e1eccad487\" (UID: \"c4252c83-3dc7-420a-aa87-63e1eccad487\") " Oct 03 08:52:19 crc kubenswrapper[4765]: I1003 08:52:19.612946 4765 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c4252c83-3dc7-420a-aa87-63e1eccad487-utilities\") pod \"c4252c83-3dc7-420a-aa87-63e1eccad487\" (UID: \"c4252c83-3dc7-420a-aa87-63e1eccad487\") " Oct 03 08:52:19 crc kubenswrapper[4765]: I1003 08:52:19.612991 4765 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-s4jks\" (UniqueName: \"kubernetes.io/projected/c4252c83-3dc7-420a-aa87-63e1eccad487-kube-api-access-s4jks\") pod \"c4252c83-3dc7-420a-aa87-63e1eccad487\" (UID: \"c4252c83-3dc7-420a-aa87-63e1eccad487\") " Oct 03 08:52:19 crc kubenswrapper[4765]: I1003 08:52:19.614924 4765 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c4252c83-3dc7-420a-aa87-63e1eccad487-utilities" (OuterVolumeSpecName: "utilities") pod "c4252c83-3dc7-420a-aa87-63e1eccad487" (UID: "c4252c83-3dc7-420a-aa87-63e1eccad487"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 03 08:52:19 crc kubenswrapper[4765]: I1003 08:52:19.627235 4765 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c4252c83-3dc7-420a-aa87-63e1eccad487-kube-api-access-s4jks" (OuterVolumeSpecName: "kube-api-access-s4jks") pod "c4252c83-3dc7-420a-aa87-63e1eccad487" (UID: "c4252c83-3dc7-420a-aa87-63e1eccad487"). InnerVolumeSpecName "kube-api-access-s4jks". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 08:52:19 crc kubenswrapper[4765]: I1003 08:52:19.691734 4765 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c4252c83-3dc7-420a-aa87-63e1eccad487-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "c4252c83-3dc7-420a-aa87-63e1eccad487" (UID: "c4252c83-3dc7-420a-aa87-63e1eccad487"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 03 08:52:19 crc kubenswrapper[4765]: I1003 08:52:19.714024 4765 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c4252c83-3dc7-420a-aa87-63e1eccad487-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 03 08:52:19 crc kubenswrapper[4765]: I1003 08:52:19.714077 4765 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c4252c83-3dc7-420a-aa87-63e1eccad487-utilities\") on node \"crc\" DevicePath \"\"" Oct 03 08:52:19 crc kubenswrapper[4765]: I1003 08:52:19.714088 4765 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-s4jks\" (UniqueName: \"kubernetes.io/projected/c4252c83-3dc7-420a-aa87-63e1eccad487-kube-api-access-s4jks\") on node \"crc\" DevicePath \"\"" Oct 03 08:52:20 crc kubenswrapper[4765]: I1003 08:52:20.285023 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-cdz52" event={"ID":"c4252c83-3dc7-420a-aa87-63e1eccad487","Type":"ContainerDied","Data":"109ee9b3be2bd013be44dce9c4b99ec1b2fe01a5023a510df185f44504e868ab"} Oct 03 08:52:20 crc kubenswrapper[4765]: I1003 08:52:20.285078 4765 scope.go:117] "RemoveContainer" containerID="cbd7cc76026446dd74ab6da88d0c3e64bae9fa06e134787a5d54eb50f72c71bf" Oct 03 08:52:20 crc kubenswrapper[4765]: I1003 08:52:20.285154 4765 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-cdz52" Oct 03 08:52:20 crc kubenswrapper[4765]: I1003 08:52:20.299319 4765 scope.go:117] "RemoveContainer" containerID="052d57ed21c1723fb7fe81b16621a0a8759bd5ecd999749e84d26e96e4f4ff80" Oct 03 08:52:20 crc kubenswrapper[4765]: I1003 08:52:20.315735 4765 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-cdz52"] Oct 03 08:52:20 crc kubenswrapper[4765]: I1003 08:52:20.317588 4765 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-cdz52"] Oct 03 08:52:20 crc kubenswrapper[4765]: I1003 08:52:20.322403 4765 scope.go:117] "RemoveContainer" containerID="2994ed8dc1e7a5fd7cd001aef31f4b1c80b57d4bbfb2f08b7ec6c30fe4cfc1bf" Oct 03 08:52:20 crc kubenswrapper[4765]: I1003 08:52:20.749018 4765 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/metallb-operator-controller-manager-778d9f7978-ns677"] Oct 03 08:52:20 crc kubenswrapper[4765]: E1003 08:52:20.749555 4765 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c4252c83-3dc7-420a-aa87-63e1eccad487" containerName="extract-utilities" Oct 03 08:52:20 crc kubenswrapper[4765]: I1003 08:52:20.749571 4765 state_mem.go:107] "Deleted CPUSet assignment" podUID="c4252c83-3dc7-420a-aa87-63e1eccad487" containerName="extract-utilities" Oct 03 08:52:20 crc kubenswrapper[4765]: E1003 08:52:20.749578 4765 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c4252c83-3dc7-420a-aa87-63e1eccad487" containerName="extract-content" Oct 03 08:52:20 crc kubenswrapper[4765]: I1003 08:52:20.749584 4765 state_mem.go:107] "Deleted CPUSet assignment" podUID="c4252c83-3dc7-420a-aa87-63e1eccad487" containerName="extract-content" Oct 03 08:52:20 crc kubenswrapper[4765]: E1003 08:52:20.749594 4765 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="aca861ef-e249-4395-8760-c2b556b47ae7" containerName="util" Oct 03 08:52:20 crc kubenswrapper[4765]: I1003 08:52:20.749601 4765 state_mem.go:107] "Deleted CPUSet assignment" podUID="aca861ef-e249-4395-8760-c2b556b47ae7" containerName="util" Oct 03 08:52:20 crc kubenswrapper[4765]: E1003 08:52:20.749621 4765 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c4252c83-3dc7-420a-aa87-63e1eccad487" containerName="registry-server" Oct 03 08:52:20 crc kubenswrapper[4765]: I1003 08:52:20.749627 4765 state_mem.go:107] "Deleted CPUSet assignment" podUID="c4252c83-3dc7-420a-aa87-63e1eccad487" containerName="registry-server" Oct 03 08:52:20 crc kubenswrapper[4765]: E1003 08:52:20.749635 4765 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="aca861ef-e249-4395-8760-c2b556b47ae7" containerName="extract" Oct 03 08:52:20 crc kubenswrapper[4765]: I1003 08:52:20.749659 4765 state_mem.go:107] "Deleted CPUSet assignment" podUID="aca861ef-e249-4395-8760-c2b556b47ae7" containerName="extract" Oct 03 08:52:20 crc kubenswrapper[4765]: E1003 08:52:20.749669 4765 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="aca861ef-e249-4395-8760-c2b556b47ae7" containerName="pull" Oct 03 08:52:20 crc kubenswrapper[4765]: I1003 08:52:20.749677 4765 state_mem.go:107] "Deleted CPUSet assignment" podUID="aca861ef-e249-4395-8760-c2b556b47ae7" containerName="pull" Oct 03 08:52:20 crc kubenswrapper[4765]: I1003 08:52:20.749791 4765 memory_manager.go:354] "RemoveStaleState removing state" podUID="aca861ef-e249-4395-8760-c2b556b47ae7" containerName="extract" Oct 03 08:52:20 crc kubenswrapper[4765]: I1003 08:52:20.749801 4765 memory_manager.go:354] "RemoveStaleState removing state" podUID="c4252c83-3dc7-420a-aa87-63e1eccad487" containerName="registry-server" Oct 03 08:52:20 crc kubenswrapper[4765]: I1003 08:52:20.750314 4765 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/metallb-operator-controller-manager-778d9f7978-ns677" Oct 03 08:52:20 crc kubenswrapper[4765]: I1003 08:52:20.753078 4765 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"metallb-operator-webhook-server-cert" Oct 03 08:52:20 crc kubenswrapper[4765]: I1003 08:52:20.753178 4765 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"metallb-operator-controller-manager-service-cert" Oct 03 08:52:20 crc kubenswrapper[4765]: I1003 08:52:20.753438 4765 reflector.go:368] Caches populated for *v1.ConfigMap from object-"metallb-system"/"openshift-service-ca.crt" Oct 03 08:52:20 crc kubenswrapper[4765]: I1003 08:52:20.753667 4765 reflector.go:368] Caches populated for *v1.ConfigMap from object-"metallb-system"/"kube-root-ca.crt" Oct 03 08:52:20 crc kubenswrapper[4765]: I1003 08:52:20.755487 4765 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"manager-account-dockercfg-kclsw" Oct 03 08:52:20 crc kubenswrapper[4765]: I1003 08:52:20.786113 4765 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/metallb-operator-controller-manager-778d9f7978-ns677"] Oct 03 08:52:20 crc kubenswrapper[4765]: I1003 08:52:20.829232 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/f5247665-eb4f-47b9-9400-71a8a43d381c-webhook-cert\") pod \"metallb-operator-controller-manager-778d9f7978-ns677\" (UID: \"f5247665-eb4f-47b9-9400-71a8a43d381c\") " pod="metallb-system/metallb-operator-controller-manager-778d9f7978-ns677" Oct 03 08:52:20 crc kubenswrapper[4765]: I1003 08:52:20.829304 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8ppx2\" (UniqueName: \"kubernetes.io/projected/f5247665-eb4f-47b9-9400-71a8a43d381c-kube-api-access-8ppx2\") pod \"metallb-operator-controller-manager-778d9f7978-ns677\" (UID: \"f5247665-eb4f-47b9-9400-71a8a43d381c\") " pod="metallb-system/metallb-operator-controller-manager-778d9f7978-ns677" Oct 03 08:52:20 crc kubenswrapper[4765]: I1003 08:52:20.829347 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/f5247665-eb4f-47b9-9400-71a8a43d381c-apiservice-cert\") pod \"metallb-operator-controller-manager-778d9f7978-ns677\" (UID: \"f5247665-eb4f-47b9-9400-71a8a43d381c\") " pod="metallb-system/metallb-operator-controller-manager-778d9f7978-ns677" Oct 03 08:52:20 crc kubenswrapper[4765]: I1003 08:52:20.929843 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/f5247665-eb4f-47b9-9400-71a8a43d381c-apiservice-cert\") pod \"metallb-operator-controller-manager-778d9f7978-ns677\" (UID: \"f5247665-eb4f-47b9-9400-71a8a43d381c\") " pod="metallb-system/metallb-operator-controller-manager-778d9f7978-ns677" Oct 03 08:52:20 crc kubenswrapper[4765]: I1003 08:52:20.929917 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/f5247665-eb4f-47b9-9400-71a8a43d381c-webhook-cert\") pod \"metallb-operator-controller-manager-778d9f7978-ns677\" (UID: \"f5247665-eb4f-47b9-9400-71a8a43d381c\") " pod="metallb-system/metallb-operator-controller-manager-778d9f7978-ns677" Oct 03 08:52:20 crc kubenswrapper[4765]: I1003 08:52:20.929949 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8ppx2\" (UniqueName: \"kubernetes.io/projected/f5247665-eb4f-47b9-9400-71a8a43d381c-kube-api-access-8ppx2\") pod \"metallb-operator-controller-manager-778d9f7978-ns677\" (UID: \"f5247665-eb4f-47b9-9400-71a8a43d381c\") " pod="metallb-system/metallb-operator-controller-manager-778d9f7978-ns677" Oct 03 08:52:20 crc kubenswrapper[4765]: I1003 08:52:20.936479 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/f5247665-eb4f-47b9-9400-71a8a43d381c-apiservice-cert\") pod \"metallb-operator-controller-manager-778d9f7978-ns677\" (UID: \"f5247665-eb4f-47b9-9400-71a8a43d381c\") " pod="metallb-system/metallb-operator-controller-manager-778d9f7978-ns677" Oct 03 08:52:20 crc kubenswrapper[4765]: I1003 08:52:20.943683 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/f5247665-eb4f-47b9-9400-71a8a43d381c-webhook-cert\") pod \"metallb-operator-controller-manager-778d9f7978-ns677\" (UID: \"f5247665-eb4f-47b9-9400-71a8a43d381c\") " pod="metallb-system/metallb-operator-controller-manager-778d9f7978-ns677" Oct 03 08:52:20 crc kubenswrapper[4765]: I1003 08:52:20.954775 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8ppx2\" (UniqueName: \"kubernetes.io/projected/f5247665-eb4f-47b9-9400-71a8a43d381c-kube-api-access-8ppx2\") pod \"metallb-operator-controller-manager-778d9f7978-ns677\" (UID: \"f5247665-eb4f-47b9-9400-71a8a43d381c\") " pod="metallb-system/metallb-operator-controller-manager-778d9f7978-ns677" Oct 03 08:52:21 crc kubenswrapper[4765]: I1003 08:52:21.064274 4765 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/metallb-operator-controller-manager-778d9f7978-ns677" Oct 03 08:52:21 crc kubenswrapper[4765]: I1003 08:52:21.193915 4765 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/metallb-operator-webhook-server-77bffb8bb6-z6dx6"] Oct 03 08:52:21 crc kubenswrapper[4765]: I1003 08:52:21.194961 4765 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/metallb-operator-webhook-server-77bffb8bb6-z6dx6" Oct 03 08:52:21 crc kubenswrapper[4765]: I1003 08:52:21.199905 4765 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"controller-dockercfg-fv7tp" Oct 03 08:52:21 crc kubenswrapper[4765]: I1003 08:52:21.200155 4765 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"metallb-webhook-cert" Oct 03 08:52:21 crc kubenswrapper[4765]: I1003 08:52:21.200301 4765 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"metallb-operator-webhook-server-service-cert" Oct 03 08:52:21 crc kubenswrapper[4765]: I1003 08:52:21.222755 4765 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/metallb-operator-webhook-server-77bffb8bb6-z6dx6"] Oct 03 08:52:21 crc kubenswrapper[4765]: I1003 08:52:21.338665 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/4884351b-d735-4f5a-904a-93df33b47d3f-apiservice-cert\") pod \"metallb-operator-webhook-server-77bffb8bb6-z6dx6\" (UID: \"4884351b-d735-4f5a-904a-93df33b47d3f\") " pod="metallb-system/metallb-operator-webhook-server-77bffb8bb6-z6dx6" Oct 03 08:52:21 crc kubenswrapper[4765]: I1003 08:52:21.338772 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-n9d57\" (UniqueName: \"kubernetes.io/projected/4884351b-d735-4f5a-904a-93df33b47d3f-kube-api-access-n9d57\") pod \"metallb-operator-webhook-server-77bffb8bb6-z6dx6\" (UID: \"4884351b-d735-4f5a-904a-93df33b47d3f\") " pod="metallb-system/metallb-operator-webhook-server-77bffb8bb6-z6dx6" Oct 03 08:52:21 crc kubenswrapper[4765]: I1003 08:52:21.338801 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/4884351b-d735-4f5a-904a-93df33b47d3f-webhook-cert\") pod \"metallb-operator-webhook-server-77bffb8bb6-z6dx6\" (UID: \"4884351b-d735-4f5a-904a-93df33b47d3f\") " pod="metallb-system/metallb-operator-webhook-server-77bffb8bb6-z6dx6" Oct 03 08:52:21 crc kubenswrapper[4765]: I1003 08:52:21.440402 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-n9d57\" (UniqueName: \"kubernetes.io/projected/4884351b-d735-4f5a-904a-93df33b47d3f-kube-api-access-n9d57\") pod \"metallb-operator-webhook-server-77bffb8bb6-z6dx6\" (UID: \"4884351b-d735-4f5a-904a-93df33b47d3f\") " pod="metallb-system/metallb-operator-webhook-server-77bffb8bb6-z6dx6" Oct 03 08:52:21 crc kubenswrapper[4765]: I1003 08:52:21.440746 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/4884351b-d735-4f5a-904a-93df33b47d3f-webhook-cert\") pod \"metallb-operator-webhook-server-77bffb8bb6-z6dx6\" (UID: \"4884351b-d735-4f5a-904a-93df33b47d3f\") " pod="metallb-system/metallb-operator-webhook-server-77bffb8bb6-z6dx6" Oct 03 08:52:21 crc kubenswrapper[4765]: I1003 08:52:21.440807 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/4884351b-d735-4f5a-904a-93df33b47d3f-apiservice-cert\") pod \"metallb-operator-webhook-server-77bffb8bb6-z6dx6\" (UID: \"4884351b-d735-4f5a-904a-93df33b47d3f\") " pod="metallb-system/metallb-operator-webhook-server-77bffb8bb6-z6dx6" Oct 03 08:52:21 crc kubenswrapper[4765]: I1003 08:52:21.451753 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/4884351b-d735-4f5a-904a-93df33b47d3f-apiservice-cert\") pod \"metallb-operator-webhook-server-77bffb8bb6-z6dx6\" (UID: \"4884351b-d735-4f5a-904a-93df33b47d3f\") " pod="metallb-system/metallb-operator-webhook-server-77bffb8bb6-z6dx6" Oct 03 08:52:21 crc kubenswrapper[4765]: I1003 08:52:21.453244 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/4884351b-d735-4f5a-904a-93df33b47d3f-webhook-cert\") pod \"metallb-operator-webhook-server-77bffb8bb6-z6dx6\" (UID: \"4884351b-d735-4f5a-904a-93df33b47d3f\") " pod="metallb-system/metallb-operator-webhook-server-77bffb8bb6-z6dx6" Oct 03 08:52:21 crc kubenswrapper[4765]: I1003 08:52:21.483584 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-n9d57\" (UniqueName: \"kubernetes.io/projected/4884351b-d735-4f5a-904a-93df33b47d3f-kube-api-access-n9d57\") pod \"metallb-operator-webhook-server-77bffb8bb6-z6dx6\" (UID: \"4884351b-d735-4f5a-904a-93df33b47d3f\") " pod="metallb-system/metallb-operator-webhook-server-77bffb8bb6-z6dx6" Oct 03 08:52:21 crc kubenswrapper[4765]: I1003 08:52:21.544592 4765 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/metallb-operator-webhook-server-77bffb8bb6-z6dx6" Oct 03 08:52:21 crc kubenswrapper[4765]: I1003 08:52:21.564529 4765 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/metallb-operator-controller-manager-778d9f7978-ns677"] Oct 03 08:52:21 crc kubenswrapper[4765]: W1003 08:52:21.587211 4765 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf5247665_eb4f_47b9_9400_71a8a43d381c.slice/crio-67988e4ca3c977975ceb60658af893b587f377508728bb680d9d41e07341055b WatchSource:0}: Error finding container 67988e4ca3c977975ceb60658af893b587f377508728bb680d9d41e07341055b: Status 404 returned error can't find the container with id 67988e4ca3c977975ceb60658af893b587f377508728bb680d9d41e07341055b Oct 03 08:52:22 crc kubenswrapper[4765]: I1003 08:52:22.026152 4765 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/metallb-operator-webhook-server-77bffb8bb6-z6dx6"] Oct 03 08:52:22 crc kubenswrapper[4765]: I1003 08:52:22.322046 4765 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c4252c83-3dc7-420a-aa87-63e1eccad487" path="/var/lib/kubelet/pods/c4252c83-3dc7-420a-aa87-63e1eccad487/volumes" Oct 03 08:52:22 crc kubenswrapper[4765]: I1003 08:52:22.323092 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/metallb-operator-controller-manager-778d9f7978-ns677" event={"ID":"f5247665-eb4f-47b9-9400-71a8a43d381c","Type":"ContainerStarted","Data":"67988e4ca3c977975ceb60658af893b587f377508728bb680d9d41e07341055b"} Oct 03 08:52:22 crc kubenswrapper[4765]: I1003 08:52:22.323133 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/metallb-operator-webhook-server-77bffb8bb6-z6dx6" event={"ID":"4884351b-d735-4f5a-904a-93df33b47d3f","Type":"ContainerStarted","Data":"be560475216db0cf2986ce1846175b01f08a55048d68820450f5cfe38df8e8ec"} Oct 03 08:52:25 crc kubenswrapper[4765]: I1003 08:52:25.342927 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/metallb-operator-controller-manager-778d9f7978-ns677" event={"ID":"f5247665-eb4f-47b9-9400-71a8a43d381c","Type":"ContainerStarted","Data":"61351643696d72cebc2e47f9730324fb8b3d371207c32f2c6a37b2197903904e"} Oct 03 08:52:25 crc kubenswrapper[4765]: I1003 08:52:25.344539 4765 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/metallb-operator-controller-manager-778d9f7978-ns677" Oct 03 08:52:25 crc kubenswrapper[4765]: I1003 08:52:25.375819 4765 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/metallb-operator-controller-manager-778d9f7978-ns677" podStartSLOduration=2.707560121 podStartE2EDuration="5.375797661s" podCreationTimestamp="2025-10-03 08:52:20 +0000 UTC" firstStartedPulling="2025-10-03 08:52:21.595779232 +0000 UTC m=+785.897273562" lastFinishedPulling="2025-10-03 08:52:24.264016772 +0000 UTC m=+788.565511102" observedRunningTime="2025-10-03 08:52:25.371495802 +0000 UTC m=+789.672990142" watchObservedRunningTime="2025-10-03 08:52:25.375797661 +0000 UTC m=+789.677291991" Oct 03 08:52:27 crc kubenswrapper[4765]: I1003 08:52:27.362129 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/metallb-operator-webhook-server-77bffb8bb6-z6dx6" event={"ID":"4884351b-d735-4f5a-904a-93df33b47d3f","Type":"ContainerStarted","Data":"5f26f7dfdd64a2cd9295cada1f7961cbc104b75c2d5a1ed3b97d8531959cac65"} Oct 03 08:52:27 crc kubenswrapper[4765]: I1003 08:52:27.384404 4765 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/metallb-operator-webhook-server-77bffb8bb6-z6dx6" podStartSLOduration=1.408897555 podStartE2EDuration="6.384387615s" podCreationTimestamp="2025-10-03 08:52:21 +0000 UTC" firstStartedPulling="2025-10-03 08:52:22.04070384 +0000 UTC m=+786.342198170" lastFinishedPulling="2025-10-03 08:52:27.0161939 +0000 UTC m=+791.317688230" observedRunningTime="2025-10-03 08:52:27.380598268 +0000 UTC m=+791.682092598" watchObservedRunningTime="2025-10-03 08:52:27.384387615 +0000 UTC m=+791.685881935" Oct 03 08:52:28 crc kubenswrapper[4765]: I1003 08:52:28.367544 4765 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/metallb-operator-webhook-server-77bffb8bb6-z6dx6" Oct 03 08:52:30 crc kubenswrapper[4765]: I1003 08:52:30.680681 4765 patch_prober.go:28] interesting pod/machine-config-daemon-j8mss container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 03 08:52:30 crc kubenswrapper[4765]: I1003 08:52:30.681042 4765 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-j8mss" podUID="d636dbad-9ffa-4ba7-953f-adea04b76a23" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 03 08:52:30 crc kubenswrapper[4765]: I1003 08:52:30.681094 4765 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-j8mss" Oct 03 08:52:30 crc kubenswrapper[4765]: I1003 08:52:30.681773 4765 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"5477c0c212e204859773f5f32ccf8d9a259a11b347c7a6534e70a233d47641c8"} pod="openshift-machine-config-operator/machine-config-daemon-j8mss" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Oct 03 08:52:30 crc kubenswrapper[4765]: I1003 08:52:30.681839 4765 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-j8mss" podUID="d636dbad-9ffa-4ba7-953f-adea04b76a23" containerName="machine-config-daemon" containerID="cri-o://5477c0c212e204859773f5f32ccf8d9a259a11b347c7a6534e70a233d47641c8" gracePeriod=600 Oct 03 08:52:31 crc kubenswrapper[4765]: I1003 08:52:31.386000 4765 generic.go:334] "Generic (PLEG): container finished" podID="d636dbad-9ffa-4ba7-953f-adea04b76a23" containerID="5477c0c212e204859773f5f32ccf8d9a259a11b347c7a6534e70a233d47641c8" exitCode=0 Oct 03 08:52:31 crc kubenswrapper[4765]: I1003 08:52:31.386073 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-j8mss" event={"ID":"d636dbad-9ffa-4ba7-953f-adea04b76a23","Type":"ContainerDied","Data":"5477c0c212e204859773f5f32ccf8d9a259a11b347c7a6534e70a233d47641c8"} Oct 03 08:52:31 crc kubenswrapper[4765]: I1003 08:52:31.386331 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-j8mss" event={"ID":"d636dbad-9ffa-4ba7-953f-adea04b76a23","Type":"ContainerStarted","Data":"bfe51b01984985879807e07e7a2482b4ea6735b787f2d94829df4202c6f13dc1"} Oct 03 08:52:31 crc kubenswrapper[4765]: I1003 08:52:31.386361 4765 scope.go:117] "RemoveContainer" containerID="089a5d4e72b0c207517b7963f9dfafd9affe2b20c69fc8f9bbf6c0c97d24c65d" Oct 03 08:52:41 crc kubenswrapper[4765]: I1003 08:52:41.548804 4765 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/metallb-operator-webhook-server-77bffb8bb6-z6dx6" Oct 03 08:52:48 crc kubenswrapper[4765]: I1003 08:52:48.747573 4765 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-2859z"] Oct 03 08:52:48 crc kubenswrapper[4765]: I1003 08:52:48.749420 4765 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-2859z" Oct 03 08:52:48 crc kubenswrapper[4765]: I1003 08:52:48.757573 4765 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-2859z"] Oct 03 08:52:48 crc kubenswrapper[4765]: I1003 08:52:48.822381 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f8e23dd0-fc37-42ec-ad04-5fc279d1e020-catalog-content\") pod \"community-operators-2859z\" (UID: \"f8e23dd0-fc37-42ec-ad04-5fc279d1e020\") " pod="openshift-marketplace/community-operators-2859z" Oct 03 08:52:48 crc kubenswrapper[4765]: I1003 08:52:48.822763 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-m5tfm\" (UniqueName: \"kubernetes.io/projected/f8e23dd0-fc37-42ec-ad04-5fc279d1e020-kube-api-access-m5tfm\") pod \"community-operators-2859z\" (UID: \"f8e23dd0-fc37-42ec-ad04-5fc279d1e020\") " pod="openshift-marketplace/community-operators-2859z" Oct 03 08:52:48 crc kubenswrapper[4765]: I1003 08:52:48.822800 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f8e23dd0-fc37-42ec-ad04-5fc279d1e020-utilities\") pod \"community-operators-2859z\" (UID: \"f8e23dd0-fc37-42ec-ad04-5fc279d1e020\") " pod="openshift-marketplace/community-operators-2859z" Oct 03 08:52:48 crc kubenswrapper[4765]: I1003 08:52:48.924017 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f8e23dd0-fc37-42ec-ad04-5fc279d1e020-catalog-content\") pod \"community-operators-2859z\" (UID: \"f8e23dd0-fc37-42ec-ad04-5fc279d1e020\") " pod="openshift-marketplace/community-operators-2859z" Oct 03 08:52:48 crc kubenswrapper[4765]: I1003 08:52:48.924072 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-m5tfm\" (UniqueName: \"kubernetes.io/projected/f8e23dd0-fc37-42ec-ad04-5fc279d1e020-kube-api-access-m5tfm\") pod \"community-operators-2859z\" (UID: \"f8e23dd0-fc37-42ec-ad04-5fc279d1e020\") " pod="openshift-marketplace/community-operators-2859z" Oct 03 08:52:48 crc kubenswrapper[4765]: I1003 08:52:48.924128 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f8e23dd0-fc37-42ec-ad04-5fc279d1e020-utilities\") pod \"community-operators-2859z\" (UID: \"f8e23dd0-fc37-42ec-ad04-5fc279d1e020\") " pod="openshift-marketplace/community-operators-2859z" Oct 03 08:52:48 crc kubenswrapper[4765]: I1003 08:52:48.924554 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f8e23dd0-fc37-42ec-ad04-5fc279d1e020-catalog-content\") pod \"community-operators-2859z\" (UID: \"f8e23dd0-fc37-42ec-ad04-5fc279d1e020\") " pod="openshift-marketplace/community-operators-2859z" Oct 03 08:52:48 crc kubenswrapper[4765]: I1003 08:52:48.924905 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f8e23dd0-fc37-42ec-ad04-5fc279d1e020-utilities\") pod \"community-operators-2859z\" (UID: \"f8e23dd0-fc37-42ec-ad04-5fc279d1e020\") " pod="openshift-marketplace/community-operators-2859z" Oct 03 08:52:48 crc kubenswrapper[4765]: I1003 08:52:48.952930 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-m5tfm\" (UniqueName: \"kubernetes.io/projected/f8e23dd0-fc37-42ec-ad04-5fc279d1e020-kube-api-access-m5tfm\") pod \"community-operators-2859z\" (UID: \"f8e23dd0-fc37-42ec-ad04-5fc279d1e020\") " pod="openshift-marketplace/community-operators-2859z" Oct 03 08:52:49 crc kubenswrapper[4765]: I1003 08:52:49.070776 4765 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-2859z" Oct 03 08:52:49 crc kubenswrapper[4765]: I1003 08:52:49.560999 4765 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-2859z"] Oct 03 08:52:50 crc kubenswrapper[4765]: I1003 08:52:50.503931 4765 generic.go:334] "Generic (PLEG): container finished" podID="f8e23dd0-fc37-42ec-ad04-5fc279d1e020" containerID="ee16253950b711a1d51ddff84de5237b1d016f1201475ab7d8c1a816f306a32c" exitCode=0 Oct 03 08:52:50 crc kubenswrapper[4765]: I1003 08:52:50.503989 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-2859z" event={"ID":"f8e23dd0-fc37-42ec-ad04-5fc279d1e020","Type":"ContainerDied","Data":"ee16253950b711a1d51ddff84de5237b1d016f1201475ab7d8c1a816f306a32c"} Oct 03 08:52:50 crc kubenswrapper[4765]: I1003 08:52:50.504231 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-2859z" event={"ID":"f8e23dd0-fc37-42ec-ad04-5fc279d1e020","Type":"ContainerStarted","Data":"6fc49fde1ab0f894fa50deff7d8a27a227a0ae5d13f7a14550c95178718437ff"} Oct 03 08:52:51 crc kubenswrapper[4765]: I1003 08:52:51.510857 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-2859z" event={"ID":"f8e23dd0-fc37-42ec-ad04-5fc279d1e020","Type":"ContainerStarted","Data":"19ab1170464530c732a911edc8564a53255dc76e646784b14c3fbd64aea59216"} Oct 03 08:52:52 crc kubenswrapper[4765]: I1003 08:52:52.517474 4765 generic.go:334] "Generic (PLEG): container finished" podID="f8e23dd0-fc37-42ec-ad04-5fc279d1e020" containerID="19ab1170464530c732a911edc8564a53255dc76e646784b14c3fbd64aea59216" exitCode=0 Oct 03 08:52:52 crc kubenswrapper[4765]: I1003 08:52:52.517545 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-2859z" event={"ID":"f8e23dd0-fc37-42ec-ad04-5fc279d1e020","Type":"ContainerDied","Data":"19ab1170464530c732a911edc8564a53255dc76e646784b14c3fbd64aea59216"} Oct 03 08:52:53 crc kubenswrapper[4765]: I1003 08:52:53.525551 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-2859z" event={"ID":"f8e23dd0-fc37-42ec-ad04-5fc279d1e020","Type":"ContainerStarted","Data":"061e08a99b4fbde01bd280ec08d6a2cf5e79f6d2f3f6c1bc2db91dd483bd6b6d"} Oct 03 08:52:55 crc kubenswrapper[4765]: I1003 08:52:55.944963 4765 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-2859z" podStartSLOduration=5.547969865 podStartE2EDuration="7.944934448s" podCreationTimestamp="2025-10-03 08:52:48 +0000 UTC" firstStartedPulling="2025-10-03 08:52:50.505438685 +0000 UTC m=+814.806933015" lastFinishedPulling="2025-10-03 08:52:52.902403268 +0000 UTC m=+817.203897598" observedRunningTime="2025-10-03 08:52:53.54184151 +0000 UTC m=+817.843335840" watchObservedRunningTime="2025-10-03 08:52:55.944934448 +0000 UTC m=+820.246428778" Oct 03 08:52:55 crc kubenswrapper[4765]: I1003 08:52:55.947893 4765 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-hj8lb"] Oct 03 08:52:55 crc kubenswrapper[4765]: I1003 08:52:55.949325 4765 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-hj8lb" Oct 03 08:52:55 crc kubenswrapper[4765]: I1003 08:52:55.958077 4765 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-hj8lb"] Oct 03 08:52:56 crc kubenswrapper[4765]: I1003 08:52:56.013633 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7664de21-4628-49dc-9b0c-f42d8e0b54de-utilities\") pod \"certified-operators-hj8lb\" (UID: \"7664de21-4628-49dc-9b0c-f42d8e0b54de\") " pod="openshift-marketplace/certified-operators-hj8lb" Oct 03 08:52:56 crc kubenswrapper[4765]: I1003 08:52:56.013703 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-l6bbh\" (UniqueName: \"kubernetes.io/projected/7664de21-4628-49dc-9b0c-f42d8e0b54de-kube-api-access-l6bbh\") pod \"certified-operators-hj8lb\" (UID: \"7664de21-4628-49dc-9b0c-f42d8e0b54de\") " pod="openshift-marketplace/certified-operators-hj8lb" Oct 03 08:52:56 crc kubenswrapper[4765]: I1003 08:52:56.013797 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7664de21-4628-49dc-9b0c-f42d8e0b54de-catalog-content\") pod \"certified-operators-hj8lb\" (UID: \"7664de21-4628-49dc-9b0c-f42d8e0b54de\") " pod="openshift-marketplace/certified-operators-hj8lb" Oct 03 08:52:56 crc kubenswrapper[4765]: I1003 08:52:56.114932 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-l6bbh\" (UniqueName: \"kubernetes.io/projected/7664de21-4628-49dc-9b0c-f42d8e0b54de-kube-api-access-l6bbh\") pod \"certified-operators-hj8lb\" (UID: \"7664de21-4628-49dc-9b0c-f42d8e0b54de\") " pod="openshift-marketplace/certified-operators-hj8lb" Oct 03 08:52:56 crc kubenswrapper[4765]: I1003 08:52:56.115026 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7664de21-4628-49dc-9b0c-f42d8e0b54de-catalog-content\") pod \"certified-operators-hj8lb\" (UID: \"7664de21-4628-49dc-9b0c-f42d8e0b54de\") " pod="openshift-marketplace/certified-operators-hj8lb" Oct 03 08:52:56 crc kubenswrapper[4765]: I1003 08:52:56.115093 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7664de21-4628-49dc-9b0c-f42d8e0b54de-utilities\") pod \"certified-operators-hj8lb\" (UID: \"7664de21-4628-49dc-9b0c-f42d8e0b54de\") " pod="openshift-marketplace/certified-operators-hj8lb" Oct 03 08:52:56 crc kubenswrapper[4765]: I1003 08:52:56.115816 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7664de21-4628-49dc-9b0c-f42d8e0b54de-utilities\") pod \"certified-operators-hj8lb\" (UID: \"7664de21-4628-49dc-9b0c-f42d8e0b54de\") " pod="openshift-marketplace/certified-operators-hj8lb" Oct 03 08:52:56 crc kubenswrapper[4765]: I1003 08:52:56.115921 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7664de21-4628-49dc-9b0c-f42d8e0b54de-catalog-content\") pod \"certified-operators-hj8lb\" (UID: \"7664de21-4628-49dc-9b0c-f42d8e0b54de\") " pod="openshift-marketplace/certified-operators-hj8lb" Oct 03 08:52:56 crc kubenswrapper[4765]: I1003 08:52:56.140160 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-l6bbh\" (UniqueName: \"kubernetes.io/projected/7664de21-4628-49dc-9b0c-f42d8e0b54de-kube-api-access-l6bbh\") pod \"certified-operators-hj8lb\" (UID: \"7664de21-4628-49dc-9b0c-f42d8e0b54de\") " pod="openshift-marketplace/certified-operators-hj8lb" Oct 03 08:52:56 crc kubenswrapper[4765]: I1003 08:52:56.268562 4765 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-hj8lb" Oct 03 08:52:56 crc kubenswrapper[4765]: I1003 08:52:56.731990 4765 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-hj8lb"] Oct 03 08:52:57 crc kubenswrapper[4765]: I1003 08:52:57.548083 4765 generic.go:334] "Generic (PLEG): container finished" podID="7664de21-4628-49dc-9b0c-f42d8e0b54de" containerID="3cc4d49cbd5b7b3d59f3c1fc23cb923d38d9358154c5f01220e561c7745c146b" exitCode=0 Oct 03 08:52:57 crc kubenswrapper[4765]: I1003 08:52:57.548144 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-hj8lb" event={"ID":"7664de21-4628-49dc-9b0c-f42d8e0b54de","Type":"ContainerDied","Data":"3cc4d49cbd5b7b3d59f3c1fc23cb923d38d9358154c5f01220e561c7745c146b"} Oct 03 08:52:57 crc kubenswrapper[4765]: I1003 08:52:57.548404 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-hj8lb" event={"ID":"7664de21-4628-49dc-9b0c-f42d8e0b54de","Type":"ContainerStarted","Data":"7a1dcb00c89ab42b79e11d366a5e9409dcbdac41b7c515a762265ac2d6ac4d6a"} Oct 03 08:52:59 crc kubenswrapper[4765]: I1003 08:52:59.071545 4765 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-2859z" Oct 03 08:52:59 crc kubenswrapper[4765]: I1003 08:52:59.071600 4765 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-2859z" Oct 03 08:52:59 crc kubenswrapper[4765]: I1003 08:52:59.111707 4765 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-2859z" Oct 03 08:52:59 crc kubenswrapper[4765]: I1003 08:52:59.559480 4765 generic.go:334] "Generic (PLEG): container finished" podID="7664de21-4628-49dc-9b0c-f42d8e0b54de" containerID="afa9d41705808c58139526902d85924559edd3fba4f6a6b10eefd26a69c8b4a0" exitCode=0 Oct 03 08:52:59 crc kubenswrapper[4765]: I1003 08:52:59.559538 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-hj8lb" event={"ID":"7664de21-4628-49dc-9b0c-f42d8e0b54de","Type":"ContainerDied","Data":"afa9d41705808c58139526902d85924559edd3fba4f6a6b10eefd26a69c8b4a0"} Oct 03 08:52:59 crc kubenswrapper[4765]: I1003 08:52:59.597700 4765 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-2859z" Oct 03 08:53:00 crc kubenswrapper[4765]: I1003 08:53:00.566541 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-hj8lb" event={"ID":"7664de21-4628-49dc-9b0c-f42d8e0b54de","Type":"ContainerStarted","Data":"e9d22f1d20aba0869f9066640535b698e5044599323ba684c7444971410ac553"} Oct 03 08:53:00 crc kubenswrapper[4765]: I1003 08:53:00.586124 4765 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-hj8lb" podStartSLOduration=3.173215806 podStartE2EDuration="5.586103644s" podCreationTimestamp="2025-10-03 08:52:55 +0000 UTC" firstStartedPulling="2025-10-03 08:52:57.551034874 +0000 UTC m=+821.852529204" lastFinishedPulling="2025-10-03 08:52:59.963922712 +0000 UTC m=+824.265417042" observedRunningTime="2025-10-03 08:53:00.580858941 +0000 UTC m=+824.882353281" watchObservedRunningTime="2025-10-03 08:53:00.586103644 +0000 UTC m=+824.887597974" Oct 03 08:53:01 crc kubenswrapper[4765]: I1003 08:53:01.067486 4765 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/metallb-operator-controller-manager-778d9f7978-ns677" Oct 03 08:53:01 crc kubenswrapper[4765]: I1003 08:53:01.793471 4765 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/frr-k8s-sx8td"] Oct 03 08:53:01 crc kubenswrapper[4765]: I1003 08:53:01.796635 4765 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/frr-k8s-sx8td" Oct 03 08:53:01 crc kubenswrapper[4765]: I1003 08:53:01.801944 4765 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"frr-k8s-certs-secret" Oct 03 08:53:01 crc kubenswrapper[4765]: I1003 08:53:01.802034 4765 reflector.go:368] Caches populated for *v1.ConfigMap from object-"metallb-system"/"frr-startup" Oct 03 08:53:01 crc kubenswrapper[4765]: I1003 08:53:01.802267 4765 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"frr-k8s-daemon-dockercfg-jjzfg" Oct 03 08:53:01 crc kubenswrapper[4765]: I1003 08:53:01.803059 4765 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/frr-k8s-webhook-server-64bf5d555-8gx25"] Oct 03 08:53:01 crc kubenswrapper[4765]: I1003 08:53:01.803832 4765 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/frr-k8s-webhook-server-64bf5d555-8gx25" Oct 03 08:53:01 crc kubenswrapper[4765]: I1003 08:53:01.815870 4765 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"frr-k8s-webhook-server-cert" Oct 03 08:53:01 crc kubenswrapper[4765]: I1003 08:53:01.822335 4765 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/frr-k8s-webhook-server-64bf5d555-8gx25"] Oct 03 08:53:01 crc kubenswrapper[4765]: I1003 08:53:01.898786 4765 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/speaker-h9tnj"] Oct 03 08:53:01 crc kubenswrapper[4765]: I1003 08:53:01.899924 4765 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/speaker-h9tnj" Oct 03 08:53:01 crc kubenswrapper[4765]: I1003 08:53:01.902051 4765 reflector.go:368] Caches populated for *v1.ConfigMap from object-"metallb-system"/"metallb-excludel2" Oct 03 08:53:01 crc kubenswrapper[4765]: I1003 08:53:01.902306 4765 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"speaker-certs-secret" Oct 03 08:53:01 crc kubenswrapper[4765]: I1003 08:53:01.902449 4765 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"speaker-dockercfg-fxbw5" Oct 03 08:53:01 crc kubenswrapper[4765]: I1003 08:53:01.905454 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-h2pq5\" (UniqueName: \"kubernetes.io/projected/6a402791-ba68-4594-bb36-a6c491fdf723-kube-api-access-h2pq5\") pod \"frr-k8s-sx8td\" (UID: \"6a402791-ba68-4594-bb36-a6c491fdf723\") " pod="metallb-system/frr-k8s-sx8td" Oct 03 08:53:01 crc kubenswrapper[4765]: I1003 08:53:01.905512 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"reloader\" (UniqueName: \"kubernetes.io/empty-dir/6a402791-ba68-4594-bb36-a6c491fdf723-reloader\") pod \"frr-k8s-sx8td\" (UID: \"6a402791-ba68-4594-bb36-a6c491fdf723\") " pod="metallb-system/frr-k8s-sx8td" Oct 03 08:53:01 crc kubenswrapper[4765]: I1003 08:53:01.905544 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-69p9v\" (UniqueName: \"kubernetes.io/projected/a9886be2-cc56-4d96-b89e-55c1fc65774e-kube-api-access-69p9v\") pod \"frr-k8s-webhook-server-64bf5d555-8gx25\" (UID: \"a9886be2-cc56-4d96-b89e-55c1fc65774e\") " pod="metallb-system/frr-k8s-webhook-server-64bf5d555-8gx25" Oct 03 08:53:01 crc kubenswrapper[4765]: I1003 08:53:01.905573 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/a9886be2-cc56-4d96-b89e-55c1fc65774e-cert\") pod \"frr-k8s-webhook-server-64bf5d555-8gx25\" (UID: \"a9886be2-cc56-4d96-b89e-55c1fc65774e\") " pod="metallb-system/frr-k8s-webhook-server-64bf5d555-8gx25" Oct 03 08:53:01 crc kubenswrapper[4765]: I1003 08:53:01.905612 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"frr-conf\" (UniqueName: \"kubernetes.io/empty-dir/6a402791-ba68-4594-bb36-a6c491fdf723-frr-conf\") pod \"frr-k8s-sx8td\" (UID: \"6a402791-ba68-4594-bb36-a6c491fdf723\") " pod="metallb-system/frr-k8s-sx8td" Oct 03 08:53:01 crc kubenswrapper[4765]: I1003 08:53:01.905660 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"frr-startup\" (UniqueName: \"kubernetes.io/configmap/6a402791-ba68-4594-bb36-a6c491fdf723-frr-startup\") pod \"frr-k8s-sx8td\" (UID: \"6a402791-ba68-4594-bb36-a6c491fdf723\") " pod="metallb-system/frr-k8s-sx8td" Oct 03 08:53:01 crc kubenswrapper[4765]: I1003 08:53:01.905748 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/6a402791-ba68-4594-bb36-a6c491fdf723-metrics-certs\") pod \"frr-k8s-sx8td\" (UID: \"6a402791-ba68-4594-bb36-a6c491fdf723\") " pod="metallb-system/frr-k8s-sx8td" Oct 03 08:53:01 crc kubenswrapper[4765]: I1003 08:53:01.905799 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"frr-sockets\" (UniqueName: \"kubernetes.io/empty-dir/6a402791-ba68-4594-bb36-a6c491fdf723-frr-sockets\") pod \"frr-k8s-sx8td\" (UID: \"6a402791-ba68-4594-bb36-a6c491fdf723\") " pod="metallb-system/frr-k8s-sx8td" Oct 03 08:53:01 crc kubenswrapper[4765]: I1003 08:53:01.905841 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics\" (UniqueName: \"kubernetes.io/empty-dir/6a402791-ba68-4594-bb36-a6c491fdf723-metrics\") pod \"frr-k8s-sx8td\" (UID: \"6a402791-ba68-4594-bb36-a6c491fdf723\") " pod="metallb-system/frr-k8s-sx8td" Oct 03 08:53:01 crc kubenswrapper[4765]: I1003 08:53:01.913666 4765 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"metallb-memberlist" Oct 03 08:53:01 crc kubenswrapper[4765]: I1003 08:53:01.927872 4765 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/controller-68d546b9d8-2cwp9"] Oct 03 08:53:01 crc kubenswrapper[4765]: I1003 08:53:01.929026 4765 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/controller-68d546b9d8-2cwp9" Oct 03 08:53:01 crc kubenswrapper[4765]: I1003 08:53:01.935299 4765 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"controller-certs-secret" Oct 03 08:53:01 crc kubenswrapper[4765]: I1003 08:53:01.949001 4765 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/controller-68d546b9d8-2cwp9"] Oct 03 08:53:02 crc kubenswrapper[4765]: I1003 08:53:02.007209 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/93299a5d-55e2-4554-9fe8-4432acc25332-metrics-certs\") pod \"speaker-h9tnj\" (UID: \"93299a5d-55e2-4554-9fe8-4432acc25332\") " pod="metallb-system/speaker-h9tnj" Oct 03 08:53:02 crc kubenswrapper[4765]: I1003 08:53:02.007548 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-h2pq5\" (UniqueName: \"kubernetes.io/projected/6a402791-ba68-4594-bb36-a6c491fdf723-kube-api-access-h2pq5\") pod \"frr-k8s-sx8td\" (UID: \"6a402791-ba68-4594-bb36-a6c491fdf723\") " pod="metallb-system/frr-k8s-sx8td" Oct 03 08:53:02 crc kubenswrapper[4765]: I1003 08:53:02.007701 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ffbxm\" (UniqueName: \"kubernetes.io/projected/e1ff5baf-0400-42a9-8815-1a618208934e-kube-api-access-ffbxm\") pod \"controller-68d546b9d8-2cwp9\" (UID: \"e1ff5baf-0400-42a9-8815-1a618208934e\") " pod="metallb-system/controller-68d546b9d8-2cwp9" Oct 03 08:53:02 crc kubenswrapper[4765]: I1003 08:53:02.007910 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"reloader\" (UniqueName: \"kubernetes.io/empty-dir/6a402791-ba68-4594-bb36-a6c491fdf723-reloader\") pod \"frr-k8s-sx8td\" (UID: \"6a402791-ba68-4594-bb36-a6c491fdf723\") " pod="metallb-system/frr-k8s-sx8td" Oct 03 08:53:02 crc kubenswrapper[4765]: I1003 08:53:02.008069 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-69p9v\" (UniqueName: \"kubernetes.io/projected/a9886be2-cc56-4d96-b89e-55c1fc65774e-kube-api-access-69p9v\") pod \"frr-k8s-webhook-server-64bf5d555-8gx25\" (UID: \"a9886be2-cc56-4d96-b89e-55c1fc65774e\") " pod="metallb-system/frr-k8s-webhook-server-64bf5d555-8gx25" Oct 03 08:53:02 crc kubenswrapper[4765]: I1003 08:53:02.008167 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/a9886be2-cc56-4d96-b89e-55c1fc65774e-cert\") pod \"frr-k8s-webhook-server-64bf5d555-8gx25\" (UID: \"a9886be2-cc56-4d96-b89e-55c1fc65774e\") " pod="metallb-system/frr-k8s-webhook-server-64bf5d555-8gx25" Oct 03 08:53:02 crc kubenswrapper[4765]: I1003 08:53:02.008325 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/e1ff5baf-0400-42a9-8815-1a618208934e-cert\") pod \"controller-68d546b9d8-2cwp9\" (UID: \"e1ff5baf-0400-42a9-8815-1a618208934e\") " pod="metallb-system/controller-68d546b9d8-2cwp9" Oct 03 08:53:02 crc kubenswrapper[4765]: I1003 08:53:02.008526 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/e1ff5baf-0400-42a9-8815-1a618208934e-metrics-certs\") pod \"controller-68d546b9d8-2cwp9\" (UID: \"e1ff5baf-0400-42a9-8815-1a618208934e\") " pod="metallb-system/controller-68d546b9d8-2cwp9" Oct 03 08:53:02 crc kubenswrapper[4765]: I1003 08:53:02.008738 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"frr-conf\" (UniqueName: \"kubernetes.io/empty-dir/6a402791-ba68-4594-bb36-a6c491fdf723-frr-conf\") pod \"frr-k8s-sx8td\" (UID: \"6a402791-ba68-4594-bb36-a6c491fdf723\") " pod="metallb-system/frr-k8s-sx8td" Oct 03 08:53:02 crc kubenswrapper[4765]: I1003 08:53:02.008852 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"frr-startup\" (UniqueName: \"kubernetes.io/configmap/6a402791-ba68-4594-bb36-a6c491fdf723-frr-startup\") pod \"frr-k8s-sx8td\" (UID: \"6a402791-ba68-4594-bb36-a6c491fdf723\") " pod="metallb-system/frr-k8s-sx8td" Oct 03 08:53:02 crc kubenswrapper[4765]: I1003 08:53:02.008996 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/6a402791-ba68-4594-bb36-a6c491fdf723-metrics-certs\") pod \"frr-k8s-sx8td\" (UID: \"6a402791-ba68-4594-bb36-a6c491fdf723\") " pod="metallb-system/frr-k8s-sx8td" Oct 03 08:53:02 crc kubenswrapper[4765]: I1003 08:53:02.009110 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gzmlv\" (UniqueName: \"kubernetes.io/projected/93299a5d-55e2-4554-9fe8-4432acc25332-kube-api-access-gzmlv\") pod \"speaker-h9tnj\" (UID: \"93299a5d-55e2-4554-9fe8-4432acc25332\") " pod="metallb-system/speaker-h9tnj" Oct 03 08:53:02 crc kubenswrapper[4765]: I1003 08:53:02.009298 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"frr-sockets\" (UniqueName: \"kubernetes.io/empty-dir/6a402791-ba68-4594-bb36-a6c491fdf723-frr-sockets\") pod \"frr-k8s-sx8td\" (UID: \"6a402791-ba68-4594-bb36-a6c491fdf723\") " pod="metallb-system/frr-k8s-sx8td" Oct 03 08:53:02 crc kubenswrapper[4765]: I1003 08:53:02.009488 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"memberlist\" (UniqueName: \"kubernetes.io/secret/93299a5d-55e2-4554-9fe8-4432acc25332-memberlist\") pod \"speaker-h9tnj\" (UID: \"93299a5d-55e2-4554-9fe8-4432acc25332\") " pod="metallb-system/speaker-h9tnj" Oct 03 08:53:02 crc kubenswrapper[4765]: I1003 08:53:02.009586 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics\" (UniqueName: \"kubernetes.io/empty-dir/6a402791-ba68-4594-bb36-a6c491fdf723-metrics\") pod \"frr-k8s-sx8td\" (UID: \"6a402791-ba68-4594-bb36-a6c491fdf723\") " pod="metallb-system/frr-k8s-sx8td" Oct 03 08:53:02 crc kubenswrapper[4765]: I1003 08:53:02.009688 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metallb-excludel2\" (UniqueName: \"kubernetes.io/configmap/93299a5d-55e2-4554-9fe8-4432acc25332-metallb-excludel2\") pod \"speaker-h9tnj\" (UID: \"93299a5d-55e2-4554-9fe8-4432acc25332\") " pod="metallb-system/speaker-h9tnj" Oct 03 08:53:02 crc kubenswrapper[4765]: E1003 08:53:02.009215 4765 secret.go:188] Couldn't get secret metallb-system/frr-k8s-webhook-server-cert: secret "frr-k8s-webhook-server-cert" not found Oct 03 08:53:02 crc kubenswrapper[4765]: E1003 08:53:02.010099 4765 secret.go:188] Couldn't get secret metallb-system/frr-k8s-certs-secret: secret "frr-k8s-certs-secret" not found Oct 03 08:53:02 crc kubenswrapper[4765]: I1003 08:53:02.009969 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"frr-startup\" (UniqueName: \"kubernetes.io/configmap/6a402791-ba68-4594-bb36-a6c491fdf723-frr-startup\") pod \"frr-k8s-sx8td\" (UID: \"6a402791-ba68-4594-bb36-a6c491fdf723\") " pod="metallb-system/frr-k8s-sx8td" Oct 03 08:53:02 crc kubenswrapper[4765]: I1003 08:53:02.010155 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"frr-sockets\" (UniqueName: \"kubernetes.io/empty-dir/6a402791-ba68-4594-bb36-a6c491fdf723-frr-sockets\") pod \"frr-k8s-sx8td\" (UID: \"6a402791-ba68-4594-bb36-a6c491fdf723\") " pod="metallb-system/frr-k8s-sx8td" Oct 03 08:53:02 crc kubenswrapper[4765]: I1003 08:53:02.009142 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"reloader\" (UniqueName: \"kubernetes.io/empty-dir/6a402791-ba68-4594-bb36-a6c491fdf723-reloader\") pod \"frr-k8s-sx8td\" (UID: \"6a402791-ba68-4594-bb36-a6c491fdf723\") " pod="metallb-system/frr-k8s-sx8td" Oct 03 08:53:02 crc kubenswrapper[4765]: E1003 08:53:02.010120 4765 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/a9886be2-cc56-4d96-b89e-55c1fc65774e-cert podName:a9886be2-cc56-4d96-b89e-55c1fc65774e nodeName:}" failed. No retries permitted until 2025-10-03 08:53:02.510095862 +0000 UTC m=+826.811590402 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/a9886be2-cc56-4d96-b89e-55c1fc65774e-cert") pod "frr-k8s-webhook-server-64bf5d555-8gx25" (UID: "a9886be2-cc56-4d96-b89e-55c1fc65774e") : secret "frr-k8s-webhook-server-cert" not found Oct 03 08:53:02 crc kubenswrapper[4765]: I1003 08:53:02.010234 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics\" (UniqueName: \"kubernetes.io/empty-dir/6a402791-ba68-4594-bb36-a6c491fdf723-metrics\") pod \"frr-k8s-sx8td\" (UID: \"6a402791-ba68-4594-bb36-a6c491fdf723\") " pod="metallb-system/frr-k8s-sx8td" Oct 03 08:53:02 crc kubenswrapper[4765]: E1003 08:53:02.010247 4765 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/6a402791-ba68-4594-bb36-a6c491fdf723-metrics-certs podName:6a402791-ba68-4594-bb36-a6c491fdf723 nodeName:}" failed. No retries permitted until 2025-10-03 08:53:02.510227476 +0000 UTC m=+826.811722006 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/6a402791-ba68-4594-bb36-a6c491fdf723-metrics-certs") pod "frr-k8s-sx8td" (UID: "6a402791-ba68-4594-bb36-a6c491fdf723") : secret "frr-k8s-certs-secret" not found Oct 03 08:53:02 crc kubenswrapper[4765]: I1003 08:53:02.010433 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"frr-conf\" (UniqueName: \"kubernetes.io/empty-dir/6a402791-ba68-4594-bb36-a6c491fdf723-frr-conf\") pod \"frr-k8s-sx8td\" (UID: \"6a402791-ba68-4594-bb36-a6c491fdf723\") " pod="metallb-system/frr-k8s-sx8td" Oct 03 08:53:02 crc kubenswrapper[4765]: I1003 08:53:02.030729 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-69p9v\" (UniqueName: \"kubernetes.io/projected/a9886be2-cc56-4d96-b89e-55c1fc65774e-kube-api-access-69p9v\") pod \"frr-k8s-webhook-server-64bf5d555-8gx25\" (UID: \"a9886be2-cc56-4d96-b89e-55c1fc65774e\") " pod="metallb-system/frr-k8s-webhook-server-64bf5d555-8gx25" Oct 03 08:53:02 crc kubenswrapper[4765]: I1003 08:53:02.036670 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-h2pq5\" (UniqueName: \"kubernetes.io/projected/6a402791-ba68-4594-bb36-a6c491fdf723-kube-api-access-h2pq5\") pod \"frr-k8s-sx8td\" (UID: \"6a402791-ba68-4594-bb36-a6c491fdf723\") " pod="metallb-system/frr-k8s-sx8td" Oct 03 08:53:02 crc kubenswrapper[4765]: I1003 08:53:02.110846 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gzmlv\" (UniqueName: \"kubernetes.io/projected/93299a5d-55e2-4554-9fe8-4432acc25332-kube-api-access-gzmlv\") pod \"speaker-h9tnj\" (UID: \"93299a5d-55e2-4554-9fe8-4432acc25332\") " pod="metallb-system/speaker-h9tnj" Oct 03 08:53:02 crc kubenswrapper[4765]: I1003 08:53:02.110937 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"memberlist\" (UniqueName: \"kubernetes.io/secret/93299a5d-55e2-4554-9fe8-4432acc25332-memberlist\") pod \"speaker-h9tnj\" (UID: \"93299a5d-55e2-4554-9fe8-4432acc25332\") " pod="metallb-system/speaker-h9tnj" Oct 03 08:53:02 crc kubenswrapper[4765]: I1003 08:53:02.110966 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metallb-excludel2\" (UniqueName: \"kubernetes.io/configmap/93299a5d-55e2-4554-9fe8-4432acc25332-metallb-excludel2\") pod \"speaker-h9tnj\" (UID: \"93299a5d-55e2-4554-9fe8-4432acc25332\") " pod="metallb-system/speaker-h9tnj" Oct 03 08:53:02 crc kubenswrapper[4765]: I1003 08:53:02.110993 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/93299a5d-55e2-4554-9fe8-4432acc25332-metrics-certs\") pod \"speaker-h9tnj\" (UID: \"93299a5d-55e2-4554-9fe8-4432acc25332\") " pod="metallb-system/speaker-h9tnj" Oct 03 08:53:02 crc kubenswrapper[4765]: I1003 08:53:02.111031 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ffbxm\" (UniqueName: \"kubernetes.io/projected/e1ff5baf-0400-42a9-8815-1a618208934e-kube-api-access-ffbxm\") pod \"controller-68d546b9d8-2cwp9\" (UID: \"e1ff5baf-0400-42a9-8815-1a618208934e\") " pod="metallb-system/controller-68d546b9d8-2cwp9" Oct 03 08:53:02 crc kubenswrapper[4765]: I1003 08:53:02.111089 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/e1ff5baf-0400-42a9-8815-1a618208934e-cert\") pod \"controller-68d546b9d8-2cwp9\" (UID: \"e1ff5baf-0400-42a9-8815-1a618208934e\") " pod="metallb-system/controller-68d546b9d8-2cwp9" Oct 03 08:53:02 crc kubenswrapper[4765]: I1003 08:53:02.111113 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/e1ff5baf-0400-42a9-8815-1a618208934e-metrics-certs\") pod \"controller-68d546b9d8-2cwp9\" (UID: \"e1ff5baf-0400-42a9-8815-1a618208934e\") " pod="metallb-system/controller-68d546b9d8-2cwp9" Oct 03 08:53:02 crc kubenswrapper[4765]: E1003 08:53:02.111165 4765 secret.go:188] Couldn't get secret metallb-system/metallb-memberlist: secret "metallb-memberlist" not found Oct 03 08:53:02 crc kubenswrapper[4765]: E1003 08:53:02.111225 4765 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/93299a5d-55e2-4554-9fe8-4432acc25332-memberlist podName:93299a5d-55e2-4554-9fe8-4432acc25332 nodeName:}" failed. No retries permitted until 2025-10-03 08:53:02.611205298 +0000 UTC m=+826.912699628 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "memberlist" (UniqueName: "kubernetes.io/secret/93299a5d-55e2-4554-9fe8-4432acc25332-memberlist") pod "speaker-h9tnj" (UID: "93299a5d-55e2-4554-9fe8-4432acc25332") : secret "metallb-memberlist" not found Oct 03 08:53:02 crc kubenswrapper[4765]: I1003 08:53:02.111985 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metallb-excludel2\" (UniqueName: \"kubernetes.io/configmap/93299a5d-55e2-4554-9fe8-4432acc25332-metallb-excludel2\") pod \"speaker-h9tnj\" (UID: \"93299a5d-55e2-4554-9fe8-4432acc25332\") " pod="metallb-system/speaker-h9tnj" Oct 03 08:53:02 crc kubenswrapper[4765]: I1003 08:53:02.114524 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/e1ff5baf-0400-42a9-8815-1a618208934e-cert\") pod \"controller-68d546b9d8-2cwp9\" (UID: \"e1ff5baf-0400-42a9-8815-1a618208934e\") " pod="metallb-system/controller-68d546b9d8-2cwp9" Oct 03 08:53:02 crc kubenswrapper[4765]: I1003 08:53:02.114542 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/93299a5d-55e2-4554-9fe8-4432acc25332-metrics-certs\") pod \"speaker-h9tnj\" (UID: \"93299a5d-55e2-4554-9fe8-4432acc25332\") " pod="metallb-system/speaker-h9tnj" Oct 03 08:53:02 crc kubenswrapper[4765]: I1003 08:53:02.116410 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/e1ff5baf-0400-42a9-8815-1a618208934e-metrics-certs\") pod \"controller-68d546b9d8-2cwp9\" (UID: \"e1ff5baf-0400-42a9-8815-1a618208934e\") " pod="metallb-system/controller-68d546b9d8-2cwp9" Oct 03 08:53:02 crc kubenswrapper[4765]: I1003 08:53:02.137018 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ffbxm\" (UniqueName: \"kubernetes.io/projected/e1ff5baf-0400-42a9-8815-1a618208934e-kube-api-access-ffbxm\") pod \"controller-68d546b9d8-2cwp9\" (UID: \"e1ff5baf-0400-42a9-8815-1a618208934e\") " pod="metallb-system/controller-68d546b9d8-2cwp9" Oct 03 08:53:02 crc kubenswrapper[4765]: I1003 08:53:02.140343 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gzmlv\" (UniqueName: \"kubernetes.io/projected/93299a5d-55e2-4554-9fe8-4432acc25332-kube-api-access-gzmlv\") pod \"speaker-h9tnj\" (UID: \"93299a5d-55e2-4554-9fe8-4432acc25332\") " pod="metallb-system/speaker-h9tnj" Oct 03 08:53:02 crc kubenswrapper[4765]: I1003 08:53:02.247092 4765 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/controller-68d546b9d8-2cwp9" Oct 03 08:53:02 crc kubenswrapper[4765]: I1003 08:53:02.517486 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/a9886be2-cc56-4d96-b89e-55c1fc65774e-cert\") pod \"frr-k8s-webhook-server-64bf5d555-8gx25\" (UID: \"a9886be2-cc56-4d96-b89e-55c1fc65774e\") " pod="metallb-system/frr-k8s-webhook-server-64bf5d555-8gx25" Oct 03 08:53:02 crc kubenswrapper[4765]: I1003 08:53:02.517861 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/6a402791-ba68-4594-bb36-a6c491fdf723-metrics-certs\") pod \"frr-k8s-sx8td\" (UID: \"6a402791-ba68-4594-bb36-a6c491fdf723\") " pod="metallb-system/frr-k8s-sx8td" Oct 03 08:53:02 crc kubenswrapper[4765]: I1003 08:53:02.523120 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/6a402791-ba68-4594-bb36-a6c491fdf723-metrics-certs\") pod \"frr-k8s-sx8td\" (UID: \"6a402791-ba68-4594-bb36-a6c491fdf723\") " pod="metallb-system/frr-k8s-sx8td" Oct 03 08:53:02 crc kubenswrapper[4765]: I1003 08:53:02.529372 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/a9886be2-cc56-4d96-b89e-55c1fc65774e-cert\") pod \"frr-k8s-webhook-server-64bf5d555-8gx25\" (UID: \"a9886be2-cc56-4d96-b89e-55c1fc65774e\") " pod="metallb-system/frr-k8s-webhook-server-64bf5d555-8gx25" Oct 03 08:53:02 crc kubenswrapper[4765]: I1003 08:53:02.540700 4765 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-2859z"] Oct 03 08:53:02 crc kubenswrapper[4765]: I1003 08:53:02.541085 4765 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-2859z" podUID="f8e23dd0-fc37-42ec-ad04-5fc279d1e020" containerName="registry-server" containerID="cri-o://061e08a99b4fbde01bd280ec08d6a2cf5e79f6d2f3f6c1bc2db91dd483bd6b6d" gracePeriod=2 Oct 03 08:53:02 crc kubenswrapper[4765]: I1003 08:53:02.619409 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"memberlist\" (UniqueName: \"kubernetes.io/secret/93299a5d-55e2-4554-9fe8-4432acc25332-memberlist\") pod \"speaker-h9tnj\" (UID: \"93299a5d-55e2-4554-9fe8-4432acc25332\") " pod="metallb-system/speaker-h9tnj" Oct 03 08:53:02 crc kubenswrapper[4765]: E1003 08:53:02.619660 4765 secret.go:188] Couldn't get secret metallb-system/metallb-memberlist: secret "metallb-memberlist" not found Oct 03 08:53:02 crc kubenswrapper[4765]: E1003 08:53:02.619725 4765 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/93299a5d-55e2-4554-9fe8-4432acc25332-memberlist podName:93299a5d-55e2-4554-9fe8-4432acc25332 nodeName:}" failed. No retries permitted until 2025-10-03 08:53:03.619708235 +0000 UTC m=+827.921202575 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "memberlist" (UniqueName: "kubernetes.io/secret/93299a5d-55e2-4554-9fe8-4432acc25332-memberlist") pod "speaker-h9tnj" (UID: "93299a5d-55e2-4554-9fe8-4432acc25332") : secret "metallb-memberlist" not found Oct 03 08:53:02 crc kubenswrapper[4765]: I1003 08:53:02.654266 4765 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/controller-68d546b9d8-2cwp9"] Oct 03 08:53:02 crc kubenswrapper[4765]: I1003 08:53:02.718179 4765 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/frr-k8s-sx8td" Oct 03 08:53:02 crc kubenswrapper[4765]: I1003 08:53:02.729033 4765 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/frr-k8s-webhook-server-64bf5d555-8gx25" Oct 03 08:53:02 crc kubenswrapper[4765]: I1003 08:53:02.923796 4765 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-2859z" Oct 03 08:53:03 crc kubenswrapper[4765]: I1003 08:53:03.023447 4765 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f8e23dd0-fc37-42ec-ad04-5fc279d1e020-utilities\") pod \"f8e23dd0-fc37-42ec-ad04-5fc279d1e020\" (UID: \"f8e23dd0-fc37-42ec-ad04-5fc279d1e020\") " Oct 03 08:53:03 crc kubenswrapper[4765]: I1003 08:53:03.023505 4765 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f8e23dd0-fc37-42ec-ad04-5fc279d1e020-catalog-content\") pod \"f8e23dd0-fc37-42ec-ad04-5fc279d1e020\" (UID: \"f8e23dd0-fc37-42ec-ad04-5fc279d1e020\") " Oct 03 08:53:03 crc kubenswrapper[4765]: I1003 08:53:03.023552 4765 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-m5tfm\" (UniqueName: \"kubernetes.io/projected/f8e23dd0-fc37-42ec-ad04-5fc279d1e020-kube-api-access-m5tfm\") pod \"f8e23dd0-fc37-42ec-ad04-5fc279d1e020\" (UID: \"f8e23dd0-fc37-42ec-ad04-5fc279d1e020\") " Oct 03 08:53:03 crc kubenswrapper[4765]: I1003 08:53:03.024560 4765 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f8e23dd0-fc37-42ec-ad04-5fc279d1e020-utilities" (OuterVolumeSpecName: "utilities") pod "f8e23dd0-fc37-42ec-ad04-5fc279d1e020" (UID: "f8e23dd0-fc37-42ec-ad04-5fc279d1e020"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 03 08:53:03 crc kubenswrapper[4765]: I1003 08:53:03.028046 4765 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f8e23dd0-fc37-42ec-ad04-5fc279d1e020-kube-api-access-m5tfm" (OuterVolumeSpecName: "kube-api-access-m5tfm") pod "f8e23dd0-fc37-42ec-ad04-5fc279d1e020" (UID: "f8e23dd0-fc37-42ec-ad04-5fc279d1e020"). InnerVolumeSpecName "kube-api-access-m5tfm". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 08:53:03 crc kubenswrapper[4765]: I1003 08:53:03.080924 4765 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f8e23dd0-fc37-42ec-ad04-5fc279d1e020-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "f8e23dd0-fc37-42ec-ad04-5fc279d1e020" (UID: "f8e23dd0-fc37-42ec-ad04-5fc279d1e020"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 03 08:53:03 crc kubenswrapper[4765]: I1003 08:53:03.124930 4765 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f8e23dd0-fc37-42ec-ad04-5fc279d1e020-utilities\") on node \"crc\" DevicePath \"\"" Oct 03 08:53:03 crc kubenswrapper[4765]: I1003 08:53:03.124984 4765 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f8e23dd0-fc37-42ec-ad04-5fc279d1e020-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 03 08:53:03 crc kubenswrapper[4765]: I1003 08:53:03.124998 4765 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-m5tfm\" (UniqueName: \"kubernetes.io/projected/f8e23dd0-fc37-42ec-ad04-5fc279d1e020-kube-api-access-m5tfm\") on node \"crc\" DevicePath \"\"" Oct 03 08:53:03 crc kubenswrapper[4765]: I1003 08:53:03.220009 4765 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/frr-k8s-webhook-server-64bf5d555-8gx25"] Oct 03 08:53:03 crc kubenswrapper[4765]: W1003 08:53:03.223374 4765 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-poda9886be2_cc56_4d96_b89e_55c1fc65774e.slice/crio-cad07dfbb0c666089aa11885f22be33690fa263963cfc5ec9224b3c8e60df53a WatchSource:0}: Error finding container cad07dfbb0c666089aa11885f22be33690fa263963cfc5ec9224b3c8e60df53a: Status 404 returned error can't find the container with id cad07dfbb0c666089aa11885f22be33690fa263963cfc5ec9224b3c8e60df53a Oct 03 08:53:03 crc kubenswrapper[4765]: I1003 08:53:03.587310 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-webhook-server-64bf5d555-8gx25" event={"ID":"a9886be2-cc56-4d96-b89e-55c1fc65774e","Type":"ContainerStarted","Data":"cad07dfbb0c666089aa11885f22be33690fa263963cfc5ec9224b3c8e60df53a"} Oct 03 08:53:03 crc kubenswrapper[4765]: I1003 08:53:03.588419 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-sx8td" event={"ID":"6a402791-ba68-4594-bb36-a6c491fdf723","Type":"ContainerStarted","Data":"7f71649724dd9062e51c6dd22daf58ea8cfbc0140338adb28db6cf232c978cd9"} Oct 03 08:53:03 crc kubenswrapper[4765]: I1003 08:53:03.590479 4765 generic.go:334] "Generic (PLEG): container finished" podID="f8e23dd0-fc37-42ec-ad04-5fc279d1e020" containerID="061e08a99b4fbde01bd280ec08d6a2cf5e79f6d2f3f6c1bc2db91dd483bd6b6d" exitCode=0 Oct 03 08:53:03 crc kubenswrapper[4765]: I1003 08:53:03.590538 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-2859z" event={"ID":"f8e23dd0-fc37-42ec-ad04-5fc279d1e020","Type":"ContainerDied","Data":"061e08a99b4fbde01bd280ec08d6a2cf5e79f6d2f3f6c1bc2db91dd483bd6b6d"} Oct 03 08:53:03 crc kubenswrapper[4765]: I1003 08:53:03.590563 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-2859z" event={"ID":"f8e23dd0-fc37-42ec-ad04-5fc279d1e020","Type":"ContainerDied","Data":"6fc49fde1ab0f894fa50deff7d8a27a227a0ae5d13f7a14550c95178718437ff"} Oct 03 08:53:03 crc kubenswrapper[4765]: I1003 08:53:03.590584 4765 scope.go:117] "RemoveContainer" containerID="061e08a99b4fbde01bd280ec08d6a2cf5e79f6d2f3f6c1bc2db91dd483bd6b6d" Oct 03 08:53:03 crc kubenswrapper[4765]: I1003 08:53:03.590750 4765 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-2859z" Oct 03 08:53:03 crc kubenswrapper[4765]: I1003 08:53:03.595893 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/controller-68d546b9d8-2cwp9" event={"ID":"e1ff5baf-0400-42a9-8815-1a618208934e","Type":"ContainerStarted","Data":"5badbf1987321348530a7934e0b2e9d12b556f92ee19581c3043f98139246133"} Oct 03 08:53:03 crc kubenswrapper[4765]: I1003 08:53:03.595937 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/controller-68d546b9d8-2cwp9" event={"ID":"e1ff5baf-0400-42a9-8815-1a618208934e","Type":"ContainerStarted","Data":"da317a643942551aaf02da1de720052320f54c2b7b55e422f9f53399cd392c1f"} Oct 03 08:53:03 crc kubenswrapper[4765]: I1003 08:53:03.595951 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/controller-68d546b9d8-2cwp9" event={"ID":"e1ff5baf-0400-42a9-8815-1a618208934e","Type":"ContainerStarted","Data":"b64a015f1ee75580eb55e8ebff80d1d35dc8182d4b4d159ac7d443467ceba500"} Oct 03 08:53:03 crc kubenswrapper[4765]: I1003 08:53:03.596225 4765 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/controller-68d546b9d8-2cwp9" Oct 03 08:53:03 crc kubenswrapper[4765]: I1003 08:53:03.606252 4765 scope.go:117] "RemoveContainer" containerID="19ab1170464530c732a911edc8564a53255dc76e646784b14c3fbd64aea59216" Oct 03 08:53:03 crc kubenswrapper[4765]: I1003 08:53:03.615797 4765 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/controller-68d546b9d8-2cwp9" podStartSLOduration=2.615780028 podStartE2EDuration="2.615780028s" podCreationTimestamp="2025-10-03 08:53:01 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-03 08:53:03.614166676 +0000 UTC m=+827.915661006" watchObservedRunningTime="2025-10-03 08:53:03.615780028 +0000 UTC m=+827.917274358" Oct 03 08:53:03 crc kubenswrapper[4765]: I1003 08:53:03.641863 4765 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-2859z"] Oct 03 08:53:03 crc kubenswrapper[4765]: I1003 08:53:03.642066 4765 scope.go:117] "RemoveContainer" containerID="ee16253950b711a1d51ddff84de5237b1d016f1201475ab7d8c1a816f306a32c" Oct 03 08:53:03 crc kubenswrapper[4765]: I1003 08:53:03.642803 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"memberlist\" (UniqueName: \"kubernetes.io/secret/93299a5d-55e2-4554-9fe8-4432acc25332-memberlist\") pod \"speaker-h9tnj\" (UID: \"93299a5d-55e2-4554-9fe8-4432acc25332\") " pod="metallb-system/speaker-h9tnj" Oct 03 08:53:03 crc kubenswrapper[4765]: I1003 08:53:03.646529 4765 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-2859z"] Oct 03 08:53:03 crc kubenswrapper[4765]: I1003 08:53:03.648119 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"memberlist\" (UniqueName: \"kubernetes.io/secret/93299a5d-55e2-4554-9fe8-4432acc25332-memberlist\") pod \"speaker-h9tnj\" (UID: \"93299a5d-55e2-4554-9fe8-4432acc25332\") " pod="metallb-system/speaker-h9tnj" Oct 03 08:53:03 crc kubenswrapper[4765]: I1003 08:53:03.682562 4765 scope.go:117] "RemoveContainer" containerID="061e08a99b4fbde01bd280ec08d6a2cf5e79f6d2f3f6c1bc2db91dd483bd6b6d" Oct 03 08:53:03 crc kubenswrapper[4765]: E1003 08:53:03.683097 4765 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"061e08a99b4fbde01bd280ec08d6a2cf5e79f6d2f3f6c1bc2db91dd483bd6b6d\": container with ID starting with 061e08a99b4fbde01bd280ec08d6a2cf5e79f6d2f3f6c1bc2db91dd483bd6b6d not found: ID does not exist" containerID="061e08a99b4fbde01bd280ec08d6a2cf5e79f6d2f3f6c1bc2db91dd483bd6b6d" Oct 03 08:53:03 crc kubenswrapper[4765]: I1003 08:53:03.683139 4765 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"061e08a99b4fbde01bd280ec08d6a2cf5e79f6d2f3f6c1bc2db91dd483bd6b6d"} err="failed to get container status \"061e08a99b4fbde01bd280ec08d6a2cf5e79f6d2f3f6c1bc2db91dd483bd6b6d\": rpc error: code = NotFound desc = could not find container \"061e08a99b4fbde01bd280ec08d6a2cf5e79f6d2f3f6c1bc2db91dd483bd6b6d\": container with ID starting with 061e08a99b4fbde01bd280ec08d6a2cf5e79f6d2f3f6c1bc2db91dd483bd6b6d not found: ID does not exist" Oct 03 08:53:03 crc kubenswrapper[4765]: I1003 08:53:03.683166 4765 scope.go:117] "RemoveContainer" containerID="19ab1170464530c732a911edc8564a53255dc76e646784b14c3fbd64aea59216" Oct 03 08:53:03 crc kubenswrapper[4765]: E1003 08:53:03.683478 4765 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"19ab1170464530c732a911edc8564a53255dc76e646784b14c3fbd64aea59216\": container with ID starting with 19ab1170464530c732a911edc8564a53255dc76e646784b14c3fbd64aea59216 not found: ID does not exist" containerID="19ab1170464530c732a911edc8564a53255dc76e646784b14c3fbd64aea59216" Oct 03 08:53:03 crc kubenswrapper[4765]: I1003 08:53:03.683521 4765 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"19ab1170464530c732a911edc8564a53255dc76e646784b14c3fbd64aea59216"} err="failed to get container status \"19ab1170464530c732a911edc8564a53255dc76e646784b14c3fbd64aea59216\": rpc error: code = NotFound desc = could not find container \"19ab1170464530c732a911edc8564a53255dc76e646784b14c3fbd64aea59216\": container with ID starting with 19ab1170464530c732a911edc8564a53255dc76e646784b14c3fbd64aea59216 not found: ID does not exist" Oct 03 08:53:03 crc kubenswrapper[4765]: I1003 08:53:03.683550 4765 scope.go:117] "RemoveContainer" containerID="ee16253950b711a1d51ddff84de5237b1d016f1201475ab7d8c1a816f306a32c" Oct 03 08:53:03 crc kubenswrapper[4765]: E1003 08:53:03.683820 4765 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ee16253950b711a1d51ddff84de5237b1d016f1201475ab7d8c1a816f306a32c\": container with ID starting with ee16253950b711a1d51ddff84de5237b1d016f1201475ab7d8c1a816f306a32c not found: ID does not exist" containerID="ee16253950b711a1d51ddff84de5237b1d016f1201475ab7d8c1a816f306a32c" Oct 03 08:53:03 crc kubenswrapper[4765]: I1003 08:53:03.683847 4765 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ee16253950b711a1d51ddff84de5237b1d016f1201475ab7d8c1a816f306a32c"} err="failed to get container status \"ee16253950b711a1d51ddff84de5237b1d016f1201475ab7d8c1a816f306a32c\": rpc error: code = NotFound desc = could not find container \"ee16253950b711a1d51ddff84de5237b1d016f1201475ab7d8c1a816f306a32c\": container with ID starting with ee16253950b711a1d51ddff84de5237b1d016f1201475ab7d8c1a816f306a32c not found: ID does not exist" Oct 03 08:53:03 crc kubenswrapper[4765]: I1003 08:53:03.718616 4765 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/speaker-h9tnj" Oct 03 08:53:04 crc kubenswrapper[4765]: I1003 08:53:04.322506 4765 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f8e23dd0-fc37-42ec-ad04-5fc279d1e020" path="/var/lib/kubelet/pods/f8e23dd0-fc37-42ec-ad04-5fc279d1e020/volumes" Oct 03 08:53:04 crc kubenswrapper[4765]: I1003 08:53:04.610889 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/speaker-h9tnj" event={"ID":"93299a5d-55e2-4554-9fe8-4432acc25332","Type":"ContainerStarted","Data":"527caf4042bf494074b36a7df136c1e5d5d8157404acad9dbbbeb740e9420b6f"} Oct 03 08:53:04 crc kubenswrapper[4765]: I1003 08:53:04.610955 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/speaker-h9tnj" event={"ID":"93299a5d-55e2-4554-9fe8-4432acc25332","Type":"ContainerStarted","Data":"3605af0f85744a8473dc8268c14f6dedba40c8ca0e234303332ba8abe9eeedc1"} Oct 03 08:53:04 crc kubenswrapper[4765]: I1003 08:53:04.610969 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/speaker-h9tnj" event={"ID":"93299a5d-55e2-4554-9fe8-4432acc25332","Type":"ContainerStarted","Data":"a492110b792818fc4f934ef03463f940dfb3085cf7771efa685fe79f8891e9dd"} Oct 03 08:53:04 crc kubenswrapper[4765]: I1003 08:53:04.611374 4765 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/speaker-h9tnj" Oct 03 08:53:04 crc kubenswrapper[4765]: I1003 08:53:04.638844 4765 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/speaker-h9tnj" podStartSLOduration=3.638825307 podStartE2EDuration="3.638825307s" podCreationTimestamp="2025-10-03 08:53:01 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-03 08:53:04.634266381 +0000 UTC m=+828.935760731" watchObservedRunningTime="2025-10-03 08:53:04.638825307 +0000 UTC m=+828.940319637" Oct 03 08:53:06 crc kubenswrapper[4765]: I1003 08:53:06.269306 4765 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-hj8lb" Oct 03 08:53:06 crc kubenswrapper[4765]: I1003 08:53:06.269665 4765 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-hj8lb" Oct 03 08:53:06 crc kubenswrapper[4765]: I1003 08:53:06.318960 4765 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-hj8lb" Oct 03 08:53:06 crc kubenswrapper[4765]: I1003 08:53:06.664749 4765 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-hj8lb" Oct 03 08:53:09 crc kubenswrapper[4765]: I1003 08:53:09.141162 4765 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-hj8lb"] Oct 03 08:53:09 crc kubenswrapper[4765]: I1003 08:53:09.141699 4765 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-hj8lb" podUID="7664de21-4628-49dc-9b0c-f42d8e0b54de" containerName="registry-server" containerID="cri-o://e9d22f1d20aba0869f9066640535b698e5044599323ba684c7444971410ac553" gracePeriod=2 Oct 03 08:53:09 crc kubenswrapper[4765]: I1003 08:53:09.664566 4765 generic.go:334] "Generic (PLEG): container finished" podID="7664de21-4628-49dc-9b0c-f42d8e0b54de" containerID="e9d22f1d20aba0869f9066640535b698e5044599323ba684c7444971410ac553" exitCode=0 Oct 03 08:53:09 crc kubenswrapper[4765]: I1003 08:53:09.664730 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-hj8lb" event={"ID":"7664de21-4628-49dc-9b0c-f42d8e0b54de","Type":"ContainerDied","Data":"e9d22f1d20aba0869f9066640535b698e5044599323ba684c7444971410ac553"} Oct 03 08:53:09 crc kubenswrapper[4765]: I1003 08:53:09.776969 4765 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-hj8lb" Oct 03 08:53:09 crc kubenswrapper[4765]: I1003 08:53:09.937935 4765 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7664de21-4628-49dc-9b0c-f42d8e0b54de-utilities\") pod \"7664de21-4628-49dc-9b0c-f42d8e0b54de\" (UID: \"7664de21-4628-49dc-9b0c-f42d8e0b54de\") " Oct 03 08:53:09 crc kubenswrapper[4765]: I1003 08:53:09.938052 4765 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-l6bbh\" (UniqueName: \"kubernetes.io/projected/7664de21-4628-49dc-9b0c-f42d8e0b54de-kube-api-access-l6bbh\") pod \"7664de21-4628-49dc-9b0c-f42d8e0b54de\" (UID: \"7664de21-4628-49dc-9b0c-f42d8e0b54de\") " Oct 03 08:53:09 crc kubenswrapper[4765]: I1003 08:53:09.938133 4765 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7664de21-4628-49dc-9b0c-f42d8e0b54de-catalog-content\") pod \"7664de21-4628-49dc-9b0c-f42d8e0b54de\" (UID: \"7664de21-4628-49dc-9b0c-f42d8e0b54de\") " Oct 03 08:53:09 crc kubenswrapper[4765]: I1003 08:53:09.939040 4765 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/7664de21-4628-49dc-9b0c-f42d8e0b54de-utilities" (OuterVolumeSpecName: "utilities") pod "7664de21-4628-49dc-9b0c-f42d8e0b54de" (UID: "7664de21-4628-49dc-9b0c-f42d8e0b54de"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 03 08:53:09 crc kubenswrapper[4765]: I1003 08:53:09.939807 4765 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7664de21-4628-49dc-9b0c-f42d8e0b54de-utilities\") on node \"crc\" DevicePath \"\"" Oct 03 08:53:09 crc kubenswrapper[4765]: I1003 08:53:09.943953 4765 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7664de21-4628-49dc-9b0c-f42d8e0b54de-kube-api-access-l6bbh" (OuterVolumeSpecName: "kube-api-access-l6bbh") pod "7664de21-4628-49dc-9b0c-f42d8e0b54de" (UID: "7664de21-4628-49dc-9b0c-f42d8e0b54de"). InnerVolumeSpecName "kube-api-access-l6bbh". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 08:53:09 crc kubenswrapper[4765]: I1003 08:53:09.997242 4765 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/7664de21-4628-49dc-9b0c-f42d8e0b54de-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "7664de21-4628-49dc-9b0c-f42d8e0b54de" (UID: "7664de21-4628-49dc-9b0c-f42d8e0b54de"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 03 08:53:10 crc kubenswrapper[4765]: I1003 08:53:10.041478 4765 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-l6bbh\" (UniqueName: \"kubernetes.io/projected/7664de21-4628-49dc-9b0c-f42d8e0b54de-kube-api-access-l6bbh\") on node \"crc\" DevicePath \"\"" Oct 03 08:53:10 crc kubenswrapper[4765]: I1003 08:53:10.041540 4765 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7664de21-4628-49dc-9b0c-f42d8e0b54de-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 03 08:53:10 crc kubenswrapper[4765]: I1003 08:53:10.672792 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-hj8lb" event={"ID":"7664de21-4628-49dc-9b0c-f42d8e0b54de","Type":"ContainerDied","Data":"7a1dcb00c89ab42b79e11d366a5e9409dcbdac41b7c515a762265ac2d6ac4d6a"} Oct 03 08:53:10 crc kubenswrapper[4765]: I1003 08:53:10.672817 4765 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-hj8lb" Oct 03 08:53:10 crc kubenswrapper[4765]: I1003 08:53:10.672850 4765 scope.go:117] "RemoveContainer" containerID="e9d22f1d20aba0869f9066640535b698e5044599323ba684c7444971410ac553" Oct 03 08:53:10 crc kubenswrapper[4765]: I1003 08:53:10.675220 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-webhook-server-64bf5d555-8gx25" event={"ID":"a9886be2-cc56-4d96-b89e-55c1fc65774e","Type":"ContainerStarted","Data":"8c70a81b9c4a3c34de7d334d185cb54c48c623928b3d956b20d3df6181bf3173"} Oct 03 08:53:10 crc kubenswrapper[4765]: I1003 08:53:10.675332 4765 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/frr-k8s-webhook-server-64bf5d555-8gx25" Oct 03 08:53:10 crc kubenswrapper[4765]: I1003 08:53:10.676871 4765 generic.go:334] "Generic (PLEG): container finished" podID="6a402791-ba68-4594-bb36-a6c491fdf723" containerID="8e313e54fb73695636ede1808f32f48afd7b6bc67d05cac6e43dd632017733ee" exitCode=0 Oct 03 08:53:10 crc kubenswrapper[4765]: I1003 08:53:10.676906 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-sx8td" event={"ID":"6a402791-ba68-4594-bb36-a6c491fdf723","Type":"ContainerDied","Data":"8e313e54fb73695636ede1808f32f48afd7b6bc67d05cac6e43dd632017733ee"} Oct 03 08:53:10 crc kubenswrapper[4765]: I1003 08:53:10.692500 4765 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/frr-k8s-webhook-server-64bf5d555-8gx25" podStartSLOduration=3.280923434 podStartE2EDuration="9.692481218s" podCreationTimestamp="2025-10-03 08:53:01 +0000 UTC" firstStartedPulling="2025-10-03 08:53:03.225318955 +0000 UTC m=+827.526813285" lastFinishedPulling="2025-10-03 08:53:09.636876739 +0000 UTC m=+833.938371069" observedRunningTime="2025-10-03 08:53:10.690514188 +0000 UTC m=+834.992008518" watchObservedRunningTime="2025-10-03 08:53:10.692481218 +0000 UTC m=+834.993975548" Oct 03 08:53:10 crc kubenswrapper[4765]: I1003 08:53:10.702833 4765 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-hj8lb"] Oct 03 08:53:10 crc kubenswrapper[4765]: I1003 08:53:10.710354 4765 scope.go:117] "RemoveContainer" containerID="afa9d41705808c58139526902d85924559edd3fba4f6a6b10eefd26a69c8b4a0" Oct 03 08:53:10 crc kubenswrapper[4765]: I1003 08:53:10.710822 4765 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-hj8lb"] Oct 03 08:53:10 crc kubenswrapper[4765]: I1003 08:53:10.741705 4765 scope.go:117] "RemoveContainer" containerID="3cc4d49cbd5b7b3d59f3c1fc23cb923d38d9358154c5f01220e561c7745c146b" Oct 03 08:53:11 crc kubenswrapper[4765]: I1003 08:53:11.685176 4765 generic.go:334] "Generic (PLEG): container finished" podID="6a402791-ba68-4594-bb36-a6c491fdf723" containerID="17d7a8c3e29d5e0e619d9e118140664ef0d43e381759016972dce97c777ed524" exitCode=0 Oct 03 08:53:11 crc kubenswrapper[4765]: I1003 08:53:11.685230 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-sx8td" event={"ID":"6a402791-ba68-4594-bb36-a6c491fdf723","Type":"ContainerDied","Data":"17d7a8c3e29d5e0e619d9e118140664ef0d43e381759016972dce97c777ed524"} Oct 03 08:53:12 crc kubenswrapper[4765]: I1003 08:53:12.251403 4765 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/controller-68d546b9d8-2cwp9" Oct 03 08:53:12 crc kubenswrapper[4765]: I1003 08:53:12.314421 4765 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7664de21-4628-49dc-9b0c-f42d8e0b54de" path="/var/lib/kubelet/pods/7664de21-4628-49dc-9b0c-f42d8e0b54de/volumes" Oct 03 08:53:12 crc kubenswrapper[4765]: I1003 08:53:12.694529 4765 generic.go:334] "Generic (PLEG): container finished" podID="6a402791-ba68-4594-bb36-a6c491fdf723" containerID="e8ca811111398823f70a99f0b063a34d114f4e3679f551f33f8a40bb02b96a3b" exitCode=0 Oct 03 08:53:12 crc kubenswrapper[4765]: I1003 08:53:12.694577 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-sx8td" event={"ID":"6a402791-ba68-4594-bb36-a6c491fdf723","Type":"ContainerDied","Data":"e8ca811111398823f70a99f0b063a34d114f4e3679f551f33f8a40bb02b96a3b"} Oct 03 08:53:13 crc kubenswrapper[4765]: I1003 08:53:13.706125 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-sx8td" event={"ID":"6a402791-ba68-4594-bb36-a6c491fdf723","Type":"ContainerStarted","Data":"036064b79d20edf6027751440e49def584290cc761a8afe80d14a80469b40547"} Oct 03 08:53:13 crc kubenswrapper[4765]: I1003 08:53:13.706187 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-sx8td" event={"ID":"6a402791-ba68-4594-bb36-a6c491fdf723","Type":"ContainerStarted","Data":"b7c3ec10ec6e89d50682906761e6b57dff29337b9a4ae356c59c62da7c7d55c3"} Oct 03 08:53:13 crc kubenswrapper[4765]: I1003 08:53:13.706200 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-sx8td" event={"ID":"6a402791-ba68-4594-bb36-a6c491fdf723","Type":"ContainerStarted","Data":"c2e2e3e3d0dff58f5b50d6bdec0c0a79cbc9016fd1b98b72021a6f44ee52b106"} Oct 03 08:53:14 crc kubenswrapper[4765]: I1003 08:53:14.716780 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-sx8td" event={"ID":"6a402791-ba68-4594-bb36-a6c491fdf723","Type":"ContainerStarted","Data":"32ae271aa871b2f00bb1f07aa294ad9da5354701fa3dddd3eb4ef5dbc0f26a99"} Oct 03 08:53:14 crc kubenswrapper[4765]: I1003 08:53:14.717100 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-sx8td" event={"ID":"6a402791-ba68-4594-bb36-a6c491fdf723","Type":"ContainerStarted","Data":"5e6210dc91fee3dee36972c77f330b95cc6e0e6417da3a1ac65dda45920b0e43"} Oct 03 08:53:14 crc kubenswrapper[4765]: I1003 08:53:14.717112 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-sx8td" event={"ID":"6a402791-ba68-4594-bb36-a6c491fdf723","Type":"ContainerStarted","Data":"50588a3d90be156678158254c6f2551d98b65d8f0ab3796bf4b7a906fd387615"} Oct 03 08:53:14 crc kubenswrapper[4765]: I1003 08:53:14.717890 4765 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/frr-k8s-sx8td" Oct 03 08:53:14 crc kubenswrapper[4765]: I1003 08:53:14.737828 4765 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/frr-k8s-sx8td" podStartSLOduration=6.984506386 podStartE2EDuration="13.737812592s" podCreationTimestamp="2025-10-03 08:53:01 +0000 UTC" firstStartedPulling="2025-10-03 08:53:02.843015991 +0000 UTC m=+827.144510321" lastFinishedPulling="2025-10-03 08:53:09.596322197 +0000 UTC m=+833.897816527" observedRunningTime="2025-10-03 08:53:14.736267353 +0000 UTC m=+839.037761693" watchObservedRunningTime="2025-10-03 08:53:14.737812592 +0000 UTC m=+839.039306922" Oct 03 08:53:17 crc kubenswrapper[4765]: I1003 08:53:17.719618 4765 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="metallb-system/frr-k8s-sx8td" Oct 03 08:53:17 crc kubenswrapper[4765]: I1003 08:53:17.766464 4765 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="metallb-system/frr-k8s-sx8td" Oct 03 08:53:22 crc kubenswrapper[4765]: I1003 08:53:22.720806 4765 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/frr-k8s-sx8td" Oct 03 08:53:22 crc kubenswrapper[4765]: I1003 08:53:22.734849 4765 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/frr-k8s-webhook-server-64bf5d555-8gx25" Oct 03 08:53:23 crc kubenswrapper[4765]: I1003 08:53:23.722540 4765 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/speaker-h9tnj" Oct 03 08:53:25 crc kubenswrapper[4765]: I1003 08:53:25.798721 4765 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/695e9552c02c72940c72621f824780f00ca58086c3badc308bf0a2eb69pfsz6"] Oct 03 08:53:25 crc kubenswrapper[4765]: E1003 08:53:25.799316 4765 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f8e23dd0-fc37-42ec-ad04-5fc279d1e020" containerName="extract-utilities" Oct 03 08:53:25 crc kubenswrapper[4765]: I1003 08:53:25.799337 4765 state_mem.go:107] "Deleted CPUSet assignment" podUID="f8e23dd0-fc37-42ec-ad04-5fc279d1e020" containerName="extract-utilities" Oct 03 08:53:25 crc kubenswrapper[4765]: E1003 08:53:25.799353 4765 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7664de21-4628-49dc-9b0c-f42d8e0b54de" containerName="extract-content" Oct 03 08:53:25 crc kubenswrapper[4765]: I1003 08:53:25.799361 4765 state_mem.go:107] "Deleted CPUSet assignment" podUID="7664de21-4628-49dc-9b0c-f42d8e0b54de" containerName="extract-content" Oct 03 08:53:25 crc kubenswrapper[4765]: E1003 08:53:25.799378 4765 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7664de21-4628-49dc-9b0c-f42d8e0b54de" containerName="registry-server" Oct 03 08:53:25 crc kubenswrapper[4765]: I1003 08:53:25.799387 4765 state_mem.go:107] "Deleted CPUSet assignment" podUID="7664de21-4628-49dc-9b0c-f42d8e0b54de" containerName="registry-server" Oct 03 08:53:25 crc kubenswrapper[4765]: E1003 08:53:25.799396 4765 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7664de21-4628-49dc-9b0c-f42d8e0b54de" containerName="extract-utilities" Oct 03 08:53:25 crc kubenswrapper[4765]: I1003 08:53:25.799405 4765 state_mem.go:107] "Deleted CPUSet assignment" podUID="7664de21-4628-49dc-9b0c-f42d8e0b54de" containerName="extract-utilities" Oct 03 08:53:25 crc kubenswrapper[4765]: E1003 08:53:25.799420 4765 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f8e23dd0-fc37-42ec-ad04-5fc279d1e020" containerName="extract-content" Oct 03 08:53:25 crc kubenswrapper[4765]: I1003 08:53:25.799427 4765 state_mem.go:107] "Deleted CPUSet assignment" podUID="f8e23dd0-fc37-42ec-ad04-5fc279d1e020" containerName="extract-content" Oct 03 08:53:25 crc kubenswrapper[4765]: E1003 08:53:25.799441 4765 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f8e23dd0-fc37-42ec-ad04-5fc279d1e020" containerName="registry-server" Oct 03 08:53:25 crc kubenswrapper[4765]: I1003 08:53:25.799447 4765 state_mem.go:107] "Deleted CPUSet assignment" podUID="f8e23dd0-fc37-42ec-ad04-5fc279d1e020" containerName="registry-server" Oct 03 08:53:25 crc kubenswrapper[4765]: I1003 08:53:25.799545 4765 memory_manager.go:354] "RemoveStaleState removing state" podUID="7664de21-4628-49dc-9b0c-f42d8e0b54de" containerName="registry-server" Oct 03 08:53:25 crc kubenswrapper[4765]: I1003 08:53:25.799560 4765 memory_manager.go:354] "RemoveStaleState removing state" podUID="f8e23dd0-fc37-42ec-ad04-5fc279d1e020" containerName="registry-server" Oct 03 08:53:25 crc kubenswrapper[4765]: I1003 08:53:25.800460 4765 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/695e9552c02c72940c72621f824780f00ca58086c3badc308bf0a2eb69pfsz6" Oct 03 08:53:25 crc kubenswrapper[4765]: I1003 08:53:25.804315 4765 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"default-dockercfg-vmwhc" Oct 03 08:53:25 crc kubenswrapper[4765]: I1003 08:53:25.848348 4765 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/695e9552c02c72940c72621f824780f00ca58086c3badc308bf0a2eb69pfsz6"] Oct 03 08:53:25 crc kubenswrapper[4765]: I1003 08:53:25.945923 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/16c6b0c1-ad97-4661-81c4-0ae36496ca1e-bundle\") pod \"695e9552c02c72940c72621f824780f00ca58086c3badc308bf0a2eb69pfsz6\" (UID: \"16c6b0c1-ad97-4661-81c4-0ae36496ca1e\") " pod="openshift-marketplace/695e9552c02c72940c72621f824780f00ca58086c3badc308bf0a2eb69pfsz6" Oct 03 08:53:25 crc kubenswrapper[4765]: I1003 08:53:25.945990 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/16c6b0c1-ad97-4661-81c4-0ae36496ca1e-util\") pod \"695e9552c02c72940c72621f824780f00ca58086c3badc308bf0a2eb69pfsz6\" (UID: \"16c6b0c1-ad97-4661-81c4-0ae36496ca1e\") " pod="openshift-marketplace/695e9552c02c72940c72621f824780f00ca58086c3badc308bf0a2eb69pfsz6" Oct 03 08:53:25 crc kubenswrapper[4765]: I1003 08:53:25.946075 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9zfbq\" (UniqueName: \"kubernetes.io/projected/16c6b0c1-ad97-4661-81c4-0ae36496ca1e-kube-api-access-9zfbq\") pod \"695e9552c02c72940c72621f824780f00ca58086c3badc308bf0a2eb69pfsz6\" (UID: \"16c6b0c1-ad97-4661-81c4-0ae36496ca1e\") " pod="openshift-marketplace/695e9552c02c72940c72621f824780f00ca58086c3badc308bf0a2eb69pfsz6" Oct 03 08:53:26 crc kubenswrapper[4765]: I1003 08:53:26.046737 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9zfbq\" (UniqueName: \"kubernetes.io/projected/16c6b0c1-ad97-4661-81c4-0ae36496ca1e-kube-api-access-9zfbq\") pod \"695e9552c02c72940c72621f824780f00ca58086c3badc308bf0a2eb69pfsz6\" (UID: \"16c6b0c1-ad97-4661-81c4-0ae36496ca1e\") " pod="openshift-marketplace/695e9552c02c72940c72621f824780f00ca58086c3badc308bf0a2eb69pfsz6" Oct 03 08:53:26 crc kubenswrapper[4765]: I1003 08:53:26.047099 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/16c6b0c1-ad97-4661-81c4-0ae36496ca1e-bundle\") pod \"695e9552c02c72940c72621f824780f00ca58086c3badc308bf0a2eb69pfsz6\" (UID: \"16c6b0c1-ad97-4661-81c4-0ae36496ca1e\") " pod="openshift-marketplace/695e9552c02c72940c72621f824780f00ca58086c3badc308bf0a2eb69pfsz6" Oct 03 08:53:26 crc kubenswrapper[4765]: I1003 08:53:26.047126 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/16c6b0c1-ad97-4661-81c4-0ae36496ca1e-util\") pod \"695e9552c02c72940c72621f824780f00ca58086c3badc308bf0a2eb69pfsz6\" (UID: \"16c6b0c1-ad97-4661-81c4-0ae36496ca1e\") " pod="openshift-marketplace/695e9552c02c72940c72621f824780f00ca58086c3badc308bf0a2eb69pfsz6" Oct 03 08:53:26 crc kubenswrapper[4765]: I1003 08:53:26.047788 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/16c6b0c1-ad97-4661-81c4-0ae36496ca1e-bundle\") pod \"695e9552c02c72940c72621f824780f00ca58086c3badc308bf0a2eb69pfsz6\" (UID: \"16c6b0c1-ad97-4661-81c4-0ae36496ca1e\") " pod="openshift-marketplace/695e9552c02c72940c72621f824780f00ca58086c3badc308bf0a2eb69pfsz6" Oct 03 08:53:26 crc kubenswrapper[4765]: I1003 08:53:26.047805 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/16c6b0c1-ad97-4661-81c4-0ae36496ca1e-util\") pod \"695e9552c02c72940c72621f824780f00ca58086c3badc308bf0a2eb69pfsz6\" (UID: \"16c6b0c1-ad97-4661-81c4-0ae36496ca1e\") " pod="openshift-marketplace/695e9552c02c72940c72621f824780f00ca58086c3badc308bf0a2eb69pfsz6" Oct 03 08:53:26 crc kubenswrapper[4765]: I1003 08:53:26.067519 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9zfbq\" (UniqueName: \"kubernetes.io/projected/16c6b0c1-ad97-4661-81c4-0ae36496ca1e-kube-api-access-9zfbq\") pod \"695e9552c02c72940c72621f824780f00ca58086c3badc308bf0a2eb69pfsz6\" (UID: \"16c6b0c1-ad97-4661-81c4-0ae36496ca1e\") " pod="openshift-marketplace/695e9552c02c72940c72621f824780f00ca58086c3badc308bf0a2eb69pfsz6" Oct 03 08:53:26 crc kubenswrapper[4765]: I1003 08:53:26.172706 4765 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/695e9552c02c72940c72621f824780f00ca58086c3badc308bf0a2eb69pfsz6" Oct 03 08:53:26 crc kubenswrapper[4765]: I1003 08:53:26.603835 4765 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/695e9552c02c72940c72621f824780f00ca58086c3badc308bf0a2eb69pfsz6"] Oct 03 08:53:26 crc kubenswrapper[4765]: W1003 08:53:26.607016 4765 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod16c6b0c1_ad97_4661_81c4_0ae36496ca1e.slice/crio-39797758c2ffeeb52ff9e7783e0f57ff881a7a7ae74f383f1a0a1a92c600af56 WatchSource:0}: Error finding container 39797758c2ffeeb52ff9e7783e0f57ff881a7a7ae74f383f1a0a1a92c600af56: Status 404 returned error can't find the container with id 39797758c2ffeeb52ff9e7783e0f57ff881a7a7ae74f383f1a0a1a92c600af56 Oct 03 08:53:26 crc kubenswrapper[4765]: I1003 08:53:26.802488 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/695e9552c02c72940c72621f824780f00ca58086c3badc308bf0a2eb69pfsz6" event={"ID":"16c6b0c1-ad97-4661-81c4-0ae36496ca1e","Type":"ContainerStarted","Data":"dc84a08785fbf4f9582d3a41645bfff4c7a61f614db47b2be809807d8c892e3f"} Oct 03 08:53:26 crc kubenswrapper[4765]: I1003 08:53:26.802868 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/695e9552c02c72940c72621f824780f00ca58086c3badc308bf0a2eb69pfsz6" event={"ID":"16c6b0c1-ad97-4661-81c4-0ae36496ca1e","Type":"ContainerStarted","Data":"39797758c2ffeeb52ff9e7783e0f57ff881a7a7ae74f383f1a0a1a92c600af56"} Oct 03 08:53:27 crc kubenswrapper[4765]: I1003 08:53:27.808631 4765 generic.go:334] "Generic (PLEG): container finished" podID="16c6b0c1-ad97-4661-81c4-0ae36496ca1e" containerID="dc84a08785fbf4f9582d3a41645bfff4c7a61f614db47b2be809807d8c892e3f" exitCode=0 Oct 03 08:53:27 crc kubenswrapper[4765]: I1003 08:53:27.808689 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/695e9552c02c72940c72621f824780f00ca58086c3badc308bf0a2eb69pfsz6" event={"ID":"16c6b0c1-ad97-4661-81c4-0ae36496ca1e","Type":"ContainerDied","Data":"dc84a08785fbf4f9582d3a41645bfff4c7a61f614db47b2be809807d8c892e3f"} Oct 03 08:53:29 crc kubenswrapper[4765]: I1003 08:53:29.821870 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/695e9552c02c72940c72621f824780f00ca58086c3badc308bf0a2eb69pfsz6" event={"ID":"16c6b0c1-ad97-4661-81c4-0ae36496ca1e","Type":"ContainerStarted","Data":"42935077ed81d0bf0a3ca838f854057a05f8487c380c2fbdcdff5dd28012c96c"} Oct 03 08:53:30 crc kubenswrapper[4765]: I1003 08:53:30.829467 4765 generic.go:334] "Generic (PLEG): container finished" podID="16c6b0c1-ad97-4661-81c4-0ae36496ca1e" containerID="42935077ed81d0bf0a3ca838f854057a05f8487c380c2fbdcdff5dd28012c96c" exitCode=0 Oct 03 08:53:30 crc kubenswrapper[4765]: I1003 08:53:30.829514 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/695e9552c02c72940c72621f824780f00ca58086c3badc308bf0a2eb69pfsz6" event={"ID":"16c6b0c1-ad97-4661-81c4-0ae36496ca1e","Type":"ContainerDied","Data":"42935077ed81d0bf0a3ca838f854057a05f8487c380c2fbdcdff5dd28012c96c"} Oct 03 08:53:31 crc kubenswrapper[4765]: I1003 08:53:31.836788 4765 generic.go:334] "Generic (PLEG): container finished" podID="16c6b0c1-ad97-4661-81c4-0ae36496ca1e" containerID="1ded45c925dca0223a5b56628ad3981d8e6733e0ffa8afd0b0f0540d5c04fab0" exitCode=0 Oct 03 08:53:31 crc kubenswrapper[4765]: I1003 08:53:31.836835 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/695e9552c02c72940c72621f824780f00ca58086c3badc308bf0a2eb69pfsz6" event={"ID":"16c6b0c1-ad97-4661-81c4-0ae36496ca1e","Type":"ContainerDied","Data":"1ded45c925dca0223a5b56628ad3981d8e6733e0ffa8afd0b0f0540d5c04fab0"} Oct 03 08:53:33 crc kubenswrapper[4765]: I1003 08:53:33.128632 4765 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/695e9552c02c72940c72621f824780f00ca58086c3badc308bf0a2eb69pfsz6" Oct 03 08:53:33 crc kubenswrapper[4765]: I1003 08:53:33.161464 4765 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-tfqnw"] Oct 03 08:53:33 crc kubenswrapper[4765]: E1003 08:53:33.161809 4765 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="16c6b0c1-ad97-4661-81c4-0ae36496ca1e" containerName="extract" Oct 03 08:53:33 crc kubenswrapper[4765]: I1003 08:53:33.161831 4765 state_mem.go:107] "Deleted CPUSet assignment" podUID="16c6b0c1-ad97-4661-81c4-0ae36496ca1e" containerName="extract" Oct 03 08:53:33 crc kubenswrapper[4765]: E1003 08:53:33.161844 4765 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="16c6b0c1-ad97-4661-81c4-0ae36496ca1e" containerName="util" Oct 03 08:53:33 crc kubenswrapper[4765]: I1003 08:53:33.161853 4765 state_mem.go:107] "Deleted CPUSet assignment" podUID="16c6b0c1-ad97-4661-81c4-0ae36496ca1e" containerName="util" Oct 03 08:53:33 crc kubenswrapper[4765]: E1003 08:53:33.161876 4765 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="16c6b0c1-ad97-4661-81c4-0ae36496ca1e" containerName="pull" Oct 03 08:53:33 crc kubenswrapper[4765]: I1003 08:53:33.161884 4765 state_mem.go:107] "Deleted CPUSet assignment" podUID="16c6b0c1-ad97-4661-81c4-0ae36496ca1e" containerName="pull" Oct 03 08:53:33 crc kubenswrapper[4765]: I1003 08:53:33.162025 4765 memory_manager.go:354] "RemoveStaleState removing state" podUID="16c6b0c1-ad97-4661-81c4-0ae36496ca1e" containerName="extract" Oct 03 08:53:33 crc kubenswrapper[4765]: I1003 08:53:33.163052 4765 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-tfqnw" Oct 03 08:53:33 crc kubenswrapper[4765]: I1003 08:53:33.175436 4765 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-tfqnw"] Oct 03 08:53:33 crc kubenswrapper[4765]: I1003 08:53:33.256788 4765 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/16c6b0c1-ad97-4661-81c4-0ae36496ca1e-util\") pod \"16c6b0c1-ad97-4661-81c4-0ae36496ca1e\" (UID: \"16c6b0c1-ad97-4661-81c4-0ae36496ca1e\") " Oct 03 08:53:33 crc kubenswrapper[4765]: I1003 08:53:33.257339 4765 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9zfbq\" (UniqueName: \"kubernetes.io/projected/16c6b0c1-ad97-4661-81c4-0ae36496ca1e-kube-api-access-9zfbq\") pod \"16c6b0c1-ad97-4661-81c4-0ae36496ca1e\" (UID: \"16c6b0c1-ad97-4661-81c4-0ae36496ca1e\") " Oct 03 08:53:33 crc kubenswrapper[4765]: I1003 08:53:33.257414 4765 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/16c6b0c1-ad97-4661-81c4-0ae36496ca1e-bundle\") pod \"16c6b0c1-ad97-4661-81c4-0ae36496ca1e\" (UID: \"16c6b0c1-ad97-4661-81c4-0ae36496ca1e\") " Oct 03 08:53:33 crc kubenswrapper[4765]: I1003 08:53:33.257656 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/26d9e9b3-34a3-4c3f-af50-7ebf64127770-catalog-content\") pod \"redhat-marketplace-tfqnw\" (UID: \"26d9e9b3-34a3-4c3f-af50-7ebf64127770\") " pod="openshift-marketplace/redhat-marketplace-tfqnw" Oct 03 08:53:33 crc kubenswrapper[4765]: I1003 08:53:33.257702 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/26d9e9b3-34a3-4c3f-af50-7ebf64127770-utilities\") pod \"redhat-marketplace-tfqnw\" (UID: \"26d9e9b3-34a3-4c3f-af50-7ebf64127770\") " pod="openshift-marketplace/redhat-marketplace-tfqnw" Oct 03 08:53:33 crc kubenswrapper[4765]: I1003 08:53:33.257748 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vrvcd\" (UniqueName: \"kubernetes.io/projected/26d9e9b3-34a3-4c3f-af50-7ebf64127770-kube-api-access-vrvcd\") pod \"redhat-marketplace-tfqnw\" (UID: \"26d9e9b3-34a3-4c3f-af50-7ebf64127770\") " pod="openshift-marketplace/redhat-marketplace-tfqnw" Oct 03 08:53:33 crc kubenswrapper[4765]: I1003 08:53:33.261799 4765 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/16c6b0c1-ad97-4661-81c4-0ae36496ca1e-bundle" (OuterVolumeSpecName: "bundle") pod "16c6b0c1-ad97-4661-81c4-0ae36496ca1e" (UID: "16c6b0c1-ad97-4661-81c4-0ae36496ca1e"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 03 08:53:33 crc kubenswrapper[4765]: I1003 08:53:33.266520 4765 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/16c6b0c1-ad97-4661-81c4-0ae36496ca1e-kube-api-access-9zfbq" (OuterVolumeSpecName: "kube-api-access-9zfbq") pod "16c6b0c1-ad97-4661-81c4-0ae36496ca1e" (UID: "16c6b0c1-ad97-4661-81c4-0ae36496ca1e"). InnerVolumeSpecName "kube-api-access-9zfbq". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 08:53:33 crc kubenswrapper[4765]: I1003 08:53:33.269165 4765 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/16c6b0c1-ad97-4661-81c4-0ae36496ca1e-util" (OuterVolumeSpecName: "util") pod "16c6b0c1-ad97-4661-81c4-0ae36496ca1e" (UID: "16c6b0c1-ad97-4661-81c4-0ae36496ca1e"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 03 08:53:33 crc kubenswrapper[4765]: I1003 08:53:33.359122 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/26d9e9b3-34a3-4c3f-af50-7ebf64127770-catalog-content\") pod \"redhat-marketplace-tfqnw\" (UID: \"26d9e9b3-34a3-4c3f-af50-7ebf64127770\") " pod="openshift-marketplace/redhat-marketplace-tfqnw" Oct 03 08:53:33 crc kubenswrapper[4765]: I1003 08:53:33.359179 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/26d9e9b3-34a3-4c3f-af50-7ebf64127770-utilities\") pod \"redhat-marketplace-tfqnw\" (UID: \"26d9e9b3-34a3-4c3f-af50-7ebf64127770\") " pod="openshift-marketplace/redhat-marketplace-tfqnw" Oct 03 08:53:33 crc kubenswrapper[4765]: I1003 08:53:33.359222 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vrvcd\" (UniqueName: \"kubernetes.io/projected/26d9e9b3-34a3-4c3f-af50-7ebf64127770-kube-api-access-vrvcd\") pod \"redhat-marketplace-tfqnw\" (UID: \"26d9e9b3-34a3-4c3f-af50-7ebf64127770\") " pod="openshift-marketplace/redhat-marketplace-tfqnw" Oct 03 08:53:33 crc kubenswrapper[4765]: I1003 08:53:33.359319 4765 reconciler_common.go:293] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/16c6b0c1-ad97-4661-81c4-0ae36496ca1e-bundle\") on node \"crc\" DevicePath \"\"" Oct 03 08:53:33 crc kubenswrapper[4765]: I1003 08:53:33.359331 4765 reconciler_common.go:293] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/16c6b0c1-ad97-4661-81c4-0ae36496ca1e-util\") on node \"crc\" DevicePath \"\"" Oct 03 08:53:33 crc kubenswrapper[4765]: I1003 08:53:33.359343 4765 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9zfbq\" (UniqueName: \"kubernetes.io/projected/16c6b0c1-ad97-4661-81c4-0ae36496ca1e-kube-api-access-9zfbq\") on node \"crc\" DevicePath \"\"" Oct 03 08:53:33 crc kubenswrapper[4765]: I1003 08:53:33.360161 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/26d9e9b3-34a3-4c3f-af50-7ebf64127770-catalog-content\") pod \"redhat-marketplace-tfqnw\" (UID: \"26d9e9b3-34a3-4c3f-af50-7ebf64127770\") " pod="openshift-marketplace/redhat-marketplace-tfqnw" Oct 03 08:53:33 crc kubenswrapper[4765]: I1003 08:53:33.360216 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/26d9e9b3-34a3-4c3f-af50-7ebf64127770-utilities\") pod \"redhat-marketplace-tfqnw\" (UID: \"26d9e9b3-34a3-4c3f-af50-7ebf64127770\") " pod="openshift-marketplace/redhat-marketplace-tfqnw" Oct 03 08:53:33 crc kubenswrapper[4765]: I1003 08:53:33.378109 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vrvcd\" (UniqueName: \"kubernetes.io/projected/26d9e9b3-34a3-4c3f-af50-7ebf64127770-kube-api-access-vrvcd\") pod \"redhat-marketplace-tfqnw\" (UID: \"26d9e9b3-34a3-4c3f-af50-7ebf64127770\") " pod="openshift-marketplace/redhat-marketplace-tfqnw" Oct 03 08:53:33 crc kubenswrapper[4765]: I1003 08:53:33.486601 4765 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-tfqnw" Oct 03 08:53:33 crc kubenswrapper[4765]: I1003 08:53:33.850085 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/695e9552c02c72940c72621f824780f00ca58086c3badc308bf0a2eb69pfsz6" event={"ID":"16c6b0c1-ad97-4661-81c4-0ae36496ca1e","Type":"ContainerDied","Data":"39797758c2ffeeb52ff9e7783e0f57ff881a7a7ae74f383f1a0a1a92c600af56"} Oct 03 08:53:33 crc kubenswrapper[4765]: I1003 08:53:33.850451 4765 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="39797758c2ffeeb52ff9e7783e0f57ff881a7a7ae74f383f1a0a1a92c600af56" Oct 03 08:53:33 crc kubenswrapper[4765]: I1003 08:53:33.850115 4765 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/695e9552c02c72940c72621f824780f00ca58086c3badc308bf0a2eb69pfsz6" Oct 03 08:53:33 crc kubenswrapper[4765]: I1003 08:53:33.880619 4765 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-tfqnw"] Oct 03 08:53:33 crc kubenswrapper[4765]: W1003 08:53:33.885046 4765 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod26d9e9b3_34a3_4c3f_af50_7ebf64127770.slice/crio-592f5c5866a31a721939cd9460ed9a9ce63bcd02da6bab855cb61b3fb41daa72 WatchSource:0}: Error finding container 592f5c5866a31a721939cd9460ed9a9ce63bcd02da6bab855cb61b3fb41daa72: Status 404 returned error can't find the container with id 592f5c5866a31a721939cd9460ed9a9ce63bcd02da6bab855cb61b3fb41daa72 Oct 03 08:53:34 crc kubenswrapper[4765]: I1003 08:53:34.858558 4765 generic.go:334] "Generic (PLEG): container finished" podID="26d9e9b3-34a3-4c3f-af50-7ebf64127770" containerID="e03814312320c243725b66ed6d7eb0f650f10890416ad14ee0eac49fbcf0ac10" exitCode=0 Oct 03 08:53:34 crc kubenswrapper[4765]: I1003 08:53:34.858926 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-tfqnw" event={"ID":"26d9e9b3-34a3-4c3f-af50-7ebf64127770","Type":"ContainerDied","Data":"e03814312320c243725b66ed6d7eb0f650f10890416ad14ee0eac49fbcf0ac10"} Oct 03 08:53:34 crc kubenswrapper[4765]: I1003 08:53:34.859006 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-tfqnw" event={"ID":"26d9e9b3-34a3-4c3f-af50-7ebf64127770","Type":"ContainerStarted","Data":"592f5c5866a31a721939cd9460ed9a9ce63bcd02da6bab855cb61b3fb41daa72"} Oct 03 08:53:35 crc kubenswrapper[4765]: I1003 08:53:35.865793 4765 generic.go:334] "Generic (PLEG): container finished" podID="26d9e9b3-34a3-4c3f-af50-7ebf64127770" containerID="c926fd737e257cbe85894f1f288b537287d216d292010a256d3f304ec6512c13" exitCode=0 Oct 03 08:53:35 crc kubenswrapper[4765]: I1003 08:53:35.866421 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-tfqnw" event={"ID":"26d9e9b3-34a3-4c3f-af50-7ebf64127770","Type":"ContainerDied","Data":"c926fd737e257cbe85894f1f288b537287d216d292010a256d3f304ec6512c13"} Oct 03 08:53:36 crc kubenswrapper[4765]: I1003 08:53:36.426045 4765 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["cert-manager-operator/cert-manager-operator-controller-manager-57cd46d6d-nv5bg"] Oct 03 08:53:36 crc kubenswrapper[4765]: I1003 08:53:36.427658 4765 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager-operator/cert-manager-operator-controller-manager-57cd46d6d-nv5bg" Oct 03 08:53:36 crc kubenswrapper[4765]: I1003 08:53:36.434687 4765 reflector.go:368] Caches populated for *v1.ConfigMap from object-"cert-manager-operator"/"kube-root-ca.crt" Oct 03 08:53:36 crc kubenswrapper[4765]: I1003 08:53:36.434986 4765 reflector.go:368] Caches populated for *v1.ConfigMap from object-"cert-manager-operator"/"openshift-service-ca.crt" Oct 03 08:53:36 crc kubenswrapper[4765]: I1003 08:53:36.435239 4765 reflector.go:368] Caches populated for *v1.Secret from object-"cert-manager-operator"/"cert-manager-operator-controller-manager-dockercfg-t4z8v" Oct 03 08:53:36 crc kubenswrapper[4765]: I1003 08:53:36.482232 4765 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager-operator/cert-manager-operator-controller-manager-57cd46d6d-nv5bg"] Oct 03 08:53:36 crc kubenswrapper[4765]: I1003 08:53:36.601823 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gm4nr\" (UniqueName: \"kubernetes.io/projected/4dc5b97e-ce00-40ea-915b-c4dfa8a31162-kube-api-access-gm4nr\") pod \"cert-manager-operator-controller-manager-57cd46d6d-nv5bg\" (UID: \"4dc5b97e-ce00-40ea-915b-c4dfa8a31162\") " pod="cert-manager-operator/cert-manager-operator-controller-manager-57cd46d6d-nv5bg" Oct 03 08:53:36 crc kubenswrapper[4765]: I1003 08:53:36.703039 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gm4nr\" (UniqueName: \"kubernetes.io/projected/4dc5b97e-ce00-40ea-915b-c4dfa8a31162-kube-api-access-gm4nr\") pod \"cert-manager-operator-controller-manager-57cd46d6d-nv5bg\" (UID: \"4dc5b97e-ce00-40ea-915b-c4dfa8a31162\") " pod="cert-manager-operator/cert-manager-operator-controller-manager-57cd46d6d-nv5bg" Oct 03 08:53:36 crc kubenswrapper[4765]: I1003 08:53:36.726718 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gm4nr\" (UniqueName: \"kubernetes.io/projected/4dc5b97e-ce00-40ea-915b-c4dfa8a31162-kube-api-access-gm4nr\") pod \"cert-manager-operator-controller-manager-57cd46d6d-nv5bg\" (UID: \"4dc5b97e-ce00-40ea-915b-c4dfa8a31162\") " pod="cert-manager-operator/cert-manager-operator-controller-manager-57cd46d6d-nv5bg" Oct 03 08:53:36 crc kubenswrapper[4765]: I1003 08:53:36.747129 4765 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager-operator/cert-manager-operator-controller-manager-57cd46d6d-nv5bg" Oct 03 08:53:36 crc kubenswrapper[4765]: I1003 08:53:36.897485 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-tfqnw" event={"ID":"26d9e9b3-34a3-4c3f-af50-7ebf64127770","Type":"ContainerStarted","Data":"e5190046985b19216f2e0428a4da34a58d77c4333cbec208b5a42cda751cdeea"} Oct 03 08:53:36 crc kubenswrapper[4765]: I1003 08:53:36.930710 4765 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-tfqnw" podStartSLOduration=2.558888209 podStartE2EDuration="3.930694589s" podCreationTimestamp="2025-10-03 08:53:33 +0000 UTC" firstStartedPulling="2025-10-03 08:53:34.860585969 +0000 UTC m=+859.162080299" lastFinishedPulling="2025-10-03 08:53:36.232392349 +0000 UTC m=+860.533886679" observedRunningTime="2025-10-03 08:53:36.926482222 +0000 UTC m=+861.227976572" watchObservedRunningTime="2025-10-03 08:53:36.930694589 +0000 UTC m=+861.232188919" Oct 03 08:53:37 crc kubenswrapper[4765]: I1003 08:53:37.415601 4765 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager-operator/cert-manager-operator-controller-manager-57cd46d6d-nv5bg"] Oct 03 08:53:37 crc kubenswrapper[4765]: W1003 08:53:37.430072 4765 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod4dc5b97e_ce00_40ea_915b_c4dfa8a31162.slice/crio-a616f2c00c4c85298d23b22b19ec9953a7ed33d1da31e6d98a994c0971ce62c0 WatchSource:0}: Error finding container a616f2c00c4c85298d23b22b19ec9953a7ed33d1da31e6d98a994c0971ce62c0: Status 404 returned error can't find the container with id a616f2c00c4c85298d23b22b19ec9953a7ed33d1da31e6d98a994c0971ce62c0 Oct 03 08:53:37 crc kubenswrapper[4765]: I1003 08:53:37.905577 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager-operator/cert-manager-operator-controller-manager-57cd46d6d-nv5bg" event={"ID":"4dc5b97e-ce00-40ea-915b-c4dfa8a31162","Type":"ContainerStarted","Data":"a616f2c00c4c85298d23b22b19ec9953a7ed33d1da31e6d98a994c0971ce62c0"} Oct 03 08:53:41 crc kubenswrapper[4765]: I1003 08:53:41.929523 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager-operator/cert-manager-operator-controller-manager-57cd46d6d-nv5bg" event={"ID":"4dc5b97e-ce00-40ea-915b-c4dfa8a31162","Type":"ContainerStarted","Data":"bd54c3e0ca18d34ebb00a88a9031487aac87dc78e4e6d983d6d689628f16d0ec"} Oct 03 08:53:41 crc kubenswrapper[4765]: I1003 08:53:41.954809 4765 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="cert-manager-operator/cert-manager-operator-controller-manager-57cd46d6d-nv5bg" podStartSLOduration=2.493290075 podStartE2EDuration="5.954791224s" podCreationTimestamp="2025-10-03 08:53:36 +0000 UTC" firstStartedPulling="2025-10-03 08:53:37.432203518 +0000 UTC m=+861.733697848" lastFinishedPulling="2025-10-03 08:53:40.893704667 +0000 UTC m=+865.195198997" observedRunningTime="2025-10-03 08:53:41.952723442 +0000 UTC m=+866.254217772" watchObservedRunningTime="2025-10-03 08:53:41.954791224 +0000 UTC m=+866.256285544" Oct 03 08:53:43 crc kubenswrapper[4765]: I1003 08:53:43.487085 4765 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-tfqnw" Oct 03 08:53:43 crc kubenswrapper[4765]: I1003 08:53:43.487438 4765 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-tfqnw" Oct 03 08:53:43 crc kubenswrapper[4765]: I1003 08:53:43.538756 4765 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-tfqnw" Oct 03 08:53:43 crc kubenswrapper[4765]: I1003 08:53:43.930094 4765 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["cert-manager/cert-manager-webhook-d969966f-tpwbb"] Oct 03 08:53:43 crc kubenswrapper[4765]: I1003 08:53:43.930911 4765 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-webhook-d969966f-tpwbb" Oct 03 08:53:43 crc kubenswrapper[4765]: I1003 08:53:43.933387 4765 reflector.go:368] Caches populated for *v1.ConfigMap from object-"cert-manager"/"openshift-service-ca.crt" Oct 03 08:53:43 crc kubenswrapper[4765]: I1003 08:53:43.933702 4765 reflector.go:368] Caches populated for *v1.ConfigMap from object-"cert-manager"/"kube-root-ca.crt" Oct 03 08:53:43 crc kubenswrapper[4765]: I1003 08:53:43.935283 4765 reflector.go:368] Caches populated for *v1.Secret from object-"cert-manager"/"cert-manager-webhook-dockercfg-26pmq" Oct 03 08:53:43 crc kubenswrapper[4765]: I1003 08:53:43.945918 4765 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-webhook-d969966f-tpwbb"] Oct 03 08:53:44 crc kubenswrapper[4765]: I1003 08:53:44.001806 4765 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-tfqnw" Oct 03 08:53:44 crc kubenswrapper[4765]: I1003 08:53:44.109187 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/6428e1d2-b7e5-4046-bdd4-b1ec5c55e6cd-bound-sa-token\") pod \"cert-manager-webhook-d969966f-tpwbb\" (UID: \"6428e1d2-b7e5-4046-bdd4-b1ec5c55e6cd\") " pod="cert-manager/cert-manager-webhook-d969966f-tpwbb" Oct 03 08:53:44 crc kubenswrapper[4765]: I1003 08:53:44.109301 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5khnf\" (UniqueName: \"kubernetes.io/projected/6428e1d2-b7e5-4046-bdd4-b1ec5c55e6cd-kube-api-access-5khnf\") pod \"cert-manager-webhook-d969966f-tpwbb\" (UID: \"6428e1d2-b7e5-4046-bdd4-b1ec5c55e6cd\") " pod="cert-manager/cert-manager-webhook-d969966f-tpwbb" Oct 03 08:53:44 crc kubenswrapper[4765]: I1003 08:53:44.210626 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/6428e1d2-b7e5-4046-bdd4-b1ec5c55e6cd-bound-sa-token\") pod \"cert-manager-webhook-d969966f-tpwbb\" (UID: \"6428e1d2-b7e5-4046-bdd4-b1ec5c55e6cd\") " pod="cert-manager/cert-manager-webhook-d969966f-tpwbb" Oct 03 08:53:44 crc kubenswrapper[4765]: I1003 08:53:44.210739 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5khnf\" (UniqueName: \"kubernetes.io/projected/6428e1d2-b7e5-4046-bdd4-b1ec5c55e6cd-kube-api-access-5khnf\") pod \"cert-manager-webhook-d969966f-tpwbb\" (UID: \"6428e1d2-b7e5-4046-bdd4-b1ec5c55e6cd\") " pod="cert-manager/cert-manager-webhook-d969966f-tpwbb" Oct 03 08:53:44 crc kubenswrapper[4765]: I1003 08:53:44.251849 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/6428e1d2-b7e5-4046-bdd4-b1ec5c55e6cd-bound-sa-token\") pod \"cert-manager-webhook-d969966f-tpwbb\" (UID: \"6428e1d2-b7e5-4046-bdd4-b1ec5c55e6cd\") " pod="cert-manager/cert-manager-webhook-d969966f-tpwbb" Oct 03 08:53:44 crc kubenswrapper[4765]: I1003 08:53:44.252045 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5khnf\" (UniqueName: \"kubernetes.io/projected/6428e1d2-b7e5-4046-bdd4-b1ec5c55e6cd-kube-api-access-5khnf\") pod \"cert-manager-webhook-d969966f-tpwbb\" (UID: \"6428e1d2-b7e5-4046-bdd4-b1ec5c55e6cd\") " pod="cert-manager/cert-manager-webhook-d969966f-tpwbb" Oct 03 08:53:44 crc kubenswrapper[4765]: I1003 08:53:44.259990 4765 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-webhook-d969966f-tpwbb" Oct 03 08:53:44 crc kubenswrapper[4765]: I1003 08:53:44.862504 4765 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-webhook-d969966f-tpwbb"] Oct 03 08:53:44 crc kubenswrapper[4765]: I1003 08:53:44.949250 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-webhook-d969966f-tpwbb" event={"ID":"6428e1d2-b7e5-4046-bdd4-b1ec5c55e6cd","Type":"ContainerStarted","Data":"e18a62967c483a0dcd708f877e5edcd995a75ba8f1ec27a73c70165788ff61a4"} Oct 03 08:53:47 crc kubenswrapper[4765]: I1003 08:53:47.146725 4765 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-tfqnw"] Oct 03 08:53:47 crc kubenswrapper[4765]: I1003 08:53:47.149209 4765 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-tfqnw" podUID="26d9e9b3-34a3-4c3f-af50-7ebf64127770" containerName="registry-server" containerID="cri-o://e5190046985b19216f2e0428a4da34a58d77c4333cbec208b5a42cda751cdeea" gracePeriod=2 Oct 03 08:53:47 crc kubenswrapper[4765]: I1003 08:53:47.640488 4765 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-tfqnw" Oct 03 08:53:47 crc kubenswrapper[4765]: I1003 08:53:47.685146 4765 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/26d9e9b3-34a3-4c3f-af50-7ebf64127770-utilities\") pod \"26d9e9b3-34a3-4c3f-af50-7ebf64127770\" (UID: \"26d9e9b3-34a3-4c3f-af50-7ebf64127770\") " Oct 03 08:53:47 crc kubenswrapper[4765]: I1003 08:53:47.685283 4765 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vrvcd\" (UniqueName: \"kubernetes.io/projected/26d9e9b3-34a3-4c3f-af50-7ebf64127770-kube-api-access-vrvcd\") pod \"26d9e9b3-34a3-4c3f-af50-7ebf64127770\" (UID: \"26d9e9b3-34a3-4c3f-af50-7ebf64127770\") " Oct 03 08:53:47 crc kubenswrapper[4765]: I1003 08:53:47.685344 4765 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/26d9e9b3-34a3-4c3f-af50-7ebf64127770-catalog-content\") pod \"26d9e9b3-34a3-4c3f-af50-7ebf64127770\" (UID: \"26d9e9b3-34a3-4c3f-af50-7ebf64127770\") " Oct 03 08:53:47 crc kubenswrapper[4765]: I1003 08:53:47.686590 4765 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/26d9e9b3-34a3-4c3f-af50-7ebf64127770-utilities" (OuterVolumeSpecName: "utilities") pod "26d9e9b3-34a3-4c3f-af50-7ebf64127770" (UID: "26d9e9b3-34a3-4c3f-af50-7ebf64127770"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 03 08:53:47 crc kubenswrapper[4765]: I1003 08:53:47.705870 4765 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/26d9e9b3-34a3-4c3f-af50-7ebf64127770-kube-api-access-vrvcd" (OuterVolumeSpecName: "kube-api-access-vrvcd") pod "26d9e9b3-34a3-4c3f-af50-7ebf64127770" (UID: "26d9e9b3-34a3-4c3f-af50-7ebf64127770"). InnerVolumeSpecName "kube-api-access-vrvcd". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 08:53:47 crc kubenswrapper[4765]: I1003 08:53:47.710386 4765 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/26d9e9b3-34a3-4c3f-af50-7ebf64127770-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "26d9e9b3-34a3-4c3f-af50-7ebf64127770" (UID: "26d9e9b3-34a3-4c3f-af50-7ebf64127770"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 03 08:53:47 crc kubenswrapper[4765]: I1003 08:53:47.787550 4765 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vrvcd\" (UniqueName: \"kubernetes.io/projected/26d9e9b3-34a3-4c3f-af50-7ebf64127770-kube-api-access-vrvcd\") on node \"crc\" DevicePath \"\"" Oct 03 08:53:47 crc kubenswrapper[4765]: I1003 08:53:47.787596 4765 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/26d9e9b3-34a3-4c3f-af50-7ebf64127770-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 03 08:53:47 crc kubenswrapper[4765]: I1003 08:53:47.787611 4765 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/26d9e9b3-34a3-4c3f-af50-7ebf64127770-utilities\") on node \"crc\" DevicePath \"\"" Oct 03 08:53:47 crc kubenswrapper[4765]: I1003 08:53:47.900717 4765 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["cert-manager/cert-manager-cainjector-7d9f95dbf-9nl8k"] Oct 03 08:53:47 crc kubenswrapper[4765]: E1003 08:53:47.901165 4765 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="26d9e9b3-34a3-4c3f-af50-7ebf64127770" containerName="extract-content" Oct 03 08:53:47 crc kubenswrapper[4765]: I1003 08:53:47.901188 4765 state_mem.go:107] "Deleted CPUSet assignment" podUID="26d9e9b3-34a3-4c3f-af50-7ebf64127770" containerName="extract-content" Oct 03 08:53:47 crc kubenswrapper[4765]: E1003 08:53:47.901204 4765 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="26d9e9b3-34a3-4c3f-af50-7ebf64127770" containerName="extract-utilities" Oct 03 08:53:47 crc kubenswrapper[4765]: I1003 08:53:47.901213 4765 state_mem.go:107] "Deleted CPUSet assignment" podUID="26d9e9b3-34a3-4c3f-af50-7ebf64127770" containerName="extract-utilities" Oct 03 08:53:47 crc kubenswrapper[4765]: E1003 08:53:47.901222 4765 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="26d9e9b3-34a3-4c3f-af50-7ebf64127770" containerName="registry-server" Oct 03 08:53:47 crc kubenswrapper[4765]: I1003 08:53:47.901230 4765 state_mem.go:107] "Deleted CPUSet assignment" podUID="26d9e9b3-34a3-4c3f-af50-7ebf64127770" containerName="registry-server" Oct 03 08:53:47 crc kubenswrapper[4765]: I1003 08:53:47.901405 4765 memory_manager.go:354] "RemoveStaleState removing state" podUID="26d9e9b3-34a3-4c3f-af50-7ebf64127770" containerName="registry-server" Oct 03 08:53:47 crc kubenswrapper[4765]: I1003 08:53:47.901957 4765 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-cainjector-7d9f95dbf-9nl8k"] Oct 03 08:53:47 crc kubenswrapper[4765]: I1003 08:53:47.902079 4765 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-cainjector-7d9f95dbf-9nl8k" Oct 03 08:53:47 crc kubenswrapper[4765]: I1003 08:53:47.906890 4765 reflector.go:368] Caches populated for *v1.Secret from object-"cert-manager"/"cert-manager-cainjector-dockercfg-dv9kr" Oct 03 08:53:47 crc kubenswrapper[4765]: I1003 08:53:47.988395 4765 generic.go:334] "Generic (PLEG): container finished" podID="26d9e9b3-34a3-4c3f-af50-7ebf64127770" containerID="e5190046985b19216f2e0428a4da34a58d77c4333cbec208b5a42cda751cdeea" exitCode=0 Oct 03 08:53:47 crc kubenswrapper[4765]: I1003 08:53:47.988439 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-tfqnw" event={"ID":"26d9e9b3-34a3-4c3f-af50-7ebf64127770","Type":"ContainerDied","Data":"e5190046985b19216f2e0428a4da34a58d77c4333cbec208b5a42cda751cdeea"} Oct 03 08:53:47 crc kubenswrapper[4765]: I1003 08:53:47.988467 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-tfqnw" event={"ID":"26d9e9b3-34a3-4c3f-af50-7ebf64127770","Type":"ContainerDied","Data":"592f5c5866a31a721939cd9460ed9a9ce63bcd02da6bab855cb61b3fb41daa72"} Oct 03 08:53:47 crc kubenswrapper[4765]: I1003 08:53:47.988484 4765 scope.go:117] "RemoveContainer" containerID="e5190046985b19216f2e0428a4da34a58d77c4333cbec208b5a42cda751cdeea" Oct 03 08:53:47 crc kubenswrapper[4765]: I1003 08:53:47.988591 4765 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-tfqnw" Oct 03 08:53:47 crc kubenswrapper[4765]: I1003 08:53:47.989827 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9jqd4\" (UniqueName: \"kubernetes.io/projected/fd40bba3-15c9-42d9-83f3-aeb014fd89eb-kube-api-access-9jqd4\") pod \"cert-manager-cainjector-7d9f95dbf-9nl8k\" (UID: \"fd40bba3-15c9-42d9-83f3-aeb014fd89eb\") " pod="cert-manager/cert-manager-cainjector-7d9f95dbf-9nl8k" Oct 03 08:53:47 crc kubenswrapper[4765]: I1003 08:53:47.989876 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/fd40bba3-15c9-42d9-83f3-aeb014fd89eb-bound-sa-token\") pod \"cert-manager-cainjector-7d9f95dbf-9nl8k\" (UID: \"fd40bba3-15c9-42d9-83f3-aeb014fd89eb\") " pod="cert-manager/cert-manager-cainjector-7d9f95dbf-9nl8k" Oct 03 08:53:48 crc kubenswrapper[4765]: I1003 08:53:48.008315 4765 scope.go:117] "RemoveContainer" containerID="c926fd737e257cbe85894f1f288b537287d216d292010a256d3f304ec6512c13" Oct 03 08:53:48 crc kubenswrapper[4765]: I1003 08:53:48.026981 4765 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-tfqnw"] Oct 03 08:53:48 crc kubenswrapper[4765]: I1003 08:53:48.036614 4765 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-tfqnw"] Oct 03 08:53:48 crc kubenswrapper[4765]: I1003 08:53:48.050303 4765 scope.go:117] "RemoveContainer" containerID="e03814312320c243725b66ed6d7eb0f650f10890416ad14ee0eac49fbcf0ac10" Oct 03 08:53:48 crc kubenswrapper[4765]: I1003 08:53:48.070875 4765 scope.go:117] "RemoveContainer" containerID="e5190046985b19216f2e0428a4da34a58d77c4333cbec208b5a42cda751cdeea" Oct 03 08:53:48 crc kubenswrapper[4765]: E1003 08:53:48.071552 4765 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e5190046985b19216f2e0428a4da34a58d77c4333cbec208b5a42cda751cdeea\": container with ID starting with e5190046985b19216f2e0428a4da34a58d77c4333cbec208b5a42cda751cdeea not found: ID does not exist" containerID="e5190046985b19216f2e0428a4da34a58d77c4333cbec208b5a42cda751cdeea" Oct 03 08:53:48 crc kubenswrapper[4765]: I1003 08:53:48.071605 4765 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e5190046985b19216f2e0428a4da34a58d77c4333cbec208b5a42cda751cdeea"} err="failed to get container status \"e5190046985b19216f2e0428a4da34a58d77c4333cbec208b5a42cda751cdeea\": rpc error: code = NotFound desc = could not find container \"e5190046985b19216f2e0428a4da34a58d77c4333cbec208b5a42cda751cdeea\": container with ID starting with e5190046985b19216f2e0428a4da34a58d77c4333cbec208b5a42cda751cdeea not found: ID does not exist" Oct 03 08:53:48 crc kubenswrapper[4765]: I1003 08:53:48.071635 4765 scope.go:117] "RemoveContainer" containerID="c926fd737e257cbe85894f1f288b537287d216d292010a256d3f304ec6512c13" Oct 03 08:53:48 crc kubenswrapper[4765]: E1003 08:53:48.072086 4765 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c926fd737e257cbe85894f1f288b537287d216d292010a256d3f304ec6512c13\": container with ID starting with c926fd737e257cbe85894f1f288b537287d216d292010a256d3f304ec6512c13 not found: ID does not exist" containerID="c926fd737e257cbe85894f1f288b537287d216d292010a256d3f304ec6512c13" Oct 03 08:53:48 crc kubenswrapper[4765]: I1003 08:53:48.072115 4765 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c926fd737e257cbe85894f1f288b537287d216d292010a256d3f304ec6512c13"} err="failed to get container status \"c926fd737e257cbe85894f1f288b537287d216d292010a256d3f304ec6512c13\": rpc error: code = NotFound desc = could not find container \"c926fd737e257cbe85894f1f288b537287d216d292010a256d3f304ec6512c13\": container with ID starting with c926fd737e257cbe85894f1f288b537287d216d292010a256d3f304ec6512c13 not found: ID does not exist" Oct 03 08:53:48 crc kubenswrapper[4765]: I1003 08:53:48.072130 4765 scope.go:117] "RemoveContainer" containerID="e03814312320c243725b66ed6d7eb0f650f10890416ad14ee0eac49fbcf0ac10" Oct 03 08:53:48 crc kubenswrapper[4765]: E1003 08:53:48.072349 4765 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e03814312320c243725b66ed6d7eb0f650f10890416ad14ee0eac49fbcf0ac10\": container with ID starting with e03814312320c243725b66ed6d7eb0f650f10890416ad14ee0eac49fbcf0ac10 not found: ID does not exist" containerID="e03814312320c243725b66ed6d7eb0f650f10890416ad14ee0eac49fbcf0ac10" Oct 03 08:53:48 crc kubenswrapper[4765]: I1003 08:53:48.072369 4765 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e03814312320c243725b66ed6d7eb0f650f10890416ad14ee0eac49fbcf0ac10"} err="failed to get container status \"e03814312320c243725b66ed6d7eb0f650f10890416ad14ee0eac49fbcf0ac10\": rpc error: code = NotFound desc = could not find container \"e03814312320c243725b66ed6d7eb0f650f10890416ad14ee0eac49fbcf0ac10\": container with ID starting with e03814312320c243725b66ed6d7eb0f650f10890416ad14ee0eac49fbcf0ac10 not found: ID does not exist" Oct 03 08:53:48 crc kubenswrapper[4765]: I1003 08:53:48.091589 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9jqd4\" (UniqueName: \"kubernetes.io/projected/fd40bba3-15c9-42d9-83f3-aeb014fd89eb-kube-api-access-9jqd4\") pod \"cert-manager-cainjector-7d9f95dbf-9nl8k\" (UID: \"fd40bba3-15c9-42d9-83f3-aeb014fd89eb\") " pod="cert-manager/cert-manager-cainjector-7d9f95dbf-9nl8k" Oct 03 08:53:48 crc kubenswrapper[4765]: I1003 08:53:48.091672 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/fd40bba3-15c9-42d9-83f3-aeb014fd89eb-bound-sa-token\") pod \"cert-manager-cainjector-7d9f95dbf-9nl8k\" (UID: \"fd40bba3-15c9-42d9-83f3-aeb014fd89eb\") " pod="cert-manager/cert-manager-cainjector-7d9f95dbf-9nl8k" Oct 03 08:53:48 crc kubenswrapper[4765]: I1003 08:53:48.110324 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/fd40bba3-15c9-42d9-83f3-aeb014fd89eb-bound-sa-token\") pod \"cert-manager-cainjector-7d9f95dbf-9nl8k\" (UID: \"fd40bba3-15c9-42d9-83f3-aeb014fd89eb\") " pod="cert-manager/cert-manager-cainjector-7d9f95dbf-9nl8k" Oct 03 08:53:48 crc kubenswrapper[4765]: I1003 08:53:48.110825 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9jqd4\" (UniqueName: \"kubernetes.io/projected/fd40bba3-15c9-42d9-83f3-aeb014fd89eb-kube-api-access-9jqd4\") pod \"cert-manager-cainjector-7d9f95dbf-9nl8k\" (UID: \"fd40bba3-15c9-42d9-83f3-aeb014fd89eb\") " pod="cert-manager/cert-manager-cainjector-7d9f95dbf-9nl8k" Oct 03 08:53:48 crc kubenswrapper[4765]: I1003 08:53:48.223296 4765 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-cainjector-7d9f95dbf-9nl8k" Oct 03 08:53:48 crc kubenswrapper[4765]: I1003 08:53:48.316609 4765 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="26d9e9b3-34a3-4c3f-af50-7ebf64127770" path="/var/lib/kubelet/pods/26d9e9b3-34a3-4c3f-af50-7ebf64127770/volumes" Oct 03 08:53:48 crc kubenswrapper[4765]: I1003 08:53:48.693407 4765 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-cainjector-7d9f95dbf-9nl8k"] Oct 03 08:53:51 crc kubenswrapper[4765]: I1003 08:53:51.014954 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-cainjector-7d9f95dbf-9nl8k" event={"ID":"fd40bba3-15c9-42d9-83f3-aeb014fd89eb","Type":"ContainerStarted","Data":"342e4f0a96d5e45f395332bc62c794d2a05597d212199ae3b2b983432465d7ac"} Oct 03 08:53:51 crc kubenswrapper[4765]: I1003 08:53:51.016349 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-webhook-d969966f-tpwbb" event={"ID":"6428e1d2-b7e5-4046-bdd4-b1ec5c55e6cd","Type":"ContainerStarted","Data":"3f59e0dc29d6b709eeb265622875b75d7acc4df1a5276e75f24abcd9cc706c8e"} Oct 03 08:53:51 crc kubenswrapper[4765]: I1003 08:53:51.016489 4765 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="cert-manager/cert-manager-webhook-d969966f-tpwbb" Oct 03 08:53:51 crc kubenswrapper[4765]: I1003 08:53:51.036438 4765 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="cert-manager/cert-manager-webhook-d969966f-tpwbb" podStartSLOduration=2.540142317 podStartE2EDuration="8.036416987s" podCreationTimestamp="2025-10-03 08:53:43 +0000 UTC" firstStartedPulling="2025-10-03 08:53:44.850786234 +0000 UTC m=+869.152280564" lastFinishedPulling="2025-10-03 08:53:50.347060904 +0000 UTC m=+874.648555234" observedRunningTime="2025-10-03 08:53:51.032456616 +0000 UTC m=+875.333950956" watchObservedRunningTime="2025-10-03 08:53:51.036416987 +0000 UTC m=+875.337911317" Oct 03 08:53:52 crc kubenswrapper[4765]: I1003 08:53:52.023539 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-cainjector-7d9f95dbf-9nl8k" event={"ID":"fd40bba3-15c9-42d9-83f3-aeb014fd89eb","Type":"ContainerStarted","Data":"dd639887a2699feb4cd9f0a01e1d118035aacbb24ceca25dbd97f1a48056f092"} Oct 03 08:53:52 crc kubenswrapper[4765]: I1003 08:53:52.041685 4765 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="cert-manager/cert-manager-cainjector-7d9f95dbf-9nl8k" podStartSLOduration=3.870717297 podStartE2EDuration="5.041665802s" podCreationTimestamp="2025-10-03 08:53:47 +0000 UTC" firstStartedPulling="2025-10-03 08:53:50.250747231 +0000 UTC m=+874.552241561" lastFinishedPulling="2025-10-03 08:53:51.421695746 +0000 UTC m=+875.723190066" observedRunningTime="2025-10-03 08:53:52.036577143 +0000 UTC m=+876.338071473" watchObservedRunningTime="2025-10-03 08:53:52.041665802 +0000 UTC m=+876.343160132" Oct 03 08:53:55 crc kubenswrapper[4765]: I1003 08:53:55.850020 4765 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["cert-manager/cert-manager-7d4cc89fcb-hlltz"] Oct 03 08:53:55 crc kubenswrapper[4765]: I1003 08:53:55.851087 4765 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-7d4cc89fcb-hlltz" Oct 03 08:53:55 crc kubenswrapper[4765]: I1003 08:53:55.853010 4765 reflector.go:368] Caches populated for *v1.Secret from object-"cert-manager"/"cert-manager-dockercfg-j9kmt" Oct 03 08:53:55 crc kubenswrapper[4765]: I1003 08:53:55.861790 4765 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-7d4cc89fcb-hlltz"] Oct 03 08:53:55 crc kubenswrapper[4765]: I1003 08:53:55.894362 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/27364373-79dc-418b-8ec0-5d0032a48040-bound-sa-token\") pod \"cert-manager-7d4cc89fcb-hlltz\" (UID: \"27364373-79dc-418b-8ec0-5d0032a48040\") " pod="cert-manager/cert-manager-7d4cc89fcb-hlltz" Oct 03 08:53:55 crc kubenswrapper[4765]: I1003 08:53:55.894416 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-g24dv\" (UniqueName: \"kubernetes.io/projected/27364373-79dc-418b-8ec0-5d0032a48040-kube-api-access-g24dv\") pod \"cert-manager-7d4cc89fcb-hlltz\" (UID: \"27364373-79dc-418b-8ec0-5d0032a48040\") " pod="cert-manager/cert-manager-7d4cc89fcb-hlltz" Oct 03 08:53:55 crc kubenswrapper[4765]: I1003 08:53:55.995565 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/27364373-79dc-418b-8ec0-5d0032a48040-bound-sa-token\") pod \"cert-manager-7d4cc89fcb-hlltz\" (UID: \"27364373-79dc-418b-8ec0-5d0032a48040\") " pod="cert-manager/cert-manager-7d4cc89fcb-hlltz" Oct 03 08:53:55 crc kubenswrapper[4765]: I1003 08:53:55.995688 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-g24dv\" (UniqueName: \"kubernetes.io/projected/27364373-79dc-418b-8ec0-5d0032a48040-kube-api-access-g24dv\") pod \"cert-manager-7d4cc89fcb-hlltz\" (UID: \"27364373-79dc-418b-8ec0-5d0032a48040\") " pod="cert-manager/cert-manager-7d4cc89fcb-hlltz" Oct 03 08:53:56 crc kubenswrapper[4765]: I1003 08:53:56.016925 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/27364373-79dc-418b-8ec0-5d0032a48040-bound-sa-token\") pod \"cert-manager-7d4cc89fcb-hlltz\" (UID: \"27364373-79dc-418b-8ec0-5d0032a48040\") " pod="cert-manager/cert-manager-7d4cc89fcb-hlltz" Oct 03 08:53:56 crc kubenswrapper[4765]: I1003 08:53:56.017035 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-g24dv\" (UniqueName: \"kubernetes.io/projected/27364373-79dc-418b-8ec0-5d0032a48040-kube-api-access-g24dv\") pod \"cert-manager-7d4cc89fcb-hlltz\" (UID: \"27364373-79dc-418b-8ec0-5d0032a48040\") " pod="cert-manager/cert-manager-7d4cc89fcb-hlltz" Oct 03 08:53:56 crc kubenswrapper[4765]: I1003 08:53:56.170753 4765 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-7d4cc89fcb-hlltz" Oct 03 08:53:56 crc kubenswrapper[4765]: I1003 08:53:56.455738 4765 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-7d4cc89fcb-hlltz"] Oct 03 08:53:56 crc kubenswrapper[4765]: W1003 08:53:56.465871 4765 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod27364373_79dc_418b_8ec0_5d0032a48040.slice/crio-332563fbe9a27817977ab05f876590bdf6d0780e4af7dfe049e241df1e1f7fd4 WatchSource:0}: Error finding container 332563fbe9a27817977ab05f876590bdf6d0780e4af7dfe049e241df1e1f7fd4: Status 404 returned error can't find the container with id 332563fbe9a27817977ab05f876590bdf6d0780e4af7dfe049e241df1e1f7fd4 Oct 03 08:53:57 crc kubenswrapper[4765]: I1003 08:53:57.064844 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-7d4cc89fcb-hlltz" event={"ID":"27364373-79dc-418b-8ec0-5d0032a48040","Type":"ContainerStarted","Data":"6cf79e84408008dbca8a0343e5b0d5a036ff9a13a91fd3314c8194df87ad8ef0"} Oct 03 08:53:57 crc kubenswrapper[4765]: I1003 08:53:57.065593 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-7d4cc89fcb-hlltz" event={"ID":"27364373-79dc-418b-8ec0-5d0032a48040","Type":"ContainerStarted","Data":"332563fbe9a27817977ab05f876590bdf6d0780e4af7dfe049e241df1e1f7fd4"} Oct 03 08:53:57 crc kubenswrapper[4765]: I1003 08:53:57.099732 4765 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="cert-manager/cert-manager-7d4cc89fcb-hlltz" podStartSLOduration=2.099707243 podStartE2EDuration="2.099707243s" podCreationTimestamp="2025-10-03 08:53:55 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-03 08:53:57.097047445 +0000 UTC m=+881.398541775" watchObservedRunningTime="2025-10-03 08:53:57.099707243 +0000 UTC m=+881.401201573" Oct 03 08:53:59 crc kubenswrapper[4765]: I1003 08:53:59.264298 4765 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="cert-manager/cert-manager-webhook-d969966f-tpwbb" Oct 03 08:54:02 crc kubenswrapper[4765]: I1003 08:54:02.579472 4765 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/openstack-operator-index-jqgdl"] Oct 03 08:54:02 crc kubenswrapper[4765]: I1003 08:54:02.580768 4765 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-index-jqgdl" Oct 03 08:54:02 crc kubenswrapper[4765]: I1003 08:54:02.582950 4765 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack-operators"/"kube-root-ca.crt" Oct 03 08:54:02 crc kubenswrapper[4765]: I1003 08:54:02.582983 4765 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"openstack-operator-index-dockercfg-q7qjl" Oct 03 08:54:02 crc kubenswrapper[4765]: I1003 08:54:02.583986 4765 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack-operators"/"openshift-service-ca.crt" Oct 03 08:54:02 crc kubenswrapper[4765]: I1003 08:54:02.631455 4765 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-index-jqgdl"] Oct 03 08:54:02 crc kubenswrapper[4765]: I1003 08:54:02.688203 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qjvsw\" (UniqueName: \"kubernetes.io/projected/e0bff11c-c2bb-4935-bac9-b14465c43101-kube-api-access-qjvsw\") pod \"openstack-operator-index-jqgdl\" (UID: \"e0bff11c-c2bb-4935-bac9-b14465c43101\") " pod="openstack-operators/openstack-operator-index-jqgdl" Oct 03 08:54:02 crc kubenswrapper[4765]: I1003 08:54:02.791945 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qjvsw\" (UniqueName: \"kubernetes.io/projected/e0bff11c-c2bb-4935-bac9-b14465c43101-kube-api-access-qjvsw\") pod \"openstack-operator-index-jqgdl\" (UID: \"e0bff11c-c2bb-4935-bac9-b14465c43101\") " pod="openstack-operators/openstack-operator-index-jqgdl" Oct 03 08:54:02 crc kubenswrapper[4765]: I1003 08:54:02.813611 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qjvsw\" (UniqueName: \"kubernetes.io/projected/e0bff11c-c2bb-4935-bac9-b14465c43101-kube-api-access-qjvsw\") pod \"openstack-operator-index-jqgdl\" (UID: \"e0bff11c-c2bb-4935-bac9-b14465c43101\") " pod="openstack-operators/openstack-operator-index-jqgdl" Oct 03 08:54:02 crc kubenswrapper[4765]: I1003 08:54:02.900975 4765 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-index-jqgdl" Oct 03 08:54:03 crc kubenswrapper[4765]: I1003 08:54:03.296244 4765 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-index-jqgdl"] Oct 03 08:54:03 crc kubenswrapper[4765]: W1003 08:54:03.304898 4765 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode0bff11c_c2bb_4935_bac9_b14465c43101.slice/crio-57ffc494959d423aca2e5ccd0614c8161c95ee54e06275d30d54da4810fbc676 WatchSource:0}: Error finding container 57ffc494959d423aca2e5ccd0614c8161c95ee54e06275d30d54da4810fbc676: Status 404 returned error can't find the container with id 57ffc494959d423aca2e5ccd0614c8161c95ee54e06275d30d54da4810fbc676 Oct 03 08:54:04 crc kubenswrapper[4765]: I1003 08:54:04.108238 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-index-jqgdl" event={"ID":"e0bff11c-c2bb-4935-bac9-b14465c43101","Type":"ContainerStarted","Data":"57ffc494959d423aca2e5ccd0614c8161c95ee54e06275d30d54da4810fbc676"} Oct 03 08:54:06 crc kubenswrapper[4765]: I1003 08:54:06.121490 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-index-jqgdl" event={"ID":"e0bff11c-c2bb-4935-bac9-b14465c43101","Type":"ContainerStarted","Data":"e6cbfce45510c5d6e03be427bd843f7a3555839135655bfb043b8001ebcbc630"} Oct 03 08:54:06 crc kubenswrapper[4765]: I1003 08:54:06.139706 4765 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/openstack-operator-index-jqgdl" podStartSLOduration=2.144315765 podStartE2EDuration="4.139687593s" podCreationTimestamp="2025-10-03 08:54:02 +0000 UTC" firstStartedPulling="2025-10-03 08:54:03.306847232 +0000 UTC m=+887.608341562" lastFinishedPulling="2025-10-03 08:54:05.30221906 +0000 UTC m=+889.603713390" observedRunningTime="2025-10-03 08:54:06.136842281 +0000 UTC m=+890.438336631" watchObservedRunningTime="2025-10-03 08:54:06.139687593 +0000 UTC m=+890.441181933" Oct 03 08:54:06 crc kubenswrapper[4765]: I1003 08:54:06.759511 4765 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-operators/openstack-operator-index-jqgdl"] Oct 03 08:54:07 crc kubenswrapper[4765]: I1003 08:54:07.563821 4765 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/openstack-operator-index-bkz6d"] Oct 03 08:54:07 crc kubenswrapper[4765]: I1003 08:54:07.564735 4765 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-index-bkz6d" Oct 03 08:54:07 crc kubenswrapper[4765]: I1003 08:54:07.574097 4765 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-index-bkz6d"] Oct 03 08:54:07 crc kubenswrapper[4765]: I1003 08:54:07.669226 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dr4jg\" (UniqueName: \"kubernetes.io/projected/731a5006-db70-4aee-8324-98cd6d3b5cf8-kube-api-access-dr4jg\") pod \"openstack-operator-index-bkz6d\" (UID: \"731a5006-db70-4aee-8324-98cd6d3b5cf8\") " pod="openstack-operators/openstack-operator-index-bkz6d" Oct 03 08:54:07 crc kubenswrapper[4765]: I1003 08:54:07.770259 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dr4jg\" (UniqueName: \"kubernetes.io/projected/731a5006-db70-4aee-8324-98cd6d3b5cf8-kube-api-access-dr4jg\") pod \"openstack-operator-index-bkz6d\" (UID: \"731a5006-db70-4aee-8324-98cd6d3b5cf8\") " pod="openstack-operators/openstack-operator-index-bkz6d" Oct 03 08:54:07 crc kubenswrapper[4765]: I1003 08:54:07.791002 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dr4jg\" (UniqueName: \"kubernetes.io/projected/731a5006-db70-4aee-8324-98cd6d3b5cf8-kube-api-access-dr4jg\") pod \"openstack-operator-index-bkz6d\" (UID: \"731a5006-db70-4aee-8324-98cd6d3b5cf8\") " pod="openstack-operators/openstack-operator-index-bkz6d" Oct 03 08:54:07 crc kubenswrapper[4765]: I1003 08:54:07.884012 4765 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-index-bkz6d" Oct 03 08:54:08 crc kubenswrapper[4765]: I1003 08:54:08.134592 4765 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-operators/openstack-operator-index-jqgdl" podUID="e0bff11c-c2bb-4935-bac9-b14465c43101" containerName="registry-server" containerID="cri-o://e6cbfce45510c5d6e03be427bd843f7a3555839135655bfb043b8001ebcbc630" gracePeriod=2 Oct 03 08:54:08 crc kubenswrapper[4765]: I1003 08:54:08.331232 4765 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-index-bkz6d"] Oct 03 08:54:08 crc kubenswrapper[4765]: W1003 08:54:08.344771 4765 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod731a5006_db70_4aee_8324_98cd6d3b5cf8.slice/crio-ab21977878e4ae46443f65aa5462e763eae3afb2242ae806c49b4cad6001cb6a WatchSource:0}: Error finding container ab21977878e4ae46443f65aa5462e763eae3afb2242ae806c49b4cad6001cb6a: Status 404 returned error can't find the container with id ab21977878e4ae46443f65aa5462e763eae3afb2242ae806c49b4cad6001cb6a Oct 03 08:54:08 crc kubenswrapper[4765]: I1003 08:54:08.475109 4765 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-index-jqgdl" Oct 03 08:54:08 crc kubenswrapper[4765]: I1003 08:54:08.480967 4765 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qjvsw\" (UniqueName: \"kubernetes.io/projected/e0bff11c-c2bb-4935-bac9-b14465c43101-kube-api-access-qjvsw\") pod \"e0bff11c-c2bb-4935-bac9-b14465c43101\" (UID: \"e0bff11c-c2bb-4935-bac9-b14465c43101\") " Oct 03 08:54:08 crc kubenswrapper[4765]: I1003 08:54:08.488072 4765 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e0bff11c-c2bb-4935-bac9-b14465c43101-kube-api-access-qjvsw" (OuterVolumeSpecName: "kube-api-access-qjvsw") pod "e0bff11c-c2bb-4935-bac9-b14465c43101" (UID: "e0bff11c-c2bb-4935-bac9-b14465c43101"). InnerVolumeSpecName "kube-api-access-qjvsw". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 08:54:08 crc kubenswrapper[4765]: I1003 08:54:08.583173 4765 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qjvsw\" (UniqueName: \"kubernetes.io/projected/e0bff11c-c2bb-4935-bac9-b14465c43101-kube-api-access-qjvsw\") on node \"crc\" DevicePath \"\"" Oct 03 08:54:09 crc kubenswrapper[4765]: I1003 08:54:09.141884 4765 generic.go:334] "Generic (PLEG): container finished" podID="e0bff11c-c2bb-4935-bac9-b14465c43101" containerID="e6cbfce45510c5d6e03be427bd843f7a3555839135655bfb043b8001ebcbc630" exitCode=0 Oct 03 08:54:09 crc kubenswrapper[4765]: I1003 08:54:09.141968 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-index-jqgdl" event={"ID":"e0bff11c-c2bb-4935-bac9-b14465c43101","Type":"ContainerDied","Data":"e6cbfce45510c5d6e03be427bd843f7a3555839135655bfb043b8001ebcbc630"} Oct 03 08:54:09 crc kubenswrapper[4765]: I1003 08:54:09.141975 4765 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-index-jqgdl" Oct 03 08:54:09 crc kubenswrapper[4765]: I1003 08:54:09.141994 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-index-jqgdl" event={"ID":"e0bff11c-c2bb-4935-bac9-b14465c43101","Type":"ContainerDied","Data":"57ffc494959d423aca2e5ccd0614c8161c95ee54e06275d30d54da4810fbc676"} Oct 03 08:54:09 crc kubenswrapper[4765]: I1003 08:54:09.142012 4765 scope.go:117] "RemoveContainer" containerID="e6cbfce45510c5d6e03be427bd843f7a3555839135655bfb043b8001ebcbc630" Oct 03 08:54:09 crc kubenswrapper[4765]: I1003 08:54:09.143690 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-index-bkz6d" event={"ID":"731a5006-db70-4aee-8324-98cd6d3b5cf8","Type":"ContainerStarted","Data":"603d1598245ba0c8ab123acb42b768a4bbcc6594b8b2bc2be52e2f871ea0d00a"} Oct 03 08:54:09 crc kubenswrapper[4765]: I1003 08:54:09.143751 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-index-bkz6d" event={"ID":"731a5006-db70-4aee-8324-98cd6d3b5cf8","Type":"ContainerStarted","Data":"ab21977878e4ae46443f65aa5462e763eae3afb2242ae806c49b4cad6001cb6a"} Oct 03 08:54:09 crc kubenswrapper[4765]: I1003 08:54:09.160932 4765 scope.go:117] "RemoveContainer" containerID="e6cbfce45510c5d6e03be427bd843f7a3555839135655bfb043b8001ebcbc630" Oct 03 08:54:09 crc kubenswrapper[4765]: E1003 08:54:09.161499 4765 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e6cbfce45510c5d6e03be427bd843f7a3555839135655bfb043b8001ebcbc630\": container with ID starting with e6cbfce45510c5d6e03be427bd843f7a3555839135655bfb043b8001ebcbc630 not found: ID does not exist" containerID="e6cbfce45510c5d6e03be427bd843f7a3555839135655bfb043b8001ebcbc630" Oct 03 08:54:09 crc kubenswrapper[4765]: I1003 08:54:09.161532 4765 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e6cbfce45510c5d6e03be427bd843f7a3555839135655bfb043b8001ebcbc630"} err="failed to get container status \"e6cbfce45510c5d6e03be427bd843f7a3555839135655bfb043b8001ebcbc630\": rpc error: code = NotFound desc = could not find container \"e6cbfce45510c5d6e03be427bd843f7a3555839135655bfb043b8001ebcbc630\": container with ID starting with e6cbfce45510c5d6e03be427bd843f7a3555839135655bfb043b8001ebcbc630 not found: ID does not exist" Oct 03 08:54:09 crc kubenswrapper[4765]: I1003 08:54:09.168111 4765 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/openstack-operator-index-bkz6d" podStartSLOduration=2.114615843 podStartE2EDuration="2.168076104s" podCreationTimestamp="2025-10-03 08:54:07 +0000 UTC" firstStartedPulling="2025-10-03 08:54:08.349750118 +0000 UTC m=+892.651244448" lastFinishedPulling="2025-10-03 08:54:08.403210379 +0000 UTC m=+892.704704709" observedRunningTime="2025-10-03 08:54:09.161875566 +0000 UTC m=+893.463369906" watchObservedRunningTime="2025-10-03 08:54:09.168076104 +0000 UTC m=+893.469570434" Oct 03 08:54:09 crc kubenswrapper[4765]: I1003 08:54:09.180924 4765 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-operators/openstack-operator-index-jqgdl"] Oct 03 08:54:09 crc kubenswrapper[4765]: I1003 08:54:09.185907 4765 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-operators/openstack-operator-index-jqgdl"] Oct 03 08:54:10 crc kubenswrapper[4765]: I1003 08:54:10.316497 4765 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e0bff11c-c2bb-4935-bac9-b14465c43101" path="/var/lib/kubelet/pods/e0bff11c-c2bb-4935-bac9-b14465c43101/volumes" Oct 03 08:54:17 crc kubenswrapper[4765]: I1003 08:54:17.884250 4765 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack-operators/openstack-operator-index-bkz6d" Oct 03 08:54:17 crc kubenswrapper[4765]: I1003 08:54:17.884796 4765 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/openstack-operator-index-bkz6d" Oct 03 08:54:17 crc kubenswrapper[4765]: I1003 08:54:17.911439 4765 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack-operators/openstack-operator-index-bkz6d" Oct 03 08:54:18 crc kubenswrapper[4765]: I1003 08:54:18.229797 4765 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/openstack-operator-index-bkz6d" Oct 03 08:54:24 crc kubenswrapper[4765]: I1003 08:54:24.289666 4765 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/05e449ad10d2925ed0c6277e570b403daaa897f83ce6b356969c83b3029txlf"] Oct 03 08:54:24 crc kubenswrapper[4765]: E1003 08:54:24.290325 4765 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e0bff11c-c2bb-4935-bac9-b14465c43101" containerName="registry-server" Oct 03 08:54:24 crc kubenswrapper[4765]: I1003 08:54:24.290343 4765 state_mem.go:107] "Deleted CPUSet assignment" podUID="e0bff11c-c2bb-4935-bac9-b14465c43101" containerName="registry-server" Oct 03 08:54:24 crc kubenswrapper[4765]: I1003 08:54:24.290497 4765 memory_manager.go:354] "RemoveStaleState removing state" podUID="e0bff11c-c2bb-4935-bac9-b14465c43101" containerName="registry-server" Oct 03 08:54:24 crc kubenswrapper[4765]: I1003 08:54:24.291816 4765 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/05e449ad10d2925ed0c6277e570b403daaa897f83ce6b356969c83b3029txlf" Oct 03 08:54:24 crc kubenswrapper[4765]: I1003 08:54:24.297693 4765 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/05e449ad10d2925ed0c6277e570b403daaa897f83ce6b356969c83b3029txlf"] Oct 03 08:54:24 crc kubenswrapper[4765]: I1003 08:54:24.299183 4765 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"default-dockercfg-fnnxp" Oct 03 08:54:24 crc kubenswrapper[4765]: I1003 08:54:24.403261 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jh8mg\" (UniqueName: \"kubernetes.io/projected/81cd31aa-88b3-4604-9e6c-6360936a50ae-kube-api-access-jh8mg\") pod \"05e449ad10d2925ed0c6277e570b403daaa897f83ce6b356969c83b3029txlf\" (UID: \"81cd31aa-88b3-4604-9e6c-6360936a50ae\") " pod="openstack-operators/05e449ad10d2925ed0c6277e570b403daaa897f83ce6b356969c83b3029txlf" Oct 03 08:54:24 crc kubenswrapper[4765]: I1003 08:54:24.403439 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/81cd31aa-88b3-4604-9e6c-6360936a50ae-bundle\") pod \"05e449ad10d2925ed0c6277e570b403daaa897f83ce6b356969c83b3029txlf\" (UID: \"81cd31aa-88b3-4604-9e6c-6360936a50ae\") " pod="openstack-operators/05e449ad10d2925ed0c6277e570b403daaa897f83ce6b356969c83b3029txlf" Oct 03 08:54:24 crc kubenswrapper[4765]: I1003 08:54:24.404230 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/81cd31aa-88b3-4604-9e6c-6360936a50ae-util\") pod \"05e449ad10d2925ed0c6277e570b403daaa897f83ce6b356969c83b3029txlf\" (UID: \"81cd31aa-88b3-4604-9e6c-6360936a50ae\") " pod="openstack-operators/05e449ad10d2925ed0c6277e570b403daaa897f83ce6b356969c83b3029txlf" Oct 03 08:54:24 crc kubenswrapper[4765]: I1003 08:54:24.504902 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/81cd31aa-88b3-4604-9e6c-6360936a50ae-bundle\") pod \"05e449ad10d2925ed0c6277e570b403daaa897f83ce6b356969c83b3029txlf\" (UID: \"81cd31aa-88b3-4604-9e6c-6360936a50ae\") " pod="openstack-operators/05e449ad10d2925ed0c6277e570b403daaa897f83ce6b356969c83b3029txlf" Oct 03 08:54:24 crc kubenswrapper[4765]: I1003 08:54:24.504964 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/81cd31aa-88b3-4604-9e6c-6360936a50ae-util\") pod \"05e449ad10d2925ed0c6277e570b403daaa897f83ce6b356969c83b3029txlf\" (UID: \"81cd31aa-88b3-4604-9e6c-6360936a50ae\") " pod="openstack-operators/05e449ad10d2925ed0c6277e570b403daaa897f83ce6b356969c83b3029txlf" Oct 03 08:54:24 crc kubenswrapper[4765]: I1003 08:54:24.504999 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jh8mg\" (UniqueName: \"kubernetes.io/projected/81cd31aa-88b3-4604-9e6c-6360936a50ae-kube-api-access-jh8mg\") pod \"05e449ad10d2925ed0c6277e570b403daaa897f83ce6b356969c83b3029txlf\" (UID: \"81cd31aa-88b3-4604-9e6c-6360936a50ae\") " pod="openstack-operators/05e449ad10d2925ed0c6277e570b403daaa897f83ce6b356969c83b3029txlf" Oct 03 08:54:24 crc kubenswrapper[4765]: I1003 08:54:24.505364 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/81cd31aa-88b3-4604-9e6c-6360936a50ae-bundle\") pod \"05e449ad10d2925ed0c6277e570b403daaa897f83ce6b356969c83b3029txlf\" (UID: \"81cd31aa-88b3-4604-9e6c-6360936a50ae\") " pod="openstack-operators/05e449ad10d2925ed0c6277e570b403daaa897f83ce6b356969c83b3029txlf" Oct 03 08:54:24 crc kubenswrapper[4765]: I1003 08:54:24.506581 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/81cd31aa-88b3-4604-9e6c-6360936a50ae-util\") pod \"05e449ad10d2925ed0c6277e570b403daaa897f83ce6b356969c83b3029txlf\" (UID: \"81cd31aa-88b3-4604-9e6c-6360936a50ae\") " pod="openstack-operators/05e449ad10d2925ed0c6277e570b403daaa897f83ce6b356969c83b3029txlf" Oct 03 08:54:24 crc kubenswrapper[4765]: I1003 08:54:24.521955 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jh8mg\" (UniqueName: \"kubernetes.io/projected/81cd31aa-88b3-4604-9e6c-6360936a50ae-kube-api-access-jh8mg\") pod \"05e449ad10d2925ed0c6277e570b403daaa897f83ce6b356969c83b3029txlf\" (UID: \"81cd31aa-88b3-4604-9e6c-6360936a50ae\") " pod="openstack-operators/05e449ad10d2925ed0c6277e570b403daaa897f83ce6b356969c83b3029txlf" Oct 03 08:54:24 crc kubenswrapper[4765]: I1003 08:54:24.613714 4765 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/05e449ad10d2925ed0c6277e570b403daaa897f83ce6b356969c83b3029txlf" Oct 03 08:54:25 crc kubenswrapper[4765]: I1003 08:54:25.038068 4765 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/05e449ad10d2925ed0c6277e570b403daaa897f83ce6b356969c83b3029txlf"] Oct 03 08:54:25 crc kubenswrapper[4765]: I1003 08:54:25.242233 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/05e449ad10d2925ed0c6277e570b403daaa897f83ce6b356969c83b3029txlf" event={"ID":"81cd31aa-88b3-4604-9e6c-6360936a50ae","Type":"ContainerStarted","Data":"4452fa0298c6bba5b0ed586ab07165d909109d9646da491b4dae7d6139cc6c96"} Oct 03 08:54:25 crc kubenswrapper[4765]: I1003 08:54:25.242291 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/05e449ad10d2925ed0c6277e570b403daaa897f83ce6b356969c83b3029txlf" event={"ID":"81cd31aa-88b3-4604-9e6c-6360936a50ae","Type":"ContainerStarted","Data":"25e7ff1b3621d652173c03ab58a6e3c7d5c6df0d328efaf7d01702f9a3801500"} Oct 03 08:54:26 crc kubenswrapper[4765]: I1003 08:54:26.249894 4765 generic.go:334] "Generic (PLEG): container finished" podID="81cd31aa-88b3-4604-9e6c-6360936a50ae" containerID="4452fa0298c6bba5b0ed586ab07165d909109d9646da491b4dae7d6139cc6c96" exitCode=0 Oct 03 08:54:26 crc kubenswrapper[4765]: I1003 08:54:26.249944 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/05e449ad10d2925ed0c6277e570b403daaa897f83ce6b356969c83b3029txlf" event={"ID":"81cd31aa-88b3-4604-9e6c-6360936a50ae","Type":"ContainerDied","Data":"4452fa0298c6bba5b0ed586ab07165d909109d9646da491b4dae7d6139cc6c96"} Oct 03 08:54:27 crc kubenswrapper[4765]: I1003 08:54:27.257839 4765 generic.go:334] "Generic (PLEG): container finished" podID="81cd31aa-88b3-4604-9e6c-6360936a50ae" containerID="52ccf75e4c1b36071f97faee56968ed57a6e6413ead01488d548a3ee7d66cd35" exitCode=0 Oct 03 08:54:27 crc kubenswrapper[4765]: I1003 08:54:27.257929 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/05e449ad10d2925ed0c6277e570b403daaa897f83ce6b356969c83b3029txlf" event={"ID":"81cd31aa-88b3-4604-9e6c-6360936a50ae","Type":"ContainerDied","Data":"52ccf75e4c1b36071f97faee56968ed57a6e6413ead01488d548a3ee7d66cd35"} Oct 03 08:54:28 crc kubenswrapper[4765]: I1003 08:54:28.266950 4765 generic.go:334] "Generic (PLEG): container finished" podID="81cd31aa-88b3-4604-9e6c-6360936a50ae" containerID="579a1ad99966fa8f355ca4002c99b2a6d8c2263f73a5294bdb49aaa04308a733" exitCode=0 Oct 03 08:54:28 crc kubenswrapper[4765]: I1003 08:54:28.267047 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/05e449ad10d2925ed0c6277e570b403daaa897f83ce6b356969c83b3029txlf" event={"ID":"81cd31aa-88b3-4604-9e6c-6360936a50ae","Type":"ContainerDied","Data":"579a1ad99966fa8f355ca4002c99b2a6d8c2263f73a5294bdb49aaa04308a733"} Oct 03 08:54:29 crc kubenswrapper[4765]: I1003 08:54:29.503610 4765 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/05e449ad10d2925ed0c6277e570b403daaa897f83ce6b356969c83b3029txlf" Oct 03 08:54:29 crc kubenswrapper[4765]: I1003 08:54:29.595849 4765 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jh8mg\" (UniqueName: \"kubernetes.io/projected/81cd31aa-88b3-4604-9e6c-6360936a50ae-kube-api-access-jh8mg\") pod \"81cd31aa-88b3-4604-9e6c-6360936a50ae\" (UID: \"81cd31aa-88b3-4604-9e6c-6360936a50ae\") " Oct 03 08:54:29 crc kubenswrapper[4765]: I1003 08:54:29.595950 4765 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/81cd31aa-88b3-4604-9e6c-6360936a50ae-util\") pod \"81cd31aa-88b3-4604-9e6c-6360936a50ae\" (UID: \"81cd31aa-88b3-4604-9e6c-6360936a50ae\") " Oct 03 08:54:29 crc kubenswrapper[4765]: I1003 08:54:29.596079 4765 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/81cd31aa-88b3-4604-9e6c-6360936a50ae-bundle\") pod \"81cd31aa-88b3-4604-9e6c-6360936a50ae\" (UID: \"81cd31aa-88b3-4604-9e6c-6360936a50ae\") " Oct 03 08:54:29 crc kubenswrapper[4765]: I1003 08:54:29.596969 4765 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/81cd31aa-88b3-4604-9e6c-6360936a50ae-bundle" (OuterVolumeSpecName: "bundle") pod "81cd31aa-88b3-4604-9e6c-6360936a50ae" (UID: "81cd31aa-88b3-4604-9e6c-6360936a50ae"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 03 08:54:29 crc kubenswrapper[4765]: I1003 08:54:29.602096 4765 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/81cd31aa-88b3-4604-9e6c-6360936a50ae-kube-api-access-jh8mg" (OuterVolumeSpecName: "kube-api-access-jh8mg") pod "81cd31aa-88b3-4604-9e6c-6360936a50ae" (UID: "81cd31aa-88b3-4604-9e6c-6360936a50ae"). InnerVolumeSpecName "kube-api-access-jh8mg". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 08:54:29 crc kubenswrapper[4765]: I1003 08:54:29.609634 4765 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/81cd31aa-88b3-4604-9e6c-6360936a50ae-util" (OuterVolumeSpecName: "util") pod "81cd31aa-88b3-4604-9e6c-6360936a50ae" (UID: "81cd31aa-88b3-4604-9e6c-6360936a50ae"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 03 08:54:29 crc kubenswrapper[4765]: I1003 08:54:29.697801 4765 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jh8mg\" (UniqueName: \"kubernetes.io/projected/81cd31aa-88b3-4604-9e6c-6360936a50ae-kube-api-access-jh8mg\") on node \"crc\" DevicePath \"\"" Oct 03 08:54:29 crc kubenswrapper[4765]: I1003 08:54:29.697850 4765 reconciler_common.go:293] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/81cd31aa-88b3-4604-9e6c-6360936a50ae-util\") on node \"crc\" DevicePath \"\"" Oct 03 08:54:29 crc kubenswrapper[4765]: I1003 08:54:29.697859 4765 reconciler_common.go:293] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/81cd31aa-88b3-4604-9e6c-6360936a50ae-bundle\") on node \"crc\" DevicePath \"\"" Oct 03 08:54:30 crc kubenswrapper[4765]: I1003 08:54:30.286005 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/05e449ad10d2925ed0c6277e570b403daaa897f83ce6b356969c83b3029txlf" event={"ID":"81cd31aa-88b3-4604-9e6c-6360936a50ae","Type":"ContainerDied","Data":"25e7ff1b3621d652173c03ab58a6e3c7d5c6df0d328efaf7d01702f9a3801500"} Oct 03 08:54:30 crc kubenswrapper[4765]: I1003 08:54:30.286069 4765 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="25e7ff1b3621d652173c03ab58a6e3c7d5c6df0d328efaf7d01702f9a3801500" Oct 03 08:54:30 crc kubenswrapper[4765]: I1003 08:54:30.286169 4765 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/05e449ad10d2925ed0c6277e570b403daaa897f83ce6b356969c83b3029txlf" Oct 03 08:54:30 crc kubenswrapper[4765]: I1003 08:54:30.680769 4765 patch_prober.go:28] interesting pod/machine-config-daemon-j8mss container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 03 08:54:30 crc kubenswrapper[4765]: I1003 08:54:30.680904 4765 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-j8mss" podUID="d636dbad-9ffa-4ba7-953f-adea04b76a23" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 03 08:54:36 crc kubenswrapper[4765]: I1003 08:54:36.753096 4765 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/openstack-operator-controller-operator-7f8d586cd6-l6vxc"] Oct 03 08:54:36 crc kubenswrapper[4765]: E1003 08:54:36.753973 4765 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="81cd31aa-88b3-4604-9e6c-6360936a50ae" containerName="pull" Oct 03 08:54:36 crc kubenswrapper[4765]: I1003 08:54:36.753989 4765 state_mem.go:107] "Deleted CPUSet assignment" podUID="81cd31aa-88b3-4604-9e6c-6360936a50ae" containerName="pull" Oct 03 08:54:36 crc kubenswrapper[4765]: E1003 08:54:36.754005 4765 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="81cd31aa-88b3-4604-9e6c-6360936a50ae" containerName="extract" Oct 03 08:54:36 crc kubenswrapper[4765]: I1003 08:54:36.754013 4765 state_mem.go:107] "Deleted CPUSet assignment" podUID="81cd31aa-88b3-4604-9e6c-6360936a50ae" containerName="extract" Oct 03 08:54:36 crc kubenswrapper[4765]: E1003 08:54:36.754028 4765 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="81cd31aa-88b3-4604-9e6c-6360936a50ae" containerName="util" Oct 03 08:54:36 crc kubenswrapper[4765]: I1003 08:54:36.754035 4765 state_mem.go:107] "Deleted CPUSet assignment" podUID="81cd31aa-88b3-4604-9e6c-6360936a50ae" containerName="util" Oct 03 08:54:36 crc kubenswrapper[4765]: I1003 08:54:36.754174 4765 memory_manager.go:354] "RemoveStaleState removing state" podUID="81cd31aa-88b3-4604-9e6c-6360936a50ae" containerName="extract" Oct 03 08:54:36 crc kubenswrapper[4765]: I1003 08:54:36.754974 4765 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-controller-operator-7f8d586cd6-l6vxc" Oct 03 08:54:36 crc kubenswrapper[4765]: I1003 08:54:36.756796 4765 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"openstack-operator-controller-operator-dockercfg-txcjl" Oct 03 08:54:36 crc kubenswrapper[4765]: I1003 08:54:36.776863 4765 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-controller-operator-7f8d586cd6-l6vxc"] Oct 03 08:54:36 crc kubenswrapper[4765]: I1003 08:54:36.896736 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hxrzw\" (UniqueName: \"kubernetes.io/projected/1aa867b6-fd87-48cf-9461-338e10d56738-kube-api-access-hxrzw\") pod \"openstack-operator-controller-operator-7f8d586cd6-l6vxc\" (UID: \"1aa867b6-fd87-48cf-9461-338e10d56738\") " pod="openstack-operators/openstack-operator-controller-operator-7f8d586cd6-l6vxc" Oct 03 08:54:36 crc kubenswrapper[4765]: I1003 08:54:36.998007 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hxrzw\" (UniqueName: \"kubernetes.io/projected/1aa867b6-fd87-48cf-9461-338e10d56738-kube-api-access-hxrzw\") pod \"openstack-operator-controller-operator-7f8d586cd6-l6vxc\" (UID: \"1aa867b6-fd87-48cf-9461-338e10d56738\") " pod="openstack-operators/openstack-operator-controller-operator-7f8d586cd6-l6vxc" Oct 03 08:54:37 crc kubenswrapper[4765]: I1003 08:54:37.016577 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hxrzw\" (UniqueName: \"kubernetes.io/projected/1aa867b6-fd87-48cf-9461-338e10d56738-kube-api-access-hxrzw\") pod \"openstack-operator-controller-operator-7f8d586cd6-l6vxc\" (UID: \"1aa867b6-fd87-48cf-9461-338e10d56738\") " pod="openstack-operators/openstack-operator-controller-operator-7f8d586cd6-l6vxc" Oct 03 08:54:37 crc kubenswrapper[4765]: I1003 08:54:37.074254 4765 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-controller-operator-7f8d586cd6-l6vxc" Oct 03 08:54:37 crc kubenswrapper[4765]: I1003 08:54:37.516834 4765 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-controller-operator-7f8d586cd6-l6vxc"] Oct 03 08:54:38 crc kubenswrapper[4765]: I1003 08:54:38.348832 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-controller-operator-7f8d586cd6-l6vxc" event={"ID":"1aa867b6-fd87-48cf-9461-338e10d56738","Type":"ContainerStarted","Data":"877cb5145318926b815933b0b847d79fec5beb30962ef2ee89701212f6ff5d4d"} Oct 03 08:54:42 crc kubenswrapper[4765]: I1003 08:54:42.374921 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-controller-operator-7f8d586cd6-l6vxc" event={"ID":"1aa867b6-fd87-48cf-9461-338e10d56738","Type":"ContainerStarted","Data":"7307e245178cc09e6b2a102b3c255b2c4113b1bd03d917cb0dcb700e13a7f72e"} Oct 03 08:54:44 crc kubenswrapper[4765]: I1003 08:54:44.388530 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-controller-operator-7f8d586cd6-l6vxc" event={"ID":"1aa867b6-fd87-48cf-9461-338e10d56738","Type":"ContainerStarted","Data":"fb0b3b8b2c36d2ced99c525dfb9626f57a48ff42d673427d6429fb33f9b1f215"} Oct 03 08:54:44 crc kubenswrapper[4765]: I1003 08:54:44.388906 4765 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/openstack-operator-controller-operator-7f8d586cd6-l6vxc" Oct 03 08:54:44 crc kubenswrapper[4765]: I1003 08:54:44.414710 4765 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/openstack-operator-controller-operator-7f8d586cd6-l6vxc" podStartSLOduration=2.422720551 podStartE2EDuration="8.414691062s" podCreationTimestamp="2025-10-03 08:54:36 +0000 UTC" firstStartedPulling="2025-10-03 08:54:37.526325207 +0000 UTC m=+921.827819537" lastFinishedPulling="2025-10-03 08:54:43.518295718 +0000 UTC m=+927.819790048" observedRunningTime="2025-10-03 08:54:44.411272865 +0000 UTC m=+928.712767195" watchObservedRunningTime="2025-10-03 08:54:44.414691062 +0000 UTC m=+928.716185382" Oct 03 08:54:47 crc kubenswrapper[4765]: I1003 08:54:47.082398 4765 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/openstack-operator-controller-operator-7f8d586cd6-l6vxc" Oct 03 08:55:00 crc kubenswrapper[4765]: I1003 08:55:00.680102 4765 patch_prober.go:28] interesting pod/machine-config-daemon-j8mss container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 03 08:55:00 crc kubenswrapper[4765]: I1003 08:55:00.680673 4765 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-j8mss" podUID="d636dbad-9ffa-4ba7-953f-adea04b76a23" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 03 08:55:04 crc kubenswrapper[4765]: I1003 08:55:04.025994 4765 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/barbican-operator-controller-manager-6c675fb79f-z24cr"] Oct 03 08:55:04 crc kubenswrapper[4765]: I1003 08:55:04.027490 4765 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/barbican-operator-controller-manager-6c675fb79f-z24cr" Oct 03 08:55:04 crc kubenswrapper[4765]: I1003 08:55:04.030206 4765 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"barbican-operator-controller-manager-dockercfg-crsz2" Oct 03 08:55:04 crc kubenswrapper[4765]: I1003 08:55:04.045338 4765 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/barbican-operator-controller-manager-6c675fb79f-z24cr"] Oct 03 08:55:04 crc kubenswrapper[4765]: I1003 08:55:04.053355 4765 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/cinder-operator-controller-manager-79d68d6c85-kxsxl"] Oct 03 08:55:04 crc kubenswrapper[4765]: I1003 08:55:04.057626 4765 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/cinder-operator-controller-manager-79d68d6c85-kxsxl" Oct 03 08:55:04 crc kubenswrapper[4765]: I1003 08:55:04.061597 4765 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"cinder-operator-controller-manager-dockercfg-5hbsb" Oct 03 08:55:04 crc kubenswrapper[4765]: I1003 08:55:04.077296 4765 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/cinder-operator-controller-manager-79d68d6c85-kxsxl"] Oct 03 08:55:04 crc kubenswrapper[4765]: I1003 08:55:04.087563 4765 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/designate-operator-controller-manager-75dfd9b554-c9sw7"] Oct 03 08:55:04 crc kubenswrapper[4765]: I1003 08:55:04.088917 4765 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/designate-operator-controller-manager-75dfd9b554-c9sw7" Oct 03 08:55:04 crc kubenswrapper[4765]: I1003 08:55:04.093026 4765 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/designate-operator-controller-manager-75dfd9b554-c9sw7"] Oct 03 08:55:04 crc kubenswrapper[4765]: I1003 08:55:04.093339 4765 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"designate-operator-controller-manager-dockercfg-ljmz2" Oct 03 08:55:04 crc kubenswrapper[4765]: I1003 08:55:04.111678 4765 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/glance-operator-controller-manager-846dff85b5-z5mzn"] Oct 03 08:55:04 crc kubenswrapper[4765]: I1003 08:55:04.112972 4765 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/glance-operator-controller-manager-846dff85b5-z5mzn" Oct 03 08:55:04 crc kubenswrapper[4765]: I1003 08:55:04.116044 4765 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"glance-operator-controller-manager-dockercfg-q9lkp" Oct 03 08:55:04 crc kubenswrapper[4765]: I1003 08:55:04.123768 4765 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/glance-operator-controller-manager-846dff85b5-z5mzn"] Oct 03 08:55:04 crc kubenswrapper[4765]: I1003 08:55:04.129405 4765 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/heat-operator-controller-manager-599898f689-6t6dx"] Oct 03 08:55:04 crc kubenswrapper[4765]: I1003 08:55:04.130593 4765 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/heat-operator-controller-manager-599898f689-6t6dx" Oct 03 08:55:04 crc kubenswrapper[4765]: I1003 08:55:04.137312 4765 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"heat-operator-controller-manager-dockercfg-q9czh" Oct 03 08:55:04 crc kubenswrapper[4765]: I1003 08:55:04.155154 4765 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/horizon-operator-controller-manager-6769b867d9-tjk7g"] Oct 03 08:55:04 crc kubenswrapper[4765]: I1003 08:55:04.156378 4765 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/horizon-operator-controller-manager-6769b867d9-tjk7g" Oct 03 08:55:04 crc kubenswrapper[4765]: I1003 08:55:04.157782 4765 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"horizon-operator-controller-manager-dockercfg-75nx8" Oct 03 08:55:04 crc kubenswrapper[4765]: I1003 08:55:04.187467 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gbcsw\" (UniqueName: \"kubernetes.io/projected/a9c35b93-23e9-43d5-9532-f1f0d51e8ae8-kube-api-access-gbcsw\") pod \"barbican-operator-controller-manager-6c675fb79f-z24cr\" (UID: \"a9c35b93-23e9-43d5-9532-f1f0d51e8ae8\") " pod="openstack-operators/barbican-operator-controller-manager-6c675fb79f-z24cr" Oct 03 08:55:04 crc kubenswrapper[4765]: I1003 08:55:04.187593 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lwflp\" (UniqueName: \"kubernetes.io/projected/4fd5d3b8-ca79-48d0-9854-5bb9bc04eae4-kube-api-access-lwflp\") pod \"cinder-operator-controller-manager-79d68d6c85-kxsxl\" (UID: \"4fd5d3b8-ca79-48d0-9854-5bb9bc04eae4\") " pod="openstack-operators/cinder-operator-controller-manager-79d68d6c85-kxsxl" Oct 03 08:55:04 crc kubenswrapper[4765]: I1003 08:55:04.197690 4765 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/horizon-operator-controller-manager-6769b867d9-tjk7g"] Oct 03 08:55:04 crc kubenswrapper[4765]: I1003 08:55:04.244859 4765 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/infra-operator-controller-manager-5fbf469cd7-k5nkl"] Oct 03 08:55:04 crc kubenswrapper[4765]: I1003 08:55:04.246334 4765 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/infra-operator-controller-manager-5fbf469cd7-k5nkl" Oct 03 08:55:04 crc kubenswrapper[4765]: I1003 08:55:04.252070 4765 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"infra-operator-controller-manager-dockercfg-8s6kg" Oct 03 08:55:04 crc kubenswrapper[4765]: I1003 08:55:04.252162 4765 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"infra-operator-webhook-server-cert" Oct 03 08:55:04 crc kubenswrapper[4765]: I1003 08:55:04.264382 4765 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/heat-operator-controller-manager-599898f689-6t6dx"] Oct 03 08:55:04 crc kubenswrapper[4765]: I1003 08:55:04.289481 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-n6n8c\" (UniqueName: \"kubernetes.io/projected/11cef8fb-3e83-485c-8651-6fbf983c682a-kube-api-access-n6n8c\") pod \"designate-operator-controller-manager-75dfd9b554-c9sw7\" (UID: \"11cef8fb-3e83-485c-8651-6fbf983c682a\") " pod="openstack-operators/designate-operator-controller-manager-75dfd9b554-c9sw7" Oct 03 08:55:04 crc kubenswrapper[4765]: I1003 08:55:04.289557 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-flff7\" (UniqueName: \"kubernetes.io/projected/24929c35-6b01-4f35-9d3c-7cbd372661a7-kube-api-access-flff7\") pod \"horizon-operator-controller-manager-6769b867d9-tjk7g\" (UID: \"24929c35-6b01-4f35-9d3c-7cbd372661a7\") " pod="openstack-operators/horizon-operator-controller-manager-6769b867d9-tjk7g" Oct 03 08:55:04 crc kubenswrapper[4765]: I1003 08:55:04.289591 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qmzp8\" (UniqueName: \"kubernetes.io/projected/341b089a-f32e-4c46-a533-dbdb7f7b836f-kube-api-access-qmzp8\") pod \"glance-operator-controller-manager-846dff85b5-z5mzn\" (UID: \"341b089a-f32e-4c46-a533-dbdb7f7b836f\") " pod="openstack-operators/glance-operator-controller-manager-846dff85b5-z5mzn" Oct 03 08:55:04 crc kubenswrapper[4765]: I1003 08:55:04.289621 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lwflp\" (UniqueName: \"kubernetes.io/projected/4fd5d3b8-ca79-48d0-9854-5bb9bc04eae4-kube-api-access-lwflp\") pod \"cinder-operator-controller-manager-79d68d6c85-kxsxl\" (UID: \"4fd5d3b8-ca79-48d0-9854-5bb9bc04eae4\") " pod="openstack-operators/cinder-operator-controller-manager-79d68d6c85-kxsxl" Oct 03 08:55:04 crc kubenswrapper[4765]: I1003 08:55:04.289713 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gbcsw\" (UniqueName: \"kubernetes.io/projected/a9c35b93-23e9-43d5-9532-f1f0d51e8ae8-kube-api-access-gbcsw\") pod \"barbican-operator-controller-manager-6c675fb79f-z24cr\" (UID: \"a9c35b93-23e9-43d5-9532-f1f0d51e8ae8\") " pod="openstack-operators/barbican-operator-controller-manager-6c675fb79f-z24cr" Oct 03 08:55:04 crc kubenswrapper[4765]: I1003 08:55:04.289751 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6bb2v\" (UniqueName: \"kubernetes.io/projected/c5aa8d54-3d82-404c-a201-d6ca76dc8a8e-kube-api-access-6bb2v\") pod \"heat-operator-controller-manager-599898f689-6t6dx\" (UID: \"c5aa8d54-3d82-404c-a201-d6ca76dc8a8e\") " pod="openstack-operators/heat-operator-controller-manager-599898f689-6t6dx" Oct 03 08:55:04 crc kubenswrapper[4765]: I1003 08:55:04.291233 4765 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/ironic-operator-controller-manager-84bc9db6cc-kc4kr"] Oct 03 08:55:04 crc kubenswrapper[4765]: I1003 08:55:04.292649 4765 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/ironic-operator-controller-manager-84bc9db6cc-kc4kr" Oct 03 08:55:04 crc kubenswrapper[4765]: I1003 08:55:04.295091 4765 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"ironic-operator-controller-manager-dockercfg-7n7wl" Oct 03 08:55:04 crc kubenswrapper[4765]: I1003 08:55:04.318189 4765 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/infra-operator-controller-manager-5fbf469cd7-k5nkl"] Oct 03 08:55:04 crc kubenswrapper[4765]: I1003 08:55:04.320869 4765 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/keystone-operator-controller-manager-7f55849f88-88qh4"] Oct 03 08:55:04 crc kubenswrapper[4765]: I1003 08:55:04.322238 4765 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/keystone-operator-controller-manager-7f55849f88-88qh4" Oct 03 08:55:04 crc kubenswrapper[4765]: I1003 08:55:04.326620 4765 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"keystone-operator-controller-manager-dockercfg-qdtjn" Oct 03 08:55:04 crc kubenswrapper[4765]: I1003 08:55:04.330675 4765 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/ironic-operator-controller-manager-84bc9db6cc-kc4kr"] Oct 03 08:55:04 crc kubenswrapper[4765]: I1003 08:55:04.331641 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lwflp\" (UniqueName: \"kubernetes.io/projected/4fd5d3b8-ca79-48d0-9854-5bb9bc04eae4-kube-api-access-lwflp\") pod \"cinder-operator-controller-manager-79d68d6c85-kxsxl\" (UID: \"4fd5d3b8-ca79-48d0-9854-5bb9bc04eae4\") " pod="openstack-operators/cinder-operator-controller-manager-79d68d6c85-kxsxl" Oct 03 08:55:04 crc kubenswrapper[4765]: I1003 08:55:04.336262 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gbcsw\" (UniqueName: \"kubernetes.io/projected/a9c35b93-23e9-43d5-9532-f1f0d51e8ae8-kube-api-access-gbcsw\") pod \"barbican-operator-controller-manager-6c675fb79f-z24cr\" (UID: \"a9c35b93-23e9-43d5-9532-f1f0d51e8ae8\") " pod="openstack-operators/barbican-operator-controller-manager-6c675fb79f-z24cr" Oct 03 08:55:04 crc kubenswrapper[4765]: I1003 08:55:04.355806 4765 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/keystone-operator-controller-manager-7f55849f88-88qh4"] Oct 03 08:55:04 crc kubenswrapper[4765]: I1003 08:55:04.356089 4765 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/barbican-operator-controller-manager-6c675fb79f-z24cr" Oct 03 08:55:04 crc kubenswrapper[4765]: I1003 08:55:04.367356 4765 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/manila-operator-controller-manager-6fd6854b49-rg2n6"] Oct 03 08:55:04 crc kubenswrapper[4765]: I1003 08:55:04.369098 4765 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/manila-operator-controller-manager-6fd6854b49-rg2n6" Oct 03 08:55:04 crc kubenswrapper[4765]: I1003 08:55:04.373167 4765 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"manila-operator-controller-manager-dockercfg-7z4mq" Oct 03 08:55:04 crc kubenswrapper[4765]: I1003 08:55:04.382863 4765 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/cinder-operator-controller-manager-79d68d6c85-kxsxl" Oct 03 08:55:04 crc kubenswrapper[4765]: I1003 08:55:04.391446 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6bb2v\" (UniqueName: \"kubernetes.io/projected/c5aa8d54-3d82-404c-a201-d6ca76dc8a8e-kube-api-access-6bb2v\") pod \"heat-operator-controller-manager-599898f689-6t6dx\" (UID: \"c5aa8d54-3d82-404c-a201-d6ca76dc8a8e\") " pod="openstack-operators/heat-operator-controller-manager-599898f689-6t6dx" Oct 03 08:55:04 crc kubenswrapper[4765]: I1003 08:55:04.391532 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/24d5db65-49a9-45d6-949a-d8310646b691-cert\") pod \"infra-operator-controller-manager-5fbf469cd7-k5nkl\" (UID: \"24d5db65-49a9-45d6-949a-d8310646b691\") " pod="openstack-operators/infra-operator-controller-manager-5fbf469cd7-k5nkl" Oct 03 08:55:04 crc kubenswrapper[4765]: I1003 08:55:04.391565 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-n6n8c\" (UniqueName: \"kubernetes.io/projected/11cef8fb-3e83-485c-8651-6fbf983c682a-kube-api-access-n6n8c\") pod \"designate-operator-controller-manager-75dfd9b554-c9sw7\" (UID: \"11cef8fb-3e83-485c-8651-6fbf983c682a\") " pod="openstack-operators/designate-operator-controller-manager-75dfd9b554-c9sw7" Oct 03 08:55:04 crc kubenswrapper[4765]: I1003 08:55:04.391603 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-flff7\" (UniqueName: \"kubernetes.io/projected/24929c35-6b01-4f35-9d3c-7cbd372661a7-kube-api-access-flff7\") pod \"horizon-operator-controller-manager-6769b867d9-tjk7g\" (UID: \"24929c35-6b01-4f35-9d3c-7cbd372661a7\") " pod="openstack-operators/horizon-operator-controller-manager-6769b867d9-tjk7g" Oct 03 08:55:04 crc kubenswrapper[4765]: I1003 08:55:04.391624 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pn9n7\" (UniqueName: \"kubernetes.io/projected/991c16cf-3b2a-4f6c-b792-3848bffc434d-kube-api-access-pn9n7\") pod \"ironic-operator-controller-manager-84bc9db6cc-kc4kr\" (UID: \"991c16cf-3b2a-4f6c-b792-3848bffc434d\") " pod="openstack-operators/ironic-operator-controller-manager-84bc9db6cc-kc4kr" Oct 03 08:55:04 crc kubenswrapper[4765]: I1003 08:55:04.391652 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qmzp8\" (UniqueName: \"kubernetes.io/projected/341b089a-f32e-4c46-a533-dbdb7f7b836f-kube-api-access-qmzp8\") pod \"glance-operator-controller-manager-846dff85b5-z5mzn\" (UID: \"341b089a-f32e-4c46-a533-dbdb7f7b836f\") " pod="openstack-operators/glance-operator-controller-manager-846dff85b5-z5mzn" Oct 03 08:55:04 crc kubenswrapper[4765]: I1003 08:55:04.391692 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jqwkk\" (UniqueName: \"kubernetes.io/projected/24d5db65-49a9-45d6-949a-d8310646b691-kube-api-access-jqwkk\") pod \"infra-operator-controller-manager-5fbf469cd7-k5nkl\" (UID: \"24d5db65-49a9-45d6-949a-d8310646b691\") " pod="openstack-operators/infra-operator-controller-manager-5fbf469cd7-k5nkl" Oct 03 08:55:04 crc kubenswrapper[4765]: I1003 08:55:04.392361 4765 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/manila-operator-controller-manager-6fd6854b49-rg2n6"] Oct 03 08:55:04 crc kubenswrapper[4765]: I1003 08:55:04.425215 4765 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/mariadb-operator-controller-manager-5c468bf4d4-8tlhl"] Oct 03 08:55:04 crc kubenswrapper[4765]: I1003 08:55:04.426285 4765 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/mariadb-operator-controller-manager-5c468bf4d4-8tlhl" Oct 03 08:55:04 crc kubenswrapper[4765]: I1003 08:55:04.428414 4765 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"mariadb-operator-controller-manager-dockercfg-mdvgs" Oct 03 08:55:04 crc kubenswrapper[4765]: I1003 08:55:04.428617 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qmzp8\" (UniqueName: \"kubernetes.io/projected/341b089a-f32e-4c46-a533-dbdb7f7b836f-kube-api-access-qmzp8\") pod \"glance-operator-controller-manager-846dff85b5-z5mzn\" (UID: \"341b089a-f32e-4c46-a533-dbdb7f7b836f\") " pod="openstack-operators/glance-operator-controller-manager-846dff85b5-z5mzn" Oct 03 08:55:04 crc kubenswrapper[4765]: I1003 08:55:04.436151 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-n6n8c\" (UniqueName: \"kubernetes.io/projected/11cef8fb-3e83-485c-8651-6fbf983c682a-kube-api-access-n6n8c\") pod \"designate-operator-controller-manager-75dfd9b554-c9sw7\" (UID: \"11cef8fb-3e83-485c-8651-6fbf983c682a\") " pod="openstack-operators/designate-operator-controller-manager-75dfd9b554-c9sw7" Oct 03 08:55:04 crc kubenswrapper[4765]: I1003 08:55:04.438576 4765 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/neutron-operator-controller-manager-6574bf987d-867c4"] Oct 03 08:55:04 crc kubenswrapper[4765]: I1003 08:55:04.439795 4765 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/neutron-operator-controller-manager-6574bf987d-867c4" Oct 03 08:55:04 crc kubenswrapper[4765]: I1003 08:55:04.446845 4765 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/glance-operator-controller-manager-846dff85b5-z5mzn" Oct 03 08:55:04 crc kubenswrapper[4765]: I1003 08:55:04.448128 4765 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"neutron-operator-controller-manager-dockercfg-b8qhm" Oct 03 08:55:04 crc kubenswrapper[4765]: I1003 08:55:04.450454 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-flff7\" (UniqueName: \"kubernetes.io/projected/24929c35-6b01-4f35-9d3c-7cbd372661a7-kube-api-access-flff7\") pod \"horizon-operator-controller-manager-6769b867d9-tjk7g\" (UID: \"24929c35-6b01-4f35-9d3c-7cbd372661a7\") " pod="openstack-operators/horizon-operator-controller-manager-6769b867d9-tjk7g" Oct 03 08:55:04 crc kubenswrapper[4765]: I1003 08:55:04.455193 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6bb2v\" (UniqueName: \"kubernetes.io/projected/c5aa8d54-3d82-404c-a201-d6ca76dc8a8e-kube-api-access-6bb2v\") pod \"heat-operator-controller-manager-599898f689-6t6dx\" (UID: \"c5aa8d54-3d82-404c-a201-d6ca76dc8a8e\") " pod="openstack-operators/heat-operator-controller-manager-599898f689-6t6dx" Oct 03 08:55:04 crc kubenswrapper[4765]: I1003 08:55:04.470795 4765 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/mariadb-operator-controller-manager-5c468bf4d4-8tlhl"] Oct 03 08:55:04 crc kubenswrapper[4765]: I1003 08:55:04.471482 4765 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/heat-operator-controller-manager-599898f689-6t6dx" Oct 03 08:55:04 crc kubenswrapper[4765]: I1003 08:55:04.486815 4765 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/nova-operator-controller-manager-555c7456bd-7zq5j"] Oct 03 08:55:04 crc kubenswrapper[4765]: I1003 08:55:04.488375 4765 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/nova-operator-controller-manager-555c7456bd-7zq5j" Oct 03 08:55:04 crc kubenswrapper[4765]: I1003 08:55:04.493372 4765 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"nova-operator-controller-manager-dockercfg-x5r5p" Oct 03 08:55:04 crc kubenswrapper[4765]: I1003 08:55:04.494003 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/24d5db65-49a9-45d6-949a-d8310646b691-cert\") pod \"infra-operator-controller-manager-5fbf469cd7-k5nkl\" (UID: \"24d5db65-49a9-45d6-949a-d8310646b691\") " pod="openstack-operators/infra-operator-controller-manager-5fbf469cd7-k5nkl" Oct 03 08:55:04 crc kubenswrapper[4765]: I1003 08:55:04.494078 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-f5clr\" (UniqueName: \"kubernetes.io/projected/65b40fd9-a427-4ab2-ab36-e4422081aa43-kube-api-access-f5clr\") pod \"manila-operator-controller-manager-6fd6854b49-rg2n6\" (UID: \"65b40fd9-a427-4ab2-ab36-e4422081aa43\") " pod="openstack-operators/manila-operator-controller-manager-6fd6854b49-rg2n6" Oct 03 08:55:04 crc kubenswrapper[4765]: I1003 08:55:04.494124 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pn9n7\" (UniqueName: \"kubernetes.io/projected/991c16cf-3b2a-4f6c-b792-3848bffc434d-kube-api-access-pn9n7\") pod \"ironic-operator-controller-manager-84bc9db6cc-kc4kr\" (UID: \"991c16cf-3b2a-4f6c-b792-3848bffc434d\") " pod="openstack-operators/ironic-operator-controller-manager-84bc9db6cc-kc4kr" Oct 03 08:55:04 crc kubenswrapper[4765]: I1003 08:55:04.494169 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jqwkk\" (UniqueName: \"kubernetes.io/projected/24d5db65-49a9-45d6-949a-d8310646b691-kube-api-access-jqwkk\") pod \"infra-operator-controller-manager-5fbf469cd7-k5nkl\" (UID: \"24d5db65-49a9-45d6-949a-d8310646b691\") " pod="openstack-operators/infra-operator-controller-manager-5fbf469cd7-k5nkl" Oct 03 08:55:04 crc kubenswrapper[4765]: I1003 08:55:04.498426 4765 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/horizon-operator-controller-manager-6769b867d9-tjk7g" Oct 03 08:55:04 crc kubenswrapper[4765]: I1003 08:55:04.501750 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-c97gn\" (UniqueName: \"kubernetes.io/projected/b609ed75-6fd6-4719-a9dd-eca3c0f7df03-kube-api-access-c97gn\") pod \"keystone-operator-controller-manager-7f55849f88-88qh4\" (UID: \"b609ed75-6fd6-4719-a9dd-eca3c0f7df03\") " pod="openstack-operators/keystone-operator-controller-manager-7f55849f88-88qh4" Oct 03 08:55:04 crc kubenswrapper[4765]: I1003 08:55:04.505543 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/24d5db65-49a9-45d6-949a-d8310646b691-cert\") pod \"infra-operator-controller-manager-5fbf469cd7-k5nkl\" (UID: \"24d5db65-49a9-45d6-949a-d8310646b691\") " pod="openstack-operators/infra-operator-controller-manager-5fbf469cd7-k5nkl" Oct 03 08:55:04 crc kubenswrapper[4765]: I1003 08:55:04.515371 4765 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/neutron-operator-controller-manager-6574bf987d-867c4"] Oct 03 08:55:04 crc kubenswrapper[4765]: I1003 08:55:04.537448 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jqwkk\" (UniqueName: \"kubernetes.io/projected/24d5db65-49a9-45d6-949a-d8310646b691-kube-api-access-jqwkk\") pod \"infra-operator-controller-manager-5fbf469cd7-k5nkl\" (UID: \"24d5db65-49a9-45d6-949a-d8310646b691\") " pod="openstack-operators/infra-operator-controller-manager-5fbf469cd7-k5nkl" Oct 03 08:55:04 crc kubenswrapper[4765]: I1003 08:55:04.544906 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pn9n7\" (UniqueName: \"kubernetes.io/projected/991c16cf-3b2a-4f6c-b792-3848bffc434d-kube-api-access-pn9n7\") pod \"ironic-operator-controller-manager-84bc9db6cc-kc4kr\" (UID: \"991c16cf-3b2a-4f6c-b792-3848bffc434d\") " pod="openstack-operators/ironic-operator-controller-manager-84bc9db6cc-kc4kr" Oct 03 08:55:04 crc kubenswrapper[4765]: I1003 08:55:04.545986 4765 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/nova-operator-controller-manager-555c7456bd-7zq5j"] Oct 03 08:55:04 crc kubenswrapper[4765]: I1003 08:55:04.563964 4765 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/octavia-operator-controller-manager-59d6cfdf45-zmn6f"] Oct 03 08:55:04 crc kubenswrapper[4765]: I1003 08:55:04.565256 4765 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/infra-operator-controller-manager-5fbf469cd7-k5nkl" Oct 03 08:55:04 crc kubenswrapper[4765]: I1003 08:55:04.566290 4765 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/octavia-operator-controller-manager-59d6cfdf45-zmn6f" Oct 03 08:55:04 crc kubenswrapper[4765]: I1003 08:55:04.570112 4765 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"octavia-operator-controller-manager-dockercfg-j5khl" Oct 03 08:55:04 crc kubenswrapper[4765]: I1003 08:55:04.577047 4765 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/octavia-operator-controller-manager-59d6cfdf45-zmn6f"] Oct 03 08:55:04 crc kubenswrapper[4765]: I1003 08:55:04.599316 4765 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/openstack-baremetal-operator-controller-manager-6f64c4d678bz9r6"] Oct 03 08:55:04 crc kubenswrapper[4765]: I1003 08:55:04.600990 4765 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-baremetal-operator-controller-manager-6f64c4d678bz9r6" Oct 03 08:55:04 crc kubenswrapper[4765]: I1003 08:55:04.605071 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-f5clr\" (UniqueName: \"kubernetes.io/projected/65b40fd9-a427-4ab2-ab36-e4422081aa43-kube-api-access-f5clr\") pod \"manila-operator-controller-manager-6fd6854b49-rg2n6\" (UID: \"65b40fd9-a427-4ab2-ab36-e4422081aa43\") " pod="openstack-operators/manila-operator-controller-manager-6fd6854b49-rg2n6" Oct 03 08:55:04 crc kubenswrapper[4765]: I1003 08:55:04.612942 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5nt2l\" (UniqueName: \"kubernetes.io/projected/2cda6800-dddc-4535-86bc-25d3f2d167b0-kube-api-access-5nt2l\") pod \"mariadb-operator-controller-manager-5c468bf4d4-8tlhl\" (UID: \"2cda6800-dddc-4535-86bc-25d3f2d167b0\") " pod="openstack-operators/mariadb-operator-controller-manager-5c468bf4d4-8tlhl" Oct 03 08:55:04 crc kubenswrapper[4765]: I1003 08:55:04.613094 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-c97gn\" (UniqueName: \"kubernetes.io/projected/b609ed75-6fd6-4719-a9dd-eca3c0f7df03-kube-api-access-c97gn\") pod \"keystone-operator-controller-manager-7f55849f88-88qh4\" (UID: \"b609ed75-6fd6-4719-a9dd-eca3c0f7df03\") " pod="openstack-operators/keystone-operator-controller-manager-7f55849f88-88qh4" Oct 03 08:55:04 crc kubenswrapper[4765]: I1003 08:55:04.613218 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7frgx\" (UniqueName: \"kubernetes.io/projected/21090664-3a4c-4c58-852d-d2779c7bf17d-kube-api-access-7frgx\") pod \"neutron-operator-controller-manager-6574bf987d-867c4\" (UID: \"21090664-3a4c-4c58-852d-d2779c7bf17d\") " pod="openstack-operators/neutron-operator-controller-manager-6574bf987d-867c4" Oct 03 08:55:04 crc kubenswrapper[4765]: I1003 08:55:04.613241 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-59nqc\" (UniqueName: \"kubernetes.io/projected/c6bb21ee-995b-448f-8356-bd14d768d848-kube-api-access-59nqc\") pod \"nova-operator-controller-manager-555c7456bd-7zq5j\" (UID: \"c6bb21ee-995b-448f-8356-bd14d768d848\") " pod="openstack-operators/nova-operator-controller-manager-555c7456bd-7zq5j" Oct 03 08:55:04 crc kubenswrapper[4765]: I1003 08:55:04.618103 4765 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/ironic-operator-controller-manager-84bc9db6cc-kc4kr" Oct 03 08:55:04 crc kubenswrapper[4765]: I1003 08:55:04.626765 4765 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-baremetal-operator-controller-manager-6f64c4d678bz9r6"] Oct 03 08:55:04 crc kubenswrapper[4765]: I1003 08:55:04.635166 4765 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"openstack-baremetal-operator-controller-manager-dockercfg-ww7jx" Oct 03 08:55:04 crc kubenswrapper[4765]: I1003 08:55:04.635433 4765 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"openstack-baremetal-operator-webhook-server-cert" Oct 03 08:55:04 crc kubenswrapper[4765]: I1003 08:55:04.650205 4765 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/ovn-operator-controller-manager-688db7b6c7-qmgwv"] Oct 03 08:55:04 crc kubenswrapper[4765]: I1003 08:55:04.651454 4765 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/ovn-operator-controller-manager-688db7b6c7-qmgwv" Oct 03 08:55:04 crc kubenswrapper[4765]: I1003 08:55:04.654308 4765 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"ovn-operator-controller-manager-dockercfg-sv75c" Oct 03 08:55:04 crc kubenswrapper[4765]: I1003 08:55:04.658212 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-c97gn\" (UniqueName: \"kubernetes.io/projected/b609ed75-6fd6-4719-a9dd-eca3c0f7df03-kube-api-access-c97gn\") pod \"keystone-operator-controller-manager-7f55849f88-88qh4\" (UID: \"b609ed75-6fd6-4719-a9dd-eca3c0f7df03\") " pod="openstack-operators/keystone-operator-controller-manager-7f55849f88-88qh4" Oct 03 08:55:04 crc kubenswrapper[4765]: I1003 08:55:04.662586 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-f5clr\" (UniqueName: \"kubernetes.io/projected/65b40fd9-a427-4ab2-ab36-e4422081aa43-kube-api-access-f5clr\") pod \"manila-operator-controller-manager-6fd6854b49-rg2n6\" (UID: \"65b40fd9-a427-4ab2-ab36-e4422081aa43\") " pod="openstack-operators/manila-operator-controller-manager-6fd6854b49-rg2n6" Oct 03 08:55:04 crc kubenswrapper[4765]: I1003 08:55:04.669917 4765 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/ovn-operator-controller-manager-688db7b6c7-qmgwv"] Oct 03 08:55:04 crc kubenswrapper[4765]: I1003 08:55:04.687083 4765 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/placement-operator-controller-manager-7d8bb7f44c-gsgcs"] Oct 03 08:55:04 crc kubenswrapper[4765]: I1003 08:55:04.688190 4765 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/placement-operator-controller-manager-7d8bb7f44c-gsgcs" Oct 03 08:55:04 crc kubenswrapper[4765]: I1003 08:55:04.691039 4765 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"placement-operator-controller-manager-dockercfg-ndxb6" Oct 03 08:55:04 crc kubenswrapper[4765]: I1003 08:55:04.697134 4765 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/placement-operator-controller-manager-7d8bb7f44c-gsgcs"] Oct 03 08:55:04 crc kubenswrapper[4765]: I1003 08:55:04.706460 4765 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/swift-operator-controller-manager-6859f9b676-5rxfw"] Oct 03 08:55:04 crc kubenswrapper[4765]: I1003 08:55:04.710607 4765 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/designate-operator-controller-manager-75dfd9b554-c9sw7" Oct 03 08:55:04 crc kubenswrapper[4765]: I1003 08:55:04.711893 4765 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/swift-operator-controller-manager-6859f9b676-5rxfw" Oct 03 08:55:04 crc kubenswrapper[4765]: I1003 08:55:04.720464 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xqd9x\" (UniqueName: \"kubernetes.io/projected/b52d06aa-3131-4887-b541-16ada075b2dd-kube-api-access-xqd9x\") pod \"openstack-baremetal-operator-controller-manager-6f64c4d678bz9r6\" (UID: \"b52d06aa-3131-4887-b541-16ada075b2dd\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-6f64c4d678bz9r6" Oct 03 08:55:04 crc kubenswrapper[4765]: I1003 08:55:04.720510 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7frgx\" (UniqueName: \"kubernetes.io/projected/21090664-3a4c-4c58-852d-d2779c7bf17d-kube-api-access-7frgx\") pod \"neutron-operator-controller-manager-6574bf987d-867c4\" (UID: \"21090664-3a4c-4c58-852d-d2779c7bf17d\") " pod="openstack-operators/neutron-operator-controller-manager-6574bf987d-867c4" Oct 03 08:55:04 crc kubenswrapper[4765]: I1003 08:55:04.720532 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-59nqc\" (UniqueName: \"kubernetes.io/projected/c6bb21ee-995b-448f-8356-bd14d768d848-kube-api-access-59nqc\") pod \"nova-operator-controller-manager-555c7456bd-7zq5j\" (UID: \"c6bb21ee-995b-448f-8356-bd14d768d848\") " pod="openstack-operators/nova-operator-controller-manager-555c7456bd-7zq5j" Oct 03 08:55:04 crc kubenswrapper[4765]: I1003 08:55:04.720598 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qt6br\" (UniqueName: \"kubernetes.io/projected/a8759262-a00d-4609-9ca0-7bb0701064f0-kube-api-access-qt6br\") pod \"octavia-operator-controller-manager-59d6cfdf45-zmn6f\" (UID: \"a8759262-a00d-4609-9ca0-7bb0701064f0\") " pod="openstack-operators/octavia-operator-controller-manager-59d6cfdf45-zmn6f" Oct 03 08:55:04 crc kubenswrapper[4765]: I1003 08:55:04.720619 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5nt2l\" (UniqueName: \"kubernetes.io/projected/2cda6800-dddc-4535-86bc-25d3f2d167b0-kube-api-access-5nt2l\") pod \"mariadb-operator-controller-manager-5c468bf4d4-8tlhl\" (UID: \"2cda6800-dddc-4535-86bc-25d3f2d167b0\") " pod="openstack-operators/mariadb-operator-controller-manager-5c468bf4d4-8tlhl" Oct 03 08:55:04 crc kubenswrapper[4765]: I1003 08:55:04.720639 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/b52d06aa-3131-4887-b541-16ada075b2dd-cert\") pod \"openstack-baremetal-operator-controller-manager-6f64c4d678bz9r6\" (UID: \"b52d06aa-3131-4887-b541-16ada075b2dd\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-6f64c4d678bz9r6" Oct 03 08:55:04 crc kubenswrapper[4765]: I1003 08:55:04.725578 4765 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"swift-operator-controller-manager-dockercfg-pv2dw" Oct 03 08:55:04 crc kubenswrapper[4765]: I1003 08:55:04.739386 4765 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/telemetry-operator-controller-manager-5db5cf686f-qjp99"] Oct 03 08:55:04 crc kubenswrapper[4765]: I1003 08:55:04.740798 4765 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/telemetry-operator-controller-manager-5db5cf686f-qjp99" Oct 03 08:55:04 crc kubenswrapper[4765]: I1003 08:55:04.769982 4765 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/keystone-operator-controller-manager-7f55849f88-88qh4" Oct 03 08:55:04 crc kubenswrapper[4765]: I1003 08:55:04.775753 4765 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"telemetry-operator-controller-manager-dockercfg-74r9d" Oct 03 08:55:04 crc kubenswrapper[4765]: I1003 08:55:04.794358 4765 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/manila-operator-controller-manager-6fd6854b49-rg2n6" Oct 03 08:55:04 crc kubenswrapper[4765]: I1003 08:55:04.795811 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-59nqc\" (UniqueName: \"kubernetes.io/projected/c6bb21ee-995b-448f-8356-bd14d768d848-kube-api-access-59nqc\") pod \"nova-operator-controller-manager-555c7456bd-7zq5j\" (UID: \"c6bb21ee-995b-448f-8356-bd14d768d848\") " pod="openstack-operators/nova-operator-controller-manager-555c7456bd-7zq5j" Oct 03 08:55:04 crc kubenswrapper[4765]: I1003 08:55:04.825001 4765 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/swift-operator-controller-manager-6859f9b676-5rxfw"] Oct 03 08:55:04 crc kubenswrapper[4765]: I1003 08:55:04.832430 4765 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/telemetry-operator-controller-manager-5db5cf686f-qjp99"] Oct 03 08:55:04 crc kubenswrapper[4765]: I1003 08:55:04.843273 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xqd9x\" (UniqueName: \"kubernetes.io/projected/b52d06aa-3131-4887-b541-16ada075b2dd-kube-api-access-xqd9x\") pod \"openstack-baremetal-operator-controller-manager-6f64c4d678bz9r6\" (UID: \"b52d06aa-3131-4887-b541-16ada075b2dd\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-6f64c4d678bz9r6" Oct 03 08:55:04 crc kubenswrapper[4765]: I1003 08:55:04.843339 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7dznf\" (UniqueName: \"kubernetes.io/projected/b0f3d3a4-5e2c-4f86-a6ab-65be50620ce7-kube-api-access-7dznf\") pod \"telemetry-operator-controller-manager-5db5cf686f-qjp99\" (UID: \"b0f3d3a4-5e2c-4f86-a6ab-65be50620ce7\") " pod="openstack-operators/telemetry-operator-controller-manager-5db5cf686f-qjp99" Oct 03 08:55:04 crc kubenswrapper[4765]: I1003 08:55:04.843474 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-67wj4\" (UniqueName: \"kubernetes.io/projected/98995a33-7241-42fe-a4c8-308da084b8db-kube-api-access-67wj4\") pod \"placement-operator-controller-manager-7d8bb7f44c-gsgcs\" (UID: \"98995a33-7241-42fe-a4c8-308da084b8db\") " pod="openstack-operators/placement-operator-controller-manager-7d8bb7f44c-gsgcs" Oct 03 08:55:04 crc kubenswrapper[4765]: I1003 08:55:04.843625 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qt6br\" (UniqueName: \"kubernetes.io/projected/a8759262-a00d-4609-9ca0-7bb0701064f0-kube-api-access-qt6br\") pod \"octavia-operator-controller-manager-59d6cfdf45-zmn6f\" (UID: \"a8759262-a00d-4609-9ca0-7bb0701064f0\") " pod="openstack-operators/octavia-operator-controller-manager-59d6cfdf45-zmn6f" Oct 03 08:55:04 crc kubenswrapper[4765]: I1003 08:55:04.843731 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7kfv6\" (UniqueName: \"kubernetes.io/projected/4a654679-7956-4885-841e-e1718fc57bef-kube-api-access-7kfv6\") pod \"swift-operator-controller-manager-6859f9b676-5rxfw\" (UID: \"4a654679-7956-4885-841e-e1718fc57bef\") " pod="openstack-operators/swift-operator-controller-manager-6859f9b676-5rxfw" Oct 03 08:55:04 crc kubenswrapper[4765]: I1003 08:55:04.843760 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/b52d06aa-3131-4887-b541-16ada075b2dd-cert\") pod \"openstack-baremetal-operator-controller-manager-6f64c4d678bz9r6\" (UID: \"b52d06aa-3131-4887-b541-16ada075b2dd\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-6f64c4d678bz9r6" Oct 03 08:55:04 crc kubenswrapper[4765]: I1003 08:55:04.843798 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vzdg9\" (UniqueName: \"kubernetes.io/projected/a6dc4446-834f-45ac-8504-1ffc7be80df3-kube-api-access-vzdg9\") pod \"ovn-operator-controller-manager-688db7b6c7-qmgwv\" (UID: \"a6dc4446-834f-45ac-8504-1ffc7be80df3\") " pod="openstack-operators/ovn-operator-controller-manager-688db7b6c7-qmgwv" Oct 03 08:55:04 crc kubenswrapper[4765]: E1003 08:55:04.844643 4765 secret.go:188] Couldn't get secret openstack-operators/openstack-baremetal-operator-webhook-server-cert: secret "openstack-baremetal-operator-webhook-server-cert" not found Oct 03 08:55:04 crc kubenswrapper[4765]: E1003 08:55:04.844749 4765 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/b52d06aa-3131-4887-b541-16ada075b2dd-cert podName:b52d06aa-3131-4887-b541-16ada075b2dd nodeName:}" failed. No retries permitted until 2025-10-03 08:55:05.344721542 +0000 UTC m=+949.646215882 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/b52d06aa-3131-4887-b541-16ada075b2dd-cert") pod "openstack-baremetal-operator-controller-manager-6f64c4d678bz9r6" (UID: "b52d06aa-3131-4887-b541-16ada075b2dd") : secret "openstack-baremetal-operator-webhook-server-cert" not found Oct 03 08:55:04 crc kubenswrapper[4765]: I1003 08:55:04.848246 4765 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/nova-operator-controller-manager-555c7456bd-7zq5j" Oct 03 08:55:04 crc kubenswrapper[4765]: I1003 08:55:04.853426 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7frgx\" (UniqueName: \"kubernetes.io/projected/21090664-3a4c-4c58-852d-d2779c7bf17d-kube-api-access-7frgx\") pod \"neutron-operator-controller-manager-6574bf987d-867c4\" (UID: \"21090664-3a4c-4c58-852d-d2779c7bf17d\") " pod="openstack-operators/neutron-operator-controller-manager-6574bf987d-867c4" Oct 03 08:55:04 crc kubenswrapper[4765]: I1003 08:55:04.853969 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5nt2l\" (UniqueName: \"kubernetes.io/projected/2cda6800-dddc-4535-86bc-25d3f2d167b0-kube-api-access-5nt2l\") pod \"mariadb-operator-controller-manager-5c468bf4d4-8tlhl\" (UID: \"2cda6800-dddc-4535-86bc-25d3f2d167b0\") " pod="openstack-operators/mariadb-operator-controller-manager-5c468bf4d4-8tlhl" Oct 03 08:55:04 crc kubenswrapper[4765]: I1003 08:55:04.877511 4765 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/test-operator-controller-manager-5cd5cb47d7-vvxdl"] Oct 03 08:55:04 crc kubenswrapper[4765]: I1003 08:55:04.879470 4765 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/test-operator-controller-manager-5cd5cb47d7-vvxdl" Oct 03 08:55:04 crc kubenswrapper[4765]: I1003 08:55:04.885248 4765 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"test-operator-controller-manager-dockercfg-fdl9d" Oct 03 08:55:04 crc kubenswrapper[4765]: I1003 08:55:04.890863 4765 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/test-operator-controller-manager-5cd5cb47d7-vvxdl"] Oct 03 08:55:04 crc kubenswrapper[4765]: I1003 08:55:04.899831 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xqd9x\" (UniqueName: \"kubernetes.io/projected/b52d06aa-3131-4887-b541-16ada075b2dd-kube-api-access-xqd9x\") pod \"openstack-baremetal-operator-controller-manager-6f64c4d678bz9r6\" (UID: \"b52d06aa-3131-4887-b541-16ada075b2dd\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-6f64c4d678bz9r6" Oct 03 08:55:04 crc kubenswrapper[4765]: I1003 08:55:04.900818 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qt6br\" (UniqueName: \"kubernetes.io/projected/a8759262-a00d-4609-9ca0-7bb0701064f0-kube-api-access-qt6br\") pod \"octavia-operator-controller-manager-59d6cfdf45-zmn6f\" (UID: \"a8759262-a00d-4609-9ca0-7bb0701064f0\") " pod="openstack-operators/octavia-operator-controller-manager-59d6cfdf45-zmn6f" Oct 03 08:55:04 crc kubenswrapper[4765]: I1003 08:55:04.904945 4765 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/watcher-operator-controller-manager-5c899c45b6-mkt28"] Oct 03 08:55:04 crc kubenswrapper[4765]: I1003 08:55:04.907490 4765 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/watcher-operator-controller-manager-5c899c45b6-mkt28" Oct 03 08:55:04 crc kubenswrapper[4765]: I1003 08:55:04.910341 4765 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"watcher-operator-controller-manager-dockercfg-5bqtp" Oct 03 08:55:04 crc kubenswrapper[4765]: I1003 08:55:04.913207 4765 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/watcher-operator-controller-manager-5c899c45b6-mkt28"] Oct 03 08:55:04 crc kubenswrapper[4765]: I1003 08:55:04.924969 4765 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/octavia-operator-controller-manager-59d6cfdf45-zmn6f" Oct 03 08:55:04 crc kubenswrapper[4765]: I1003 08:55:04.941599 4765 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/openstack-operator-controller-manager-6fb94b767d-jsrj4"] Oct 03 08:55:04 crc kubenswrapper[4765]: I1003 08:55:04.943388 4765 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-controller-manager-6fb94b767d-jsrj4" Oct 03 08:55:04 crc kubenswrapper[4765]: I1003 08:55:04.946474 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7kfv6\" (UniqueName: \"kubernetes.io/projected/4a654679-7956-4885-841e-e1718fc57bef-kube-api-access-7kfv6\") pod \"swift-operator-controller-manager-6859f9b676-5rxfw\" (UID: \"4a654679-7956-4885-841e-e1718fc57bef\") " pod="openstack-operators/swift-operator-controller-manager-6859f9b676-5rxfw" Oct 03 08:55:04 crc kubenswrapper[4765]: I1003 08:55:04.946532 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vzdg9\" (UniqueName: \"kubernetes.io/projected/a6dc4446-834f-45ac-8504-1ffc7be80df3-kube-api-access-vzdg9\") pod \"ovn-operator-controller-manager-688db7b6c7-qmgwv\" (UID: \"a6dc4446-834f-45ac-8504-1ffc7be80df3\") " pod="openstack-operators/ovn-operator-controller-manager-688db7b6c7-qmgwv" Oct 03 08:55:04 crc kubenswrapper[4765]: I1003 08:55:04.946600 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7dznf\" (UniqueName: \"kubernetes.io/projected/b0f3d3a4-5e2c-4f86-a6ab-65be50620ce7-kube-api-access-7dznf\") pod \"telemetry-operator-controller-manager-5db5cf686f-qjp99\" (UID: \"b0f3d3a4-5e2c-4f86-a6ab-65be50620ce7\") " pod="openstack-operators/telemetry-operator-controller-manager-5db5cf686f-qjp99" Oct 03 08:55:04 crc kubenswrapper[4765]: I1003 08:55:04.946651 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-67wj4\" (UniqueName: \"kubernetes.io/projected/98995a33-7241-42fe-a4c8-308da084b8db-kube-api-access-67wj4\") pod \"placement-operator-controller-manager-7d8bb7f44c-gsgcs\" (UID: \"98995a33-7241-42fe-a4c8-308da084b8db\") " pod="openstack-operators/placement-operator-controller-manager-7d8bb7f44c-gsgcs" Oct 03 08:55:04 crc kubenswrapper[4765]: I1003 08:55:04.947399 4765 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-controller-manager-6fb94b767d-jsrj4"] Oct 03 08:55:04 crc kubenswrapper[4765]: I1003 08:55:04.948883 4765 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"webhook-server-cert" Oct 03 08:55:04 crc kubenswrapper[4765]: I1003 08:55:04.948890 4765 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"openstack-operator-controller-manager-dockercfg-qcqpj" Oct 03 08:55:04 crc kubenswrapper[4765]: I1003 08:55:04.968091 4765 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/rabbitmq-cluster-operator-manager-5f97d8c699-bsdgf"] Oct 03 08:55:04 crc kubenswrapper[4765]: I1003 08:55:04.969249 4765 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/rabbitmq-cluster-operator-manager-5f97d8c699-bsdgf" Oct 03 08:55:04 crc kubenswrapper[4765]: I1003 08:55:04.969388 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7kfv6\" (UniqueName: \"kubernetes.io/projected/4a654679-7956-4885-841e-e1718fc57bef-kube-api-access-7kfv6\") pod \"swift-operator-controller-manager-6859f9b676-5rxfw\" (UID: \"4a654679-7956-4885-841e-e1718fc57bef\") " pod="openstack-operators/swift-operator-controller-manager-6859f9b676-5rxfw" Oct 03 08:55:04 crc kubenswrapper[4765]: I1003 08:55:04.971510 4765 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"rabbitmq-cluster-operator-controller-manager-dockercfg-4lzc5" Oct 03 08:55:04 crc kubenswrapper[4765]: I1003 08:55:04.974501 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vzdg9\" (UniqueName: \"kubernetes.io/projected/a6dc4446-834f-45ac-8504-1ffc7be80df3-kube-api-access-vzdg9\") pod \"ovn-operator-controller-manager-688db7b6c7-qmgwv\" (UID: \"a6dc4446-834f-45ac-8504-1ffc7be80df3\") " pod="openstack-operators/ovn-operator-controller-manager-688db7b6c7-qmgwv" Oct 03 08:55:04 crc kubenswrapper[4765]: I1003 08:55:04.976951 4765 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/rabbitmq-cluster-operator-manager-5f97d8c699-bsdgf"] Oct 03 08:55:04 crc kubenswrapper[4765]: I1003 08:55:04.991357 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7dznf\" (UniqueName: \"kubernetes.io/projected/b0f3d3a4-5e2c-4f86-a6ab-65be50620ce7-kube-api-access-7dznf\") pod \"telemetry-operator-controller-manager-5db5cf686f-qjp99\" (UID: \"b0f3d3a4-5e2c-4f86-a6ab-65be50620ce7\") " pod="openstack-operators/telemetry-operator-controller-manager-5db5cf686f-qjp99" Oct 03 08:55:04 crc kubenswrapper[4765]: I1003 08:55:04.993627 4765 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/barbican-operator-controller-manager-6c675fb79f-z24cr"] Oct 03 08:55:04 crc kubenswrapper[4765]: I1003 08:55:04.996004 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-67wj4\" (UniqueName: \"kubernetes.io/projected/98995a33-7241-42fe-a4c8-308da084b8db-kube-api-access-67wj4\") pod \"placement-operator-controller-manager-7d8bb7f44c-gsgcs\" (UID: \"98995a33-7241-42fe-a4c8-308da084b8db\") " pod="openstack-operators/placement-operator-controller-manager-7d8bb7f44c-gsgcs" Oct 03 08:55:04 crc kubenswrapper[4765]: I1003 08:55:04.997348 4765 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/ovn-operator-controller-manager-688db7b6c7-qmgwv" Oct 03 08:55:05 crc kubenswrapper[4765]: I1003 08:55:05.031475 4765 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/placement-operator-controller-manager-7d8bb7f44c-gsgcs" Oct 03 08:55:05 crc kubenswrapper[4765]: I1003 08:55:05.061944 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/8075708f-0d72-4b98-b39e-a0ee77011c10-cert\") pod \"openstack-operator-controller-manager-6fb94b767d-jsrj4\" (UID: \"8075708f-0d72-4b98-b39e-a0ee77011c10\") " pod="openstack-operators/openstack-operator-controller-manager-6fb94b767d-jsrj4" Oct 03 08:55:05 crc kubenswrapper[4765]: I1003 08:55:05.061985 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-g7jbh\" (UniqueName: \"kubernetes.io/projected/8075708f-0d72-4b98-b39e-a0ee77011c10-kube-api-access-g7jbh\") pod \"openstack-operator-controller-manager-6fb94b767d-jsrj4\" (UID: \"8075708f-0d72-4b98-b39e-a0ee77011c10\") " pod="openstack-operators/openstack-operator-controller-manager-6fb94b767d-jsrj4" Oct 03 08:55:05 crc kubenswrapper[4765]: I1003 08:55:05.062049 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bgqk7\" (UniqueName: \"kubernetes.io/projected/f3a40e37-8073-4528-9379-9cea11f883ed-kube-api-access-bgqk7\") pod \"test-operator-controller-manager-5cd5cb47d7-vvxdl\" (UID: \"f3a40e37-8073-4528-9379-9cea11f883ed\") " pod="openstack-operators/test-operator-controller-manager-5cd5cb47d7-vvxdl" Oct 03 08:55:05 crc kubenswrapper[4765]: I1003 08:55:05.062429 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kcthx\" (UniqueName: \"kubernetes.io/projected/2f15ff13-62f4-43da-82f8-5d5faac4f502-kube-api-access-kcthx\") pod \"watcher-operator-controller-manager-5c899c45b6-mkt28\" (UID: \"2f15ff13-62f4-43da-82f8-5d5faac4f502\") " pod="openstack-operators/watcher-operator-controller-manager-5c899c45b6-mkt28" Oct 03 08:55:05 crc kubenswrapper[4765]: I1003 08:55:05.097201 4765 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/mariadb-operator-controller-manager-5c468bf4d4-8tlhl" Oct 03 08:55:05 crc kubenswrapper[4765]: I1003 08:55:05.102888 4765 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/swift-operator-controller-manager-6859f9b676-5rxfw" Oct 03 08:55:05 crc kubenswrapper[4765]: I1003 08:55:05.111123 4765 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/cinder-operator-controller-manager-79d68d6c85-kxsxl"] Oct 03 08:55:05 crc kubenswrapper[4765]: I1003 08:55:05.113317 4765 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/neutron-operator-controller-manager-6574bf987d-867c4" Oct 03 08:55:05 crc kubenswrapper[4765]: I1003 08:55:05.123357 4765 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/telemetry-operator-controller-manager-5db5cf686f-qjp99" Oct 03 08:55:05 crc kubenswrapper[4765]: I1003 08:55:05.164175 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/8075708f-0d72-4b98-b39e-a0ee77011c10-cert\") pod \"openstack-operator-controller-manager-6fb94b767d-jsrj4\" (UID: \"8075708f-0d72-4b98-b39e-a0ee77011c10\") " pod="openstack-operators/openstack-operator-controller-manager-6fb94b767d-jsrj4" Oct 03 08:55:05 crc kubenswrapper[4765]: I1003 08:55:05.164205 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-g7jbh\" (UniqueName: \"kubernetes.io/projected/8075708f-0d72-4b98-b39e-a0ee77011c10-kube-api-access-g7jbh\") pod \"openstack-operator-controller-manager-6fb94b767d-jsrj4\" (UID: \"8075708f-0d72-4b98-b39e-a0ee77011c10\") " pod="openstack-operators/openstack-operator-controller-manager-6fb94b767d-jsrj4" Oct 03 08:55:05 crc kubenswrapper[4765]: I1003 08:55:05.164260 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bgqk7\" (UniqueName: \"kubernetes.io/projected/f3a40e37-8073-4528-9379-9cea11f883ed-kube-api-access-bgqk7\") pod \"test-operator-controller-manager-5cd5cb47d7-vvxdl\" (UID: \"f3a40e37-8073-4528-9379-9cea11f883ed\") " pod="openstack-operators/test-operator-controller-manager-5cd5cb47d7-vvxdl" Oct 03 08:55:05 crc kubenswrapper[4765]: I1003 08:55:05.164305 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bdl48\" (UniqueName: \"kubernetes.io/projected/ac8027c9-646b-4c9f-965e-639d6ad818dd-kube-api-access-bdl48\") pod \"rabbitmq-cluster-operator-manager-5f97d8c699-bsdgf\" (UID: \"ac8027c9-646b-4c9f-965e-639d6ad818dd\") " pod="openstack-operators/rabbitmq-cluster-operator-manager-5f97d8c699-bsdgf" Oct 03 08:55:05 crc kubenswrapper[4765]: I1003 08:55:05.164330 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kcthx\" (UniqueName: \"kubernetes.io/projected/2f15ff13-62f4-43da-82f8-5d5faac4f502-kube-api-access-kcthx\") pod \"watcher-operator-controller-manager-5c899c45b6-mkt28\" (UID: \"2f15ff13-62f4-43da-82f8-5d5faac4f502\") " pod="openstack-operators/watcher-operator-controller-manager-5c899c45b6-mkt28" Oct 03 08:55:05 crc kubenswrapper[4765]: E1003 08:55:05.164687 4765 secret.go:188] Couldn't get secret openstack-operators/webhook-server-cert: secret "webhook-server-cert" not found Oct 03 08:55:05 crc kubenswrapper[4765]: E1003 08:55:05.164725 4765 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/8075708f-0d72-4b98-b39e-a0ee77011c10-cert podName:8075708f-0d72-4b98-b39e-a0ee77011c10 nodeName:}" failed. No retries permitted until 2025-10-03 08:55:05.66471173 +0000 UTC m=+949.966206060 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/8075708f-0d72-4b98-b39e-a0ee77011c10-cert") pod "openstack-operator-controller-manager-6fb94b767d-jsrj4" (UID: "8075708f-0d72-4b98-b39e-a0ee77011c10") : secret "webhook-server-cert" not found Oct 03 08:55:05 crc kubenswrapper[4765]: I1003 08:55:05.188768 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kcthx\" (UniqueName: \"kubernetes.io/projected/2f15ff13-62f4-43da-82f8-5d5faac4f502-kube-api-access-kcthx\") pod \"watcher-operator-controller-manager-5c899c45b6-mkt28\" (UID: \"2f15ff13-62f4-43da-82f8-5d5faac4f502\") " pod="openstack-operators/watcher-operator-controller-manager-5c899c45b6-mkt28" Oct 03 08:55:05 crc kubenswrapper[4765]: I1003 08:55:05.189265 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-g7jbh\" (UniqueName: \"kubernetes.io/projected/8075708f-0d72-4b98-b39e-a0ee77011c10-kube-api-access-g7jbh\") pod \"openstack-operator-controller-manager-6fb94b767d-jsrj4\" (UID: \"8075708f-0d72-4b98-b39e-a0ee77011c10\") " pod="openstack-operators/openstack-operator-controller-manager-6fb94b767d-jsrj4" Oct 03 08:55:05 crc kubenswrapper[4765]: I1003 08:55:05.207906 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bgqk7\" (UniqueName: \"kubernetes.io/projected/f3a40e37-8073-4528-9379-9cea11f883ed-kube-api-access-bgqk7\") pod \"test-operator-controller-manager-5cd5cb47d7-vvxdl\" (UID: \"f3a40e37-8073-4528-9379-9cea11f883ed\") " pod="openstack-operators/test-operator-controller-manager-5cd5cb47d7-vvxdl" Oct 03 08:55:05 crc kubenswrapper[4765]: I1003 08:55:05.266107 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bdl48\" (UniqueName: \"kubernetes.io/projected/ac8027c9-646b-4c9f-965e-639d6ad818dd-kube-api-access-bdl48\") pod \"rabbitmq-cluster-operator-manager-5f97d8c699-bsdgf\" (UID: \"ac8027c9-646b-4c9f-965e-639d6ad818dd\") " pod="openstack-operators/rabbitmq-cluster-operator-manager-5f97d8c699-bsdgf" Oct 03 08:55:05 crc kubenswrapper[4765]: I1003 08:55:05.292629 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bdl48\" (UniqueName: \"kubernetes.io/projected/ac8027c9-646b-4c9f-965e-639d6ad818dd-kube-api-access-bdl48\") pod \"rabbitmq-cluster-operator-manager-5f97d8c699-bsdgf\" (UID: \"ac8027c9-646b-4c9f-965e-639d6ad818dd\") " pod="openstack-operators/rabbitmq-cluster-operator-manager-5f97d8c699-bsdgf" Oct 03 08:55:05 crc kubenswrapper[4765]: I1003 08:55:05.312342 4765 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/test-operator-controller-manager-5cd5cb47d7-vvxdl" Oct 03 08:55:05 crc kubenswrapper[4765]: I1003 08:55:05.367748 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/b52d06aa-3131-4887-b541-16ada075b2dd-cert\") pod \"openstack-baremetal-operator-controller-manager-6f64c4d678bz9r6\" (UID: \"b52d06aa-3131-4887-b541-16ada075b2dd\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-6f64c4d678bz9r6" Oct 03 08:55:05 crc kubenswrapper[4765]: E1003 08:55:05.368051 4765 secret.go:188] Couldn't get secret openstack-operators/openstack-baremetal-operator-webhook-server-cert: secret "openstack-baremetal-operator-webhook-server-cert" not found Oct 03 08:55:05 crc kubenswrapper[4765]: E1003 08:55:05.368179 4765 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/b52d06aa-3131-4887-b541-16ada075b2dd-cert podName:b52d06aa-3131-4887-b541-16ada075b2dd nodeName:}" failed. No retries permitted until 2025-10-03 08:55:06.36815176 +0000 UTC m=+950.669646090 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/b52d06aa-3131-4887-b541-16ada075b2dd-cert") pod "openstack-baremetal-operator-controller-manager-6f64c4d678bz9r6" (UID: "b52d06aa-3131-4887-b541-16ada075b2dd") : secret "openstack-baremetal-operator-webhook-server-cert" not found Oct 03 08:55:05 crc kubenswrapper[4765]: I1003 08:55:05.398695 4765 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/watcher-operator-controller-manager-5c899c45b6-mkt28" Oct 03 08:55:05 crc kubenswrapper[4765]: I1003 08:55:05.440621 4765 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/rabbitmq-cluster-operator-manager-5f97d8c699-bsdgf" Oct 03 08:55:05 crc kubenswrapper[4765]: I1003 08:55:05.571936 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/cinder-operator-controller-manager-79d68d6c85-kxsxl" event={"ID":"4fd5d3b8-ca79-48d0-9854-5bb9bc04eae4","Type":"ContainerStarted","Data":"8089887812ab86d5277a22400c5cd0c97f053152b972944cb4a31c3143a83fd9"} Oct 03 08:55:05 crc kubenswrapper[4765]: I1003 08:55:05.573169 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/barbican-operator-controller-manager-6c675fb79f-z24cr" event={"ID":"a9c35b93-23e9-43d5-9532-f1f0d51e8ae8","Type":"ContainerStarted","Data":"f64e6eb0c1d77d569eba50f24d48f45f2736e87ec4c0d70dfe081dc3084d0b17"} Oct 03 08:55:05 crc kubenswrapper[4765]: I1003 08:55:05.674440 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/8075708f-0d72-4b98-b39e-a0ee77011c10-cert\") pod \"openstack-operator-controller-manager-6fb94b767d-jsrj4\" (UID: \"8075708f-0d72-4b98-b39e-a0ee77011c10\") " pod="openstack-operators/openstack-operator-controller-manager-6fb94b767d-jsrj4" Oct 03 08:55:05 crc kubenswrapper[4765]: E1003 08:55:05.674706 4765 secret.go:188] Couldn't get secret openstack-operators/webhook-server-cert: secret "webhook-server-cert" not found Oct 03 08:55:05 crc kubenswrapper[4765]: E1003 08:55:05.674756 4765 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/8075708f-0d72-4b98-b39e-a0ee77011c10-cert podName:8075708f-0d72-4b98-b39e-a0ee77011c10 nodeName:}" failed. No retries permitted until 2025-10-03 08:55:06.674741327 +0000 UTC m=+950.976235667 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/8075708f-0d72-4b98-b39e-a0ee77011c10-cert") pod "openstack-operator-controller-manager-6fb94b767d-jsrj4" (UID: "8075708f-0d72-4b98-b39e-a0ee77011c10") : secret "webhook-server-cert" not found Oct 03 08:55:05 crc kubenswrapper[4765]: I1003 08:55:05.729637 4765 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/glance-operator-controller-manager-846dff85b5-z5mzn"] Oct 03 08:55:05 crc kubenswrapper[4765]: I1003 08:55:05.738276 4765 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/horizon-operator-controller-manager-6769b867d9-tjk7g"] Oct 03 08:55:05 crc kubenswrapper[4765]: I1003 08:55:05.762506 4765 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/heat-operator-controller-manager-599898f689-6t6dx"] Oct 03 08:55:05 crc kubenswrapper[4765]: I1003 08:55:05.770289 4765 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/ironic-operator-controller-manager-84bc9db6cc-kc4kr"] Oct 03 08:55:05 crc kubenswrapper[4765]: I1003 08:55:05.777821 4765 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/infra-operator-controller-manager-5fbf469cd7-k5nkl"] Oct 03 08:55:05 crc kubenswrapper[4765]: W1003 08:55:05.809127 4765 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod341b089a_f32e_4c46_a533_dbdb7f7b836f.slice/crio-0e9bdf926e7ac19c786e571bced0801879aeb70087a42ec27ba11eefb7b7a668 WatchSource:0}: Error finding container 0e9bdf926e7ac19c786e571bced0801879aeb70087a42ec27ba11eefb7b7a668: Status 404 returned error can't find the container with id 0e9bdf926e7ac19c786e571bced0801879aeb70087a42ec27ba11eefb7b7a668 Oct 03 08:55:06 crc kubenswrapper[4765]: I1003 08:55:06.146206 4765 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/octavia-operator-controller-manager-59d6cfdf45-zmn6f"] Oct 03 08:55:06 crc kubenswrapper[4765]: I1003 08:55:06.165602 4765 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/designate-operator-controller-manager-75dfd9b554-c9sw7"] Oct 03 08:55:06 crc kubenswrapper[4765]: I1003 08:55:06.219269 4765 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/nova-operator-controller-manager-555c7456bd-7zq5j"] Oct 03 08:55:06 crc kubenswrapper[4765]: I1003 08:55:06.259259 4765 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/manila-operator-controller-manager-6fd6854b49-rg2n6"] Oct 03 08:55:06 crc kubenswrapper[4765]: I1003 08:55:06.269315 4765 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/ovn-operator-controller-manager-688db7b6c7-qmgwv"] Oct 03 08:55:06 crc kubenswrapper[4765]: W1003 08:55:06.273652 4765 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podb609ed75_6fd6_4719_a9dd_eca3c0f7df03.slice/crio-0a4e808fc466223f4f1ec284e68861998eca88abfcac55ae588e96475003b650 WatchSource:0}: Error finding container 0a4e808fc466223f4f1ec284e68861998eca88abfcac55ae588e96475003b650: Status 404 returned error can't find the container with id 0a4e808fc466223f4f1ec284e68861998eca88abfcac55ae588e96475003b650 Oct 03 08:55:06 crc kubenswrapper[4765]: I1003 08:55:06.281517 4765 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/keystone-operator-controller-manager-7f55849f88-88qh4"] Oct 03 08:55:06 crc kubenswrapper[4765]: I1003 08:55:06.283425 4765 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/swift-operator-controller-manager-6859f9b676-5rxfw"] Oct 03 08:55:06 crc kubenswrapper[4765]: I1003 08:55:06.293045 4765 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/rabbitmq-cluster-operator-manager-5f97d8c699-bsdgf"] Oct 03 08:55:06 crc kubenswrapper[4765]: I1003 08:55:06.297079 4765 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/neutron-operator-controller-manager-6574bf987d-867c4"] Oct 03 08:55:06 crc kubenswrapper[4765]: E1003 08:55:06.299590 4765 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/neutron-operator@sha256:570e59f91d7dd66c9abcec1e54889a44c65d676d3fff6802be101fe5215bc988,Command:[/manager],Args:[--health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080 --leader-elect],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-7frgx,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod neutron-operator-controller-manager-6574bf987d-867c4_openstack-operators(21090664-3a4c-4c58-852d-d2779c7bf17d): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Oct 03 08:55:06 crc kubenswrapper[4765]: I1003 08:55:06.305268 4765 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/mariadb-operator-controller-manager-5c468bf4d4-8tlhl"] Oct 03 08:55:06 crc kubenswrapper[4765]: E1003 08:55:06.320283 4765 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/mariadb-operator@sha256:110b885fe640ffdd8536e7da2a613677a6777e3d902e2ff15fa4d5968fe06c54,Command:[/manager],Args:[--health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080 --leader-elect],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-5nt2l,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod mariadb-operator-controller-manager-5c468bf4d4-8tlhl_openstack-operators(2cda6800-dddc-4535-86bc-25d3f2d167b0): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Oct 03 08:55:06 crc kubenswrapper[4765]: W1003 08:55:06.334238 4765 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podac8027c9_646b_4c9f_965e_639d6ad818dd.slice/crio-b5de8a40716fa1919603955325482d9029c52612d3b4f3c14d62dc80e3608331 WatchSource:0}: Error finding container b5de8a40716fa1919603955325482d9029c52612d3b4f3c14d62dc80e3608331: Status 404 returned error can't find the container with id b5de8a40716fa1919603955325482d9029c52612d3b4f3c14d62dc80e3608331 Oct 03 08:55:06 crc kubenswrapper[4765]: E1003 08:55:06.341425 4765 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:operator,Image:quay.io/openstack-k8s-operators/rabbitmq-cluster-operator@sha256:893e66303c1b0bc1d00a299a3f0380bad55c8dc813c8a1c6a4aab379f5aa12a2,Command:[/manager],Args:[],WorkingDir:,Ports:[]ContainerPort{ContainerPort{Name:metrics,HostPort:0,ContainerPort:9782,Protocol:TCP,HostIP:,},},Env:[]EnvVar{EnvVar{Name:OPERATOR_NAMESPACE,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:metadata.namespace,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{200 -3} {} 200m DecimalSI},memory: {{524288000 0} {} 500Mi BinarySI},},Requests:ResourceList{cpu: {{5 -3} {} 5m DecimalSI},memory: {{67108864 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-bdl48,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000660000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod rabbitmq-cluster-operator-manager-5f97d8c699-bsdgf_openstack-operators(ac8027c9-646b-4c9f-965e-639d6ad818dd): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Oct 03 08:55:06 crc kubenswrapper[4765]: E1003 08:55:06.342543 4765 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"operator\" with ErrImagePull: \"pull QPS exceeded\"" pod="openstack-operators/rabbitmq-cluster-operator-manager-5f97d8c699-bsdgf" podUID="ac8027c9-646b-4c9f-965e-639d6ad818dd" Oct 03 08:55:06 crc kubenswrapper[4765]: I1003 08:55:06.395665 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/b52d06aa-3131-4887-b541-16ada075b2dd-cert\") pod \"openstack-baremetal-operator-controller-manager-6f64c4d678bz9r6\" (UID: \"b52d06aa-3131-4887-b541-16ada075b2dd\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-6f64c4d678bz9r6" Oct 03 08:55:06 crc kubenswrapper[4765]: E1003 08:55:06.396636 4765 secret.go:188] Couldn't get secret openstack-operators/openstack-baremetal-operator-webhook-server-cert: secret "openstack-baremetal-operator-webhook-server-cert" not found Oct 03 08:55:06 crc kubenswrapper[4765]: E1003 08:55:06.396714 4765 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/b52d06aa-3131-4887-b541-16ada075b2dd-cert podName:b52d06aa-3131-4887-b541-16ada075b2dd nodeName:}" failed. No retries permitted until 2025-10-03 08:55:08.396695009 +0000 UTC m=+952.698189439 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/b52d06aa-3131-4887-b541-16ada075b2dd-cert") pod "openstack-baremetal-operator-controller-manager-6f64c4d678bz9r6" (UID: "b52d06aa-3131-4887-b541-16ada075b2dd") : secret "openstack-baremetal-operator-webhook-server-cert" not found Oct 03 08:55:06 crc kubenswrapper[4765]: I1003 08:55:06.455644 4765 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/watcher-operator-controller-manager-5c899c45b6-mkt28"] Oct 03 08:55:06 crc kubenswrapper[4765]: I1003 08:55:06.461828 4765 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/placement-operator-controller-manager-7d8bb7f44c-gsgcs"] Oct 03 08:55:06 crc kubenswrapper[4765]: I1003 08:55:06.468453 4765 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/telemetry-operator-controller-manager-5db5cf686f-qjp99"] Oct 03 08:55:06 crc kubenswrapper[4765]: W1003 08:55:06.475799 4765 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod98995a33_7241_42fe_a4c8_308da084b8db.slice/crio-ea11ed89691ad221c1d47735add392c63319eaf6e7d0c2964490dc4a15d30b12 WatchSource:0}: Error finding container ea11ed89691ad221c1d47735add392c63319eaf6e7d0c2964490dc4a15d30b12: Status 404 returned error can't find the container with id ea11ed89691ad221c1d47735add392c63319eaf6e7d0c2964490dc4a15d30b12 Oct 03 08:55:06 crc kubenswrapper[4765]: W1003 08:55:06.478715 4765 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podb0f3d3a4_5e2c_4f86_a6ab_65be50620ce7.slice/crio-b2f73a671a8897388fad4f686547b4969189b7d7c00726832af17d736de6c7a3 WatchSource:0}: Error finding container b2f73a671a8897388fad4f686547b4969189b7d7c00726832af17d736de6c7a3: Status 404 returned error can't find the container with id b2f73a671a8897388fad4f686547b4969189b7d7c00726832af17d736de6c7a3 Oct 03 08:55:06 crc kubenswrapper[4765]: E1003 08:55:06.489508 4765 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/telemetry-operator@sha256:8f5eee2eb7b77432ef1a88ed693ff981514359dfc808581f393bcef252de5cfa,Command:[/manager],Args:[--health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080 --leader-elect],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-7dznf,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod telemetry-operator-controller-manager-5db5cf686f-qjp99_openstack-operators(b0f3d3a4-5e2c-4f86-a6ab-65be50620ce7): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Oct 03 08:55:06 crc kubenswrapper[4765]: I1003 08:55:06.490975 4765 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/test-operator-controller-manager-5cd5cb47d7-vvxdl"] Oct 03 08:55:06 crc kubenswrapper[4765]: W1003 08:55:06.492358 4765 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf3a40e37_8073_4528_9379_9cea11f883ed.slice/crio-5706a5250fc5f2ba60985d455e0b7b6b1cad007810d35c194101b03942807cc6 WatchSource:0}: Error finding container 5706a5250fc5f2ba60985d455e0b7b6b1cad007810d35c194101b03942807cc6: Status 404 returned error can't find the container with id 5706a5250fc5f2ba60985d455e0b7b6b1cad007810d35c194101b03942807cc6 Oct 03 08:55:06 crc kubenswrapper[4765]: E1003 08:55:06.493612 4765 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:38.102.83.58:5001/openstack-k8s-operators/watcher-operator:45d3689e3784ce2fe5ff22a7c8f8533d389c1a20,Command:[/manager],Args:[--health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080 --leader-elect],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-kcthx,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod watcher-operator-controller-manager-5c899c45b6-mkt28_openstack-operators(2f15ff13-62f4-43da-82f8-5d5faac4f502): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Oct 03 08:55:06 crc kubenswrapper[4765]: E1003 08:55:06.498255 4765 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/test-operator@sha256:0daf76cc40ab619ae266b11defcc1b65beb22d859369e7b1b04de9169089a4cb,Command:[/manager],Args:[--health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080 --leader-elect],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-bgqk7,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod test-operator-controller-manager-5cd5cb47d7-vvxdl_openstack-operators(f3a40e37-8073-4528-9379-9cea11f883ed): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Oct 03 08:55:06 crc kubenswrapper[4765]: I1003 08:55:06.583063 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/glance-operator-controller-manager-846dff85b5-z5mzn" event={"ID":"341b089a-f32e-4c46-a533-dbdb7f7b836f","Type":"ContainerStarted","Data":"0e9bdf926e7ac19c786e571bced0801879aeb70087a42ec27ba11eefb7b7a668"} Oct 03 08:55:06 crc kubenswrapper[4765]: I1003 08:55:06.588421 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/rabbitmq-cluster-operator-manager-5f97d8c699-bsdgf" event={"ID":"ac8027c9-646b-4c9f-965e-639d6ad818dd","Type":"ContainerStarted","Data":"b5de8a40716fa1919603955325482d9029c52612d3b4f3c14d62dc80e3608331"} Oct 03 08:55:06 crc kubenswrapper[4765]: I1003 08:55:06.593613 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/heat-operator-controller-manager-599898f689-6t6dx" event={"ID":"c5aa8d54-3d82-404c-a201-d6ca76dc8a8e","Type":"ContainerStarted","Data":"b68ce862bfa6c86a90b37a7c220221eeecdee01439034549f1abe08e89862588"} Oct 03 08:55:06 crc kubenswrapper[4765]: I1003 08:55:06.595199 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/infra-operator-controller-manager-5fbf469cd7-k5nkl" event={"ID":"24d5db65-49a9-45d6-949a-d8310646b691","Type":"ContainerStarted","Data":"427ba136295c5097e491b7cddaf09dd8e5cd2e9beee66bdf1f78771bd8850744"} Oct 03 08:55:06 crc kubenswrapper[4765]: E1003 08:55:06.596369 4765 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"operator\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/rabbitmq-cluster-operator@sha256:893e66303c1b0bc1d00a299a3f0380bad55c8dc813c8a1c6a4aab379f5aa12a2\\\"\"" pod="openstack-operators/rabbitmq-cluster-operator-manager-5f97d8c699-bsdgf" podUID="ac8027c9-646b-4c9f-965e-639d6ad818dd" Oct 03 08:55:06 crc kubenswrapper[4765]: I1003 08:55:06.600991 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/designate-operator-controller-manager-75dfd9b554-c9sw7" event={"ID":"11cef8fb-3e83-485c-8651-6fbf983c682a","Type":"ContainerStarted","Data":"22eace49308008f273b69cab6eee72168723cfa99f32daa6bbcab1cc50b1119a"} Oct 03 08:55:06 crc kubenswrapper[4765]: I1003 08:55:06.603242 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/placement-operator-controller-manager-7d8bb7f44c-gsgcs" event={"ID":"98995a33-7241-42fe-a4c8-308da084b8db","Type":"ContainerStarted","Data":"ea11ed89691ad221c1d47735add392c63319eaf6e7d0c2964490dc4a15d30b12"} Oct 03 08:55:06 crc kubenswrapper[4765]: E1003 08:55:06.604477 4765 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"pull QPS exceeded\"" pod="openstack-operators/mariadb-operator-controller-manager-5c468bf4d4-8tlhl" podUID="2cda6800-dddc-4535-86bc-25d3f2d167b0" Oct 03 08:55:06 crc kubenswrapper[4765]: I1003 08:55:06.604800 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/nova-operator-controller-manager-555c7456bd-7zq5j" event={"ID":"c6bb21ee-995b-448f-8356-bd14d768d848","Type":"ContainerStarted","Data":"588bab4512517c3382b9b5a6bdc30a0f2287f018d6a5a95e3d50262a4bde56ba"} Oct 03 08:55:06 crc kubenswrapper[4765]: I1003 08:55:06.606623 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/ovn-operator-controller-manager-688db7b6c7-qmgwv" event={"ID":"a6dc4446-834f-45ac-8504-1ffc7be80df3","Type":"ContainerStarted","Data":"78884967f0f3d28adbaad310b2d9ae0ed2345bf2d9a8f4a79fcd7e3b212335a9"} Oct 03 08:55:06 crc kubenswrapper[4765]: I1003 08:55:06.608755 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/telemetry-operator-controller-manager-5db5cf686f-qjp99" event={"ID":"b0f3d3a4-5e2c-4f86-a6ab-65be50620ce7","Type":"ContainerStarted","Data":"b2f73a671a8897388fad4f686547b4969189b7d7c00726832af17d736de6c7a3"} Oct 03 08:55:06 crc kubenswrapper[4765]: I1003 08:55:06.611122 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/test-operator-controller-manager-5cd5cb47d7-vvxdl" event={"ID":"f3a40e37-8073-4528-9379-9cea11f883ed","Type":"ContainerStarted","Data":"5706a5250fc5f2ba60985d455e0b7b6b1cad007810d35c194101b03942807cc6"} Oct 03 08:55:06 crc kubenswrapper[4765]: I1003 08:55:06.613685 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/manila-operator-controller-manager-6fd6854b49-rg2n6" event={"ID":"65b40fd9-a427-4ab2-ab36-e4422081aa43","Type":"ContainerStarted","Data":"f54815cb51e0ca853146253c79b694c89448ce620185529b24f0c79bc739f5e8"} Oct 03 08:55:06 crc kubenswrapper[4765]: I1003 08:55:06.616392 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/keystone-operator-controller-manager-7f55849f88-88qh4" event={"ID":"b609ed75-6fd6-4719-a9dd-eca3c0f7df03","Type":"ContainerStarted","Data":"0a4e808fc466223f4f1ec284e68861998eca88abfcac55ae588e96475003b650"} Oct 03 08:55:06 crc kubenswrapper[4765]: I1003 08:55:06.618672 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/horizon-operator-controller-manager-6769b867d9-tjk7g" event={"ID":"24929c35-6b01-4f35-9d3c-7cbd372661a7","Type":"ContainerStarted","Data":"90b50480180c2b70fd887f8ee71091281fef3617df9921e0ee571d3798a22f4a"} Oct 03 08:55:06 crc kubenswrapper[4765]: I1003 08:55:06.625898 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/neutron-operator-controller-manager-6574bf987d-867c4" event={"ID":"21090664-3a4c-4c58-852d-d2779c7bf17d","Type":"ContainerStarted","Data":"5f96ba4499a66a8bf4d2ce36140a4667fef64db9d30970126c8b98dafab1e093"} Oct 03 08:55:06 crc kubenswrapper[4765]: E1003 08:55:06.626563 4765 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"pull QPS exceeded\"" pod="openstack-operators/neutron-operator-controller-manager-6574bf987d-867c4" podUID="21090664-3a4c-4c58-852d-d2779c7bf17d" Oct 03 08:55:06 crc kubenswrapper[4765]: I1003 08:55:06.632085 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/octavia-operator-controller-manager-59d6cfdf45-zmn6f" event={"ID":"a8759262-a00d-4609-9ca0-7bb0701064f0","Type":"ContainerStarted","Data":"ca0c30809618c7317e089b303759ba320863ecee06091b025b0a5180be68f41f"} Oct 03 08:55:06 crc kubenswrapper[4765]: I1003 08:55:06.636593 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/mariadb-operator-controller-manager-5c468bf4d4-8tlhl" event={"ID":"2cda6800-dddc-4535-86bc-25d3f2d167b0","Type":"ContainerStarted","Data":"057f61ac136485531f92d2243b6381970631153191fee18e6b3f8db88cd2b531"} Oct 03 08:55:06 crc kubenswrapper[4765]: E1003 08:55:06.638022 4765 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/mariadb-operator@sha256:110b885fe640ffdd8536e7da2a613677a6777e3d902e2ff15fa4d5968fe06c54\\\"\"" pod="openstack-operators/mariadb-operator-controller-manager-5c468bf4d4-8tlhl" podUID="2cda6800-dddc-4535-86bc-25d3f2d167b0" Oct 03 08:55:06 crc kubenswrapper[4765]: I1003 08:55:06.640202 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/watcher-operator-controller-manager-5c899c45b6-mkt28" event={"ID":"2f15ff13-62f4-43da-82f8-5d5faac4f502","Type":"ContainerStarted","Data":"b93e242d2d9c5790d8c5088e298958ba63ef77082fe06dd007f9782eaf76b573"} Oct 03 08:55:06 crc kubenswrapper[4765]: I1003 08:55:06.643362 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/ironic-operator-controller-manager-84bc9db6cc-kc4kr" event={"ID":"991c16cf-3b2a-4f6c-b792-3848bffc434d","Type":"ContainerStarted","Data":"e855ddb2de7d7081e715ae7d0e5579760b2ef3a6044803062a251f3eee1a7633"} Oct 03 08:55:06 crc kubenswrapper[4765]: I1003 08:55:06.645556 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/swift-operator-controller-manager-6859f9b676-5rxfw" event={"ID":"4a654679-7956-4885-841e-e1718fc57bef","Type":"ContainerStarted","Data":"d15074b2f480b6a1e72fe7259de4cfc8d60c6415284b018cc9ee1c1134791b8b"} Oct 03 08:55:06 crc kubenswrapper[4765]: I1003 08:55:06.699549 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/8075708f-0d72-4b98-b39e-a0ee77011c10-cert\") pod \"openstack-operator-controller-manager-6fb94b767d-jsrj4\" (UID: \"8075708f-0d72-4b98-b39e-a0ee77011c10\") " pod="openstack-operators/openstack-operator-controller-manager-6fb94b767d-jsrj4" Oct 03 08:55:06 crc kubenswrapper[4765]: E1003 08:55:06.713485 4765 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"pull QPS exceeded\"" pod="openstack-operators/telemetry-operator-controller-manager-5db5cf686f-qjp99" podUID="b0f3d3a4-5e2c-4f86-a6ab-65be50620ce7" Oct 03 08:55:06 crc kubenswrapper[4765]: I1003 08:55:06.733490 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/8075708f-0d72-4b98-b39e-a0ee77011c10-cert\") pod \"openstack-operator-controller-manager-6fb94b767d-jsrj4\" (UID: \"8075708f-0d72-4b98-b39e-a0ee77011c10\") " pod="openstack-operators/openstack-operator-controller-manager-6fb94b767d-jsrj4" Oct 03 08:55:06 crc kubenswrapper[4765]: E1003 08:55:06.750968 4765 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"pull QPS exceeded\"" pod="openstack-operators/watcher-operator-controller-manager-5c899c45b6-mkt28" podUID="2f15ff13-62f4-43da-82f8-5d5faac4f502" Oct 03 08:55:06 crc kubenswrapper[4765]: E1003 08:55:06.767785 4765 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"pull QPS exceeded\"" pod="openstack-operators/test-operator-controller-manager-5cd5cb47d7-vvxdl" podUID="f3a40e37-8073-4528-9379-9cea11f883ed" Oct 03 08:55:06 crc kubenswrapper[4765]: I1003 08:55:06.930880 4765 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-controller-manager-6fb94b767d-jsrj4" Oct 03 08:55:07 crc kubenswrapper[4765]: I1003 08:55:07.687355 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/test-operator-controller-manager-5cd5cb47d7-vvxdl" event={"ID":"f3a40e37-8073-4528-9379-9cea11f883ed","Type":"ContainerStarted","Data":"c6a7f094b370e498d74e47ce1a5c580cf1ae4270f8aa7e9a8d0f1205af363eac"} Oct 03 08:55:07 crc kubenswrapper[4765]: E1003 08:55:07.691335 4765 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/test-operator@sha256:0daf76cc40ab619ae266b11defcc1b65beb22d859369e7b1b04de9169089a4cb\\\"\"" pod="openstack-operators/test-operator-controller-manager-5cd5cb47d7-vvxdl" podUID="f3a40e37-8073-4528-9379-9cea11f883ed" Oct 03 08:55:07 crc kubenswrapper[4765]: I1003 08:55:07.705054 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/neutron-operator-controller-manager-6574bf987d-867c4" event={"ID":"21090664-3a4c-4c58-852d-d2779c7bf17d","Type":"ContainerStarted","Data":"8d12bf82ca4557491d810b9d9954148f84c6c6f2ce23720018b0e4d5a82c5ff2"} Oct 03 08:55:07 crc kubenswrapper[4765]: E1003 08:55:07.714917 4765 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/neutron-operator@sha256:570e59f91d7dd66c9abcec1e54889a44c65d676d3fff6802be101fe5215bc988\\\"\"" pod="openstack-operators/neutron-operator-controller-manager-6574bf987d-867c4" podUID="21090664-3a4c-4c58-852d-d2779c7bf17d" Oct 03 08:55:07 crc kubenswrapper[4765]: I1003 08:55:07.762143 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/mariadb-operator-controller-manager-5c468bf4d4-8tlhl" event={"ID":"2cda6800-dddc-4535-86bc-25d3f2d167b0","Type":"ContainerStarted","Data":"41cd9ca8baac559771b924cf3f74d7b28600f4d81d364a1821cd0247cf8a6ceb"} Oct 03 08:55:07 crc kubenswrapper[4765]: E1003 08:55:07.775345 4765 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/mariadb-operator@sha256:110b885fe640ffdd8536e7da2a613677a6777e3d902e2ff15fa4d5968fe06c54\\\"\"" pod="openstack-operators/mariadb-operator-controller-manager-5c468bf4d4-8tlhl" podUID="2cda6800-dddc-4535-86bc-25d3f2d167b0" Oct 03 08:55:07 crc kubenswrapper[4765]: I1003 08:55:07.780073 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/watcher-operator-controller-manager-5c899c45b6-mkt28" event={"ID":"2f15ff13-62f4-43da-82f8-5d5faac4f502","Type":"ContainerStarted","Data":"07051b0e7227a009dc9d0e7d5e19f46a084b1ea9bcd330669a93f37c8c92fd4a"} Oct 03 08:55:07 crc kubenswrapper[4765]: E1003 08:55:07.782046 4765 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"38.102.83.58:5001/openstack-k8s-operators/watcher-operator:45d3689e3784ce2fe5ff22a7c8f8533d389c1a20\\\"\"" pod="openstack-operators/watcher-operator-controller-manager-5c899c45b6-mkt28" podUID="2f15ff13-62f4-43da-82f8-5d5faac4f502" Oct 03 08:55:07 crc kubenswrapper[4765]: I1003 08:55:07.790258 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/telemetry-operator-controller-manager-5db5cf686f-qjp99" event={"ID":"b0f3d3a4-5e2c-4f86-a6ab-65be50620ce7","Type":"ContainerStarted","Data":"2ad02675f403b39ffb3929c7798554f3204d44cd9cd7fa6a25f6a9cbf6f3d578"} Oct 03 08:55:07 crc kubenswrapper[4765]: E1003 08:55:07.792428 4765 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/telemetry-operator@sha256:8f5eee2eb7b77432ef1a88ed693ff981514359dfc808581f393bcef252de5cfa\\\"\"" pod="openstack-operators/telemetry-operator-controller-manager-5db5cf686f-qjp99" podUID="b0f3d3a4-5e2c-4f86-a6ab-65be50620ce7" Oct 03 08:55:07 crc kubenswrapper[4765]: E1003 08:55:07.793051 4765 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"operator\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/rabbitmq-cluster-operator@sha256:893e66303c1b0bc1d00a299a3f0380bad55c8dc813c8a1c6a4aab379f5aa12a2\\\"\"" pod="openstack-operators/rabbitmq-cluster-operator-manager-5f97d8c699-bsdgf" podUID="ac8027c9-646b-4c9f-965e-639d6ad818dd" Oct 03 08:55:07 crc kubenswrapper[4765]: I1003 08:55:07.832614 4765 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-controller-manager-6fb94b767d-jsrj4"] Oct 03 08:55:08 crc kubenswrapper[4765]: I1003 08:55:08.432553 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/b52d06aa-3131-4887-b541-16ada075b2dd-cert\") pod \"openstack-baremetal-operator-controller-manager-6f64c4d678bz9r6\" (UID: \"b52d06aa-3131-4887-b541-16ada075b2dd\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-6f64c4d678bz9r6" Oct 03 08:55:08 crc kubenswrapper[4765]: I1003 08:55:08.443635 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/b52d06aa-3131-4887-b541-16ada075b2dd-cert\") pod \"openstack-baremetal-operator-controller-manager-6f64c4d678bz9r6\" (UID: \"b52d06aa-3131-4887-b541-16ada075b2dd\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-6f64c4d678bz9r6" Oct 03 08:55:08 crc kubenswrapper[4765]: I1003 08:55:08.547438 4765 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-baremetal-operator-controller-manager-6f64c4d678bz9r6" Oct 03 08:55:08 crc kubenswrapper[4765]: E1003 08:55:08.799838 4765 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/mariadb-operator@sha256:110b885fe640ffdd8536e7da2a613677a6777e3d902e2ff15fa4d5968fe06c54\\\"\"" pod="openstack-operators/mariadb-operator-controller-manager-5c468bf4d4-8tlhl" podUID="2cda6800-dddc-4535-86bc-25d3f2d167b0" Oct 03 08:55:08 crc kubenswrapper[4765]: E1003 08:55:08.800563 4765 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/neutron-operator@sha256:570e59f91d7dd66c9abcec1e54889a44c65d676d3fff6802be101fe5215bc988\\\"\"" pod="openstack-operators/neutron-operator-controller-manager-6574bf987d-867c4" podUID="21090664-3a4c-4c58-852d-d2779c7bf17d" Oct 03 08:55:08 crc kubenswrapper[4765]: E1003 08:55:08.801629 4765 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"38.102.83.58:5001/openstack-k8s-operators/watcher-operator:45d3689e3784ce2fe5ff22a7c8f8533d389c1a20\\\"\"" pod="openstack-operators/watcher-operator-controller-manager-5c899c45b6-mkt28" podUID="2f15ff13-62f4-43da-82f8-5d5faac4f502" Oct 03 08:55:08 crc kubenswrapper[4765]: E1003 08:55:08.802121 4765 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/test-operator@sha256:0daf76cc40ab619ae266b11defcc1b65beb22d859369e7b1b04de9169089a4cb\\\"\"" pod="openstack-operators/test-operator-controller-manager-5cd5cb47d7-vvxdl" podUID="f3a40e37-8073-4528-9379-9cea11f883ed" Oct 03 08:55:08 crc kubenswrapper[4765]: E1003 08:55:08.806939 4765 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/telemetry-operator@sha256:8f5eee2eb7b77432ef1a88ed693ff981514359dfc808581f393bcef252de5cfa\\\"\"" pod="openstack-operators/telemetry-operator-controller-manager-5db5cf686f-qjp99" podUID="b0f3d3a4-5e2c-4f86-a6ab-65be50620ce7" Oct 03 08:55:09 crc kubenswrapper[4765]: W1003 08:55:09.129856 4765 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod8075708f_0d72_4b98_b39e_a0ee77011c10.slice/crio-1a31d52cabad8a682908f3338a00d0c76e178c6572c5c58ce7f741366e031444 WatchSource:0}: Error finding container 1a31d52cabad8a682908f3338a00d0c76e178c6572c5c58ce7f741366e031444: Status 404 returned error can't find the container with id 1a31d52cabad8a682908f3338a00d0c76e178c6572c5c58ce7f741366e031444 Oct 03 08:55:09 crc kubenswrapper[4765]: I1003 08:55:09.806763 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-controller-manager-6fb94b767d-jsrj4" event={"ID":"8075708f-0d72-4b98-b39e-a0ee77011c10","Type":"ContainerStarted","Data":"1a31d52cabad8a682908f3338a00d0c76e178c6572c5c58ce7f741366e031444"} Oct 03 08:55:16 crc kubenswrapper[4765]: I1003 08:55:16.685257 4765 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-baremetal-operator-controller-manager-6f64c4d678bz9r6"] Oct 03 08:55:16 crc kubenswrapper[4765]: I1003 08:55:16.813827 4765 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Oct 03 08:55:16 crc kubenswrapper[4765]: I1003 08:55:16.855431 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/cinder-operator-controller-manager-79d68d6c85-kxsxl" event={"ID":"4fd5d3b8-ca79-48d0-9854-5bb9bc04eae4","Type":"ContainerStarted","Data":"cf75ae5ce86456a7617c9c01c7397f05c89bd2cac3b5b4490e20f339a9f7eec4"} Oct 03 08:55:16 crc kubenswrapper[4765]: I1003 08:55:16.861547 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-controller-manager-6fb94b767d-jsrj4" event={"ID":"8075708f-0d72-4b98-b39e-a0ee77011c10","Type":"ContainerStarted","Data":"0988162ec47248c363c8e2565d1055a964361f4255f83386978d470f63ea0afc"} Oct 03 08:55:16 crc kubenswrapper[4765]: I1003 08:55:16.865838 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/swift-operator-controller-manager-6859f9b676-5rxfw" event={"ID":"4a654679-7956-4885-841e-e1718fc57bef","Type":"ContainerStarted","Data":"6c7be3c9af6c0cd649efcd2b78cfe5e62d4b29c995800ddf3b7db2bbbde38d06"} Oct 03 08:55:16 crc kubenswrapper[4765]: I1003 08:55:16.867729 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/manila-operator-controller-manager-6fd6854b49-rg2n6" event={"ID":"65b40fd9-a427-4ab2-ab36-e4422081aa43","Type":"ContainerStarted","Data":"36a85b26f064ab2ac0a51a9135d32302eb6f66e14f557a67eb04cb228492e6fd"} Oct 03 08:55:16 crc kubenswrapper[4765]: I1003 08:55:16.869569 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/horizon-operator-controller-manager-6769b867d9-tjk7g" event={"ID":"24929c35-6b01-4f35-9d3c-7cbd372661a7","Type":"ContainerStarted","Data":"357a97dab5a71008755fdd40a25b6634c9c3efaabace44adfd4216326cdde308"} Oct 03 08:55:16 crc kubenswrapper[4765]: I1003 08:55:16.870669 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-baremetal-operator-controller-manager-6f64c4d678bz9r6" event={"ID":"b52d06aa-3131-4887-b541-16ada075b2dd","Type":"ContainerStarted","Data":"118bb09ca66e3f25ea5fb82de749971b97f1753f5ebda6890aa00640fa368e98"} Oct 03 08:55:16 crc kubenswrapper[4765]: I1003 08:55:16.872743 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/nova-operator-controller-manager-555c7456bd-7zq5j" event={"ID":"c6bb21ee-995b-448f-8356-bd14d768d848","Type":"ContainerStarted","Data":"ea762161e3b7cefe7a00896e0a7e594f2d1bb545a0ca78539a7fd77812e678cd"} Oct 03 08:55:17 crc kubenswrapper[4765]: I1003 08:55:17.911102 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/cinder-operator-controller-manager-79d68d6c85-kxsxl" event={"ID":"4fd5d3b8-ca79-48d0-9854-5bb9bc04eae4","Type":"ContainerStarted","Data":"689b6068fd99fec1be9590d56c15d6b0144bf0b576b207805132ed23871ddc33"} Oct 03 08:55:17 crc kubenswrapper[4765]: I1003 08:55:17.912185 4765 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/cinder-operator-controller-manager-79d68d6c85-kxsxl" Oct 03 08:55:17 crc kubenswrapper[4765]: I1003 08:55:17.920900 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/heat-operator-controller-manager-599898f689-6t6dx" event={"ID":"c5aa8d54-3d82-404c-a201-d6ca76dc8a8e","Type":"ContainerStarted","Data":"47b5d5d636f042e1d0f25670cfc1a6d276d5aec8d9d4753526300f72daf835b1"} Oct 03 08:55:17 crc kubenswrapper[4765]: I1003 08:55:17.926245 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-controller-manager-6fb94b767d-jsrj4" event={"ID":"8075708f-0d72-4b98-b39e-a0ee77011c10","Type":"ContainerStarted","Data":"f90ce86a5db33ff957e177cd9a654d1a0761026fd39be522cc0a9e20b5c6bfba"} Oct 03 08:55:17 crc kubenswrapper[4765]: I1003 08:55:17.927731 4765 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/openstack-operator-controller-manager-6fb94b767d-jsrj4" Oct 03 08:55:17 crc kubenswrapper[4765]: I1003 08:55:17.936583 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/keystone-operator-controller-manager-7f55849f88-88qh4" event={"ID":"b609ed75-6fd6-4719-a9dd-eca3c0f7df03","Type":"ContainerStarted","Data":"16b71f54d699d251ae69eb6d045e56e9f403aaf7252ed6cad4d678e4df1822a2"} Oct 03 08:55:17 crc kubenswrapper[4765]: I1003 08:55:17.945761 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/placement-operator-controller-manager-7d8bb7f44c-gsgcs" event={"ID":"98995a33-7241-42fe-a4c8-308da084b8db","Type":"ContainerStarted","Data":"9477bf584482f5c58ae4c8671cd43ac6d421b54176b105195a2006f1531c51a8"} Oct 03 08:55:17 crc kubenswrapper[4765]: I1003 08:55:17.958963 4765 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/cinder-operator-controller-manager-79d68d6c85-kxsxl" podStartSLOduration=3.007268196 podStartE2EDuration="13.958943202s" podCreationTimestamp="2025-10-03 08:55:04 +0000 UTC" firstStartedPulling="2025-10-03 08:55:05.2153612 +0000 UTC m=+949.516855530" lastFinishedPulling="2025-10-03 08:55:16.167036206 +0000 UTC m=+960.468530536" observedRunningTime="2025-10-03 08:55:17.945342816 +0000 UTC m=+962.246837156" watchObservedRunningTime="2025-10-03 08:55:17.958943202 +0000 UTC m=+962.260437532" Oct 03 08:55:17 crc kubenswrapper[4765]: I1003 08:55:17.959966 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/octavia-operator-controller-manager-59d6cfdf45-zmn6f" event={"ID":"a8759262-a00d-4609-9ca0-7bb0701064f0","Type":"ContainerStarted","Data":"10865ceda9efa5cc127f7fa34895ec361a168cf2c3a6e978622dd8ba4b8631cb"} Oct 03 08:55:17 crc kubenswrapper[4765]: I1003 08:55:17.981283 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/ovn-operator-controller-manager-688db7b6c7-qmgwv" event={"ID":"a6dc4446-834f-45ac-8504-1ffc7be80df3","Type":"ContainerStarted","Data":"da41615a0831d94cc554985b8864afb5a6e4bb3c865193ab1ff9e130abf33b8a"} Oct 03 08:55:17 crc kubenswrapper[4765]: I1003 08:55:17.982423 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/designate-operator-controller-manager-75dfd9b554-c9sw7" event={"ID":"11cef8fb-3e83-485c-8651-6fbf983c682a","Type":"ContainerStarted","Data":"03526bacd0fea2a050dea0350242b91de66eb8093018ed6b3af2ac882a01ef0d"} Oct 03 08:55:17 crc kubenswrapper[4765]: I1003 08:55:17.983268 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/glance-operator-controller-manager-846dff85b5-z5mzn" event={"ID":"341b089a-f32e-4c46-a533-dbdb7f7b836f","Type":"ContainerStarted","Data":"9b5bffad4a19c8786b9cda489203e54fe82e73b29e8b2887fc80ec1f8f94cfc7"} Oct 03 08:55:17 crc kubenswrapper[4765]: I1003 08:55:17.984130 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/ironic-operator-controller-manager-84bc9db6cc-kc4kr" event={"ID":"991c16cf-3b2a-4f6c-b792-3848bffc434d","Type":"ContainerStarted","Data":"544f9a01fb6ed31059fe15a81546904b63b950329d188eb75d309dc41c6baa72"} Oct 03 08:55:17 crc kubenswrapper[4765]: I1003 08:55:17.985431 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/nova-operator-controller-manager-555c7456bd-7zq5j" event={"ID":"c6bb21ee-995b-448f-8356-bd14d768d848","Type":"ContainerStarted","Data":"6ec11235e5a6fb9d734c94d843a8fcd0d62d2dc085dffb662efb672386773471"} Oct 03 08:55:17 crc kubenswrapper[4765]: I1003 08:55:17.985955 4765 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/nova-operator-controller-manager-555c7456bd-7zq5j" Oct 03 08:55:17 crc kubenswrapper[4765]: I1003 08:55:17.987167 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/barbican-operator-controller-manager-6c675fb79f-z24cr" event={"ID":"a9c35b93-23e9-43d5-9532-f1f0d51e8ae8","Type":"ContainerStarted","Data":"42ab08932bf012377eaf5b9d29e6f219e6ffd544a48a764c5c4f570d59dc4c71"} Oct 03 08:55:18 crc kubenswrapper[4765]: I1003 08:55:18.002605 4765 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/openstack-operator-controller-manager-6fb94b767d-jsrj4" podStartSLOduration=14.002584333 podStartE2EDuration="14.002584333s" podCreationTimestamp="2025-10-03 08:55:04 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-03 08:55:17.990882745 +0000 UTC m=+962.292377075" watchObservedRunningTime="2025-10-03 08:55:18.002584333 +0000 UTC m=+962.304078663" Oct 03 08:55:18 crc kubenswrapper[4765]: I1003 08:55:18.005083 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/infra-operator-controller-manager-5fbf469cd7-k5nkl" event={"ID":"24d5db65-49a9-45d6-949a-d8310646b691","Type":"ContainerStarted","Data":"e5690eb4688c7f747a0e79184a3c47374e4d07d4a6208f10d8e79e098a643aca"} Oct 03 08:55:18 crc kubenswrapper[4765]: I1003 08:55:18.025803 4765 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/nova-operator-controller-manager-555c7456bd-7zq5j" podStartSLOduration=4.083992891 podStartE2EDuration="14.025787484s" podCreationTimestamp="2025-10-03 08:55:04 +0000 UTC" firstStartedPulling="2025-10-03 08:55:06.2620274 +0000 UTC m=+950.563521730" lastFinishedPulling="2025-10-03 08:55:16.203821993 +0000 UTC m=+960.505316323" observedRunningTime="2025-10-03 08:55:18.023487735 +0000 UTC m=+962.324982085" watchObservedRunningTime="2025-10-03 08:55:18.025787484 +0000 UTC m=+962.327281814" Oct 03 08:55:18 crc kubenswrapper[4765]: I1003 08:55:18.040790 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/swift-operator-controller-manager-6859f9b676-5rxfw" event={"ID":"4a654679-7956-4885-841e-e1718fc57bef","Type":"ContainerStarted","Data":"ba6a3dbdcaace2529067c8e8f7b53910eef33ff80d43c08fe5a6bc959d6409ed"} Oct 03 08:55:18 crc kubenswrapper[4765]: I1003 08:55:18.040861 4765 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/swift-operator-controller-manager-6859f9b676-5rxfw" Oct 03 08:55:18 crc kubenswrapper[4765]: I1003 08:55:18.063741 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/horizon-operator-controller-manager-6769b867d9-tjk7g" event={"ID":"24929c35-6b01-4f35-9d3c-7cbd372661a7","Type":"ContainerStarted","Data":"beee2a43cc203d289fc9c41e74478d023459fe106a1f2aa52825bfe3f1dee089"} Oct 03 08:55:18 crc kubenswrapper[4765]: I1003 08:55:18.064340 4765 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/horizon-operator-controller-manager-6769b867d9-tjk7g" Oct 03 08:55:18 crc kubenswrapper[4765]: I1003 08:55:18.113126 4765 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/swift-operator-controller-manager-6859f9b676-5rxfw" podStartSLOduration=4.196092586 podStartE2EDuration="14.113105117s" podCreationTimestamp="2025-10-03 08:55:04 +0000 UTC" firstStartedPulling="2025-10-03 08:55:06.297376801 +0000 UTC m=+950.598871131" lastFinishedPulling="2025-10-03 08:55:16.214389332 +0000 UTC m=+960.515883662" observedRunningTime="2025-10-03 08:55:18.085593167 +0000 UTC m=+962.387087487" watchObservedRunningTime="2025-10-03 08:55:18.113105117 +0000 UTC m=+962.414599447" Oct 03 08:55:18 crc kubenswrapper[4765]: I1003 08:55:18.116759 4765 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/horizon-operator-controller-manager-6769b867d9-tjk7g" podStartSLOduration=3.724226731 podStartE2EDuration="14.11674917s" podCreationTimestamp="2025-10-03 08:55:04 +0000 UTC" firstStartedPulling="2025-10-03 08:55:05.814914426 +0000 UTC m=+950.116408756" lastFinishedPulling="2025-10-03 08:55:16.207436865 +0000 UTC m=+960.508931195" observedRunningTime="2025-10-03 08:55:18.11319688 +0000 UTC m=+962.414691210" watchObservedRunningTime="2025-10-03 08:55:18.11674917 +0000 UTC m=+962.418243500" Oct 03 08:55:19 crc kubenswrapper[4765]: I1003 08:55:19.084913 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/designate-operator-controller-manager-75dfd9b554-c9sw7" event={"ID":"11cef8fb-3e83-485c-8651-6fbf983c682a","Type":"ContainerStarted","Data":"387187c0a648ba93eda8a049d8d0033db18f3d849c838a09f56fd5512ec9889c"} Oct 03 08:55:19 crc kubenswrapper[4765]: I1003 08:55:19.085723 4765 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/designate-operator-controller-manager-75dfd9b554-c9sw7" Oct 03 08:55:19 crc kubenswrapper[4765]: I1003 08:55:19.088584 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/manila-operator-controller-manager-6fd6854b49-rg2n6" event={"ID":"65b40fd9-a427-4ab2-ab36-e4422081aa43","Type":"ContainerStarted","Data":"b06b6560a4844de03c4ae6101839e592b424b93196644ffcc08c5cb395750174"} Oct 03 08:55:19 crc kubenswrapper[4765]: I1003 08:55:19.088884 4765 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/manila-operator-controller-manager-6fd6854b49-rg2n6" Oct 03 08:55:19 crc kubenswrapper[4765]: I1003 08:55:19.090420 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/keystone-operator-controller-manager-7f55849f88-88qh4" event={"ID":"b609ed75-6fd6-4719-a9dd-eca3c0f7df03","Type":"ContainerStarted","Data":"4c52bd2408b12de427ff51da6b93fd9e242a64b6af39bd4dc60ecf6ade973872"} Oct 03 08:55:19 crc kubenswrapper[4765]: I1003 08:55:19.090554 4765 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/keystone-operator-controller-manager-7f55849f88-88qh4" Oct 03 08:55:19 crc kubenswrapper[4765]: I1003 08:55:19.093835 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/barbican-operator-controller-manager-6c675fb79f-z24cr" event={"ID":"a9c35b93-23e9-43d5-9532-f1f0d51e8ae8","Type":"ContainerStarted","Data":"b85bac734d4363bc7d84898aa8aed8228d6688bf0fa4902b40d412a823405a96"} Oct 03 08:55:19 crc kubenswrapper[4765]: I1003 08:55:19.094239 4765 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/barbican-operator-controller-manager-6c675fb79f-z24cr" Oct 03 08:55:19 crc kubenswrapper[4765]: I1003 08:55:19.099162 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/heat-operator-controller-manager-599898f689-6t6dx" event={"ID":"c5aa8d54-3d82-404c-a201-d6ca76dc8a8e","Type":"ContainerStarted","Data":"c135d064b39ed03f23382c441f8666ea24591f7f3ad808e83d859431df26d9c9"} Oct 03 08:55:19 crc kubenswrapper[4765]: I1003 08:55:19.099750 4765 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/heat-operator-controller-manager-599898f689-6t6dx" Oct 03 08:55:19 crc kubenswrapper[4765]: I1003 08:55:19.119502 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/ovn-operator-controller-manager-688db7b6c7-qmgwv" event={"ID":"a6dc4446-834f-45ac-8504-1ffc7be80df3","Type":"ContainerStarted","Data":"04ebc38549ef9501a4cba20f6512797904badfa5087cb16b5bee294471e71d2e"} Oct 03 08:55:19 crc kubenswrapper[4765]: I1003 08:55:19.120553 4765 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/ovn-operator-controller-manager-688db7b6c7-qmgwv" Oct 03 08:55:19 crc kubenswrapper[4765]: I1003 08:55:19.124468 4765 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/designate-operator-controller-manager-75dfd9b554-c9sw7" podStartSLOduration=5.096460082 podStartE2EDuration="15.124449769s" podCreationTimestamp="2025-10-03 08:55:04 +0000 UTC" firstStartedPulling="2025-10-03 08:55:06.188915929 +0000 UTC m=+950.490410259" lastFinishedPulling="2025-10-03 08:55:16.216905616 +0000 UTC m=+960.518399946" observedRunningTime="2025-10-03 08:55:19.107367274 +0000 UTC m=+963.408861604" watchObservedRunningTime="2025-10-03 08:55:19.124449769 +0000 UTC m=+963.425944099" Oct 03 08:55:19 crc kubenswrapper[4765]: I1003 08:55:19.129637 4765 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/heat-operator-controller-manager-599898f689-6t6dx" podStartSLOduration=4.740629612 podStartE2EDuration="15.129618751s" podCreationTimestamp="2025-10-03 08:55:04 +0000 UTC" firstStartedPulling="2025-10-03 08:55:05.814961167 +0000 UTC m=+950.116455497" lastFinishedPulling="2025-10-03 08:55:16.203950306 +0000 UTC m=+960.505444636" observedRunningTime="2025-10-03 08:55:19.125779913 +0000 UTC m=+963.427274233" watchObservedRunningTime="2025-10-03 08:55:19.129618751 +0000 UTC m=+963.431113081" Oct 03 08:55:19 crc kubenswrapper[4765]: I1003 08:55:19.134334 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/infra-operator-controller-manager-5fbf469cd7-k5nkl" event={"ID":"24d5db65-49a9-45d6-949a-d8310646b691","Type":"ContainerStarted","Data":"4830298cb25b66b7aad1d99dda318186e152cd8f8df226e4124d748723bf277d"} Oct 03 08:55:19 crc kubenswrapper[4765]: I1003 08:55:19.134499 4765 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/infra-operator-controller-manager-5fbf469cd7-k5nkl" Oct 03 08:55:19 crc kubenswrapper[4765]: I1003 08:55:19.137965 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/ironic-operator-controller-manager-84bc9db6cc-kc4kr" event={"ID":"991c16cf-3b2a-4f6c-b792-3848bffc434d","Type":"ContainerStarted","Data":"36880157d9729758f72c72b75f25f4d06479a094f78d8558a547b4eb09cb75b7"} Oct 03 08:55:19 crc kubenswrapper[4765]: I1003 08:55:19.138108 4765 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/ironic-operator-controller-manager-84bc9db6cc-kc4kr" Oct 03 08:55:19 crc kubenswrapper[4765]: I1003 08:55:19.139756 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/glance-operator-controller-manager-846dff85b5-z5mzn" event={"ID":"341b089a-f32e-4c46-a533-dbdb7f7b836f","Type":"ContainerStarted","Data":"cad40486b27b2f5839ddafcf53198b5b93812bb0394a22f134f5d7e9dbf18e25"} Oct 03 08:55:19 crc kubenswrapper[4765]: I1003 08:55:19.140418 4765 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/glance-operator-controller-manager-846dff85b5-z5mzn" Oct 03 08:55:19 crc kubenswrapper[4765]: I1003 08:55:19.142518 4765 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/barbican-operator-controller-manager-6c675fb79f-z24cr" podStartSLOduration=4.13181574 podStartE2EDuration="15.142507269s" podCreationTimestamp="2025-10-03 08:55:04 +0000 UTC" firstStartedPulling="2025-10-03 08:55:05.206066943 +0000 UTC m=+949.507561273" lastFinishedPulling="2025-10-03 08:55:16.216758472 +0000 UTC m=+960.518252802" observedRunningTime="2025-10-03 08:55:19.140838566 +0000 UTC m=+963.442332896" watchObservedRunningTime="2025-10-03 08:55:19.142507269 +0000 UTC m=+963.444001599" Oct 03 08:55:19 crc kubenswrapper[4765]: I1003 08:55:19.146789 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/placement-operator-controller-manager-7d8bb7f44c-gsgcs" event={"ID":"98995a33-7241-42fe-a4c8-308da084b8db","Type":"ContainerStarted","Data":"258161e32e79b805c78111efc46e3645055ba6efc57c3f8222474e427fa34663"} Oct 03 08:55:19 crc kubenswrapper[4765]: I1003 08:55:19.147050 4765 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/placement-operator-controller-manager-7d8bb7f44c-gsgcs" Oct 03 08:55:19 crc kubenswrapper[4765]: I1003 08:55:19.151946 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/octavia-operator-controller-manager-59d6cfdf45-zmn6f" event={"ID":"a8759262-a00d-4609-9ca0-7bb0701064f0","Type":"ContainerStarted","Data":"c16a62cf5e0ef6ba41043d07503367500a439f6a1357894f755ac13eca5b81a4"} Oct 03 08:55:19 crc kubenswrapper[4765]: I1003 08:55:19.160693 4765 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/manila-operator-controller-manager-6fd6854b49-rg2n6" podStartSLOduration=5.207264023 podStartE2EDuration="15.160668201s" podCreationTimestamp="2025-10-03 08:55:04 +0000 UTC" firstStartedPulling="2025-10-03 08:55:06.261702982 +0000 UTC m=+950.563197312" lastFinishedPulling="2025-10-03 08:55:16.21510716 +0000 UTC m=+960.516601490" observedRunningTime="2025-10-03 08:55:19.160141518 +0000 UTC m=+963.461635848" watchObservedRunningTime="2025-10-03 08:55:19.160668201 +0000 UTC m=+963.462162541" Oct 03 08:55:19 crc kubenswrapper[4765]: I1003 08:55:19.199006 4765 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/keystone-operator-controller-manager-7f55849f88-88qh4" podStartSLOduration=5.285815423 podStartE2EDuration="15.198987707s" podCreationTimestamp="2025-10-03 08:55:04 +0000 UTC" firstStartedPulling="2025-10-03 08:55:06.296655762 +0000 UTC m=+950.598150092" lastFinishedPulling="2025-10-03 08:55:16.209828056 +0000 UTC m=+960.511322376" observedRunningTime="2025-10-03 08:55:19.180965728 +0000 UTC m=+963.482460068" watchObservedRunningTime="2025-10-03 08:55:19.198987707 +0000 UTC m=+963.500482047" Oct 03 08:55:19 crc kubenswrapper[4765]: I1003 08:55:19.199376 4765 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/octavia-operator-controller-manager-59d6cfdf45-zmn6f" podStartSLOduration=5.183426407 podStartE2EDuration="15.199372597s" podCreationTimestamp="2025-10-03 08:55:04 +0000 UTC" firstStartedPulling="2025-10-03 08:55:06.187675237 +0000 UTC m=+950.489169567" lastFinishedPulling="2025-10-03 08:55:16.203621427 +0000 UTC m=+960.505115757" observedRunningTime="2025-10-03 08:55:19.196419962 +0000 UTC m=+963.497914302" watchObservedRunningTime="2025-10-03 08:55:19.199372597 +0000 UTC m=+963.500866927" Oct 03 08:55:19 crc kubenswrapper[4765]: I1003 08:55:19.225273 4765 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/ironic-operator-controller-manager-84bc9db6cc-kc4kr" podStartSLOduration=4.848115119 podStartE2EDuration="15.225255346s" podCreationTimestamp="2025-10-03 08:55:04 +0000 UTC" firstStartedPulling="2025-10-03 08:55:05.839584704 +0000 UTC m=+950.141079034" lastFinishedPulling="2025-10-03 08:55:16.216724931 +0000 UTC m=+960.518219261" observedRunningTime="2025-10-03 08:55:19.221849409 +0000 UTC m=+963.523343739" watchObservedRunningTime="2025-10-03 08:55:19.225255346 +0000 UTC m=+963.526749676" Oct 03 08:55:19 crc kubenswrapper[4765]: I1003 08:55:19.238356 4765 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/glance-operator-controller-manager-846dff85b5-z5mzn" podStartSLOduration=4.836817541 podStartE2EDuration="15.238337109s" podCreationTimestamp="2025-10-03 08:55:04 +0000 UTC" firstStartedPulling="2025-10-03 08:55:05.814561067 +0000 UTC m=+950.116055397" lastFinishedPulling="2025-10-03 08:55:16.216080635 +0000 UTC m=+960.517574965" observedRunningTime="2025-10-03 08:55:19.23720394 +0000 UTC m=+963.538698280" watchObservedRunningTime="2025-10-03 08:55:19.238337109 +0000 UTC m=+963.539831439" Oct 03 08:55:19 crc kubenswrapper[4765]: I1003 08:55:19.257074 4765 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/placement-operator-controller-manager-7d8bb7f44c-gsgcs" podStartSLOduration=5.530829032 podStartE2EDuration="15.257050035s" podCreationTimestamp="2025-10-03 08:55:04 +0000 UTC" firstStartedPulling="2025-10-03 08:55:06.479929149 +0000 UTC m=+950.781423479" lastFinishedPulling="2025-10-03 08:55:16.206150152 +0000 UTC m=+960.507644482" observedRunningTime="2025-10-03 08:55:19.256524362 +0000 UTC m=+963.558018692" watchObservedRunningTime="2025-10-03 08:55:19.257050035 +0000 UTC m=+963.558544365" Oct 03 08:55:19 crc kubenswrapper[4765]: I1003 08:55:19.286424 4765 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/infra-operator-controller-manager-5fbf469cd7-k5nkl" podStartSLOduration=4.91332495 podStartE2EDuration="15.286398263s" podCreationTimestamp="2025-10-03 08:55:04 +0000 UTC" firstStartedPulling="2025-10-03 08:55:05.843781311 +0000 UTC m=+950.145275651" lastFinishedPulling="2025-10-03 08:55:16.216854634 +0000 UTC m=+960.518348964" observedRunningTime="2025-10-03 08:55:19.277376083 +0000 UTC m=+963.578870423" watchObservedRunningTime="2025-10-03 08:55:19.286398263 +0000 UTC m=+963.587892603" Oct 03 08:55:19 crc kubenswrapper[4765]: I1003 08:55:19.300361 4765 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/ovn-operator-controller-manager-688db7b6c7-qmgwv" podStartSLOduration=5.350556093 podStartE2EDuration="15.300337618s" podCreationTimestamp="2025-10-03 08:55:04 +0000 UTC" firstStartedPulling="2025-10-03 08:55:06.260148413 +0000 UTC m=+950.561642733" lastFinishedPulling="2025-10-03 08:55:16.209929928 +0000 UTC m=+960.511424258" observedRunningTime="2025-10-03 08:55:19.296275914 +0000 UTC m=+963.597770244" watchObservedRunningTime="2025-10-03 08:55:19.300337618 +0000 UTC m=+963.601831948" Oct 03 08:55:20 crc kubenswrapper[4765]: I1003 08:55:20.158623 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-baremetal-operator-controller-manager-6f64c4d678bz9r6" event={"ID":"b52d06aa-3131-4887-b541-16ada075b2dd","Type":"ContainerStarted","Data":"2f47ed3442705d3f221633a27302cbe2d0b166c9ce1e988739e7dc7702c30957"} Oct 03 08:55:20 crc kubenswrapper[4765]: I1003 08:55:20.159409 4765 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/octavia-operator-controller-manager-59d6cfdf45-zmn6f" Oct 03 08:55:20 crc kubenswrapper[4765]: I1003 08:55:20.159429 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-baremetal-operator-controller-manager-6f64c4d678bz9r6" event={"ID":"b52d06aa-3131-4887-b541-16ada075b2dd","Type":"ContainerStarted","Data":"7275f24992dbd066a3d19e9545db696b40d6519a04bc5ebd997f6c9f0358ff25"} Oct 03 08:55:20 crc kubenswrapper[4765]: I1003 08:55:20.188717 4765 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/openstack-baremetal-operator-controller-manager-6f64c4d678bz9r6" podStartSLOduration=13.482707537 podStartE2EDuration="16.188697418s" podCreationTimestamp="2025-10-03 08:55:04 +0000 UTC" firstStartedPulling="2025-10-03 08:55:16.813563888 +0000 UTC m=+961.115058218" lastFinishedPulling="2025-10-03 08:55:19.519553779 +0000 UTC m=+963.821048099" observedRunningTime="2025-10-03 08:55:20.185549618 +0000 UTC m=+964.487043968" watchObservedRunningTime="2025-10-03 08:55:20.188697418 +0000 UTC m=+964.490191748" Oct 03 08:55:21 crc kubenswrapper[4765]: I1003 08:55:21.164935 4765 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/openstack-baremetal-operator-controller-manager-6f64c4d678bz9r6" Oct 03 08:55:21 crc kubenswrapper[4765]: I1003 08:55:21.167257 4765 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/heat-operator-controller-manager-599898f689-6t6dx" Oct 03 08:55:22 crc kubenswrapper[4765]: I1003 08:55:22.175464 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/watcher-operator-controller-manager-5c899c45b6-mkt28" event={"ID":"2f15ff13-62f4-43da-82f8-5d5faac4f502","Type":"ContainerStarted","Data":"41985ff18b8485559f3bcbb5992777d8e71fb3d367ec41f5807bbba76ecd942a"} Oct 03 08:55:22 crc kubenswrapper[4765]: I1003 08:55:22.176090 4765 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/watcher-operator-controller-manager-5c899c45b6-mkt28" Oct 03 08:55:22 crc kubenswrapper[4765]: I1003 08:55:22.193526 4765 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/watcher-operator-controller-manager-5c899c45b6-mkt28" podStartSLOduration=3.452871343 podStartE2EDuration="18.193507915s" podCreationTimestamp="2025-10-03 08:55:04 +0000 UTC" firstStartedPulling="2025-10-03 08:55:06.493494754 +0000 UTC m=+950.794989084" lastFinishedPulling="2025-10-03 08:55:21.234131326 +0000 UTC m=+965.535625656" observedRunningTime="2025-10-03 08:55:22.192604942 +0000 UTC m=+966.494099292" watchObservedRunningTime="2025-10-03 08:55:22.193507915 +0000 UTC m=+966.495002245" Oct 03 08:55:23 crc kubenswrapper[4765]: I1003 08:55:23.184547 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/test-operator-controller-manager-5cd5cb47d7-vvxdl" event={"ID":"f3a40e37-8073-4528-9379-9cea11f883ed","Type":"ContainerStarted","Data":"2bae228f5421f927a4e7725396409c0e57c188eb050d891ea7d345b343606063"} Oct 03 08:55:23 crc kubenswrapper[4765]: I1003 08:55:23.185137 4765 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/test-operator-controller-manager-5cd5cb47d7-vvxdl" Oct 03 08:55:23 crc kubenswrapper[4765]: I1003 08:55:23.201401 4765 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/test-operator-controller-manager-5cd5cb47d7-vvxdl" podStartSLOduration=3.489430143 podStartE2EDuration="19.201386038s" podCreationTimestamp="2025-10-03 08:55:04 +0000 UTC" firstStartedPulling="2025-10-03 08:55:06.497623629 +0000 UTC m=+950.799117959" lastFinishedPulling="2025-10-03 08:55:22.209579524 +0000 UTC m=+966.511073854" observedRunningTime="2025-10-03 08:55:23.200140056 +0000 UTC m=+967.501634396" watchObservedRunningTime="2025-10-03 08:55:23.201386038 +0000 UTC m=+967.502880368" Oct 03 08:55:24 crc kubenswrapper[4765]: I1003 08:55:24.360481 4765 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/barbican-operator-controller-manager-6c675fb79f-z24cr" Oct 03 08:55:24 crc kubenswrapper[4765]: I1003 08:55:24.388500 4765 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/cinder-operator-controller-manager-79d68d6c85-kxsxl" Oct 03 08:55:24 crc kubenswrapper[4765]: I1003 08:55:24.451151 4765 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/glance-operator-controller-manager-846dff85b5-z5mzn" Oct 03 08:55:24 crc kubenswrapper[4765]: I1003 08:55:24.504977 4765 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/horizon-operator-controller-manager-6769b867d9-tjk7g" Oct 03 08:55:24 crc kubenswrapper[4765]: I1003 08:55:24.572439 4765 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/infra-operator-controller-manager-5fbf469cd7-k5nkl" Oct 03 08:55:24 crc kubenswrapper[4765]: I1003 08:55:24.630255 4765 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/ironic-operator-controller-manager-84bc9db6cc-kc4kr" Oct 03 08:55:24 crc kubenswrapper[4765]: I1003 08:55:24.713355 4765 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/designate-operator-controller-manager-75dfd9b554-c9sw7" Oct 03 08:55:24 crc kubenswrapper[4765]: I1003 08:55:24.783498 4765 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/keystone-operator-controller-manager-7f55849f88-88qh4" Oct 03 08:55:24 crc kubenswrapper[4765]: I1003 08:55:24.806912 4765 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/manila-operator-controller-manager-6fd6854b49-rg2n6" Oct 03 08:55:24 crc kubenswrapper[4765]: I1003 08:55:24.856338 4765 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/nova-operator-controller-manager-555c7456bd-7zq5j" Oct 03 08:55:24 crc kubenswrapper[4765]: I1003 08:55:24.929380 4765 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/octavia-operator-controller-manager-59d6cfdf45-zmn6f" Oct 03 08:55:25 crc kubenswrapper[4765]: I1003 08:55:25.000705 4765 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/ovn-operator-controller-manager-688db7b6c7-qmgwv" Oct 03 08:55:25 crc kubenswrapper[4765]: I1003 08:55:25.035701 4765 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/placement-operator-controller-manager-7d8bb7f44c-gsgcs" Oct 03 08:55:25 crc kubenswrapper[4765]: I1003 08:55:25.105340 4765 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/swift-operator-controller-manager-6859f9b676-5rxfw" Oct 03 08:55:25 crc kubenswrapper[4765]: I1003 08:55:25.198263 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/rabbitmq-cluster-operator-manager-5f97d8c699-bsdgf" event={"ID":"ac8027c9-646b-4c9f-965e-639d6ad818dd","Type":"ContainerStarted","Data":"f56ca5a3f9c83f3d959f9a87612de150eb7e681c0cf79137f27883bc2767f17d"} Oct 03 08:55:25 crc kubenswrapper[4765]: I1003 08:55:25.199996 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/mariadb-operator-controller-manager-5c468bf4d4-8tlhl" event={"ID":"2cda6800-dddc-4535-86bc-25d3f2d167b0","Type":"ContainerStarted","Data":"3188a562c2464566259bf8f1f420c3dae8615a7ef4c2a03c8c3695d8638781dd"} Oct 03 08:55:25 crc kubenswrapper[4765]: I1003 08:55:25.200242 4765 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/mariadb-operator-controller-manager-5c468bf4d4-8tlhl" Oct 03 08:55:25 crc kubenswrapper[4765]: I1003 08:55:25.202616 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/telemetry-operator-controller-manager-5db5cf686f-qjp99" event={"ID":"b0f3d3a4-5e2c-4f86-a6ab-65be50620ce7","Type":"ContainerStarted","Data":"a033c2d97402d86c7f0506cf36f9e76ca8ff94764f2db08912f910ac3aaeef95"} Oct 03 08:55:25 crc kubenswrapper[4765]: I1003 08:55:25.202817 4765 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/telemetry-operator-controller-manager-5db5cf686f-qjp99" Oct 03 08:55:25 crc kubenswrapper[4765]: I1003 08:55:25.221202 4765 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/rabbitmq-cluster-operator-manager-5f97d8c699-bsdgf" podStartSLOduration=3.473654031 podStartE2EDuration="21.221180507s" podCreationTimestamp="2025-10-03 08:55:04 +0000 UTC" firstStartedPulling="2025-10-03 08:55:06.341305169 +0000 UTC m=+950.642799499" lastFinishedPulling="2025-10-03 08:55:24.088831645 +0000 UTC m=+968.390325975" observedRunningTime="2025-10-03 08:55:25.213623564 +0000 UTC m=+969.515117894" watchObservedRunningTime="2025-10-03 08:55:25.221180507 +0000 UTC m=+969.522674847" Oct 03 08:55:25 crc kubenswrapper[4765]: I1003 08:55:25.235448 4765 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/telemetry-operator-controller-manager-5db5cf686f-qjp99" podStartSLOduration=3.632126945 podStartE2EDuration="21.235415139s" podCreationTimestamp="2025-10-03 08:55:04 +0000 UTC" firstStartedPulling="2025-10-03 08:55:06.489378389 +0000 UTC m=+950.790872719" lastFinishedPulling="2025-10-03 08:55:24.092666583 +0000 UTC m=+968.394160913" observedRunningTime="2025-10-03 08:55:25.228093613 +0000 UTC m=+969.529587943" watchObservedRunningTime="2025-10-03 08:55:25.235415139 +0000 UTC m=+969.536909469" Oct 03 08:55:25 crc kubenswrapper[4765]: I1003 08:55:25.244476 4765 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/mariadb-operator-controller-manager-5c468bf4d4-8tlhl" podStartSLOduration=3.459361858 podStartE2EDuration="21.24446344s" podCreationTimestamp="2025-10-03 08:55:04 +0000 UTC" firstStartedPulling="2025-10-03 08:55:06.320086449 +0000 UTC m=+950.621580779" lastFinishedPulling="2025-10-03 08:55:24.105188031 +0000 UTC m=+968.406682361" observedRunningTime="2025-10-03 08:55:25.244165612 +0000 UTC m=+969.545659942" watchObservedRunningTime="2025-10-03 08:55:25.24446344 +0000 UTC m=+969.545957770" Oct 03 08:55:26 crc kubenswrapper[4765]: I1003 08:55:26.210745 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/neutron-operator-controller-manager-6574bf987d-867c4" event={"ID":"21090664-3a4c-4c58-852d-d2779c7bf17d","Type":"ContainerStarted","Data":"bf427673a2bcd5e0852fde29bad24555611e2ba3bd09dee61c7e558cf28b66e9"} Oct 03 08:55:26 crc kubenswrapper[4765]: I1003 08:55:26.211485 4765 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/neutron-operator-controller-manager-6574bf987d-867c4" Oct 03 08:55:26 crc kubenswrapper[4765]: I1003 08:55:26.228355 4765 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/neutron-operator-controller-manager-6574bf987d-867c4" podStartSLOduration=2.937132331 podStartE2EDuration="22.228337272s" podCreationTimestamp="2025-10-03 08:55:04 +0000 UTC" firstStartedPulling="2025-10-03 08:55:06.299367111 +0000 UTC m=+950.600861441" lastFinishedPulling="2025-10-03 08:55:25.590572042 +0000 UTC m=+969.892066382" observedRunningTime="2025-10-03 08:55:26.224092644 +0000 UTC m=+970.525586994" watchObservedRunningTime="2025-10-03 08:55:26.228337272 +0000 UTC m=+970.529831602" Oct 03 08:55:26 crc kubenswrapper[4765]: I1003 08:55:26.937695 4765 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/openstack-operator-controller-manager-6fb94b767d-jsrj4" Oct 03 08:55:28 crc kubenswrapper[4765]: I1003 08:55:28.553826 4765 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/openstack-baremetal-operator-controller-manager-6f64c4d678bz9r6" Oct 03 08:55:30 crc kubenswrapper[4765]: I1003 08:55:30.680426 4765 patch_prober.go:28] interesting pod/machine-config-daemon-j8mss container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 03 08:55:30 crc kubenswrapper[4765]: I1003 08:55:30.680490 4765 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-j8mss" podUID="d636dbad-9ffa-4ba7-953f-adea04b76a23" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 03 08:55:30 crc kubenswrapper[4765]: I1003 08:55:30.680533 4765 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-j8mss" Oct 03 08:55:30 crc kubenswrapper[4765]: I1003 08:55:30.681112 4765 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"bfe51b01984985879807e07e7a2482b4ea6735b787f2d94829df4202c6f13dc1"} pod="openshift-machine-config-operator/machine-config-daemon-j8mss" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Oct 03 08:55:30 crc kubenswrapper[4765]: I1003 08:55:30.681167 4765 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-j8mss" podUID="d636dbad-9ffa-4ba7-953f-adea04b76a23" containerName="machine-config-daemon" containerID="cri-o://bfe51b01984985879807e07e7a2482b4ea6735b787f2d94829df4202c6f13dc1" gracePeriod=600 Oct 03 08:55:31 crc kubenswrapper[4765]: I1003 08:55:31.244596 4765 generic.go:334] "Generic (PLEG): container finished" podID="d636dbad-9ffa-4ba7-953f-adea04b76a23" containerID="bfe51b01984985879807e07e7a2482b4ea6735b787f2d94829df4202c6f13dc1" exitCode=0 Oct 03 08:55:31 crc kubenswrapper[4765]: I1003 08:55:31.244678 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-j8mss" event={"ID":"d636dbad-9ffa-4ba7-953f-adea04b76a23","Type":"ContainerDied","Data":"bfe51b01984985879807e07e7a2482b4ea6735b787f2d94829df4202c6f13dc1"} Oct 03 08:55:31 crc kubenswrapper[4765]: I1003 08:55:31.245231 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-j8mss" event={"ID":"d636dbad-9ffa-4ba7-953f-adea04b76a23","Type":"ContainerStarted","Data":"3ea26be161c74098dfac030bd0f30c6280c924f6e3ecfdb459bea4fd5d08ace1"} Oct 03 08:55:31 crc kubenswrapper[4765]: I1003 08:55:31.245257 4765 scope.go:117] "RemoveContainer" containerID="5477c0c212e204859773f5f32ccf8d9a259a11b347c7a6534e70a233d47641c8" Oct 03 08:55:35 crc kubenswrapper[4765]: I1003 08:55:35.099892 4765 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/mariadb-operator-controller-manager-5c468bf4d4-8tlhl" Oct 03 08:55:35 crc kubenswrapper[4765]: I1003 08:55:35.116233 4765 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/neutron-operator-controller-manager-6574bf987d-867c4" Oct 03 08:55:35 crc kubenswrapper[4765]: I1003 08:55:35.133081 4765 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/telemetry-operator-controller-manager-5db5cf686f-qjp99" Oct 03 08:55:35 crc kubenswrapper[4765]: I1003 08:55:35.315799 4765 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/test-operator-controller-manager-5cd5cb47d7-vvxdl" Oct 03 08:55:35 crc kubenswrapper[4765]: I1003 08:55:35.401847 4765 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/watcher-operator-controller-manager-5c899c45b6-mkt28" Oct 03 08:55:39 crc kubenswrapper[4765]: I1003 08:55:39.879516 4765 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-operators/watcher-operator-controller-manager-5c899c45b6-mkt28"] Oct 03 08:55:39 crc kubenswrapper[4765]: I1003 08:55:39.880372 4765 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-operators/watcher-operator-controller-manager-5c899c45b6-mkt28" podUID="2f15ff13-62f4-43da-82f8-5d5faac4f502" containerName="kube-rbac-proxy" containerID="cri-o://07051b0e7227a009dc9d0e7d5e19f46a084b1ea9bcd330669a93f37c8c92fd4a" gracePeriod=10 Oct 03 08:55:39 crc kubenswrapper[4765]: I1003 08:55:39.880432 4765 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-operators/watcher-operator-controller-manager-5c899c45b6-mkt28" podUID="2f15ff13-62f4-43da-82f8-5d5faac4f502" containerName="manager" containerID="cri-o://41985ff18b8485559f3bcbb5992777d8e71fb3d367ec41f5807bbba76ecd942a" gracePeriod=10 Oct 03 08:55:39 crc kubenswrapper[4765]: I1003 08:55:39.906088 4765 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-operators/openstack-operator-controller-operator-7f8d586cd6-l6vxc"] Oct 03 08:55:39 crc kubenswrapper[4765]: I1003 08:55:39.906825 4765 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-operators/openstack-operator-controller-operator-7f8d586cd6-l6vxc" podUID="1aa867b6-fd87-48cf-9461-338e10d56738" containerName="operator" containerID="cri-o://7307e245178cc09e6b2a102b3c255b2c4113b1bd03d917cb0dcb700e13a7f72e" gracePeriod=10 Oct 03 08:55:39 crc kubenswrapper[4765]: I1003 08:55:39.906919 4765 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-operators/openstack-operator-controller-operator-7f8d586cd6-l6vxc" podUID="1aa867b6-fd87-48cf-9461-338e10d56738" containerName="kube-rbac-proxy" containerID="cri-o://fb0b3b8b2c36d2ced99c525dfb9626f57a48ff42d673427d6429fb33f9b1f215" gracePeriod=10 Oct 03 08:55:40 crc kubenswrapper[4765]: I1003 08:55:40.367097 4765 generic.go:334] "Generic (PLEG): container finished" podID="2f15ff13-62f4-43da-82f8-5d5faac4f502" containerID="41985ff18b8485559f3bcbb5992777d8e71fb3d367ec41f5807bbba76ecd942a" exitCode=0 Oct 03 08:55:40 crc kubenswrapper[4765]: I1003 08:55:40.367386 4765 generic.go:334] "Generic (PLEG): container finished" podID="2f15ff13-62f4-43da-82f8-5d5faac4f502" containerID="07051b0e7227a009dc9d0e7d5e19f46a084b1ea9bcd330669a93f37c8c92fd4a" exitCode=0 Oct 03 08:55:40 crc kubenswrapper[4765]: I1003 08:55:40.367732 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/watcher-operator-controller-manager-5c899c45b6-mkt28" event={"ID":"2f15ff13-62f4-43da-82f8-5d5faac4f502","Type":"ContainerDied","Data":"41985ff18b8485559f3bcbb5992777d8e71fb3d367ec41f5807bbba76ecd942a"} Oct 03 08:55:40 crc kubenswrapper[4765]: I1003 08:55:40.367821 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/watcher-operator-controller-manager-5c899c45b6-mkt28" event={"ID":"2f15ff13-62f4-43da-82f8-5d5faac4f502","Type":"ContainerDied","Data":"07051b0e7227a009dc9d0e7d5e19f46a084b1ea9bcd330669a93f37c8c92fd4a"} Oct 03 08:55:40 crc kubenswrapper[4765]: I1003 08:55:40.382564 4765 generic.go:334] "Generic (PLEG): container finished" podID="1aa867b6-fd87-48cf-9461-338e10d56738" containerID="fb0b3b8b2c36d2ced99c525dfb9626f57a48ff42d673427d6429fb33f9b1f215" exitCode=0 Oct 03 08:55:40 crc kubenswrapper[4765]: I1003 08:55:40.382589 4765 generic.go:334] "Generic (PLEG): container finished" podID="1aa867b6-fd87-48cf-9461-338e10d56738" containerID="7307e245178cc09e6b2a102b3c255b2c4113b1bd03d917cb0dcb700e13a7f72e" exitCode=0 Oct 03 08:55:40 crc kubenswrapper[4765]: I1003 08:55:40.382624 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-controller-operator-7f8d586cd6-l6vxc" event={"ID":"1aa867b6-fd87-48cf-9461-338e10d56738","Type":"ContainerDied","Data":"fb0b3b8b2c36d2ced99c525dfb9626f57a48ff42d673427d6429fb33f9b1f215"} Oct 03 08:55:40 crc kubenswrapper[4765]: I1003 08:55:40.382658 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-controller-operator-7f8d586cd6-l6vxc" event={"ID":"1aa867b6-fd87-48cf-9461-338e10d56738","Type":"ContainerDied","Data":"7307e245178cc09e6b2a102b3c255b2c4113b1bd03d917cb0dcb700e13a7f72e"} Oct 03 08:55:40 crc kubenswrapper[4765]: I1003 08:55:40.542603 4765 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/watcher-operator-controller-manager-5c899c45b6-mkt28" Oct 03 08:55:40 crc kubenswrapper[4765]: I1003 08:55:40.551064 4765 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-controller-operator-7f8d586cd6-l6vxc" Oct 03 08:55:40 crc kubenswrapper[4765]: I1003 08:55:40.632803 4765 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-hxrzw\" (UniqueName: \"kubernetes.io/projected/1aa867b6-fd87-48cf-9461-338e10d56738-kube-api-access-hxrzw\") pod \"1aa867b6-fd87-48cf-9461-338e10d56738\" (UID: \"1aa867b6-fd87-48cf-9461-338e10d56738\") " Oct 03 08:55:40 crc kubenswrapper[4765]: I1003 08:55:40.632943 4765 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kcthx\" (UniqueName: \"kubernetes.io/projected/2f15ff13-62f4-43da-82f8-5d5faac4f502-kube-api-access-kcthx\") pod \"2f15ff13-62f4-43da-82f8-5d5faac4f502\" (UID: \"2f15ff13-62f4-43da-82f8-5d5faac4f502\") " Oct 03 08:55:40 crc kubenswrapper[4765]: I1003 08:55:40.638500 4765 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1aa867b6-fd87-48cf-9461-338e10d56738-kube-api-access-hxrzw" (OuterVolumeSpecName: "kube-api-access-hxrzw") pod "1aa867b6-fd87-48cf-9461-338e10d56738" (UID: "1aa867b6-fd87-48cf-9461-338e10d56738"). InnerVolumeSpecName "kube-api-access-hxrzw". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 08:55:40 crc kubenswrapper[4765]: I1003 08:55:40.642070 4765 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2f15ff13-62f4-43da-82f8-5d5faac4f502-kube-api-access-kcthx" (OuterVolumeSpecName: "kube-api-access-kcthx") pod "2f15ff13-62f4-43da-82f8-5d5faac4f502" (UID: "2f15ff13-62f4-43da-82f8-5d5faac4f502"). InnerVolumeSpecName "kube-api-access-kcthx". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 08:55:40 crc kubenswrapper[4765]: I1003 08:55:40.734396 4765 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-kcthx\" (UniqueName: \"kubernetes.io/projected/2f15ff13-62f4-43da-82f8-5d5faac4f502-kube-api-access-kcthx\") on node \"crc\" DevicePath \"\"" Oct 03 08:55:40 crc kubenswrapper[4765]: I1003 08:55:40.734427 4765 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-hxrzw\" (UniqueName: \"kubernetes.io/projected/1aa867b6-fd87-48cf-9461-338e10d56738-kube-api-access-hxrzw\") on node \"crc\" DevicePath \"\"" Oct 03 08:55:41 crc kubenswrapper[4765]: I1003 08:55:41.392161 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/watcher-operator-controller-manager-5c899c45b6-mkt28" event={"ID":"2f15ff13-62f4-43da-82f8-5d5faac4f502","Type":"ContainerDied","Data":"b93e242d2d9c5790d8c5088e298958ba63ef77082fe06dd007f9782eaf76b573"} Oct 03 08:55:41 crc kubenswrapper[4765]: I1003 08:55:41.392221 4765 scope.go:117] "RemoveContainer" containerID="41985ff18b8485559f3bcbb5992777d8e71fb3d367ec41f5807bbba76ecd942a" Oct 03 08:55:41 crc kubenswrapper[4765]: I1003 08:55:41.392183 4765 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/watcher-operator-controller-manager-5c899c45b6-mkt28" Oct 03 08:55:41 crc kubenswrapper[4765]: I1003 08:55:41.394446 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-controller-operator-7f8d586cd6-l6vxc" event={"ID":"1aa867b6-fd87-48cf-9461-338e10d56738","Type":"ContainerDied","Data":"877cb5145318926b815933b0b847d79fec5beb30962ef2ee89701212f6ff5d4d"} Oct 03 08:55:41 crc kubenswrapper[4765]: I1003 08:55:41.394528 4765 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-controller-operator-7f8d586cd6-l6vxc" Oct 03 08:55:41 crc kubenswrapper[4765]: I1003 08:55:41.413062 4765 scope.go:117] "RemoveContainer" containerID="07051b0e7227a009dc9d0e7d5e19f46a084b1ea9bcd330669a93f37c8c92fd4a" Oct 03 08:55:41 crc kubenswrapper[4765]: I1003 08:55:41.427950 4765 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-operators/watcher-operator-controller-manager-5c899c45b6-mkt28"] Oct 03 08:55:41 crc kubenswrapper[4765]: I1003 08:55:41.431704 4765 scope.go:117] "RemoveContainer" containerID="fb0b3b8b2c36d2ced99c525dfb9626f57a48ff42d673427d6429fb33f9b1f215" Oct 03 08:55:41 crc kubenswrapper[4765]: I1003 08:55:41.434193 4765 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-operators/watcher-operator-controller-manager-5c899c45b6-mkt28"] Oct 03 08:55:41 crc kubenswrapper[4765]: I1003 08:55:41.449791 4765 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-operators/openstack-operator-controller-operator-7f8d586cd6-l6vxc"] Oct 03 08:55:41 crc kubenswrapper[4765]: I1003 08:55:41.461070 4765 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-operators/openstack-operator-controller-operator-7f8d586cd6-l6vxc"] Oct 03 08:55:41 crc kubenswrapper[4765]: I1003 08:55:41.462460 4765 scope.go:117] "RemoveContainer" containerID="7307e245178cc09e6b2a102b3c255b2c4113b1bd03d917cb0dcb700e13a7f72e" Oct 03 08:55:42 crc kubenswrapper[4765]: I1003 08:55:42.318306 4765 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1aa867b6-fd87-48cf-9461-338e10d56738" path="/var/lib/kubelet/pods/1aa867b6-fd87-48cf-9461-338e10d56738/volumes" Oct 03 08:55:42 crc kubenswrapper[4765]: I1003 08:55:42.319283 4765 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2f15ff13-62f4-43da-82f8-5d5faac4f502" path="/var/lib/kubelet/pods/2f15ff13-62f4-43da-82f8-5d5faac4f502/volumes" Oct 03 08:55:43 crc kubenswrapper[4765]: I1003 08:55:43.024782 4765 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/watcher-operator-index-g27pq"] Oct 03 08:55:43 crc kubenswrapper[4765]: E1003 08:55:43.025086 4765 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2f15ff13-62f4-43da-82f8-5d5faac4f502" containerName="manager" Oct 03 08:55:43 crc kubenswrapper[4765]: I1003 08:55:43.025100 4765 state_mem.go:107] "Deleted CPUSet assignment" podUID="2f15ff13-62f4-43da-82f8-5d5faac4f502" containerName="manager" Oct 03 08:55:43 crc kubenswrapper[4765]: E1003 08:55:43.025131 4765 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2f15ff13-62f4-43da-82f8-5d5faac4f502" containerName="kube-rbac-proxy" Oct 03 08:55:43 crc kubenswrapper[4765]: I1003 08:55:43.025138 4765 state_mem.go:107] "Deleted CPUSet assignment" podUID="2f15ff13-62f4-43da-82f8-5d5faac4f502" containerName="kube-rbac-proxy" Oct 03 08:55:43 crc kubenswrapper[4765]: E1003 08:55:43.025155 4765 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1aa867b6-fd87-48cf-9461-338e10d56738" containerName="kube-rbac-proxy" Oct 03 08:55:43 crc kubenswrapper[4765]: I1003 08:55:43.025161 4765 state_mem.go:107] "Deleted CPUSet assignment" podUID="1aa867b6-fd87-48cf-9461-338e10d56738" containerName="kube-rbac-proxy" Oct 03 08:55:43 crc kubenswrapper[4765]: E1003 08:55:43.025172 4765 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1aa867b6-fd87-48cf-9461-338e10d56738" containerName="operator" Oct 03 08:55:43 crc kubenswrapper[4765]: I1003 08:55:43.025178 4765 state_mem.go:107] "Deleted CPUSet assignment" podUID="1aa867b6-fd87-48cf-9461-338e10d56738" containerName="operator" Oct 03 08:55:43 crc kubenswrapper[4765]: I1003 08:55:43.025390 4765 memory_manager.go:354] "RemoveStaleState removing state" podUID="2f15ff13-62f4-43da-82f8-5d5faac4f502" containerName="kube-rbac-proxy" Oct 03 08:55:43 crc kubenswrapper[4765]: I1003 08:55:43.025403 4765 memory_manager.go:354] "RemoveStaleState removing state" podUID="1aa867b6-fd87-48cf-9461-338e10d56738" containerName="kube-rbac-proxy" Oct 03 08:55:43 crc kubenswrapper[4765]: I1003 08:55:43.025412 4765 memory_manager.go:354] "RemoveStaleState removing state" podUID="2f15ff13-62f4-43da-82f8-5d5faac4f502" containerName="manager" Oct 03 08:55:43 crc kubenswrapper[4765]: I1003 08:55:43.025421 4765 memory_manager.go:354] "RemoveStaleState removing state" podUID="1aa867b6-fd87-48cf-9461-338e10d56738" containerName="operator" Oct 03 08:55:43 crc kubenswrapper[4765]: I1003 08:55:43.025880 4765 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/watcher-operator-index-g27pq" Oct 03 08:55:43 crc kubenswrapper[4765]: I1003 08:55:43.032442 4765 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"watcher-operator-index-dockercfg-xzbxx" Oct 03 08:55:43 crc kubenswrapper[4765]: I1003 08:55:43.060835 4765 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/watcher-operator-index-g27pq"] Oct 03 08:55:43 crc kubenswrapper[4765]: I1003 08:55:43.165700 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2lxjd\" (UniqueName: \"kubernetes.io/projected/33ebdfff-a8cc-4e8f-99c3-a670a9f769cc-kube-api-access-2lxjd\") pod \"watcher-operator-index-g27pq\" (UID: \"33ebdfff-a8cc-4e8f-99c3-a670a9f769cc\") " pod="openstack-operators/watcher-operator-index-g27pq" Oct 03 08:55:43 crc kubenswrapper[4765]: I1003 08:55:43.267531 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2lxjd\" (UniqueName: \"kubernetes.io/projected/33ebdfff-a8cc-4e8f-99c3-a670a9f769cc-kube-api-access-2lxjd\") pod \"watcher-operator-index-g27pq\" (UID: \"33ebdfff-a8cc-4e8f-99c3-a670a9f769cc\") " pod="openstack-operators/watcher-operator-index-g27pq" Oct 03 08:55:43 crc kubenswrapper[4765]: I1003 08:55:43.286862 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2lxjd\" (UniqueName: \"kubernetes.io/projected/33ebdfff-a8cc-4e8f-99c3-a670a9f769cc-kube-api-access-2lxjd\") pod \"watcher-operator-index-g27pq\" (UID: \"33ebdfff-a8cc-4e8f-99c3-a670a9f769cc\") " pod="openstack-operators/watcher-operator-index-g27pq" Oct 03 08:55:43 crc kubenswrapper[4765]: I1003 08:55:43.341959 4765 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/watcher-operator-index-g27pq" Oct 03 08:55:43 crc kubenswrapper[4765]: I1003 08:55:43.915052 4765 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/watcher-operator-index-g27pq"] Oct 03 08:55:44 crc kubenswrapper[4765]: I1003 08:55:44.423531 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/watcher-operator-index-g27pq" event={"ID":"33ebdfff-a8cc-4e8f-99c3-a670a9f769cc","Type":"ContainerStarted","Data":"c9d1ad654e54b0dcecd1e2dfd0d1ba1b73a6eea2b64948f17fea87ada3f1ce0b"} Oct 03 08:55:46 crc kubenswrapper[4765]: I1003 08:55:46.811786 4765 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-operators/watcher-operator-index-g27pq"] Oct 03 08:55:47 crc kubenswrapper[4765]: I1003 08:55:47.421118 4765 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/watcher-operator-index-v95cb"] Oct 03 08:55:47 crc kubenswrapper[4765]: I1003 08:55:47.422497 4765 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/watcher-operator-index-v95cb" Oct 03 08:55:47 crc kubenswrapper[4765]: I1003 08:55:47.469160 4765 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/watcher-operator-index-v95cb"] Oct 03 08:55:47 crc kubenswrapper[4765]: I1003 08:55:47.524746 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-z7kwz\" (UniqueName: \"kubernetes.io/projected/5a3da866-1f20-45f7-b2da-6d7beac4f7f4-kube-api-access-z7kwz\") pod \"watcher-operator-index-v95cb\" (UID: \"5a3da866-1f20-45f7-b2da-6d7beac4f7f4\") " pod="openstack-operators/watcher-operator-index-v95cb" Oct 03 08:55:47 crc kubenswrapper[4765]: I1003 08:55:47.626916 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-z7kwz\" (UniqueName: \"kubernetes.io/projected/5a3da866-1f20-45f7-b2da-6d7beac4f7f4-kube-api-access-z7kwz\") pod \"watcher-operator-index-v95cb\" (UID: \"5a3da866-1f20-45f7-b2da-6d7beac4f7f4\") " pod="openstack-operators/watcher-operator-index-v95cb" Oct 03 08:55:47 crc kubenswrapper[4765]: I1003 08:55:47.647761 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-z7kwz\" (UniqueName: \"kubernetes.io/projected/5a3da866-1f20-45f7-b2da-6d7beac4f7f4-kube-api-access-z7kwz\") pod \"watcher-operator-index-v95cb\" (UID: \"5a3da866-1f20-45f7-b2da-6d7beac4f7f4\") " pod="openstack-operators/watcher-operator-index-v95cb" Oct 03 08:55:47 crc kubenswrapper[4765]: I1003 08:55:47.755070 4765 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/watcher-operator-index-v95cb" Oct 03 08:55:48 crc kubenswrapper[4765]: I1003 08:55:48.197734 4765 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/watcher-operator-index-v95cb"] Oct 03 08:55:48 crc kubenswrapper[4765]: I1003 08:55:48.458709 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/watcher-operator-index-v95cb" event={"ID":"5a3da866-1f20-45f7-b2da-6d7beac4f7f4","Type":"ContainerStarted","Data":"8c32d2d70e34ca9c1e8446cf6b3d5d640b44d1472c5aae9aa520d96e98c1bd18"} Oct 03 08:55:51 crc kubenswrapper[4765]: I1003 08:55:51.480268 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/watcher-operator-index-g27pq" event={"ID":"33ebdfff-a8cc-4e8f-99c3-a670a9f769cc","Type":"ContainerStarted","Data":"4d8d3e4684b97a73ac89928a4d44687940ca435c1242c7f81e820b8aedf95e30"} Oct 03 08:55:51 crc kubenswrapper[4765]: I1003 08:55:51.480778 4765 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-operators/watcher-operator-index-g27pq" podUID="33ebdfff-a8cc-4e8f-99c3-a670a9f769cc" containerName="registry-server" containerID="cri-o://4d8d3e4684b97a73ac89928a4d44687940ca435c1242c7f81e820b8aedf95e30" gracePeriod=2 Oct 03 08:55:51 crc kubenswrapper[4765]: I1003 08:55:51.484891 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/watcher-operator-index-v95cb" event={"ID":"5a3da866-1f20-45f7-b2da-6d7beac4f7f4","Type":"ContainerStarted","Data":"6c5f0b333c1ac44b6cee8ae74a58cd00b76ae960ef4187a90301b2cd23dd8bcc"} Oct 03 08:55:51 crc kubenswrapper[4765]: I1003 08:55:51.505575 4765 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/watcher-operator-index-g27pq" podStartSLOduration=1.9578950640000001 podStartE2EDuration="8.505548133s" podCreationTimestamp="2025-10-03 08:55:43 +0000 UTC" firstStartedPulling="2025-10-03 08:55:43.931613114 +0000 UTC m=+988.233107444" lastFinishedPulling="2025-10-03 08:55:50.479266183 +0000 UTC m=+994.780760513" observedRunningTime="2025-10-03 08:55:51.500815903 +0000 UTC m=+995.802310253" watchObservedRunningTime="2025-10-03 08:55:51.505548133 +0000 UTC m=+995.807042463" Oct 03 08:55:51 crc kubenswrapper[4765]: I1003 08:55:51.872903 4765 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/watcher-operator-index-g27pq" Oct 03 08:55:51 crc kubenswrapper[4765]: I1003 08:55:51.889271 4765 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/watcher-operator-index-v95cb" podStartSLOduration=2.613184429 podStartE2EDuration="4.889249037s" podCreationTimestamp="2025-10-03 08:55:47 +0000 UTC" firstStartedPulling="2025-10-03 08:55:48.204549959 +0000 UTC m=+992.506044299" lastFinishedPulling="2025-10-03 08:55:50.480614577 +0000 UTC m=+994.782108907" observedRunningTime="2025-10-03 08:55:51.51655202 +0000 UTC m=+995.818046350" watchObservedRunningTime="2025-10-03 08:55:51.889249037 +0000 UTC m=+996.190743367" Oct 03 08:55:51 crc kubenswrapper[4765]: I1003 08:55:51.991014 4765 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2lxjd\" (UniqueName: \"kubernetes.io/projected/33ebdfff-a8cc-4e8f-99c3-a670a9f769cc-kube-api-access-2lxjd\") pod \"33ebdfff-a8cc-4e8f-99c3-a670a9f769cc\" (UID: \"33ebdfff-a8cc-4e8f-99c3-a670a9f769cc\") " Oct 03 08:55:51 crc kubenswrapper[4765]: I1003 08:55:51.997002 4765 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/33ebdfff-a8cc-4e8f-99c3-a670a9f769cc-kube-api-access-2lxjd" (OuterVolumeSpecName: "kube-api-access-2lxjd") pod "33ebdfff-a8cc-4e8f-99c3-a670a9f769cc" (UID: "33ebdfff-a8cc-4e8f-99c3-a670a9f769cc"). InnerVolumeSpecName "kube-api-access-2lxjd". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 08:55:52 crc kubenswrapper[4765]: I1003 08:55:52.092421 4765 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2lxjd\" (UniqueName: \"kubernetes.io/projected/33ebdfff-a8cc-4e8f-99c3-a670a9f769cc-kube-api-access-2lxjd\") on node \"crc\" DevicePath \"\"" Oct 03 08:55:52 crc kubenswrapper[4765]: I1003 08:55:52.497041 4765 generic.go:334] "Generic (PLEG): container finished" podID="33ebdfff-a8cc-4e8f-99c3-a670a9f769cc" containerID="4d8d3e4684b97a73ac89928a4d44687940ca435c1242c7f81e820b8aedf95e30" exitCode=0 Oct 03 08:55:52 crc kubenswrapper[4765]: I1003 08:55:52.497112 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/watcher-operator-index-g27pq" event={"ID":"33ebdfff-a8cc-4e8f-99c3-a670a9f769cc","Type":"ContainerDied","Data":"4d8d3e4684b97a73ac89928a4d44687940ca435c1242c7f81e820b8aedf95e30"} Oct 03 08:55:52 crc kubenswrapper[4765]: I1003 08:55:52.497133 4765 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/watcher-operator-index-g27pq" Oct 03 08:55:52 crc kubenswrapper[4765]: I1003 08:55:52.497173 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/watcher-operator-index-g27pq" event={"ID":"33ebdfff-a8cc-4e8f-99c3-a670a9f769cc","Type":"ContainerDied","Data":"c9d1ad654e54b0dcecd1e2dfd0d1ba1b73a6eea2b64948f17fea87ada3f1ce0b"} Oct 03 08:55:52 crc kubenswrapper[4765]: I1003 08:55:52.497194 4765 scope.go:117] "RemoveContainer" containerID="4d8d3e4684b97a73ac89928a4d44687940ca435c1242c7f81e820b8aedf95e30" Oct 03 08:55:52 crc kubenswrapper[4765]: I1003 08:55:52.517420 4765 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-operators/watcher-operator-index-g27pq"] Oct 03 08:55:52 crc kubenswrapper[4765]: I1003 08:55:52.521609 4765 scope.go:117] "RemoveContainer" containerID="4d8d3e4684b97a73ac89928a4d44687940ca435c1242c7f81e820b8aedf95e30" Oct 03 08:55:52 crc kubenswrapper[4765]: E1003 08:55:52.522044 4765 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4d8d3e4684b97a73ac89928a4d44687940ca435c1242c7f81e820b8aedf95e30\": container with ID starting with 4d8d3e4684b97a73ac89928a4d44687940ca435c1242c7f81e820b8aedf95e30 not found: ID does not exist" containerID="4d8d3e4684b97a73ac89928a4d44687940ca435c1242c7f81e820b8aedf95e30" Oct 03 08:55:52 crc kubenswrapper[4765]: I1003 08:55:52.522087 4765 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4d8d3e4684b97a73ac89928a4d44687940ca435c1242c7f81e820b8aedf95e30"} err="failed to get container status \"4d8d3e4684b97a73ac89928a4d44687940ca435c1242c7f81e820b8aedf95e30\": rpc error: code = NotFound desc = could not find container \"4d8d3e4684b97a73ac89928a4d44687940ca435c1242c7f81e820b8aedf95e30\": container with ID starting with 4d8d3e4684b97a73ac89928a4d44687940ca435c1242c7f81e820b8aedf95e30 not found: ID does not exist" Oct 03 08:55:52 crc kubenswrapper[4765]: I1003 08:55:52.522390 4765 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-operators/watcher-operator-index-g27pq"] Oct 03 08:55:54 crc kubenswrapper[4765]: I1003 08:55:54.314891 4765 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="33ebdfff-a8cc-4e8f-99c3-a670a9f769cc" path="/var/lib/kubelet/pods/33ebdfff-a8cc-4e8f-99c3-a670a9f769cc/volumes" Oct 03 08:55:57 crc kubenswrapper[4765]: I1003 08:55:57.755563 4765 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack-operators/watcher-operator-index-v95cb" Oct 03 08:55:57 crc kubenswrapper[4765]: I1003 08:55:57.755924 4765 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/watcher-operator-index-v95cb" Oct 03 08:55:57 crc kubenswrapper[4765]: I1003 08:55:57.783262 4765 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack-operators/watcher-operator-index-v95cb" Oct 03 08:55:58 crc kubenswrapper[4765]: I1003 08:55:58.566916 4765 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/watcher-operator-index-v95cb" Oct 03 08:55:59 crc kubenswrapper[4765]: I1003 08:55:59.870709 4765 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/e202c84312f326eb61c25220bdbc5ba9f724d2b420482503a1fb00e10cqqrl6"] Oct 03 08:55:59 crc kubenswrapper[4765]: E1003 08:55:59.871333 4765 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="33ebdfff-a8cc-4e8f-99c3-a670a9f769cc" containerName="registry-server" Oct 03 08:55:59 crc kubenswrapper[4765]: I1003 08:55:59.871345 4765 state_mem.go:107] "Deleted CPUSet assignment" podUID="33ebdfff-a8cc-4e8f-99c3-a670a9f769cc" containerName="registry-server" Oct 03 08:55:59 crc kubenswrapper[4765]: I1003 08:55:59.871482 4765 memory_manager.go:354] "RemoveStaleState removing state" podUID="33ebdfff-a8cc-4e8f-99c3-a670a9f769cc" containerName="registry-server" Oct 03 08:55:59 crc kubenswrapper[4765]: I1003 08:55:59.872518 4765 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/e202c84312f326eb61c25220bdbc5ba9f724d2b420482503a1fb00e10cqqrl6" Oct 03 08:55:59 crc kubenswrapper[4765]: I1003 08:55:59.876000 4765 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"default-dockercfg-fnnxp" Oct 03 08:55:59 crc kubenswrapper[4765]: I1003 08:55:59.883957 4765 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/e202c84312f326eb61c25220bdbc5ba9f724d2b420482503a1fb00e10cqqrl6"] Oct 03 08:56:00 crc kubenswrapper[4765]: I1003 08:56:00.018040 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/fa66b2dd-bf1b-43db-b43a-e4adae1d77ce-util\") pod \"e202c84312f326eb61c25220bdbc5ba9f724d2b420482503a1fb00e10cqqrl6\" (UID: \"fa66b2dd-bf1b-43db-b43a-e4adae1d77ce\") " pod="openstack-operators/e202c84312f326eb61c25220bdbc5ba9f724d2b420482503a1fb00e10cqqrl6" Oct 03 08:56:00 crc kubenswrapper[4765]: I1003 08:56:00.018116 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/fa66b2dd-bf1b-43db-b43a-e4adae1d77ce-bundle\") pod \"e202c84312f326eb61c25220bdbc5ba9f724d2b420482503a1fb00e10cqqrl6\" (UID: \"fa66b2dd-bf1b-43db-b43a-e4adae1d77ce\") " pod="openstack-operators/e202c84312f326eb61c25220bdbc5ba9f724d2b420482503a1fb00e10cqqrl6" Oct 03 08:56:00 crc kubenswrapper[4765]: I1003 08:56:00.018317 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-q56hq\" (UniqueName: \"kubernetes.io/projected/fa66b2dd-bf1b-43db-b43a-e4adae1d77ce-kube-api-access-q56hq\") pod \"e202c84312f326eb61c25220bdbc5ba9f724d2b420482503a1fb00e10cqqrl6\" (UID: \"fa66b2dd-bf1b-43db-b43a-e4adae1d77ce\") " pod="openstack-operators/e202c84312f326eb61c25220bdbc5ba9f724d2b420482503a1fb00e10cqqrl6" Oct 03 08:56:00 crc kubenswrapper[4765]: I1003 08:56:00.120025 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/fa66b2dd-bf1b-43db-b43a-e4adae1d77ce-bundle\") pod \"e202c84312f326eb61c25220bdbc5ba9f724d2b420482503a1fb00e10cqqrl6\" (UID: \"fa66b2dd-bf1b-43db-b43a-e4adae1d77ce\") " pod="openstack-operators/e202c84312f326eb61c25220bdbc5ba9f724d2b420482503a1fb00e10cqqrl6" Oct 03 08:56:00 crc kubenswrapper[4765]: I1003 08:56:00.120141 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-q56hq\" (UniqueName: \"kubernetes.io/projected/fa66b2dd-bf1b-43db-b43a-e4adae1d77ce-kube-api-access-q56hq\") pod \"e202c84312f326eb61c25220bdbc5ba9f724d2b420482503a1fb00e10cqqrl6\" (UID: \"fa66b2dd-bf1b-43db-b43a-e4adae1d77ce\") " pod="openstack-operators/e202c84312f326eb61c25220bdbc5ba9f724d2b420482503a1fb00e10cqqrl6" Oct 03 08:56:00 crc kubenswrapper[4765]: I1003 08:56:00.120189 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/fa66b2dd-bf1b-43db-b43a-e4adae1d77ce-util\") pod \"e202c84312f326eb61c25220bdbc5ba9f724d2b420482503a1fb00e10cqqrl6\" (UID: \"fa66b2dd-bf1b-43db-b43a-e4adae1d77ce\") " pod="openstack-operators/e202c84312f326eb61c25220bdbc5ba9f724d2b420482503a1fb00e10cqqrl6" Oct 03 08:56:00 crc kubenswrapper[4765]: I1003 08:56:00.120624 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/fa66b2dd-bf1b-43db-b43a-e4adae1d77ce-bundle\") pod \"e202c84312f326eb61c25220bdbc5ba9f724d2b420482503a1fb00e10cqqrl6\" (UID: \"fa66b2dd-bf1b-43db-b43a-e4adae1d77ce\") " pod="openstack-operators/e202c84312f326eb61c25220bdbc5ba9f724d2b420482503a1fb00e10cqqrl6" Oct 03 08:56:00 crc kubenswrapper[4765]: I1003 08:56:00.120658 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/fa66b2dd-bf1b-43db-b43a-e4adae1d77ce-util\") pod \"e202c84312f326eb61c25220bdbc5ba9f724d2b420482503a1fb00e10cqqrl6\" (UID: \"fa66b2dd-bf1b-43db-b43a-e4adae1d77ce\") " pod="openstack-operators/e202c84312f326eb61c25220bdbc5ba9f724d2b420482503a1fb00e10cqqrl6" Oct 03 08:56:00 crc kubenswrapper[4765]: I1003 08:56:00.138505 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-q56hq\" (UniqueName: \"kubernetes.io/projected/fa66b2dd-bf1b-43db-b43a-e4adae1d77ce-kube-api-access-q56hq\") pod \"e202c84312f326eb61c25220bdbc5ba9f724d2b420482503a1fb00e10cqqrl6\" (UID: \"fa66b2dd-bf1b-43db-b43a-e4adae1d77ce\") " pod="openstack-operators/e202c84312f326eb61c25220bdbc5ba9f724d2b420482503a1fb00e10cqqrl6" Oct 03 08:56:00 crc kubenswrapper[4765]: I1003 08:56:00.193392 4765 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/e202c84312f326eb61c25220bdbc5ba9f724d2b420482503a1fb00e10cqqrl6" Oct 03 08:56:00 crc kubenswrapper[4765]: I1003 08:56:00.663626 4765 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/e202c84312f326eb61c25220bdbc5ba9f724d2b420482503a1fb00e10cqqrl6"] Oct 03 08:56:01 crc kubenswrapper[4765]: I1003 08:56:01.558239 4765 generic.go:334] "Generic (PLEG): container finished" podID="fa66b2dd-bf1b-43db-b43a-e4adae1d77ce" containerID="26cfae7baab763c2e2b4790694f36afd328d71376edf35ff18f3d398d254ade3" exitCode=0 Oct 03 08:56:01 crc kubenswrapper[4765]: I1003 08:56:01.558354 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/e202c84312f326eb61c25220bdbc5ba9f724d2b420482503a1fb00e10cqqrl6" event={"ID":"fa66b2dd-bf1b-43db-b43a-e4adae1d77ce","Type":"ContainerDied","Data":"26cfae7baab763c2e2b4790694f36afd328d71376edf35ff18f3d398d254ade3"} Oct 03 08:56:01 crc kubenswrapper[4765]: I1003 08:56:01.558549 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/e202c84312f326eb61c25220bdbc5ba9f724d2b420482503a1fb00e10cqqrl6" event={"ID":"fa66b2dd-bf1b-43db-b43a-e4adae1d77ce","Type":"ContainerStarted","Data":"bf61bb63cabc1563e43faf4e41ba84832461382ff341fdfa4eb31cd16de1f841"} Oct 03 08:56:02 crc kubenswrapper[4765]: I1003 08:56:02.567368 4765 generic.go:334] "Generic (PLEG): container finished" podID="fa66b2dd-bf1b-43db-b43a-e4adae1d77ce" containerID="04118d3f31335130601101d503100dbbdc5665a173c9d237854ca9b66c884707" exitCode=0 Oct 03 08:56:02 crc kubenswrapper[4765]: I1003 08:56:02.567460 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/e202c84312f326eb61c25220bdbc5ba9f724d2b420482503a1fb00e10cqqrl6" event={"ID":"fa66b2dd-bf1b-43db-b43a-e4adae1d77ce","Type":"ContainerDied","Data":"04118d3f31335130601101d503100dbbdc5665a173c9d237854ca9b66c884707"} Oct 03 08:56:03 crc kubenswrapper[4765]: I1003 08:56:03.575666 4765 generic.go:334] "Generic (PLEG): container finished" podID="fa66b2dd-bf1b-43db-b43a-e4adae1d77ce" containerID="2d5a3b002807aeecfcbfa8b40b90fe22d63d2053ffc9c6ca017ce1cedd78b733" exitCode=0 Oct 03 08:56:03 crc kubenswrapper[4765]: I1003 08:56:03.575708 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/e202c84312f326eb61c25220bdbc5ba9f724d2b420482503a1fb00e10cqqrl6" event={"ID":"fa66b2dd-bf1b-43db-b43a-e4adae1d77ce","Type":"ContainerDied","Data":"2d5a3b002807aeecfcbfa8b40b90fe22d63d2053ffc9c6ca017ce1cedd78b733"} Oct 03 08:56:04 crc kubenswrapper[4765]: I1003 08:56:04.863687 4765 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/e202c84312f326eb61c25220bdbc5ba9f724d2b420482503a1fb00e10cqqrl6" Oct 03 08:56:04 crc kubenswrapper[4765]: I1003 08:56:04.979714 4765 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/fa66b2dd-bf1b-43db-b43a-e4adae1d77ce-bundle\") pod \"fa66b2dd-bf1b-43db-b43a-e4adae1d77ce\" (UID: \"fa66b2dd-bf1b-43db-b43a-e4adae1d77ce\") " Oct 03 08:56:04 crc kubenswrapper[4765]: I1003 08:56:04.980164 4765 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/fa66b2dd-bf1b-43db-b43a-e4adae1d77ce-util\") pod \"fa66b2dd-bf1b-43db-b43a-e4adae1d77ce\" (UID: \"fa66b2dd-bf1b-43db-b43a-e4adae1d77ce\") " Oct 03 08:56:04 crc kubenswrapper[4765]: I1003 08:56:04.980411 4765 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-q56hq\" (UniqueName: \"kubernetes.io/projected/fa66b2dd-bf1b-43db-b43a-e4adae1d77ce-kube-api-access-q56hq\") pod \"fa66b2dd-bf1b-43db-b43a-e4adae1d77ce\" (UID: \"fa66b2dd-bf1b-43db-b43a-e4adae1d77ce\") " Oct 03 08:56:04 crc kubenswrapper[4765]: I1003 08:56:04.980859 4765 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/fa66b2dd-bf1b-43db-b43a-e4adae1d77ce-bundle" (OuterVolumeSpecName: "bundle") pod "fa66b2dd-bf1b-43db-b43a-e4adae1d77ce" (UID: "fa66b2dd-bf1b-43db-b43a-e4adae1d77ce"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 03 08:56:04 crc kubenswrapper[4765]: I1003 08:56:04.994809 4765 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/fa66b2dd-bf1b-43db-b43a-e4adae1d77ce-util" (OuterVolumeSpecName: "util") pod "fa66b2dd-bf1b-43db-b43a-e4adae1d77ce" (UID: "fa66b2dd-bf1b-43db-b43a-e4adae1d77ce"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 03 08:56:04 crc kubenswrapper[4765]: I1003 08:56:04.997907 4765 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/fa66b2dd-bf1b-43db-b43a-e4adae1d77ce-kube-api-access-q56hq" (OuterVolumeSpecName: "kube-api-access-q56hq") pod "fa66b2dd-bf1b-43db-b43a-e4adae1d77ce" (UID: "fa66b2dd-bf1b-43db-b43a-e4adae1d77ce"). InnerVolumeSpecName "kube-api-access-q56hq". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 08:56:05 crc kubenswrapper[4765]: I1003 08:56:05.081836 4765 reconciler_common.go:293] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/fa66b2dd-bf1b-43db-b43a-e4adae1d77ce-util\") on node \"crc\" DevicePath \"\"" Oct 03 08:56:05 crc kubenswrapper[4765]: I1003 08:56:05.082284 4765 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-q56hq\" (UniqueName: \"kubernetes.io/projected/fa66b2dd-bf1b-43db-b43a-e4adae1d77ce-kube-api-access-q56hq\") on node \"crc\" DevicePath \"\"" Oct 03 08:56:05 crc kubenswrapper[4765]: I1003 08:56:05.082295 4765 reconciler_common.go:293] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/fa66b2dd-bf1b-43db-b43a-e4adae1d77ce-bundle\") on node \"crc\" DevicePath \"\"" Oct 03 08:56:05 crc kubenswrapper[4765]: I1003 08:56:05.591314 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/e202c84312f326eb61c25220bdbc5ba9f724d2b420482503a1fb00e10cqqrl6" event={"ID":"fa66b2dd-bf1b-43db-b43a-e4adae1d77ce","Type":"ContainerDied","Data":"bf61bb63cabc1563e43faf4e41ba84832461382ff341fdfa4eb31cd16de1f841"} Oct 03 08:56:05 crc kubenswrapper[4765]: I1003 08:56:05.591349 4765 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="bf61bb63cabc1563e43faf4e41ba84832461382ff341fdfa4eb31cd16de1f841" Oct 03 08:56:05 crc kubenswrapper[4765]: I1003 08:56:05.591624 4765 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/e202c84312f326eb61c25220bdbc5ba9f724d2b420482503a1fb00e10cqqrl6" Oct 03 08:56:11 crc kubenswrapper[4765]: I1003 08:56:11.730823 4765 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/watcher-operator-controller-manager-597b5446d4-jpd84"] Oct 03 08:56:11 crc kubenswrapper[4765]: E1003 08:56:11.731610 4765 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fa66b2dd-bf1b-43db-b43a-e4adae1d77ce" containerName="extract" Oct 03 08:56:11 crc kubenswrapper[4765]: I1003 08:56:11.731624 4765 state_mem.go:107] "Deleted CPUSet assignment" podUID="fa66b2dd-bf1b-43db-b43a-e4adae1d77ce" containerName="extract" Oct 03 08:56:11 crc kubenswrapper[4765]: E1003 08:56:11.731676 4765 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fa66b2dd-bf1b-43db-b43a-e4adae1d77ce" containerName="pull" Oct 03 08:56:11 crc kubenswrapper[4765]: I1003 08:56:11.731683 4765 state_mem.go:107] "Deleted CPUSet assignment" podUID="fa66b2dd-bf1b-43db-b43a-e4adae1d77ce" containerName="pull" Oct 03 08:56:11 crc kubenswrapper[4765]: E1003 08:56:11.731694 4765 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fa66b2dd-bf1b-43db-b43a-e4adae1d77ce" containerName="util" Oct 03 08:56:11 crc kubenswrapper[4765]: I1003 08:56:11.731700 4765 state_mem.go:107] "Deleted CPUSet assignment" podUID="fa66b2dd-bf1b-43db-b43a-e4adae1d77ce" containerName="util" Oct 03 08:56:11 crc kubenswrapper[4765]: I1003 08:56:11.731856 4765 memory_manager.go:354] "RemoveStaleState removing state" podUID="fa66b2dd-bf1b-43db-b43a-e4adae1d77ce" containerName="extract" Oct 03 08:56:11 crc kubenswrapper[4765]: I1003 08:56:11.732716 4765 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/watcher-operator-controller-manager-597b5446d4-jpd84" Oct 03 08:56:11 crc kubenswrapper[4765]: I1003 08:56:11.740894 4765 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"watcher-operator-controller-manager-service-cert" Oct 03 08:56:11 crc kubenswrapper[4765]: I1003 08:56:11.741594 4765 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"watcher-operator-controller-manager-dockercfg-5bqtp" Oct 03 08:56:11 crc kubenswrapper[4765]: I1003 08:56:11.741872 4765 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/watcher-operator-controller-manager-597b5446d4-jpd84"] Oct 03 08:56:11 crc kubenswrapper[4765]: I1003 08:56:11.779534 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/d2b56943-c8b6-4dde-acf8-182435d0b704-webhook-cert\") pod \"watcher-operator-controller-manager-597b5446d4-jpd84\" (UID: \"d2b56943-c8b6-4dde-acf8-182435d0b704\") " pod="openstack-operators/watcher-operator-controller-manager-597b5446d4-jpd84" Oct 03 08:56:11 crc kubenswrapper[4765]: I1003 08:56:11.779581 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-66zxw\" (UniqueName: \"kubernetes.io/projected/d2b56943-c8b6-4dde-acf8-182435d0b704-kube-api-access-66zxw\") pod \"watcher-operator-controller-manager-597b5446d4-jpd84\" (UID: \"d2b56943-c8b6-4dde-acf8-182435d0b704\") " pod="openstack-operators/watcher-operator-controller-manager-597b5446d4-jpd84" Oct 03 08:56:11 crc kubenswrapper[4765]: I1003 08:56:11.779614 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/d2b56943-c8b6-4dde-acf8-182435d0b704-apiservice-cert\") pod \"watcher-operator-controller-manager-597b5446d4-jpd84\" (UID: \"d2b56943-c8b6-4dde-acf8-182435d0b704\") " pod="openstack-operators/watcher-operator-controller-manager-597b5446d4-jpd84" Oct 03 08:56:11 crc kubenswrapper[4765]: I1003 08:56:11.880925 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/d2b56943-c8b6-4dde-acf8-182435d0b704-webhook-cert\") pod \"watcher-operator-controller-manager-597b5446d4-jpd84\" (UID: \"d2b56943-c8b6-4dde-acf8-182435d0b704\") " pod="openstack-operators/watcher-operator-controller-manager-597b5446d4-jpd84" Oct 03 08:56:11 crc kubenswrapper[4765]: I1003 08:56:11.880993 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-66zxw\" (UniqueName: \"kubernetes.io/projected/d2b56943-c8b6-4dde-acf8-182435d0b704-kube-api-access-66zxw\") pod \"watcher-operator-controller-manager-597b5446d4-jpd84\" (UID: \"d2b56943-c8b6-4dde-acf8-182435d0b704\") " pod="openstack-operators/watcher-operator-controller-manager-597b5446d4-jpd84" Oct 03 08:56:11 crc kubenswrapper[4765]: I1003 08:56:11.881038 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/d2b56943-c8b6-4dde-acf8-182435d0b704-apiservice-cert\") pod \"watcher-operator-controller-manager-597b5446d4-jpd84\" (UID: \"d2b56943-c8b6-4dde-acf8-182435d0b704\") " pod="openstack-operators/watcher-operator-controller-manager-597b5446d4-jpd84" Oct 03 08:56:11 crc kubenswrapper[4765]: I1003 08:56:11.890609 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/d2b56943-c8b6-4dde-acf8-182435d0b704-apiservice-cert\") pod \"watcher-operator-controller-manager-597b5446d4-jpd84\" (UID: \"d2b56943-c8b6-4dde-acf8-182435d0b704\") " pod="openstack-operators/watcher-operator-controller-manager-597b5446d4-jpd84" Oct 03 08:56:11 crc kubenswrapper[4765]: I1003 08:56:11.890660 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/d2b56943-c8b6-4dde-acf8-182435d0b704-webhook-cert\") pod \"watcher-operator-controller-manager-597b5446d4-jpd84\" (UID: \"d2b56943-c8b6-4dde-acf8-182435d0b704\") " pod="openstack-operators/watcher-operator-controller-manager-597b5446d4-jpd84" Oct 03 08:56:11 crc kubenswrapper[4765]: I1003 08:56:11.907536 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-66zxw\" (UniqueName: \"kubernetes.io/projected/d2b56943-c8b6-4dde-acf8-182435d0b704-kube-api-access-66zxw\") pod \"watcher-operator-controller-manager-597b5446d4-jpd84\" (UID: \"d2b56943-c8b6-4dde-acf8-182435d0b704\") " pod="openstack-operators/watcher-operator-controller-manager-597b5446d4-jpd84" Oct 03 08:56:12 crc kubenswrapper[4765]: I1003 08:56:12.050778 4765 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/watcher-operator-controller-manager-597b5446d4-jpd84" Oct 03 08:56:12 crc kubenswrapper[4765]: I1003 08:56:12.344079 4765 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/watcher-operator-controller-manager-5d56cb75ff-b6xc8"] Oct 03 08:56:12 crc kubenswrapper[4765]: I1003 08:56:12.345522 4765 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/watcher-operator-controller-manager-5d56cb75ff-b6xc8" Oct 03 08:56:12 crc kubenswrapper[4765]: I1003 08:56:12.389266 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nr5jq\" (UniqueName: \"kubernetes.io/projected/5b5423e1-1c3e-4cab-9cc7-112199b23e2e-kube-api-access-nr5jq\") pod \"watcher-operator-controller-manager-5d56cb75ff-b6xc8\" (UID: \"5b5423e1-1c3e-4cab-9cc7-112199b23e2e\") " pod="openstack-operators/watcher-operator-controller-manager-5d56cb75ff-b6xc8" Oct 03 08:56:12 crc kubenswrapper[4765]: I1003 08:56:12.389356 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/5b5423e1-1c3e-4cab-9cc7-112199b23e2e-webhook-cert\") pod \"watcher-operator-controller-manager-5d56cb75ff-b6xc8\" (UID: \"5b5423e1-1c3e-4cab-9cc7-112199b23e2e\") " pod="openstack-operators/watcher-operator-controller-manager-5d56cb75ff-b6xc8" Oct 03 08:56:12 crc kubenswrapper[4765]: I1003 08:56:12.389384 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/5b5423e1-1c3e-4cab-9cc7-112199b23e2e-apiservice-cert\") pod \"watcher-operator-controller-manager-5d56cb75ff-b6xc8\" (UID: \"5b5423e1-1c3e-4cab-9cc7-112199b23e2e\") " pod="openstack-operators/watcher-operator-controller-manager-5d56cb75ff-b6xc8" Oct 03 08:56:12 crc kubenswrapper[4765]: I1003 08:56:12.415597 4765 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/watcher-operator-controller-manager-5d56cb75ff-b6xc8"] Oct 03 08:56:12 crc kubenswrapper[4765]: I1003 08:56:12.490697 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/5b5423e1-1c3e-4cab-9cc7-112199b23e2e-webhook-cert\") pod \"watcher-operator-controller-manager-5d56cb75ff-b6xc8\" (UID: \"5b5423e1-1c3e-4cab-9cc7-112199b23e2e\") " pod="openstack-operators/watcher-operator-controller-manager-5d56cb75ff-b6xc8" Oct 03 08:56:12 crc kubenswrapper[4765]: I1003 08:56:12.490764 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/5b5423e1-1c3e-4cab-9cc7-112199b23e2e-apiservice-cert\") pod \"watcher-operator-controller-manager-5d56cb75ff-b6xc8\" (UID: \"5b5423e1-1c3e-4cab-9cc7-112199b23e2e\") " pod="openstack-operators/watcher-operator-controller-manager-5d56cb75ff-b6xc8" Oct 03 08:56:12 crc kubenswrapper[4765]: I1003 08:56:12.490850 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nr5jq\" (UniqueName: \"kubernetes.io/projected/5b5423e1-1c3e-4cab-9cc7-112199b23e2e-kube-api-access-nr5jq\") pod \"watcher-operator-controller-manager-5d56cb75ff-b6xc8\" (UID: \"5b5423e1-1c3e-4cab-9cc7-112199b23e2e\") " pod="openstack-operators/watcher-operator-controller-manager-5d56cb75ff-b6xc8" Oct 03 08:56:12 crc kubenswrapper[4765]: I1003 08:56:12.496381 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/5b5423e1-1c3e-4cab-9cc7-112199b23e2e-webhook-cert\") pod \"watcher-operator-controller-manager-5d56cb75ff-b6xc8\" (UID: \"5b5423e1-1c3e-4cab-9cc7-112199b23e2e\") " pod="openstack-operators/watcher-operator-controller-manager-5d56cb75ff-b6xc8" Oct 03 08:56:12 crc kubenswrapper[4765]: I1003 08:56:12.499146 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/5b5423e1-1c3e-4cab-9cc7-112199b23e2e-apiservice-cert\") pod \"watcher-operator-controller-manager-5d56cb75ff-b6xc8\" (UID: \"5b5423e1-1c3e-4cab-9cc7-112199b23e2e\") " pod="openstack-operators/watcher-operator-controller-manager-5d56cb75ff-b6xc8" Oct 03 08:56:12 crc kubenswrapper[4765]: I1003 08:56:12.510264 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nr5jq\" (UniqueName: \"kubernetes.io/projected/5b5423e1-1c3e-4cab-9cc7-112199b23e2e-kube-api-access-nr5jq\") pod \"watcher-operator-controller-manager-5d56cb75ff-b6xc8\" (UID: \"5b5423e1-1c3e-4cab-9cc7-112199b23e2e\") " pod="openstack-operators/watcher-operator-controller-manager-5d56cb75ff-b6xc8" Oct 03 08:56:12 crc kubenswrapper[4765]: I1003 08:56:12.549507 4765 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/watcher-operator-controller-manager-597b5446d4-jpd84"] Oct 03 08:56:12 crc kubenswrapper[4765]: W1003 08:56:12.552616 4765 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd2b56943_c8b6_4dde_acf8_182435d0b704.slice/crio-fc962aa661453136f744ca26c68b82f21be331942a28dcd7e0ef9bf17631c577 WatchSource:0}: Error finding container fc962aa661453136f744ca26c68b82f21be331942a28dcd7e0ef9bf17631c577: Status 404 returned error can't find the container with id fc962aa661453136f744ca26c68b82f21be331942a28dcd7e0ef9bf17631c577 Oct 03 08:56:12 crc kubenswrapper[4765]: I1003 08:56:12.645796 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/watcher-operator-controller-manager-597b5446d4-jpd84" event={"ID":"d2b56943-c8b6-4dde-acf8-182435d0b704","Type":"ContainerStarted","Data":"fc962aa661453136f744ca26c68b82f21be331942a28dcd7e0ef9bf17631c577"} Oct 03 08:56:12 crc kubenswrapper[4765]: I1003 08:56:12.694802 4765 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/watcher-operator-controller-manager-5d56cb75ff-b6xc8" Oct 03 08:56:13 crc kubenswrapper[4765]: I1003 08:56:13.181778 4765 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/watcher-operator-controller-manager-5d56cb75ff-b6xc8"] Oct 03 08:56:13 crc kubenswrapper[4765]: W1003 08:56:13.187821 4765 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod5b5423e1_1c3e_4cab_9cc7_112199b23e2e.slice/crio-78d4f1cc8771f88a02e95d2c7c59cc73e89c9efb3faf7a7cf13cc3cd2e169dce WatchSource:0}: Error finding container 78d4f1cc8771f88a02e95d2c7c59cc73e89c9efb3faf7a7cf13cc3cd2e169dce: Status 404 returned error can't find the container with id 78d4f1cc8771f88a02e95d2c7c59cc73e89c9efb3faf7a7cf13cc3cd2e169dce Oct 03 08:56:13 crc kubenswrapper[4765]: I1003 08:56:13.655786 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/watcher-operator-controller-manager-5d56cb75ff-b6xc8" event={"ID":"5b5423e1-1c3e-4cab-9cc7-112199b23e2e","Type":"ContainerStarted","Data":"8ecbc3bd4422e3197c4fe9b1bafcfda35ef42e1d176403502d4b5c137cde9848"} Oct 03 08:56:13 crc kubenswrapper[4765]: I1003 08:56:13.655850 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/watcher-operator-controller-manager-5d56cb75ff-b6xc8" event={"ID":"5b5423e1-1c3e-4cab-9cc7-112199b23e2e","Type":"ContainerStarted","Data":"6d859e76d0157a73cb0be7b9659876324743b0108e1698f47d989a95d8d7693c"} Oct 03 08:56:13 crc kubenswrapper[4765]: I1003 08:56:13.655864 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/watcher-operator-controller-manager-5d56cb75ff-b6xc8" event={"ID":"5b5423e1-1c3e-4cab-9cc7-112199b23e2e","Type":"ContainerStarted","Data":"78d4f1cc8771f88a02e95d2c7c59cc73e89c9efb3faf7a7cf13cc3cd2e169dce"} Oct 03 08:56:13 crc kubenswrapper[4765]: I1003 08:56:13.655908 4765 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/watcher-operator-controller-manager-5d56cb75ff-b6xc8" Oct 03 08:56:13 crc kubenswrapper[4765]: I1003 08:56:13.657879 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/watcher-operator-controller-manager-597b5446d4-jpd84" event={"ID":"d2b56943-c8b6-4dde-acf8-182435d0b704","Type":"ContainerStarted","Data":"20de0d70b6e4ef1985421888890c3198b365a37223c3b1e17792270905bb6d24"} Oct 03 08:56:13 crc kubenswrapper[4765]: I1003 08:56:13.657914 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/watcher-operator-controller-manager-597b5446d4-jpd84" event={"ID":"d2b56943-c8b6-4dde-acf8-182435d0b704","Type":"ContainerStarted","Data":"be7dab624198443df0ab2081076c245d9ca821b96c7367a49297aba531f1e45c"} Oct 03 08:56:13 crc kubenswrapper[4765]: I1003 08:56:13.658034 4765 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/watcher-operator-controller-manager-597b5446d4-jpd84" Oct 03 08:56:13 crc kubenswrapper[4765]: I1003 08:56:13.678013 4765 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/watcher-operator-controller-manager-5d56cb75ff-b6xc8" podStartSLOduration=1.6779927890000002 podStartE2EDuration="1.677992789s" podCreationTimestamp="2025-10-03 08:56:12 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-03 08:56:13.671087045 +0000 UTC m=+1017.972581375" watchObservedRunningTime="2025-10-03 08:56:13.677992789 +0000 UTC m=+1017.979487119" Oct 03 08:56:13 crc kubenswrapper[4765]: I1003 08:56:13.691184 4765 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/watcher-operator-controller-manager-597b5446d4-jpd84" podStartSLOduration=2.6911620000000003 podStartE2EDuration="2.691162s" podCreationTimestamp="2025-10-03 08:56:11 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-03 08:56:13.690011401 +0000 UTC m=+1017.991505751" watchObservedRunningTime="2025-10-03 08:56:13.691162 +0000 UTC m=+1017.992656330" Oct 03 08:56:22 crc kubenswrapper[4765]: I1003 08:56:22.069570 4765 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/watcher-operator-controller-manager-597b5446d4-jpd84" Oct 03 08:56:22 crc kubenswrapper[4765]: I1003 08:56:22.698676 4765 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/watcher-operator-controller-manager-5d56cb75ff-b6xc8" Oct 03 08:56:22 crc kubenswrapper[4765]: I1003 08:56:22.755909 4765 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-operators/watcher-operator-controller-manager-597b5446d4-jpd84"] Oct 03 08:56:22 crc kubenswrapper[4765]: I1003 08:56:22.756161 4765 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-operators/watcher-operator-controller-manager-597b5446d4-jpd84" podUID="d2b56943-c8b6-4dde-acf8-182435d0b704" containerName="manager" containerID="cri-o://be7dab624198443df0ab2081076c245d9ca821b96c7367a49297aba531f1e45c" gracePeriod=10 Oct 03 08:56:22 crc kubenswrapper[4765]: I1003 08:56:22.756248 4765 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-operators/watcher-operator-controller-manager-597b5446d4-jpd84" podUID="d2b56943-c8b6-4dde-acf8-182435d0b704" containerName="kube-rbac-proxy" containerID="cri-o://20de0d70b6e4ef1985421888890c3198b365a37223c3b1e17792270905bb6d24" gracePeriod=10 Oct 03 08:56:23 crc kubenswrapper[4765]: I1003 08:56:23.185794 4765 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/watcher-operator-controller-manager-597b5446d4-jpd84" Oct 03 08:56:23 crc kubenswrapper[4765]: I1003 08:56:23.242702 4765 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-66zxw\" (UniqueName: \"kubernetes.io/projected/d2b56943-c8b6-4dde-acf8-182435d0b704-kube-api-access-66zxw\") pod \"d2b56943-c8b6-4dde-acf8-182435d0b704\" (UID: \"d2b56943-c8b6-4dde-acf8-182435d0b704\") " Oct 03 08:56:23 crc kubenswrapper[4765]: I1003 08:56:23.242786 4765 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/d2b56943-c8b6-4dde-acf8-182435d0b704-apiservice-cert\") pod \"d2b56943-c8b6-4dde-acf8-182435d0b704\" (UID: \"d2b56943-c8b6-4dde-acf8-182435d0b704\") " Oct 03 08:56:23 crc kubenswrapper[4765]: I1003 08:56:23.242850 4765 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/d2b56943-c8b6-4dde-acf8-182435d0b704-webhook-cert\") pod \"d2b56943-c8b6-4dde-acf8-182435d0b704\" (UID: \"d2b56943-c8b6-4dde-acf8-182435d0b704\") " Oct 03 08:56:23 crc kubenswrapper[4765]: I1003 08:56:23.247761 4765 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d2b56943-c8b6-4dde-acf8-182435d0b704-webhook-cert" (OuterVolumeSpecName: "webhook-cert") pod "d2b56943-c8b6-4dde-acf8-182435d0b704" (UID: "d2b56943-c8b6-4dde-acf8-182435d0b704"). InnerVolumeSpecName "webhook-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 08:56:23 crc kubenswrapper[4765]: I1003 08:56:23.247964 4765 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d2b56943-c8b6-4dde-acf8-182435d0b704-apiservice-cert" (OuterVolumeSpecName: "apiservice-cert") pod "d2b56943-c8b6-4dde-acf8-182435d0b704" (UID: "d2b56943-c8b6-4dde-acf8-182435d0b704"). InnerVolumeSpecName "apiservice-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 08:56:23 crc kubenswrapper[4765]: I1003 08:56:23.248131 4765 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d2b56943-c8b6-4dde-acf8-182435d0b704-kube-api-access-66zxw" (OuterVolumeSpecName: "kube-api-access-66zxw") pod "d2b56943-c8b6-4dde-acf8-182435d0b704" (UID: "d2b56943-c8b6-4dde-acf8-182435d0b704"). InnerVolumeSpecName "kube-api-access-66zxw". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 08:56:23 crc kubenswrapper[4765]: I1003 08:56:23.344938 4765 reconciler_common.go:293] "Volume detached for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/d2b56943-c8b6-4dde-acf8-182435d0b704-webhook-cert\") on node \"crc\" DevicePath \"\"" Oct 03 08:56:23 crc kubenswrapper[4765]: I1003 08:56:23.344971 4765 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-66zxw\" (UniqueName: \"kubernetes.io/projected/d2b56943-c8b6-4dde-acf8-182435d0b704-kube-api-access-66zxw\") on node \"crc\" DevicePath \"\"" Oct 03 08:56:23 crc kubenswrapper[4765]: I1003 08:56:23.344981 4765 reconciler_common.go:293] "Volume detached for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/d2b56943-c8b6-4dde-acf8-182435d0b704-apiservice-cert\") on node \"crc\" DevicePath \"\"" Oct 03 08:56:23 crc kubenswrapper[4765]: I1003 08:56:23.739257 4765 generic.go:334] "Generic (PLEG): container finished" podID="d2b56943-c8b6-4dde-acf8-182435d0b704" containerID="20de0d70b6e4ef1985421888890c3198b365a37223c3b1e17792270905bb6d24" exitCode=0 Oct 03 08:56:23 crc kubenswrapper[4765]: I1003 08:56:23.739291 4765 generic.go:334] "Generic (PLEG): container finished" podID="d2b56943-c8b6-4dde-acf8-182435d0b704" containerID="be7dab624198443df0ab2081076c245d9ca821b96c7367a49297aba531f1e45c" exitCode=0 Oct 03 08:56:23 crc kubenswrapper[4765]: I1003 08:56:23.739306 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/watcher-operator-controller-manager-597b5446d4-jpd84" event={"ID":"d2b56943-c8b6-4dde-acf8-182435d0b704","Type":"ContainerDied","Data":"20de0d70b6e4ef1985421888890c3198b365a37223c3b1e17792270905bb6d24"} Oct 03 08:56:23 crc kubenswrapper[4765]: I1003 08:56:23.739361 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/watcher-operator-controller-manager-597b5446d4-jpd84" event={"ID":"d2b56943-c8b6-4dde-acf8-182435d0b704","Type":"ContainerDied","Data":"be7dab624198443df0ab2081076c245d9ca821b96c7367a49297aba531f1e45c"} Oct 03 08:56:23 crc kubenswrapper[4765]: I1003 08:56:23.739372 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/watcher-operator-controller-manager-597b5446d4-jpd84" event={"ID":"d2b56943-c8b6-4dde-acf8-182435d0b704","Type":"ContainerDied","Data":"fc962aa661453136f744ca26c68b82f21be331942a28dcd7e0ef9bf17631c577"} Oct 03 08:56:23 crc kubenswrapper[4765]: I1003 08:56:23.739389 4765 scope.go:117] "RemoveContainer" containerID="20de0d70b6e4ef1985421888890c3198b365a37223c3b1e17792270905bb6d24" Oct 03 08:56:23 crc kubenswrapper[4765]: I1003 08:56:23.739324 4765 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/watcher-operator-controller-manager-597b5446d4-jpd84" Oct 03 08:56:23 crc kubenswrapper[4765]: I1003 08:56:23.766048 4765 scope.go:117] "RemoveContainer" containerID="be7dab624198443df0ab2081076c245d9ca821b96c7367a49297aba531f1e45c" Oct 03 08:56:23 crc kubenswrapper[4765]: I1003 08:56:23.774785 4765 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-operators/watcher-operator-controller-manager-597b5446d4-jpd84"] Oct 03 08:56:23 crc kubenswrapper[4765]: I1003 08:56:23.779697 4765 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-operators/watcher-operator-controller-manager-597b5446d4-jpd84"] Oct 03 08:56:23 crc kubenswrapper[4765]: I1003 08:56:23.780545 4765 scope.go:117] "RemoveContainer" containerID="20de0d70b6e4ef1985421888890c3198b365a37223c3b1e17792270905bb6d24" Oct 03 08:56:23 crc kubenswrapper[4765]: E1003 08:56:23.780979 4765 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"20de0d70b6e4ef1985421888890c3198b365a37223c3b1e17792270905bb6d24\": container with ID starting with 20de0d70b6e4ef1985421888890c3198b365a37223c3b1e17792270905bb6d24 not found: ID does not exist" containerID="20de0d70b6e4ef1985421888890c3198b365a37223c3b1e17792270905bb6d24" Oct 03 08:56:23 crc kubenswrapper[4765]: I1003 08:56:23.781014 4765 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"20de0d70b6e4ef1985421888890c3198b365a37223c3b1e17792270905bb6d24"} err="failed to get container status \"20de0d70b6e4ef1985421888890c3198b365a37223c3b1e17792270905bb6d24\": rpc error: code = NotFound desc = could not find container \"20de0d70b6e4ef1985421888890c3198b365a37223c3b1e17792270905bb6d24\": container with ID starting with 20de0d70b6e4ef1985421888890c3198b365a37223c3b1e17792270905bb6d24 not found: ID does not exist" Oct 03 08:56:23 crc kubenswrapper[4765]: I1003 08:56:23.781039 4765 scope.go:117] "RemoveContainer" containerID="be7dab624198443df0ab2081076c245d9ca821b96c7367a49297aba531f1e45c" Oct 03 08:56:23 crc kubenswrapper[4765]: E1003 08:56:23.781283 4765 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"be7dab624198443df0ab2081076c245d9ca821b96c7367a49297aba531f1e45c\": container with ID starting with be7dab624198443df0ab2081076c245d9ca821b96c7367a49297aba531f1e45c not found: ID does not exist" containerID="be7dab624198443df0ab2081076c245d9ca821b96c7367a49297aba531f1e45c" Oct 03 08:56:23 crc kubenswrapper[4765]: I1003 08:56:23.781313 4765 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"be7dab624198443df0ab2081076c245d9ca821b96c7367a49297aba531f1e45c"} err="failed to get container status \"be7dab624198443df0ab2081076c245d9ca821b96c7367a49297aba531f1e45c\": rpc error: code = NotFound desc = could not find container \"be7dab624198443df0ab2081076c245d9ca821b96c7367a49297aba531f1e45c\": container with ID starting with be7dab624198443df0ab2081076c245d9ca821b96c7367a49297aba531f1e45c not found: ID does not exist" Oct 03 08:56:23 crc kubenswrapper[4765]: I1003 08:56:23.781334 4765 scope.go:117] "RemoveContainer" containerID="20de0d70b6e4ef1985421888890c3198b365a37223c3b1e17792270905bb6d24" Oct 03 08:56:23 crc kubenswrapper[4765]: I1003 08:56:23.781624 4765 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"20de0d70b6e4ef1985421888890c3198b365a37223c3b1e17792270905bb6d24"} err="failed to get container status \"20de0d70b6e4ef1985421888890c3198b365a37223c3b1e17792270905bb6d24\": rpc error: code = NotFound desc = could not find container \"20de0d70b6e4ef1985421888890c3198b365a37223c3b1e17792270905bb6d24\": container with ID starting with 20de0d70b6e4ef1985421888890c3198b365a37223c3b1e17792270905bb6d24 not found: ID does not exist" Oct 03 08:56:23 crc kubenswrapper[4765]: I1003 08:56:23.781669 4765 scope.go:117] "RemoveContainer" containerID="be7dab624198443df0ab2081076c245d9ca821b96c7367a49297aba531f1e45c" Oct 03 08:56:23 crc kubenswrapper[4765]: I1003 08:56:23.781886 4765 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"be7dab624198443df0ab2081076c245d9ca821b96c7367a49297aba531f1e45c"} err="failed to get container status \"be7dab624198443df0ab2081076c245d9ca821b96c7367a49297aba531f1e45c\": rpc error: code = NotFound desc = could not find container \"be7dab624198443df0ab2081076c245d9ca821b96c7367a49297aba531f1e45c\": container with ID starting with be7dab624198443df0ab2081076c245d9ca821b96c7367a49297aba531f1e45c not found: ID does not exist" Oct 03 08:56:24 crc kubenswrapper[4765]: I1003 08:56:24.315570 4765 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d2b56943-c8b6-4dde-acf8-182435d0b704" path="/var/lib/kubelet/pods/d2b56943-c8b6-4dde-acf8-182435d0b704/volumes" Oct 03 08:56:37 crc kubenswrapper[4765]: I1003 08:56:37.597087 4765 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["watcher-kuttl-default/rabbitmq-notifications-server-0"] Oct 03 08:56:37 crc kubenswrapper[4765]: E1003 08:56:37.597880 4765 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d2b56943-c8b6-4dde-acf8-182435d0b704" containerName="manager" Oct 03 08:56:37 crc kubenswrapper[4765]: I1003 08:56:37.597892 4765 state_mem.go:107] "Deleted CPUSet assignment" podUID="d2b56943-c8b6-4dde-acf8-182435d0b704" containerName="manager" Oct 03 08:56:37 crc kubenswrapper[4765]: E1003 08:56:37.597900 4765 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d2b56943-c8b6-4dde-acf8-182435d0b704" containerName="kube-rbac-proxy" Oct 03 08:56:37 crc kubenswrapper[4765]: I1003 08:56:37.597906 4765 state_mem.go:107] "Deleted CPUSet assignment" podUID="d2b56943-c8b6-4dde-acf8-182435d0b704" containerName="kube-rbac-proxy" Oct 03 08:56:37 crc kubenswrapper[4765]: I1003 08:56:37.598039 4765 memory_manager.go:354] "RemoveStaleState removing state" podUID="d2b56943-c8b6-4dde-acf8-182435d0b704" containerName="manager" Oct 03 08:56:37 crc kubenswrapper[4765]: I1003 08:56:37.598054 4765 memory_manager.go:354] "RemoveStaleState removing state" podUID="d2b56943-c8b6-4dde-acf8-182435d0b704" containerName="kube-rbac-proxy" Oct 03 08:56:37 crc kubenswrapper[4765]: I1003 08:56:37.598926 4765 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/rabbitmq-notifications-server-0" Oct 03 08:56:37 crc kubenswrapper[4765]: I1003 08:56:37.604413 4765 reflector.go:368] Caches populated for *v1.ConfigMap from object-"watcher-kuttl-default"/"rabbitmq-notifications-server-conf" Oct 03 08:56:37 crc kubenswrapper[4765]: I1003 08:56:37.604790 4765 reflector.go:368] Caches populated for *v1.ConfigMap from object-"watcher-kuttl-default"/"openshift-service-ca.crt" Oct 03 08:56:37 crc kubenswrapper[4765]: I1003 08:56:37.605000 4765 reflector.go:368] Caches populated for *v1.Secret from object-"watcher-kuttl-default"/"rabbitmq-notifications-default-user" Oct 03 08:56:37 crc kubenswrapper[4765]: I1003 08:56:37.605244 4765 reflector.go:368] Caches populated for *v1.Secret from object-"watcher-kuttl-default"/"rabbitmq-notifications-server-dockercfg-qs9bg" Oct 03 08:56:37 crc kubenswrapper[4765]: I1003 08:56:37.605399 4765 reflector.go:368] Caches populated for *v1.Secret from object-"watcher-kuttl-default"/"rabbitmq-notifications-erlang-cookie" Oct 03 08:56:37 crc kubenswrapper[4765]: I1003 08:56:37.605523 4765 reflector.go:368] Caches populated for *v1.ConfigMap from object-"watcher-kuttl-default"/"rabbitmq-notifications-config-data" Oct 03 08:56:37 crc kubenswrapper[4765]: I1003 08:56:37.605674 4765 reflector.go:368] Caches populated for *v1.ConfigMap from object-"watcher-kuttl-default"/"kube-root-ca.crt" Oct 03 08:56:37 crc kubenswrapper[4765]: I1003 08:56:37.605789 4765 reflector.go:368] Caches populated for *v1.Secret from object-"watcher-kuttl-default"/"cert-rabbitmq-notifications-svc" Oct 03 08:56:37 crc kubenswrapper[4765]: I1003 08:56:37.606224 4765 reflector.go:368] Caches populated for *v1.ConfigMap from object-"watcher-kuttl-default"/"rabbitmq-notifications-plugins-conf" Oct 03 08:56:37 crc kubenswrapper[4765]: I1003 08:56:37.616852 4765 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["watcher-kuttl-default/rabbitmq-notifications-server-0"] Oct 03 08:56:37 crc kubenswrapper[4765]: I1003 08:56:37.635998 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/822ab948-07b5-4946-aeb3-d6cd9e4f6752-rabbitmq-plugins\") pod \"rabbitmq-notifications-server-0\" (UID: \"822ab948-07b5-4946-aeb3-d6cd9e4f6752\") " pod="watcher-kuttl-default/rabbitmq-notifications-server-0" Oct 03 08:56:37 crc kubenswrapper[4765]: I1003 08:56:37.636044 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/822ab948-07b5-4946-aeb3-d6cd9e4f6752-plugins-conf\") pod \"rabbitmq-notifications-server-0\" (UID: \"822ab948-07b5-4946-aeb3-d6cd9e4f6752\") " pod="watcher-kuttl-default/rabbitmq-notifications-server-0" Oct 03 08:56:37 crc kubenswrapper[4765]: I1003 08:56:37.636097 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/822ab948-07b5-4946-aeb3-d6cd9e4f6752-rabbitmq-erlang-cookie\") pod \"rabbitmq-notifications-server-0\" (UID: \"822ab948-07b5-4946-aeb3-d6cd9e4f6752\") " pod="watcher-kuttl-default/rabbitmq-notifications-server-0" Oct 03 08:56:37 crc kubenswrapper[4765]: I1003 08:56:37.636116 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/822ab948-07b5-4946-aeb3-d6cd9e4f6752-pod-info\") pod \"rabbitmq-notifications-server-0\" (UID: \"822ab948-07b5-4946-aeb3-d6cd9e4f6752\") " pod="watcher-kuttl-default/rabbitmq-notifications-server-0" Oct 03 08:56:37 crc kubenswrapper[4765]: I1003 08:56:37.636131 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/822ab948-07b5-4946-aeb3-d6cd9e4f6752-config-data\") pod \"rabbitmq-notifications-server-0\" (UID: \"822ab948-07b5-4946-aeb3-d6cd9e4f6752\") " pod="watcher-kuttl-default/rabbitmq-notifications-server-0" Oct 03 08:56:37 crc kubenswrapper[4765]: I1003 08:56:37.636154 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pvc-64e21ecb-6802-40db-b472-9d379d26c803\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-64e21ecb-6802-40db-b472-9d379d26c803\") pod \"rabbitmq-notifications-server-0\" (UID: \"822ab948-07b5-4946-aeb3-d6cd9e4f6752\") " pod="watcher-kuttl-default/rabbitmq-notifications-server-0" Oct 03 08:56:37 crc kubenswrapper[4765]: I1003 08:56:37.636177 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/822ab948-07b5-4946-aeb3-d6cd9e4f6752-rabbitmq-tls\") pod \"rabbitmq-notifications-server-0\" (UID: \"822ab948-07b5-4946-aeb3-d6cd9e4f6752\") " pod="watcher-kuttl-default/rabbitmq-notifications-server-0" Oct 03 08:56:37 crc kubenswrapper[4765]: I1003 08:56:37.636191 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/822ab948-07b5-4946-aeb3-d6cd9e4f6752-server-conf\") pod \"rabbitmq-notifications-server-0\" (UID: \"822ab948-07b5-4946-aeb3-d6cd9e4f6752\") " pod="watcher-kuttl-default/rabbitmq-notifications-server-0" Oct 03 08:56:37 crc kubenswrapper[4765]: I1003 08:56:37.636206 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/822ab948-07b5-4946-aeb3-d6cd9e4f6752-rabbitmq-confd\") pod \"rabbitmq-notifications-server-0\" (UID: \"822ab948-07b5-4946-aeb3-d6cd9e4f6752\") " pod="watcher-kuttl-default/rabbitmq-notifications-server-0" Oct 03 08:56:37 crc kubenswrapper[4765]: I1003 08:56:37.636233 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-h59lp\" (UniqueName: \"kubernetes.io/projected/822ab948-07b5-4946-aeb3-d6cd9e4f6752-kube-api-access-h59lp\") pod \"rabbitmq-notifications-server-0\" (UID: \"822ab948-07b5-4946-aeb3-d6cd9e4f6752\") " pod="watcher-kuttl-default/rabbitmq-notifications-server-0" Oct 03 08:56:37 crc kubenswrapper[4765]: I1003 08:56:37.636250 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/822ab948-07b5-4946-aeb3-d6cd9e4f6752-erlang-cookie-secret\") pod \"rabbitmq-notifications-server-0\" (UID: \"822ab948-07b5-4946-aeb3-d6cd9e4f6752\") " pod="watcher-kuttl-default/rabbitmq-notifications-server-0" Oct 03 08:56:37 crc kubenswrapper[4765]: I1003 08:56:37.737491 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/822ab948-07b5-4946-aeb3-d6cd9e4f6752-pod-info\") pod \"rabbitmq-notifications-server-0\" (UID: \"822ab948-07b5-4946-aeb3-d6cd9e4f6752\") " pod="watcher-kuttl-default/rabbitmq-notifications-server-0" Oct 03 08:56:37 crc kubenswrapper[4765]: I1003 08:56:37.737548 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/822ab948-07b5-4946-aeb3-d6cd9e4f6752-config-data\") pod \"rabbitmq-notifications-server-0\" (UID: \"822ab948-07b5-4946-aeb3-d6cd9e4f6752\") " pod="watcher-kuttl-default/rabbitmq-notifications-server-0" Oct 03 08:56:37 crc kubenswrapper[4765]: I1003 08:56:37.737578 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-64e21ecb-6802-40db-b472-9d379d26c803\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-64e21ecb-6802-40db-b472-9d379d26c803\") pod \"rabbitmq-notifications-server-0\" (UID: \"822ab948-07b5-4946-aeb3-d6cd9e4f6752\") " pod="watcher-kuttl-default/rabbitmq-notifications-server-0" Oct 03 08:56:37 crc kubenswrapper[4765]: I1003 08:56:37.737607 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/822ab948-07b5-4946-aeb3-d6cd9e4f6752-rabbitmq-tls\") pod \"rabbitmq-notifications-server-0\" (UID: \"822ab948-07b5-4946-aeb3-d6cd9e4f6752\") " pod="watcher-kuttl-default/rabbitmq-notifications-server-0" Oct 03 08:56:37 crc kubenswrapper[4765]: I1003 08:56:37.737623 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/822ab948-07b5-4946-aeb3-d6cd9e4f6752-server-conf\") pod \"rabbitmq-notifications-server-0\" (UID: \"822ab948-07b5-4946-aeb3-d6cd9e4f6752\") " pod="watcher-kuttl-default/rabbitmq-notifications-server-0" Oct 03 08:56:37 crc kubenswrapper[4765]: I1003 08:56:37.737656 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/822ab948-07b5-4946-aeb3-d6cd9e4f6752-rabbitmq-confd\") pod \"rabbitmq-notifications-server-0\" (UID: \"822ab948-07b5-4946-aeb3-d6cd9e4f6752\") " pod="watcher-kuttl-default/rabbitmq-notifications-server-0" Oct 03 08:56:37 crc kubenswrapper[4765]: I1003 08:56:37.737693 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-h59lp\" (UniqueName: \"kubernetes.io/projected/822ab948-07b5-4946-aeb3-d6cd9e4f6752-kube-api-access-h59lp\") pod \"rabbitmq-notifications-server-0\" (UID: \"822ab948-07b5-4946-aeb3-d6cd9e4f6752\") " pod="watcher-kuttl-default/rabbitmq-notifications-server-0" Oct 03 08:56:37 crc kubenswrapper[4765]: I1003 08:56:37.737720 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/822ab948-07b5-4946-aeb3-d6cd9e4f6752-erlang-cookie-secret\") pod \"rabbitmq-notifications-server-0\" (UID: \"822ab948-07b5-4946-aeb3-d6cd9e4f6752\") " pod="watcher-kuttl-default/rabbitmq-notifications-server-0" Oct 03 08:56:37 crc kubenswrapper[4765]: I1003 08:56:37.737767 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/822ab948-07b5-4946-aeb3-d6cd9e4f6752-rabbitmq-plugins\") pod \"rabbitmq-notifications-server-0\" (UID: \"822ab948-07b5-4946-aeb3-d6cd9e4f6752\") " pod="watcher-kuttl-default/rabbitmq-notifications-server-0" Oct 03 08:56:37 crc kubenswrapper[4765]: I1003 08:56:37.737788 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/822ab948-07b5-4946-aeb3-d6cd9e4f6752-plugins-conf\") pod \"rabbitmq-notifications-server-0\" (UID: \"822ab948-07b5-4946-aeb3-d6cd9e4f6752\") " pod="watcher-kuttl-default/rabbitmq-notifications-server-0" Oct 03 08:56:37 crc kubenswrapper[4765]: I1003 08:56:37.737836 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/822ab948-07b5-4946-aeb3-d6cd9e4f6752-rabbitmq-erlang-cookie\") pod \"rabbitmq-notifications-server-0\" (UID: \"822ab948-07b5-4946-aeb3-d6cd9e4f6752\") " pod="watcher-kuttl-default/rabbitmq-notifications-server-0" Oct 03 08:56:37 crc kubenswrapper[4765]: I1003 08:56:37.738825 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/822ab948-07b5-4946-aeb3-d6cd9e4f6752-rabbitmq-plugins\") pod \"rabbitmq-notifications-server-0\" (UID: \"822ab948-07b5-4946-aeb3-d6cd9e4f6752\") " pod="watcher-kuttl-default/rabbitmq-notifications-server-0" Oct 03 08:56:37 crc kubenswrapper[4765]: I1003 08:56:37.738905 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/822ab948-07b5-4946-aeb3-d6cd9e4f6752-rabbitmq-erlang-cookie\") pod \"rabbitmq-notifications-server-0\" (UID: \"822ab948-07b5-4946-aeb3-d6cd9e4f6752\") " pod="watcher-kuttl-default/rabbitmq-notifications-server-0" Oct 03 08:56:37 crc kubenswrapper[4765]: I1003 08:56:37.739684 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/822ab948-07b5-4946-aeb3-d6cd9e4f6752-plugins-conf\") pod \"rabbitmq-notifications-server-0\" (UID: \"822ab948-07b5-4946-aeb3-d6cd9e4f6752\") " pod="watcher-kuttl-default/rabbitmq-notifications-server-0" Oct 03 08:56:37 crc kubenswrapper[4765]: I1003 08:56:37.739736 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/822ab948-07b5-4946-aeb3-d6cd9e4f6752-config-data\") pod \"rabbitmq-notifications-server-0\" (UID: \"822ab948-07b5-4946-aeb3-d6cd9e4f6752\") " pod="watcher-kuttl-default/rabbitmq-notifications-server-0" Oct 03 08:56:37 crc kubenswrapper[4765]: I1003 08:56:37.739912 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/822ab948-07b5-4946-aeb3-d6cd9e4f6752-server-conf\") pod \"rabbitmq-notifications-server-0\" (UID: \"822ab948-07b5-4946-aeb3-d6cd9e4f6752\") " pod="watcher-kuttl-default/rabbitmq-notifications-server-0" Oct 03 08:56:37 crc kubenswrapper[4765]: I1003 08:56:37.741220 4765 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Oct 03 08:56:37 crc kubenswrapper[4765]: I1003 08:56:37.741261 4765 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-64e21ecb-6802-40db-b472-9d379d26c803\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-64e21ecb-6802-40db-b472-9d379d26c803\") pod \"rabbitmq-notifications-server-0\" (UID: \"822ab948-07b5-4946-aeb3-d6cd9e4f6752\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/31311ec9d5cceabd361314543a5e4aefe1c8cd2c71546073e628d2986c18ba47/globalmount\"" pod="watcher-kuttl-default/rabbitmq-notifications-server-0" Oct 03 08:56:37 crc kubenswrapper[4765]: I1003 08:56:37.743111 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/822ab948-07b5-4946-aeb3-d6cd9e4f6752-rabbitmq-confd\") pod \"rabbitmq-notifications-server-0\" (UID: \"822ab948-07b5-4946-aeb3-d6cd9e4f6752\") " pod="watcher-kuttl-default/rabbitmq-notifications-server-0" Oct 03 08:56:37 crc kubenswrapper[4765]: I1003 08:56:37.744293 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/822ab948-07b5-4946-aeb3-d6cd9e4f6752-pod-info\") pod \"rabbitmq-notifications-server-0\" (UID: \"822ab948-07b5-4946-aeb3-d6cd9e4f6752\") " pod="watcher-kuttl-default/rabbitmq-notifications-server-0" Oct 03 08:56:37 crc kubenswrapper[4765]: I1003 08:56:37.746278 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/822ab948-07b5-4946-aeb3-d6cd9e4f6752-rabbitmq-tls\") pod \"rabbitmq-notifications-server-0\" (UID: \"822ab948-07b5-4946-aeb3-d6cd9e4f6752\") " pod="watcher-kuttl-default/rabbitmq-notifications-server-0" Oct 03 08:56:37 crc kubenswrapper[4765]: I1003 08:56:37.747544 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/822ab948-07b5-4946-aeb3-d6cd9e4f6752-erlang-cookie-secret\") pod \"rabbitmq-notifications-server-0\" (UID: \"822ab948-07b5-4946-aeb3-d6cd9e4f6752\") " pod="watcher-kuttl-default/rabbitmq-notifications-server-0" Oct 03 08:56:37 crc kubenswrapper[4765]: I1003 08:56:37.760634 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-h59lp\" (UniqueName: \"kubernetes.io/projected/822ab948-07b5-4946-aeb3-d6cd9e4f6752-kube-api-access-h59lp\") pod \"rabbitmq-notifications-server-0\" (UID: \"822ab948-07b5-4946-aeb3-d6cd9e4f6752\") " pod="watcher-kuttl-default/rabbitmq-notifications-server-0" Oct 03 08:56:37 crc kubenswrapper[4765]: I1003 08:56:37.771336 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-64e21ecb-6802-40db-b472-9d379d26c803\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-64e21ecb-6802-40db-b472-9d379d26c803\") pod \"rabbitmq-notifications-server-0\" (UID: \"822ab948-07b5-4946-aeb3-d6cd9e4f6752\") " pod="watcher-kuttl-default/rabbitmq-notifications-server-0" Oct 03 08:56:37 crc kubenswrapper[4765]: I1003 08:56:37.901232 4765 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["watcher-kuttl-default/rabbitmq-server-0"] Oct 03 08:56:37 crc kubenswrapper[4765]: I1003 08:56:37.905028 4765 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/rabbitmq-server-0" Oct 03 08:56:37 crc kubenswrapper[4765]: I1003 08:56:37.909767 4765 reflector.go:368] Caches populated for *v1.ConfigMap from object-"watcher-kuttl-default"/"rabbitmq-plugins-conf" Oct 03 08:56:37 crc kubenswrapper[4765]: I1003 08:56:37.912138 4765 reflector.go:368] Caches populated for *v1.Secret from object-"watcher-kuttl-default"/"rabbitmq-default-user" Oct 03 08:56:37 crc kubenswrapper[4765]: I1003 08:56:37.912143 4765 reflector.go:368] Caches populated for *v1.Secret from object-"watcher-kuttl-default"/"rabbitmq-erlang-cookie" Oct 03 08:56:37 crc kubenswrapper[4765]: I1003 08:56:37.912382 4765 reflector.go:368] Caches populated for *v1.ConfigMap from object-"watcher-kuttl-default"/"rabbitmq-config-data" Oct 03 08:56:37 crc kubenswrapper[4765]: I1003 08:56:37.912497 4765 reflector.go:368] Caches populated for *v1.Secret from object-"watcher-kuttl-default"/"rabbitmq-server-dockercfg-sr8n2" Oct 03 08:56:37 crc kubenswrapper[4765]: I1003 08:56:37.918121 4765 reflector.go:368] Caches populated for *v1.ConfigMap from object-"watcher-kuttl-default"/"rabbitmq-server-conf" Oct 03 08:56:37 crc kubenswrapper[4765]: I1003 08:56:37.918574 4765 reflector.go:368] Caches populated for *v1.Secret from object-"watcher-kuttl-default"/"cert-rabbitmq-svc" Oct 03 08:56:37 crc kubenswrapper[4765]: I1003 08:56:37.921503 4765 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["watcher-kuttl-default/rabbitmq-server-0"] Oct 03 08:56:37 crc kubenswrapper[4765]: I1003 08:56:37.936213 4765 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/rabbitmq-notifications-server-0" Oct 03 08:56:38 crc kubenswrapper[4765]: I1003 08:56:38.043537 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/ee23f3ed-67bb-44ab-93fe-8251f7768941-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-0\" (UID: \"ee23f3ed-67bb-44ab-93fe-8251f7768941\") " pod="watcher-kuttl-default/rabbitmq-server-0" Oct 03 08:56:38 crc kubenswrapper[4765]: I1003 08:56:38.043591 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/ee23f3ed-67bb-44ab-93fe-8251f7768941-rabbitmq-confd\") pod \"rabbitmq-server-0\" (UID: \"ee23f3ed-67bb-44ab-93fe-8251f7768941\") " pod="watcher-kuttl-default/rabbitmq-server-0" Oct 03 08:56:38 crc kubenswrapper[4765]: I1003 08:56:38.043627 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/ee23f3ed-67bb-44ab-93fe-8251f7768941-erlang-cookie-secret\") pod \"rabbitmq-server-0\" (UID: \"ee23f3ed-67bb-44ab-93fe-8251f7768941\") " pod="watcher-kuttl-default/rabbitmq-server-0" Oct 03 08:56:38 crc kubenswrapper[4765]: I1003 08:56:38.043688 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/ee23f3ed-67bb-44ab-93fe-8251f7768941-plugins-conf\") pod \"rabbitmq-server-0\" (UID: \"ee23f3ed-67bb-44ab-93fe-8251f7768941\") " pod="watcher-kuttl-default/rabbitmq-server-0" Oct 03 08:56:38 crc kubenswrapper[4765]: I1003 08:56:38.043724 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/ee23f3ed-67bb-44ab-93fe-8251f7768941-rabbitmq-tls\") pod \"rabbitmq-server-0\" (UID: \"ee23f3ed-67bb-44ab-93fe-8251f7768941\") " pod="watcher-kuttl-default/rabbitmq-server-0" Oct 03 08:56:38 crc kubenswrapper[4765]: I1003 08:56:38.043751 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/ee23f3ed-67bb-44ab-93fe-8251f7768941-server-conf\") pod \"rabbitmq-server-0\" (UID: \"ee23f3ed-67bb-44ab-93fe-8251f7768941\") " pod="watcher-kuttl-default/rabbitmq-server-0" Oct 03 08:56:38 crc kubenswrapper[4765]: I1003 08:56:38.043799 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/ee23f3ed-67bb-44ab-93fe-8251f7768941-pod-info\") pod \"rabbitmq-server-0\" (UID: \"ee23f3ed-67bb-44ab-93fe-8251f7768941\") " pod="watcher-kuttl-default/rabbitmq-server-0" Oct 03 08:56:38 crc kubenswrapper[4765]: I1003 08:56:38.043839 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/ee23f3ed-67bb-44ab-93fe-8251f7768941-config-data\") pod \"rabbitmq-server-0\" (UID: \"ee23f3ed-67bb-44ab-93fe-8251f7768941\") " pod="watcher-kuttl-default/rabbitmq-server-0" Oct 03 08:56:38 crc kubenswrapper[4765]: I1003 08:56:38.043865 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pvc-e995a88b-347d-44f4-b67d-79f63cace943\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-e995a88b-347d-44f4-b67d-79f63cace943\") pod \"rabbitmq-server-0\" (UID: \"ee23f3ed-67bb-44ab-93fe-8251f7768941\") " pod="watcher-kuttl-default/rabbitmq-server-0" Oct 03 08:56:38 crc kubenswrapper[4765]: I1003 08:56:38.043895 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/ee23f3ed-67bb-44ab-93fe-8251f7768941-rabbitmq-plugins\") pod \"rabbitmq-server-0\" (UID: \"ee23f3ed-67bb-44ab-93fe-8251f7768941\") " pod="watcher-kuttl-default/rabbitmq-server-0" Oct 03 08:56:38 crc kubenswrapper[4765]: I1003 08:56:38.043916 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-scn6p\" (UniqueName: \"kubernetes.io/projected/ee23f3ed-67bb-44ab-93fe-8251f7768941-kube-api-access-scn6p\") pod \"rabbitmq-server-0\" (UID: \"ee23f3ed-67bb-44ab-93fe-8251f7768941\") " pod="watcher-kuttl-default/rabbitmq-server-0" Oct 03 08:56:38 crc kubenswrapper[4765]: I1003 08:56:38.147089 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/ee23f3ed-67bb-44ab-93fe-8251f7768941-server-conf\") pod \"rabbitmq-server-0\" (UID: \"ee23f3ed-67bb-44ab-93fe-8251f7768941\") " pod="watcher-kuttl-default/rabbitmq-server-0" Oct 03 08:56:38 crc kubenswrapper[4765]: I1003 08:56:38.147156 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/ee23f3ed-67bb-44ab-93fe-8251f7768941-pod-info\") pod \"rabbitmq-server-0\" (UID: \"ee23f3ed-67bb-44ab-93fe-8251f7768941\") " pod="watcher-kuttl-default/rabbitmq-server-0" Oct 03 08:56:38 crc kubenswrapper[4765]: I1003 08:56:38.147176 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/ee23f3ed-67bb-44ab-93fe-8251f7768941-config-data\") pod \"rabbitmq-server-0\" (UID: \"ee23f3ed-67bb-44ab-93fe-8251f7768941\") " pod="watcher-kuttl-default/rabbitmq-server-0" Oct 03 08:56:38 crc kubenswrapper[4765]: I1003 08:56:38.147195 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-e995a88b-347d-44f4-b67d-79f63cace943\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-e995a88b-347d-44f4-b67d-79f63cace943\") pod \"rabbitmq-server-0\" (UID: \"ee23f3ed-67bb-44ab-93fe-8251f7768941\") " pod="watcher-kuttl-default/rabbitmq-server-0" Oct 03 08:56:38 crc kubenswrapper[4765]: I1003 08:56:38.147218 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/ee23f3ed-67bb-44ab-93fe-8251f7768941-rabbitmq-plugins\") pod \"rabbitmq-server-0\" (UID: \"ee23f3ed-67bb-44ab-93fe-8251f7768941\") " pod="watcher-kuttl-default/rabbitmq-server-0" Oct 03 08:56:38 crc kubenswrapper[4765]: I1003 08:56:38.147238 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-scn6p\" (UniqueName: \"kubernetes.io/projected/ee23f3ed-67bb-44ab-93fe-8251f7768941-kube-api-access-scn6p\") pod \"rabbitmq-server-0\" (UID: \"ee23f3ed-67bb-44ab-93fe-8251f7768941\") " pod="watcher-kuttl-default/rabbitmq-server-0" Oct 03 08:56:38 crc kubenswrapper[4765]: I1003 08:56:38.147273 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/ee23f3ed-67bb-44ab-93fe-8251f7768941-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-0\" (UID: \"ee23f3ed-67bb-44ab-93fe-8251f7768941\") " pod="watcher-kuttl-default/rabbitmq-server-0" Oct 03 08:56:38 crc kubenswrapper[4765]: I1003 08:56:38.147293 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/ee23f3ed-67bb-44ab-93fe-8251f7768941-rabbitmq-confd\") pod \"rabbitmq-server-0\" (UID: \"ee23f3ed-67bb-44ab-93fe-8251f7768941\") " pod="watcher-kuttl-default/rabbitmq-server-0" Oct 03 08:56:38 crc kubenswrapper[4765]: I1003 08:56:38.147314 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/ee23f3ed-67bb-44ab-93fe-8251f7768941-erlang-cookie-secret\") pod \"rabbitmq-server-0\" (UID: \"ee23f3ed-67bb-44ab-93fe-8251f7768941\") " pod="watcher-kuttl-default/rabbitmq-server-0" Oct 03 08:56:38 crc kubenswrapper[4765]: I1003 08:56:38.147342 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/ee23f3ed-67bb-44ab-93fe-8251f7768941-plugins-conf\") pod \"rabbitmq-server-0\" (UID: \"ee23f3ed-67bb-44ab-93fe-8251f7768941\") " pod="watcher-kuttl-default/rabbitmq-server-0" Oct 03 08:56:38 crc kubenswrapper[4765]: I1003 08:56:38.147364 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/ee23f3ed-67bb-44ab-93fe-8251f7768941-rabbitmq-tls\") pod \"rabbitmq-server-0\" (UID: \"ee23f3ed-67bb-44ab-93fe-8251f7768941\") " pod="watcher-kuttl-default/rabbitmq-server-0" Oct 03 08:56:38 crc kubenswrapper[4765]: I1003 08:56:38.148593 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/ee23f3ed-67bb-44ab-93fe-8251f7768941-rabbitmq-plugins\") pod \"rabbitmq-server-0\" (UID: \"ee23f3ed-67bb-44ab-93fe-8251f7768941\") " pod="watcher-kuttl-default/rabbitmq-server-0" Oct 03 08:56:38 crc kubenswrapper[4765]: I1003 08:56:38.149799 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/ee23f3ed-67bb-44ab-93fe-8251f7768941-server-conf\") pod \"rabbitmq-server-0\" (UID: \"ee23f3ed-67bb-44ab-93fe-8251f7768941\") " pod="watcher-kuttl-default/rabbitmq-server-0" Oct 03 08:56:38 crc kubenswrapper[4765]: I1003 08:56:38.149933 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/ee23f3ed-67bb-44ab-93fe-8251f7768941-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-0\" (UID: \"ee23f3ed-67bb-44ab-93fe-8251f7768941\") " pod="watcher-kuttl-default/rabbitmq-server-0" Oct 03 08:56:38 crc kubenswrapper[4765]: I1003 08:56:38.152679 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/ee23f3ed-67bb-44ab-93fe-8251f7768941-plugins-conf\") pod \"rabbitmq-server-0\" (UID: \"ee23f3ed-67bb-44ab-93fe-8251f7768941\") " pod="watcher-kuttl-default/rabbitmq-server-0" Oct 03 08:56:38 crc kubenswrapper[4765]: I1003 08:56:38.153734 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/ee23f3ed-67bb-44ab-93fe-8251f7768941-config-data\") pod \"rabbitmq-server-0\" (UID: \"ee23f3ed-67bb-44ab-93fe-8251f7768941\") " pod="watcher-kuttl-default/rabbitmq-server-0" Oct 03 08:56:38 crc kubenswrapper[4765]: I1003 08:56:38.158512 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/ee23f3ed-67bb-44ab-93fe-8251f7768941-rabbitmq-confd\") pod \"rabbitmq-server-0\" (UID: \"ee23f3ed-67bb-44ab-93fe-8251f7768941\") " pod="watcher-kuttl-default/rabbitmq-server-0" Oct 03 08:56:38 crc kubenswrapper[4765]: I1003 08:56:38.158964 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/ee23f3ed-67bb-44ab-93fe-8251f7768941-pod-info\") pod \"rabbitmq-server-0\" (UID: \"ee23f3ed-67bb-44ab-93fe-8251f7768941\") " pod="watcher-kuttl-default/rabbitmq-server-0" Oct 03 08:56:38 crc kubenswrapper[4765]: I1003 08:56:38.159024 4765 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Oct 03 08:56:38 crc kubenswrapper[4765]: I1003 08:56:38.159051 4765 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-e995a88b-347d-44f4-b67d-79f63cace943\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-e995a88b-347d-44f4-b67d-79f63cace943\") pod \"rabbitmq-server-0\" (UID: \"ee23f3ed-67bb-44ab-93fe-8251f7768941\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/71462224054ef02956c53b22282d42b4f612c150089251a49fa6695c6f5ebceb/globalmount\"" pod="watcher-kuttl-default/rabbitmq-server-0" Oct 03 08:56:38 crc kubenswrapper[4765]: I1003 08:56:38.159330 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/ee23f3ed-67bb-44ab-93fe-8251f7768941-erlang-cookie-secret\") pod \"rabbitmq-server-0\" (UID: \"ee23f3ed-67bb-44ab-93fe-8251f7768941\") " pod="watcher-kuttl-default/rabbitmq-server-0" Oct 03 08:56:38 crc kubenswrapper[4765]: I1003 08:56:38.162406 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/ee23f3ed-67bb-44ab-93fe-8251f7768941-rabbitmq-tls\") pod \"rabbitmq-server-0\" (UID: \"ee23f3ed-67bb-44ab-93fe-8251f7768941\") " pod="watcher-kuttl-default/rabbitmq-server-0" Oct 03 08:56:38 crc kubenswrapper[4765]: I1003 08:56:38.167425 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-scn6p\" (UniqueName: \"kubernetes.io/projected/ee23f3ed-67bb-44ab-93fe-8251f7768941-kube-api-access-scn6p\") pod \"rabbitmq-server-0\" (UID: \"ee23f3ed-67bb-44ab-93fe-8251f7768941\") " pod="watcher-kuttl-default/rabbitmq-server-0" Oct 03 08:56:38 crc kubenswrapper[4765]: I1003 08:56:38.191440 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-e995a88b-347d-44f4-b67d-79f63cace943\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-e995a88b-347d-44f4-b67d-79f63cace943\") pod \"rabbitmq-server-0\" (UID: \"ee23f3ed-67bb-44ab-93fe-8251f7768941\") " pod="watcher-kuttl-default/rabbitmq-server-0" Oct 03 08:56:38 crc kubenswrapper[4765]: I1003 08:56:38.243323 4765 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/rabbitmq-server-0" Oct 03 08:56:38 crc kubenswrapper[4765]: I1003 08:56:38.393961 4765 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["watcher-kuttl-default/rabbitmq-notifications-server-0"] Oct 03 08:56:38 crc kubenswrapper[4765]: I1003 08:56:38.725904 4765 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["watcher-kuttl-default/rabbitmq-server-0"] Oct 03 08:56:38 crc kubenswrapper[4765]: W1003 08:56:38.742378 4765 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podee23f3ed_67bb_44ab_93fe_8251f7768941.slice/crio-868ab32c9ca77a7ff94c9711a75a717c95e58465d1bb3500069dbcdc09982e2a WatchSource:0}: Error finding container 868ab32c9ca77a7ff94c9711a75a717c95e58465d1bb3500069dbcdc09982e2a: Status 404 returned error can't find the container with id 868ab32c9ca77a7ff94c9711a75a717c95e58465d1bb3500069dbcdc09982e2a Oct 03 08:56:38 crc kubenswrapper[4765]: I1003 08:56:38.837043 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/rabbitmq-server-0" event={"ID":"ee23f3ed-67bb-44ab-93fe-8251f7768941","Type":"ContainerStarted","Data":"868ab32c9ca77a7ff94c9711a75a717c95e58465d1bb3500069dbcdc09982e2a"} Oct 03 08:56:38 crc kubenswrapper[4765]: I1003 08:56:38.838009 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/rabbitmq-notifications-server-0" event={"ID":"822ab948-07b5-4946-aeb3-d6cd9e4f6752","Type":"ContainerStarted","Data":"281caa8fccb770f908c097ecca469f3171f9c8e62d2f610d2613cf55378c4dd3"} Oct 03 08:56:39 crc kubenswrapper[4765]: I1003 08:56:39.265470 4765 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["watcher-kuttl-default/openstack-galera-0"] Oct 03 08:56:39 crc kubenswrapper[4765]: I1003 08:56:39.267741 4765 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/openstack-galera-0" Oct 03 08:56:39 crc kubenswrapper[4765]: I1003 08:56:39.270704 4765 reflector.go:368] Caches populated for *v1.Secret from object-"watcher-kuttl-default"/"osp-secret" Oct 03 08:56:39 crc kubenswrapper[4765]: I1003 08:56:39.271013 4765 reflector.go:368] Caches populated for *v1.Secret from object-"watcher-kuttl-default"/"galera-openstack-dockercfg-6lf96" Oct 03 08:56:39 crc kubenswrapper[4765]: I1003 08:56:39.271168 4765 reflector.go:368] Caches populated for *v1.Secret from object-"watcher-kuttl-default"/"cert-galera-openstack-svc" Oct 03 08:56:39 crc kubenswrapper[4765]: I1003 08:56:39.271355 4765 reflector.go:368] Caches populated for *v1.ConfigMap from object-"watcher-kuttl-default"/"openstack-config-data" Oct 03 08:56:39 crc kubenswrapper[4765]: I1003 08:56:39.277533 4765 reflector.go:368] Caches populated for *v1.ConfigMap from object-"watcher-kuttl-default"/"openstack-scripts" Oct 03 08:56:39 crc kubenswrapper[4765]: I1003 08:56:39.279469 4765 reflector.go:368] Caches populated for *v1.Secret from object-"watcher-kuttl-default"/"combined-ca-bundle" Oct 03 08:56:39 crc kubenswrapper[4765]: I1003 08:56:39.279636 4765 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["watcher-kuttl-default/openstack-galera-0"] Oct 03 08:56:39 crc kubenswrapper[4765]: I1003 08:56:39.372212 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/3f7d0aed-bfcf-4589-a75c-d94328cf7b7a-config-data-default\") pod \"openstack-galera-0\" (UID: \"3f7d0aed-bfcf-4589-a75c-d94328cf7b7a\") " pod="watcher-kuttl-default/openstack-galera-0" Oct 03 08:56:39 crc kubenswrapper[4765]: I1003 08:56:39.372288 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/3f7d0aed-bfcf-4589-a75c-d94328cf7b7a-config-data-generated\") pod \"openstack-galera-0\" (UID: \"3f7d0aed-bfcf-4589-a75c-d94328cf7b7a\") " pod="watcher-kuttl-default/openstack-galera-0" Oct 03 08:56:39 crc kubenswrapper[4765]: I1003 08:56:39.372436 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/3f7d0aed-bfcf-4589-a75c-d94328cf7b7a-galera-tls-certs\") pod \"openstack-galera-0\" (UID: \"3f7d0aed-bfcf-4589-a75c-d94328cf7b7a\") " pod="watcher-kuttl-default/openstack-galera-0" Oct 03 08:56:39 crc kubenswrapper[4765]: I1003 08:56:39.372577 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/3f7d0aed-bfcf-4589-a75c-d94328cf7b7a-operator-scripts\") pod \"openstack-galera-0\" (UID: \"3f7d0aed-bfcf-4589-a75c-d94328cf7b7a\") " pod="watcher-kuttl-default/openstack-galera-0" Oct 03 08:56:39 crc kubenswrapper[4765]: I1003 08:56:39.372615 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pvc-8fb2bf0d-fb03-493c-95b7-0b479e27b346\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-8fb2bf0d-fb03-493c-95b7-0b479e27b346\") pod \"openstack-galera-0\" (UID: \"3f7d0aed-bfcf-4589-a75c-d94328cf7b7a\") " pod="watcher-kuttl-default/openstack-galera-0" Oct 03 08:56:39 crc kubenswrapper[4765]: I1003 08:56:39.372696 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secrets\" (UniqueName: \"kubernetes.io/secret/3f7d0aed-bfcf-4589-a75c-d94328cf7b7a-secrets\") pod \"openstack-galera-0\" (UID: \"3f7d0aed-bfcf-4589-a75c-d94328cf7b7a\") " pod="watcher-kuttl-default/openstack-galera-0" Oct 03 08:56:39 crc kubenswrapper[4765]: I1003 08:56:39.372726 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/3f7d0aed-bfcf-4589-a75c-d94328cf7b7a-kolla-config\") pod \"openstack-galera-0\" (UID: \"3f7d0aed-bfcf-4589-a75c-d94328cf7b7a\") " pod="watcher-kuttl-default/openstack-galera-0" Oct 03 08:56:39 crc kubenswrapper[4765]: I1003 08:56:39.372749 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3f7d0aed-bfcf-4589-a75c-d94328cf7b7a-combined-ca-bundle\") pod \"openstack-galera-0\" (UID: \"3f7d0aed-bfcf-4589-a75c-d94328cf7b7a\") " pod="watcher-kuttl-default/openstack-galera-0" Oct 03 08:56:39 crc kubenswrapper[4765]: I1003 08:56:39.372783 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2qkl8\" (UniqueName: \"kubernetes.io/projected/3f7d0aed-bfcf-4589-a75c-d94328cf7b7a-kube-api-access-2qkl8\") pod \"openstack-galera-0\" (UID: \"3f7d0aed-bfcf-4589-a75c-d94328cf7b7a\") " pod="watcher-kuttl-default/openstack-galera-0" Oct 03 08:56:39 crc kubenswrapper[4765]: I1003 08:56:39.477199 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3f7d0aed-bfcf-4589-a75c-d94328cf7b7a-combined-ca-bundle\") pod \"openstack-galera-0\" (UID: \"3f7d0aed-bfcf-4589-a75c-d94328cf7b7a\") " pod="watcher-kuttl-default/openstack-galera-0" Oct 03 08:56:39 crc kubenswrapper[4765]: I1003 08:56:39.477328 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2qkl8\" (UniqueName: \"kubernetes.io/projected/3f7d0aed-bfcf-4589-a75c-d94328cf7b7a-kube-api-access-2qkl8\") pod \"openstack-galera-0\" (UID: \"3f7d0aed-bfcf-4589-a75c-d94328cf7b7a\") " pod="watcher-kuttl-default/openstack-galera-0" Oct 03 08:56:39 crc kubenswrapper[4765]: I1003 08:56:39.477558 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/3f7d0aed-bfcf-4589-a75c-d94328cf7b7a-config-data-default\") pod \"openstack-galera-0\" (UID: \"3f7d0aed-bfcf-4589-a75c-d94328cf7b7a\") " pod="watcher-kuttl-default/openstack-galera-0" Oct 03 08:56:39 crc kubenswrapper[4765]: I1003 08:56:39.477585 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/3f7d0aed-bfcf-4589-a75c-d94328cf7b7a-config-data-generated\") pod \"openstack-galera-0\" (UID: \"3f7d0aed-bfcf-4589-a75c-d94328cf7b7a\") " pod="watcher-kuttl-default/openstack-galera-0" Oct 03 08:56:39 crc kubenswrapper[4765]: I1003 08:56:39.477706 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/3f7d0aed-bfcf-4589-a75c-d94328cf7b7a-galera-tls-certs\") pod \"openstack-galera-0\" (UID: \"3f7d0aed-bfcf-4589-a75c-d94328cf7b7a\") " pod="watcher-kuttl-default/openstack-galera-0" Oct 03 08:56:39 crc kubenswrapper[4765]: I1003 08:56:39.477847 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/3f7d0aed-bfcf-4589-a75c-d94328cf7b7a-operator-scripts\") pod \"openstack-galera-0\" (UID: \"3f7d0aed-bfcf-4589-a75c-d94328cf7b7a\") " pod="watcher-kuttl-default/openstack-galera-0" Oct 03 08:56:39 crc kubenswrapper[4765]: I1003 08:56:39.477877 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-8fb2bf0d-fb03-493c-95b7-0b479e27b346\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-8fb2bf0d-fb03-493c-95b7-0b479e27b346\") pod \"openstack-galera-0\" (UID: \"3f7d0aed-bfcf-4589-a75c-d94328cf7b7a\") " pod="watcher-kuttl-default/openstack-galera-0" Oct 03 08:56:39 crc kubenswrapper[4765]: I1003 08:56:39.477929 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secrets\" (UniqueName: \"kubernetes.io/secret/3f7d0aed-bfcf-4589-a75c-d94328cf7b7a-secrets\") pod \"openstack-galera-0\" (UID: \"3f7d0aed-bfcf-4589-a75c-d94328cf7b7a\") " pod="watcher-kuttl-default/openstack-galera-0" Oct 03 08:56:39 crc kubenswrapper[4765]: I1003 08:56:39.477952 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/3f7d0aed-bfcf-4589-a75c-d94328cf7b7a-kolla-config\") pod \"openstack-galera-0\" (UID: \"3f7d0aed-bfcf-4589-a75c-d94328cf7b7a\") " pod="watcher-kuttl-default/openstack-galera-0" Oct 03 08:56:39 crc kubenswrapper[4765]: I1003 08:56:39.478935 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/3f7d0aed-bfcf-4589-a75c-d94328cf7b7a-kolla-config\") pod \"openstack-galera-0\" (UID: \"3f7d0aed-bfcf-4589-a75c-d94328cf7b7a\") " pod="watcher-kuttl-default/openstack-galera-0" Oct 03 08:56:39 crc kubenswrapper[4765]: I1003 08:56:39.483964 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/3f7d0aed-bfcf-4589-a75c-d94328cf7b7a-config-data-generated\") pod \"openstack-galera-0\" (UID: \"3f7d0aed-bfcf-4589-a75c-d94328cf7b7a\") " pod="watcher-kuttl-default/openstack-galera-0" Oct 03 08:56:39 crc kubenswrapper[4765]: I1003 08:56:39.484734 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3f7d0aed-bfcf-4589-a75c-d94328cf7b7a-combined-ca-bundle\") pod \"openstack-galera-0\" (UID: \"3f7d0aed-bfcf-4589-a75c-d94328cf7b7a\") " pod="watcher-kuttl-default/openstack-galera-0" Oct 03 08:56:39 crc kubenswrapper[4765]: I1003 08:56:39.484773 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/3f7d0aed-bfcf-4589-a75c-d94328cf7b7a-operator-scripts\") pod \"openstack-galera-0\" (UID: \"3f7d0aed-bfcf-4589-a75c-d94328cf7b7a\") " pod="watcher-kuttl-default/openstack-galera-0" Oct 03 08:56:39 crc kubenswrapper[4765]: I1003 08:56:39.487219 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/3f7d0aed-bfcf-4589-a75c-d94328cf7b7a-galera-tls-certs\") pod \"openstack-galera-0\" (UID: \"3f7d0aed-bfcf-4589-a75c-d94328cf7b7a\") " pod="watcher-kuttl-default/openstack-galera-0" Oct 03 08:56:39 crc kubenswrapper[4765]: I1003 08:56:39.487516 4765 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Oct 03 08:56:39 crc kubenswrapper[4765]: I1003 08:56:39.487549 4765 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-8fb2bf0d-fb03-493c-95b7-0b479e27b346\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-8fb2bf0d-fb03-493c-95b7-0b479e27b346\") pod \"openstack-galera-0\" (UID: \"3f7d0aed-bfcf-4589-a75c-d94328cf7b7a\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/115ad2d66fc5a353c53ae2281b9458fbf49b37c0825f9df4b3f6e88233797014/globalmount\"" pod="watcher-kuttl-default/openstack-galera-0" Oct 03 08:56:39 crc kubenswrapper[4765]: I1003 08:56:39.494492 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secrets\" (UniqueName: \"kubernetes.io/secret/3f7d0aed-bfcf-4589-a75c-d94328cf7b7a-secrets\") pod \"openstack-galera-0\" (UID: \"3f7d0aed-bfcf-4589-a75c-d94328cf7b7a\") " pod="watcher-kuttl-default/openstack-galera-0" Oct 03 08:56:39 crc kubenswrapper[4765]: I1003 08:56:39.502229 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2qkl8\" (UniqueName: \"kubernetes.io/projected/3f7d0aed-bfcf-4589-a75c-d94328cf7b7a-kube-api-access-2qkl8\") pod \"openstack-galera-0\" (UID: \"3f7d0aed-bfcf-4589-a75c-d94328cf7b7a\") " pod="watcher-kuttl-default/openstack-galera-0" Oct 03 08:56:39 crc kubenswrapper[4765]: I1003 08:56:39.503273 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/3f7d0aed-bfcf-4589-a75c-d94328cf7b7a-config-data-default\") pod \"openstack-galera-0\" (UID: \"3f7d0aed-bfcf-4589-a75c-d94328cf7b7a\") " pod="watcher-kuttl-default/openstack-galera-0" Oct 03 08:56:39 crc kubenswrapper[4765]: I1003 08:56:39.552822 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-8fb2bf0d-fb03-493c-95b7-0b479e27b346\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-8fb2bf0d-fb03-493c-95b7-0b479e27b346\") pod \"openstack-galera-0\" (UID: \"3f7d0aed-bfcf-4589-a75c-d94328cf7b7a\") " pod="watcher-kuttl-default/openstack-galera-0" Oct 03 08:56:39 crc kubenswrapper[4765]: I1003 08:56:39.610010 4765 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/openstack-galera-0" Oct 03 08:56:39 crc kubenswrapper[4765]: I1003 08:56:39.637166 4765 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["watcher-kuttl-default/memcached-0"] Oct 03 08:56:39 crc kubenswrapper[4765]: I1003 08:56:39.638155 4765 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/memcached-0" Oct 03 08:56:39 crc kubenswrapper[4765]: I1003 08:56:39.641305 4765 reflector.go:368] Caches populated for *v1.Secret from object-"watcher-kuttl-default"/"memcached-memcached-dockercfg-9c22t" Oct 03 08:56:39 crc kubenswrapper[4765]: I1003 08:56:39.641968 4765 reflector.go:368] Caches populated for *v1.ConfigMap from object-"watcher-kuttl-default"/"memcached-config-data" Oct 03 08:56:39 crc kubenswrapper[4765]: I1003 08:56:39.645110 4765 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["watcher-kuttl-default/memcached-0"] Oct 03 08:56:39 crc kubenswrapper[4765]: I1003 08:56:39.653302 4765 reflector.go:368] Caches populated for *v1.Secret from object-"watcher-kuttl-default"/"cert-memcached-svc" Oct 03 08:56:39 crc kubenswrapper[4765]: I1003 08:56:39.781639 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/2336fb85-91d3-4450-90ee-52264f3dc39f-kolla-config\") pod \"memcached-0\" (UID: \"2336fb85-91d3-4450-90ee-52264f3dc39f\") " pod="watcher-kuttl-default/memcached-0" Oct 03 08:56:39 crc kubenswrapper[4765]: I1003 08:56:39.781713 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/2336fb85-91d3-4450-90ee-52264f3dc39f-config-data\") pod \"memcached-0\" (UID: \"2336fb85-91d3-4450-90ee-52264f3dc39f\") " pod="watcher-kuttl-default/memcached-0" Oct 03 08:56:39 crc kubenswrapper[4765]: I1003 08:56:39.781741 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"memcached-tls-certs\" (UniqueName: \"kubernetes.io/secret/2336fb85-91d3-4450-90ee-52264f3dc39f-memcached-tls-certs\") pod \"memcached-0\" (UID: \"2336fb85-91d3-4450-90ee-52264f3dc39f\") " pod="watcher-kuttl-default/memcached-0" Oct 03 08:56:39 crc kubenswrapper[4765]: I1003 08:56:39.781806 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2336fb85-91d3-4450-90ee-52264f3dc39f-combined-ca-bundle\") pod \"memcached-0\" (UID: \"2336fb85-91d3-4450-90ee-52264f3dc39f\") " pod="watcher-kuttl-default/memcached-0" Oct 03 08:56:39 crc kubenswrapper[4765]: I1003 08:56:39.781831 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-h6tgg\" (UniqueName: \"kubernetes.io/projected/2336fb85-91d3-4450-90ee-52264f3dc39f-kube-api-access-h6tgg\") pod \"memcached-0\" (UID: \"2336fb85-91d3-4450-90ee-52264f3dc39f\") " pod="watcher-kuttl-default/memcached-0" Oct 03 08:56:39 crc kubenswrapper[4765]: I1003 08:56:39.883236 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/2336fb85-91d3-4450-90ee-52264f3dc39f-kolla-config\") pod \"memcached-0\" (UID: \"2336fb85-91d3-4450-90ee-52264f3dc39f\") " pod="watcher-kuttl-default/memcached-0" Oct 03 08:56:39 crc kubenswrapper[4765]: I1003 08:56:39.883591 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/2336fb85-91d3-4450-90ee-52264f3dc39f-config-data\") pod \"memcached-0\" (UID: \"2336fb85-91d3-4450-90ee-52264f3dc39f\") " pod="watcher-kuttl-default/memcached-0" Oct 03 08:56:39 crc kubenswrapper[4765]: I1003 08:56:39.883631 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"memcached-tls-certs\" (UniqueName: \"kubernetes.io/secret/2336fb85-91d3-4450-90ee-52264f3dc39f-memcached-tls-certs\") pod \"memcached-0\" (UID: \"2336fb85-91d3-4450-90ee-52264f3dc39f\") " pod="watcher-kuttl-default/memcached-0" Oct 03 08:56:39 crc kubenswrapper[4765]: I1003 08:56:39.883721 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2336fb85-91d3-4450-90ee-52264f3dc39f-combined-ca-bundle\") pod \"memcached-0\" (UID: \"2336fb85-91d3-4450-90ee-52264f3dc39f\") " pod="watcher-kuttl-default/memcached-0" Oct 03 08:56:39 crc kubenswrapper[4765]: I1003 08:56:39.883748 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-h6tgg\" (UniqueName: \"kubernetes.io/projected/2336fb85-91d3-4450-90ee-52264f3dc39f-kube-api-access-h6tgg\") pod \"memcached-0\" (UID: \"2336fb85-91d3-4450-90ee-52264f3dc39f\") " pod="watcher-kuttl-default/memcached-0" Oct 03 08:56:39 crc kubenswrapper[4765]: I1003 08:56:39.884183 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/2336fb85-91d3-4450-90ee-52264f3dc39f-kolla-config\") pod \"memcached-0\" (UID: \"2336fb85-91d3-4450-90ee-52264f3dc39f\") " pod="watcher-kuttl-default/memcached-0" Oct 03 08:56:39 crc kubenswrapper[4765]: I1003 08:56:39.885168 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/2336fb85-91d3-4450-90ee-52264f3dc39f-config-data\") pod \"memcached-0\" (UID: \"2336fb85-91d3-4450-90ee-52264f3dc39f\") " pod="watcher-kuttl-default/memcached-0" Oct 03 08:56:39 crc kubenswrapper[4765]: I1003 08:56:39.888923 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"memcached-tls-certs\" (UniqueName: \"kubernetes.io/secret/2336fb85-91d3-4450-90ee-52264f3dc39f-memcached-tls-certs\") pod \"memcached-0\" (UID: \"2336fb85-91d3-4450-90ee-52264f3dc39f\") " pod="watcher-kuttl-default/memcached-0" Oct 03 08:56:39 crc kubenswrapper[4765]: I1003 08:56:39.902441 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-h6tgg\" (UniqueName: \"kubernetes.io/projected/2336fb85-91d3-4450-90ee-52264f3dc39f-kube-api-access-h6tgg\") pod \"memcached-0\" (UID: \"2336fb85-91d3-4450-90ee-52264f3dc39f\") " pod="watcher-kuttl-default/memcached-0" Oct 03 08:56:39 crc kubenswrapper[4765]: I1003 08:56:39.921902 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2336fb85-91d3-4450-90ee-52264f3dc39f-combined-ca-bundle\") pod \"memcached-0\" (UID: \"2336fb85-91d3-4450-90ee-52264f3dc39f\") " pod="watcher-kuttl-default/memcached-0" Oct 03 08:56:40 crc kubenswrapper[4765]: I1003 08:56:40.001298 4765 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["watcher-kuttl-default/kube-state-metrics-0"] Oct 03 08:56:40 crc kubenswrapper[4765]: I1003 08:56:40.002406 4765 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/kube-state-metrics-0" Oct 03 08:56:40 crc kubenswrapper[4765]: I1003 08:56:40.012892 4765 reflector.go:368] Caches populated for *v1.Secret from object-"watcher-kuttl-default"/"telemetry-ceilometer-dockercfg-bvsh8" Oct 03 08:56:40 crc kubenswrapper[4765]: I1003 08:56:40.021854 4765 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/memcached-0" Oct 03 08:56:40 crc kubenswrapper[4765]: I1003 08:56:40.038536 4765 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["watcher-kuttl-default/kube-state-metrics-0"] Oct 03 08:56:40 crc kubenswrapper[4765]: I1003 08:56:40.091439 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-w7bdv\" (UniqueName: \"kubernetes.io/projected/14d4a2cf-7c1b-4e8d-a42c-01cb979e78b6-kube-api-access-w7bdv\") pod \"kube-state-metrics-0\" (UID: \"14d4a2cf-7c1b-4e8d-a42c-01cb979e78b6\") " pod="watcher-kuttl-default/kube-state-metrics-0" Oct 03 08:56:40 crc kubenswrapper[4765]: I1003 08:56:40.194730 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-w7bdv\" (UniqueName: \"kubernetes.io/projected/14d4a2cf-7c1b-4e8d-a42c-01cb979e78b6-kube-api-access-w7bdv\") pod \"kube-state-metrics-0\" (UID: \"14d4a2cf-7c1b-4e8d-a42c-01cb979e78b6\") " pod="watcher-kuttl-default/kube-state-metrics-0" Oct 03 08:56:40 crc kubenswrapper[4765]: I1003 08:56:40.255100 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-w7bdv\" (UniqueName: \"kubernetes.io/projected/14d4a2cf-7c1b-4e8d-a42c-01cb979e78b6-kube-api-access-w7bdv\") pod \"kube-state-metrics-0\" (UID: \"14d4a2cf-7c1b-4e8d-a42c-01cb979e78b6\") " pod="watcher-kuttl-default/kube-state-metrics-0" Oct 03 08:56:40 crc kubenswrapper[4765]: I1003 08:56:40.329054 4765 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["watcher-kuttl-default/openstack-galera-0"] Oct 03 08:56:40 crc kubenswrapper[4765]: I1003 08:56:40.332459 4765 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/kube-state-metrics-0" Oct 03 08:56:40 crc kubenswrapper[4765]: I1003 08:56:40.761372 4765 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["watcher-kuttl-default/alertmanager-metric-storage-0"] Oct 03 08:56:40 crc kubenswrapper[4765]: I1003 08:56:40.762863 4765 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/alertmanager-metric-storage-0" Oct 03 08:56:40 crc kubenswrapper[4765]: I1003 08:56:40.777574 4765 reflector.go:368] Caches populated for *v1.Secret from object-"watcher-kuttl-default"/"alertmanager-metric-storage-generated" Oct 03 08:56:40 crc kubenswrapper[4765]: I1003 08:56:40.777892 4765 reflector.go:368] Caches populated for *v1.Secret from object-"watcher-kuttl-default"/"alertmanager-metric-storage-tls-assets-0" Oct 03 08:56:40 crc kubenswrapper[4765]: I1003 08:56:40.778077 4765 reflector.go:368] Caches populated for *v1.Secret from object-"watcher-kuttl-default"/"alertmanager-metric-storage-web-config" Oct 03 08:56:40 crc kubenswrapper[4765]: I1003 08:56:40.778297 4765 reflector.go:368] Caches populated for *v1.Secret from object-"watcher-kuttl-default"/"metric-storage-alertmanager-dockercfg-zpq7z" Oct 03 08:56:40 crc kubenswrapper[4765]: I1003 08:56:40.790779 4765 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["watcher-kuttl-default/memcached-0"] Oct 03 08:56:40 crc kubenswrapper[4765]: I1003 08:56:40.836722 4765 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["watcher-kuttl-default/alertmanager-metric-storage-0"] Oct 03 08:56:40 crc kubenswrapper[4765]: I1003 08:56:40.921595 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/c0fe5012-3b98-4ef6-954c-27b4e962a1cc-config-out\") pod \"alertmanager-metric-storage-0\" (UID: \"c0fe5012-3b98-4ef6-954c-27b4e962a1cc\") " pod="watcher-kuttl-default/alertmanager-metric-storage-0" Oct 03 08:56:40 crc kubenswrapper[4765]: I1003 08:56:40.921684 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4hwbx\" (UniqueName: \"kubernetes.io/projected/c0fe5012-3b98-4ef6-954c-27b4e962a1cc-kube-api-access-4hwbx\") pod \"alertmanager-metric-storage-0\" (UID: \"c0fe5012-3b98-4ef6-954c-27b4e962a1cc\") " pod="watcher-kuttl-default/alertmanager-metric-storage-0" Oct 03 08:56:40 crc kubenswrapper[4765]: I1003 08:56:40.921723 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/c0fe5012-3b98-4ef6-954c-27b4e962a1cc-web-config\") pod \"alertmanager-metric-storage-0\" (UID: \"c0fe5012-3b98-4ef6-954c-27b4e962a1cc\") " pod="watcher-kuttl-default/alertmanager-metric-storage-0" Oct 03 08:56:40 crc kubenswrapper[4765]: I1003 08:56:40.921772 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/secret/c0fe5012-3b98-4ef6-954c-27b4e962a1cc-config-volume\") pod \"alertmanager-metric-storage-0\" (UID: \"c0fe5012-3b98-4ef6-954c-27b4e962a1cc\") " pod="watcher-kuttl-default/alertmanager-metric-storage-0" Oct 03 08:56:40 crc kubenswrapper[4765]: I1003 08:56:40.921805 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"alertmanager-metric-storage-db\" (UniqueName: \"kubernetes.io/empty-dir/c0fe5012-3b98-4ef6-954c-27b4e962a1cc-alertmanager-metric-storage-db\") pod \"alertmanager-metric-storage-0\" (UID: \"c0fe5012-3b98-4ef6-954c-27b4e962a1cc\") " pod="watcher-kuttl-default/alertmanager-metric-storage-0" Oct 03 08:56:40 crc kubenswrapper[4765]: I1003 08:56:40.921825 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/c0fe5012-3b98-4ef6-954c-27b4e962a1cc-tls-assets\") pod \"alertmanager-metric-storage-0\" (UID: \"c0fe5012-3b98-4ef6-954c-27b4e962a1cc\") " pod="watcher-kuttl-default/alertmanager-metric-storage-0" Oct 03 08:56:40 crc kubenswrapper[4765]: I1003 08:56:40.952810 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/memcached-0" event={"ID":"2336fb85-91d3-4450-90ee-52264f3dc39f","Type":"ContainerStarted","Data":"5a4a2e4cd8d1e7efb6c2f15eba9297355f243ce065b8786932b959cdc56eea07"} Oct 03 08:56:40 crc kubenswrapper[4765]: I1003 08:56:40.955109 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/openstack-galera-0" event={"ID":"3f7d0aed-bfcf-4589-a75c-d94328cf7b7a","Type":"ContainerStarted","Data":"b371c342636cb578eb6983179a727edb613602d173eee6b57c594af4e232afc8"} Oct 03 08:56:41 crc kubenswrapper[4765]: I1003 08:56:41.023136 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/secret/c0fe5012-3b98-4ef6-954c-27b4e962a1cc-config-volume\") pod \"alertmanager-metric-storage-0\" (UID: \"c0fe5012-3b98-4ef6-954c-27b4e962a1cc\") " pod="watcher-kuttl-default/alertmanager-metric-storage-0" Oct 03 08:56:41 crc kubenswrapper[4765]: I1003 08:56:41.023210 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"alertmanager-metric-storage-db\" (UniqueName: \"kubernetes.io/empty-dir/c0fe5012-3b98-4ef6-954c-27b4e962a1cc-alertmanager-metric-storage-db\") pod \"alertmanager-metric-storage-0\" (UID: \"c0fe5012-3b98-4ef6-954c-27b4e962a1cc\") " pod="watcher-kuttl-default/alertmanager-metric-storage-0" Oct 03 08:56:41 crc kubenswrapper[4765]: I1003 08:56:41.023242 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/c0fe5012-3b98-4ef6-954c-27b4e962a1cc-tls-assets\") pod \"alertmanager-metric-storage-0\" (UID: \"c0fe5012-3b98-4ef6-954c-27b4e962a1cc\") " pod="watcher-kuttl-default/alertmanager-metric-storage-0" Oct 03 08:56:41 crc kubenswrapper[4765]: I1003 08:56:41.023284 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/c0fe5012-3b98-4ef6-954c-27b4e962a1cc-config-out\") pod \"alertmanager-metric-storage-0\" (UID: \"c0fe5012-3b98-4ef6-954c-27b4e962a1cc\") " pod="watcher-kuttl-default/alertmanager-metric-storage-0" Oct 03 08:56:41 crc kubenswrapper[4765]: I1003 08:56:41.023400 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4hwbx\" (UniqueName: \"kubernetes.io/projected/c0fe5012-3b98-4ef6-954c-27b4e962a1cc-kube-api-access-4hwbx\") pod \"alertmanager-metric-storage-0\" (UID: \"c0fe5012-3b98-4ef6-954c-27b4e962a1cc\") " pod="watcher-kuttl-default/alertmanager-metric-storage-0" Oct 03 08:56:41 crc kubenswrapper[4765]: I1003 08:56:41.023492 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/c0fe5012-3b98-4ef6-954c-27b4e962a1cc-web-config\") pod \"alertmanager-metric-storage-0\" (UID: \"c0fe5012-3b98-4ef6-954c-27b4e962a1cc\") " pod="watcher-kuttl-default/alertmanager-metric-storage-0" Oct 03 08:56:41 crc kubenswrapper[4765]: I1003 08:56:41.028496 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"alertmanager-metric-storage-db\" (UniqueName: \"kubernetes.io/empty-dir/c0fe5012-3b98-4ef6-954c-27b4e962a1cc-alertmanager-metric-storage-db\") pod \"alertmanager-metric-storage-0\" (UID: \"c0fe5012-3b98-4ef6-954c-27b4e962a1cc\") " pod="watcher-kuttl-default/alertmanager-metric-storage-0" Oct 03 08:56:41 crc kubenswrapper[4765]: I1003 08:56:41.031062 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/c0fe5012-3b98-4ef6-954c-27b4e962a1cc-tls-assets\") pod \"alertmanager-metric-storage-0\" (UID: \"c0fe5012-3b98-4ef6-954c-27b4e962a1cc\") " pod="watcher-kuttl-default/alertmanager-metric-storage-0" Oct 03 08:56:41 crc kubenswrapper[4765]: I1003 08:56:41.032774 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/c0fe5012-3b98-4ef6-954c-27b4e962a1cc-web-config\") pod \"alertmanager-metric-storage-0\" (UID: \"c0fe5012-3b98-4ef6-954c-27b4e962a1cc\") " pod="watcher-kuttl-default/alertmanager-metric-storage-0" Oct 03 08:56:41 crc kubenswrapper[4765]: I1003 08:56:41.033635 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/c0fe5012-3b98-4ef6-954c-27b4e962a1cc-config-out\") pod \"alertmanager-metric-storage-0\" (UID: \"c0fe5012-3b98-4ef6-954c-27b4e962a1cc\") " pod="watcher-kuttl-default/alertmanager-metric-storage-0" Oct 03 08:56:41 crc kubenswrapper[4765]: I1003 08:56:41.034301 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/secret/c0fe5012-3b98-4ef6-954c-27b4e962a1cc-config-volume\") pod \"alertmanager-metric-storage-0\" (UID: \"c0fe5012-3b98-4ef6-954c-27b4e962a1cc\") " pod="watcher-kuttl-default/alertmanager-metric-storage-0" Oct 03 08:56:41 crc kubenswrapper[4765]: I1003 08:56:41.047462 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4hwbx\" (UniqueName: \"kubernetes.io/projected/c0fe5012-3b98-4ef6-954c-27b4e962a1cc-kube-api-access-4hwbx\") pod \"alertmanager-metric-storage-0\" (UID: \"c0fe5012-3b98-4ef6-954c-27b4e962a1cc\") " pod="watcher-kuttl-default/alertmanager-metric-storage-0" Oct 03 08:56:41 crc kubenswrapper[4765]: I1003 08:56:41.100632 4765 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/alertmanager-metric-storage-0" Oct 03 08:56:41 crc kubenswrapper[4765]: I1003 08:56:41.183799 4765 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operators/observability-ui-dashboards-6584dc9448-d5k5d"] Oct 03 08:56:41 crc kubenswrapper[4765]: I1003 08:56:41.185094 4765 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/observability-ui-dashboards-6584dc9448-d5k5d" Oct 03 08:56:41 crc kubenswrapper[4765]: I1003 08:56:41.189956 4765 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operators"/"observability-ui-dashboards-sa-dockercfg-flv82" Oct 03 08:56:41 crc kubenswrapper[4765]: I1003 08:56:41.190133 4765 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operators"/"observability-ui-dashboards" Oct 03 08:56:41 crc kubenswrapper[4765]: I1003 08:56:41.199403 4765 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operators/observability-ui-dashboards-6584dc9448-d5k5d"] Oct 03 08:56:41 crc kubenswrapper[4765]: I1003 08:56:41.266400 4765 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["watcher-kuttl-default/kube-state-metrics-0"] Oct 03 08:56:41 crc kubenswrapper[4765]: W1003 08:56:41.294825 4765 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod14d4a2cf_7c1b_4e8d_a42c_01cb979e78b6.slice/crio-c56ec4f720b52257866ffb6a14674698dc98b2e177ccc184e4072ddf9c240f40 WatchSource:0}: Error finding container c56ec4f720b52257866ffb6a14674698dc98b2e177ccc184e4072ddf9c240f40: Status 404 returned error can't find the container with id c56ec4f720b52257866ffb6a14674698dc98b2e177ccc184e4072ddf9c240f40 Oct 03 08:56:41 crc kubenswrapper[4765]: I1003 08:56:41.328673 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cmrvv\" (UniqueName: \"kubernetes.io/projected/1f6d6f47-0f09-4c55-8df8-5346c3f8ffb7-kube-api-access-cmrvv\") pod \"observability-ui-dashboards-6584dc9448-d5k5d\" (UID: \"1f6d6f47-0f09-4c55-8df8-5346c3f8ffb7\") " pod="openshift-operators/observability-ui-dashboards-6584dc9448-d5k5d" Oct 03 08:56:41 crc kubenswrapper[4765]: I1003 08:56:41.328717 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1f6d6f47-0f09-4c55-8df8-5346c3f8ffb7-serving-cert\") pod \"observability-ui-dashboards-6584dc9448-d5k5d\" (UID: \"1f6d6f47-0f09-4c55-8df8-5346c3f8ffb7\") " pod="openshift-operators/observability-ui-dashboards-6584dc9448-d5k5d" Oct 03 08:56:41 crc kubenswrapper[4765]: I1003 08:56:41.430610 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cmrvv\" (UniqueName: \"kubernetes.io/projected/1f6d6f47-0f09-4c55-8df8-5346c3f8ffb7-kube-api-access-cmrvv\") pod \"observability-ui-dashboards-6584dc9448-d5k5d\" (UID: \"1f6d6f47-0f09-4c55-8df8-5346c3f8ffb7\") " pod="openshift-operators/observability-ui-dashboards-6584dc9448-d5k5d" Oct 03 08:56:41 crc kubenswrapper[4765]: I1003 08:56:41.430671 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1f6d6f47-0f09-4c55-8df8-5346c3f8ffb7-serving-cert\") pod \"observability-ui-dashboards-6584dc9448-d5k5d\" (UID: \"1f6d6f47-0f09-4c55-8df8-5346c3f8ffb7\") " pod="openshift-operators/observability-ui-dashboards-6584dc9448-d5k5d" Oct 03 08:56:41 crc kubenswrapper[4765]: I1003 08:56:41.436481 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1f6d6f47-0f09-4c55-8df8-5346c3f8ffb7-serving-cert\") pod \"observability-ui-dashboards-6584dc9448-d5k5d\" (UID: \"1f6d6f47-0f09-4c55-8df8-5346c3f8ffb7\") " pod="openshift-operators/observability-ui-dashboards-6584dc9448-d5k5d" Oct 03 08:56:41 crc kubenswrapper[4765]: I1003 08:56:41.438023 4765 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["watcher-kuttl-default/prometheus-metric-storage-0"] Oct 03 08:56:41 crc kubenswrapper[4765]: I1003 08:56:41.440975 4765 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/prometheus-metric-storage-0" Oct 03 08:56:41 crc kubenswrapper[4765]: I1003 08:56:41.459681 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cmrvv\" (UniqueName: \"kubernetes.io/projected/1f6d6f47-0f09-4c55-8df8-5346c3f8ffb7-kube-api-access-cmrvv\") pod \"observability-ui-dashboards-6584dc9448-d5k5d\" (UID: \"1f6d6f47-0f09-4c55-8df8-5346c3f8ffb7\") " pod="openshift-operators/observability-ui-dashboards-6584dc9448-d5k5d" Oct 03 08:56:41 crc kubenswrapper[4765]: I1003 08:56:41.460921 4765 reflector.go:368] Caches populated for *v1.Secret from object-"watcher-kuttl-default"/"prometheus-metric-storage-tls-assets-0" Oct 03 08:56:41 crc kubenswrapper[4765]: I1003 08:56:41.461179 4765 reflector.go:368] Caches populated for *v1.Secret from object-"watcher-kuttl-default"/"prometheus-metric-storage" Oct 03 08:56:41 crc kubenswrapper[4765]: I1003 08:56:41.461299 4765 reflector.go:368] Caches populated for *v1.Secret from object-"watcher-kuttl-default"/"prometheus-metric-storage-thanos-prometheus-http-client-file" Oct 03 08:56:41 crc kubenswrapper[4765]: I1003 08:56:41.461450 4765 reflector.go:368] Caches populated for *v1.Secret from object-"watcher-kuttl-default"/"prometheus-metric-storage-web-config" Oct 03 08:56:41 crc kubenswrapper[4765]: I1003 08:56:41.461585 4765 reflector.go:368] Caches populated for *v1.Secret from object-"watcher-kuttl-default"/"metric-storage-prometheus-dockercfg-qmkx5" Oct 03 08:56:41 crc kubenswrapper[4765]: I1003 08:56:41.461700 4765 reflector.go:368] Caches populated for *v1.ConfigMap from object-"watcher-kuttl-default"/"prometheus-metric-storage-rulefiles-0" Oct 03 08:56:41 crc kubenswrapper[4765]: I1003 08:56:41.474670 4765 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["watcher-kuttl-default/prometheus-metric-storage-0"] Oct 03 08:56:41 crc kubenswrapper[4765]: I1003 08:56:41.512065 4765 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/observability-ui-dashboards-6584dc9448-d5k5d" Oct 03 08:56:41 crc kubenswrapper[4765]: I1003 08:56:41.531815 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pvc-3f5f0f0f-56c1-41b9-8467-2b1aaa11a12c\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-3f5f0f0f-56c1-41b9-8467-2b1aaa11a12c\") pod \"prometheus-metric-storage-0\" (UID: \"c29c72f9-5956-4af4-8936-e14f5d0ea18a\") " pod="watcher-kuttl-default/prometheus-metric-storage-0" Oct 03 08:56:41 crc kubenswrapper[4765]: I1003 08:56:41.531870 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wshq2\" (UniqueName: \"kubernetes.io/projected/c29c72f9-5956-4af4-8936-e14f5d0ea18a-kube-api-access-wshq2\") pod \"prometheus-metric-storage-0\" (UID: \"c29c72f9-5956-4af4-8936-e14f5d0ea18a\") " pod="watcher-kuttl-default/prometheus-metric-storage-0" Oct 03 08:56:41 crc kubenswrapper[4765]: I1003 08:56:41.531912 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/c29c72f9-5956-4af4-8936-e14f5d0ea18a-web-config\") pod \"prometheus-metric-storage-0\" (UID: \"c29c72f9-5956-4af4-8936-e14f5d0ea18a\") " pod="watcher-kuttl-default/prometheus-metric-storage-0" Oct 03 08:56:41 crc kubenswrapper[4765]: I1003 08:56:41.531952 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/c29c72f9-5956-4af4-8936-e14f5d0ea18a-tls-assets\") pod \"prometheus-metric-storage-0\" (UID: \"c29c72f9-5956-4af4-8936-e14f5d0ea18a\") " pod="watcher-kuttl-default/prometheus-metric-storage-0" Oct 03 08:56:41 crc kubenswrapper[4765]: I1003 08:56:41.531980 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-metric-storage-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/c29c72f9-5956-4af4-8936-e14f5d0ea18a-prometheus-metric-storage-rulefiles-0\") pod \"prometheus-metric-storage-0\" (UID: \"c29c72f9-5956-4af4-8936-e14f5d0ea18a\") " pod="watcher-kuttl-default/prometheus-metric-storage-0" Oct 03 08:56:41 crc kubenswrapper[4765]: I1003 08:56:41.532029 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/c29c72f9-5956-4af4-8936-e14f5d0ea18a-config-out\") pod \"prometheus-metric-storage-0\" (UID: \"c29c72f9-5956-4af4-8936-e14f5d0ea18a\") " pod="watcher-kuttl-default/prometheus-metric-storage-0" Oct 03 08:56:41 crc kubenswrapper[4765]: I1003 08:56:41.532068 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/c29c72f9-5956-4af4-8936-e14f5d0ea18a-thanos-prometheus-http-client-file\") pod \"prometheus-metric-storage-0\" (UID: \"c29c72f9-5956-4af4-8936-e14f5d0ea18a\") " pod="watcher-kuttl-default/prometheus-metric-storage-0" Oct 03 08:56:41 crc kubenswrapper[4765]: I1003 08:56:41.532130 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/c29c72f9-5956-4af4-8936-e14f5d0ea18a-config\") pod \"prometheus-metric-storage-0\" (UID: \"c29c72f9-5956-4af4-8936-e14f5d0ea18a\") " pod="watcher-kuttl-default/prometheus-metric-storage-0" Oct 03 08:56:41 crc kubenswrapper[4765]: I1003 08:56:41.607384 4765 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-console/console-997fd5d99-2gxvt"] Oct 03 08:56:41 crc kubenswrapper[4765]: I1003 08:56:41.608725 4765 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-997fd5d99-2gxvt" Oct 03 08:56:41 crc kubenswrapper[4765]: I1003 08:56:41.620250 4765 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-997fd5d99-2gxvt"] Oct 03 08:56:41 crc kubenswrapper[4765]: I1003 08:56:41.647595 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-3f5f0f0f-56c1-41b9-8467-2b1aaa11a12c\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-3f5f0f0f-56c1-41b9-8467-2b1aaa11a12c\") pod \"prometheus-metric-storage-0\" (UID: \"c29c72f9-5956-4af4-8936-e14f5d0ea18a\") " pod="watcher-kuttl-default/prometheus-metric-storage-0" Oct 03 08:56:41 crc kubenswrapper[4765]: I1003 08:56:41.647668 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wshq2\" (UniqueName: \"kubernetes.io/projected/c29c72f9-5956-4af4-8936-e14f5d0ea18a-kube-api-access-wshq2\") pod \"prometheus-metric-storage-0\" (UID: \"c29c72f9-5956-4af4-8936-e14f5d0ea18a\") " pod="watcher-kuttl-default/prometheus-metric-storage-0" Oct 03 08:56:41 crc kubenswrapper[4765]: I1003 08:56:41.647704 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/c29c72f9-5956-4af4-8936-e14f5d0ea18a-web-config\") pod \"prometheus-metric-storage-0\" (UID: \"c29c72f9-5956-4af4-8936-e14f5d0ea18a\") " pod="watcher-kuttl-default/prometheus-metric-storage-0" Oct 03 08:56:41 crc kubenswrapper[4765]: I1003 08:56:41.647735 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/c29c72f9-5956-4af4-8936-e14f5d0ea18a-tls-assets\") pod \"prometheus-metric-storage-0\" (UID: \"c29c72f9-5956-4af4-8936-e14f5d0ea18a\") " pod="watcher-kuttl-default/prometheus-metric-storage-0" Oct 03 08:56:41 crc kubenswrapper[4765]: I1003 08:56:41.647754 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"prometheus-metric-storage-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/c29c72f9-5956-4af4-8936-e14f5d0ea18a-prometheus-metric-storage-rulefiles-0\") pod \"prometheus-metric-storage-0\" (UID: \"c29c72f9-5956-4af4-8936-e14f5d0ea18a\") " pod="watcher-kuttl-default/prometheus-metric-storage-0" Oct 03 08:56:41 crc kubenswrapper[4765]: I1003 08:56:41.648530 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/c29c72f9-5956-4af4-8936-e14f5d0ea18a-config-out\") pod \"prometheus-metric-storage-0\" (UID: \"c29c72f9-5956-4af4-8936-e14f5d0ea18a\") " pod="watcher-kuttl-default/prometheus-metric-storage-0" Oct 03 08:56:41 crc kubenswrapper[4765]: I1003 08:56:41.648576 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/c29c72f9-5956-4af4-8936-e14f5d0ea18a-thanos-prometheus-http-client-file\") pod \"prometheus-metric-storage-0\" (UID: \"c29c72f9-5956-4af4-8936-e14f5d0ea18a\") " pod="watcher-kuttl-default/prometheus-metric-storage-0" Oct 03 08:56:41 crc kubenswrapper[4765]: I1003 08:56:41.648634 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/c29c72f9-5956-4af4-8936-e14f5d0ea18a-config\") pod \"prometheus-metric-storage-0\" (UID: \"c29c72f9-5956-4af4-8936-e14f5d0ea18a\") " pod="watcher-kuttl-default/prometheus-metric-storage-0" Oct 03 08:56:41 crc kubenswrapper[4765]: I1003 08:56:41.653666 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/secret/c29c72f9-5956-4af4-8936-e14f5d0ea18a-config\") pod \"prometheus-metric-storage-0\" (UID: \"c29c72f9-5956-4af4-8936-e14f5d0ea18a\") " pod="watcher-kuttl-default/prometheus-metric-storage-0" Oct 03 08:56:41 crc kubenswrapper[4765]: I1003 08:56:41.656073 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/c29c72f9-5956-4af4-8936-e14f5d0ea18a-tls-assets\") pod \"prometheus-metric-storage-0\" (UID: \"c29c72f9-5956-4af4-8936-e14f5d0ea18a\") " pod="watcher-kuttl-default/prometheus-metric-storage-0" Oct 03 08:56:41 crc kubenswrapper[4765]: I1003 08:56:41.656123 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"prometheus-metric-storage-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/c29c72f9-5956-4af4-8936-e14f5d0ea18a-prometheus-metric-storage-rulefiles-0\") pod \"prometheus-metric-storage-0\" (UID: \"c29c72f9-5956-4af4-8936-e14f5d0ea18a\") " pod="watcher-kuttl-default/prometheus-metric-storage-0" Oct 03 08:56:41 crc kubenswrapper[4765]: I1003 08:56:41.656700 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/c29c72f9-5956-4af4-8936-e14f5d0ea18a-web-config\") pod \"prometheus-metric-storage-0\" (UID: \"c29c72f9-5956-4af4-8936-e14f5d0ea18a\") " pod="watcher-kuttl-default/prometheus-metric-storage-0" Oct 03 08:56:41 crc kubenswrapper[4765]: I1003 08:56:41.659056 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/c29c72f9-5956-4af4-8936-e14f5d0ea18a-config-out\") pod \"prometheus-metric-storage-0\" (UID: \"c29c72f9-5956-4af4-8936-e14f5d0ea18a\") " pod="watcher-kuttl-default/prometheus-metric-storage-0" Oct 03 08:56:41 crc kubenswrapper[4765]: I1003 08:56:41.670278 4765 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Oct 03 08:56:41 crc kubenswrapper[4765]: I1003 08:56:41.670316 4765 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-3f5f0f0f-56c1-41b9-8467-2b1aaa11a12c\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-3f5f0f0f-56c1-41b9-8467-2b1aaa11a12c\") pod \"prometheus-metric-storage-0\" (UID: \"c29c72f9-5956-4af4-8936-e14f5d0ea18a\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/bd2eec3abdd28a1cd7561a06af9eac99c6b9120801b74599f1637a7e0294eead/globalmount\"" pod="watcher-kuttl-default/prometheus-metric-storage-0" Oct 03 08:56:41 crc kubenswrapper[4765]: I1003 08:56:41.671503 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/c29c72f9-5956-4af4-8936-e14f5d0ea18a-thanos-prometheus-http-client-file\") pod \"prometheus-metric-storage-0\" (UID: \"c29c72f9-5956-4af4-8936-e14f5d0ea18a\") " pod="watcher-kuttl-default/prometheus-metric-storage-0" Oct 03 08:56:41 crc kubenswrapper[4765]: I1003 08:56:41.685526 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wshq2\" (UniqueName: \"kubernetes.io/projected/c29c72f9-5956-4af4-8936-e14f5d0ea18a-kube-api-access-wshq2\") pod \"prometheus-metric-storage-0\" (UID: \"c29c72f9-5956-4af4-8936-e14f5d0ea18a\") " pod="watcher-kuttl-default/prometheus-metric-storage-0" Oct 03 08:56:41 crc kubenswrapper[4765]: I1003 08:56:41.715076 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-3f5f0f0f-56c1-41b9-8467-2b1aaa11a12c\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-3f5f0f0f-56c1-41b9-8467-2b1aaa11a12c\") pod \"prometheus-metric-storage-0\" (UID: \"c29c72f9-5956-4af4-8936-e14f5d0ea18a\") " pod="watcher-kuttl-default/prometheus-metric-storage-0" Oct 03 08:56:41 crc kubenswrapper[4765]: I1003 08:56:41.756741 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/e9210740-b0e3-476a-b527-043bf0b4e587-service-ca\") pod \"console-997fd5d99-2gxvt\" (UID: \"e9210740-b0e3-476a-b527-043bf0b4e587\") " pod="openshift-console/console-997fd5d99-2gxvt" Oct 03 08:56:41 crc kubenswrapper[4765]: I1003 08:56:41.756821 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ww7px\" (UniqueName: \"kubernetes.io/projected/e9210740-b0e3-476a-b527-043bf0b4e587-kube-api-access-ww7px\") pod \"console-997fd5d99-2gxvt\" (UID: \"e9210740-b0e3-476a-b527-043bf0b4e587\") " pod="openshift-console/console-997fd5d99-2gxvt" Oct 03 08:56:41 crc kubenswrapper[4765]: I1003 08:56:41.756854 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/e9210740-b0e3-476a-b527-043bf0b4e587-trusted-ca-bundle\") pod \"console-997fd5d99-2gxvt\" (UID: \"e9210740-b0e3-476a-b527-043bf0b4e587\") " pod="openshift-console/console-997fd5d99-2gxvt" Oct 03 08:56:41 crc kubenswrapper[4765]: I1003 08:56:41.756876 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/e9210740-b0e3-476a-b527-043bf0b4e587-console-serving-cert\") pod \"console-997fd5d99-2gxvt\" (UID: \"e9210740-b0e3-476a-b527-043bf0b4e587\") " pod="openshift-console/console-997fd5d99-2gxvt" Oct 03 08:56:41 crc kubenswrapper[4765]: I1003 08:56:41.756912 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/e9210740-b0e3-476a-b527-043bf0b4e587-oauth-serving-cert\") pod \"console-997fd5d99-2gxvt\" (UID: \"e9210740-b0e3-476a-b527-043bf0b4e587\") " pod="openshift-console/console-997fd5d99-2gxvt" Oct 03 08:56:41 crc kubenswrapper[4765]: I1003 08:56:41.759907 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/e9210740-b0e3-476a-b527-043bf0b4e587-console-config\") pod \"console-997fd5d99-2gxvt\" (UID: \"e9210740-b0e3-476a-b527-043bf0b4e587\") " pod="openshift-console/console-997fd5d99-2gxvt" Oct 03 08:56:41 crc kubenswrapper[4765]: I1003 08:56:41.759971 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/e9210740-b0e3-476a-b527-043bf0b4e587-console-oauth-config\") pod \"console-997fd5d99-2gxvt\" (UID: \"e9210740-b0e3-476a-b527-043bf0b4e587\") " pod="openshift-console/console-997fd5d99-2gxvt" Oct 03 08:56:41 crc kubenswrapper[4765]: I1003 08:56:41.818858 4765 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/prometheus-metric-storage-0" Oct 03 08:56:41 crc kubenswrapper[4765]: I1003 08:56:41.854538 4765 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["watcher-kuttl-default/alertmanager-metric-storage-0"] Oct 03 08:56:41 crc kubenswrapper[4765]: I1003 08:56:41.861601 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/e9210740-b0e3-476a-b527-043bf0b4e587-console-config\") pod \"console-997fd5d99-2gxvt\" (UID: \"e9210740-b0e3-476a-b527-043bf0b4e587\") " pod="openshift-console/console-997fd5d99-2gxvt" Oct 03 08:56:41 crc kubenswrapper[4765]: I1003 08:56:41.861659 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/e9210740-b0e3-476a-b527-043bf0b4e587-console-oauth-config\") pod \"console-997fd5d99-2gxvt\" (UID: \"e9210740-b0e3-476a-b527-043bf0b4e587\") " pod="openshift-console/console-997fd5d99-2gxvt" Oct 03 08:56:41 crc kubenswrapper[4765]: I1003 08:56:41.861695 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/e9210740-b0e3-476a-b527-043bf0b4e587-service-ca\") pod \"console-997fd5d99-2gxvt\" (UID: \"e9210740-b0e3-476a-b527-043bf0b4e587\") " pod="openshift-console/console-997fd5d99-2gxvt" Oct 03 08:56:41 crc kubenswrapper[4765]: I1003 08:56:41.861741 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ww7px\" (UniqueName: \"kubernetes.io/projected/e9210740-b0e3-476a-b527-043bf0b4e587-kube-api-access-ww7px\") pod \"console-997fd5d99-2gxvt\" (UID: \"e9210740-b0e3-476a-b527-043bf0b4e587\") " pod="openshift-console/console-997fd5d99-2gxvt" Oct 03 08:56:41 crc kubenswrapper[4765]: I1003 08:56:41.861769 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/e9210740-b0e3-476a-b527-043bf0b4e587-trusted-ca-bundle\") pod \"console-997fd5d99-2gxvt\" (UID: \"e9210740-b0e3-476a-b527-043bf0b4e587\") " pod="openshift-console/console-997fd5d99-2gxvt" Oct 03 08:56:41 crc kubenswrapper[4765]: I1003 08:56:41.861787 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/e9210740-b0e3-476a-b527-043bf0b4e587-console-serving-cert\") pod \"console-997fd5d99-2gxvt\" (UID: \"e9210740-b0e3-476a-b527-043bf0b4e587\") " pod="openshift-console/console-997fd5d99-2gxvt" Oct 03 08:56:41 crc kubenswrapper[4765]: I1003 08:56:41.861821 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/e9210740-b0e3-476a-b527-043bf0b4e587-oauth-serving-cert\") pod \"console-997fd5d99-2gxvt\" (UID: \"e9210740-b0e3-476a-b527-043bf0b4e587\") " pod="openshift-console/console-997fd5d99-2gxvt" Oct 03 08:56:41 crc kubenswrapper[4765]: I1003 08:56:41.863110 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/e9210740-b0e3-476a-b527-043bf0b4e587-console-config\") pod \"console-997fd5d99-2gxvt\" (UID: \"e9210740-b0e3-476a-b527-043bf0b4e587\") " pod="openshift-console/console-997fd5d99-2gxvt" Oct 03 08:56:41 crc kubenswrapper[4765]: I1003 08:56:41.865667 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/e9210740-b0e3-476a-b527-043bf0b4e587-trusted-ca-bundle\") pod \"console-997fd5d99-2gxvt\" (UID: \"e9210740-b0e3-476a-b527-043bf0b4e587\") " pod="openshift-console/console-997fd5d99-2gxvt" Oct 03 08:56:41 crc kubenswrapper[4765]: I1003 08:56:41.873883 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/e9210740-b0e3-476a-b527-043bf0b4e587-service-ca\") pod \"console-997fd5d99-2gxvt\" (UID: \"e9210740-b0e3-476a-b527-043bf0b4e587\") " pod="openshift-console/console-997fd5d99-2gxvt" Oct 03 08:56:41 crc kubenswrapper[4765]: I1003 08:56:41.888024 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/e9210740-b0e3-476a-b527-043bf0b4e587-console-serving-cert\") pod \"console-997fd5d99-2gxvt\" (UID: \"e9210740-b0e3-476a-b527-043bf0b4e587\") " pod="openshift-console/console-997fd5d99-2gxvt" Oct 03 08:56:41 crc kubenswrapper[4765]: I1003 08:56:41.889490 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/e9210740-b0e3-476a-b527-043bf0b4e587-oauth-serving-cert\") pod \"console-997fd5d99-2gxvt\" (UID: \"e9210740-b0e3-476a-b527-043bf0b4e587\") " pod="openshift-console/console-997fd5d99-2gxvt" Oct 03 08:56:41 crc kubenswrapper[4765]: I1003 08:56:41.909196 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/e9210740-b0e3-476a-b527-043bf0b4e587-console-oauth-config\") pod \"console-997fd5d99-2gxvt\" (UID: \"e9210740-b0e3-476a-b527-043bf0b4e587\") " pod="openshift-console/console-997fd5d99-2gxvt" Oct 03 08:56:41 crc kubenswrapper[4765]: I1003 08:56:41.916928 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ww7px\" (UniqueName: \"kubernetes.io/projected/e9210740-b0e3-476a-b527-043bf0b4e587-kube-api-access-ww7px\") pod \"console-997fd5d99-2gxvt\" (UID: \"e9210740-b0e3-476a-b527-043bf0b4e587\") " pod="openshift-console/console-997fd5d99-2gxvt" Oct 03 08:56:41 crc kubenswrapper[4765]: I1003 08:56:41.968411 4765 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-997fd5d99-2gxvt" Oct 03 08:56:41 crc kubenswrapper[4765]: I1003 08:56:41.993408 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/kube-state-metrics-0" event={"ID":"14d4a2cf-7c1b-4e8d-a42c-01cb979e78b6","Type":"ContainerStarted","Data":"c56ec4f720b52257866ffb6a14674698dc98b2e177ccc184e4072ddf9c240f40"} Oct 03 08:56:41 crc kubenswrapper[4765]: I1003 08:56:41.994890 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/alertmanager-metric-storage-0" event={"ID":"c0fe5012-3b98-4ef6-954c-27b4e962a1cc","Type":"ContainerStarted","Data":"55bd57afe69b9935b1f1b756d2b33ac0ee62a5d6119bcb1220152144a4d849f3"} Oct 03 08:56:42 crc kubenswrapper[4765]: I1003 08:56:42.265169 4765 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operators/observability-ui-dashboards-6584dc9448-d5k5d"] Oct 03 08:56:42 crc kubenswrapper[4765]: I1003 08:56:42.272550 4765 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["watcher-kuttl-default/prometheus-metric-storage-0"] Oct 03 08:56:42 crc kubenswrapper[4765]: I1003 08:56:42.581666 4765 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-997fd5d99-2gxvt"] Oct 03 08:56:43 crc kubenswrapper[4765]: I1003 08:56:43.032148 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operators/observability-ui-dashboards-6584dc9448-d5k5d" event={"ID":"1f6d6f47-0f09-4c55-8df8-5346c3f8ffb7","Type":"ContainerStarted","Data":"7807b063b135d14ea49111091f2fff187b3c3b88636b11d45c9093d01ad9c126"} Oct 03 08:56:43 crc kubenswrapper[4765]: I1003 08:56:43.038821 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-997fd5d99-2gxvt" event={"ID":"e9210740-b0e3-476a-b527-043bf0b4e587","Type":"ContainerStarted","Data":"e896649554a4b095367cdd37afcff3e092736958b53d2135d1b70218739827b9"} Oct 03 08:56:43 crc kubenswrapper[4765]: I1003 08:56:43.040881 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/prometheus-metric-storage-0" event={"ID":"c29c72f9-5956-4af4-8936-e14f5d0ea18a","Type":"ContainerStarted","Data":"481790c0b054b5d75a13abd20c15c82d035ba0b282d8161fabde53f5716bd714"} Oct 03 08:56:45 crc kubenswrapper[4765]: I1003 08:56:45.073947 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-997fd5d99-2gxvt" event={"ID":"e9210740-b0e3-476a-b527-043bf0b4e587","Type":"ContainerStarted","Data":"d349422b66f257c5922c54f97356bed7954606b076ffec1c3c216f6953d15f8e"} Oct 03 08:56:45 crc kubenswrapper[4765]: I1003 08:56:45.099982 4765 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/console-997fd5d99-2gxvt" podStartSLOduration=4.099957255 podStartE2EDuration="4.099957255s" podCreationTimestamp="2025-10-03 08:56:41 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-03 08:56:45.091851611 +0000 UTC m=+1049.393345961" watchObservedRunningTime="2025-10-03 08:56:45.099957255 +0000 UTC m=+1049.401451585" Oct 03 08:56:51 crc kubenswrapper[4765]: I1003 08:56:51.968962 4765 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-console/console-997fd5d99-2gxvt" Oct 03 08:56:51 crc kubenswrapper[4765]: I1003 08:56:51.969489 4765 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-console/console-997fd5d99-2gxvt" Oct 03 08:56:51 crc kubenswrapper[4765]: I1003 08:56:51.975272 4765 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-console/console-997fd5d99-2gxvt" Oct 03 08:56:52 crc kubenswrapper[4765]: I1003 08:56:52.135487 4765 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/console-997fd5d99-2gxvt" Oct 03 08:56:52 crc kubenswrapper[4765]: I1003 08:56:52.197845 4765 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-console/console-848877996-52ncb"] Oct 03 08:56:54 crc kubenswrapper[4765]: I1003 08:56:54.153145 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/memcached-0" event={"ID":"2336fb85-91d3-4450-90ee-52264f3dc39f","Type":"ContainerStarted","Data":"95cc04ebd1b6e9267df24d6177f27b547015ef04c08a3eea05707e0e0c268088"} Oct 03 08:56:54 crc kubenswrapper[4765]: I1003 08:56:54.153684 4765 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="watcher-kuttl-default/memcached-0" Oct 03 08:56:54 crc kubenswrapper[4765]: I1003 08:56:54.155281 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/kube-state-metrics-0" event={"ID":"14d4a2cf-7c1b-4e8d-a42c-01cb979e78b6","Type":"ContainerStarted","Data":"1b6e759505f867fd22a8088522d2da7078cb4c81ab2d23b44ac6b3214f9217a0"} Oct 03 08:56:54 crc kubenswrapper[4765]: I1003 08:56:54.155422 4765 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="watcher-kuttl-default/kube-state-metrics-0" Oct 03 08:56:54 crc kubenswrapper[4765]: I1003 08:56:54.173035 4765 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="watcher-kuttl-default/memcached-0" podStartSLOduration=2.838522898 podStartE2EDuration="15.173012821s" podCreationTimestamp="2025-10-03 08:56:39 +0000 UTC" firstStartedPulling="2025-10-03 08:56:40.825721198 +0000 UTC m=+1045.127215528" lastFinishedPulling="2025-10-03 08:56:53.160211121 +0000 UTC m=+1057.461705451" observedRunningTime="2025-10-03 08:56:54.169232025 +0000 UTC m=+1058.470726385" watchObservedRunningTime="2025-10-03 08:56:54.173012821 +0000 UTC m=+1058.474507151" Oct 03 08:56:54 crc kubenswrapper[4765]: I1003 08:56:54.183219 4765 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="watcher-kuttl-default/kube-state-metrics-0" podStartSLOduration=3.455717443 podStartE2EDuration="15.183198377s" podCreationTimestamp="2025-10-03 08:56:39 +0000 UTC" firstStartedPulling="2025-10-03 08:56:41.365141655 +0000 UTC m=+1045.666635985" lastFinishedPulling="2025-10-03 08:56:53.092622589 +0000 UTC m=+1057.394116919" observedRunningTime="2025-10-03 08:56:54.182185042 +0000 UTC m=+1058.483679372" watchObservedRunningTime="2025-10-03 08:56:54.183198377 +0000 UTC m=+1058.484692707" Oct 03 08:56:55 crc kubenswrapper[4765]: I1003 08:56:55.164251 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/rabbitmq-notifications-server-0" event={"ID":"822ab948-07b5-4946-aeb3-d6cd9e4f6752","Type":"ContainerStarted","Data":"1e8a5d1c89041686731f9e7c7dc6265109cae39d92fe1aef985dc7dff1dfc2fe"} Oct 03 08:56:55 crc kubenswrapper[4765]: I1003 08:56:55.166062 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/openstack-galera-0" event={"ID":"3f7d0aed-bfcf-4589-a75c-d94328cf7b7a","Type":"ContainerStarted","Data":"d96481727373acda0c1f7044a942c209a775e2e68cc44c784a10811772106d4e"} Oct 03 08:56:55 crc kubenswrapper[4765]: I1003 08:56:55.167531 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operators/observability-ui-dashboards-6584dc9448-d5k5d" event={"ID":"1f6d6f47-0f09-4c55-8df8-5346c3f8ffb7","Type":"ContainerStarted","Data":"e64a989db819479906376e62ab559d36e93a95facd916de313d86786ca646cfd"} Oct 03 08:56:55 crc kubenswrapper[4765]: I1003 08:56:55.204259 4765 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operators/observability-ui-dashboards-6584dc9448-d5k5d" podStartSLOduration=6.646256539 podStartE2EDuration="14.204240675s" podCreationTimestamp="2025-10-03 08:56:41 +0000 UTC" firstStartedPulling="2025-10-03 08:56:42.494274685 +0000 UTC m=+1046.795769015" lastFinishedPulling="2025-10-03 08:56:50.052258821 +0000 UTC m=+1054.353753151" observedRunningTime="2025-10-03 08:56:55.20166634 +0000 UTC m=+1059.503160680" watchObservedRunningTime="2025-10-03 08:56:55.204240675 +0000 UTC m=+1059.505735005" Oct 03 08:56:56 crc kubenswrapper[4765]: I1003 08:56:56.175591 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/rabbitmq-server-0" event={"ID":"ee23f3ed-67bb-44ab-93fe-8251f7768941","Type":"ContainerStarted","Data":"38cdce0c0561407df5f0137ac96690dacbbb0ded9a974468f6e258e7860865c8"} Oct 03 08:56:56 crc kubenswrapper[4765]: I1003 08:56:56.177398 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/alertmanager-metric-storage-0" event={"ID":"c0fe5012-3b98-4ef6-954c-27b4e962a1cc","Type":"ContainerStarted","Data":"8ecafe6f78b43219b946b52257cca81593923afce9e116429f557bf714fa2774"} Oct 03 08:56:56 crc kubenswrapper[4765]: I1003 08:56:56.179252 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/prometheus-metric-storage-0" event={"ID":"c29c72f9-5956-4af4-8936-e14f5d0ea18a","Type":"ContainerStarted","Data":"e8513b286421b80f51fa44472253d5d8576a7e0684141b66f00d41203128d445"} Oct 03 08:56:58 crc kubenswrapper[4765]: I1003 08:56:58.198430 4765 generic.go:334] "Generic (PLEG): container finished" podID="3f7d0aed-bfcf-4589-a75c-d94328cf7b7a" containerID="d96481727373acda0c1f7044a942c209a775e2e68cc44c784a10811772106d4e" exitCode=0 Oct 03 08:56:58 crc kubenswrapper[4765]: I1003 08:56:58.198586 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/openstack-galera-0" event={"ID":"3f7d0aed-bfcf-4589-a75c-d94328cf7b7a","Type":"ContainerDied","Data":"d96481727373acda0c1f7044a942c209a775e2e68cc44c784a10811772106d4e"} Oct 03 08:56:59 crc kubenswrapper[4765]: I1003 08:56:59.209702 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/openstack-galera-0" event={"ID":"3f7d0aed-bfcf-4589-a75c-d94328cf7b7a","Type":"ContainerStarted","Data":"480e9170918b02d4dd851583e77282202dba6ca06091b9d7fe73b1f013ed8af0"} Oct 03 08:56:59 crc kubenswrapper[4765]: I1003 08:56:59.233335 4765 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="watcher-kuttl-default/openstack-galera-0" podStartSLOduration=8.576515306 podStartE2EDuration="21.233315567s" podCreationTimestamp="2025-10-03 08:56:38 +0000 UTC" firstStartedPulling="2025-10-03 08:56:40.482430892 +0000 UTC m=+1044.783925222" lastFinishedPulling="2025-10-03 08:56:53.139231153 +0000 UTC m=+1057.440725483" observedRunningTime="2025-10-03 08:56:59.227899971 +0000 UTC m=+1063.529394321" watchObservedRunningTime="2025-10-03 08:56:59.233315567 +0000 UTC m=+1063.534809897" Oct 03 08:56:59 crc kubenswrapper[4765]: I1003 08:56:59.611245 4765 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="watcher-kuttl-default/openstack-galera-0" Oct 03 08:56:59 crc kubenswrapper[4765]: I1003 08:56:59.611345 4765 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="watcher-kuttl-default/openstack-galera-0" Oct 03 08:57:00 crc kubenswrapper[4765]: I1003 08:57:00.023544 4765 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="watcher-kuttl-default/memcached-0" Oct 03 08:57:00 crc kubenswrapper[4765]: I1003 08:57:00.338209 4765 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="watcher-kuttl-default/kube-state-metrics-0" Oct 03 08:57:02 crc kubenswrapper[4765]: I1003 08:57:02.230038 4765 generic.go:334] "Generic (PLEG): container finished" podID="c29c72f9-5956-4af4-8936-e14f5d0ea18a" containerID="e8513b286421b80f51fa44472253d5d8576a7e0684141b66f00d41203128d445" exitCode=0 Oct 03 08:57:02 crc kubenswrapper[4765]: I1003 08:57:02.230098 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/prometheus-metric-storage-0" event={"ID":"c29c72f9-5956-4af4-8936-e14f5d0ea18a","Type":"ContainerDied","Data":"e8513b286421b80f51fa44472253d5d8576a7e0684141b66f00d41203128d445"} Oct 03 08:57:02 crc kubenswrapper[4765]: I1003 08:57:02.240492 4765 generic.go:334] "Generic (PLEG): container finished" podID="c0fe5012-3b98-4ef6-954c-27b4e962a1cc" containerID="8ecafe6f78b43219b946b52257cca81593923afce9e116429f557bf714fa2774" exitCode=0 Oct 03 08:57:02 crc kubenswrapper[4765]: I1003 08:57:02.240539 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/alertmanager-metric-storage-0" event={"ID":"c0fe5012-3b98-4ef6-954c-27b4e962a1cc","Type":"ContainerDied","Data":"8ecafe6f78b43219b946b52257cca81593923afce9e116429f557bf714fa2774"} Oct 03 08:57:03 crc kubenswrapper[4765]: I1003 08:57:03.733980 4765 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="watcher-kuttl-default/openstack-galera-0" Oct 03 08:57:03 crc kubenswrapper[4765]: I1003 08:57:03.785703 4765 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="watcher-kuttl-default/openstack-galera-0" Oct 03 08:57:09 crc kubenswrapper[4765]: I1003 08:57:09.700487 4765 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["watcher-kuttl-default/keystone-db-create-mvkpw"] Oct 03 08:57:09 crc kubenswrapper[4765]: I1003 08:57:09.702146 4765 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/keystone-db-create-mvkpw" Oct 03 08:57:09 crc kubenswrapper[4765]: I1003 08:57:09.715342 4765 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["watcher-kuttl-default/keystone-db-create-mvkpw"] Oct 03 08:57:09 crc kubenswrapper[4765]: I1003 08:57:09.896265 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gvc56\" (UniqueName: \"kubernetes.io/projected/96645098-7ec7-4672-8f05-bc20105308e3-kube-api-access-gvc56\") pod \"keystone-db-create-mvkpw\" (UID: \"96645098-7ec7-4672-8f05-bc20105308e3\") " pod="watcher-kuttl-default/keystone-db-create-mvkpw" Oct 03 08:57:09 crc kubenswrapper[4765]: I1003 08:57:09.998083 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gvc56\" (UniqueName: \"kubernetes.io/projected/96645098-7ec7-4672-8f05-bc20105308e3-kube-api-access-gvc56\") pod \"keystone-db-create-mvkpw\" (UID: \"96645098-7ec7-4672-8f05-bc20105308e3\") " pod="watcher-kuttl-default/keystone-db-create-mvkpw" Oct 03 08:57:10 crc kubenswrapper[4765]: I1003 08:57:10.034704 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gvc56\" (UniqueName: \"kubernetes.io/projected/96645098-7ec7-4672-8f05-bc20105308e3-kube-api-access-gvc56\") pod \"keystone-db-create-mvkpw\" (UID: \"96645098-7ec7-4672-8f05-bc20105308e3\") " pod="watcher-kuttl-default/keystone-db-create-mvkpw" Oct 03 08:57:10 crc kubenswrapper[4765]: I1003 08:57:10.331146 4765 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/keystone-db-create-mvkpw" Oct 03 08:57:13 crc kubenswrapper[4765]: I1003 08:57:13.559972 4765 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["watcher-kuttl-default/keystone-db-create-mvkpw"] Oct 03 08:57:14 crc kubenswrapper[4765]: I1003 08:57:14.337941 4765 generic.go:334] "Generic (PLEG): container finished" podID="96645098-7ec7-4672-8f05-bc20105308e3" containerID="bcb2cb47a5852d6096be015b93463d8481e5cc8d0b8245a1e1d41c66b6ec90fb" exitCode=0 Oct 03 08:57:14 crc kubenswrapper[4765]: I1003 08:57:14.338059 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/keystone-db-create-mvkpw" event={"ID":"96645098-7ec7-4672-8f05-bc20105308e3","Type":"ContainerDied","Data":"bcb2cb47a5852d6096be015b93463d8481e5cc8d0b8245a1e1d41c66b6ec90fb"} Oct 03 08:57:14 crc kubenswrapper[4765]: I1003 08:57:14.338341 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/keystone-db-create-mvkpw" event={"ID":"96645098-7ec7-4672-8f05-bc20105308e3","Type":"ContainerStarted","Data":"69e28de50f20a145185194904ef6500b58c016bf26c2f8ef5c19b7c96de2045e"} Oct 03 08:57:14 crc kubenswrapper[4765]: I1003 08:57:14.340175 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/alertmanager-metric-storage-0" event={"ID":"c0fe5012-3b98-4ef6-954c-27b4e962a1cc","Type":"ContainerStarted","Data":"f48612283dc0331e8fa7ae46bb6fc561b401004177f2383335d86901bfee88f9"} Oct 03 08:57:14 crc kubenswrapper[4765]: I1003 08:57:14.342291 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/prometheus-metric-storage-0" event={"ID":"c29c72f9-5956-4af4-8936-e14f5d0ea18a","Type":"ContainerStarted","Data":"0c229e566e190e473e9e2c9487ed215a62248b0201bc79de8834c43312fac2f3"} Oct 03 08:57:15 crc kubenswrapper[4765]: I1003 08:57:15.735233 4765 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/keystone-db-create-mvkpw" Oct 03 08:57:15 crc kubenswrapper[4765]: I1003 08:57:15.897855 4765 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gvc56\" (UniqueName: \"kubernetes.io/projected/96645098-7ec7-4672-8f05-bc20105308e3-kube-api-access-gvc56\") pod \"96645098-7ec7-4672-8f05-bc20105308e3\" (UID: \"96645098-7ec7-4672-8f05-bc20105308e3\") " Oct 03 08:57:15 crc kubenswrapper[4765]: I1003 08:57:15.905040 4765 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/96645098-7ec7-4672-8f05-bc20105308e3-kube-api-access-gvc56" (OuterVolumeSpecName: "kube-api-access-gvc56") pod "96645098-7ec7-4672-8f05-bc20105308e3" (UID: "96645098-7ec7-4672-8f05-bc20105308e3"). InnerVolumeSpecName "kube-api-access-gvc56". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 08:57:15 crc kubenswrapper[4765]: I1003 08:57:15.999432 4765 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-gvc56\" (UniqueName: \"kubernetes.io/projected/96645098-7ec7-4672-8f05-bc20105308e3-kube-api-access-gvc56\") on node \"crc\" DevicePath \"\"" Oct 03 08:57:16 crc kubenswrapper[4765]: I1003 08:57:16.362372 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/prometheus-metric-storage-0" event={"ID":"c29c72f9-5956-4af4-8936-e14f5d0ea18a","Type":"ContainerStarted","Data":"688227fda8445cdeae376833574da4045f8c177b73d7aa368b47852a6b6c9443"} Oct 03 08:57:16 crc kubenswrapper[4765]: I1003 08:57:16.364389 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/keystone-db-create-mvkpw" event={"ID":"96645098-7ec7-4672-8f05-bc20105308e3","Type":"ContainerDied","Data":"69e28de50f20a145185194904ef6500b58c016bf26c2f8ef5c19b7c96de2045e"} Oct 03 08:57:16 crc kubenswrapper[4765]: I1003 08:57:16.364438 4765 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="69e28de50f20a145185194904ef6500b58c016bf26c2f8ef5c19b7c96de2045e" Oct 03 08:57:16 crc kubenswrapper[4765]: I1003 08:57:16.364558 4765 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/keystone-db-create-mvkpw" Oct 03 08:57:16 crc kubenswrapper[4765]: I1003 08:57:16.379750 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/alertmanager-metric-storage-0" event={"ID":"c0fe5012-3b98-4ef6-954c-27b4e962a1cc","Type":"ContainerStarted","Data":"9b41158674698a3134b4c8e02b731e1e077f190797d925318b4747693764d30f"} Oct 03 08:57:16 crc kubenswrapper[4765]: I1003 08:57:16.380161 4765 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="watcher-kuttl-default/alertmanager-metric-storage-0" Oct 03 08:57:16 crc kubenswrapper[4765]: I1003 08:57:16.385546 4765 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="watcher-kuttl-default/alertmanager-metric-storage-0" Oct 03 08:57:16 crc kubenswrapper[4765]: I1003 08:57:16.442461 4765 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="watcher-kuttl-default/alertmanager-metric-storage-0" podStartSLOduration=5.27248765 podStartE2EDuration="36.442441809s" podCreationTimestamp="2025-10-03 08:56:40 +0000 UTC" firstStartedPulling="2025-10-03 08:56:41.895828102 +0000 UTC m=+1046.197322442" lastFinishedPulling="2025-10-03 08:57:13.065782271 +0000 UTC m=+1077.367276601" observedRunningTime="2025-10-03 08:57:16.412772092 +0000 UTC m=+1080.714266432" watchObservedRunningTime="2025-10-03 08:57:16.442441809 +0000 UTC m=+1080.743936139" Oct 03 08:57:17 crc kubenswrapper[4765]: I1003 08:57:17.266184 4765 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-console/console-848877996-52ncb" podUID="64809084-8cca-4e95-ace6-5ecfcf98b208" containerName="console" containerID="cri-o://2d9da8541379e88bba639a15369f9fde0e3eb86a21afce0f6a1db49db0ab4b39" gracePeriod=15 Oct 03 08:57:17 crc kubenswrapper[4765]: I1003 08:57:17.390121 4765 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-848877996-52ncb_64809084-8cca-4e95-ace6-5ecfcf98b208/console/0.log" Oct 03 08:57:17 crc kubenswrapper[4765]: I1003 08:57:17.390166 4765 generic.go:334] "Generic (PLEG): container finished" podID="64809084-8cca-4e95-ace6-5ecfcf98b208" containerID="2d9da8541379e88bba639a15369f9fde0e3eb86a21afce0f6a1db49db0ab4b39" exitCode=2 Oct 03 08:57:17 crc kubenswrapper[4765]: I1003 08:57:17.391104 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-848877996-52ncb" event={"ID":"64809084-8cca-4e95-ace6-5ecfcf98b208","Type":"ContainerDied","Data":"2d9da8541379e88bba639a15369f9fde0e3eb86a21afce0f6a1db49db0ab4b39"} Oct 03 08:57:17 crc kubenswrapper[4765]: I1003 08:57:17.682243 4765 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-848877996-52ncb_64809084-8cca-4e95-ace6-5ecfcf98b208/console/0.log" Oct 03 08:57:17 crc kubenswrapper[4765]: I1003 08:57:17.682541 4765 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-848877996-52ncb" Oct 03 08:57:17 crc kubenswrapper[4765]: I1003 08:57:17.831117 4765 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/64809084-8cca-4e95-ace6-5ecfcf98b208-service-ca\") pod \"64809084-8cca-4e95-ace6-5ecfcf98b208\" (UID: \"64809084-8cca-4e95-ace6-5ecfcf98b208\") " Oct 03 08:57:17 crc kubenswrapper[4765]: I1003 08:57:17.831207 4765 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kkvs7\" (UniqueName: \"kubernetes.io/projected/64809084-8cca-4e95-ace6-5ecfcf98b208-kube-api-access-kkvs7\") pod \"64809084-8cca-4e95-ace6-5ecfcf98b208\" (UID: \"64809084-8cca-4e95-ace6-5ecfcf98b208\") " Oct 03 08:57:17 crc kubenswrapper[4765]: I1003 08:57:17.831234 4765 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/64809084-8cca-4e95-ace6-5ecfcf98b208-trusted-ca-bundle\") pod \"64809084-8cca-4e95-ace6-5ecfcf98b208\" (UID: \"64809084-8cca-4e95-ace6-5ecfcf98b208\") " Oct 03 08:57:17 crc kubenswrapper[4765]: I1003 08:57:17.831285 4765 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/64809084-8cca-4e95-ace6-5ecfcf98b208-console-serving-cert\") pod \"64809084-8cca-4e95-ace6-5ecfcf98b208\" (UID: \"64809084-8cca-4e95-ace6-5ecfcf98b208\") " Oct 03 08:57:17 crc kubenswrapper[4765]: I1003 08:57:17.831338 4765 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/64809084-8cca-4e95-ace6-5ecfcf98b208-console-config\") pod \"64809084-8cca-4e95-ace6-5ecfcf98b208\" (UID: \"64809084-8cca-4e95-ace6-5ecfcf98b208\") " Oct 03 08:57:17 crc kubenswrapper[4765]: I1003 08:57:17.831381 4765 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/64809084-8cca-4e95-ace6-5ecfcf98b208-oauth-serving-cert\") pod \"64809084-8cca-4e95-ace6-5ecfcf98b208\" (UID: \"64809084-8cca-4e95-ace6-5ecfcf98b208\") " Oct 03 08:57:17 crc kubenswrapper[4765]: I1003 08:57:17.831432 4765 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/64809084-8cca-4e95-ace6-5ecfcf98b208-console-oauth-config\") pod \"64809084-8cca-4e95-ace6-5ecfcf98b208\" (UID: \"64809084-8cca-4e95-ace6-5ecfcf98b208\") " Oct 03 08:57:17 crc kubenswrapper[4765]: I1003 08:57:17.832311 4765 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/64809084-8cca-4e95-ace6-5ecfcf98b208-oauth-serving-cert" (OuterVolumeSpecName: "oauth-serving-cert") pod "64809084-8cca-4e95-ace6-5ecfcf98b208" (UID: "64809084-8cca-4e95-ace6-5ecfcf98b208"). InnerVolumeSpecName "oauth-serving-cert". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 03 08:57:17 crc kubenswrapper[4765]: I1003 08:57:17.832344 4765 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/64809084-8cca-4e95-ace6-5ecfcf98b208-console-config" (OuterVolumeSpecName: "console-config") pod "64809084-8cca-4e95-ace6-5ecfcf98b208" (UID: "64809084-8cca-4e95-ace6-5ecfcf98b208"). InnerVolumeSpecName "console-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 03 08:57:17 crc kubenswrapper[4765]: I1003 08:57:17.832352 4765 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/64809084-8cca-4e95-ace6-5ecfcf98b208-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "64809084-8cca-4e95-ace6-5ecfcf98b208" (UID: "64809084-8cca-4e95-ace6-5ecfcf98b208"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 03 08:57:17 crc kubenswrapper[4765]: I1003 08:57:17.832390 4765 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/64809084-8cca-4e95-ace6-5ecfcf98b208-service-ca" (OuterVolumeSpecName: "service-ca") pod "64809084-8cca-4e95-ace6-5ecfcf98b208" (UID: "64809084-8cca-4e95-ace6-5ecfcf98b208"). InnerVolumeSpecName "service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 03 08:57:17 crc kubenswrapper[4765]: I1003 08:57:17.832799 4765 reconciler_common.go:293] "Volume detached for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/64809084-8cca-4e95-ace6-5ecfcf98b208-service-ca\") on node \"crc\" DevicePath \"\"" Oct 03 08:57:17 crc kubenswrapper[4765]: I1003 08:57:17.832823 4765 reconciler_common.go:293] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/64809084-8cca-4e95-ace6-5ecfcf98b208-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 03 08:57:17 crc kubenswrapper[4765]: I1003 08:57:17.832837 4765 reconciler_common.go:293] "Volume detached for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/64809084-8cca-4e95-ace6-5ecfcf98b208-console-config\") on node \"crc\" DevicePath \"\"" Oct 03 08:57:17 crc kubenswrapper[4765]: I1003 08:57:17.832848 4765 reconciler_common.go:293] "Volume detached for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/64809084-8cca-4e95-ace6-5ecfcf98b208-oauth-serving-cert\") on node \"crc\" DevicePath \"\"" Oct 03 08:57:17 crc kubenswrapper[4765]: I1003 08:57:17.836373 4765 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/64809084-8cca-4e95-ace6-5ecfcf98b208-kube-api-access-kkvs7" (OuterVolumeSpecName: "kube-api-access-kkvs7") pod "64809084-8cca-4e95-ace6-5ecfcf98b208" (UID: "64809084-8cca-4e95-ace6-5ecfcf98b208"). InnerVolumeSpecName "kube-api-access-kkvs7". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 08:57:17 crc kubenswrapper[4765]: I1003 08:57:17.837004 4765 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/64809084-8cca-4e95-ace6-5ecfcf98b208-console-oauth-config" (OuterVolumeSpecName: "console-oauth-config") pod "64809084-8cca-4e95-ace6-5ecfcf98b208" (UID: "64809084-8cca-4e95-ace6-5ecfcf98b208"). InnerVolumeSpecName "console-oauth-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 08:57:17 crc kubenswrapper[4765]: I1003 08:57:17.841919 4765 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/64809084-8cca-4e95-ace6-5ecfcf98b208-console-serving-cert" (OuterVolumeSpecName: "console-serving-cert") pod "64809084-8cca-4e95-ace6-5ecfcf98b208" (UID: "64809084-8cca-4e95-ace6-5ecfcf98b208"). InnerVolumeSpecName "console-serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 08:57:17 crc kubenswrapper[4765]: I1003 08:57:17.934212 4765 reconciler_common.go:293] "Volume detached for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/64809084-8cca-4e95-ace6-5ecfcf98b208-console-oauth-config\") on node \"crc\" DevicePath \"\"" Oct 03 08:57:17 crc kubenswrapper[4765]: I1003 08:57:17.934255 4765 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-kkvs7\" (UniqueName: \"kubernetes.io/projected/64809084-8cca-4e95-ace6-5ecfcf98b208-kube-api-access-kkvs7\") on node \"crc\" DevicePath \"\"" Oct 03 08:57:17 crc kubenswrapper[4765]: I1003 08:57:17.934269 4765 reconciler_common.go:293] "Volume detached for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/64809084-8cca-4e95-ace6-5ecfcf98b208-console-serving-cert\") on node \"crc\" DevicePath \"\"" Oct 03 08:57:18 crc kubenswrapper[4765]: I1003 08:57:18.402618 4765 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-848877996-52ncb_64809084-8cca-4e95-ace6-5ecfcf98b208/console/0.log" Oct 03 08:57:18 crc kubenswrapper[4765]: I1003 08:57:18.402747 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-848877996-52ncb" event={"ID":"64809084-8cca-4e95-ace6-5ecfcf98b208","Type":"ContainerDied","Data":"f1a9920efd5c05acc7a769148e2d708e52c9564ff040eda2315f13862846dd64"} Oct 03 08:57:18 crc kubenswrapper[4765]: I1003 08:57:18.402785 4765 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-848877996-52ncb" Oct 03 08:57:18 crc kubenswrapper[4765]: I1003 08:57:18.402822 4765 scope.go:117] "RemoveContainer" containerID="2d9da8541379e88bba639a15369f9fde0e3eb86a21afce0f6a1db49db0ab4b39" Oct 03 08:57:18 crc kubenswrapper[4765]: I1003 08:57:18.431840 4765 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-console/console-848877996-52ncb"] Oct 03 08:57:18 crc kubenswrapper[4765]: I1003 08:57:18.436797 4765 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-console/console-848877996-52ncb"] Oct 03 08:57:19 crc kubenswrapper[4765]: I1003 08:57:19.417172 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/prometheus-metric-storage-0" event={"ID":"c29c72f9-5956-4af4-8936-e14f5d0ea18a","Type":"ContainerStarted","Data":"70438e2c809a5df05c1276583488a4fafe8953d53a962c8c6ac88dc81270bd9d"} Oct 03 08:57:19 crc kubenswrapper[4765]: I1003 08:57:19.447021 4765 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="watcher-kuttl-default/prometheus-metric-storage-0" podStartSLOduration=3.067868622 podStartE2EDuration="39.446998446s" podCreationTimestamp="2025-10-03 08:56:40 +0000 UTC" firstStartedPulling="2025-10-03 08:56:42.494781278 +0000 UTC m=+1046.796275608" lastFinishedPulling="2025-10-03 08:57:18.873911102 +0000 UTC m=+1083.175405432" observedRunningTime="2025-10-03 08:57:19.440750969 +0000 UTC m=+1083.742245319" watchObservedRunningTime="2025-10-03 08:57:19.446998446 +0000 UTC m=+1083.748492776" Oct 03 08:57:20 crc kubenswrapper[4765]: I1003 08:57:20.316306 4765 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="64809084-8cca-4e95-ace6-5ecfcf98b208" path="/var/lib/kubelet/pods/64809084-8cca-4e95-ace6-5ecfcf98b208/volumes" Oct 03 08:57:21 crc kubenswrapper[4765]: I1003 08:57:21.819253 4765 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="watcher-kuttl-default/prometheus-metric-storage-0" Oct 03 08:57:26 crc kubenswrapper[4765]: I1003 08:57:26.820031 4765 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="watcher-kuttl-default/prometheus-metric-storage-0" Oct 03 08:57:26 crc kubenswrapper[4765]: I1003 08:57:26.823508 4765 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="watcher-kuttl-default/prometheus-metric-storage-0" Oct 03 08:57:27 crc kubenswrapper[4765]: I1003 08:57:27.480399 4765 generic.go:334] "Generic (PLEG): container finished" podID="ee23f3ed-67bb-44ab-93fe-8251f7768941" containerID="38cdce0c0561407df5f0137ac96690dacbbb0ded9a974468f6e258e7860865c8" exitCode=0 Oct 03 08:57:27 crc kubenswrapper[4765]: I1003 08:57:27.480475 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/rabbitmq-server-0" event={"ID":"ee23f3ed-67bb-44ab-93fe-8251f7768941","Type":"ContainerDied","Data":"38cdce0c0561407df5f0137ac96690dacbbb0ded9a974468f6e258e7860865c8"} Oct 03 08:57:27 crc kubenswrapper[4765]: I1003 08:57:27.482759 4765 generic.go:334] "Generic (PLEG): container finished" podID="822ab948-07b5-4946-aeb3-d6cd9e4f6752" containerID="1e8a5d1c89041686731f9e7c7dc6265109cae39d92fe1aef985dc7dff1dfc2fe" exitCode=0 Oct 03 08:57:27 crc kubenswrapper[4765]: I1003 08:57:27.482832 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/rabbitmq-notifications-server-0" event={"ID":"822ab948-07b5-4946-aeb3-d6cd9e4f6752","Type":"ContainerDied","Data":"1e8a5d1c89041686731f9e7c7dc6265109cae39d92fe1aef985dc7dff1dfc2fe"} Oct 03 08:57:27 crc kubenswrapper[4765]: I1003 08:57:27.484328 4765 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="watcher-kuttl-default/prometheus-metric-storage-0" Oct 03 08:57:28 crc kubenswrapper[4765]: I1003 08:57:28.493495 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/rabbitmq-notifications-server-0" event={"ID":"822ab948-07b5-4946-aeb3-d6cd9e4f6752","Type":"ContainerStarted","Data":"910e3efed84f3e5e95ef3fdba0306acc1f3f87d933422fb2cd747e4f30722410"} Oct 03 08:57:28 crc kubenswrapper[4765]: I1003 08:57:28.494012 4765 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="watcher-kuttl-default/rabbitmq-notifications-server-0" Oct 03 08:57:28 crc kubenswrapper[4765]: I1003 08:57:28.496402 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/rabbitmq-server-0" event={"ID":"ee23f3ed-67bb-44ab-93fe-8251f7768941","Type":"ContainerStarted","Data":"0c408a419993b0afaf1a4cc7eab1188ebb957e1d0a49da46f7afb042d609cc47"} Oct 03 08:57:28 crc kubenswrapper[4765]: I1003 08:57:28.496928 4765 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="watcher-kuttl-default/rabbitmq-server-0" Oct 03 08:57:28 crc kubenswrapper[4765]: I1003 08:57:28.538042 4765 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="watcher-kuttl-default/rabbitmq-notifications-server-0" podStartSLOduration=37.779144048 podStartE2EDuration="52.538027125s" podCreationTimestamp="2025-10-03 08:56:36 +0000 UTC" firstStartedPulling="2025-10-03 08:56:38.417927073 +0000 UTC m=+1042.719421403" lastFinishedPulling="2025-10-03 08:56:53.17681015 +0000 UTC m=+1057.478304480" observedRunningTime="2025-10-03 08:57:28.535216034 +0000 UTC m=+1092.836710364" watchObservedRunningTime="2025-10-03 08:57:28.538027125 +0000 UTC m=+1092.839521455" Oct 03 08:57:28 crc kubenswrapper[4765]: I1003 08:57:28.577244 4765 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="watcher-kuttl-default/rabbitmq-server-0" podStartSLOduration=38.193208137 podStartE2EDuration="52.577223922s" podCreationTimestamp="2025-10-03 08:56:36 +0000 UTC" firstStartedPulling="2025-10-03 08:56:38.743972565 +0000 UTC m=+1043.045466885" lastFinishedPulling="2025-10-03 08:56:53.12798834 +0000 UTC m=+1057.429482670" observedRunningTime="2025-10-03 08:57:28.571625181 +0000 UTC m=+1092.873119511" watchObservedRunningTime="2025-10-03 08:57:28.577223922 +0000 UTC m=+1092.878718252" Oct 03 08:57:29 crc kubenswrapper[4765]: I1003 08:57:29.765862 4765 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["watcher-kuttl-default/keystone-7a7f-account-create-t89nv"] Oct 03 08:57:29 crc kubenswrapper[4765]: E1003 08:57:29.766686 4765 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="96645098-7ec7-4672-8f05-bc20105308e3" containerName="mariadb-database-create" Oct 03 08:57:29 crc kubenswrapper[4765]: I1003 08:57:29.766708 4765 state_mem.go:107] "Deleted CPUSet assignment" podUID="96645098-7ec7-4672-8f05-bc20105308e3" containerName="mariadb-database-create" Oct 03 08:57:29 crc kubenswrapper[4765]: E1003 08:57:29.766724 4765 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="64809084-8cca-4e95-ace6-5ecfcf98b208" containerName="console" Oct 03 08:57:29 crc kubenswrapper[4765]: I1003 08:57:29.766731 4765 state_mem.go:107] "Deleted CPUSet assignment" podUID="64809084-8cca-4e95-ace6-5ecfcf98b208" containerName="console" Oct 03 08:57:29 crc kubenswrapper[4765]: I1003 08:57:29.766968 4765 memory_manager.go:354] "RemoveStaleState removing state" podUID="64809084-8cca-4e95-ace6-5ecfcf98b208" containerName="console" Oct 03 08:57:29 crc kubenswrapper[4765]: I1003 08:57:29.766989 4765 memory_manager.go:354] "RemoveStaleState removing state" podUID="96645098-7ec7-4672-8f05-bc20105308e3" containerName="mariadb-database-create" Oct 03 08:57:29 crc kubenswrapper[4765]: I1003 08:57:29.767809 4765 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/keystone-7a7f-account-create-t89nv" Oct 03 08:57:29 crc kubenswrapper[4765]: I1003 08:57:29.772011 4765 reflector.go:368] Caches populated for *v1.Secret from object-"watcher-kuttl-default"/"keystone-db-secret" Oct 03 08:57:29 crc kubenswrapper[4765]: I1003 08:57:29.797555 4765 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["watcher-kuttl-default/keystone-7a7f-account-create-t89nv"] Oct 03 08:57:29 crc kubenswrapper[4765]: I1003 08:57:29.814108 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-z5g42\" (UniqueName: \"kubernetes.io/projected/ce7aef1c-e74f-473a-9a1c-751e2c3185e2-kube-api-access-z5g42\") pod \"keystone-7a7f-account-create-t89nv\" (UID: \"ce7aef1c-e74f-473a-9a1c-751e2c3185e2\") " pod="watcher-kuttl-default/keystone-7a7f-account-create-t89nv" Oct 03 08:57:29 crc kubenswrapper[4765]: I1003 08:57:29.916021 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-z5g42\" (UniqueName: \"kubernetes.io/projected/ce7aef1c-e74f-473a-9a1c-751e2c3185e2-kube-api-access-z5g42\") pod \"keystone-7a7f-account-create-t89nv\" (UID: \"ce7aef1c-e74f-473a-9a1c-751e2c3185e2\") " pod="watcher-kuttl-default/keystone-7a7f-account-create-t89nv" Oct 03 08:57:29 crc kubenswrapper[4765]: I1003 08:57:29.971297 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-z5g42\" (UniqueName: \"kubernetes.io/projected/ce7aef1c-e74f-473a-9a1c-751e2c3185e2-kube-api-access-z5g42\") pod \"keystone-7a7f-account-create-t89nv\" (UID: \"ce7aef1c-e74f-473a-9a1c-751e2c3185e2\") " pod="watcher-kuttl-default/keystone-7a7f-account-create-t89nv" Oct 03 08:57:30 crc kubenswrapper[4765]: I1003 08:57:30.049586 4765 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["watcher-kuttl-default/prometheus-metric-storage-0"] Oct 03 08:57:30 crc kubenswrapper[4765]: I1003 08:57:30.049918 4765 kuberuntime_container.go:808] "Killing container with a grace period" pod="watcher-kuttl-default/prometheus-metric-storage-0" podUID="c29c72f9-5956-4af4-8936-e14f5d0ea18a" containerName="prometheus" containerID="cri-o://0c229e566e190e473e9e2c9487ed215a62248b0201bc79de8834c43312fac2f3" gracePeriod=600 Oct 03 08:57:30 crc kubenswrapper[4765]: I1003 08:57:30.049986 4765 kuberuntime_container.go:808] "Killing container with a grace period" pod="watcher-kuttl-default/prometheus-metric-storage-0" podUID="c29c72f9-5956-4af4-8936-e14f5d0ea18a" containerName="thanos-sidecar" containerID="cri-o://70438e2c809a5df05c1276583488a4fafe8953d53a962c8c6ac88dc81270bd9d" gracePeriod=600 Oct 03 08:57:30 crc kubenswrapper[4765]: I1003 08:57:30.050037 4765 kuberuntime_container.go:808] "Killing container with a grace period" pod="watcher-kuttl-default/prometheus-metric-storage-0" podUID="c29c72f9-5956-4af4-8936-e14f5d0ea18a" containerName="config-reloader" containerID="cri-o://688227fda8445cdeae376833574da4045f8c177b73d7aa368b47852a6b6c9443" gracePeriod=600 Oct 03 08:57:30 crc kubenswrapper[4765]: I1003 08:57:30.103471 4765 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/keystone-7a7f-account-create-t89nv" Oct 03 08:57:30 crc kubenswrapper[4765]: I1003 08:57:30.536898 4765 generic.go:334] "Generic (PLEG): container finished" podID="c29c72f9-5956-4af4-8936-e14f5d0ea18a" containerID="70438e2c809a5df05c1276583488a4fafe8953d53a962c8c6ac88dc81270bd9d" exitCode=0 Oct 03 08:57:30 crc kubenswrapper[4765]: I1003 08:57:30.537272 4765 generic.go:334] "Generic (PLEG): container finished" podID="c29c72f9-5956-4af4-8936-e14f5d0ea18a" containerID="688227fda8445cdeae376833574da4045f8c177b73d7aa368b47852a6b6c9443" exitCode=0 Oct 03 08:57:30 crc kubenswrapper[4765]: I1003 08:57:30.537284 4765 generic.go:334] "Generic (PLEG): container finished" podID="c29c72f9-5956-4af4-8936-e14f5d0ea18a" containerID="0c229e566e190e473e9e2c9487ed215a62248b0201bc79de8834c43312fac2f3" exitCode=0 Oct 03 08:57:30 crc kubenswrapper[4765]: I1003 08:57:30.537311 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/prometheus-metric-storage-0" event={"ID":"c29c72f9-5956-4af4-8936-e14f5d0ea18a","Type":"ContainerDied","Data":"70438e2c809a5df05c1276583488a4fafe8953d53a962c8c6ac88dc81270bd9d"} Oct 03 08:57:30 crc kubenswrapper[4765]: I1003 08:57:30.537344 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/prometheus-metric-storage-0" event={"ID":"c29c72f9-5956-4af4-8936-e14f5d0ea18a","Type":"ContainerDied","Data":"688227fda8445cdeae376833574da4045f8c177b73d7aa368b47852a6b6c9443"} Oct 03 08:57:30 crc kubenswrapper[4765]: I1003 08:57:30.537357 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/prometheus-metric-storage-0" event={"ID":"c29c72f9-5956-4af4-8936-e14f5d0ea18a","Type":"ContainerDied","Data":"0c229e566e190e473e9e2c9487ed215a62248b0201bc79de8834c43312fac2f3"} Oct 03 08:57:30 crc kubenswrapper[4765]: I1003 08:57:30.680509 4765 patch_prober.go:28] interesting pod/machine-config-daemon-j8mss container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 03 08:57:30 crc kubenswrapper[4765]: I1003 08:57:30.680826 4765 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-j8mss" podUID="d636dbad-9ffa-4ba7-953f-adea04b76a23" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 03 08:57:30 crc kubenswrapper[4765]: I1003 08:57:30.719146 4765 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["watcher-kuttl-default/keystone-7a7f-account-create-t89nv"] Oct 03 08:57:30 crc kubenswrapper[4765]: I1003 08:57:30.862883 4765 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/prometheus-metric-storage-0" Oct 03 08:57:30 crc kubenswrapper[4765]: I1003 08:57:30.943984 4765 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/c29c72f9-5956-4af4-8936-e14f5d0ea18a-thanos-prometheus-http-client-file\") pod \"c29c72f9-5956-4af4-8936-e14f5d0ea18a\" (UID: \"c29c72f9-5956-4af4-8936-e14f5d0ea18a\") " Oct 03 08:57:30 crc kubenswrapper[4765]: I1003 08:57:30.944051 4765 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/c29c72f9-5956-4af4-8936-e14f5d0ea18a-web-config\") pod \"c29c72f9-5956-4af4-8936-e14f5d0ea18a\" (UID: \"c29c72f9-5956-4af4-8936-e14f5d0ea18a\") " Oct 03 08:57:30 crc kubenswrapper[4765]: I1003 08:57:30.944090 4765 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"prometheus-metric-storage-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/c29c72f9-5956-4af4-8936-e14f5d0ea18a-prometheus-metric-storage-rulefiles-0\") pod \"c29c72f9-5956-4af4-8936-e14f5d0ea18a\" (UID: \"c29c72f9-5956-4af4-8936-e14f5d0ea18a\") " Oct 03 08:57:30 crc kubenswrapper[4765]: I1003 08:57:30.944132 4765 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wshq2\" (UniqueName: \"kubernetes.io/projected/c29c72f9-5956-4af4-8936-e14f5d0ea18a-kube-api-access-wshq2\") pod \"c29c72f9-5956-4af4-8936-e14f5d0ea18a\" (UID: \"c29c72f9-5956-4af4-8936-e14f5d0ea18a\") " Oct 03 08:57:30 crc kubenswrapper[4765]: I1003 08:57:30.944158 4765 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/c29c72f9-5956-4af4-8936-e14f5d0ea18a-tls-assets\") pod \"c29c72f9-5956-4af4-8936-e14f5d0ea18a\" (UID: \"c29c72f9-5956-4af4-8936-e14f5d0ea18a\") " Oct 03 08:57:30 crc kubenswrapper[4765]: I1003 08:57:30.944295 4765 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"prometheus-metric-storage-db\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-3f5f0f0f-56c1-41b9-8467-2b1aaa11a12c\") pod \"c29c72f9-5956-4af4-8936-e14f5d0ea18a\" (UID: \"c29c72f9-5956-4af4-8936-e14f5d0ea18a\") " Oct 03 08:57:30 crc kubenswrapper[4765]: I1003 08:57:30.944372 4765 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/c29c72f9-5956-4af4-8936-e14f5d0ea18a-config\") pod \"c29c72f9-5956-4af4-8936-e14f5d0ea18a\" (UID: \"c29c72f9-5956-4af4-8936-e14f5d0ea18a\") " Oct 03 08:57:30 crc kubenswrapper[4765]: I1003 08:57:30.944408 4765 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/c29c72f9-5956-4af4-8936-e14f5d0ea18a-config-out\") pod \"c29c72f9-5956-4af4-8936-e14f5d0ea18a\" (UID: \"c29c72f9-5956-4af4-8936-e14f5d0ea18a\") " Oct 03 08:57:30 crc kubenswrapper[4765]: I1003 08:57:30.945488 4765 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c29c72f9-5956-4af4-8936-e14f5d0ea18a-prometheus-metric-storage-rulefiles-0" (OuterVolumeSpecName: "prometheus-metric-storage-rulefiles-0") pod "c29c72f9-5956-4af4-8936-e14f5d0ea18a" (UID: "c29c72f9-5956-4af4-8936-e14f5d0ea18a"). InnerVolumeSpecName "prometheus-metric-storage-rulefiles-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 03 08:57:30 crc kubenswrapper[4765]: I1003 08:57:30.951254 4765 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c29c72f9-5956-4af4-8936-e14f5d0ea18a-thanos-prometheus-http-client-file" (OuterVolumeSpecName: "thanos-prometheus-http-client-file") pod "c29c72f9-5956-4af4-8936-e14f5d0ea18a" (UID: "c29c72f9-5956-4af4-8936-e14f5d0ea18a"). InnerVolumeSpecName "thanos-prometheus-http-client-file". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 08:57:30 crc kubenswrapper[4765]: I1003 08:57:30.951355 4765 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c29c72f9-5956-4af4-8936-e14f5d0ea18a-config-out" (OuterVolumeSpecName: "config-out") pod "c29c72f9-5956-4af4-8936-e14f5d0ea18a" (UID: "c29c72f9-5956-4af4-8936-e14f5d0ea18a"). InnerVolumeSpecName "config-out". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 03 08:57:30 crc kubenswrapper[4765]: I1003 08:57:30.953656 4765 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c29c72f9-5956-4af4-8936-e14f5d0ea18a-config" (OuterVolumeSpecName: "config") pod "c29c72f9-5956-4af4-8936-e14f5d0ea18a" (UID: "c29c72f9-5956-4af4-8936-e14f5d0ea18a"). InnerVolumeSpecName "config". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 08:57:30 crc kubenswrapper[4765]: I1003 08:57:30.953713 4765 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c29c72f9-5956-4af4-8936-e14f5d0ea18a-tls-assets" (OuterVolumeSpecName: "tls-assets") pod "c29c72f9-5956-4af4-8936-e14f5d0ea18a" (UID: "c29c72f9-5956-4af4-8936-e14f5d0ea18a"). InnerVolumeSpecName "tls-assets". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 08:57:30 crc kubenswrapper[4765]: I1003 08:57:30.953834 4765 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c29c72f9-5956-4af4-8936-e14f5d0ea18a-kube-api-access-wshq2" (OuterVolumeSpecName: "kube-api-access-wshq2") pod "c29c72f9-5956-4af4-8936-e14f5d0ea18a" (UID: "c29c72f9-5956-4af4-8936-e14f5d0ea18a"). InnerVolumeSpecName "kube-api-access-wshq2". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 08:57:30 crc kubenswrapper[4765]: I1003 08:57:30.967978 4765 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-3f5f0f0f-56c1-41b9-8467-2b1aaa11a12c" (OuterVolumeSpecName: "prometheus-metric-storage-db") pod "c29c72f9-5956-4af4-8936-e14f5d0ea18a" (UID: "c29c72f9-5956-4af4-8936-e14f5d0ea18a"). InnerVolumeSpecName "pvc-3f5f0f0f-56c1-41b9-8467-2b1aaa11a12c". PluginName "kubernetes.io/csi", VolumeGidValue "" Oct 03 08:57:30 crc kubenswrapper[4765]: I1003 08:57:30.993235 4765 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c29c72f9-5956-4af4-8936-e14f5d0ea18a-web-config" (OuterVolumeSpecName: "web-config") pod "c29c72f9-5956-4af4-8936-e14f5d0ea18a" (UID: "c29c72f9-5956-4af4-8936-e14f5d0ea18a"). InnerVolumeSpecName "web-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 08:57:31 crc kubenswrapper[4765]: I1003 08:57:31.046420 4765 reconciler_common.go:293] "Volume detached for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/c29c72f9-5956-4af4-8936-e14f5d0ea18a-thanos-prometheus-http-client-file\") on node \"crc\" DevicePath \"\"" Oct 03 08:57:31 crc kubenswrapper[4765]: I1003 08:57:31.046473 4765 reconciler_common.go:293] "Volume detached for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/c29c72f9-5956-4af4-8936-e14f5d0ea18a-web-config\") on node \"crc\" DevicePath \"\"" Oct 03 08:57:31 crc kubenswrapper[4765]: I1003 08:57:31.046490 4765 reconciler_common.go:293] "Volume detached for volume \"prometheus-metric-storage-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/c29c72f9-5956-4af4-8936-e14f5d0ea18a-prometheus-metric-storage-rulefiles-0\") on node \"crc\" DevicePath \"\"" Oct 03 08:57:31 crc kubenswrapper[4765]: I1003 08:57:31.046508 4765 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wshq2\" (UniqueName: \"kubernetes.io/projected/c29c72f9-5956-4af4-8936-e14f5d0ea18a-kube-api-access-wshq2\") on node \"crc\" DevicePath \"\"" Oct 03 08:57:31 crc kubenswrapper[4765]: I1003 08:57:31.046523 4765 reconciler_common.go:293] "Volume detached for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/c29c72f9-5956-4af4-8936-e14f5d0ea18a-tls-assets\") on node \"crc\" DevicePath \"\"" Oct 03 08:57:31 crc kubenswrapper[4765]: I1003 08:57:31.046578 4765 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"pvc-3f5f0f0f-56c1-41b9-8467-2b1aaa11a12c\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-3f5f0f0f-56c1-41b9-8467-2b1aaa11a12c\") on node \"crc\" " Oct 03 08:57:31 crc kubenswrapper[4765]: I1003 08:57:31.046595 4765 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/secret/c29c72f9-5956-4af4-8936-e14f5d0ea18a-config\") on node \"crc\" DevicePath \"\"" Oct 03 08:57:31 crc kubenswrapper[4765]: I1003 08:57:31.046611 4765 reconciler_common.go:293] "Volume detached for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/c29c72f9-5956-4af4-8936-e14f5d0ea18a-config-out\") on node \"crc\" DevicePath \"\"" Oct 03 08:57:31 crc kubenswrapper[4765]: I1003 08:57:31.073123 4765 csi_attacher.go:630] kubernetes.io/csi: attacher.UnmountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping UnmountDevice... Oct 03 08:57:31 crc kubenswrapper[4765]: I1003 08:57:31.073289 4765 operation_generator.go:917] UnmountDevice succeeded for volume "pvc-3f5f0f0f-56c1-41b9-8467-2b1aaa11a12c" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-3f5f0f0f-56c1-41b9-8467-2b1aaa11a12c") on node "crc" Oct 03 08:57:31 crc kubenswrapper[4765]: I1003 08:57:31.147761 4765 reconciler_common.go:293] "Volume detached for volume \"pvc-3f5f0f0f-56c1-41b9-8467-2b1aaa11a12c\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-3f5f0f0f-56c1-41b9-8467-2b1aaa11a12c\") on node \"crc\" DevicePath \"\"" Oct 03 08:57:31 crc kubenswrapper[4765]: I1003 08:57:31.548000 4765 generic.go:334] "Generic (PLEG): container finished" podID="ce7aef1c-e74f-473a-9a1c-751e2c3185e2" containerID="323551f52f86181592b93eb14fff39ac59c9ed25a4731bbfba3cc8d0f7b904fb" exitCode=0 Oct 03 08:57:31 crc kubenswrapper[4765]: I1003 08:57:31.548096 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/keystone-7a7f-account-create-t89nv" event={"ID":"ce7aef1c-e74f-473a-9a1c-751e2c3185e2","Type":"ContainerDied","Data":"323551f52f86181592b93eb14fff39ac59c9ed25a4731bbfba3cc8d0f7b904fb"} Oct 03 08:57:31 crc kubenswrapper[4765]: I1003 08:57:31.548123 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/keystone-7a7f-account-create-t89nv" event={"ID":"ce7aef1c-e74f-473a-9a1c-751e2c3185e2","Type":"ContainerStarted","Data":"da4710b7a5a9634afbe8487d6d9de9e1e4ed66f9ced1e0759353499320c3772a"} Oct 03 08:57:31 crc kubenswrapper[4765]: I1003 08:57:31.551982 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/prometheus-metric-storage-0" event={"ID":"c29c72f9-5956-4af4-8936-e14f5d0ea18a","Type":"ContainerDied","Data":"481790c0b054b5d75a13abd20c15c82d035ba0b282d8161fabde53f5716bd714"} Oct 03 08:57:31 crc kubenswrapper[4765]: I1003 08:57:31.552031 4765 scope.go:117] "RemoveContainer" containerID="70438e2c809a5df05c1276583488a4fafe8953d53a962c8c6ac88dc81270bd9d" Oct 03 08:57:31 crc kubenswrapper[4765]: I1003 08:57:31.552079 4765 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/prometheus-metric-storage-0" Oct 03 08:57:31 crc kubenswrapper[4765]: I1003 08:57:31.577844 4765 scope.go:117] "RemoveContainer" containerID="688227fda8445cdeae376833574da4045f8c177b73d7aa368b47852a6b6c9443" Oct 03 08:57:31 crc kubenswrapper[4765]: I1003 08:57:31.600921 4765 scope.go:117] "RemoveContainer" containerID="0c229e566e190e473e9e2c9487ed215a62248b0201bc79de8834c43312fac2f3" Oct 03 08:57:31 crc kubenswrapper[4765]: I1003 08:57:31.610127 4765 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["watcher-kuttl-default/prometheus-metric-storage-0"] Oct 03 08:57:31 crc kubenswrapper[4765]: I1003 08:57:31.626345 4765 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["watcher-kuttl-default/prometheus-metric-storage-0"] Oct 03 08:57:31 crc kubenswrapper[4765]: I1003 08:57:31.635377 4765 scope.go:117] "RemoveContainer" containerID="e8513b286421b80f51fa44472253d5d8576a7e0684141b66f00d41203128d445" Oct 03 08:57:31 crc kubenswrapper[4765]: I1003 08:57:31.637336 4765 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["watcher-kuttl-default/prometheus-metric-storage-0"] Oct 03 08:57:31 crc kubenswrapper[4765]: E1003 08:57:31.637868 4765 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c29c72f9-5956-4af4-8936-e14f5d0ea18a" containerName="thanos-sidecar" Oct 03 08:57:31 crc kubenswrapper[4765]: I1003 08:57:31.637896 4765 state_mem.go:107] "Deleted CPUSet assignment" podUID="c29c72f9-5956-4af4-8936-e14f5d0ea18a" containerName="thanos-sidecar" Oct 03 08:57:31 crc kubenswrapper[4765]: E1003 08:57:31.637916 4765 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c29c72f9-5956-4af4-8936-e14f5d0ea18a" containerName="config-reloader" Oct 03 08:57:31 crc kubenswrapper[4765]: I1003 08:57:31.637925 4765 state_mem.go:107] "Deleted CPUSet assignment" podUID="c29c72f9-5956-4af4-8936-e14f5d0ea18a" containerName="config-reloader" Oct 03 08:57:31 crc kubenswrapper[4765]: E1003 08:57:31.637956 4765 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c29c72f9-5956-4af4-8936-e14f5d0ea18a" containerName="init-config-reloader" Oct 03 08:57:31 crc kubenswrapper[4765]: I1003 08:57:31.637966 4765 state_mem.go:107] "Deleted CPUSet assignment" podUID="c29c72f9-5956-4af4-8936-e14f5d0ea18a" containerName="init-config-reloader" Oct 03 08:57:31 crc kubenswrapper[4765]: E1003 08:57:31.637977 4765 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c29c72f9-5956-4af4-8936-e14f5d0ea18a" containerName="prometheus" Oct 03 08:57:31 crc kubenswrapper[4765]: I1003 08:57:31.637984 4765 state_mem.go:107] "Deleted CPUSet assignment" podUID="c29c72f9-5956-4af4-8936-e14f5d0ea18a" containerName="prometheus" Oct 03 08:57:31 crc kubenswrapper[4765]: I1003 08:57:31.638181 4765 memory_manager.go:354] "RemoveStaleState removing state" podUID="c29c72f9-5956-4af4-8936-e14f5d0ea18a" containerName="config-reloader" Oct 03 08:57:31 crc kubenswrapper[4765]: I1003 08:57:31.638203 4765 memory_manager.go:354] "RemoveStaleState removing state" podUID="c29c72f9-5956-4af4-8936-e14f5d0ea18a" containerName="prometheus" Oct 03 08:57:31 crc kubenswrapper[4765]: I1003 08:57:31.638217 4765 memory_manager.go:354] "RemoveStaleState removing state" podUID="c29c72f9-5956-4af4-8936-e14f5d0ea18a" containerName="thanos-sidecar" Oct 03 08:57:31 crc kubenswrapper[4765]: I1003 08:57:31.640219 4765 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/prometheus-metric-storage-0" Oct 03 08:57:31 crc kubenswrapper[4765]: I1003 08:57:31.675531 4765 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["watcher-kuttl-default/prometheus-metric-storage-0"] Oct 03 08:57:31 crc kubenswrapper[4765]: I1003 08:57:31.676538 4765 reflector.go:368] Caches populated for *v1.Secret from object-"watcher-kuttl-default"/"prometheus-metric-storage-thanos-prometheus-http-client-file" Oct 03 08:57:31 crc kubenswrapper[4765]: I1003 08:57:31.676754 4765 reflector.go:368] Caches populated for *v1.Secret from object-"watcher-kuttl-default"/"prometheus-metric-storage-web-config" Oct 03 08:57:31 crc kubenswrapper[4765]: I1003 08:57:31.676867 4765 reflector.go:368] Caches populated for *v1.Secret from object-"watcher-kuttl-default"/"prometheus-metric-storage" Oct 03 08:57:31 crc kubenswrapper[4765]: I1003 08:57:31.677107 4765 reflector.go:368] Caches populated for *v1.Secret from object-"watcher-kuttl-default"/"metric-storage-prometheus-dockercfg-qmkx5" Oct 03 08:57:31 crc kubenswrapper[4765]: I1003 08:57:31.678909 4765 reflector.go:368] Caches populated for *v1.Secret from object-"watcher-kuttl-default"/"cert-metric-storage-prometheus-svc" Oct 03 08:57:31 crc kubenswrapper[4765]: I1003 08:57:31.679491 4765 reflector.go:368] Caches populated for *v1.ConfigMap from object-"watcher-kuttl-default"/"prometheus-metric-storage-rulefiles-0" Oct 03 08:57:31 crc kubenswrapper[4765]: I1003 08:57:31.687226 4765 reflector.go:368] Caches populated for *v1.Secret from object-"watcher-kuttl-default"/"prometheus-metric-storage-tls-assets-0" Oct 03 08:57:31 crc kubenswrapper[4765]: I1003 08:57:31.757383 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-metric-storage-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/81065dff-f372-4966-8f6b-751090a1f5b6-prometheus-metric-storage-rulefiles-0\") pod \"prometheus-metric-storage-0\" (UID: \"81065dff-f372-4966-8f6b-751090a1f5b6\") " pod="watcher-kuttl-default/prometheus-metric-storage-0" Oct 03 08:57:31 crc kubenswrapper[4765]: I1003 08:57:31.757483 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jcjtf\" (UniqueName: \"kubernetes.io/projected/81065dff-f372-4966-8f6b-751090a1f5b6-kube-api-access-jcjtf\") pod \"prometheus-metric-storage-0\" (UID: \"81065dff-f372-4966-8f6b-751090a1f5b6\") " pod="watcher-kuttl-default/prometheus-metric-storage-0" Oct 03 08:57:31 crc kubenswrapper[4765]: I1003 08:57:31.757536 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"web-config-tls-secret-key-cert-metric-storage-promethe-dc638c2d\" (UniqueName: \"kubernetes.io/secret/81065dff-f372-4966-8f6b-751090a1f5b6-web-config-tls-secret-key-cert-metric-storage-promethe-dc638c2d\") pod \"prometheus-metric-storage-0\" (UID: \"81065dff-f372-4966-8f6b-751090a1f5b6\") " pod="watcher-kuttl-default/prometheus-metric-storage-0" Oct 03 08:57:31 crc kubenswrapper[4765]: I1003 08:57:31.757564 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/81065dff-f372-4966-8f6b-751090a1f5b6-secret-combined-ca-bundle\") pod \"prometheus-metric-storage-0\" (UID: \"81065dff-f372-4966-8f6b-751090a1f5b6\") " pod="watcher-kuttl-default/prometheus-metric-storage-0" Oct 03 08:57:31 crc kubenswrapper[4765]: I1003 08:57:31.757612 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/81065dff-f372-4966-8f6b-751090a1f5b6-config\") pod \"prometheus-metric-storage-0\" (UID: \"81065dff-f372-4966-8f6b-751090a1f5b6\") " pod="watcher-kuttl-default/prometheus-metric-storage-0" Oct 03 08:57:31 crc kubenswrapper[4765]: I1003 08:57:31.757661 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/81065dff-f372-4966-8f6b-751090a1f5b6-thanos-prometheus-http-client-file\") pod \"prometheus-metric-storage-0\" (UID: \"81065dff-f372-4966-8f6b-751090a1f5b6\") " pod="watcher-kuttl-default/prometheus-metric-storage-0" Oct 03 08:57:31 crc kubenswrapper[4765]: I1003 08:57:31.757690 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/81065dff-f372-4966-8f6b-751090a1f5b6-config-out\") pod \"prometheus-metric-storage-0\" (UID: \"81065dff-f372-4966-8f6b-751090a1f5b6\") " pod="watcher-kuttl-default/prometheus-metric-storage-0" Oct 03 08:57:31 crc kubenswrapper[4765]: I1003 08:57:31.757732 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/81065dff-f372-4966-8f6b-751090a1f5b6-tls-assets\") pod \"prometheus-metric-storage-0\" (UID: \"81065dff-f372-4966-8f6b-751090a1f5b6\") " pod="watcher-kuttl-default/prometheus-metric-storage-0" Oct 03 08:57:31 crc kubenswrapper[4765]: I1003 08:57:31.757765 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"web-config-tls-secret-cert-cert-metric-storage-prometh-dc638c2d\" (UniqueName: \"kubernetes.io/secret/81065dff-f372-4966-8f6b-751090a1f5b6-web-config-tls-secret-cert-cert-metric-storage-prometh-dc638c2d\") pod \"prometheus-metric-storage-0\" (UID: \"81065dff-f372-4966-8f6b-751090a1f5b6\") " pod="watcher-kuttl-default/prometheus-metric-storage-0" Oct 03 08:57:31 crc kubenswrapper[4765]: I1003 08:57:31.757805 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/81065dff-f372-4966-8f6b-751090a1f5b6-web-config\") pod \"prometheus-metric-storage-0\" (UID: \"81065dff-f372-4966-8f6b-751090a1f5b6\") " pod="watcher-kuttl-default/prometheus-metric-storage-0" Oct 03 08:57:31 crc kubenswrapper[4765]: I1003 08:57:31.757841 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pvc-3f5f0f0f-56c1-41b9-8467-2b1aaa11a12c\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-3f5f0f0f-56c1-41b9-8467-2b1aaa11a12c\") pod \"prometheus-metric-storage-0\" (UID: \"81065dff-f372-4966-8f6b-751090a1f5b6\") " pod="watcher-kuttl-default/prometheus-metric-storage-0" Oct 03 08:57:31 crc kubenswrapper[4765]: I1003 08:57:31.859706 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/81065dff-f372-4966-8f6b-751090a1f5b6-secret-combined-ca-bundle\") pod \"prometheus-metric-storage-0\" (UID: \"81065dff-f372-4966-8f6b-751090a1f5b6\") " pod="watcher-kuttl-default/prometheus-metric-storage-0" Oct 03 08:57:31 crc kubenswrapper[4765]: I1003 08:57:31.859988 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/81065dff-f372-4966-8f6b-751090a1f5b6-config\") pod \"prometheus-metric-storage-0\" (UID: \"81065dff-f372-4966-8f6b-751090a1f5b6\") " pod="watcher-kuttl-default/prometheus-metric-storage-0" Oct 03 08:57:31 crc kubenswrapper[4765]: I1003 08:57:31.860153 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/81065dff-f372-4966-8f6b-751090a1f5b6-thanos-prometheus-http-client-file\") pod \"prometheus-metric-storage-0\" (UID: \"81065dff-f372-4966-8f6b-751090a1f5b6\") " pod="watcher-kuttl-default/prometheus-metric-storage-0" Oct 03 08:57:31 crc kubenswrapper[4765]: I1003 08:57:31.860289 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/81065dff-f372-4966-8f6b-751090a1f5b6-config-out\") pod \"prometheus-metric-storage-0\" (UID: \"81065dff-f372-4966-8f6b-751090a1f5b6\") " pod="watcher-kuttl-default/prometheus-metric-storage-0" Oct 03 08:57:31 crc kubenswrapper[4765]: I1003 08:57:31.860457 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/81065dff-f372-4966-8f6b-751090a1f5b6-tls-assets\") pod \"prometheus-metric-storage-0\" (UID: \"81065dff-f372-4966-8f6b-751090a1f5b6\") " pod="watcher-kuttl-default/prometheus-metric-storage-0" Oct 03 08:57:31 crc kubenswrapper[4765]: I1003 08:57:31.860578 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"web-config-tls-secret-cert-cert-metric-storage-prometh-dc638c2d\" (UniqueName: \"kubernetes.io/secret/81065dff-f372-4966-8f6b-751090a1f5b6-web-config-tls-secret-cert-cert-metric-storage-prometh-dc638c2d\") pod \"prometheus-metric-storage-0\" (UID: \"81065dff-f372-4966-8f6b-751090a1f5b6\") " pod="watcher-kuttl-default/prometheus-metric-storage-0" Oct 03 08:57:31 crc kubenswrapper[4765]: I1003 08:57:31.860607 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/81065dff-f372-4966-8f6b-751090a1f5b6-web-config\") pod \"prometheus-metric-storage-0\" (UID: \"81065dff-f372-4966-8f6b-751090a1f5b6\") " pod="watcher-kuttl-default/prometheus-metric-storage-0" Oct 03 08:57:31 crc kubenswrapper[4765]: I1003 08:57:31.860655 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-3f5f0f0f-56c1-41b9-8467-2b1aaa11a12c\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-3f5f0f0f-56c1-41b9-8467-2b1aaa11a12c\") pod \"prometheus-metric-storage-0\" (UID: \"81065dff-f372-4966-8f6b-751090a1f5b6\") " pod="watcher-kuttl-default/prometheus-metric-storage-0" Oct 03 08:57:31 crc kubenswrapper[4765]: I1003 08:57:31.860716 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"prometheus-metric-storage-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/81065dff-f372-4966-8f6b-751090a1f5b6-prometheus-metric-storage-rulefiles-0\") pod \"prometheus-metric-storage-0\" (UID: \"81065dff-f372-4966-8f6b-751090a1f5b6\") " pod="watcher-kuttl-default/prometheus-metric-storage-0" Oct 03 08:57:31 crc kubenswrapper[4765]: I1003 08:57:31.861234 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jcjtf\" (UniqueName: \"kubernetes.io/projected/81065dff-f372-4966-8f6b-751090a1f5b6-kube-api-access-jcjtf\") pod \"prometheus-metric-storage-0\" (UID: \"81065dff-f372-4966-8f6b-751090a1f5b6\") " pod="watcher-kuttl-default/prometheus-metric-storage-0" Oct 03 08:57:31 crc kubenswrapper[4765]: I1003 08:57:31.861292 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"web-config-tls-secret-key-cert-metric-storage-promethe-dc638c2d\" (UniqueName: \"kubernetes.io/secret/81065dff-f372-4966-8f6b-751090a1f5b6-web-config-tls-secret-key-cert-metric-storage-promethe-dc638c2d\") pod \"prometheus-metric-storage-0\" (UID: \"81065dff-f372-4966-8f6b-751090a1f5b6\") " pod="watcher-kuttl-default/prometheus-metric-storage-0" Oct 03 08:57:31 crc kubenswrapper[4765]: I1003 08:57:31.862772 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"prometheus-metric-storage-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/81065dff-f372-4966-8f6b-751090a1f5b6-prometheus-metric-storage-rulefiles-0\") pod \"prometheus-metric-storage-0\" (UID: \"81065dff-f372-4966-8f6b-751090a1f5b6\") " pod="watcher-kuttl-default/prometheus-metric-storage-0" Oct 03 08:57:31 crc kubenswrapper[4765]: I1003 08:57:31.867254 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/81065dff-f372-4966-8f6b-751090a1f5b6-secret-combined-ca-bundle\") pod \"prometheus-metric-storage-0\" (UID: \"81065dff-f372-4966-8f6b-751090a1f5b6\") " pod="watcher-kuttl-default/prometheus-metric-storage-0" Oct 03 08:57:31 crc kubenswrapper[4765]: I1003 08:57:31.868697 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"web-config-tls-secret-cert-cert-metric-storage-prometh-dc638c2d\" (UniqueName: \"kubernetes.io/secret/81065dff-f372-4966-8f6b-751090a1f5b6-web-config-tls-secret-cert-cert-metric-storage-prometh-dc638c2d\") pod \"prometheus-metric-storage-0\" (UID: \"81065dff-f372-4966-8f6b-751090a1f5b6\") " pod="watcher-kuttl-default/prometheus-metric-storage-0" Oct 03 08:57:31 crc kubenswrapper[4765]: I1003 08:57:31.868904 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/81065dff-f372-4966-8f6b-751090a1f5b6-config-out\") pod \"prometheus-metric-storage-0\" (UID: \"81065dff-f372-4966-8f6b-751090a1f5b6\") " pod="watcher-kuttl-default/prometheus-metric-storage-0" Oct 03 08:57:31 crc kubenswrapper[4765]: I1003 08:57:31.868977 4765 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Oct 03 08:57:31 crc kubenswrapper[4765]: I1003 08:57:31.869026 4765 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-3f5f0f0f-56c1-41b9-8467-2b1aaa11a12c\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-3f5f0f0f-56c1-41b9-8467-2b1aaa11a12c\") pod \"prometheus-metric-storage-0\" (UID: \"81065dff-f372-4966-8f6b-751090a1f5b6\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/bd2eec3abdd28a1cd7561a06af9eac99c6b9120801b74599f1637a7e0294eead/globalmount\"" pod="watcher-kuttl-default/prometheus-metric-storage-0" Oct 03 08:57:31 crc kubenswrapper[4765]: I1003 08:57:31.869510 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/81065dff-f372-4966-8f6b-751090a1f5b6-thanos-prometheus-http-client-file\") pod \"prometheus-metric-storage-0\" (UID: \"81065dff-f372-4966-8f6b-751090a1f5b6\") " pod="watcher-kuttl-default/prometheus-metric-storage-0" Oct 03 08:57:31 crc kubenswrapper[4765]: I1003 08:57:31.869706 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/secret/81065dff-f372-4966-8f6b-751090a1f5b6-config\") pod \"prometheus-metric-storage-0\" (UID: \"81065dff-f372-4966-8f6b-751090a1f5b6\") " pod="watcher-kuttl-default/prometheus-metric-storage-0" Oct 03 08:57:31 crc kubenswrapper[4765]: I1003 08:57:31.870128 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/81065dff-f372-4966-8f6b-751090a1f5b6-tls-assets\") pod \"prometheus-metric-storage-0\" (UID: \"81065dff-f372-4966-8f6b-751090a1f5b6\") " pod="watcher-kuttl-default/prometheus-metric-storage-0" Oct 03 08:57:31 crc kubenswrapper[4765]: I1003 08:57:31.870412 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/81065dff-f372-4966-8f6b-751090a1f5b6-web-config\") pod \"prometheus-metric-storage-0\" (UID: \"81065dff-f372-4966-8f6b-751090a1f5b6\") " pod="watcher-kuttl-default/prometheus-metric-storage-0" Oct 03 08:57:31 crc kubenswrapper[4765]: I1003 08:57:31.877312 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"web-config-tls-secret-key-cert-metric-storage-promethe-dc638c2d\" (UniqueName: \"kubernetes.io/secret/81065dff-f372-4966-8f6b-751090a1f5b6-web-config-tls-secret-key-cert-metric-storage-promethe-dc638c2d\") pod \"prometheus-metric-storage-0\" (UID: \"81065dff-f372-4966-8f6b-751090a1f5b6\") " pod="watcher-kuttl-default/prometheus-metric-storage-0" Oct 03 08:57:31 crc kubenswrapper[4765]: I1003 08:57:31.884075 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jcjtf\" (UniqueName: \"kubernetes.io/projected/81065dff-f372-4966-8f6b-751090a1f5b6-kube-api-access-jcjtf\") pod \"prometheus-metric-storage-0\" (UID: \"81065dff-f372-4966-8f6b-751090a1f5b6\") " pod="watcher-kuttl-default/prometheus-metric-storage-0" Oct 03 08:57:31 crc kubenswrapper[4765]: I1003 08:57:31.924154 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-3f5f0f0f-56c1-41b9-8467-2b1aaa11a12c\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-3f5f0f0f-56c1-41b9-8467-2b1aaa11a12c\") pod \"prometheus-metric-storage-0\" (UID: \"81065dff-f372-4966-8f6b-751090a1f5b6\") " pod="watcher-kuttl-default/prometheus-metric-storage-0" Oct 03 08:57:31 crc kubenswrapper[4765]: I1003 08:57:31.993342 4765 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/prometheus-metric-storage-0" Oct 03 08:57:32 crc kubenswrapper[4765]: I1003 08:57:32.318228 4765 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c29c72f9-5956-4af4-8936-e14f5d0ea18a" path="/var/lib/kubelet/pods/c29c72f9-5956-4af4-8936-e14f5d0ea18a/volumes" Oct 03 08:57:32 crc kubenswrapper[4765]: I1003 08:57:32.526738 4765 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["watcher-kuttl-default/prometheus-metric-storage-0"] Oct 03 08:57:32 crc kubenswrapper[4765]: W1003 08:57:32.528788 4765 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod81065dff_f372_4966_8f6b_751090a1f5b6.slice/crio-32f92389edc69e32a8ef9325a1fb23b01861e53a31d1f279a251b9df9829cc2d WatchSource:0}: Error finding container 32f92389edc69e32a8ef9325a1fb23b01861e53a31d1f279a251b9df9829cc2d: Status 404 returned error can't find the container with id 32f92389edc69e32a8ef9325a1fb23b01861e53a31d1f279a251b9df9829cc2d Oct 03 08:57:32 crc kubenswrapper[4765]: I1003 08:57:32.575857 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/prometheus-metric-storage-0" event={"ID":"81065dff-f372-4966-8f6b-751090a1f5b6","Type":"ContainerStarted","Data":"32f92389edc69e32a8ef9325a1fb23b01861e53a31d1f279a251b9df9829cc2d"} Oct 03 08:57:32 crc kubenswrapper[4765]: I1003 08:57:32.848519 4765 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/keystone-7a7f-account-create-t89nv" Oct 03 08:57:32 crc kubenswrapper[4765]: I1003 08:57:32.988291 4765 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-z5g42\" (UniqueName: \"kubernetes.io/projected/ce7aef1c-e74f-473a-9a1c-751e2c3185e2-kube-api-access-z5g42\") pod \"ce7aef1c-e74f-473a-9a1c-751e2c3185e2\" (UID: \"ce7aef1c-e74f-473a-9a1c-751e2c3185e2\") " Oct 03 08:57:32 crc kubenswrapper[4765]: I1003 08:57:32.992995 4765 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ce7aef1c-e74f-473a-9a1c-751e2c3185e2-kube-api-access-z5g42" (OuterVolumeSpecName: "kube-api-access-z5g42") pod "ce7aef1c-e74f-473a-9a1c-751e2c3185e2" (UID: "ce7aef1c-e74f-473a-9a1c-751e2c3185e2"). InnerVolumeSpecName "kube-api-access-z5g42". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 08:57:33 crc kubenswrapper[4765]: I1003 08:57:33.090620 4765 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-z5g42\" (UniqueName: \"kubernetes.io/projected/ce7aef1c-e74f-473a-9a1c-751e2c3185e2-kube-api-access-z5g42\") on node \"crc\" DevicePath \"\"" Oct 03 08:57:33 crc kubenswrapper[4765]: I1003 08:57:33.586694 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/keystone-7a7f-account-create-t89nv" event={"ID":"ce7aef1c-e74f-473a-9a1c-751e2c3185e2","Type":"ContainerDied","Data":"da4710b7a5a9634afbe8487d6d9de9e1e4ed66f9ced1e0759353499320c3772a"} Oct 03 08:57:33 crc kubenswrapper[4765]: I1003 08:57:33.586744 4765 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="da4710b7a5a9634afbe8487d6d9de9e1e4ed66f9ced1e0759353499320c3772a" Oct 03 08:57:33 crc kubenswrapper[4765]: I1003 08:57:33.587020 4765 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/keystone-7a7f-account-create-t89nv" Oct 03 08:57:35 crc kubenswrapper[4765]: I1003 08:57:35.602585 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/prometheus-metric-storage-0" event={"ID":"81065dff-f372-4966-8f6b-751090a1f5b6","Type":"ContainerStarted","Data":"e0583f74df577d6ef4a3f3848ec9b065dfa3dfce0684a6462f184fa455d0ac58"} Oct 03 08:57:37 crc kubenswrapper[4765]: I1003 08:57:37.939828 4765 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="watcher-kuttl-default/rabbitmq-notifications-server-0" Oct 03 08:57:38 crc kubenswrapper[4765]: I1003 08:57:38.245788 4765 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="watcher-kuttl-default/rabbitmq-server-0" Oct 03 08:57:40 crc kubenswrapper[4765]: I1003 08:57:40.088132 4765 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["watcher-kuttl-default/keystone-db-sync-fq7mw"] Oct 03 08:57:40 crc kubenswrapper[4765]: E1003 08:57:40.088753 4765 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ce7aef1c-e74f-473a-9a1c-751e2c3185e2" containerName="mariadb-account-create" Oct 03 08:57:40 crc kubenswrapper[4765]: I1003 08:57:40.088766 4765 state_mem.go:107] "Deleted CPUSet assignment" podUID="ce7aef1c-e74f-473a-9a1c-751e2c3185e2" containerName="mariadb-account-create" Oct 03 08:57:40 crc kubenswrapper[4765]: I1003 08:57:40.088999 4765 memory_manager.go:354] "RemoveStaleState removing state" podUID="ce7aef1c-e74f-473a-9a1c-751e2c3185e2" containerName="mariadb-account-create" Oct 03 08:57:40 crc kubenswrapper[4765]: I1003 08:57:40.089560 4765 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/keystone-db-sync-fq7mw" Oct 03 08:57:40 crc kubenswrapper[4765]: I1003 08:57:40.091494 4765 reflector.go:368] Caches populated for *v1.Secret from object-"watcher-kuttl-default"/"keystone-scripts" Oct 03 08:57:40 crc kubenswrapper[4765]: I1003 08:57:40.091531 4765 reflector.go:368] Caches populated for *v1.Secret from object-"watcher-kuttl-default"/"keystone-keystone-dockercfg-rnw2m" Oct 03 08:57:40 crc kubenswrapper[4765]: I1003 08:57:40.092077 4765 reflector.go:368] Caches populated for *v1.Secret from object-"watcher-kuttl-default"/"keystone-config-data" Oct 03 08:57:40 crc kubenswrapper[4765]: I1003 08:57:40.092109 4765 reflector.go:368] Caches populated for *v1.Secret from object-"watcher-kuttl-default"/"keystone" Oct 03 08:57:40 crc kubenswrapper[4765]: I1003 08:57:40.096460 4765 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["watcher-kuttl-default/keystone-db-sync-fq7mw"] Oct 03 08:57:40 crc kubenswrapper[4765]: I1003 08:57:40.201883 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0271dd37-48aa-49af-bc6e-41b56b8ed75f-combined-ca-bundle\") pod \"keystone-db-sync-fq7mw\" (UID: \"0271dd37-48aa-49af-bc6e-41b56b8ed75f\") " pod="watcher-kuttl-default/keystone-db-sync-fq7mw" Oct 03 08:57:40 crc kubenswrapper[4765]: I1003 08:57:40.202234 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0271dd37-48aa-49af-bc6e-41b56b8ed75f-config-data\") pod \"keystone-db-sync-fq7mw\" (UID: \"0271dd37-48aa-49af-bc6e-41b56b8ed75f\") " pod="watcher-kuttl-default/keystone-db-sync-fq7mw" Oct 03 08:57:40 crc kubenswrapper[4765]: I1003 08:57:40.202326 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nv2pm\" (UniqueName: \"kubernetes.io/projected/0271dd37-48aa-49af-bc6e-41b56b8ed75f-kube-api-access-nv2pm\") pod \"keystone-db-sync-fq7mw\" (UID: \"0271dd37-48aa-49af-bc6e-41b56b8ed75f\") " pod="watcher-kuttl-default/keystone-db-sync-fq7mw" Oct 03 08:57:40 crc kubenswrapper[4765]: I1003 08:57:40.303929 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0271dd37-48aa-49af-bc6e-41b56b8ed75f-config-data\") pod \"keystone-db-sync-fq7mw\" (UID: \"0271dd37-48aa-49af-bc6e-41b56b8ed75f\") " pod="watcher-kuttl-default/keystone-db-sync-fq7mw" Oct 03 08:57:40 crc kubenswrapper[4765]: I1003 08:57:40.303983 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nv2pm\" (UniqueName: \"kubernetes.io/projected/0271dd37-48aa-49af-bc6e-41b56b8ed75f-kube-api-access-nv2pm\") pod \"keystone-db-sync-fq7mw\" (UID: \"0271dd37-48aa-49af-bc6e-41b56b8ed75f\") " pod="watcher-kuttl-default/keystone-db-sync-fq7mw" Oct 03 08:57:40 crc kubenswrapper[4765]: I1003 08:57:40.304087 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0271dd37-48aa-49af-bc6e-41b56b8ed75f-combined-ca-bundle\") pod \"keystone-db-sync-fq7mw\" (UID: \"0271dd37-48aa-49af-bc6e-41b56b8ed75f\") " pod="watcher-kuttl-default/keystone-db-sync-fq7mw" Oct 03 08:57:40 crc kubenswrapper[4765]: I1003 08:57:40.313752 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0271dd37-48aa-49af-bc6e-41b56b8ed75f-config-data\") pod \"keystone-db-sync-fq7mw\" (UID: \"0271dd37-48aa-49af-bc6e-41b56b8ed75f\") " pod="watcher-kuttl-default/keystone-db-sync-fq7mw" Oct 03 08:57:40 crc kubenswrapper[4765]: I1003 08:57:40.321970 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0271dd37-48aa-49af-bc6e-41b56b8ed75f-combined-ca-bundle\") pod \"keystone-db-sync-fq7mw\" (UID: \"0271dd37-48aa-49af-bc6e-41b56b8ed75f\") " pod="watcher-kuttl-default/keystone-db-sync-fq7mw" Oct 03 08:57:40 crc kubenswrapper[4765]: I1003 08:57:40.322201 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nv2pm\" (UniqueName: \"kubernetes.io/projected/0271dd37-48aa-49af-bc6e-41b56b8ed75f-kube-api-access-nv2pm\") pod \"keystone-db-sync-fq7mw\" (UID: \"0271dd37-48aa-49af-bc6e-41b56b8ed75f\") " pod="watcher-kuttl-default/keystone-db-sync-fq7mw" Oct 03 08:57:40 crc kubenswrapper[4765]: I1003 08:57:40.418114 4765 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/keystone-db-sync-fq7mw" Oct 03 08:57:40 crc kubenswrapper[4765]: I1003 08:57:40.989839 4765 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["watcher-kuttl-default/keystone-db-sync-fq7mw"] Oct 03 08:57:41 crc kubenswrapper[4765]: I1003 08:57:41.651978 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/prometheus-metric-storage-0" event={"ID":"81065dff-f372-4966-8f6b-751090a1f5b6","Type":"ContainerDied","Data":"e0583f74df577d6ef4a3f3848ec9b065dfa3dfce0684a6462f184fa455d0ac58"} Oct 03 08:57:41 crc kubenswrapper[4765]: I1003 08:57:41.652101 4765 generic.go:334] "Generic (PLEG): container finished" podID="81065dff-f372-4966-8f6b-751090a1f5b6" containerID="e0583f74df577d6ef4a3f3848ec9b065dfa3dfce0684a6462f184fa455d0ac58" exitCode=0 Oct 03 08:57:41 crc kubenswrapper[4765]: I1003 08:57:41.655302 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/keystone-db-sync-fq7mw" event={"ID":"0271dd37-48aa-49af-bc6e-41b56b8ed75f","Type":"ContainerStarted","Data":"57f7021dc3a2fac8feb01a508f88cbbae0f9906e3579b841adc0c3910330ef62"} Oct 03 08:57:42 crc kubenswrapper[4765]: I1003 08:57:42.666857 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/prometheus-metric-storage-0" event={"ID":"81065dff-f372-4966-8f6b-751090a1f5b6","Type":"ContainerStarted","Data":"7965921e697fda6578bb954a346152f7a8519ba218adeff40a55db5e1ddd2465"} Oct 03 08:57:44 crc kubenswrapper[4765]: I1003 08:57:44.685794 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/prometheus-metric-storage-0" event={"ID":"81065dff-f372-4966-8f6b-751090a1f5b6","Type":"ContainerStarted","Data":"8abc4f65660b92b7afcce859349d20727e471bd2d51469e5b94495113799075c"} Oct 03 08:57:49 crc kubenswrapper[4765]: I1003 08:57:49.739309 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/keystone-db-sync-fq7mw" event={"ID":"0271dd37-48aa-49af-bc6e-41b56b8ed75f","Type":"ContainerStarted","Data":"caa0c7221bdba9e370f980c5b399ceba968e1831eae98d7ae392be92df6494e5"} Oct 03 08:57:49 crc kubenswrapper[4765]: I1003 08:57:49.742081 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/prometheus-metric-storage-0" event={"ID":"81065dff-f372-4966-8f6b-751090a1f5b6","Type":"ContainerStarted","Data":"df36e325d1839d63385aa7896f564e0e3f72471512820a32e19934f18c6eeaa6"} Oct 03 08:57:49 crc kubenswrapper[4765]: I1003 08:57:49.754793 4765 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="watcher-kuttl-default/keystone-db-sync-fq7mw" podStartSLOduration=1.561230897 podStartE2EDuration="9.754776075s" podCreationTimestamp="2025-10-03 08:57:40 +0000 UTC" firstStartedPulling="2025-10-03 08:57:40.997313414 +0000 UTC m=+1105.298807744" lastFinishedPulling="2025-10-03 08:57:49.190858592 +0000 UTC m=+1113.492352922" observedRunningTime="2025-10-03 08:57:49.751816581 +0000 UTC m=+1114.053310911" watchObservedRunningTime="2025-10-03 08:57:49.754776075 +0000 UTC m=+1114.056270405" Oct 03 08:57:51 crc kubenswrapper[4765]: I1003 08:57:51.994119 4765 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="watcher-kuttl-default/prometheus-metric-storage-0" Oct 03 08:57:52 crc kubenswrapper[4765]: I1003 08:57:52.767211 4765 generic.go:334] "Generic (PLEG): container finished" podID="0271dd37-48aa-49af-bc6e-41b56b8ed75f" containerID="caa0c7221bdba9e370f980c5b399ceba968e1831eae98d7ae392be92df6494e5" exitCode=0 Oct 03 08:57:52 crc kubenswrapper[4765]: I1003 08:57:52.767305 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/keystone-db-sync-fq7mw" event={"ID":"0271dd37-48aa-49af-bc6e-41b56b8ed75f","Type":"ContainerDied","Data":"caa0c7221bdba9e370f980c5b399ceba968e1831eae98d7ae392be92df6494e5"} Oct 03 08:57:52 crc kubenswrapper[4765]: I1003 08:57:52.787975 4765 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="watcher-kuttl-default/prometheus-metric-storage-0" podStartSLOduration=21.787954499 podStartE2EDuration="21.787954499s" podCreationTimestamp="2025-10-03 08:57:31 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-03 08:57:49.786449942 +0000 UTC m=+1114.087944272" watchObservedRunningTime="2025-10-03 08:57:52.787954499 +0000 UTC m=+1117.089448829" Oct 03 08:57:54 crc kubenswrapper[4765]: I1003 08:57:54.117479 4765 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/keystone-db-sync-fq7mw" Oct 03 08:57:54 crc kubenswrapper[4765]: I1003 08:57:54.232486 4765 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0271dd37-48aa-49af-bc6e-41b56b8ed75f-combined-ca-bundle\") pod \"0271dd37-48aa-49af-bc6e-41b56b8ed75f\" (UID: \"0271dd37-48aa-49af-bc6e-41b56b8ed75f\") " Oct 03 08:57:54 crc kubenswrapper[4765]: I1003 08:57:54.232550 4765 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nv2pm\" (UniqueName: \"kubernetes.io/projected/0271dd37-48aa-49af-bc6e-41b56b8ed75f-kube-api-access-nv2pm\") pod \"0271dd37-48aa-49af-bc6e-41b56b8ed75f\" (UID: \"0271dd37-48aa-49af-bc6e-41b56b8ed75f\") " Oct 03 08:57:54 crc kubenswrapper[4765]: I1003 08:57:54.232667 4765 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0271dd37-48aa-49af-bc6e-41b56b8ed75f-config-data\") pod \"0271dd37-48aa-49af-bc6e-41b56b8ed75f\" (UID: \"0271dd37-48aa-49af-bc6e-41b56b8ed75f\") " Oct 03 08:57:54 crc kubenswrapper[4765]: I1003 08:57:54.238007 4765 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0271dd37-48aa-49af-bc6e-41b56b8ed75f-kube-api-access-nv2pm" (OuterVolumeSpecName: "kube-api-access-nv2pm") pod "0271dd37-48aa-49af-bc6e-41b56b8ed75f" (UID: "0271dd37-48aa-49af-bc6e-41b56b8ed75f"). InnerVolumeSpecName "kube-api-access-nv2pm". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 08:57:54 crc kubenswrapper[4765]: I1003 08:57:54.256451 4765 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0271dd37-48aa-49af-bc6e-41b56b8ed75f-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "0271dd37-48aa-49af-bc6e-41b56b8ed75f" (UID: "0271dd37-48aa-49af-bc6e-41b56b8ed75f"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 08:57:54 crc kubenswrapper[4765]: I1003 08:57:54.274361 4765 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0271dd37-48aa-49af-bc6e-41b56b8ed75f-config-data" (OuterVolumeSpecName: "config-data") pod "0271dd37-48aa-49af-bc6e-41b56b8ed75f" (UID: "0271dd37-48aa-49af-bc6e-41b56b8ed75f"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 08:57:54 crc kubenswrapper[4765]: I1003 08:57:54.334767 4765 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0271dd37-48aa-49af-bc6e-41b56b8ed75f-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 03 08:57:54 crc kubenswrapper[4765]: I1003 08:57:54.334809 4765 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-nv2pm\" (UniqueName: \"kubernetes.io/projected/0271dd37-48aa-49af-bc6e-41b56b8ed75f-kube-api-access-nv2pm\") on node \"crc\" DevicePath \"\"" Oct 03 08:57:54 crc kubenswrapper[4765]: I1003 08:57:54.334822 4765 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0271dd37-48aa-49af-bc6e-41b56b8ed75f-config-data\") on node \"crc\" DevicePath \"\"" Oct 03 08:57:54 crc kubenswrapper[4765]: I1003 08:57:54.789791 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/keystone-db-sync-fq7mw" event={"ID":"0271dd37-48aa-49af-bc6e-41b56b8ed75f","Type":"ContainerDied","Data":"57f7021dc3a2fac8feb01a508f88cbbae0f9906e3579b841adc0c3910330ef62"} Oct 03 08:57:54 crc kubenswrapper[4765]: I1003 08:57:54.789854 4765 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="57f7021dc3a2fac8feb01a508f88cbbae0f9906e3579b841adc0c3910330ef62" Oct 03 08:57:54 crc kubenswrapper[4765]: I1003 08:57:54.789857 4765 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/keystone-db-sync-fq7mw" Oct 03 08:57:54 crc kubenswrapper[4765]: I1003 08:57:54.993526 4765 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["watcher-kuttl-default/keystone-bootstrap-g577w"] Oct 03 08:57:54 crc kubenswrapper[4765]: E1003 08:57:54.993873 4765 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0271dd37-48aa-49af-bc6e-41b56b8ed75f" containerName="keystone-db-sync" Oct 03 08:57:54 crc kubenswrapper[4765]: I1003 08:57:54.993887 4765 state_mem.go:107] "Deleted CPUSet assignment" podUID="0271dd37-48aa-49af-bc6e-41b56b8ed75f" containerName="keystone-db-sync" Oct 03 08:57:54 crc kubenswrapper[4765]: I1003 08:57:54.994034 4765 memory_manager.go:354] "RemoveStaleState removing state" podUID="0271dd37-48aa-49af-bc6e-41b56b8ed75f" containerName="keystone-db-sync" Oct 03 08:57:54 crc kubenswrapper[4765]: I1003 08:57:54.994736 4765 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/keystone-bootstrap-g577w" Oct 03 08:57:54 crc kubenswrapper[4765]: I1003 08:57:54.997006 4765 reflector.go:368] Caches populated for *v1.Secret from object-"watcher-kuttl-default"/"keystone-keystone-dockercfg-rnw2m" Oct 03 08:57:54 crc kubenswrapper[4765]: I1003 08:57:54.997105 4765 reflector.go:368] Caches populated for *v1.Secret from object-"watcher-kuttl-default"/"keystone-config-data" Oct 03 08:57:54 crc kubenswrapper[4765]: I1003 08:57:54.998454 4765 reflector.go:368] Caches populated for *v1.Secret from object-"watcher-kuttl-default"/"keystone-scripts" Oct 03 08:57:55 crc kubenswrapper[4765]: I1003 08:57:55.002144 4765 reflector.go:368] Caches populated for *v1.Secret from object-"watcher-kuttl-default"/"keystone" Oct 03 08:57:55 crc kubenswrapper[4765]: I1003 08:57:55.007129 4765 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["watcher-kuttl-default/keystone-bootstrap-g577w"] Oct 03 08:57:55 crc kubenswrapper[4765]: I1003 08:57:55.046380 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/486a212f-4528-4a5c-8a96-35668025e8a1-scripts\") pod \"keystone-bootstrap-g577w\" (UID: \"486a212f-4528-4a5c-8a96-35668025e8a1\") " pod="watcher-kuttl-default/keystone-bootstrap-g577w" Oct 03 08:57:55 crc kubenswrapper[4765]: I1003 08:57:55.046438 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/486a212f-4528-4a5c-8a96-35668025e8a1-fernet-keys\") pod \"keystone-bootstrap-g577w\" (UID: \"486a212f-4528-4a5c-8a96-35668025e8a1\") " pod="watcher-kuttl-default/keystone-bootstrap-g577w" Oct 03 08:57:55 crc kubenswrapper[4765]: I1003 08:57:55.046463 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/486a212f-4528-4a5c-8a96-35668025e8a1-config-data\") pod \"keystone-bootstrap-g577w\" (UID: \"486a212f-4528-4a5c-8a96-35668025e8a1\") " pod="watcher-kuttl-default/keystone-bootstrap-g577w" Oct 03 08:57:55 crc kubenswrapper[4765]: I1003 08:57:55.046481 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/486a212f-4528-4a5c-8a96-35668025e8a1-credential-keys\") pod \"keystone-bootstrap-g577w\" (UID: \"486a212f-4528-4a5c-8a96-35668025e8a1\") " pod="watcher-kuttl-default/keystone-bootstrap-g577w" Oct 03 08:57:55 crc kubenswrapper[4765]: I1003 08:57:55.046513 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/486a212f-4528-4a5c-8a96-35668025e8a1-combined-ca-bundle\") pod \"keystone-bootstrap-g577w\" (UID: \"486a212f-4528-4a5c-8a96-35668025e8a1\") " pod="watcher-kuttl-default/keystone-bootstrap-g577w" Oct 03 08:57:55 crc kubenswrapper[4765]: I1003 08:57:55.046535 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-h28cm\" (UniqueName: \"kubernetes.io/projected/486a212f-4528-4a5c-8a96-35668025e8a1-kube-api-access-h28cm\") pod \"keystone-bootstrap-g577w\" (UID: \"486a212f-4528-4a5c-8a96-35668025e8a1\") " pod="watcher-kuttl-default/keystone-bootstrap-g577w" Oct 03 08:57:55 crc kubenswrapper[4765]: I1003 08:57:55.135071 4765 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["watcher-kuttl-default/ceilometer-0"] Oct 03 08:57:55 crc kubenswrapper[4765]: I1003 08:57:55.136963 4765 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/ceilometer-0" Oct 03 08:57:55 crc kubenswrapper[4765]: I1003 08:57:55.139360 4765 reflector.go:368] Caches populated for *v1.Secret from object-"watcher-kuttl-default"/"ceilometer-scripts" Oct 03 08:57:55 crc kubenswrapper[4765]: I1003 08:57:55.140102 4765 reflector.go:368] Caches populated for *v1.Secret from object-"watcher-kuttl-default"/"ceilometer-config-data" Oct 03 08:57:55 crc kubenswrapper[4765]: I1003 08:57:55.147476 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/486a212f-4528-4a5c-8a96-35668025e8a1-scripts\") pod \"keystone-bootstrap-g577w\" (UID: \"486a212f-4528-4a5c-8a96-35668025e8a1\") " pod="watcher-kuttl-default/keystone-bootstrap-g577w" Oct 03 08:57:55 crc kubenswrapper[4765]: I1003 08:57:55.147549 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/486a212f-4528-4a5c-8a96-35668025e8a1-fernet-keys\") pod \"keystone-bootstrap-g577w\" (UID: \"486a212f-4528-4a5c-8a96-35668025e8a1\") " pod="watcher-kuttl-default/keystone-bootstrap-g577w" Oct 03 08:57:55 crc kubenswrapper[4765]: I1003 08:57:55.147579 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/486a212f-4528-4a5c-8a96-35668025e8a1-config-data\") pod \"keystone-bootstrap-g577w\" (UID: \"486a212f-4528-4a5c-8a96-35668025e8a1\") " pod="watcher-kuttl-default/keystone-bootstrap-g577w" Oct 03 08:57:55 crc kubenswrapper[4765]: I1003 08:57:55.147600 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/486a212f-4528-4a5c-8a96-35668025e8a1-credential-keys\") pod \"keystone-bootstrap-g577w\" (UID: \"486a212f-4528-4a5c-8a96-35668025e8a1\") " pod="watcher-kuttl-default/keystone-bootstrap-g577w" Oct 03 08:57:55 crc kubenswrapper[4765]: I1003 08:57:55.147671 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/486a212f-4528-4a5c-8a96-35668025e8a1-combined-ca-bundle\") pod \"keystone-bootstrap-g577w\" (UID: \"486a212f-4528-4a5c-8a96-35668025e8a1\") " pod="watcher-kuttl-default/keystone-bootstrap-g577w" Oct 03 08:57:55 crc kubenswrapper[4765]: I1003 08:57:55.147699 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-h28cm\" (UniqueName: \"kubernetes.io/projected/486a212f-4528-4a5c-8a96-35668025e8a1-kube-api-access-h28cm\") pod \"keystone-bootstrap-g577w\" (UID: \"486a212f-4528-4a5c-8a96-35668025e8a1\") " pod="watcher-kuttl-default/keystone-bootstrap-g577w" Oct 03 08:57:55 crc kubenswrapper[4765]: I1003 08:57:55.155340 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/486a212f-4528-4a5c-8a96-35668025e8a1-scripts\") pod \"keystone-bootstrap-g577w\" (UID: \"486a212f-4528-4a5c-8a96-35668025e8a1\") " pod="watcher-kuttl-default/keystone-bootstrap-g577w" Oct 03 08:57:55 crc kubenswrapper[4765]: I1003 08:57:55.155342 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/486a212f-4528-4a5c-8a96-35668025e8a1-credential-keys\") pod \"keystone-bootstrap-g577w\" (UID: \"486a212f-4528-4a5c-8a96-35668025e8a1\") " pod="watcher-kuttl-default/keystone-bootstrap-g577w" Oct 03 08:57:55 crc kubenswrapper[4765]: I1003 08:57:55.156711 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/486a212f-4528-4a5c-8a96-35668025e8a1-config-data\") pod \"keystone-bootstrap-g577w\" (UID: \"486a212f-4528-4a5c-8a96-35668025e8a1\") " pod="watcher-kuttl-default/keystone-bootstrap-g577w" Oct 03 08:57:55 crc kubenswrapper[4765]: I1003 08:57:55.163035 4765 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["watcher-kuttl-default/ceilometer-0"] Oct 03 08:57:55 crc kubenswrapper[4765]: I1003 08:57:55.169604 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/486a212f-4528-4a5c-8a96-35668025e8a1-fernet-keys\") pod \"keystone-bootstrap-g577w\" (UID: \"486a212f-4528-4a5c-8a96-35668025e8a1\") " pod="watcher-kuttl-default/keystone-bootstrap-g577w" Oct 03 08:57:55 crc kubenswrapper[4765]: I1003 08:57:55.170147 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/486a212f-4528-4a5c-8a96-35668025e8a1-combined-ca-bundle\") pod \"keystone-bootstrap-g577w\" (UID: \"486a212f-4528-4a5c-8a96-35668025e8a1\") " pod="watcher-kuttl-default/keystone-bootstrap-g577w" Oct 03 08:57:55 crc kubenswrapper[4765]: I1003 08:57:55.200223 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-h28cm\" (UniqueName: \"kubernetes.io/projected/486a212f-4528-4a5c-8a96-35668025e8a1-kube-api-access-h28cm\") pod \"keystone-bootstrap-g577w\" (UID: \"486a212f-4528-4a5c-8a96-35668025e8a1\") " pod="watcher-kuttl-default/keystone-bootstrap-g577w" Oct 03 08:57:55 crc kubenswrapper[4765]: I1003 08:57:55.249870 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/beef12c0-2b55-4d2b-9a9f-a7d1423ace91-scripts\") pod \"ceilometer-0\" (UID: \"beef12c0-2b55-4d2b-9a9f-a7d1423ace91\") " pod="watcher-kuttl-default/ceilometer-0" Oct 03 08:57:55 crc kubenswrapper[4765]: I1003 08:57:55.249976 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/beef12c0-2b55-4d2b-9a9f-a7d1423ace91-config-data\") pod \"ceilometer-0\" (UID: \"beef12c0-2b55-4d2b-9a9f-a7d1423ace91\") " pod="watcher-kuttl-default/ceilometer-0" Oct 03 08:57:55 crc kubenswrapper[4765]: I1003 08:57:55.250006 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/beef12c0-2b55-4d2b-9a9f-a7d1423ace91-log-httpd\") pod \"ceilometer-0\" (UID: \"beef12c0-2b55-4d2b-9a9f-a7d1423ace91\") " pod="watcher-kuttl-default/ceilometer-0" Oct 03 08:57:55 crc kubenswrapper[4765]: I1003 08:57:55.250028 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/beef12c0-2b55-4d2b-9a9f-a7d1423ace91-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"beef12c0-2b55-4d2b-9a9f-a7d1423ace91\") " pod="watcher-kuttl-default/ceilometer-0" Oct 03 08:57:55 crc kubenswrapper[4765]: I1003 08:57:55.250114 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/beef12c0-2b55-4d2b-9a9f-a7d1423ace91-run-httpd\") pod \"ceilometer-0\" (UID: \"beef12c0-2b55-4d2b-9a9f-a7d1423ace91\") " pod="watcher-kuttl-default/ceilometer-0" Oct 03 08:57:55 crc kubenswrapper[4765]: I1003 08:57:55.250135 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/beef12c0-2b55-4d2b-9a9f-a7d1423ace91-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"beef12c0-2b55-4d2b-9a9f-a7d1423ace91\") " pod="watcher-kuttl-default/ceilometer-0" Oct 03 08:57:55 crc kubenswrapper[4765]: I1003 08:57:55.250417 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qhrcm\" (UniqueName: \"kubernetes.io/projected/beef12c0-2b55-4d2b-9a9f-a7d1423ace91-kube-api-access-qhrcm\") pod \"ceilometer-0\" (UID: \"beef12c0-2b55-4d2b-9a9f-a7d1423ace91\") " pod="watcher-kuttl-default/ceilometer-0" Oct 03 08:57:55 crc kubenswrapper[4765]: I1003 08:57:55.326244 4765 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/keystone-bootstrap-g577w" Oct 03 08:57:55 crc kubenswrapper[4765]: I1003 08:57:55.352700 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/beef12c0-2b55-4d2b-9a9f-a7d1423ace91-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"beef12c0-2b55-4d2b-9a9f-a7d1423ace91\") " pod="watcher-kuttl-default/ceilometer-0" Oct 03 08:57:55 crc kubenswrapper[4765]: I1003 08:57:55.352766 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/beef12c0-2b55-4d2b-9a9f-a7d1423ace91-run-httpd\") pod \"ceilometer-0\" (UID: \"beef12c0-2b55-4d2b-9a9f-a7d1423ace91\") " pod="watcher-kuttl-default/ceilometer-0" Oct 03 08:57:55 crc kubenswrapper[4765]: I1003 08:57:55.352790 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/beef12c0-2b55-4d2b-9a9f-a7d1423ace91-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"beef12c0-2b55-4d2b-9a9f-a7d1423ace91\") " pod="watcher-kuttl-default/ceilometer-0" Oct 03 08:57:55 crc kubenswrapper[4765]: I1003 08:57:55.352859 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qhrcm\" (UniqueName: \"kubernetes.io/projected/beef12c0-2b55-4d2b-9a9f-a7d1423ace91-kube-api-access-qhrcm\") pod \"ceilometer-0\" (UID: \"beef12c0-2b55-4d2b-9a9f-a7d1423ace91\") " pod="watcher-kuttl-default/ceilometer-0" Oct 03 08:57:55 crc kubenswrapper[4765]: I1003 08:57:55.352908 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/beef12c0-2b55-4d2b-9a9f-a7d1423ace91-scripts\") pod \"ceilometer-0\" (UID: \"beef12c0-2b55-4d2b-9a9f-a7d1423ace91\") " pod="watcher-kuttl-default/ceilometer-0" Oct 03 08:57:55 crc kubenswrapper[4765]: I1003 08:57:55.352961 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/beef12c0-2b55-4d2b-9a9f-a7d1423ace91-config-data\") pod \"ceilometer-0\" (UID: \"beef12c0-2b55-4d2b-9a9f-a7d1423ace91\") " pod="watcher-kuttl-default/ceilometer-0" Oct 03 08:57:55 crc kubenswrapper[4765]: I1003 08:57:55.353013 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/beef12c0-2b55-4d2b-9a9f-a7d1423ace91-log-httpd\") pod \"ceilometer-0\" (UID: \"beef12c0-2b55-4d2b-9a9f-a7d1423ace91\") " pod="watcher-kuttl-default/ceilometer-0" Oct 03 08:57:55 crc kubenswrapper[4765]: I1003 08:57:55.353332 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/beef12c0-2b55-4d2b-9a9f-a7d1423ace91-run-httpd\") pod \"ceilometer-0\" (UID: \"beef12c0-2b55-4d2b-9a9f-a7d1423ace91\") " pod="watcher-kuttl-default/ceilometer-0" Oct 03 08:57:55 crc kubenswrapper[4765]: I1003 08:57:55.354981 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/beef12c0-2b55-4d2b-9a9f-a7d1423ace91-log-httpd\") pod \"ceilometer-0\" (UID: \"beef12c0-2b55-4d2b-9a9f-a7d1423ace91\") " pod="watcher-kuttl-default/ceilometer-0" Oct 03 08:57:55 crc kubenswrapper[4765]: I1003 08:57:55.368533 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/beef12c0-2b55-4d2b-9a9f-a7d1423ace91-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"beef12c0-2b55-4d2b-9a9f-a7d1423ace91\") " pod="watcher-kuttl-default/ceilometer-0" Oct 03 08:57:55 crc kubenswrapper[4765]: I1003 08:57:55.368635 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/beef12c0-2b55-4d2b-9a9f-a7d1423ace91-scripts\") pod \"ceilometer-0\" (UID: \"beef12c0-2b55-4d2b-9a9f-a7d1423ace91\") " pod="watcher-kuttl-default/ceilometer-0" Oct 03 08:57:55 crc kubenswrapper[4765]: I1003 08:57:55.368747 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/beef12c0-2b55-4d2b-9a9f-a7d1423ace91-config-data\") pod \"ceilometer-0\" (UID: \"beef12c0-2b55-4d2b-9a9f-a7d1423ace91\") " pod="watcher-kuttl-default/ceilometer-0" Oct 03 08:57:55 crc kubenswrapper[4765]: I1003 08:57:55.369095 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/beef12c0-2b55-4d2b-9a9f-a7d1423ace91-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"beef12c0-2b55-4d2b-9a9f-a7d1423ace91\") " pod="watcher-kuttl-default/ceilometer-0" Oct 03 08:57:55 crc kubenswrapper[4765]: I1003 08:57:55.374904 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qhrcm\" (UniqueName: \"kubernetes.io/projected/beef12c0-2b55-4d2b-9a9f-a7d1423ace91-kube-api-access-qhrcm\") pod \"ceilometer-0\" (UID: \"beef12c0-2b55-4d2b-9a9f-a7d1423ace91\") " pod="watcher-kuttl-default/ceilometer-0" Oct 03 08:57:55 crc kubenswrapper[4765]: I1003 08:57:55.555613 4765 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/ceilometer-0" Oct 03 08:57:55 crc kubenswrapper[4765]: I1003 08:57:55.891073 4765 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["watcher-kuttl-default/keystone-bootstrap-g577w"] Oct 03 08:57:56 crc kubenswrapper[4765]: I1003 08:57:56.093178 4765 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["watcher-kuttl-default/ceilometer-0"] Oct 03 08:57:56 crc kubenswrapper[4765]: W1003 08:57:56.095118 4765 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podbeef12c0_2b55_4d2b_9a9f_a7d1423ace91.slice/crio-e6b156adb11fd7ebab08a18e3d0423e27b65e09242f9f0133a86de6b0e1e565f WatchSource:0}: Error finding container e6b156adb11fd7ebab08a18e3d0423e27b65e09242f9f0133a86de6b0e1e565f: Status 404 returned error can't find the container with id e6b156adb11fd7ebab08a18e3d0423e27b65e09242f9f0133a86de6b0e1e565f Oct 03 08:57:56 crc kubenswrapper[4765]: I1003 08:57:56.881211 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/ceilometer-0" event={"ID":"beef12c0-2b55-4d2b-9a9f-a7d1423ace91","Type":"ContainerStarted","Data":"e6b156adb11fd7ebab08a18e3d0423e27b65e09242f9f0133a86de6b0e1e565f"} Oct 03 08:57:56 crc kubenswrapper[4765]: I1003 08:57:56.888272 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/keystone-bootstrap-g577w" event={"ID":"486a212f-4528-4a5c-8a96-35668025e8a1","Type":"ContainerStarted","Data":"371554d6a23867e20f20b0b92c3bae02aa1730cacbdf0e5968dd0522a11bf163"} Oct 03 08:57:56 crc kubenswrapper[4765]: I1003 08:57:56.888324 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/keystone-bootstrap-g577w" event={"ID":"486a212f-4528-4a5c-8a96-35668025e8a1","Type":"ContainerStarted","Data":"642067d634481e5eee6bd288fa633667db14f6f940d4d9e28befed394dd8159a"} Oct 03 08:57:56 crc kubenswrapper[4765]: I1003 08:57:56.938001 4765 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["watcher-kuttl-default/ceilometer-0"] Oct 03 08:57:56 crc kubenswrapper[4765]: I1003 08:57:56.956708 4765 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="watcher-kuttl-default/keystone-bootstrap-g577w" podStartSLOduration=2.9566887729999998 podStartE2EDuration="2.956688773s" podCreationTimestamp="2025-10-03 08:57:54 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-03 08:57:56.923315344 +0000 UTC m=+1121.224809674" watchObservedRunningTime="2025-10-03 08:57:56.956688773 +0000 UTC m=+1121.258183103" Oct 03 08:58:00 crc kubenswrapper[4765]: I1003 08:58:00.679911 4765 patch_prober.go:28] interesting pod/machine-config-daemon-j8mss container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 03 08:58:00 crc kubenswrapper[4765]: I1003 08:58:00.680674 4765 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-j8mss" podUID="d636dbad-9ffa-4ba7-953f-adea04b76a23" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 03 08:58:00 crc kubenswrapper[4765]: I1003 08:58:00.927144 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/ceilometer-0" event={"ID":"beef12c0-2b55-4d2b-9a9f-a7d1423ace91","Type":"ContainerStarted","Data":"f953edf63c785375c1c38fc6575faab03f066c99bdcfc21d7d10d71ccf0700ef"} Oct 03 08:58:00 crc kubenswrapper[4765]: I1003 08:58:00.928514 4765 generic.go:334] "Generic (PLEG): container finished" podID="486a212f-4528-4a5c-8a96-35668025e8a1" containerID="371554d6a23867e20f20b0b92c3bae02aa1730cacbdf0e5968dd0522a11bf163" exitCode=0 Oct 03 08:58:00 crc kubenswrapper[4765]: I1003 08:58:00.928546 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/keystone-bootstrap-g577w" event={"ID":"486a212f-4528-4a5c-8a96-35668025e8a1","Type":"ContainerDied","Data":"371554d6a23867e20f20b0b92c3bae02aa1730cacbdf0e5968dd0522a11bf163"} Oct 03 08:58:01 crc kubenswrapper[4765]: I1003 08:58:01.993928 4765 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="watcher-kuttl-default/prometheus-metric-storage-0" Oct 03 08:58:02 crc kubenswrapper[4765]: I1003 08:58:02.000099 4765 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="watcher-kuttl-default/prometheus-metric-storage-0" Oct 03 08:58:02 crc kubenswrapper[4765]: I1003 08:58:02.423189 4765 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/keystone-bootstrap-g577w" Oct 03 08:58:02 crc kubenswrapper[4765]: I1003 08:58:02.597709 4765 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/486a212f-4528-4a5c-8a96-35668025e8a1-credential-keys\") pod \"486a212f-4528-4a5c-8a96-35668025e8a1\" (UID: \"486a212f-4528-4a5c-8a96-35668025e8a1\") " Oct 03 08:58:02 crc kubenswrapper[4765]: I1003 08:58:02.597796 4765 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/486a212f-4528-4a5c-8a96-35668025e8a1-config-data\") pod \"486a212f-4528-4a5c-8a96-35668025e8a1\" (UID: \"486a212f-4528-4a5c-8a96-35668025e8a1\") " Oct 03 08:58:02 crc kubenswrapper[4765]: I1003 08:58:02.597871 4765 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/486a212f-4528-4a5c-8a96-35668025e8a1-combined-ca-bundle\") pod \"486a212f-4528-4a5c-8a96-35668025e8a1\" (UID: \"486a212f-4528-4a5c-8a96-35668025e8a1\") " Oct 03 08:58:02 crc kubenswrapper[4765]: I1003 08:58:02.597922 4765 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-h28cm\" (UniqueName: \"kubernetes.io/projected/486a212f-4528-4a5c-8a96-35668025e8a1-kube-api-access-h28cm\") pod \"486a212f-4528-4a5c-8a96-35668025e8a1\" (UID: \"486a212f-4528-4a5c-8a96-35668025e8a1\") " Oct 03 08:58:02 crc kubenswrapper[4765]: I1003 08:58:02.597976 4765 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/486a212f-4528-4a5c-8a96-35668025e8a1-fernet-keys\") pod \"486a212f-4528-4a5c-8a96-35668025e8a1\" (UID: \"486a212f-4528-4a5c-8a96-35668025e8a1\") " Oct 03 08:58:02 crc kubenswrapper[4765]: I1003 08:58:02.598050 4765 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/486a212f-4528-4a5c-8a96-35668025e8a1-scripts\") pod \"486a212f-4528-4a5c-8a96-35668025e8a1\" (UID: \"486a212f-4528-4a5c-8a96-35668025e8a1\") " Oct 03 08:58:02 crc kubenswrapper[4765]: I1003 08:58:02.601401 4765 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/486a212f-4528-4a5c-8a96-35668025e8a1-scripts" (OuterVolumeSpecName: "scripts") pod "486a212f-4528-4a5c-8a96-35668025e8a1" (UID: "486a212f-4528-4a5c-8a96-35668025e8a1"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 08:58:02 crc kubenswrapper[4765]: I1003 08:58:02.601605 4765 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/486a212f-4528-4a5c-8a96-35668025e8a1-kube-api-access-h28cm" (OuterVolumeSpecName: "kube-api-access-h28cm") pod "486a212f-4528-4a5c-8a96-35668025e8a1" (UID: "486a212f-4528-4a5c-8a96-35668025e8a1"). InnerVolumeSpecName "kube-api-access-h28cm". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 08:58:02 crc kubenswrapper[4765]: I1003 08:58:02.603784 4765 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/486a212f-4528-4a5c-8a96-35668025e8a1-fernet-keys" (OuterVolumeSpecName: "fernet-keys") pod "486a212f-4528-4a5c-8a96-35668025e8a1" (UID: "486a212f-4528-4a5c-8a96-35668025e8a1"). InnerVolumeSpecName "fernet-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 08:58:02 crc kubenswrapper[4765]: I1003 08:58:02.605468 4765 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/486a212f-4528-4a5c-8a96-35668025e8a1-credential-keys" (OuterVolumeSpecName: "credential-keys") pod "486a212f-4528-4a5c-8a96-35668025e8a1" (UID: "486a212f-4528-4a5c-8a96-35668025e8a1"). InnerVolumeSpecName "credential-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 08:58:02 crc kubenswrapper[4765]: I1003 08:58:02.622953 4765 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/486a212f-4528-4a5c-8a96-35668025e8a1-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "486a212f-4528-4a5c-8a96-35668025e8a1" (UID: "486a212f-4528-4a5c-8a96-35668025e8a1"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 08:58:02 crc kubenswrapper[4765]: I1003 08:58:02.623242 4765 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/486a212f-4528-4a5c-8a96-35668025e8a1-config-data" (OuterVolumeSpecName: "config-data") pod "486a212f-4528-4a5c-8a96-35668025e8a1" (UID: "486a212f-4528-4a5c-8a96-35668025e8a1"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 08:58:02 crc kubenswrapper[4765]: I1003 08:58:02.700467 4765 reconciler_common.go:293] "Volume detached for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/486a212f-4528-4a5c-8a96-35668025e8a1-fernet-keys\") on node \"crc\" DevicePath \"\"" Oct 03 08:58:02 crc kubenswrapper[4765]: I1003 08:58:02.700511 4765 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/486a212f-4528-4a5c-8a96-35668025e8a1-scripts\") on node \"crc\" DevicePath \"\"" Oct 03 08:58:02 crc kubenswrapper[4765]: I1003 08:58:02.700525 4765 reconciler_common.go:293] "Volume detached for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/486a212f-4528-4a5c-8a96-35668025e8a1-credential-keys\") on node \"crc\" DevicePath \"\"" Oct 03 08:58:02 crc kubenswrapper[4765]: I1003 08:58:02.700538 4765 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/486a212f-4528-4a5c-8a96-35668025e8a1-config-data\") on node \"crc\" DevicePath \"\"" Oct 03 08:58:02 crc kubenswrapper[4765]: I1003 08:58:02.700552 4765 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/486a212f-4528-4a5c-8a96-35668025e8a1-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 03 08:58:02 crc kubenswrapper[4765]: I1003 08:58:02.700565 4765 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-h28cm\" (UniqueName: \"kubernetes.io/projected/486a212f-4528-4a5c-8a96-35668025e8a1-kube-api-access-h28cm\") on node \"crc\" DevicePath \"\"" Oct 03 08:58:02 crc kubenswrapper[4765]: I1003 08:58:02.943799 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/keystone-bootstrap-g577w" event={"ID":"486a212f-4528-4a5c-8a96-35668025e8a1","Type":"ContainerDied","Data":"642067d634481e5eee6bd288fa633667db14f6f940d4d9e28befed394dd8159a"} Oct 03 08:58:02 crc kubenswrapper[4765]: I1003 08:58:02.943849 4765 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="642067d634481e5eee6bd288fa633667db14f6f940d4d9e28befed394dd8159a" Oct 03 08:58:02 crc kubenswrapper[4765]: I1003 08:58:02.943817 4765 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/keystone-bootstrap-g577w" Oct 03 08:58:02 crc kubenswrapper[4765]: I1003 08:58:02.947577 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/ceilometer-0" event={"ID":"beef12c0-2b55-4d2b-9a9f-a7d1423ace91","Type":"ContainerStarted","Data":"b8b21548a206b9679dfc793b55fa6989578901d20d5258e8819ad2a40f37989e"} Oct 03 08:58:02 crc kubenswrapper[4765]: I1003 08:58:02.950828 4765 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="watcher-kuttl-default/prometheus-metric-storage-0" Oct 03 08:58:03 crc kubenswrapper[4765]: I1003 08:58:03.041924 4765 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["watcher-kuttl-default/keystone-bootstrap-g577w"] Oct 03 08:58:03 crc kubenswrapper[4765]: I1003 08:58:03.047792 4765 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["watcher-kuttl-default/keystone-bootstrap-g577w"] Oct 03 08:58:03 crc kubenswrapper[4765]: I1003 08:58:03.118857 4765 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["watcher-kuttl-default/keystone-bootstrap-dgcbq"] Oct 03 08:58:03 crc kubenswrapper[4765]: E1003 08:58:03.119273 4765 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="486a212f-4528-4a5c-8a96-35668025e8a1" containerName="keystone-bootstrap" Oct 03 08:58:03 crc kubenswrapper[4765]: I1003 08:58:03.119298 4765 state_mem.go:107] "Deleted CPUSet assignment" podUID="486a212f-4528-4a5c-8a96-35668025e8a1" containerName="keystone-bootstrap" Oct 03 08:58:03 crc kubenswrapper[4765]: I1003 08:58:03.119485 4765 memory_manager.go:354] "RemoveStaleState removing state" podUID="486a212f-4528-4a5c-8a96-35668025e8a1" containerName="keystone-bootstrap" Oct 03 08:58:03 crc kubenswrapper[4765]: I1003 08:58:03.120257 4765 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/keystone-bootstrap-dgcbq" Oct 03 08:58:03 crc kubenswrapper[4765]: I1003 08:58:03.124200 4765 reflector.go:368] Caches populated for *v1.Secret from object-"watcher-kuttl-default"/"keystone" Oct 03 08:58:03 crc kubenswrapper[4765]: I1003 08:58:03.124211 4765 reflector.go:368] Caches populated for *v1.Secret from object-"watcher-kuttl-default"/"keystone-scripts" Oct 03 08:58:03 crc kubenswrapper[4765]: I1003 08:58:03.124400 4765 reflector.go:368] Caches populated for *v1.Secret from object-"watcher-kuttl-default"/"keystone-keystone-dockercfg-rnw2m" Oct 03 08:58:03 crc kubenswrapper[4765]: I1003 08:58:03.126305 4765 reflector.go:368] Caches populated for *v1.Secret from object-"watcher-kuttl-default"/"keystone-config-data" Oct 03 08:58:03 crc kubenswrapper[4765]: I1003 08:58:03.136317 4765 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["watcher-kuttl-default/keystone-bootstrap-dgcbq"] Oct 03 08:58:03 crc kubenswrapper[4765]: I1003 08:58:03.309134 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b3c84158-3adc-480a-8e89-c28795415db5-combined-ca-bundle\") pod \"keystone-bootstrap-dgcbq\" (UID: \"b3c84158-3adc-480a-8e89-c28795415db5\") " pod="watcher-kuttl-default/keystone-bootstrap-dgcbq" Oct 03 08:58:03 crc kubenswrapper[4765]: I1003 08:58:03.309326 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/b3c84158-3adc-480a-8e89-c28795415db5-fernet-keys\") pod \"keystone-bootstrap-dgcbq\" (UID: \"b3c84158-3adc-480a-8e89-c28795415db5\") " pod="watcher-kuttl-default/keystone-bootstrap-dgcbq" Oct 03 08:58:03 crc kubenswrapper[4765]: I1003 08:58:03.309379 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b3c84158-3adc-480a-8e89-c28795415db5-config-data\") pod \"keystone-bootstrap-dgcbq\" (UID: \"b3c84158-3adc-480a-8e89-c28795415db5\") " pod="watcher-kuttl-default/keystone-bootstrap-dgcbq" Oct 03 08:58:03 crc kubenswrapper[4765]: I1003 08:58:03.309423 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/b3c84158-3adc-480a-8e89-c28795415db5-scripts\") pod \"keystone-bootstrap-dgcbq\" (UID: \"b3c84158-3adc-480a-8e89-c28795415db5\") " pod="watcher-kuttl-default/keystone-bootstrap-dgcbq" Oct 03 08:58:03 crc kubenswrapper[4765]: I1003 08:58:03.309453 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7dx88\" (UniqueName: \"kubernetes.io/projected/b3c84158-3adc-480a-8e89-c28795415db5-kube-api-access-7dx88\") pod \"keystone-bootstrap-dgcbq\" (UID: \"b3c84158-3adc-480a-8e89-c28795415db5\") " pod="watcher-kuttl-default/keystone-bootstrap-dgcbq" Oct 03 08:58:03 crc kubenswrapper[4765]: I1003 08:58:03.309515 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/b3c84158-3adc-480a-8e89-c28795415db5-credential-keys\") pod \"keystone-bootstrap-dgcbq\" (UID: \"b3c84158-3adc-480a-8e89-c28795415db5\") " pod="watcher-kuttl-default/keystone-bootstrap-dgcbq" Oct 03 08:58:03 crc kubenswrapper[4765]: I1003 08:58:03.411735 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/b3c84158-3adc-480a-8e89-c28795415db5-fernet-keys\") pod \"keystone-bootstrap-dgcbq\" (UID: \"b3c84158-3adc-480a-8e89-c28795415db5\") " pod="watcher-kuttl-default/keystone-bootstrap-dgcbq" Oct 03 08:58:03 crc kubenswrapper[4765]: I1003 08:58:03.411826 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b3c84158-3adc-480a-8e89-c28795415db5-config-data\") pod \"keystone-bootstrap-dgcbq\" (UID: \"b3c84158-3adc-480a-8e89-c28795415db5\") " pod="watcher-kuttl-default/keystone-bootstrap-dgcbq" Oct 03 08:58:03 crc kubenswrapper[4765]: I1003 08:58:03.411862 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/b3c84158-3adc-480a-8e89-c28795415db5-scripts\") pod \"keystone-bootstrap-dgcbq\" (UID: \"b3c84158-3adc-480a-8e89-c28795415db5\") " pod="watcher-kuttl-default/keystone-bootstrap-dgcbq" Oct 03 08:58:03 crc kubenswrapper[4765]: I1003 08:58:03.411883 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7dx88\" (UniqueName: \"kubernetes.io/projected/b3c84158-3adc-480a-8e89-c28795415db5-kube-api-access-7dx88\") pod \"keystone-bootstrap-dgcbq\" (UID: \"b3c84158-3adc-480a-8e89-c28795415db5\") " pod="watcher-kuttl-default/keystone-bootstrap-dgcbq" Oct 03 08:58:03 crc kubenswrapper[4765]: I1003 08:58:03.411925 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/b3c84158-3adc-480a-8e89-c28795415db5-credential-keys\") pod \"keystone-bootstrap-dgcbq\" (UID: \"b3c84158-3adc-480a-8e89-c28795415db5\") " pod="watcher-kuttl-default/keystone-bootstrap-dgcbq" Oct 03 08:58:03 crc kubenswrapper[4765]: I1003 08:58:03.411965 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b3c84158-3adc-480a-8e89-c28795415db5-combined-ca-bundle\") pod \"keystone-bootstrap-dgcbq\" (UID: \"b3c84158-3adc-480a-8e89-c28795415db5\") " pod="watcher-kuttl-default/keystone-bootstrap-dgcbq" Oct 03 08:58:03 crc kubenswrapper[4765]: I1003 08:58:03.431513 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b3c84158-3adc-480a-8e89-c28795415db5-combined-ca-bundle\") pod \"keystone-bootstrap-dgcbq\" (UID: \"b3c84158-3adc-480a-8e89-c28795415db5\") " pod="watcher-kuttl-default/keystone-bootstrap-dgcbq" Oct 03 08:58:03 crc kubenswrapper[4765]: I1003 08:58:03.431746 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/b3c84158-3adc-480a-8e89-c28795415db5-scripts\") pod \"keystone-bootstrap-dgcbq\" (UID: \"b3c84158-3adc-480a-8e89-c28795415db5\") " pod="watcher-kuttl-default/keystone-bootstrap-dgcbq" Oct 03 08:58:03 crc kubenswrapper[4765]: I1003 08:58:03.432036 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/b3c84158-3adc-480a-8e89-c28795415db5-credential-keys\") pod \"keystone-bootstrap-dgcbq\" (UID: \"b3c84158-3adc-480a-8e89-c28795415db5\") " pod="watcher-kuttl-default/keystone-bootstrap-dgcbq" Oct 03 08:58:03 crc kubenswrapper[4765]: I1003 08:58:03.439446 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b3c84158-3adc-480a-8e89-c28795415db5-config-data\") pod \"keystone-bootstrap-dgcbq\" (UID: \"b3c84158-3adc-480a-8e89-c28795415db5\") " pod="watcher-kuttl-default/keystone-bootstrap-dgcbq" Oct 03 08:58:03 crc kubenswrapper[4765]: I1003 08:58:03.440029 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/b3c84158-3adc-480a-8e89-c28795415db5-fernet-keys\") pod \"keystone-bootstrap-dgcbq\" (UID: \"b3c84158-3adc-480a-8e89-c28795415db5\") " pod="watcher-kuttl-default/keystone-bootstrap-dgcbq" Oct 03 08:58:03 crc kubenswrapper[4765]: I1003 08:58:03.442168 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7dx88\" (UniqueName: \"kubernetes.io/projected/b3c84158-3adc-480a-8e89-c28795415db5-kube-api-access-7dx88\") pod \"keystone-bootstrap-dgcbq\" (UID: \"b3c84158-3adc-480a-8e89-c28795415db5\") " pod="watcher-kuttl-default/keystone-bootstrap-dgcbq" Oct 03 08:58:03 crc kubenswrapper[4765]: I1003 08:58:03.444171 4765 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/keystone-bootstrap-dgcbq" Oct 03 08:58:03 crc kubenswrapper[4765]: I1003 08:58:03.953934 4765 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["watcher-kuttl-default/keystone-bootstrap-dgcbq"] Oct 03 08:58:04 crc kubenswrapper[4765]: W1003 08:58:04.013624 4765 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podb3c84158_3adc_480a_8e89_c28795415db5.slice/crio-8e7e7340a98292e60a8a66a306646210c0eb43e51530ac4c14f67c753b8938be WatchSource:0}: Error finding container 8e7e7340a98292e60a8a66a306646210c0eb43e51530ac4c14f67c753b8938be: Status 404 returned error can't find the container with id 8e7e7340a98292e60a8a66a306646210c0eb43e51530ac4c14f67c753b8938be Oct 03 08:58:04 crc kubenswrapper[4765]: I1003 08:58:04.319196 4765 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="486a212f-4528-4a5c-8a96-35668025e8a1" path="/var/lib/kubelet/pods/486a212f-4528-4a5c-8a96-35668025e8a1/volumes" Oct 03 08:58:05 crc kubenswrapper[4765]: I1003 08:58:05.010749 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/keystone-bootstrap-dgcbq" event={"ID":"b3c84158-3adc-480a-8e89-c28795415db5","Type":"ContainerStarted","Data":"f9b1fa593b2043d2de9aa7a0b58ede46415e6225d4d4183e3963cdf0e49d9560"} Oct 03 08:58:05 crc kubenswrapper[4765]: I1003 08:58:05.010794 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/keystone-bootstrap-dgcbq" event={"ID":"b3c84158-3adc-480a-8e89-c28795415db5","Type":"ContainerStarted","Data":"8e7e7340a98292e60a8a66a306646210c0eb43e51530ac4c14f67c753b8938be"} Oct 03 08:58:05 crc kubenswrapper[4765]: I1003 08:58:05.030451 4765 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="watcher-kuttl-default/keystone-bootstrap-dgcbq" podStartSLOduration=2.030436958 podStartE2EDuration="2.030436958s" podCreationTimestamp="2025-10-03 08:58:03 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-03 08:58:05.028788916 +0000 UTC m=+1129.330283246" watchObservedRunningTime="2025-10-03 08:58:05.030436958 +0000 UTC m=+1129.331931288" Oct 03 08:58:08 crc kubenswrapper[4765]: I1003 08:58:08.037965 4765 generic.go:334] "Generic (PLEG): container finished" podID="b3c84158-3adc-480a-8e89-c28795415db5" containerID="f9b1fa593b2043d2de9aa7a0b58ede46415e6225d4d4183e3963cdf0e49d9560" exitCode=0 Oct 03 08:58:08 crc kubenswrapper[4765]: I1003 08:58:08.038053 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/keystone-bootstrap-dgcbq" event={"ID":"b3c84158-3adc-480a-8e89-c28795415db5","Type":"ContainerDied","Data":"f9b1fa593b2043d2de9aa7a0b58ede46415e6225d4d4183e3963cdf0e49d9560"} Oct 03 08:58:09 crc kubenswrapper[4765]: I1003 08:58:09.109064 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/ceilometer-0" event={"ID":"beef12c0-2b55-4d2b-9a9f-a7d1423ace91","Type":"ContainerStarted","Data":"506fb2d3d4c8bac9b32cd6a53790ed93e3d64eb0bafc490c228af1059c7b8700"} Oct 03 08:58:09 crc kubenswrapper[4765]: I1003 08:58:09.568382 4765 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/keystone-bootstrap-dgcbq" Oct 03 08:58:09 crc kubenswrapper[4765]: I1003 08:58:09.631457 4765 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/b3c84158-3adc-480a-8e89-c28795415db5-fernet-keys\") pod \"b3c84158-3adc-480a-8e89-c28795415db5\" (UID: \"b3c84158-3adc-480a-8e89-c28795415db5\") " Oct 03 08:58:09 crc kubenswrapper[4765]: I1003 08:58:09.631522 4765 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/b3c84158-3adc-480a-8e89-c28795415db5-credential-keys\") pod \"b3c84158-3adc-480a-8e89-c28795415db5\" (UID: \"b3c84158-3adc-480a-8e89-c28795415db5\") " Oct 03 08:58:09 crc kubenswrapper[4765]: I1003 08:58:09.631543 4765 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b3c84158-3adc-480a-8e89-c28795415db5-config-data\") pod \"b3c84158-3adc-480a-8e89-c28795415db5\" (UID: \"b3c84158-3adc-480a-8e89-c28795415db5\") " Oct 03 08:58:09 crc kubenswrapper[4765]: I1003 08:58:09.631611 4765 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/b3c84158-3adc-480a-8e89-c28795415db5-scripts\") pod \"b3c84158-3adc-480a-8e89-c28795415db5\" (UID: \"b3c84158-3adc-480a-8e89-c28795415db5\") " Oct 03 08:58:09 crc kubenswrapper[4765]: I1003 08:58:09.631720 4765 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b3c84158-3adc-480a-8e89-c28795415db5-combined-ca-bundle\") pod \"b3c84158-3adc-480a-8e89-c28795415db5\" (UID: \"b3c84158-3adc-480a-8e89-c28795415db5\") " Oct 03 08:58:09 crc kubenswrapper[4765]: I1003 08:58:09.631760 4765 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7dx88\" (UniqueName: \"kubernetes.io/projected/b3c84158-3adc-480a-8e89-c28795415db5-kube-api-access-7dx88\") pod \"b3c84158-3adc-480a-8e89-c28795415db5\" (UID: \"b3c84158-3adc-480a-8e89-c28795415db5\") " Oct 03 08:58:09 crc kubenswrapper[4765]: I1003 08:58:09.637725 4765 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b3c84158-3adc-480a-8e89-c28795415db5-scripts" (OuterVolumeSpecName: "scripts") pod "b3c84158-3adc-480a-8e89-c28795415db5" (UID: "b3c84158-3adc-480a-8e89-c28795415db5"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 08:58:09 crc kubenswrapper[4765]: I1003 08:58:09.637935 4765 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b3c84158-3adc-480a-8e89-c28795415db5-fernet-keys" (OuterVolumeSpecName: "fernet-keys") pod "b3c84158-3adc-480a-8e89-c28795415db5" (UID: "b3c84158-3adc-480a-8e89-c28795415db5"). InnerVolumeSpecName "fernet-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 08:58:09 crc kubenswrapper[4765]: I1003 08:58:09.638460 4765 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b3c84158-3adc-480a-8e89-c28795415db5-kube-api-access-7dx88" (OuterVolumeSpecName: "kube-api-access-7dx88") pod "b3c84158-3adc-480a-8e89-c28795415db5" (UID: "b3c84158-3adc-480a-8e89-c28795415db5"). InnerVolumeSpecName "kube-api-access-7dx88". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 08:58:09 crc kubenswrapper[4765]: I1003 08:58:09.657037 4765 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b3c84158-3adc-480a-8e89-c28795415db5-credential-keys" (OuterVolumeSpecName: "credential-keys") pod "b3c84158-3adc-480a-8e89-c28795415db5" (UID: "b3c84158-3adc-480a-8e89-c28795415db5"). InnerVolumeSpecName "credential-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 08:58:09 crc kubenswrapper[4765]: I1003 08:58:09.677740 4765 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b3c84158-3adc-480a-8e89-c28795415db5-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "b3c84158-3adc-480a-8e89-c28795415db5" (UID: "b3c84158-3adc-480a-8e89-c28795415db5"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 08:58:09 crc kubenswrapper[4765]: I1003 08:58:09.694153 4765 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b3c84158-3adc-480a-8e89-c28795415db5-config-data" (OuterVolumeSpecName: "config-data") pod "b3c84158-3adc-480a-8e89-c28795415db5" (UID: "b3c84158-3adc-480a-8e89-c28795415db5"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 08:58:09 crc kubenswrapper[4765]: I1003 08:58:09.736388 4765 reconciler_common.go:293] "Volume detached for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/b3c84158-3adc-480a-8e89-c28795415db5-fernet-keys\") on node \"crc\" DevicePath \"\"" Oct 03 08:58:09 crc kubenswrapper[4765]: I1003 08:58:09.736422 4765 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b3c84158-3adc-480a-8e89-c28795415db5-config-data\") on node \"crc\" DevicePath \"\"" Oct 03 08:58:09 crc kubenswrapper[4765]: I1003 08:58:09.736431 4765 reconciler_common.go:293] "Volume detached for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/b3c84158-3adc-480a-8e89-c28795415db5-credential-keys\") on node \"crc\" DevicePath \"\"" Oct 03 08:58:09 crc kubenswrapper[4765]: I1003 08:58:09.736442 4765 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/b3c84158-3adc-480a-8e89-c28795415db5-scripts\") on node \"crc\" DevicePath \"\"" Oct 03 08:58:09 crc kubenswrapper[4765]: I1003 08:58:09.736453 4765 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b3c84158-3adc-480a-8e89-c28795415db5-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 03 08:58:09 crc kubenswrapper[4765]: I1003 08:58:09.736465 4765 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7dx88\" (UniqueName: \"kubernetes.io/projected/b3c84158-3adc-480a-8e89-c28795415db5-kube-api-access-7dx88\") on node \"crc\" DevicePath \"\"" Oct 03 08:58:10 crc kubenswrapper[4765]: I1003 08:58:10.119736 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/keystone-bootstrap-dgcbq" event={"ID":"b3c84158-3adc-480a-8e89-c28795415db5","Type":"ContainerDied","Data":"8e7e7340a98292e60a8a66a306646210c0eb43e51530ac4c14f67c753b8938be"} Oct 03 08:58:10 crc kubenswrapper[4765]: I1003 08:58:10.119773 4765 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="8e7e7340a98292e60a8a66a306646210c0eb43e51530ac4c14f67c753b8938be" Oct 03 08:58:10 crc kubenswrapper[4765]: I1003 08:58:10.119822 4765 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/keystone-bootstrap-dgcbq" Oct 03 08:58:10 crc kubenswrapper[4765]: I1003 08:58:10.241809 4765 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["watcher-kuttl-default/keystone-596469b6bd-hwkh5"] Oct 03 08:58:10 crc kubenswrapper[4765]: E1003 08:58:10.242264 4765 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b3c84158-3adc-480a-8e89-c28795415db5" containerName="keystone-bootstrap" Oct 03 08:58:10 crc kubenswrapper[4765]: I1003 08:58:10.242285 4765 state_mem.go:107] "Deleted CPUSet assignment" podUID="b3c84158-3adc-480a-8e89-c28795415db5" containerName="keystone-bootstrap" Oct 03 08:58:10 crc kubenswrapper[4765]: I1003 08:58:10.242471 4765 memory_manager.go:354] "RemoveStaleState removing state" podUID="b3c84158-3adc-480a-8e89-c28795415db5" containerName="keystone-bootstrap" Oct 03 08:58:10 crc kubenswrapper[4765]: I1003 08:58:10.243260 4765 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/keystone-596469b6bd-hwkh5" Oct 03 08:58:10 crc kubenswrapper[4765]: I1003 08:58:10.251131 4765 reflector.go:368] Caches populated for *v1.Secret from object-"watcher-kuttl-default"/"keystone-scripts" Oct 03 08:58:10 crc kubenswrapper[4765]: I1003 08:58:10.251135 4765 reflector.go:368] Caches populated for *v1.Secret from object-"watcher-kuttl-default"/"keystone-keystone-dockercfg-rnw2m" Oct 03 08:58:10 crc kubenswrapper[4765]: I1003 08:58:10.251434 4765 reflector.go:368] Caches populated for *v1.Secret from object-"watcher-kuttl-default"/"keystone-config-data" Oct 03 08:58:10 crc kubenswrapper[4765]: I1003 08:58:10.251491 4765 reflector.go:368] Caches populated for *v1.Secret from object-"watcher-kuttl-default"/"cert-keystone-public-svc" Oct 03 08:58:10 crc kubenswrapper[4765]: I1003 08:58:10.255184 4765 reflector.go:368] Caches populated for *v1.Secret from object-"watcher-kuttl-default"/"cert-keystone-internal-svc" Oct 03 08:58:10 crc kubenswrapper[4765]: I1003 08:58:10.255228 4765 reflector.go:368] Caches populated for *v1.Secret from object-"watcher-kuttl-default"/"keystone" Oct 03 08:58:10 crc kubenswrapper[4765]: I1003 08:58:10.260168 4765 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["watcher-kuttl-default/keystone-596469b6bd-hwkh5"] Oct 03 08:58:10 crc kubenswrapper[4765]: I1003 08:58:10.350887 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/783d142e-5f7f-4ea1-bed2-6b55f7a35aec-public-tls-certs\") pod \"keystone-596469b6bd-hwkh5\" (UID: \"783d142e-5f7f-4ea1-bed2-6b55f7a35aec\") " pod="watcher-kuttl-default/keystone-596469b6bd-hwkh5" Oct 03 08:58:10 crc kubenswrapper[4765]: I1003 08:58:10.351250 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/783d142e-5f7f-4ea1-bed2-6b55f7a35aec-combined-ca-bundle\") pod \"keystone-596469b6bd-hwkh5\" (UID: \"783d142e-5f7f-4ea1-bed2-6b55f7a35aec\") " pod="watcher-kuttl-default/keystone-596469b6bd-hwkh5" Oct 03 08:58:10 crc kubenswrapper[4765]: I1003 08:58:10.351363 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/783d142e-5f7f-4ea1-bed2-6b55f7a35aec-credential-keys\") pod \"keystone-596469b6bd-hwkh5\" (UID: \"783d142e-5f7f-4ea1-bed2-6b55f7a35aec\") " pod="watcher-kuttl-default/keystone-596469b6bd-hwkh5" Oct 03 08:58:10 crc kubenswrapper[4765]: I1003 08:58:10.351480 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/783d142e-5f7f-4ea1-bed2-6b55f7a35aec-fernet-keys\") pod \"keystone-596469b6bd-hwkh5\" (UID: \"783d142e-5f7f-4ea1-bed2-6b55f7a35aec\") " pod="watcher-kuttl-default/keystone-596469b6bd-hwkh5" Oct 03 08:58:10 crc kubenswrapper[4765]: I1003 08:58:10.351611 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/783d142e-5f7f-4ea1-bed2-6b55f7a35aec-config-data\") pod \"keystone-596469b6bd-hwkh5\" (UID: \"783d142e-5f7f-4ea1-bed2-6b55f7a35aec\") " pod="watcher-kuttl-default/keystone-596469b6bd-hwkh5" Oct 03 08:58:10 crc kubenswrapper[4765]: I1003 08:58:10.351772 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/783d142e-5f7f-4ea1-bed2-6b55f7a35aec-internal-tls-certs\") pod \"keystone-596469b6bd-hwkh5\" (UID: \"783d142e-5f7f-4ea1-bed2-6b55f7a35aec\") " pod="watcher-kuttl-default/keystone-596469b6bd-hwkh5" Oct 03 08:58:10 crc kubenswrapper[4765]: I1003 08:58:10.351938 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/783d142e-5f7f-4ea1-bed2-6b55f7a35aec-scripts\") pod \"keystone-596469b6bd-hwkh5\" (UID: \"783d142e-5f7f-4ea1-bed2-6b55f7a35aec\") " pod="watcher-kuttl-default/keystone-596469b6bd-hwkh5" Oct 03 08:58:10 crc kubenswrapper[4765]: I1003 08:58:10.352614 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6zz5t\" (UniqueName: \"kubernetes.io/projected/783d142e-5f7f-4ea1-bed2-6b55f7a35aec-kube-api-access-6zz5t\") pod \"keystone-596469b6bd-hwkh5\" (UID: \"783d142e-5f7f-4ea1-bed2-6b55f7a35aec\") " pod="watcher-kuttl-default/keystone-596469b6bd-hwkh5" Oct 03 08:58:10 crc kubenswrapper[4765]: I1003 08:58:10.459300 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6zz5t\" (UniqueName: \"kubernetes.io/projected/783d142e-5f7f-4ea1-bed2-6b55f7a35aec-kube-api-access-6zz5t\") pod \"keystone-596469b6bd-hwkh5\" (UID: \"783d142e-5f7f-4ea1-bed2-6b55f7a35aec\") " pod="watcher-kuttl-default/keystone-596469b6bd-hwkh5" Oct 03 08:58:10 crc kubenswrapper[4765]: I1003 08:58:10.459359 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/783d142e-5f7f-4ea1-bed2-6b55f7a35aec-public-tls-certs\") pod \"keystone-596469b6bd-hwkh5\" (UID: \"783d142e-5f7f-4ea1-bed2-6b55f7a35aec\") " pod="watcher-kuttl-default/keystone-596469b6bd-hwkh5" Oct 03 08:58:10 crc kubenswrapper[4765]: I1003 08:58:10.459437 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/783d142e-5f7f-4ea1-bed2-6b55f7a35aec-combined-ca-bundle\") pod \"keystone-596469b6bd-hwkh5\" (UID: \"783d142e-5f7f-4ea1-bed2-6b55f7a35aec\") " pod="watcher-kuttl-default/keystone-596469b6bd-hwkh5" Oct 03 08:58:10 crc kubenswrapper[4765]: I1003 08:58:10.459468 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/783d142e-5f7f-4ea1-bed2-6b55f7a35aec-credential-keys\") pod \"keystone-596469b6bd-hwkh5\" (UID: \"783d142e-5f7f-4ea1-bed2-6b55f7a35aec\") " pod="watcher-kuttl-default/keystone-596469b6bd-hwkh5" Oct 03 08:58:10 crc kubenswrapper[4765]: I1003 08:58:10.459505 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/783d142e-5f7f-4ea1-bed2-6b55f7a35aec-fernet-keys\") pod \"keystone-596469b6bd-hwkh5\" (UID: \"783d142e-5f7f-4ea1-bed2-6b55f7a35aec\") " pod="watcher-kuttl-default/keystone-596469b6bd-hwkh5" Oct 03 08:58:10 crc kubenswrapper[4765]: I1003 08:58:10.459531 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/783d142e-5f7f-4ea1-bed2-6b55f7a35aec-config-data\") pod \"keystone-596469b6bd-hwkh5\" (UID: \"783d142e-5f7f-4ea1-bed2-6b55f7a35aec\") " pod="watcher-kuttl-default/keystone-596469b6bd-hwkh5" Oct 03 08:58:10 crc kubenswrapper[4765]: I1003 08:58:10.459635 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/783d142e-5f7f-4ea1-bed2-6b55f7a35aec-internal-tls-certs\") pod \"keystone-596469b6bd-hwkh5\" (UID: \"783d142e-5f7f-4ea1-bed2-6b55f7a35aec\") " pod="watcher-kuttl-default/keystone-596469b6bd-hwkh5" Oct 03 08:58:10 crc kubenswrapper[4765]: I1003 08:58:10.459674 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/783d142e-5f7f-4ea1-bed2-6b55f7a35aec-scripts\") pod \"keystone-596469b6bd-hwkh5\" (UID: \"783d142e-5f7f-4ea1-bed2-6b55f7a35aec\") " pod="watcher-kuttl-default/keystone-596469b6bd-hwkh5" Oct 03 08:58:10 crc kubenswrapper[4765]: I1003 08:58:10.463440 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/783d142e-5f7f-4ea1-bed2-6b55f7a35aec-scripts\") pod \"keystone-596469b6bd-hwkh5\" (UID: \"783d142e-5f7f-4ea1-bed2-6b55f7a35aec\") " pod="watcher-kuttl-default/keystone-596469b6bd-hwkh5" Oct 03 08:58:10 crc kubenswrapper[4765]: I1003 08:58:10.464389 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/783d142e-5f7f-4ea1-bed2-6b55f7a35aec-combined-ca-bundle\") pod \"keystone-596469b6bd-hwkh5\" (UID: \"783d142e-5f7f-4ea1-bed2-6b55f7a35aec\") " pod="watcher-kuttl-default/keystone-596469b6bd-hwkh5" Oct 03 08:58:10 crc kubenswrapper[4765]: I1003 08:58:10.468923 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/783d142e-5f7f-4ea1-bed2-6b55f7a35aec-fernet-keys\") pod \"keystone-596469b6bd-hwkh5\" (UID: \"783d142e-5f7f-4ea1-bed2-6b55f7a35aec\") " pod="watcher-kuttl-default/keystone-596469b6bd-hwkh5" Oct 03 08:58:10 crc kubenswrapper[4765]: I1003 08:58:10.469252 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/783d142e-5f7f-4ea1-bed2-6b55f7a35aec-credential-keys\") pod \"keystone-596469b6bd-hwkh5\" (UID: \"783d142e-5f7f-4ea1-bed2-6b55f7a35aec\") " pod="watcher-kuttl-default/keystone-596469b6bd-hwkh5" Oct 03 08:58:10 crc kubenswrapper[4765]: I1003 08:58:10.472788 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/783d142e-5f7f-4ea1-bed2-6b55f7a35aec-config-data\") pod \"keystone-596469b6bd-hwkh5\" (UID: \"783d142e-5f7f-4ea1-bed2-6b55f7a35aec\") " pod="watcher-kuttl-default/keystone-596469b6bd-hwkh5" Oct 03 08:58:10 crc kubenswrapper[4765]: I1003 08:58:10.473032 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/783d142e-5f7f-4ea1-bed2-6b55f7a35aec-public-tls-certs\") pod \"keystone-596469b6bd-hwkh5\" (UID: \"783d142e-5f7f-4ea1-bed2-6b55f7a35aec\") " pod="watcher-kuttl-default/keystone-596469b6bd-hwkh5" Oct 03 08:58:10 crc kubenswrapper[4765]: I1003 08:58:10.480679 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/783d142e-5f7f-4ea1-bed2-6b55f7a35aec-internal-tls-certs\") pod \"keystone-596469b6bd-hwkh5\" (UID: \"783d142e-5f7f-4ea1-bed2-6b55f7a35aec\") " pod="watcher-kuttl-default/keystone-596469b6bd-hwkh5" Oct 03 08:58:10 crc kubenswrapper[4765]: I1003 08:58:10.483054 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6zz5t\" (UniqueName: \"kubernetes.io/projected/783d142e-5f7f-4ea1-bed2-6b55f7a35aec-kube-api-access-6zz5t\") pod \"keystone-596469b6bd-hwkh5\" (UID: \"783d142e-5f7f-4ea1-bed2-6b55f7a35aec\") " pod="watcher-kuttl-default/keystone-596469b6bd-hwkh5" Oct 03 08:58:10 crc kubenswrapper[4765]: I1003 08:58:10.565778 4765 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/keystone-596469b6bd-hwkh5" Oct 03 08:58:11 crc kubenswrapper[4765]: I1003 08:58:11.033045 4765 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["watcher-kuttl-default/keystone-596469b6bd-hwkh5"] Oct 03 08:58:11 crc kubenswrapper[4765]: W1003 08:58:11.045053 4765 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod783d142e_5f7f_4ea1_bed2_6b55f7a35aec.slice/crio-07bf28ea03cc6da5dc699e7363213b3bb850dce5a6ade67de6680b27e8ebee9a WatchSource:0}: Error finding container 07bf28ea03cc6da5dc699e7363213b3bb850dce5a6ade67de6680b27e8ebee9a: Status 404 returned error can't find the container with id 07bf28ea03cc6da5dc699e7363213b3bb850dce5a6ade67de6680b27e8ebee9a Oct 03 08:58:11 crc kubenswrapper[4765]: I1003 08:58:11.136540 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/keystone-596469b6bd-hwkh5" event={"ID":"783d142e-5f7f-4ea1-bed2-6b55f7a35aec","Type":"ContainerStarted","Data":"07bf28ea03cc6da5dc699e7363213b3bb850dce5a6ade67de6680b27e8ebee9a"} Oct 03 08:58:12 crc kubenswrapper[4765]: I1003 08:58:12.155411 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/keystone-596469b6bd-hwkh5" event={"ID":"783d142e-5f7f-4ea1-bed2-6b55f7a35aec","Type":"ContainerStarted","Data":"f4b4d45ba6779950271e08ec96068d24d4e2040f61a118c926515522ef7fbe42"} Oct 03 08:58:12 crc kubenswrapper[4765]: I1003 08:58:12.155971 4765 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="watcher-kuttl-default/keystone-596469b6bd-hwkh5" Oct 03 08:58:12 crc kubenswrapper[4765]: I1003 08:58:12.185470 4765 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="watcher-kuttl-default/keystone-596469b6bd-hwkh5" podStartSLOduration=2.185450286 podStartE2EDuration="2.185450286s" podCreationTimestamp="2025-10-03 08:58:10 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-03 08:58:12.173942877 +0000 UTC m=+1136.475437207" watchObservedRunningTime="2025-10-03 08:58:12.185450286 +0000 UTC m=+1136.486944616" Oct 03 08:58:17 crc kubenswrapper[4765]: I1003 08:58:17.200410 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/ceilometer-0" event={"ID":"beef12c0-2b55-4d2b-9a9f-a7d1423ace91","Type":"ContainerStarted","Data":"b3a098a50f182f76c586973ae1b4d9ee5962a4c70758f529ec73c7dd0c220fd1"} Oct 03 08:58:17 crc kubenswrapper[4765]: I1003 08:58:17.201124 4765 kuberuntime_container.go:808] "Killing container with a grace period" pod="watcher-kuttl-default/ceilometer-0" podUID="beef12c0-2b55-4d2b-9a9f-a7d1423ace91" containerName="ceilometer-central-agent" containerID="cri-o://f953edf63c785375c1c38fc6575faab03f066c99bdcfc21d7d10d71ccf0700ef" gracePeriod=30 Oct 03 08:58:17 crc kubenswrapper[4765]: I1003 08:58:17.201216 4765 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="watcher-kuttl-default/ceilometer-0" Oct 03 08:58:17 crc kubenswrapper[4765]: I1003 08:58:17.201525 4765 kuberuntime_container.go:808] "Killing container with a grace period" pod="watcher-kuttl-default/ceilometer-0" podUID="beef12c0-2b55-4d2b-9a9f-a7d1423ace91" containerName="proxy-httpd" containerID="cri-o://b3a098a50f182f76c586973ae1b4d9ee5962a4c70758f529ec73c7dd0c220fd1" gracePeriod=30 Oct 03 08:58:17 crc kubenswrapper[4765]: I1003 08:58:17.201581 4765 kuberuntime_container.go:808] "Killing container with a grace period" pod="watcher-kuttl-default/ceilometer-0" podUID="beef12c0-2b55-4d2b-9a9f-a7d1423ace91" containerName="sg-core" containerID="cri-o://506fb2d3d4c8bac9b32cd6a53790ed93e3d64eb0bafc490c228af1059c7b8700" gracePeriod=30 Oct 03 08:58:17 crc kubenswrapper[4765]: I1003 08:58:17.201635 4765 kuberuntime_container.go:808] "Killing container with a grace period" pod="watcher-kuttl-default/ceilometer-0" podUID="beef12c0-2b55-4d2b-9a9f-a7d1423ace91" containerName="ceilometer-notification-agent" containerID="cri-o://b8b21548a206b9679dfc793b55fa6989578901d20d5258e8819ad2a40f37989e" gracePeriod=30 Oct 03 08:58:17 crc kubenswrapper[4765]: I1003 08:58:17.234082 4765 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="watcher-kuttl-default/ceilometer-0" podStartSLOduration=2.157592215 podStartE2EDuration="22.234064218s" podCreationTimestamp="2025-10-03 08:57:55 +0000 UTC" firstStartedPulling="2025-10-03 08:57:56.09772106 +0000 UTC m=+1120.399215390" lastFinishedPulling="2025-10-03 08:58:16.174193073 +0000 UTC m=+1140.475687393" observedRunningTime="2025-10-03 08:58:17.228319854 +0000 UTC m=+1141.529814194" watchObservedRunningTime="2025-10-03 08:58:17.234064218 +0000 UTC m=+1141.535558548" Oct 03 08:58:18 crc kubenswrapper[4765]: I1003 08:58:18.209617 4765 generic.go:334] "Generic (PLEG): container finished" podID="beef12c0-2b55-4d2b-9a9f-a7d1423ace91" containerID="b3a098a50f182f76c586973ae1b4d9ee5962a4c70758f529ec73c7dd0c220fd1" exitCode=0 Oct 03 08:58:18 crc kubenswrapper[4765]: I1003 08:58:18.209988 4765 generic.go:334] "Generic (PLEG): container finished" podID="beef12c0-2b55-4d2b-9a9f-a7d1423ace91" containerID="506fb2d3d4c8bac9b32cd6a53790ed93e3d64eb0bafc490c228af1059c7b8700" exitCode=2 Oct 03 08:58:18 crc kubenswrapper[4765]: I1003 08:58:18.210004 4765 generic.go:334] "Generic (PLEG): container finished" podID="beef12c0-2b55-4d2b-9a9f-a7d1423ace91" containerID="f953edf63c785375c1c38fc6575faab03f066c99bdcfc21d7d10d71ccf0700ef" exitCode=0 Oct 03 08:58:18 crc kubenswrapper[4765]: I1003 08:58:18.209697 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/ceilometer-0" event={"ID":"beef12c0-2b55-4d2b-9a9f-a7d1423ace91","Type":"ContainerDied","Data":"b3a098a50f182f76c586973ae1b4d9ee5962a4c70758f529ec73c7dd0c220fd1"} Oct 03 08:58:18 crc kubenswrapper[4765]: I1003 08:58:18.210037 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/ceilometer-0" event={"ID":"beef12c0-2b55-4d2b-9a9f-a7d1423ace91","Type":"ContainerDied","Data":"506fb2d3d4c8bac9b32cd6a53790ed93e3d64eb0bafc490c228af1059c7b8700"} Oct 03 08:58:18 crc kubenswrapper[4765]: I1003 08:58:18.210051 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/ceilometer-0" event={"ID":"beef12c0-2b55-4d2b-9a9f-a7d1423ace91","Type":"ContainerDied","Data":"f953edf63c785375c1c38fc6575faab03f066c99bdcfc21d7d10d71ccf0700ef"} Oct 03 08:58:20 crc kubenswrapper[4765]: I1003 08:58:19.918604 4765 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/ceilometer-0" Oct 03 08:58:20 crc kubenswrapper[4765]: I1003 08:58:20.021803 4765 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/beef12c0-2b55-4d2b-9a9f-a7d1423ace91-config-data\") pod \"beef12c0-2b55-4d2b-9a9f-a7d1423ace91\" (UID: \"beef12c0-2b55-4d2b-9a9f-a7d1423ace91\") " Oct 03 08:58:20 crc kubenswrapper[4765]: I1003 08:58:20.021887 4765 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/beef12c0-2b55-4d2b-9a9f-a7d1423ace91-sg-core-conf-yaml\") pod \"beef12c0-2b55-4d2b-9a9f-a7d1423ace91\" (UID: \"beef12c0-2b55-4d2b-9a9f-a7d1423ace91\") " Oct 03 08:58:20 crc kubenswrapper[4765]: I1003 08:58:20.021914 4765 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/beef12c0-2b55-4d2b-9a9f-a7d1423ace91-combined-ca-bundle\") pod \"beef12c0-2b55-4d2b-9a9f-a7d1423ace91\" (UID: \"beef12c0-2b55-4d2b-9a9f-a7d1423ace91\") " Oct 03 08:58:20 crc kubenswrapper[4765]: I1003 08:58:20.021956 4765 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/beef12c0-2b55-4d2b-9a9f-a7d1423ace91-run-httpd\") pod \"beef12c0-2b55-4d2b-9a9f-a7d1423ace91\" (UID: \"beef12c0-2b55-4d2b-9a9f-a7d1423ace91\") " Oct 03 08:58:20 crc kubenswrapper[4765]: I1003 08:58:20.021984 4765 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qhrcm\" (UniqueName: \"kubernetes.io/projected/beef12c0-2b55-4d2b-9a9f-a7d1423ace91-kube-api-access-qhrcm\") pod \"beef12c0-2b55-4d2b-9a9f-a7d1423ace91\" (UID: \"beef12c0-2b55-4d2b-9a9f-a7d1423ace91\") " Oct 03 08:58:20 crc kubenswrapper[4765]: I1003 08:58:20.022037 4765 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/beef12c0-2b55-4d2b-9a9f-a7d1423ace91-scripts\") pod \"beef12c0-2b55-4d2b-9a9f-a7d1423ace91\" (UID: \"beef12c0-2b55-4d2b-9a9f-a7d1423ace91\") " Oct 03 08:58:20 crc kubenswrapper[4765]: I1003 08:58:20.022064 4765 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/beef12c0-2b55-4d2b-9a9f-a7d1423ace91-log-httpd\") pod \"beef12c0-2b55-4d2b-9a9f-a7d1423ace91\" (UID: \"beef12c0-2b55-4d2b-9a9f-a7d1423ace91\") " Oct 03 08:58:20 crc kubenswrapper[4765]: I1003 08:58:20.022501 4765 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/beef12c0-2b55-4d2b-9a9f-a7d1423ace91-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "beef12c0-2b55-4d2b-9a9f-a7d1423ace91" (UID: "beef12c0-2b55-4d2b-9a9f-a7d1423ace91"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 03 08:58:20 crc kubenswrapper[4765]: I1003 08:58:20.022625 4765 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/beef12c0-2b55-4d2b-9a9f-a7d1423ace91-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "beef12c0-2b55-4d2b-9a9f-a7d1423ace91" (UID: "beef12c0-2b55-4d2b-9a9f-a7d1423ace91"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 03 08:58:20 crc kubenswrapper[4765]: I1003 08:58:20.022885 4765 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/beef12c0-2b55-4d2b-9a9f-a7d1423ace91-run-httpd\") on node \"crc\" DevicePath \"\"" Oct 03 08:58:20 crc kubenswrapper[4765]: I1003 08:58:20.022897 4765 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/beef12c0-2b55-4d2b-9a9f-a7d1423ace91-log-httpd\") on node \"crc\" DevicePath \"\"" Oct 03 08:58:20 crc kubenswrapper[4765]: I1003 08:58:20.027317 4765 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/beef12c0-2b55-4d2b-9a9f-a7d1423ace91-kube-api-access-qhrcm" (OuterVolumeSpecName: "kube-api-access-qhrcm") pod "beef12c0-2b55-4d2b-9a9f-a7d1423ace91" (UID: "beef12c0-2b55-4d2b-9a9f-a7d1423ace91"). InnerVolumeSpecName "kube-api-access-qhrcm". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 08:58:20 crc kubenswrapper[4765]: I1003 08:58:20.027467 4765 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/beef12c0-2b55-4d2b-9a9f-a7d1423ace91-scripts" (OuterVolumeSpecName: "scripts") pod "beef12c0-2b55-4d2b-9a9f-a7d1423ace91" (UID: "beef12c0-2b55-4d2b-9a9f-a7d1423ace91"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 08:58:20 crc kubenswrapper[4765]: I1003 08:58:20.044576 4765 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/beef12c0-2b55-4d2b-9a9f-a7d1423ace91-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "beef12c0-2b55-4d2b-9a9f-a7d1423ace91" (UID: "beef12c0-2b55-4d2b-9a9f-a7d1423ace91"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 08:58:20 crc kubenswrapper[4765]: I1003 08:58:20.080002 4765 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/beef12c0-2b55-4d2b-9a9f-a7d1423ace91-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "beef12c0-2b55-4d2b-9a9f-a7d1423ace91" (UID: "beef12c0-2b55-4d2b-9a9f-a7d1423ace91"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 08:58:20 crc kubenswrapper[4765]: I1003 08:58:20.097281 4765 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/beef12c0-2b55-4d2b-9a9f-a7d1423ace91-config-data" (OuterVolumeSpecName: "config-data") pod "beef12c0-2b55-4d2b-9a9f-a7d1423ace91" (UID: "beef12c0-2b55-4d2b-9a9f-a7d1423ace91"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 08:58:20 crc kubenswrapper[4765]: I1003 08:58:20.124699 4765 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/beef12c0-2b55-4d2b-9a9f-a7d1423ace91-scripts\") on node \"crc\" DevicePath \"\"" Oct 03 08:58:20 crc kubenswrapper[4765]: I1003 08:58:20.124730 4765 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/beef12c0-2b55-4d2b-9a9f-a7d1423ace91-config-data\") on node \"crc\" DevicePath \"\"" Oct 03 08:58:20 crc kubenswrapper[4765]: I1003 08:58:20.124746 4765 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/beef12c0-2b55-4d2b-9a9f-a7d1423ace91-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Oct 03 08:58:20 crc kubenswrapper[4765]: I1003 08:58:20.124758 4765 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/beef12c0-2b55-4d2b-9a9f-a7d1423ace91-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 03 08:58:20 crc kubenswrapper[4765]: I1003 08:58:20.124772 4765 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qhrcm\" (UniqueName: \"kubernetes.io/projected/beef12c0-2b55-4d2b-9a9f-a7d1423ace91-kube-api-access-qhrcm\") on node \"crc\" DevicePath \"\"" Oct 03 08:58:20 crc kubenswrapper[4765]: I1003 08:58:20.231352 4765 generic.go:334] "Generic (PLEG): container finished" podID="beef12c0-2b55-4d2b-9a9f-a7d1423ace91" containerID="b8b21548a206b9679dfc793b55fa6989578901d20d5258e8819ad2a40f37989e" exitCode=0 Oct 03 08:58:20 crc kubenswrapper[4765]: I1003 08:58:20.231388 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/ceilometer-0" event={"ID":"beef12c0-2b55-4d2b-9a9f-a7d1423ace91","Type":"ContainerDied","Data":"b8b21548a206b9679dfc793b55fa6989578901d20d5258e8819ad2a40f37989e"} Oct 03 08:58:20 crc kubenswrapper[4765]: I1003 08:58:20.231412 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/ceilometer-0" event={"ID":"beef12c0-2b55-4d2b-9a9f-a7d1423ace91","Type":"ContainerDied","Data":"e6b156adb11fd7ebab08a18e3d0423e27b65e09242f9f0133a86de6b0e1e565f"} Oct 03 08:58:20 crc kubenswrapper[4765]: I1003 08:58:20.231428 4765 scope.go:117] "RemoveContainer" containerID="b3a098a50f182f76c586973ae1b4d9ee5962a4c70758f529ec73c7dd0c220fd1" Oct 03 08:58:20 crc kubenswrapper[4765]: I1003 08:58:20.231425 4765 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/ceilometer-0" Oct 03 08:58:20 crc kubenswrapper[4765]: I1003 08:58:20.248681 4765 scope.go:117] "RemoveContainer" containerID="506fb2d3d4c8bac9b32cd6a53790ed93e3d64eb0bafc490c228af1059c7b8700" Oct 03 08:58:20 crc kubenswrapper[4765]: I1003 08:58:20.273829 4765 scope.go:117] "RemoveContainer" containerID="b8b21548a206b9679dfc793b55fa6989578901d20d5258e8819ad2a40f37989e" Oct 03 08:58:20 crc kubenswrapper[4765]: I1003 08:58:20.273879 4765 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["watcher-kuttl-default/ceilometer-0"] Oct 03 08:58:20 crc kubenswrapper[4765]: I1003 08:58:20.311239 4765 scope.go:117] "RemoveContainer" containerID="f953edf63c785375c1c38fc6575faab03f066c99bdcfc21d7d10d71ccf0700ef" Oct 03 08:58:20 crc kubenswrapper[4765]: I1003 08:58:20.329828 4765 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["watcher-kuttl-default/ceilometer-0"] Oct 03 08:58:20 crc kubenswrapper[4765]: I1003 08:58:20.329875 4765 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["watcher-kuttl-default/ceilometer-0"] Oct 03 08:58:20 crc kubenswrapper[4765]: E1003 08:58:20.330321 4765 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="beef12c0-2b55-4d2b-9a9f-a7d1423ace91" containerName="ceilometer-central-agent" Oct 03 08:58:20 crc kubenswrapper[4765]: I1003 08:58:20.330333 4765 state_mem.go:107] "Deleted CPUSet assignment" podUID="beef12c0-2b55-4d2b-9a9f-a7d1423ace91" containerName="ceilometer-central-agent" Oct 03 08:58:20 crc kubenswrapper[4765]: E1003 08:58:20.330352 4765 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="beef12c0-2b55-4d2b-9a9f-a7d1423ace91" containerName="proxy-httpd" Oct 03 08:58:20 crc kubenswrapper[4765]: I1003 08:58:20.330358 4765 state_mem.go:107] "Deleted CPUSet assignment" podUID="beef12c0-2b55-4d2b-9a9f-a7d1423ace91" containerName="proxy-httpd" Oct 03 08:58:20 crc kubenswrapper[4765]: E1003 08:58:20.330382 4765 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="beef12c0-2b55-4d2b-9a9f-a7d1423ace91" containerName="sg-core" Oct 03 08:58:20 crc kubenswrapper[4765]: I1003 08:58:20.330388 4765 state_mem.go:107] "Deleted CPUSet assignment" podUID="beef12c0-2b55-4d2b-9a9f-a7d1423ace91" containerName="sg-core" Oct 03 08:58:20 crc kubenswrapper[4765]: E1003 08:58:20.330412 4765 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="beef12c0-2b55-4d2b-9a9f-a7d1423ace91" containerName="ceilometer-notification-agent" Oct 03 08:58:20 crc kubenswrapper[4765]: I1003 08:58:20.330417 4765 state_mem.go:107] "Deleted CPUSet assignment" podUID="beef12c0-2b55-4d2b-9a9f-a7d1423ace91" containerName="ceilometer-notification-agent" Oct 03 08:58:20 crc kubenswrapper[4765]: I1003 08:58:20.330890 4765 memory_manager.go:354] "RemoveStaleState removing state" podUID="beef12c0-2b55-4d2b-9a9f-a7d1423ace91" containerName="ceilometer-central-agent" Oct 03 08:58:20 crc kubenswrapper[4765]: I1003 08:58:20.330930 4765 memory_manager.go:354] "RemoveStaleState removing state" podUID="beef12c0-2b55-4d2b-9a9f-a7d1423ace91" containerName="proxy-httpd" Oct 03 08:58:20 crc kubenswrapper[4765]: I1003 08:58:20.330968 4765 memory_manager.go:354] "RemoveStaleState removing state" podUID="beef12c0-2b55-4d2b-9a9f-a7d1423ace91" containerName="ceilometer-notification-agent" Oct 03 08:58:20 crc kubenswrapper[4765]: I1003 08:58:20.330991 4765 memory_manager.go:354] "RemoveStaleState removing state" podUID="beef12c0-2b55-4d2b-9a9f-a7d1423ace91" containerName="sg-core" Oct 03 08:58:20 crc kubenswrapper[4765]: I1003 08:58:20.336606 4765 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/ceilometer-0" Oct 03 08:58:20 crc kubenswrapper[4765]: I1003 08:58:20.338820 4765 reflector.go:368] Caches populated for *v1.Secret from object-"watcher-kuttl-default"/"ceilometer-config-data" Oct 03 08:58:20 crc kubenswrapper[4765]: I1003 08:58:20.340460 4765 reflector.go:368] Caches populated for *v1.Secret from object-"watcher-kuttl-default"/"ceilometer-scripts" Oct 03 08:58:20 crc kubenswrapper[4765]: I1003 08:58:20.343690 4765 scope.go:117] "RemoveContainer" containerID="b3a098a50f182f76c586973ae1b4d9ee5962a4c70758f529ec73c7dd0c220fd1" Oct 03 08:58:20 crc kubenswrapper[4765]: E1003 08:58:20.345934 4765 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b3a098a50f182f76c586973ae1b4d9ee5962a4c70758f529ec73c7dd0c220fd1\": container with ID starting with b3a098a50f182f76c586973ae1b4d9ee5962a4c70758f529ec73c7dd0c220fd1 not found: ID does not exist" containerID="b3a098a50f182f76c586973ae1b4d9ee5962a4c70758f529ec73c7dd0c220fd1" Oct 03 08:58:20 crc kubenswrapper[4765]: I1003 08:58:20.345987 4765 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b3a098a50f182f76c586973ae1b4d9ee5962a4c70758f529ec73c7dd0c220fd1"} err="failed to get container status \"b3a098a50f182f76c586973ae1b4d9ee5962a4c70758f529ec73c7dd0c220fd1\": rpc error: code = NotFound desc = could not find container \"b3a098a50f182f76c586973ae1b4d9ee5962a4c70758f529ec73c7dd0c220fd1\": container with ID starting with b3a098a50f182f76c586973ae1b4d9ee5962a4c70758f529ec73c7dd0c220fd1 not found: ID does not exist" Oct 03 08:58:20 crc kubenswrapper[4765]: I1003 08:58:20.346013 4765 scope.go:117] "RemoveContainer" containerID="506fb2d3d4c8bac9b32cd6a53790ed93e3d64eb0bafc490c228af1059c7b8700" Oct 03 08:58:20 crc kubenswrapper[4765]: I1003 08:58:20.353952 4765 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["watcher-kuttl-default/ceilometer-0"] Oct 03 08:58:20 crc kubenswrapper[4765]: E1003 08:58:20.353977 4765 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"506fb2d3d4c8bac9b32cd6a53790ed93e3d64eb0bafc490c228af1059c7b8700\": container with ID starting with 506fb2d3d4c8bac9b32cd6a53790ed93e3d64eb0bafc490c228af1059c7b8700 not found: ID does not exist" containerID="506fb2d3d4c8bac9b32cd6a53790ed93e3d64eb0bafc490c228af1059c7b8700" Oct 03 08:58:20 crc kubenswrapper[4765]: I1003 08:58:20.354052 4765 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"506fb2d3d4c8bac9b32cd6a53790ed93e3d64eb0bafc490c228af1059c7b8700"} err="failed to get container status \"506fb2d3d4c8bac9b32cd6a53790ed93e3d64eb0bafc490c228af1059c7b8700\": rpc error: code = NotFound desc = could not find container \"506fb2d3d4c8bac9b32cd6a53790ed93e3d64eb0bafc490c228af1059c7b8700\": container with ID starting with 506fb2d3d4c8bac9b32cd6a53790ed93e3d64eb0bafc490c228af1059c7b8700 not found: ID does not exist" Oct 03 08:58:20 crc kubenswrapper[4765]: I1003 08:58:20.354090 4765 scope.go:117] "RemoveContainer" containerID="b8b21548a206b9679dfc793b55fa6989578901d20d5258e8819ad2a40f37989e" Oct 03 08:58:20 crc kubenswrapper[4765]: E1003 08:58:20.357490 4765 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b8b21548a206b9679dfc793b55fa6989578901d20d5258e8819ad2a40f37989e\": container with ID starting with b8b21548a206b9679dfc793b55fa6989578901d20d5258e8819ad2a40f37989e not found: ID does not exist" containerID="b8b21548a206b9679dfc793b55fa6989578901d20d5258e8819ad2a40f37989e" Oct 03 08:58:20 crc kubenswrapper[4765]: I1003 08:58:20.357542 4765 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b8b21548a206b9679dfc793b55fa6989578901d20d5258e8819ad2a40f37989e"} err="failed to get container status \"b8b21548a206b9679dfc793b55fa6989578901d20d5258e8819ad2a40f37989e\": rpc error: code = NotFound desc = could not find container \"b8b21548a206b9679dfc793b55fa6989578901d20d5258e8819ad2a40f37989e\": container with ID starting with b8b21548a206b9679dfc793b55fa6989578901d20d5258e8819ad2a40f37989e not found: ID does not exist" Oct 03 08:58:20 crc kubenswrapper[4765]: I1003 08:58:20.357599 4765 scope.go:117] "RemoveContainer" containerID="f953edf63c785375c1c38fc6575faab03f066c99bdcfc21d7d10d71ccf0700ef" Oct 03 08:58:20 crc kubenswrapper[4765]: E1003 08:58:20.358162 4765 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f953edf63c785375c1c38fc6575faab03f066c99bdcfc21d7d10d71ccf0700ef\": container with ID starting with f953edf63c785375c1c38fc6575faab03f066c99bdcfc21d7d10d71ccf0700ef not found: ID does not exist" containerID="f953edf63c785375c1c38fc6575faab03f066c99bdcfc21d7d10d71ccf0700ef" Oct 03 08:58:20 crc kubenswrapper[4765]: I1003 08:58:20.358191 4765 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f953edf63c785375c1c38fc6575faab03f066c99bdcfc21d7d10d71ccf0700ef"} err="failed to get container status \"f953edf63c785375c1c38fc6575faab03f066c99bdcfc21d7d10d71ccf0700ef\": rpc error: code = NotFound desc = could not find container \"f953edf63c785375c1c38fc6575faab03f066c99bdcfc21d7d10d71ccf0700ef\": container with ID starting with f953edf63c785375c1c38fc6575faab03f066c99bdcfc21d7d10d71ccf0700ef not found: ID does not exist" Oct 03 08:58:20 crc kubenswrapper[4765]: I1003 08:58:20.434940 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/143fccd6-3ba5-49ce-bf0b-e5a89f3ca38f-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"143fccd6-3ba5-49ce-bf0b-e5a89f3ca38f\") " pod="watcher-kuttl-default/ceilometer-0" Oct 03 08:58:20 crc kubenswrapper[4765]: I1003 08:58:20.435183 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/143fccd6-3ba5-49ce-bf0b-e5a89f3ca38f-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"143fccd6-3ba5-49ce-bf0b-e5a89f3ca38f\") " pod="watcher-kuttl-default/ceilometer-0" Oct 03 08:58:20 crc kubenswrapper[4765]: I1003 08:58:20.435303 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/143fccd6-3ba5-49ce-bf0b-e5a89f3ca38f-run-httpd\") pod \"ceilometer-0\" (UID: \"143fccd6-3ba5-49ce-bf0b-e5a89f3ca38f\") " pod="watcher-kuttl-default/ceilometer-0" Oct 03 08:58:20 crc kubenswrapper[4765]: I1003 08:58:20.435405 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/143fccd6-3ba5-49ce-bf0b-e5a89f3ca38f-config-data\") pod \"ceilometer-0\" (UID: \"143fccd6-3ba5-49ce-bf0b-e5a89f3ca38f\") " pod="watcher-kuttl-default/ceilometer-0" Oct 03 08:58:20 crc kubenswrapper[4765]: I1003 08:58:20.435505 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/143fccd6-3ba5-49ce-bf0b-e5a89f3ca38f-log-httpd\") pod \"ceilometer-0\" (UID: \"143fccd6-3ba5-49ce-bf0b-e5a89f3ca38f\") " pod="watcher-kuttl-default/ceilometer-0" Oct 03 08:58:20 crc kubenswrapper[4765]: I1003 08:58:20.435687 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/143fccd6-3ba5-49ce-bf0b-e5a89f3ca38f-scripts\") pod \"ceilometer-0\" (UID: \"143fccd6-3ba5-49ce-bf0b-e5a89f3ca38f\") " pod="watcher-kuttl-default/ceilometer-0" Oct 03 08:58:20 crc kubenswrapper[4765]: I1003 08:58:20.435784 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rjqj8\" (UniqueName: \"kubernetes.io/projected/143fccd6-3ba5-49ce-bf0b-e5a89f3ca38f-kube-api-access-rjqj8\") pod \"ceilometer-0\" (UID: \"143fccd6-3ba5-49ce-bf0b-e5a89f3ca38f\") " pod="watcher-kuttl-default/ceilometer-0" Oct 03 08:58:20 crc kubenswrapper[4765]: I1003 08:58:20.536999 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/143fccd6-3ba5-49ce-bf0b-e5a89f3ca38f-config-data\") pod \"ceilometer-0\" (UID: \"143fccd6-3ba5-49ce-bf0b-e5a89f3ca38f\") " pod="watcher-kuttl-default/ceilometer-0" Oct 03 08:58:20 crc kubenswrapper[4765]: I1003 08:58:20.537336 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/143fccd6-3ba5-49ce-bf0b-e5a89f3ca38f-log-httpd\") pod \"ceilometer-0\" (UID: \"143fccd6-3ba5-49ce-bf0b-e5a89f3ca38f\") " pod="watcher-kuttl-default/ceilometer-0" Oct 03 08:58:20 crc kubenswrapper[4765]: I1003 08:58:20.537375 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/143fccd6-3ba5-49ce-bf0b-e5a89f3ca38f-scripts\") pod \"ceilometer-0\" (UID: \"143fccd6-3ba5-49ce-bf0b-e5a89f3ca38f\") " pod="watcher-kuttl-default/ceilometer-0" Oct 03 08:58:20 crc kubenswrapper[4765]: I1003 08:58:20.537400 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rjqj8\" (UniqueName: \"kubernetes.io/projected/143fccd6-3ba5-49ce-bf0b-e5a89f3ca38f-kube-api-access-rjqj8\") pod \"ceilometer-0\" (UID: \"143fccd6-3ba5-49ce-bf0b-e5a89f3ca38f\") " pod="watcher-kuttl-default/ceilometer-0" Oct 03 08:58:20 crc kubenswrapper[4765]: I1003 08:58:20.537435 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/143fccd6-3ba5-49ce-bf0b-e5a89f3ca38f-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"143fccd6-3ba5-49ce-bf0b-e5a89f3ca38f\") " pod="watcher-kuttl-default/ceilometer-0" Oct 03 08:58:20 crc kubenswrapper[4765]: I1003 08:58:20.537451 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/143fccd6-3ba5-49ce-bf0b-e5a89f3ca38f-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"143fccd6-3ba5-49ce-bf0b-e5a89f3ca38f\") " pod="watcher-kuttl-default/ceilometer-0" Oct 03 08:58:20 crc kubenswrapper[4765]: I1003 08:58:20.537477 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/143fccd6-3ba5-49ce-bf0b-e5a89f3ca38f-run-httpd\") pod \"ceilometer-0\" (UID: \"143fccd6-3ba5-49ce-bf0b-e5a89f3ca38f\") " pod="watcher-kuttl-default/ceilometer-0" Oct 03 08:58:20 crc kubenswrapper[4765]: I1003 08:58:20.537771 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/143fccd6-3ba5-49ce-bf0b-e5a89f3ca38f-log-httpd\") pod \"ceilometer-0\" (UID: \"143fccd6-3ba5-49ce-bf0b-e5a89f3ca38f\") " pod="watcher-kuttl-default/ceilometer-0" Oct 03 08:58:20 crc kubenswrapper[4765]: I1003 08:58:20.537855 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/143fccd6-3ba5-49ce-bf0b-e5a89f3ca38f-run-httpd\") pod \"ceilometer-0\" (UID: \"143fccd6-3ba5-49ce-bf0b-e5a89f3ca38f\") " pod="watcher-kuttl-default/ceilometer-0" Oct 03 08:58:20 crc kubenswrapper[4765]: I1003 08:58:20.542006 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/143fccd6-3ba5-49ce-bf0b-e5a89f3ca38f-config-data\") pod \"ceilometer-0\" (UID: \"143fccd6-3ba5-49ce-bf0b-e5a89f3ca38f\") " pod="watcher-kuttl-default/ceilometer-0" Oct 03 08:58:20 crc kubenswrapper[4765]: I1003 08:58:20.542832 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/143fccd6-3ba5-49ce-bf0b-e5a89f3ca38f-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"143fccd6-3ba5-49ce-bf0b-e5a89f3ca38f\") " pod="watcher-kuttl-default/ceilometer-0" Oct 03 08:58:20 crc kubenswrapper[4765]: I1003 08:58:20.543260 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/143fccd6-3ba5-49ce-bf0b-e5a89f3ca38f-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"143fccd6-3ba5-49ce-bf0b-e5a89f3ca38f\") " pod="watcher-kuttl-default/ceilometer-0" Oct 03 08:58:20 crc kubenswrapper[4765]: I1003 08:58:20.543442 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/143fccd6-3ba5-49ce-bf0b-e5a89f3ca38f-scripts\") pod \"ceilometer-0\" (UID: \"143fccd6-3ba5-49ce-bf0b-e5a89f3ca38f\") " pod="watcher-kuttl-default/ceilometer-0" Oct 03 08:58:20 crc kubenswrapper[4765]: I1003 08:58:20.552410 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rjqj8\" (UniqueName: \"kubernetes.io/projected/143fccd6-3ba5-49ce-bf0b-e5a89f3ca38f-kube-api-access-rjqj8\") pod \"ceilometer-0\" (UID: \"143fccd6-3ba5-49ce-bf0b-e5a89f3ca38f\") " pod="watcher-kuttl-default/ceilometer-0" Oct 03 08:58:20 crc kubenswrapper[4765]: I1003 08:58:20.663498 4765 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/ceilometer-0" Oct 03 08:58:21 crc kubenswrapper[4765]: I1003 08:58:21.105242 4765 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["watcher-kuttl-default/ceilometer-0"] Oct 03 08:58:21 crc kubenswrapper[4765]: W1003 08:58:21.109554 4765 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod143fccd6_3ba5_49ce_bf0b_e5a89f3ca38f.slice/crio-05f0f678fcd2d926c5ea3b92971788eee00516772d1ca6c635591cff46dcf269 WatchSource:0}: Error finding container 05f0f678fcd2d926c5ea3b92971788eee00516772d1ca6c635591cff46dcf269: Status 404 returned error can't find the container with id 05f0f678fcd2d926c5ea3b92971788eee00516772d1ca6c635591cff46dcf269 Oct 03 08:58:21 crc kubenswrapper[4765]: I1003 08:58:21.240604 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/ceilometer-0" event={"ID":"143fccd6-3ba5-49ce-bf0b-e5a89f3ca38f","Type":"ContainerStarted","Data":"05f0f678fcd2d926c5ea3b92971788eee00516772d1ca6c635591cff46dcf269"} Oct 03 08:58:22 crc kubenswrapper[4765]: I1003 08:58:22.251110 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/ceilometer-0" event={"ID":"143fccd6-3ba5-49ce-bf0b-e5a89f3ca38f","Type":"ContainerStarted","Data":"e1ad12b5f06caec0588dd05af5410721636bc5a0695cf0c2b98a9f114ad5c76a"} Oct 03 08:58:22 crc kubenswrapper[4765]: I1003 08:58:22.317388 4765 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="beef12c0-2b55-4d2b-9a9f-a7d1423ace91" path="/var/lib/kubelet/pods/beef12c0-2b55-4d2b-9a9f-a7d1423ace91/volumes" Oct 03 08:58:23 crc kubenswrapper[4765]: I1003 08:58:23.261031 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/ceilometer-0" event={"ID":"143fccd6-3ba5-49ce-bf0b-e5a89f3ca38f","Type":"ContainerStarted","Data":"08d3b95b3561f434a5a390b93cba2f63a2baeeae29be761516c31f7b437f1a20"} Oct 03 08:58:24 crc kubenswrapper[4765]: I1003 08:58:24.269821 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/ceilometer-0" event={"ID":"143fccd6-3ba5-49ce-bf0b-e5a89f3ca38f","Type":"ContainerStarted","Data":"52c42b4fe242f64aa422aa73c90b706046bdf7462ec62c69019e10fea6e6a880"} Oct 03 08:58:25 crc kubenswrapper[4765]: I1003 08:58:25.281558 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/ceilometer-0" event={"ID":"143fccd6-3ba5-49ce-bf0b-e5a89f3ca38f","Type":"ContainerStarted","Data":"bd5628f5c002ec422177e1a72b7723485fe07333a532b44a537183a1c80df315"} Oct 03 08:58:25 crc kubenswrapper[4765]: I1003 08:58:25.281945 4765 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="watcher-kuttl-default/ceilometer-0" Oct 03 08:58:25 crc kubenswrapper[4765]: I1003 08:58:25.306258 4765 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="watcher-kuttl-default/ceilometer-0" podStartSLOduration=1.859142209 podStartE2EDuration="5.306233163s" podCreationTimestamp="2025-10-03 08:58:20 +0000 UTC" firstStartedPulling="2025-10-03 08:58:21.112906271 +0000 UTC m=+1145.414400601" lastFinishedPulling="2025-10-03 08:58:24.559997225 +0000 UTC m=+1148.861491555" observedRunningTime="2025-10-03 08:58:25.299419081 +0000 UTC m=+1149.600913421" watchObservedRunningTime="2025-10-03 08:58:25.306233163 +0000 UTC m=+1149.607727483" Oct 03 08:58:30 crc kubenswrapper[4765]: I1003 08:58:30.680134 4765 patch_prober.go:28] interesting pod/machine-config-daemon-j8mss container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 03 08:58:30 crc kubenswrapper[4765]: I1003 08:58:30.681857 4765 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-j8mss" podUID="d636dbad-9ffa-4ba7-953f-adea04b76a23" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 03 08:58:30 crc kubenswrapper[4765]: I1003 08:58:30.681967 4765 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-j8mss" Oct 03 08:58:30 crc kubenswrapper[4765]: I1003 08:58:30.682763 4765 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"3ea26be161c74098dfac030bd0f30c6280c924f6e3ecfdb459bea4fd5d08ace1"} pod="openshift-machine-config-operator/machine-config-daemon-j8mss" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Oct 03 08:58:30 crc kubenswrapper[4765]: I1003 08:58:30.682893 4765 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-j8mss" podUID="d636dbad-9ffa-4ba7-953f-adea04b76a23" containerName="machine-config-daemon" containerID="cri-o://3ea26be161c74098dfac030bd0f30c6280c924f6e3ecfdb459bea4fd5d08ace1" gracePeriod=600 Oct 03 08:58:31 crc kubenswrapper[4765]: I1003 08:58:31.330949 4765 generic.go:334] "Generic (PLEG): container finished" podID="d636dbad-9ffa-4ba7-953f-adea04b76a23" containerID="3ea26be161c74098dfac030bd0f30c6280c924f6e3ecfdb459bea4fd5d08ace1" exitCode=0 Oct 03 08:58:31 crc kubenswrapper[4765]: I1003 08:58:31.331021 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-j8mss" event={"ID":"d636dbad-9ffa-4ba7-953f-adea04b76a23","Type":"ContainerDied","Data":"3ea26be161c74098dfac030bd0f30c6280c924f6e3ecfdb459bea4fd5d08ace1"} Oct 03 08:58:31 crc kubenswrapper[4765]: I1003 08:58:31.331188 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-j8mss" event={"ID":"d636dbad-9ffa-4ba7-953f-adea04b76a23","Type":"ContainerStarted","Data":"6fb31d91d836934f6972647efe119143f3dae08e768c5506a0423da0d8bd74e8"} Oct 03 08:58:31 crc kubenswrapper[4765]: I1003 08:58:31.331209 4765 scope.go:117] "RemoveContainer" containerID="bfe51b01984985879807e07e7a2482b4ea6735b787f2d94829df4202c6f13dc1" Oct 03 08:58:42 crc kubenswrapper[4765]: I1003 08:58:42.198314 4765 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="watcher-kuttl-default/keystone-596469b6bd-hwkh5" Oct 03 08:58:44 crc kubenswrapper[4765]: I1003 08:58:44.754937 4765 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["watcher-kuttl-default/openstackclient"] Oct 03 08:58:44 crc kubenswrapper[4765]: I1003 08:58:44.756323 4765 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/openstackclient" Oct 03 08:58:44 crc kubenswrapper[4765]: I1003 08:58:44.764994 4765 reflector.go:368] Caches populated for *v1.ConfigMap from object-"watcher-kuttl-default"/"openstack-config" Oct 03 08:58:44 crc kubenswrapper[4765]: I1003 08:58:44.765010 4765 reflector.go:368] Caches populated for *v1.Secret from object-"watcher-kuttl-default"/"openstack-config-secret" Oct 03 08:58:44 crc kubenswrapper[4765]: I1003 08:58:44.765953 4765 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["watcher-kuttl-default/openstackclient"] Oct 03 08:58:44 crc kubenswrapper[4765]: I1003 08:58:44.767031 4765 reflector.go:368] Caches populated for *v1.Secret from object-"watcher-kuttl-default"/"openstackclient-openstackclient-dockercfg-6tblm" Oct 03 08:58:44 crc kubenswrapper[4765]: I1003 08:58:44.821539 4765 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["watcher-kuttl-default/openstackclient"] Oct 03 08:58:44 crc kubenswrapper[4765]: E1003 08:58:44.822198 4765 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[combined-ca-bundle kube-api-access-pljnq openstack-config openstack-config-secret], unattached volumes=[], failed to process volumes=[combined-ca-bundle kube-api-access-pljnq openstack-config openstack-config-secret]: context canceled" pod="watcher-kuttl-default/openstackclient" podUID="b02ef8ff-0096-42b2-be52-8c391f444d1c" Oct 03 08:58:44 crc kubenswrapper[4765]: I1003 08:58:44.831889 4765 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["watcher-kuttl-default/openstackclient"] Oct 03 08:58:44 crc kubenswrapper[4765]: I1003 08:58:44.835837 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/b02ef8ff-0096-42b2-be52-8c391f444d1c-openstack-config-secret\") pod \"openstackclient\" (UID: \"b02ef8ff-0096-42b2-be52-8c391f444d1c\") " pod="watcher-kuttl-default/openstackclient" Oct 03 08:58:44 crc kubenswrapper[4765]: I1003 08:58:44.835923 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/b02ef8ff-0096-42b2-be52-8c391f444d1c-openstack-config\") pod \"openstackclient\" (UID: \"b02ef8ff-0096-42b2-be52-8c391f444d1c\") " pod="watcher-kuttl-default/openstackclient" Oct 03 08:58:44 crc kubenswrapper[4765]: I1003 08:58:44.835979 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b02ef8ff-0096-42b2-be52-8c391f444d1c-combined-ca-bundle\") pod \"openstackclient\" (UID: \"b02ef8ff-0096-42b2-be52-8c391f444d1c\") " pod="watcher-kuttl-default/openstackclient" Oct 03 08:58:44 crc kubenswrapper[4765]: I1003 08:58:44.836025 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pljnq\" (UniqueName: \"kubernetes.io/projected/b02ef8ff-0096-42b2-be52-8c391f444d1c-kube-api-access-pljnq\") pod \"openstackclient\" (UID: \"b02ef8ff-0096-42b2-be52-8c391f444d1c\") " pod="watcher-kuttl-default/openstackclient" Oct 03 08:58:44 crc kubenswrapper[4765]: I1003 08:58:44.894574 4765 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["watcher-kuttl-default/openstackclient"] Oct 03 08:58:44 crc kubenswrapper[4765]: I1003 08:58:44.895863 4765 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/openstackclient" Oct 03 08:58:44 crc kubenswrapper[4765]: I1003 08:58:44.905609 4765 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["watcher-kuttl-default/openstackclient"] Oct 03 08:58:44 crc kubenswrapper[4765]: I1003 08:58:44.936752 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pljnq\" (UniqueName: \"kubernetes.io/projected/b02ef8ff-0096-42b2-be52-8c391f444d1c-kube-api-access-pljnq\") pod \"openstackclient\" (UID: \"b02ef8ff-0096-42b2-be52-8c391f444d1c\") " pod="watcher-kuttl-default/openstackclient" Oct 03 08:58:44 crc kubenswrapper[4765]: I1003 08:58:44.936806 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/73ef00a9-8d50-49fb-84ae-669fff822e30-openstack-config-secret\") pod \"openstackclient\" (UID: \"73ef00a9-8d50-49fb-84ae-669fff822e30\") " pod="watcher-kuttl-default/openstackclient" Oct 03 08:58:44 crc kubenswrapper[4765]: I1003 08:58:44.936865 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gt6fs\" (UniqueName: \"kubernetes.io/projected/73ef00a9-8d50-49fb-84ae-669fff822e30-kube-api-access-gt6fs\") pod \"openstackclient\" (UID: \"73ef00a9-8d50-49fb-84ae-669fff822e30\") " pod="watcher-kuttl-default/openstackclient" Oct 03 08:58:44 crc kubenswrapper[4765]: I1003 08:58:44.936894 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/b02ef8ff-0096-42b2-be52-8c391f444d1c-openstack-config-secret\") pod \"openstackclient\" (UID: \"b02ef8ff-0096-42b2-be52-8c391f444d1c\") " pod="watcher-kuttl-default/openstackclient" Oct 03 08:58:44 crc kubenswrapper[4765]: I1003 08:58:44.936945 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/73ef00a9-8d50-49fb-84ae-669fff822e30-combined-ca-bundle\") pod \"openstackclient\" (UID: \"73ef00a9-8d50-49fb-84ae-669fff822e30\") " pod="watcher-kuttl-default/openstackclient" Oct 03 08:58:44 crc kubenswrapper[4765]: I1003 08:58:44.936971 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/b02ef8ff-0096-42b2-be52-8c391f444d1c-openstack-config\") pod \"openstackclient\" (UID: \"b02ef8ff-0096-42b2-be52-8c391f444d1c\") " pod="watcher-kuttl-default/openstackclient" Oct 03 08:58:44 crc kubenswrapper[4765]: I1003 08:58:44.937017 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b02ef8ff-0096-42b2-be52-8c391f444d1c-combined-ca-bundle\") pod \"openstackclient\" (UID: \"b02ef8ff-0096-42b2-be52-8c391f444d1c\") " pod="watcher-kuttl-default/openstackclient" Oct 03 08:58:44 crc kubenswrapper[4765]: I1003 08:58:44.937039 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/73ef00a9-8d50-49fb-84ae-669fff822e30-openstack-config\") pod \"openstackclient\" (UID: \"73ef00a9-8d50-49fb-84ae-669fff822e30\") " pod="watcher-kuttl-default/openstackclient" Oct 03 08:58:44 crc kubenswrapper[4765]: E1003 08:58:44.938601 4765 projected.go:194] Error preparing data for projected volume kube-api-access-pljnq for pod watcher-kuttl-default/openstackclient: failed to fetch token: serviceaccounts "openstackclient-openstackclient" is forbidden: the UID in the bound object reference (b02ef8ff-0096-42b2-be52-8c391f444d1c) does not match the UID in record. The object might have been deleted and then recreated Oct 03 08:58:44 crc kubenswrapper[4765]: E1003 08:58:44.938697 4765 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/b02ef8ff-0096-42b2-be52-8c391f444d1c-kube-api-access-pljnq podName:b02ef8ff-0096-42b2-be52-8c391f444d1c nodeName:}" failed. No retries permitted until 2025-10-03 08:58:45.438679368 +0000 UTC m=+1169.740173698 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "kube-api-access-pljnq" (UniqueName: "kubernetes.io/projected/b02ef8ff-0096-42b2-be52-8c391f444d1c-kube-api-access-pljnq") pod "openstackclient" (UID: "b02ef8ff-0096-42b2-be52-8c391f444d1c") : failed to fetch token: serviceaccounts "openstackclient-openstackclient" is forbidden: the UID in the bound object reference (b02ef8ff-0096-42b2-be52-8c391f444d1c) does not match the UID in record. The object might have been deleted and then recreated Oct 03 08:58:44 crc kubenswrapper[4765]: I1003 08:58:44.938597 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/b02ef8ff-0096-42b2-be52-8c391f444d1c-openstack-config\") pod \"openstackclient\" (UID: \"b02ef8ff-0096-42b2-be52-8c391f444d1c\") " pod="watcher-kuttl-default/openstackclient" Oct 03 08:58:44 crc kubenswrapper[4765]: I1003 08:58:44.943591 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/b02ef8ff-0096-42b2-be52-8c391f444d1c-openstack-config-secret\") pod \"openstackclient\" (UID: \"b02ef8ff-0096-42b2-be52-8c391f444d1c\") " pod="watcher-kuttl-default/openstackclient" Oct 03 08:58:44 crc kubenswrapper[4765]: I1003 08:58:44.943718 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b02ef8ff-0096-42b2-be52-8c391f444d1c-combined-ca-bundle\") pod \"openstackclient\" (UID: \"b02ef8ff-0096-42b2-be52-8c391f444d1c\") " pod="watcher-kuttl-default/openstackclient" Oct 03 08:58:45 crc kubenswrapper[4765]: I1003 08:58:45.038610 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/73ef00a9-8d50-49fb-84ae-669fff822e30-combined-ca-bundle\") pod \"openstackclient\" (UID: \"73ef00a9-8d50-49fb-84ae-669fff822e30\") " pod="watcher-kuttl-default/openstackclient" Oct 03 08:58:45 crc kubenswrapper[4765]: I1003 08:58:45.038696 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/73ef00a9-8d50-49fb-84ae-669fff822e30-openstack-config\") pod \"openstackclient\" (UID: \"73ef00a9-8d50-49fb-84ae-669fff822e30\") " pod="watcher-kuttl-default/openstackclient" Oct 03 08:58:45 crc kubenswrapper[4765]: I1003 08:58:45.038743 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/73ef00a9-8d50-49fb-84ae-669fff822e30-openstack-config-secret\") pod \"openstackclient\" (UID: \"73ef00a9-8d50-49fb-84ae-669fff822e30\") " pod="watcher-kuttl-default/openstackclient" Oct 03 08:58:45 crc kubenswrapper[4765]: I1003 08:58:45.038786 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gt6fs\" (UniqueName: \"kubernetes.io/projected/73ef00a9-8d50-49fb-84ae-669fff822e30-kube-api-access-gt6fs\") pod \"openstackclient\" (UID: \"73ef00a9-8d50-49fb-84ae-669fff822e30\") " pod="watcher-kuttl-default/openstackclient" Oct 03 08:58:45 crc kubenswrapper[4765]: I1003 08:58:45.039683 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/73ef00a9-8d50-49fb-84ae-669fff822e30-openstack-config\") pod \"openstackclient\" (UID: \"73ef00a9-8d50-49fb-84ae-669fff822e30\") " pod="watcher-kuttl-default/openstackclient" Oct 03 08:58:45 crc kubenswrapper[4765]: I1003 08:58:45.043712 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/73ef00a9-8d50-49fb-84ae-669fff822e30-openstack-config-secret\") pod \"openstackclient\" (UID: \"73ef00a9-8d50-49fb-84ae-669fff822e30\") " pod="watcher-kuttl-default/openstackclient" Oct 03 08:58:45 crc kubenswrapper[4765]: I1003 08:58:45.046289 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/73ef00a9-8d50-49fb-84ae-669fff822e30-combined-ca-bundle\") pod \"openstackclient\" (UID: \"73ef00a9-8d50-49fb-84ae-669fff822e30\") " pod="watcher-kuttl-default/openstackclient" Oct 03 08:58:45 crc kubenswrapper[4765]: I1003 08:58:45.058230 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gt6fs\" (UniqueName: \"kubernetes.io/projected/73ef00a9-8d50-49fb-84ae-669fff822e30-kube-api-access-gt6fs\") pod \"openstackclient\" (UID: \"73ef00a9-8d50-49fb-84ae-669fff822e30\") " pod="watcher-kuttl-default/openstackclient" Oct 03 08:58:45 crc kubenswrapper[4765]: I1003 08:58:45.216362 4765 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/openstackclient" Oct 03 08:58:45 crc kubenswrapper[4765]: I1003 08:58:45.439383 4765 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/openstackclient" Oct 03 08:58:45 crc kubenswrapper[4765]: I1003 08:58:45.442731 4765 status_manager.go:861] "Pod was deleted and then recreated, skipping status update" pod="watcher-kuttl-default/openstackclient" oldPodUID="b02ef8ff-0096-42b2-be52-8c391f444d1c" podUID="73ef00a9-8d50-49fb-84ae-669fff822e30" Oct 03 08:58:45 crc kubenswrapper[4765]: I1003 08:58:45.445338 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pljnq\" (UniqueName: \"kubernetes.io/projected/b02ef8ff-0096-42b2-be52-8c391f444d1c-kube-api-access-pljnq\") pod \"openstackclient\" (UID: \"b02ef8ff-0096-42b2-be52-8c391f444d1c\") " pod="watcher-kuttl-default/openstackclient" Oct 03 08:58:45 crc kubenswrapper[4765]: E1003 08:58:45.449218 4765 projected.go:194] Error preparing data for projected volume kube-api-access-pljnq for pod watcher-kuttl-default/openstackclient: failed to fetch token: serviceaccounts "openstackclient-openstackclient" is forbidden: the UID in the bound object reference (b02ef8ff-0096-42b2-be52-8c391f444d1c) does not match the UID in record. The object might have been deleted and then recreated Oct 03 08:58:45 crc kubenswrapper[4765]: E1003 08:58:45.449299 4765 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/b02ef8ff-0096-42b2-be52-8c391f444d1c-kube-api-access-pljnq podName:b02ef8ff-0096-42b2-be52-8c391f444d1c nodeName:}" failed. No retries permitted until 2025-10-03 08:58:46.44928137 +0000 UTC m=+1170.750775700 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "kube-api-access-pljnq" (UniqueName: "kubernetes.io/projected/b02ef8ff-0096-42b2-be52-8c391f444d1c-kube-api-access-pljnq") pod "openstackclient" (UID: "b02ef8ff-0096-42b2-be52-8c391f444d1c") : failed to fetch token: serviceaccounts "openstackclient-openstackclient" is forbidden: the UID in the bound object reference (b02ef8ff-0096-42b2-be52-8c391f444d1c) does not match the UID in record. The object might have been deleted and then recreated Oct 03 08:58:45 crc kubenswrapper[4765]: I1003 08:58:45.453870 4765 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/openstackclient" Oct 03 08:58:45 crc kubenswrapper[4765]: I1003 08:58:45.648234 4765 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/b02ef8ff-0096-42b2-be52-8c391f444d1c-openstack-config\") pod \"b02ef8ff-0096-42b2-be52-8c391f444d1c\" (UID: \"b02ef8ff-0096-42b2-be52-8c391f444d1c\") " Oct 03 08:58:45 crc kubenswrapper[4765]: I1003 08:58:45.648360 4765 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b02ef8ff-0096-42b2-be52-8c391f444d1c-combined-ca-bundle\") pod \"b02ef8ff-0096-42b2-be52-8c391f444d1c\" (UID: \"b02ef8ff-0096-42b2-be52-8c391f444d1c\") " Oct 03 08:58:45 crc kubenswrapper[4765]: I1003 08:58:45.648405 4765 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/b02ef8ff-0096-42b2-be52-8c391f444d1c-openstack-config-secret\") pod \"b02ef8ff-0096-42b2-be52-8c391f444d1c\" (UID: \"b02ef8ff-0096-42b2-be52-8c391f444d1c\") " Oct 03 08:58:45 crc kubenswrapper[4765]: I1003 08:58:45.648803 4765 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pljnq\" (UniqueName: \"kubernetes.io/projected/b02ef8ff-0096-42b2-be52-8c391f444d1c-kube-api-access-pljnq\") on node \"crc\" DevicePath \"\"" Oct 03 08:58:45 crc kubenswrapper[4765]: I1003 08:58:45.648813 4765 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b02ef8ff-0096-42b2-be52-8c391f444d1c-openstack-config" (OuterVolumeSpecName: "openstack-config") pod "b02ef8ff-0096-42b2-be52-8c391f444d1c" (UID: "b02ef8ff-0096-42b2-be52-8c391f444d1c"). InnerVolumeSpecName "openstack-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 03 08:58:45 crc kubenswrapper[4765]: I1003 08:58:45.652418 4765 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b02ef8ff-0096-42b2-be52-8c391f444d1c-openstack-config-secret" (OuterVolumeSpecName: "openstack-config-secret") pod "b02ef8ff-0096-42b2-be52-8c391f444d1c" (UID: "b02ef8ff-0096-42b2-be52-8c391f444d1c"). InnerVolumeSpecName "openstack-config-secret". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 08:58:45 crc kubenswrapper[4765]: I1003 08:58:45.652534 4765 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b02ef8ff-0096-42b2-be52-8c391f444d1c-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "b02ef8ff-0096-42b2-be52-8c391f444d1c" (UID: "b02ef8ff-0096-42b2-be52-8c391f444d1c"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 08:58:45 crc kubenswrapper[4765]: I1003 08:58:45.667876 4765 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["watcher-kuttl-default/openstackclient"] Oct 03 08:58:45 crc kubenswrapper[4765]: W1003 08:58:45.669263 4765 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod73ef00a9_8d50_49fb_84ae_669fff822e30.slice/crio-8d2ebc815aea90c87c7b0a254882c846b71ddfbede2eadf67c7b8d0340883b7f WatchSource:0}: Error finding container 8d2ebc815aea90c87c7b0a254882c846b71ddfbede2eadf67c7b8d0340883b7f: Status 404 returned error can't find the container with id 8d2ebc815aea90c87c7b0a254882c846b71ddfbede2eadf67c7b8d0340883b7f Oct 03 08:58:45 crc kubenswrapper[4765]: I1003 08:58:45.750238 4765 reconciler_common.go:293] "Volume detached for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/b02ef8ff-0096-42b2-be52-8c391f444d1c-openstack-config-secret\") on node \"crc\" DevicePath \"\"" Oct 03 08:58:45 crc kubenswrapper[4765]: I1003 08:58:45.750285 4765 reconciler_common.go:293] "Volume detached for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/b02ef8ff-0096-42b2-be52-8c391f444d1c-openstack-config\") on node \"crc\" DevicePath \"\"" Oct 03 08:58:45 crc kubenswrapper[4765]: I1003 08:58:45.750299 4765 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b02ef8ff-0096-42b2-be52-8c391f444d1c-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 03 08:58:46 crc kubenswrapper[4765]: I1003 08:58:46.316613 4765 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b02ef8ff-0096-42b2-be52-8c391f444d1c" path="/var/lib/kubelet/pods/b02ef8ff-0096-42b2-be52-8c391f444d1c/volumes" Oct 03 08:58:46 crc kubenswrapper[4765]: I1003 08:58:46.449423 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/openstackclient" event={"ID":"73ef00a9-8d50-49fb-84ae-669fff822e30","Type":"ContainerStarted","Data":"8d2ebc815aea90c87c7b0a254882c846b71ddfbede2eadf67c7b8d0340883b7f"} Oct 03 08:58:46 crc kubenswrapper[4765]: I1003 08:58:46.449455 4765 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/openstackclient" Oct 03 08:58:46 crc kubenswrapper[4765]: I1003 08:58:46.457767 4765 status_manager.go:861] "Pod was deleted and then recreated, skipping status update" pod="watcher-kuttl-default/openstackclient" oldPodUID="b02ef8ff-0096-42b2-be52-8c391f444d1c" podUID="73ef00a9-8d50-49fb-84ae-669fff822e30" Oct 03 08:58:50 crc kubenswrapper[4765]: I1003 08:58:50.668908 4765 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="watcher-kuttl-default/ceilometer-0" Oct 03 08:58:53 crc kubenswrapper[4765]: I1003 08:58:53.934684 4765 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["watcher-kuttl-default/kube-state-metrics-0"] Oct 03 08:58:53 crc kubenswrapper[4765]: I1003 08:58:53.935453 4765 kuberuntime_container.go:808] "Killing container with a grace period" pod="watcher-kuttl-default/kube-state-metrics-0" podUID="14d4a2cf-7c1b-4e8d-a42c-01cb979e78b6" containerName="kube-state-metrics" containerID="cri-o://1b6e759505f867fd22a8088522d2da7078cb4c81ab2d23b44ac6b3214f9217a0" gracePeriod=30 Oct 03 08:58:54 crc kubenswrapper[4765]: I1003 08:58:54.529247 4765 generic.go:334] "Generic (PLEG): container finished" podID="14d4a2cf-7c1b-4e8d-a42c-01cb979e78b6" containerID="1b6e759505f867fd22a8088522d2da7078cb4c81ab2d23b44ac6b3214f9217a0" exitCode=2 Oct 03 08:58:54 crc kubenswrapper[4765]: I1003 08:58:54.529328 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/kube-state-metrics-0" event={"ID":"14d4a2cf-7c1b-4e8d-a42c-01cb979e78b6","Type":"ContainerDied","Data":"1b6e759505f867fd22a8088522d2da7078cb4c81ab2d23b44ac6b3214f9217a0"} Oct 03 08:58:54 crc kubenswrapper[4765]: I1003 08:58:54.771504 4765 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/kube-state-metrics-0" Oct 03 08:58:54 crc kubenswrapper[4765]: I1003 08:58:54.812765 4765 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-w7bdv\" (UniqueName: \"kubernetes.io/projected/14d4a2cf-7c1b-4e8d-a42c-01cb979e78b6-kube-api-access-w7bdv\") pod \"14d4a2cf-7c1b-4e8d-a42c-01cb979e78b6\" (UID: \"14d4a2cf-7c1b-4e8d-a42c-01cb979e78b6\") " Oct 03 08:58:54 crc kubenswrapper[4765]: I1003 08:58:54.817981 4765 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/14d4a2cf-7c1b-4e8d-a42c-01cb979e78b6-kube-api-access-w7bdv" (OuterVolumeSpecName: "kube-api-access-w7bdv") pod "14d4a2cf-7c1b-4e8d-a42c-01cb979e78b6" (UID: "14d4a2cf-7c1b-4e8d-a42c-01cb979e78b6"). InnerVolumeSpecName "kube-api-access-w7bdv". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 08:58:54 crc kubenswrapper[4765]: I1003 08:58:54.914852 4765 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-w7bdv\" (UniqueName: \"kubernetes.io/projected/14d4a2cf-7c1b-4e8d-a42c-01cb979e78b6-kube-api-access-w7bdv\") on node \"crc\" DevicePath \"\"" Oct 03 08:58:55 crc kubenswrapper[4765]: I1003 08:58:55.051289 4765 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["watcher-kuttl-default/ceilometer-0"] Oct 03 08:58:55 crc kubenswrapper[4765]: I1003 08:58:55.051822 4765 kuberuntime_container.go:808] "Killing container with a grace period" pod="watcher-kuttl-default/ceilometer-0" podUID="143fccd6-3ba5-49ce-bf0b-e5a89f3ca38f" containerName="ceilometer-central-agent" containerID="cri-o://e1ad12b5f06caec0588dd05af5410721636bc5a0695cf0c2b98a9f114ad5c76a" gracePeriod=30 Oct 03 08:58:55 crc kubenswrapper[4765]: I1003 08:58:55.051849 4765 kuberuntime_container.go:808] "Killing container with a grace period" pod="watcher-kuttl-default/ceilometer-0" podUID="143fccd6-3ba5-49ce-bf0b-e5a89f3ca38f" containerName="sg-core" containerID="cri-o://52c42b4fe242f64aa422aa73c90b706046bdf7462ec62c69019e10fea6e6a880" gracePeriod=30 Oct 03 08:58:55 crc kubenswrapper[4765]: I1003 08:58:55.051880 4765 kuberuntime_container.go:808] "Killing container with a grace period" pod="watcher-kuttl-default/ceilometer-0" podUID="143fccd6-3ba5-49ce-bf0b-e5a89f3ca38f" containerName="ceilometer-notification-agent" containerID="cri-o://08d3b95b3561f434a5a390b93cba2f63a2baeeae29be761516c31f7b437f1a20" gracePeriod=30 Oct 03 08:58:55 crc kubenswrapper[4765]: I1003 08:58:55.051843 4765 kuberuntime_container.go:808] "Killing container with a grace period" pod="watcher-kuttl-default/ceilometer-0" podUID="143fccd6-3ba5-49ce-bf0b-e5a89f3ca38f" containerName="proxy-httpd" containerID="cri-o://bd5628f5c002ec422177e1a72b7723485fe07333a532b44a537183a1c80df315" gracePeriod=30 Oct 03 08:58:55 crc kubenswrapper[4765]: I1003 08:58:55.539052 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/openstackclient" event={"ID":"73ef00a9-8d50-49fb-84ae-669fff822e30","Type":"ContainerStarted","Data":"1b2032ea556a89698f69767283d1edef8b5138889d43f187cf64e45e9da42de9"} Oct 03 08:58:55 crc kubenswrapper[4765]: I1003 08:58:55.543873 4765 generic.go:334] "Generic (PLEG): container finished" podID="143fccd6-3ba5-49ce-bf0b-e5a89f3ca38f" containerID="bd5628f5c002ec422177e1a72b7723485fe07333a532b44a537183a1c80df315" exitCode=0 Oct 03 08:58:55 crc kubenswrapper[4765]: I1003 08:58:55.543911 4765 generic.go:334] "Generic (PLEG): container finished" podID="143fccd6-3ba5-49ce-bf0b-e5a89f3ca38f" containerID="52c42b4fe242f64aa422aa73c90b706046bdf7462ec62c69019e10fea6e6a880" exitCode=2 Oct 03 08:58:55 crc kubenswrapper[4765]: I1003 08:58:55.543955 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/ceilometer-0" event={"ID":"143fccd6-3ba5-49ce-bf0b-e5a89f3ca38f","Type":"ContainerDied","Data":"bd5628f5c002ec422177e1a72b7723485fe07333a532b44a537183a1c80df315"} Oct 03 08:58:55 crc kubenswrapper[4765]: I1003 08:58:55.543985 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/ceilometer-0" event={"ID":"143fccd6-3ba5-49ce-bf0b-e5a89f3ca38f","Type":"ContainerDied","Data":"52c42b4fe242f64aa422aa73c90b706046bdf7462ec62c69019e10fea6e6a880"} Oct 03 08:58:55 crc kubenswrapper[4765]: I1003 08:58:55.545957 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/kube-state-metrics-0" event={"ID":"14d4a2cf-7c1b-4e8d-a42c-01cb979e78b6","Type":"ContainerDied","Data":"c56ec4f720b52257866ffb6a14674698dc98b2e177ccc184e4072ddf9c240f40"} Oct 03 08:58:55 crc kubenswrapper[4765]: I1003 08:58:55.545998 4765 scope.go:117] "RemoveContainer" containerID="1b6e759505f867fd22a8088522d2da7078cb4c81ab2d23b44ac6b3214f9217a0" Oct 03 08:58:55 crc kubenswrapper[4765]: I1003 08:58:55.546122 4765 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/kube-state-metrics-0" Oct 03 08:58:55 crc kubenswrapper[4765]: I1003 08:58:55.567989 4765 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="watcher-kuttl-default/openstackclient" podStartSLOduration=2.738688318 podStartE2EDuration="11.567967495s" podCreationTimestamp="2025-10-03 08:58:44 +0000 UTC" firstStartedPulling="2025-10-03 08:58:45.671314284 +0000 UTC m=+1169.972808614" lastFinishedPulling="2025-10-03 08:58:54.500593461 +0000 UTC m=+1178.802087791" observedRunningTime="2025-10-03 08:58:55.563028211 +0000 UTC m=+1179.864522541" watchObservedRunningTime="2025-10-03 08:58:55.567967495 +0000 UTC m=+1179.869461825" Oct 03 08:58:55 crc kubenswrapper[4765]: I1003 08:58:55.588347 4765 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["watcher-kuttl-default/kube-state-metrics-0"] Oct 03 08:58:55 crc kubenswrapper[4765]: I1003 08:58:55.597556 4765 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["watcher-kuttl-default/kube-state-metrics-0"] Oct 03 08:58:55 crc kubenswrapper[4765]: I1003 08:58:55.621258 4765 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["watcher-kuttl-default/kube-state-metrics-0"] Oct 03 08:58:55 crc kubenswrapper[4765]: E1003 08:58:55.621632 4765 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="14d4a2cf-7c1b-4e8d-a42c-01cb979e78b6" containerName="kube-state-metrics" Oct 03 08:58:55 crc kubenswrapper[4765]: I1003 08:58:55.621672 4765 state_mem.go:107] "Deleted CPUSet assignment" podUID="14d4a2cf-7c1b-4e8d-a42c-01cb979e78b6" containerName="kube-state-metrics" Oct 03 08:58:55 crc kubenswrapper[4765]: I1003 08:58:55.621868 4765 memory_manager.go:354] "RemoveStaleState removing state" podUID="14d4a2cf-7c1b-4e8d-a42c-01cb979e78b6" containerName="kube-state-metrics" Oct 03 08:58:55 crc kubenswrapper[4765]: I1003 08:58:55.622400 4765 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/kube-state-metrics-0" Oct 03 08:58:55 crc kubenswrapper[4765]: I1003 08:58:55.624904 4765 reflector.go:368] Caches populated for *v1.Secret from object-"watcher-kuttl-default"/"cert-kube-state-metrics-svc" Oct 03 08:58:55 crc kubenswrapper[4765]: I1003 08:58:55.624984 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-state-metrics-tls-certs\" (UniqueName: \"kubernetes.io/secret/f2c99332-43c7-45cc-b2d3-83f8fe1ffc41-kube-state-metrics-tls-certs\") pod \"kube-state-metrics-0\" (UID: \"f2c99332-43c7-45cc-b2d3-83f8fe1ffc41\") " pod="watcher-kuttl-default/kube-state-metrics-0" Oct 03 08:58:55 crc kubenswrapper[4765]: I1003 08:58:55.625019 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5zqss\" (UniqueName: \"kubernetes.io/projected/f2c99332-43c7-45cc-b2d3-83f8fe1ffc41-kube-api-access-5zqss\") pod \"kube-state-metrics-0\" (UID: \"f2c99332-43c7-45cc-b2d3-83f8fe1ffc41\") " pod="watcher-kuttl-default/kube-state-metrics-0" Oct 03 08:58:55 crc kubenswrapper[4765]: I1003 08:58:55.625065 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-state-metrics-tls-config\" (UniqueName: \"kubernetes.io/secret/f2c99332-43c7-45cc-b2d3-83f8fe1ffc41-kube-state-metrics-tls-config\") pod \"kube-state-metrics-0\" (UID: \"f2c99332-43c7-45cc-b2d3-83f8fe1ffc41\") " pod="watcher-kuttl-default/kube-state-metrics-0" Oct 03 08:58:55 crc kubenswrapper[4765]: I1003 08:58:55.625105 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f2c99332-43c7-45cc-b2d3-83f8fe1ffc41-combined-ca-bundle\") pod \"kube-state-metrics-0\" (UID: \"f2c99332-43c7-45cc-b2d3-83f8fe1ffc41\") " pod="watcher-kuttl-default/kube-state-metrics-0" Oct 03 08:58:55 crc kubenswrapper[4765]: I1003 08:58:55.625123 4765 reflector.go:368] Caches populated for *v1.Secret from object-"watcher-kuttl-default"/"kube-state-metrics-tls-config" Oct 03 08:58:55 crc kubenswrapper[4765]: I1003 08:58:55.639099 4765 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["watcher-kuttl-default/kube-state-metrics-0"] Oct 03 08:58:55 crc kubenswrapper[4765]: I1003 08:58:55.726456 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5zqss\" (UniqueName: \"kubernetes.io/projected/f2c99332-43c7-45cc-b2d3-83f8fe1ffc41-kube-api-access-5zqss\") pod \"kube-state-metrics-0\" (UID: \"f2c99332-43c7-45cc-b2d3-83f8fe1ffc41\") " pod="watcher-kuttl-default/kube-state-metrics-0" Oct 03 08:58:55 crc kubenswrapper[4765]: I1003 08:58:55.726515 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-state-metrics-tls-config\" (UniqueName: \"kubernetes.io/secret/f2c99332-43c7-45cc-b2d3-83f8fe1ffc41-kube-state-metrics-tls-config\") pod \"kube-state-metrics-0\" (UID: \"f2c99332-43c7-45cc-b2d3-83f8fe1ffc41\") " pod="watcher-kuttl-default/kube-state-metrics-0" Oct 03 08:58:55 crc kubenswrapper[4765]: I1003 08:58:55.726551 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f2c99332-43c7-45cc-b2d3-83f8fe1ffc41-combined-ca-bundle\") pod \"kube-state-metrics-0\" (UID: \"f2c99332-43c7-45cc-b2d3-83f8fe1ffc41\") " pod="watcher-kuttl-default/kube-state-metrics-0" Oct 03 08:58:55 crc kubenswrapper[4765]: I1003 08:58:55.726692 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-state-metrics-tls-certs\" (UniqueName: \"kubernetes.io/secret/f2c99332-43c7-45cc-b2d3-83f8fe1ffc41-kube-state-metrics-tls-certs\") pod \"kube-state-metrics-0\" (UID: \"f2c99332-43c7-45cc-b2d3-83f8fe1ffc41\") " pod="watcher-kuttl-default/kube-state-metrics-0" Oct 03 08:58:55 crc kubenswrapper[4765]: I1003 08:58:55.732430 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f2c99332-43c7-45cc-b2d3-83f8fe1ffc41-combined-ca-bundle\") pod \"kube-state-metrics-0\" (UID: \"f2c99332-43c7-45cc-b2d3-83f8fe1ffc41\") " pod="watcher-kuttl-default/kube-state-metrics-0" Oct 03 08:58:55 crc kubenswrapper[4765]: I1003 08:58:55.733861 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-state-metrics-tls-config\" (UniqueName: \"kubernetes.io/secret/f2c99332-43c7-45cc-b2d3-83f8fe1ffc41-kube-state-metrics-tls-config\") pod \"kube-state-metrics-0\" (UID: \"f2c99332-43c7-45cc-b2d3-83f8fe1ffc41\") " pod="watcher-kuttl-default/kube-state-metrics-0" Oct 03 08:58:55 crc kubenswrapper[4765]: I1003 08:58:55.734177 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-state-metrics-tls-certs\" (UniqueName: \"kubernetes.io/secret/f2c99332-43c7-45cc-b2d3-83f8fe1ffc41-kube-state-metrics-tls-certs\") pod \"kube-state-metrics-0\" (UID: \"f2c99332-43c7-45cc-b2d3-83f8fe1ffc41\") " pod="watcher-kuttl-default/kube-state-metrics-0" Oct 03 08:58:55 crc kubenswrapper[4765]: I1003 08:58:55.765148 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5zqss\" (UniqueName: \"kubernetes.io/projected/f2c99332-43c7-45cc-b2d3-83f8fe1ffc41-kube-api-access-5zqss\") pod \"kube-state-metrics-0\" (UID: \"f2c99332-43c7-45cc-b2d3-83f8fe1ffc41\") " pod="watcher-kuttl-default/kube-state-metrics-0" Oct 03 08:58:55 crc kubenswrapper[4765]: I1003 08:58:55.939958 4765 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/kube-state-metrics-0" Oct 03 08:58:56 crc kubenswrapper[4765]: I1003 08:58:56.345851 4765 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="14d4a2cf-7c1b-4e8d-a42c-01cb979e78b6" path="/var/lib/kubelet/pods/14d4a2cf-7c1b-4e8d-a42c-01cb979e78b6/volumes" Oct 03 08:58:56 crc kubenswrapper[4765]: I1003 08:58:56.399524 4765 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["watcher-kuttl-default/kube-state-metrics-0"] Oct 03 08:58:56 crc kubenswrapper[4765]: I1003 08:58:56.556241 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/kube-state-metrics-0" event={"ID":"f2c99332-43c7-45cc-b2d3-83f8fe1ffc41","Type":"ContainerStarted","Data":"01a4b64bd558e1caef3a369eae021c5884457405c6bd85d75bf80c14fdf4172c"} Oct 03 08:58:56 crc kubenswrapper[4765]: I1003 08:58:56.559583 4765 generic.go:334] "Generic (PLEG): container finished" podID="143fccd6-3ba5-49ce-bf0b-e5a89f3ca38f" containerID="e1ad12b5f06caec0588dd05af5410721636bc5a0695cf0c2b98a9f114ad5c76a" exitCode=0 Oct 03 08:58:56 crc kubenswrapper[4765]: I1003 08:58:56.559672 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/ceilometer-0" event={"ID":"143fccd6-3ba5-49ce-bf0b-e5a89f3ca38f","Type":"ContainerDied","Data":"e1ad12b5f06caec0588dd05af5410721636bc5a0695cf0c2b98a9f114ad5c76a"} Oct 03 08:58:57 crc kubenswrapper[4765]: I1003 08:58:57.569300 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/kube-state-metrics-0" event={"ID":"f2c99332-43c7-45cc-b2d3-83f8fe1ffc41","Type":"ContainerStarted","Data":"41e439cbc3b77e6218565f31d2daecc64b3501ae55fab6a21079d3c118482a3d"} Oct 03 08:58:57 crc kubenswrapper[4765]: I1003 08:58:57.569601 4765 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="watcher-kuttl-default/kube-state-metrics-0" Oct 03 08:58:57 crc kubenswrapper[4765]: I1003 08:58:57.593193 4765 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="watcher-kuttl-default/kube-state-metrics-0" podStartSLOduration=2.222321612 podStartE2EDuration="2.593164849s" podCreationTimestamp="2025-10-03 08:58:55 +0000 UTC" firstStartedPulling="2025-10-03 08:58:56.404994996 +0000 UTC m=+1180.706489326" lastFinishedPulling="2025-10-03 08:58:56.775838233 +0000 UTC m=+1181.077332563" observedRunningTime="2025-10-03 08:58:57.586951952 +0000 UTC m=+1181.888446292" watchObservedRunningTime="2025-10-03 08:58:57.593164849 +0000 UTC m=+1181.894659179" Oct 03 08:58:59 crc kubenswrapper[4765]: I1003 08:58:59.513267 4765 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/ceilometer-0" Oct 03 08:58:59 crc kubenswrapper[4765]: I1003 08:58:59.588744 4765 generic.go:334] "Generic (PLEG): container finished" podID="143fccd6-3ba5-49ce-bf0b-e5a89f3ca38f" containerID="08d3b95b3561f434a5a390b93cba2f63a2baeeae29be761516c31f7b437f1a20" exitCode=0 Oct 03 08:58:59 crc kubenswrapper[4765]: I1003 08:58:59.588828 4765 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/ceilometer-0" Oct 03 08:58:59 crc kubenswrapper[4765]: I1003 08:58:59.589054 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/ceilometer-0" event={"ID":"143fccd6-3ba5-49ce-bf0b-e5a89f3ca38f","Type":"ContainerDied","Data":"08d3b95b3561f434a5a390b93cba2f63a2baeeae29be761516c31f7b437f1a20"} Oct 03 08:58:59 crc kubenswrapper[4765]: I1003 08:58:59.589156 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/ceilometer-0" event={"ID":"143fccd6-3ba5-49ce-bf0b-e5a89f3ca38f","Type":"ContainerDied","Data":"05f0f678fcd2d926c5ea3b92971788eee00516772d1ca6c635591cff46dcf269"} Oct 03 08:58:59 crc kubenswrapper[4765]: I1003 08:58:59.589215 4765 scope.go:117] "RemoveContainer" containerID="bd5628f5c002ec422177e1a72b7723485fe07333a532b44a537183a1c80df315" Oct 03 08:58:59 crc kubenswrapper[4765]: I1003 08:58:59.607524 4765 scope.go:117] "RemoveContainer" containerID="52c42b4fe242f64aa422aa73c90b706046bdf7462ec62c69019e10fea6e6a880" Oct 03 08:58:59 crc kubenswrapper[4765]: I1003 08:58:59.627109 4765 scope.go:117] "RemoveContainer" containerID="08d3b95b3561f434a5a390b93cba2f63a2baeeae29be761516c31f7b437f1a20" Oct 03 08:58:59 crc kubenswrapper[4765]: I1003 08:58:59.646036 4765 scope.go:117] "RemoveContainer" containerID="e1ad12b5f06caec0588dd05af5410721636bc5a0695cf0c2b98a9f114ad5c76a" Oct 03 08:58:59 crc kubenswrapper[4765]: I1003 08:58:59.667320 4765 scope.go:117] "RemoveContainer" containerID="bd5628f5c002ec422177e1a72b7723485fe07333a532b44a537183a1c80df315" Oct 03 08:58:59 crc kubenswrapper[4765]: E1003 08:58:59.667631 4765 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"bd5628f5c002ec422177e1a72b7723485fe07333a532b44a537183a1c80df315\": container with ID starting with bd5628f5c002ec422177e1a72b7723485fe07333a532b44a537183a1c80df315 not found: ID does not exist" containerID="bd5628f5c002ec422177e1a72b7723485fe07333a532b44a537183a1c80df315" Oct 03 08:58:59 crc kubenswrapper[4765]: I1003 08:58:59.667679 4765 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"bd5628f5c002ec422177e1a72b7723485fe07333a532b44a537183a1c80df315"} err="failed to get container status \"bd5628f5c002ec422177e1a72b7723485fe07333a532b44a537183a1c80df315\": rpc error: code = NotFound desc = could not find container \"bd5628f5c002ec422177e1a72b7723485fe07333a532b44a537183a1c80df315\": container with ID starting with bd5628f5c002ec422177e1a72b7723485fe07333a532b44a537183a1c80df315 not found: ID does not exist" Oct 03 08:58:59 crc kubenswrapper[4765]: I1003 08:58:59.667709 4765 scope.go:117] "RemoveContainer" containerID="52c42b4fe242f64aa422aa73c90b706046bdf7462ec62c69019e10fea6e6a880" Oct 03 08:58:59 crc kubenswrapper[4765]: E1003 08:58:59.668102 4765 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"52c42b4fe242f64aa422aa73c90b706046bdf7462ec62c69019e10fea6e6a880\": container with ID starting with 52c42b4fe242f64aa422aa73c90b706046bdf7462ec62c69019e10fea6e6a880 not found: ID does not exist" containerID="52c42b4fe242f64aa422aa73c90b706046bdf7462ec62c69019e10fea6e6a880" Oct 03 08:58:59 crc kubenswrapper[4765]: I1003 08:58:59.668138 4765 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"52c42b4fe242f64aa422aa73c90b706046bdf7462ec62c69019e10fea6e6a880"} err="failed to get container status \"52c42b4fe242f64aa422aa73c90b706046bdf7462ec62c69019e10fea6e6a880\": rpc error: code = NotFound desc = could not find container \"52c42b4fe242f64aa422aa73c90b706046bdf7462ec62c69019e10fea6e6a880\": container with ID starting with 52c42b4fe242f64aa422aa73c90b706046bdf7462ec62c69019e10fea6e6a880 not found: ID does not exist" Oct 03 08:58:59 crc kubenswrapper[4765]: I1003 08:58:59.668166 4765 scope.go:117] "RemoveContainer" containerID="08d3b95b3561f434a5a390b93cba2f63a2baeeae29be761516c31f7b437f1a20" Oct 03 08:58:59 crc kubenswrapper[4765]: E1003 08:58:59.668798 4765 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"08d3b95b3561f434a5a390b93cba2f63a2baeeae29be761516c31f7b437f1a20\": container with ID starting with 08d3b95b3561f434a5a390b93cba2f63a2baeeae29be761516c31f7b437f1a20 not found: ID does not exist" containerID="08d3b95b3561f434a5a390b93cba2f63a2baeeae29be761516c31f7b437f1a20" Oct 03 08:58:59 crc kubenswrapper[4765]: I1003 08:58:59.668829 4765 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"08d3b95b3561f434a5a390b93cba2f63a2baeeae29be761516c31f7b437f1a20"} err="failed to get container status \"08d3b95b3561f434a5a390b93cba2f63a2baeeae29be761516c31f7b437f1a20\": rpc error: code = NotFound desc = could not find container \"08d3b95b3561f434a5a390b93cba2f63a2baeeae29be761516c31f7b437f1a20\": container with ID starting with 08d3b95b3561f434a5a390b93cba2f63a2baeeae29be761516c31f7b437f1a20 not found: ID does not exist" Oct 03 08:58:59 crc kubenswrapper[4765]: I1003 08:58:59.668846 4765 scope.go:117] "RemoveContainer" containerID="e1ad12b5f06caec0588dd05af5410721636bc5a0695cf0c2b98a9f114ad5c76a" Oct 03 08:58:59 crc kubenswrapper[4765]: E1003 08:58:59.669266 4765 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e1ad12b5f06caec0588dd05af5410721636bc5a0695cf0c2b98a9f114ad5c76a\": container with ID starting with e1ad12b5f06caec0588dd05af5410721636bc5a0695cf0c2b98a9f114ad5c76a not found: ID does not exist" containerID="e1ad12b5f06caec0588dd05af5410721636bc5a0695cf0c2b98a9f114ad5c76a" Oct 03 08:58:59 crc kubenswrapper[4765]: I1003 08:58:59.669316 4765 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e1ad12b5f06caec0588dd05af5410721636bc5a0695cf0c2b98a9f114ad5c76a"} err="failed to get container status \"e1ad12b5f06caec0588dd05af5410721636bc5a0695cf0c2b98a9f114ad5c76a\": rpc error: code = NotFound desc = could not find container \"e1ad12b5f06caec0588dd05af5410721636bc5a0695cf0c2b98a9f114ad5c76a\": container with ID starting with e1ad12b5f06caec0588dd05af5410721636bc5a0695cf0c2b98a9f114ad5c76a not found: ID does not exist" Oct 03 08:58:59 crc kubenswrapper[4765]: I1003 08:58:59.695408 4765 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/143fccd6-3ba5-49ce-bf0b-e5a89f3ca38f-sg-core-conf-yaml\") pod \"143fccd6-3ba5-49ce-bf0b-e5a89f3ca38f\" (UID: \"143fccd6-3ba5-49ce-bf0b-e5a89f3ca38f\") " Oct 03 08:58:59 crc kubenswrapper[4765]: I1003 08:58:59.695455 4765 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/143fccd6-3ba5-49ce-bf0b-e5a89f3ca38f-run-httpd\") pod \"143fccd6-3ba5-49ce-bf0b-e5a89f3ca38f\" (UID: \"143fccd6-3ba5-49ce-bf0b-e5a89f3ca38f\") " Oct 03 08:58:59 crc kubenswrapper[4765]: I1003 08:58:59.695520 4765 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rjqj8\" (UniqueName: \"kubernetes.io/projected/143fccd6-3ba5-49ce-bf0b-e5a89f3ca38f-kube-api-access-rjqj8\") pod \"143fccd6-3ba5-49ce-bf0b-e5a89f3ca38f\" (UID: \"143fccd6-3ba5-49ce-bf0b-e5a89f3ca38f\") " Oct 03 08:58:59 crc kubenswrapper[4765]: I1003 08:58:59.695539 4765 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/143fccd6-3ba5-49ce-bf0b-e5a89f3ca38f-log-httpd\") pod \"143fccd6-3ba5-49ce-bf0b-e5a89f3ca38f\" (UID: \"143fccd6-3ba5-49ce-bf0b-e5a89f3ca38f\") " Oct 03 08:58:59 crc kubenswrapper[4765]: I1003 08:58:59.695579 4765 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/143fccd6-3ba5-49ce-bf0b-e5a89f3ca38f-combined-ca-bundle\") pod \"143fccd6-3ba5-49ce-bf0b-e5a89f3ca38f\" (UID: \"143fccd6-3ba5-49ce-bf0b-e5a89f3ca38f\") " Oct 03 08:58:59 crc kubenswrapper[4765]: I1003 08:58:59.695671 4765 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/143fccd6-3ba5-49ce-bf0b-e5a89f3ca38f-config-data\") pod \"143fccd6-3ba5-49ce-bf0b-e5a89f3ca38f\" (UID: \"143fccd6-3ba5-49ce-bf0b-e5a89f3ca38f\") " Oct 03 08:58:59 crc kubenswrapper[4765]: I1003 08:58:59.695706 4765 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/143fccd6-3ba5-49ce-bf0b-e5a89f3ca38f-scripts\") pod \"143fccd6-3ba5-49ce-bf0b-e5a89f3ca38f\" (UID: \"143fccd6-3ba5-49ce-bf0b-e5a89f3ca38f\") " Oct 03 08:58:59 crc kubenswrapper[4765]: I1003 08:58:59.696134 4765 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/143fccd6-3ba5-49ce-bf0b-e5a89f3ca38f-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "143fccd6-3ba5-49ce-bf0b-e5a89f3ca38f" (UID: "143fccd6-3ba5-49ce-bf0b-e5a89f3ca38f"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 03 08:58:59 crc kubenswrapper[4765]: I1003 08:58:59.696315 4765 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/143fccd6-3ba5-49ce-bf0b-e5a89f3ca38f-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "143fccd6-3ba5-49ce-bf0b-e5a89f3ca38f" (UID: "143fccd6-3ba5-49ce-bf0b-e5a89f3ca38f"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 03 08:58:59 crc kubenswrapper[4765]: I1003 08:58:59.715399 4765 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/143fccd6-3ba5-49ce-bf0b-e5a89f3ca38f-kube-api-access-rjqj8" (OuterVolumeSpecName: "kube-api-access-rjqj8") pod "143fccd6-3ba5-49ce-bf0b-e5a89f3ca38f" (UID: "143fccd6-3ba5-49ce-bf0b-e5a89f3ca38f"). InnerVolumeSpecName "kube-api-access-rjqj8". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 08:58:59 crc kubenswrapper[4765]: I1003 08:58:59.715489 4765 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/143fccd6-3ba5-49ce-bf0b-e5a89f3ca38f-scripts" (OuterVolumeSpecName: "scripts") pod "143fccd6-3ba5-49ce-bf0b-e5a89f3ca38f" (UID: "143fccd6-3ba5-49ce-bf0b-e5a89f3ca38f"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 08:58:59 crc kubenswrapper[4765]: I1003 08:58:59.724816 4765 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/143fccd6-3ba5-49ce-bf0b-e5a89f3ca38f-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "143fccd6-3ba5-49ce-bf0b-e5a89f3ca38f" (UID: "143fccd6-3ba5-49ce-bf0b-e5a89f3ca38f"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 08:58:59 crc kubenswrapper[4765]: I1003 08:58:59.778851 4765 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/143fccd6-3ba5-49ce-bf0b-e5a89f3ca38f-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "143fccd6-3ba5-49ce-bf0b-e5a89f3ca38f" (UID: "143fccd6-3ba5-49ce-bf0b-e5a89f3ca38f"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 08:58:59 crc kubenswrapper[4765]: I1003 08:58:59.786839 4765 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/143fccd6-3ba5-49ce-bf0b-e5a89f3ca38f-config-data" (OuterVolumeSpecName: "config-data") pod "143fccd6-3ba5-49ce-bf0b-e5a89f3ca38f" (UID: "143fccd6-3ba5-49ce-bf0b-e5a89f3ca38f"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 08:58:59 crc kubenswrapper[4765]: I1003 08:58:59.797837 4765 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/143fccd6-3ba5-49ce-bf0b-e5a89f3ca38f-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Oct 03 08:58:59 crc kubenswrapper[4765]: I1003 08:58:59.797873 4765 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/143fccd6-3ba5-49ce-bf0b-e5a89f3ca38f-run-httpd\") on node \"crc\" DevicePath \"\"" Oct 03 08:58:59 crc kubenswrapper[4765]: I1003 08:58:59.797883 4765 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rjqj8\" (UniqueName: \"kubernetes.io/projected/143fccd6-3ba5-49ce-bf0b-e5a89f3ca38f-kube-api-access-rjqj8\") on node \"crc\" DevicePath \"\"" Oct 03 08:58:59 crc kubenswrapper[4765]: I1003 08:58:59.797892 4765 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/143fccd6-3ba5-49ce-bf0b-e5a89f3ca38f-log-httpd\") on node \"crc\" DevicePath \"\"" Oct 03 08:58:59 crc kubenswrapper[4765]: I1003 08:58:59.797901 4765 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/143fccd6-3ba5-49ce-bf0b-e5a89f3ca38f-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 03 08:58:59 crc kubenswrapper[4765]: I1003 08:58:59.797911 4765 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/143fccd6-3ba5-49ce-bf0b-e5a89f3ca38f-config-data\") on node \"crc\" DevicePath \"\"" Oct 03 08:58:59 crc kubenswrapper[4765]: I1003 08:58:59.797919 4765 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/143fccd6-3ba5-49ce-bf0b-e5a89f3ca38f-scripts\") on node \"crc\" DevicePath \"\"" Oct 03 08:58:59 crc kubenswrapper[4765]: I1003 08:58:59.922014 4765 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["watcher-kuttl-default/ceilometer-0"] Oct 03 08:58:59 crc kubenswrapper[4765]: I1003 08:58:59.928635 4765 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["watcher-kuttl-default/ceilometer-0"] Oct 03 08:58:59 crc kubenswrapper[4765]: I1003 08:58:59.942998 4765 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["watcher-kuttl-default/ceilometer-0"] Oct 03 08:58:59 crc kubenswrapper[4765]: E1003 08:58:59.943393 4765 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="143fccd6-3ba5-49ce-bf0b-e5a89f3ca38f" containerName="sg-core" Oct 03 08:58:59 crc kubenswrapper[4765]: I1003 08:58:59.943419 4765 state_mem.go:107] "Deleted CPUSet assignment" podUID="143fccd6-3ba5-49ce-bf0b-e5a89f3ca38f" containerName="sg-core" Oct 03 08:58:59 crc kubenswrapper[4765]: E1003 08:58:59.943446 4765 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="143fccd6-3ba5-49ce-bf0b-e5a89f3ca38f" containerName="proxy-httpd" Oct 03 08:58:59 crc kubenswrapper[4765]: I1003 08:58:59.943455 4765 state_mem.go:107] "Deleted CPUSet assignment" podUID="143fccd6-3ba5-49ce-bf0b-e5a89f3ca38f" containerName="proxy-httpd" Oct 03 08:58:59 crc kubenswrapper[4765]: E1003 08:58:59.943465 4765 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="143fccd6-3ba5-49ce-bf0b-e5a89f3ca38f" containerName="ceilometer-central-agent" Oct 03 08:58:59 crc kubenswrapper[4765]: I1003 08:58:59.943475 4765 state_mem.go:107] "Deleted CPUSet assignment" podUID="143fccd6-3ba5-49ce-bf0b-e5a89f3ca38f" containerName="ceilometer-central-agent" Oct 03 08:58:59 crc kubenswrapper[4765]: E1003 08:58:59.943494 4765 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="143fccd6-3ba5-49ce-bf0b-e5a89f3ca38f" containerName="ceilometer-notification-agent" Oct 03 08:58:59 crc kubenswrapper[4765]: I1003 08:58:59.943501 4765 state_mem.go:107] "Deleted CPUSet assignment" podUID="143fccd6-3ba5-49ce-bf0b-e5a89f3ca38f" containerName="ceilometer-notification-agent" Oct 03 08:58:59 crc kubenswrapper[4765]: I1003 08:58:59.943694 4765 memory_manager.go:354] "RemoveStaleState removing state" podUID="143fccd6-3ba5-49ce-bf0b-e5a89f3ca38f" containerName="ceilometer-central-agent" Oct 03 08:58:59 crc kubenswrapper[4765]: I1003 08:58:59.943714 4765 memory_manager.go:354] "RemoveStaleState removing state" podUID="143fccd6-3ba5-49ce-bf0b-e5a89f3ca38f" containerName="sg-core" Oct 03 08:58:59 crc kubenswrapper[4765]: I1003 08:58:59.943731 4765 memory_manager.go:354] "RemoveStaleState removing state" podUID="143fccd6-3ba5-49ce-bf0b-e5a89f3ca38f" containerName="ceilometer-notification-agent" Oct 03 08:58:59 crc kubenswrapper[4765]: I1003 08:58:59.943743 4765 memory_manager.go:354] "RemoveStaleState removing state" podUID="143fccd6-3ba5-49ce-bf0b-e5a89f3ca38f" containerName="proxy-httpd" Oct 03 08:58:59 crc kubenswrapper[4765]: I1003 08:58:59.945606 4765 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/ceilometer-0" Oct 03 08:58:59 crc kubenswrapper[4765]: I1003 08:58:59.948042 4765 reflector.go:368] Caches populated for *v1.Secret from object-"watcher-kuttl-default"/"cert-ceilometer-internal-svc" Oct 03 08:58:59 crc kubenswrapper[4765]: I1003 08:58:59.948377 4765 reflector.go:368] Caches populated for *v1.Secret from object-"watcher-kuttl-default"/"ceilometer-scripts" Oct 03 08:58:59 crc kubenswrapper[4765]: I1003 08:58:59.948586 4765 reflector.go:368] Caches populated for *v1.Secret from object-"watcher-kuttl-default"/"ceilometer-config-data" Oct 03 08:58:59 crc kubenswrapper[4765]: I1003 08:58:59.955955 4765 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["watcher-kuttl-default/ceilometer-0"] Oct 03 08:59:00 crc kubenswrapper[4765]: I1003 08:59:00.102207 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/94554d2b-e45b-4c38-8871-e3b4febc81c9-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"94554d2b-e45b-4c38-8871-e3b4febc81c9\") " pod="watcher-kuttl-default/ceilometer-0" Oct 03 08:59:00 crc kubenswrapper[4765]: I1003 08:59:00.102262 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/94554d2b-e45b-4c38-8871-e3b4febc81c9-log-httpd\") pod \"ceilometer-0\" (UID: \"94554d2b-e45b-4c38-8871-e3b4febc81c9\") " pod="watcher-kuttl-default/ceilometer-0" Oct 03 08:59:00 crc kubenswrapper[4765]: I1003 08:59:00.102307 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/94554d2b-e45b-4c38-8871-e3b4febc81c9-scripts\") pod \"ceilometer-0\" (UID: \"94554d2b-e45b-4c38-8871-e3b4febc81c9\") " pod="watcher-kuttl-default/ceilometer-0" Oct 03 08:59:00 crc kubenswrapper[4765]: I1003 08:59:00.102412 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/94554d2b-e45b-4c38-8871-e3b4febc81c9-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"94554d2b-e45b-4c38-8871-e3b4febc81c9\") " pod="watcher-kuttl-default/ceilometer-0" Oct 03 08:59:00 crc kubenswrapper[4765]: I1003 08:59:00.102430 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6dwb4\" (UniqueName: \"kubernetes.io/projected/94554d2b-e45b-4c38-8871-e3b4febc81c9-kube-api-access-6dwb4\") pod \"ceilometer-0\" (UID: \"94554d2b-e45b-4c38-8871-e3b4febc81c9\") " pod="watcher-kuttl-default/ceilometer-0" Oct 03 08:59:00 crc kubenswrapper[4765]: I1003 08:59:00.102451 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/94554d2b-e45b-4c38-8871-e3b4febc81c9-config-data\") pod \"ceilometer-0\" (UID: \"94554d2b-e45b-4c38-8871-e3b4febc81c9\") " pod="watcher-kuttl-default/ceilometer-0" Oct 03 08:59:00 crc kubenswrapper[4765]: I1003 08:59:00.102478 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/94554d2b-e45b-4c38-8871-e3b4febc81c9-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"94554d2b-e45b-4c38-8871-e3b4febc81c9\") " pod="watcher-kuttl-default/ceilometer-0" Oct 03 08:59:00 crc kubenswrapper[4765]: I1003 08:59:00.102708 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/94554d2b-e45b-4c38-8871-e3b4febc81c9-run-httpd\") pod \"ceilometer-0\" (UID: \"94554d2b-e45b-4c38-8871-e3b4febc81c9\") " pod="watcher-kuttl-default/ceilometer-0" Oct 03 08:59:00 crc kubenswrapper[4765]: I1003 08:59:00.204517 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/94554d2b-e45b-4c38-8871-e3b4febc81c9-run-httpd\") pod \"ceilometer-0\" (UID: \"94554d2b-e45b-4c38-8871-e3b4febc81c9\") " pod="watcher-kuttl-default/ceilometer-0" Oct 03 08:59:00 crc kubenswrapper[4765]: I1003 08:59:00.204566 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/94554d2b-e45b-4c38-8871-e3b4febc81c9-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"94554d2b-e45b-4c38-8871-e3b4febc81c9\") " pod="watcher-kuttl-default/ceilometer-0" Oct 03 08:59:00 crc kubenswrapper[4765]: I1003 08:59:00.204597 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/94554d2b-e45b-4c38-8871-e3b4febc81c9-log-httpd\") pod \"ceilometer-0\" (UID: \"94554d2b-e45b-4c38-8871-e3b4febc81c9\") " pod="watcher-kuttl-default/ceilometer-0" Oct 03 08:59:00 crc kubenswrapper[4765]: I1003 08:59:00.204666 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/94554d2b-e45b-4c38-8871-e3b4febc81c9-scripts\") pod \"ceilometer-0\" (UID: \"94554d2b-e45b-4c38-8871-e3b4febc81c9\") " pod="watcher-kuttl-default/ceilometer-0" Oct 03 08:59:00 crc kubenswrapper[4765]: I1003 08:59:00.204689 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/94554d2b-e45b-4c38-8871-e3b4febc81c9-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"94554d2b-e45b-4c38-8871-e3b4febc81c9\") " pod="watcher-kuttl-default/ceilometer-0" Oct 03 08:59:00 crc kubenswrapper[4765]: I1003 08:59:00.204710 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6dwb4\" (UniqueName: \"kubernetes.io/projected/94554d2b-e45b-4c38-8871-e3b4febc81c9-kube-api-access-6dwb4\") pod \"ceilometer-0\" (UID: \"94554d2b-e45b-4c38-8871-e3b4febc81c9\") " pod="watcher-kuttl-default/ceilometer-0" Oct 03 08:59:00 crc kubenswrapper[4765]: I1003 08:59:00.204736 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/94554d2b-e45b-4c38-8871-e3b4febc81c9-config-data\") pod \"ceilometer-0\" (UID: \"94554d2b-e45b-4c38-8871-e3b4febc81c9\") " pod="watcher-kuttl-default/ceilometer-0" Oct 03 08:59:00 crc kubenswrapper[4765]: I1003 08:59:00.204771 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/94554d2b-e45b-4c38-8871-e3b4febc81c9-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"94554d2b-e45b-4c38-8871-e3b4febc81c9\") " pod="watcher-kuttl-default/ceilometer-0" Oct 03 08:59:00 crc kubenswrapper[4765]: I1003 08:59:00.205166 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/94554d2b-e45b-4c38-8871-e3b4febc81c9-run-httpd\") pod \"ceilometer-0\" (UID: \"94554d2b-e45b-4c38-8871-e3b4febc81c9\") " pod="watcher-kuttl-default/ceilometer-0" Oct 03 08:59:00 crc kubenswrapper[4765]: I1003 08:59:00.205186 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/94554d2b-e45b-4c38-8871-e3b4febc81c9-log-httpd\") pod \"ceilometer-0\" (UID: \"94554d2b-e45b-4c38-8871-e3b4febc81c9\") " pod="watcher-kuttl-default/ceilometer-0" Oct 03 08:59:00 crc kubenswrapper[4765]: I1003 08:59:00.208159 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/94554d2b-e45b-4c38-8871-e3b4febc81c9-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"94554d2b-e45b-4c38-8871-e3b4febc81c9\") " pod="watcher-kuttl-default/ceilometer-0" Oct 03 08:59:00 crc kubenswrapper[4765]: I1003 08:59:00.208665 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/94554d2b-e45b-4c38-8871-e3b4febc81c9-scripts\") pod \"ceilometer-0\" (UID: \"94554d2b-e45b-4c38-8871-e3b4febc81c9\") " pod="watcher-kuttl-default/ceilometer-0" Oct 03 08:59:00 crc kubenswrapper[4765]: I1003 08:59:00.209364 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/94554d2b-e45b-4c38-8871-e3b4febc81c9-config-data\") pod \"ceilometer-0\" (UID: \"94554d2b-e45b-4c38-8871-e3b4febc81c9\") " pod="watcher-kuttl-default/ceilometer-0" Oct 03 08:59:00 crc kubenswrapper[4765]: I1003 08:59:00.210667 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/94554d2b-e45b-4c38-8871-e3b4febc81c9-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"94554d2b-e45b-4c38-8871-e3b4febc81c9\") " pod="watcher-kuttl-default/ceilometer-0" Oct 03 08:59:00 crc kubenswrapper[4765]: I1003 08:59:00.219867 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/94554d2b-e45b-4c38-8871-e3b4febc81c9-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"94554d2b-e45b-4c38-8871-e3b4febc81c9\") " pod="watcher-kuttl-default/ceilometer-0" Oct 03 08:59:00 crc kubenswrapper[4765]: I1003 08:59:00.222310 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6dwb4\" (UniqueName: \"kubernetes.io/projected/94554d2b-e45b-4c38-8871-e3b4febc81c9-kube-api-access-6dwb4\") pod \"ceilometer-0\" (UID: \"94554d2b-e45b-4c38-8871-e3b4febc81c9\") " pod="watcher-kuttl-default/ceilometer-0" Oct 03 08:59:00 crc kubenswrapper[4765]: I1003 08:59:00.261779 4765 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/ceilometer-0" Oct 03 08:59:00 crc kubenswrapper[4765]: I1003 08:59:00.339636 4765 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="143fccd6-3ba5-49ce-bf0b-e5a89f3ca38f" path="/var/lib/kubelet/pods/143fccd6-3ba5-49ce-bf0b-e5a89f3ca38f/volumes" Oct 03 08:59:00 crc kubenswrapper[4765]: I1003 08:59:00.689611 4765 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["watcher-kuttl-default/ceilometer-0"] Oct 03 08:59:00 crc kubenswrapper[4765]: W1003 08:59:00.703145 4765 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod94554d2b_e45b_4c38_8871_e3b4febc81c9.slice/crio-57149f8ee7f3aab948f11fdde03b72b6885a6e1ef3e320cd48a9231407f188ac WatchSource:0}: Error finding container 57149f8ee7f3aab948f11fdde03b72b6885a6e1ef3e320cd48a9231407f188ac: Status 404 returned error can't find the container with id 57149f8ee7f3aab948f11fdde03b72b6885a6e1ef3e320cd48a9231407f188ac Oct 03 08:59:01 crc kubenswrapper[4765]: I1003 08:59:01.609955 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/ceilometer-0" event={"ID":"94554d2b-e45b-4c38-8871-e3b4febc81c9","Type":"ContainerStarted","Data":"470a9cbaae5d3d072025eb3f50a1b783b9601912597f3957f8e110e9c1176a2f"} Oct 03 08:59:01 crc kubenswrapper[4765]: I1003 08:59:01.609999 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/ceilometer-0" event={"ID":"94554d2b-e45b-4c38-8871-e3b4febc81c9","Type":"ContainerStarted","Data":"57149f8ee7f3aab948f11fdde03b72b6885a6e1ef3e320cd48a9231407f188ac"} Oct 03 08:59:02 crc kubenswrapper[4765]: I1003 08:59:02.618981 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/ceilometer-0" event={"ID":"94554d2b-e45b-4c38-8871-e3b4febc81c9","Type":"ContainerStarted","Data":"7e5174f0fd52405eb17accd4f6b83457a5ef9e014ed5831cd1e8e6991515d187"} Oct 03 08:59:03 crc kubenswrapper[4765]: I1003 08:59:03.638313 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/ceilometer-0" event={"ID":"94554d2b-e45b-4c38-8871-e3b4febc81c9","Type":"ContainerStarted","Data":"453023e03ab393ece167179e27b70c18a91576ec723a5ab19ab3761315a90975"} Oct 03 08:59:04 crc kubenswrapper[4765]: I1003 08:59:04.647190 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/ceilometer-0" event={"ID":"94554d2b-e45b-4c38-8871-e3b4febc81c9","Type":"ContainerStarted","Data":"648e88427fda1612f35b3aa26beb8885f330cf2512ef2c142fc9274949e3ca4f"} Oct 03 08:59:04 crc kubenswrapper[4765]: I1003 08:59:04.647542 4765 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="watcher-kuttl-default/ceilometer-0" Oct 03 08:59:04 crc kubenswrapper[4765]: I1003 08:59:04.667980 4765 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="watcher-kuttl-default/ceilometer-0" podStartSLOduration=2.214416742 podStartE2EDuration="5.667961859s" podCreationTimestamp="2025-10-03 08:58:59 +0000 UTC" firstStartedPulling="2025-10-03 08:59:00.704748084 +0000 UTC m=+1185.006242414" lastFinishedPulling="2025-10-03 08:59:04.158293161 +0000 UTC m=+1188.459787531" observedRunningTime="2025-10-03 08:59:04.663903857 +0000 UTC m=+1188.965398197" watchObservedRunningTime="2025-10-03 08:59:04.667961859 +0000 UTC m=+1188.969456189" Oct 03 08:59:06 crc kubenswrapper[4765]: I1003 08:59:06.087996 4765 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="watcher-kuttl-default/kube-state-metrics-0" Oct 03 08:59:30 crc kubenswrapper[4765]: I1003 08:59:30.269876 4765 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="watcher-kuttl-default/ceilometer-0" Oct 03 08:59:35 crc kubenswrapper[4765]: I1003 08:59:35.390096 4765 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["watcher-kuttl-default/watcher-db-create-skwjk"] Oct 03 08:59:35 crc kubenswrapper[4765]: I1003 08:59:35.391790 4765 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/watcher-db-create-skwjk" Oct 03 08:59:35 crc kubenswrapper[4765]: I1003 08:59:35.404183 4765 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["watcher-kuttl-default/watcher-db-create-skwjk"] Oct 03 08:59:35 crc kubenswrapper[4765]: I1003 08:59:35.578957 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-252v4\" (UniqueName: \"kubernetes.io/projected/de1b6b9b-b149-41f6-8657-d3e2cfac1e32-kube-api-access-252v4\") pod \"watcher-db-create-skwjk\" (UID: \"de1b6b9b-b149-41f6-8657-d3e2cfac1e32\") " pod="watcher-kuttl-default/watcher-db-create-skwjk" Oct 03 08:59:35 crc kubenswrapper[4765]: I1003 08:59:35.680239 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-252v4\" (UniqueName: \"kubernetes.io/projected/de1b6b9b-b149-41f6-8657-d3e2cfac1e32-kube-api-access-252v4\") pod \"watcher-db-create-skwjk\" (UID: \"de1b6b9b-b149-41f6-8657-d3e2cfac1e32\") " pod="watcher-kuttl-default/watcher-db-create-skwjk" Oct 03 08:59:35 crc kubenswrapper[4765]: I1003 08:59:35.715025 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-252v4\" (UniqueName: \"kubernetes.io/projected/de1b6b9b-b149-41f6-8657-d3e2cfac1e32-kube-api-access-252v4\") pod \"watcher-db-create-skwjk\" (UID: \"de1b6b9b-b149-41f6-8657-d3e2cfac1e32\") " pod="watcher-kuttl-default/watcher-db-create-skwjk" Oct 03 08:59:36 crc kubenswrapper[4765]: I1003 08:59:36.013384 4765 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/watcher-db-create-skwjk" Oct 03 08:59:36 crc kubenswrapper[4765]: I1003 08:59:36.499563 4765 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["watcher-kuttl-default/watcher-db-create-skwjk"] Oct 03 08:59:36 crc kubenswrapper[4765]: I1003 08:59:36.898085 4765 generic.go:334] "Generic (PLEG): container finished" podID="de1b6b9b-b149-41f6-8657-d3e2cfac1e32" containerID="c0ee5a1efc2893c2e7c90c4b084b0786c801d29e68edff371ca727b1dfd86db0" exitCode=0 Oct 03 08:59:36 crc kubenswrapper[4765]: I1003 08:59:36.898479 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/watcher-db-create-skwjk" event={"ID":"de1b6b9b-b149-41f6-8657-d3e2cfac1e32","Type":"ContainerDied","Data":"c0ee5a1efc2893c2e7c90c4b084b0786c801d29e68edff371ca727b1dfd86db0"} Oct 03 08:59:36 crc kubenswrapper[4765]: I1003 08:59:36.898524 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/watcher-db-create-skwjk" event={"ID":"de1b6b9b-b149-41f6-8657-d3e2cfac1e32","Type":"ContainerStarted","Data":"004137ffb291cb4d4859bb3b8d264dc045a74781824747cb7d4796da4915178f"} Oct 03 08:59:38 crc kubenswrapper[4765]: I1003 08:59:38.309062 4765 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/watcher-db-create-skwjk" Oct 03 08:59:38 crc kubenswrapper[4765]: I1003 08:59:38.351919 4765 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-252v4\" (UniqueName: \"kubernetes.io/projected/de1b6b9b-b149-41f6-8657-d3e2cfac1e32-kube-api-access-252v4\") pod \"de1b6b9b-b149-41f6-8657-d3e2cfac1e32\" (UID: \"de1b6b9b-b149-41f6-8657-d3e2cfac1e32\") " Oct 03 08:59:38 crc kubenswrapper[4765]: I1003 08:59:38.361926 4765 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/de1b6b9b-b149-41f6-8657-d3e2cfac1e32-kube-api-access-252v4" (OuterVolumeSpecName: "kube-api-access-252v4") pod "de1b6b9b-b149-41f6-8657-d3e2cfac1e32" (UID: "de1b6b9b-b149-41f6-8657-d3e2cfac1e32"). InnerVolumeSpecName "kube-api-access-252v4". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 08:59:38 crc kubenswrapper[4765]: I1003 08:59:38.454718 4765 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-252v4\" (UniqueName: \"kubernetes.io/projected/de1b6b9b-b149-41f6-8657-d3e2cfac1e32-kube-api-access-252v4\") on node \"crc\" DevicePath \"\"" Oct 03 08:59:38 crc kubenswrapper[4765]: I1003 08:59:38.918264 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/watcher-db-create-skwjk" event={"ID":"de1b6b9b-b149-41f6-8657-d3e2cfac1e32","Type":"ContainerDied","Data":"004137ffb291cb4d4859bb3b8d264dc045a74781824747cb7d4796da4915178f"} Oct 03 08:59:38 crc kubenswrapper[4765]: I1003 08:59:38.918313 4765 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="004137ffb291cb4d4859bb3b8d264dc045a74781824747cb7d4796da4915178f" Oct 03 08:59:38 crc kubenswrapper[4765]: I1003 08:59:38.918366 4765 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/watcher-db-create-skwjk" Oct 03 08:59:45 crc kubenswrapper[4765]: I1003 08:59:45.479546 4765 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["watcher-kuttl-default/watcher-3897-account-create-qd2xq"] Oct 03 08:59:45 crc kubenswrapper[4765]: E1003 08:59:45.480440 4765 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="de1b6b9b-b149-41f6-8657-d3e2cfac1e32" containerName="mariadb-database-create" Oct 03 08:59:45 crc kubenswrapper[4765]: I1003 08:59:45.480454 4765 state_mem.go:107] "Deleted CPUSet assignment" podUID="de1b6b9b-b149-41f6-8657-d3e2cfac1e32" containerName="mariadb-database-create" Oct 03 08:59:45 crc kubenswrapper[4765]: I1003 08:59:45.480590 4765 memory_manager.go:354] "RemoveStaleState removing state" podUID="de1b6b9b-b149-41f6-8657-d3e2cfac1e32" containerName="mariadb-database-create" Oct 03 08:59:45 crc kubenswrapper[4765]: I1003 08:59:45.481205 4765 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/watcher-3897-account-create-qd2xq" Oct 03 08:59:45 crc kubenswrapper[4765]: I1003 08:59:45.485095 4765 reflector.go:368] Caches populated for *v1.Secret from object-"watcher-kuttl-default"/"watcher-db-secret" Oct 03 08:59:45 crc kubenswrapper[4765]: I1003 08:59:45.488904 4765 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["watcher-kuttl-default/watcher-3897-account-create-qd2xq"] Oct 03 08:59:45 crc kubenswrapper[4765]: I1003 08:59:45.570479 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pkgcs\" (UniqueName: \"kubernetes.io/projected/6c523deb-e99b-4a55-85f7-35346117544f-kube-api-access-pkgcs\") pod \"watcher-3897-account-create-qd2xq\" (UID: \"6c523deb-e99b-4a55-85f7-35346117544f\") " pod="watcher-kuttl-default/watcher-3897-account-create-qd2xq" Oct 03 08:59:45 crc kubenswrapper[4765]: I1003 08:59:45.672720 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pkgcs\" (UniqueName: \"kubernetes.io/projected/6c523deb-e99b-4a55-85f7-35346117544f-kube-api-access-pkgcs\") pod \"watcher-3897-account-create-qd2xq\" (UID: \"6c523deb-e99b-4a55-85f7-35346117544f\") " pod="watcher-kuttl-default/watcher-3897-account-create-qd2xq" Oct 03 08:59:45 crc kubenswrapper[4765]: I1003 08:59:45.690461 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pkgcs\" (UniqueName: \"kubernetes.io/projected/6c523deb-e99b-4a55-85f7-35346117544f-kube-api-access-pkgcs\") pod \"watcher-3897-account-create-qd2xq\" (UID: \"6c523deb-e99b-4a55-85f7-35346117544f\") " pod="watcher-kuttl-default/watcher-3897-account-create-qd2xq" Oct 03 08:59:45 crc kubenswrapper[4765]: I1003 08:59:45.797933 4765 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/watcher-3897-account-create-qd2xq" Oct 03 08:59:46 crc kubenswrapper[4765]: I1003 08:59:46.224765 4765 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["watcher-kuttl-default/watcher-3897-account-create-qd2xq"] Oct 03 08:59:46 crc kubenswrapper[4765]: I1003 08:59:46.979869 4765 generic.go:334] "Generic (PLEG): container finished" podID="6c523deb-e99b-4a55-85f7-35346117544f" containerID="d30cd7dcf5b79b28884651b6af6741815ff55b619c7d6fb4417ac39d2831b88d" exitCode=0 Oct 03 08:59:46 crc kubenswrapper[4765]: I1003 08:59:46.979911 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/watcher-3897-account-create-qd2xq" event={"ID":"6c523deb-e99b-4a55-85f7-35346117544f","Type":"ContainerDied","Data":"d30cd7dcf5b79b28884651b6af6741815ff55b619c7d6fb4417ac39d2831b88d"} Oct 03 08:59:46 crc kubenswrapper[4765]: I1003 08:59:46.980249 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/watcher-3897-account-create-qd2xq" event={"ID":"6c523deb-e99b-4a55-85f7-35346117544f","Type":"ContainerStarted","Data":"d7c3ecd2e0867e83a361a40a88df05d53e86c78bd16a098c84e234aa219cb4eb"} Oct 03 08:59:48 crc kubenswrapper[4765]: I1003 08:59:48.287155 4765 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/watcher-3897-account-create-qd2xq" Oct 03 08:59:48 crc kubenswrapper[4765]: I1003 08:59:48.320075 4765 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pkgcs\" (UniqueName: \"kubernetes.io/projected/6c523deb-e99b-4a55-85f7-35346117544f-kube-api-access-pkgcs\") pod \"6c523deb-e99b-4a55-85f7-35346117544f\" (UID: \"6c523deb-e99b-4a55-85f7-35346117544f\") " Oct 03 08:59:48 crc kubenswrapper[4765]: I1003 08:59:48.329865 4765 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6c523deb-e99b-4a55-85f7-35346117544f-kube-api-access-pkgcs" (OuterVolumeSpecName: "kube-api-access-pkgcs") pod "6c523deb-e99b-4a55-85f7-35346117544f" (UID: "6c523deb-e99b-4a55-85f7-35346117544f"). InnerVolumeSpecName "kube-api-access-pkgcs". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 08:59:48 crc kubenswrapper[4765]: I1003 08:59:48.422335 4765 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pkgcs\" (UniqueName: \"kubernetes.io/projected/6c523deb-e99b-4a55-85f7-35346117544f-kube-api-access-pkgcs\") on node \"crc\" DevicePath \"\"" Oct 03 08:59:48 crc kubenswrapper[4765]: I1003 08:59:48.996255 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/watcher-3897-account-create-qd2xq" event={"ID":"6c523deb-e99b-4a55-85f7-35346117544f","Type":"ContainerDied","Data":"d7c3ecd2e0867e83a361a40a88df05d53e86c78bd16a098c84e234aa219cb4eb"} Oct 03 08:59:48 crc kubenswrapper[4765]: I1003 08:59:48.996291 4765 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="d7c3ecd2e0867e83a361a40a88df05d53e86c78bd16a098c84e234aa219cb4eb" Oct 03 08:59:48 crc kubenswrapper[4765]: I1003 08:59:48.996343 4765 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/watcher-3897-account-create-qd2xq" Oct 03 08:59:50 crc kubenswrapper[4765]: I1003 08:59:50.842892 4765 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["watcher-kuttl-default/watcher-kuttl-db-sync-4ll9c"] Oct 03 08:59:50 crc kubenswrapper[4765]: E1003 08:59:50.844961 4765 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6c523deb-e99b-4a55-85f7-35346117544f" containerName="mariadb-account-create" Oct 03 08:59:50 crc kubenswrapper[4765]: I1003 08:59:50.845048 4765 state_mem.go:107] "Deleted CPUSet assignment" podUID="6c523deb-e99b-4a55-85f7-35346117544f" containerName="mariadb-account-create" Oct 03 08:59:50 crc kubenswrapper[4765]: I1003 08:59:50.845326 4765 memory_manager.go:354] "RemoveStaleState removing state" podUID="6c523deb-e99b-4a55-85f7-35346117544f" containerName="mariadb-account-create" Oct 03 08:59:50 crc kubenswrapper[4765]: I1003 08:59:50.846152 4765 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/watcher-kuttl-db-sync-4ll9c" Oct 03 08:59:50 crc kubenswrapper[4765]: I1003 08:59:50.848305 4765 reflector.go:368] Caches populated for *v1.Secret from object-"watcher-kuttl-default"/"watcher-kuttl-config-data" Oct 03 08:59:50 crc kubenswrapper[4765]: I1003 08:59:50.849174 4765 reflector.go:368] Caches populated for *v1.Secret from object-"watcher-kuttl-default"/"watcher-watcher-kuttl-dockercfg-jd8v2" Oct 03 08:59:50 crc kubenswrapper[4765]: I1003 08:59:50.852118 4765 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["watcher-kuttl-default/watcher-kuttl-db-sync-4ll9c"] Oct 03 08:59:50 crc kubenswrapper[4765]: I1003 08:59:50.961335 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/498f14d0-1965-4754-8cb4-5742535aa52b-db-sync-config-data\") pod \"watcher-kuttl-db-sync-4ll9c\" (UID: \"498f14d0-1965-4754-8cb4-5742535aa52b\") " pod="watcher-kuttl-default/watcher-kuttl-db-sync-4ll9c" Oct 03 08:59:50 crc kubenswrapper[4765]: I1003 08:59:50.961378 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/498f14d0-1965-4754-8cb4-5742535aa52b-config-data\") pod \"watcher-kuttl-db-sync-4ll9c\" (UID: \"498f14d0-1965-4754-8cb4-5742535aa52b\") " pod="watcher-kuttl-default/watcher-kuttl-db-sync-4ll9c" Oct 03 08:59:50 crc kubenswrapper[4765]: I1003 08:59:50.961401 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rkz5w\" (UniqueName: \"kubernetes.io/projected/498f14d0-1965-4754-8cb4-5742535aa52b-kube-api-access-rkz5w\") pod \"watcher-kuttl-db-sync-4ll9c\" (UID: \"498f14d0-1965-4754-8cb4-5742535aa52b\") " pod="watcher-kuttl-default/watcher-kuttl-db-sync-4ll9c" Oct 03 08:59:50 crc kubenswrapper[4765]: I1003 08:59:50.961437 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/498f14d0-1965-4754-8cb4-5742535aa52b-combined-ca-bundle\") pod \"watcher-kuttl-db-sync-4ll9c\" (UID: \"498f14d0-1965-4754-8cb4-5742535aa52b\") " pod="watcher-kuttl-default/watcher-kuttl-db-sync-4ll9c" Oct 03 08:59:51 crc kubenswrapper[4765]: I1003 08:59:51.062441 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rkz5w\" (UniqueName: \"kubernetes.io/projected/498f14d0-1965-4754-8cb4-5742535aa52b-kube-api-access-rkz5w\") pod \"watcher-kuttl-db-sync-4ll9c\" (UID: \"498f14d0-1965-4754-8cb4-5742535aa52b\") " pod="watcher-kuttl-default/watcher-kuttl-db-sync-4ll9c" Oct 03 08:59:51 crc kubenswrapper[4765]: I1003 08:59:51.062513 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/498f14d0-1965-4754-8cb4-5742535aa52b-combined-ca-bundle\") pod \"watcher-kuttl-db-sync-4ll9c\" (UID: \"498f14d0-1965-4754-8cb4-5742535aa52b\") " pod="watcher-kuttl-default/watcher-kuttl-db-sync-4ll9c" Oct 03 08:59:51 crc kubenswrapper[4765]: I1003 08:59:51.062667 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/498f14d0-1965-4754-8cb4-5742535aa52b-db-sync-config-data\") pod \"watcher-kuttl-db-sync-4ll9c\" (UID: \"498f14d0-1965-4754-8cb4-5742535aa52b\") " pod="watcher-kuttl-default/watcher-kuttl-db-sync-4ll9c" Oct 03 08:59:51 crc kubenswrapper[4765]: I1003 08:59:51.062693 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/498f14d0-1965-4754-8cb4-5742535aa52b-config-data\") pod \"watcher-kuttl-db-sync-4ll9c\" (UID: \"498f14d0-1965-4754-8cb4-5742535aa52b\") " pod="watcher-kuttl-default/watcher-kuttl-db-sync-4ll9c" Oct 03 08:59:51 crc kubenswrapper[4765]: I1003 08:59:51.068454 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/498f14d0-1965-4754-8cb4-5742535aa52b-combined-ca-bundle\") pod \"watcher-kuttl-db-sync-4ll9c\" (UID: \"498f14d0-1965-4754-8cb4-5742535aa52b\") " pod="watcher-kuttl-default/watcher-kuttl-db-sync-4ll9c" Oct 03 08:59:51 crc kubenswrapper[4765]: I1003 08:59:51.068602 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/498f14d0-1965-4754-8cb4-5742535aa52b-db-sync-config-data\") pod \"watcher-kuttl-db-sync-4ll9c\" (UID: \"498f14d0-1965-4754-8cb4-5742535aa52b\") " pod="watcher-kuttl-default/watcher-kuttl-db-sync-4ll9c" Oct 03 08:59:51 crc kubenswrapper[4765]: I1003 08:59:51.069945 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/498f14d0-1965-4754-8cb4-5742535aa52b-config-data\") pod \"watcher-kuttl-db-sync-4ll9c\" (UID: \"498f14d0-1965-4754-8cb4-5742535aa52b\") " pod="watcher-kuttl-default/watcher-kuttl-db-sync-4ll9c" Oct 03 08:59:51 crc kubenswrapper[4765]: I1003 08:59:51.102634 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rkz5w\" (UniqueName: \"kubernetes.io/projected/498f14d0-1965-4754-8cb4-5742535aa52b-kube-api-access-rkz5w\") pod \"watcher-kuttl-db-sync-4ll9c\" (UID: \"498f14d0-1965-4754-8cb4-5742535aa52b\") " pod="watcher-kuttl-default/watcher-kuttl-db-sync-4ll9c" Oct 03 08:59:51 crc kubenswrapper[4765]: I1003 08:59:51.167753 4765 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/watcher-kuttl-db-sync-4ll9c" Oct 03 08:59:51 crc kubenswrapper[4765]: I1003 08:59:51.661179 4765 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["watcher-kuttl-default/watcher-kuttl-db-sync-4ll9c"] Oct 03 08:59:52 crc kubenswrapper[4765]: I1003 08:59:52.023096 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/watcher-kuttl-db-sync-4ll9c" event={"ID":"498f14d0-1965-4754-8cb4-5742535aa52b","Type":"ContainerStarted","Data":"686ddc4b4532778ab2148c67a8afe53c58bb33ac120f17e95558c1abd6d8f320"} Oct 03 09:00:00 crc kubenswrapper[4765]: I1003 09:00:00.155484 4765 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29324700-b289x"] Oct 03 09:00:00 crc kubenswrapper[4765]: I1003 09:00:00.158275 4765 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29324700-b289x" Oct 03 09:00:00 crc kubenswrapper[4765]: I1003 09:00:00.165165 4765 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Oct 03 09:00:00 crc kubenswrapper[4765]: I1003 09:00:00.165844 4765 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Oct 03 09:00:00 crc kubenswrapper[4765]: I1003 09:00:00.181809 4765 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29324700-b289x"] Oct 03 09:00:00 crc kubenswrapper[4765]: I1003 09:00:00.318070 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/59d35e34-4a3c-4e83-b87c-f88abb914e38-secret-volume\") pod \"collect-profiles-29324700-b289x\" (UID: \"59d35e34-4a3c-4e83-b87c-f88abb914e38\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29324700-b289x" Oct 03 09:00:00 crc kubenswrapper[4765]: I1003 09:00:00.318138 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hpz8m\" (UniqueName: \"kubernetes.io/projected/59d35e34-4a3c-4e83-b87c-f88abb914e38-kube-api-access-hpz8m\") pod \"collect-profiles-29324700-b289x\" (UID: \"59d35e34-4a3c-4e83-b87c-f88abb914e38\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29324700-b289x" Oct 03 09:00:00 crc kubenswrapper[4765]: I1003 09:00:00.318167 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/59d35e34-4a3c-4e83-b87c-f88abb914e38-config-volume\") pod \"collect-profiles-29324700-b289x\" (UID: \"59d35e34-4a3c-4e83-b87c-f88abb914e38\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29324700-b289x" Oct 03 09:00:00 crc kubenswrapper[4765]: I1003 09:00:00.419641 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/59d35e34-4a3c-4e83-b87c-f88abb914e38-secret-volume\") pod \"collect-profiles-29324700-b289x\" (UID: \"59d35e34-4a3c-4e83-b87c-f88abb914e38\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29324700-b289x" Oct 03 09:00:00 crc kubenswrapper[4765]: I1003 09:00:00.420735 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hpz8m\" (UniqueName: \"kubernetes.io/projected/59d35e34-4a3c-4e83-b87c-f88abb914e38-kube-api-access-hpz8m\") pod \"collect-profiles-29324700-b289x\" (UID: \"59d35e34-4a3c-4e83-b87c-f88abb914e38\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29324700-b289x" Oct 03 09:00:00 crc kubenswrapper[4765]: I1003 09:00:00.420761 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/59d35e34-4a3c-4e83-b87c-f88abb914e38-config-volume\") pod \"collect-profiles-29324700-b289x\" (UID: \"59d35e34-4a3c-4e83-b87c-f88abb914e38\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29324700-b289x" Oct 03 09:00:00 crc kubenswrapper[4765]: I1003 09:00:00.421749 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/59d35e34-4a3c-4e83-b87c-f88abb914e38-config-volume\") pod \"collect-profiles-29324700-b289x\" (UID: \"59d35e34-4a3c-4e83-b87c-f88abb914e38\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29324700-b289x" Oct 03 09:00:00 crc kubenswrapper[4765]: I1003 09:00:00.427827 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/59d35e34-4a3c-4e83-b87c-f88abb914e38-secret-volume\") pod \"collect-profiles-29324700-b289x\" (UID: \"59d35e34-4a3c-4e83-b87c-f88abb914e38\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29324700-b289x" Oct 03 09:00:00 crc kubenswrapper[4765]: I1003 09:00:00.444338 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hpz8m\" (UniqueName: \"kubernetes.io/projected/59d35e34-4a3c-4e83-b87c-f88abb914e38-kube-api-access-hpz8m\") pod \"collect-profiles-29324700-b289x\" (UID: \"59d35e34-4a3c-4e83-b87c-f88abb914e38\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29324700-b289x" Oct 03 09:00:00 crc kubenswrapper[4765]: I1003 09:00:00.507003 4765 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29324700-b289x" Oct 03 09:00:06 crc kubenswrapper[4765]: I1003 09:00:06.566996 4765 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29324700-b289x"] Oct 03 09:00:06 crc kubenswrapper[4765]: E1003 09:00:06.854253 4765 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="38.102.83.58:5001/podified-master-centos10/openstack-watcher-api:watcher_latest" Oct 03 09:00:06 crc kubenswrapper[4765]: E1003 09:00:06.854302 4765 kuberuntime_image.go:55] "Failed to pull image" err="rpc error: code = Canceled desc = copying config: context canceled" image="38.102.83.58:5001/podified-master-centos10/openstack-watcher-api:watcher_latest" Oct 03 09:00:06 crc kubenswrapper[4765]: E1003 09:00:06.854454 4765 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:watcher-kuttl-db-sync,Image:38.102.83.58:5001/podified-master-centos10/openstack-watcher-api:watcher_latest,Command:[/bin/bash],Args:[-c /usr/local/bin/kolla_set_configs && /usr/local/bin/kolla_start],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:KOLLA_BOOTSTRAP,Value:TRUE,ValueFrom:nil,},EnvVar{Name:KOLLA_CONFIG_STRATEGY,Value:COPY_ALWAYS,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:config-data,ReadOnly:true,MountPath:/var/lib/config-data/default,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data,ReadOnly:true,MountPath:/etc/my.cnf,SubPath:my.cnf,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:db-sync-config-data,ReadOnly:true,MountPath:/etc/watcher/watcher.conf.d,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data,ReadOnly:true,MountPath:/var/lib/kolla/config_files/config.json,SubPath:watcher-dbsync-config.json,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:combined-ca-bundle,ReadOnly:true,MountPath:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem,SubPath:tls-ca-bundle.pem,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-rkz5w,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*0,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:nil,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod watcher-kuttl-db-sync-4ll9c_watcher-kuttl-default(498f14d0-1965-4754-8cb4-5742535aa52b): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Oct 03 09:00:06 crc kubenswrapper[4765]: E1003 09:00:06.855592 4765 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"watcher-kuttl-db-sync\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="watcher-kuttl-default/watcher-kuttl-db-sync-4ll9c" podUID="498f14d0-1965-4754-8cb4-5742535aa52b" Oct 03 09:00:07 crc kubenswrapper[4765]: I1003 09:00:07.191305 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29324700-b289x" event={"ID":"59d35e34-4a3c-4e83-b87c-f88abb914e38","Type":"ContainerStarted","Data":"5c75a7271f802c7eb6a3bb3842cb80dd3b4b611d101773a5b44f6150ff108f2d"} Oct 03 09:00:07 crc kubenswrapper[4765]: I1003 09:00:07.191630 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29324700-b289x" event={"ID":"59d35e34-4a3c-4e83-b87c-f88abb914e38","Type":"ContainerStarted","Data":"5f07fd1b6c184bec8e4bd471e81456b5ecad4f2a4f86227ccbc5f1d9c92eb681"} Oct 03 09:00:07 crc kubenswrapper[4765]: E1003 09:00:07.192260 4765 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"watcher-kuttl-db-sync\" with ImagePullBackOff: \"Back-off pulling image \\\"38.102.83.58:5001/podified-master-centos10/openstack-watcher-api:watcher_latest\\\"\"" pod="watcher-kuttl-default/watcher-kuttl-db-sync-4ll9c" podUID="498f14d0-1965-4754-8cb4-5742535aa52b" Oct 03 09:00:07 crc kubenswrapper[4765]: I1003 09:00:07.225564 4765 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/collect-profiles-29324700-b289x" podStartSLOduration=7.225546366 podStartE2EDuration="7.225546366s" podCreationTimestamp="2025-10-03 09:00:00 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-03 09:00:07.218429957 +0000 UTC m=+1251.519924287" watchObservedRunningTime="2025-10-03 09:00:07.225546366 +0000 UTC m=+1251.527040696" Oct 03 09:00:08 crc kubenswrapper[4765]: I1003 09:00:08.199081 4765 generic.go:334] "Generic (PLEG): container finished" podID="59d35e34-4a3c-4e83-b87c-f88abb914e38" containerID="5c75a7271f802c7eb6a3bb3842cb80dd3b4b611d101773a5b44f6150ff108f2d" exitCode=0 Oct 03 09:00:08 crc kubenswrapper[4765]: I1003 09:00:08.199130 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29324700-b289x" event={"ID":"59d35e34-4a3c-4e83-b87c-f88abb914e38","Type":"ContainerDied","Data":"5c75a7271f802c7eb6a3bb3842cb80dd3b4b611d101773a5b44f6150ff108f2d"} Oct 03 09:00:09 crc kubenswrapper[4765]: I1003 09:00:09.512693 4765 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29324700-b289x" Oct 03 09:00:09 crc kubenswrapper[4765]: I1003 09:00:09.706737 4765 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/59d35e34-4a3c-4e83-b87c-f88abb914e38-config-volume\") pod \"59d35e34-4a3c-4e83-b87c-f88abb914e38\" (UID: \"59d35e34-4a3c-4e83-b87c-f88abb914e38\") " Oct 03 09:00:09 crc kubenswrapper[4765]: I1003 09:00:09.706894 4765 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/59d35e34-4a3c-4e83-b87c-f88abb914e38-secret-volume\") pod \"59d35e34-4a3c-4e83-b87c-f88abb914e38\" (UID: \"59d35e34-4a3c-4e83-b87c-f88abb914e38\") " Oct 03 09:00:09 crc kubenswrapper[4765]: I1003 09:00:09.707030 4765 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-hpz8m\" (UniqueName: \"kubernetes.io/projected/59d35e34-4a3c-4e83-b87c-f88abb914e38-kube-api-access-hpz8m\") pod \"59d35e34-4a3c-4e83-b87c-f88abb914e38\" (UID: \"59d35e34-4a3c-4e83-b87c-f88abb914e38\") " Oct 03 09:00:09 crc kubenswrapper[4765]: I1003 09:00:09.707809 4765 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/59d35e34-4a3c-4e83-b87c-f88abb914e38-config-volume" (OuterVolumeSpecName: "config-volume") pod "59d35e34-4a3c-4e83-b87c-f88abb914e38" (UID: "59d35e34-4a3c-4e83-b87c-f88abb914e38"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 03 09:00:09 crc kubenswrapper[4765]: I1003 09:00:09.714441 4765 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/59d35e34-4a3c-4e83-b87c-f88abb914e38-kube-api-access-hpz8m" (OuterVolumeSpecName: "kube-api-access-hpz8m") pod "59d35e34-4a3c-4e83-b87c-f88abb914e38" (UID: "59d35e34-4a3c-4e83-b87c-f88abb914e38"). InnerVolumeSpecName "kube-api-access-hpz8m". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 09:00:09 crc kubenswrapper[4765]: I1003 09:00:09.716789 4765 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/59d35e34-4a3c-4e83-b87c-f88abb914e38-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "59d35e34-4a3c-4e83-b87c-f88abb914e38" (UID: "59d35e34-4a3c-4e83-b87c-f88abb914e38"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 09:00:09 crc kubenswrapper[4765]: I1003 09:00:09.809159 4765 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-hpz8m\" (UniqueName: \"kubernetes.io/projected/59d35e34-4a3c-4e83-b87c-f88abb914e38-kube-api-access-hpz8m\") on node \"crc\" DevicePath \"\"" Oct 03 09:00:09 crc kubenswrapper[4765]: I1003 09:00:09.809214 4765 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/59d35e34-4a3c-4e83-b87c-f88abb914e38-config-volume\") on node \"crc\" DevicePath \"\"" Oct 03 09:00:09 crc kubenswrapper[4765]: I1003 09:00:09.809227 4765 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/59d35e34-4a3c-4e83-b87c-f88abb914e38-secret-volume\") on node \"crc\" DevicePath \"\"" Oct 03 09:00:10 crc kubenswrapper[4765]: I1003 09:00:10.217334 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29324700-b289x" event={"ID":"59d35e34-4a3c-4e83-b87c-f88abb914e38","Type":"ContainerDied","Data":"5f07fd1b6c184bec8e4bd471e81456b5ecad4f2a4f86227ccbc5f1d9c92eb681"} Oct 03 09:00:10 crc kubenswrapper[4765]: I1003 09:00:10.217927 4765 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="5f07fd1b6c184bec8e4bd471e81456b5ecad4f2a4f86227ccbc5f1d9c92eb681" Oct 03 09:00:10 crc kubenswrapper[4765]: I1003 09:00:10.217395 4765 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29324700-b289x" Oct 03 09:00:19 crc kubenswrapper[4765]: I1003 09:00:19.310248 4765 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Oct 03 09:00:20 crc kubenswrapper[4765]: I1003 09:00:20.292856 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/watcher-kuttl-db-sync-4ll9c" event={"ID":"498f14d0-1965-4754-8cb4-5742535aa52b","Type":"ContainerStarted","Data":"52999060ccc3806573782e5a15676472b0cb4da7a19cc36ad87486daa617d12c"} Oct 03 09:00:23 crc kubenswrapper[4765]: I1003 09:00:23.316083 4765 generic.go:334] "Generic (PLEG): container finished" podID="498f14d0-1965-4754-8cb4-5742535aa52b" containerID="52999060ccc3806573782e5a15676472b0cb4da7a19cc36ad87486daa617d12c" exitCode=0 Oct 03 09:00:23 crc kubenswrapper[4765]: I1003 09:00:23.316160 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/watcher-kuttl-db-sync-4ll9c" event={"ID":"498f14d0-1965-4754-8cb4-5742535aa52b","Type":"ContainerDied","Data":"52999060ccc3806573782e5a15676472b0cb4da7a19cc36ad87486daa617d12c"} Oct 03 09:00:24 crc kubenswrapper[4765]: I1003 09:00:24.751859 4765 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/watcher-kuttl-db-sync-4ll9c" Oct 03 09:00:24 crc kubenswrapper[4765]: I1003 09:00:24.853014 4765 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/498f14d0-1965-4754-8cb4-5742535aa52b-config-data\") pod \"498f14d0-1965-4754-8cb4-5742535aa52b\" (UID: \"498f14d0-1965-4754-8cb4-5742535aa52b\") " Oct 03 09:00:24 crc kubenswrapper[4765]: I1003 09:00:24.853084 4765 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/498f14d0-1965-4754-8cb4-5742535aa52b-db-sync-config-data\") pod \"498f14d0-1965-4754-8cb4-5742535aa52b\" (UID: \"498f14d0-1965-4754-8cb4-5742535aa52b\") " Oct 03 09:00:24 crc kubenswrapper[4765]: I1003 09:00:24.853163 4765 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rkz5w\" (UniqueName: \"kubernetes.io/projected/498f14d0-1965-4754-8cb4-5742535aa52b-kube-api-access-rkz5w\") pod \"498f14d0-1965-4754-8cb4-5742535aa52b\" (UID: \"498f14d0-1965-4754-8cb4-5742535aa52b\") " Oct 03 09:00:24 crc kubenswrapper[4765]: I1003 09:00:24.853237 4765 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/498f14d0-1965-4754-8cb4-5742535aa52b-combined-ca-bundle\") pod \"498f14d0-1965-4754-8cb4-5742535aa52b\" (UID: \"498f14d0-1965-4754-8cb4-5742535aa52b\") " Oct 03 09:00:24 crc kubenswrapper[4765]: I1003 09:00:24.861307 4765 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/498f14d0-1965-4754-8cb4-5742535aa52b-kube-api-access-rkz5w" (OuterVolumeSpecName: "kube-api-access-rkz5w") pod "498f14d0-1965-4754-8cb4-5742535aa52b" (UID: "498f14d0-1965-4754-8cb4-5742535aa52b"). InnerVolumeSpecName "kube-api-access-rkz5w". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 09:00:24 crc kubenswrapper[4765]: I1003 09:00:24.861309 4765 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/498f14d0-1965-4754-8cb4-5742535aa52b-db-sync-config-data" (OuterVolumeSpecName: "db-sync-config-data") pod "498f14d0-1965-4754-8cb4-5742535aa52b" (UID: "498f14d0-1965-4754-8cb4-5742535aa52b"). InnerVolumeSpecName "db-sync-config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 09:00:24 crc kubenswrapper[4765]: I1003 09:00:24.877250 4765 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/498f14d0-1965-4754-8cb4-5742535aa52b-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "498f14d0-1965-4754-8cb4-5742535aa52b" (UID: "498f14d0-1965-4754-8cb4-5742535aa52b"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 09:00:24 crc kubenswrapper[4765]: I1003 09:00:24.894820 4765 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/498f14d0-1965-4754-8cb4-5742535aa52b-config-data" (OuterVolumeSpecName: "config-data") pod "498f14d0-1965-4754-8cb4-5742535aa52b" (UID: "498f14d0-1965-4754-8cb4-5742535aa52b"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 09:00:24 crc kubenswrapper[4765]: I1003 09:00:24.955479 4765 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rkz5w\" (UniqueName: \"kubernetes.io/projected/498f14d0-1965-4754-8cb4-5742535aa52b-kube-api-access-rkz5w\") on node \"crc\" DevicePath \"\"" Oct 03 09:00:24 crc kubenswrapper[4765]: I1003 09:00:24.955521 4765 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/498f14d0-1965-4754-8cb4-5742535aa52b-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 03 09:00:24 crc kubenswrapper[4765]: I1003 09:00:24.955538 4765 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/498f14d0-1965-4754-8cb4-5742535aa52b-config-data\") on node \"crc\" DevicePath \"\"" Oct 03 09:00:24 crc kubenswrapper[4765]: I1003 09:00:24.955550 4765 reconciler_common.go:293] "Volume detached for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/498f14d0-1965-4754-8cb4-5742535aa52b-db-sync-config-data\") on node \"crc\" DevicePath \"\"" Oct 03 09:00:25 crc kubenswrapper[4765]: I1003 09:00:25.359994 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/watcher-kuttl-db-sync-4ll9c" event={"ID":"498f14d0-1965-4754-8cb4-5742535aa52b","Type":"ContainerDied","Data":"686ddc4b4532778ab2148c67a8afe53c58bb33ac120f17e95558c1abd6d8f320"} Oct 03 09:00:25 crc kubenswrapper[4765]: I1003 09:00:25.360252 4765 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/watcher-kuttl-db-sync-4ll9c" Oct 03 09:00:25 crc kubenswrapper[4765]: I1003 09:00:25.360360 4765 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="686ddc4b4532778ab2148c67a8afe53c58bb33ac120f17e95558c1abd6d8f320" Oct 03 09:00:25 crc kubenswrapper[4765]: I1003 09:00:25.652517 4765 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["watcher-kuttl-default/watcher-kuttl-decision-engine-0"] Oct 03 09:00:25 crc kubenswrapper[4765]: E1003 09:00:25.652900 4765 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="498f14d0-1965-4754-8cb4-5742535aa52b" containerName="watcher-kuttl-db-sync" Oct 03 09:00:25 crc kubenswrapper[4765]: I1003 09:00:25.652917 4765 state_mem.go:107] "Deleted CPUSet assignment" podUID="498f14d0-1965-4754-8cb4-5742535aa52b" containerName="watcher-kuttl-db-sync" Oct 03 09:00:25 crc kubenswrapper[4765]: E1003 09:00:25.652927 4765 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="59d35e34-4a3c-4e83-b87c-f88abb914e38" containerName="collect-profiles" Oct 03 09:00:25 crc kubenswrapper[4765]: I1003 09:00:25.652933 4765 state_mem.go:107] "Deleted CPUSet assignment" podUID="59d35e34-4a3c-4e83-b87c-f88abb914e38" containerName="collect-profiles" Oct 03 09:00:25 crc kubenswrapper[4765]: I1003 09:00:25.653111 4765 memory_manager.go:354] "RemoveStaleState removing state" podUID="59d35e34-4a3c-4e83-b87c-f88abb914e38" containerName="collect-profiles" Oct 03 09:00:25 crc kubenswrapper[4765]: I1003 09:00:25.653127 4765 memory_manager.go:354] "RemoveStaleState removing state" podUID="498f14d0-1965-4754-8cb4-5742535aa52b" containerName="watcher-kuttl-db-sync" Oct 03 09:00:25 crc kubenswrapper[4765]: I1003 09:00:25.653709 4765 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/watcher-kuttl-decision-engine-0" Oct 03 09:00:25 crc kubenswrapper[4765]: I1003 09:00:25.655144 4765 reflector.go:368] Caches populated for *v1.Secret from object-"watcher-kuttl-default"/"watcher-kuttl-decision-engine-config-data" Oct 03 09:00:25 crc kubenswrapper[4765]: I1003 09:00:25.655840 4765 reflector.go:368] Caches populated for *v1.Secret from object-"watcher-kuttl-default"/"watcher-watcher-kuttl-dockercfg-jd8v2" Oct 03 09:00:25 crc kubenswrapper[4765]: I1003 09:00:25.661140 4765 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["watcher-kuttl-default/watcher-kuttl-api-0"] Oct 03 09:00:25 crc kubenswrapper[4765]: I1003 09:00:25.662716 4765 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/watcher-kuttl-api-0" Oct 03 09:00:25 crc kubenswrapper[4765]: I1003 09:00:25.665929 4765 reflector.go:368] Caches populated for *v1.Secret from object-"watcher-kuttl-default"/"watcher-kuttl-api-config-data" Oct 03 09:00:25 crc kubenswrapper[4765]: I1003 09:00:25.676273 4765 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["watcher-kuttl-default/watcher-kuttl-decision-engine-0"] Oct 03 09:00:25 crc kubenswrapper[4765]: I1003 09:00:25.685318 4765 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["watcher-kuttl-default/watcher-kuttl-api-0"] Oct 03 09:00:25 crc kubenswrapper[4765]: I1003 09:00:25.737067 4765 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["watcher-kuttl-default/watcher-kuttl-applier-0"] Oct 03 09:00:25 crc kubenswrapper[4765]: I1003 09:00:25.738065 4765 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/watcher-kuttl-applier-0" Oct 03 09:00:25 crc kubenswrapper[4765]: I1003 09:00:25.740344 4765 reflector.go:368] Caches populated for *v1.Secret from object-"watcher-kuttl-default"/"watcher-kuttl-applier-config-data" Oct 03 09:00:25 crc kubenswrapper[4765]: I1003 09:00:25.750837 4765 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["watcher-kuttl-default/watcher-kuttl-applier-0"] Oct 03 09:00:25 crc kubenswrapper[4765]: I1003 09:00:25.768010 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/449e82dc-3ec9-40a6-8067-a94e3ccd1be5-config-data\") pod \"watcher-kuttl-decision-engine-0\" (UID: \"449e82dc-3ec9-40a6-8067-a94e3ccd1be5\") " pod="watcher-kuttl-default/watcher-kuttl-decision-engine-0" Oct 03 09:00:25 crc kubenswrapper[4765]: I1003 09:00:25.768062 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"custom-prometheus-ca\" (UniqueName: \"kubernetes.io/secret/3961e3c3-cfbc-4b31-af2d-957d38432858-custom-prometheus-ca\") pod \"watcher-kuttl-api-0\" (UID: \"3961e3c3-cfbc-4b31-af2d-957d38432858\") " pod="watcher-kuttl-default/watcher-kuttl-api-0" Oct 03 09:00:25 crc kubenswrapper[4765]: I1003 09:00:25.768110 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rrr9t\" (UniqueName: \"kubernetes.io/projected/3961e3c3-cfbc-4b31-af2d-957d38432858-kube-api-access-rrr9t\") pod \"watcher-kuttl-api-0\" (UID: \"3961e3c3-cfbc-4b31-af2d-957d38432858\") " pod="watcher-kuttl-default/watcher-kuttl-api-0" Oct 03 09:00:25 crc kubenswrapper[4765]: I1003 09:00:25.769082 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3961e3c3-cfbc-4b31-af2d-957d38432858-config-data\") pod \"watcher-kuttl-api-0\" (UID: \"3961e3c3-cfbc-4b31-af2d-957d38432858\") " pod="watcher-kuttl-default/watcher-kuttl-api-0" Oct 03 09:00:25 crc kubenswrapper[4765]: I1003 09:00:25.769289 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-787q5\" (UniqueName: \"kubernetes.io/projected/449e82dc-3ec9-40a6-8067-a94e3ccd1be5-kube-api-access-787q5\") pod \"watcher-kuttl-decision-engine-0\" (UID: \"449e82dc-3ec9-40a6-8067-a94e3ccd1be5\") " pod="watcher-kuttl-default/watcher-kuttl-decision-engine-0" Oct 03 09:00:25 crc kubenswrapper[4765]: I1003 09:00:25.769781 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"custom-prometheus-ca\" (UniqueName: \"kubernetes.io/secret/449e82dc-3ec9-40a6-8067-a94e3ccd1be5-custom-prometheus-ca\") pod \"watcher-kuttl-decision-engine-0\" (UID: \"449e82dc-3ec9-40a6-8067-a94e3ccd1be5\") " pod="watcher-kuttl-default/watcher-kuttl-decision-engine-0" Oct 03 09:00:25 crc kubenswrapper[4765]: I1003 09:00:25.769824 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/3961e3c3-cfbc-4b31-af2d-957d38432858-logs\") pod \"watcher-kuttl-api-0\" (UID: \"3961e3c3-cfbc-4b31-af2d-957d38432858\") " pod="watcher-kuttl-default/watcher-kuttl-api-0" Oct 03 09:00:25 crc kubenswrapper[4765]: I1003 09:00:25.769855 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3961e3c3-cfbc-4b31-af2d-957d38432858-combined-ca-bundle\") pod \"watcher-kuttl-api-0\" (UID: \"3961e3c3-cfbc-4b31-af2d-957d38432858\") " pod="watcher-kuttl-default/watcher-kuttl-api-0" Oct 03 09:00:25 crc kubenswrapper[4765]: I1003 09:00:25.769900 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/449e82dc-3ec9-40a6-8067-a94e3ccd1be5-combined-ca-bundle\") pod \"watcher-kuttl-decision-engine-0\" (UID: \"449e82dc-3ec9-40a6-8067-a94e3ccd1be5\") " pod="watcher-kuttl-default/watcher-kuttl-decision-engine-0" Oct 03 09:00:25 crc kubenswrapper[4765]: I1003 09:00:25.769963 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/449e82dc-3ec9-40a6-8067-a94e3ccd1be5-logs\") pod \"watcher-kuttl-decision-engine-0\" (UID: \"449e82dc-3ec9-40a6-8067-a94e3ccd1be5\") " pod="watcher-kuttl-default/watcher-kuttl-decision-engine-0" Oct 03 09:00:25 crc kubenswrapper[4765]: I1003 09:00:25.870967 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/449e82dc-3ec9-40a6-8067-a94e3ccd1be5-combined-ca-bundle\") pod \"watcher-kuttl-decision-engine-0\" (UID: \"449e82dc-3ec9-40a6-8067-a94e3ccd1be5\") " pod="watcher-kuttl-default/watcher-kuttl-decision-engine-0" Oct 03 09:00:25 crc kubenswrapper[4765]: I1003 09:00:25.871029 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/07aae327-dc35-4cbd-9566-027d2a972f57-config-data\") pod \"watcher-kuttl-applier-0\" (UID: \"07aae327-dc35-4cbd-9566-027d2a972f57\") " pod="watcher-kuttl-default/watcher-kuttl-applier-0" Oct 03 09:00:25 crc kubenswrapper[4765]: I1003 09:00:25.871070 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/07aae327-dc35-4cbd-9566-027d2a972f57-logs\") pod \"watcher-kuttl-applier-0\" (UID: \"07aae327-dc35-4cbd-9566-027d2a972f57\") " pod="watcher-kuttl-default/watcher-kuttl-applier-0" Oct 03 09:00:25 crc kubenswrapper[4765]: I1003 09:00:25.871115 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/449e82dc-3ec9-40a6-8067-a94e3ccd1be5-logs\") pod \"watcher-kuttl-decision-engine-0\" (UID: \"449e82dc-3ec9-40a6-8067-a94e3ccd1be5\") " pod="watcher-kuttl-default/watcher-kuttl-decision-engine-0" Oct 03 09:00:25 crc kubenswrapper[4765]: I1003 09:00:25.871151 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/449e82dc-3ec9-40a6-8067-a94e3ccd1be5-config-data\") pod \"watcher-kuttl-decision-engine-0\" (UID: \"449e82dc-3ec9-40a6-8067-a94e3ccd1be5\") " pod="watcher-kuttl-default/watcher-kuttl-decision-engine-0" Oct 03 09:00:25 crc kubenswrapper[4765]: I1003 09:00:25.871177 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"custom-prometheus-ca\" (UniqueName: \"kubernetes.io/secret/3961e3c3-cfbc-4b31-af2d-957d38432858-custom-prometheus-ca\") pod \"watcher-kuttl-api-0\" (UID: \"3961e3c3-cfbc-4b31-af2d-957d38432858\") " pod="watcher-kuttl-default/watcher-kuttl-api-0" Oct 03 09:00:25 crc kubenswrapper[4765]: I1003 09:00:25.871207 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rrr9t\" (UniqueName: \"kubernetes.io/projected/3961e3c3-cfbc-4b31-af2d-957d38432858-kube-api-access-rrr9t\") pod \"watcher-kuttl-api-0\" (UID: \"3961e3c3-cfbc-4b31-af2d-957d38432858\") " pod="watcher-kuttl-default/watcher-kuttl-api-0" Oct 03 09:00:25 crc kubenswrapper[4765]: I1003 09:00:25.871246 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3961e3c3-cfbc-4b31-af2d-957d38432858-config-data\") pod \"watcher-kuttl-api-0\" (UID: \"3961e3c3-cfbc-4b31-af2d-957d38432858\") " pod="watcher-kuttl-default/watcher-kuttl-api-0" Oct 03 09:00:25 crc kubenswrapper[4765]: I1003 09:00:25.871272 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-787q5\" (UniqueName: \"kubernetes.io/projected/449e82dc-3ec9-40a6-8067-a94e3ccd1be5-kube-api-access-787q5\") pod \"watcher-kuttl-decision-engine-0\" (UID: \"449e82dc-3ec9-40a6-8067-a94e3ccd1be5\") " pod="watcher-kuttl-default/watcher-kuttl-decision-engine-0" Oct 03 09:00:25 crc kubenswrapper[4765]: I1003 09:00:25.871328 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/07aae327-dc35-4cbd-9566-027d2a972f57-combined-ca-bundle\") pod \"watcher-kuttl-applier-0\" (UID: \"07aae327-dc35-4cbd-9566-027d2a972f57\") " pod="watcher-kuttl-default/watcher-kuttl-applier-0" Oct 03 09:00:25 crc kubenswrapper[4765]: I1003 09:00:25.871360 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"custom-prometheus-ca\" (UniqueName: \"kubernetes.io/secret/449e82dc-3ec9-40a6-8067-a94e3ccd1be5-custom-prometheus-ca\") pod \"watcher-kuttl-decision-engine-0\" (UID: \"449e82dc-3ec9-40a6-8067-a94e3ccd1be5\") " pod="watcher-kuttl-default/watcher-kuttl-decision-engine-0" Oct 03 09:00:25 crc kubenswrapper[4765]: I1003 09:00:25.871401 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lrkkx\" (UniqueName: \"kubernetes.io/projected/07aae327-dc35-4cbd-9566-027d2a972f57-kube-api-access-lrkkx\") pod \"watcher-kuttl-applier-0\" (UID: \"07aae327-dc35-4cbd-9566-027d2a972f57\") " pod="watcher-kuttl-default/watcher-kuttl-applier-0" Oct 03 09:00:25 crc kubenswrapper[4765]: I1003 09:00:25.871427 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/3961e3c3-cfbc-4b31-af2d-957d38432858-logs\") pod \"watcher-kuttl-api-0\" (UID: \"3961e3c3-cfbc-4b31-af2d-957d38432858\") " pod="watcher-kuttl-default/watcher-kuttl-api-0" Oct 03 09:00:25 crc kubenswrapper[4765]: I1003 09:00:25.871457 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3961e3c3-cfbc-4b31-af2d-957d38432858-combined-ca-bundle\") pod \"watcher-kuttl-api-0\" (UID: \"3961e3c3-cfbc-4b31-af2d-957d38432858\") " pod="watcher-kuttl-default/watcher-kuttl-api-0" Oct 03 09:00:25 crc kubenswrapper[4765]: I1003 09:00:25.872259 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/449e82dc-3ec9-40a6-8067-a94e3ccd1be5-logs\") pod \"watcher-kuttl-decision-engine-0\" (UID: \"449e82dc-3ec9-40a6-8067-a94e3ccd1be5\") " pod="watcher-kuttl-default/watcher-kuttl-decision-engine-0" Oct 03 09:00:25 crc kubenswrapper[4765]: I1003 09:00:25.872523 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/3961e3c3-cfbc-4b31-af2d-957d38432858-logs\") pod \"watcher-kuttl-api-0\" (UID: \"3961e3c3-cfbc-4b31-af2d-957d38432858\") " pod="watcher-kuttl-default/watcher-kuttl-api-0" Oct 03 09:00:25 crc kubenswrapper[4765]: I1003 09:00:25.877614 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3961e3c3-cfbc-4b31-af2d-957d38432858-combined-ca-bundle\") pod \"watcher-kuttl-api-0\" (UID: \"3961e3c3-cfbc-4b31-af2d-957d38432858\") " pod="watcher-kuttl-default/watcher-kuttl-api-0" Oct 03 09:00:25 crc kubenswrapper[4765]: I1003 09:00:25.877786 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3961e3c3-cfbc-4b31-af2d-957d38432858-config-data\") pod \"watcher-kuttl-api-0\" (UID: \"3961e3c3-cfbc-4b31-af2d-957d38432858\") " pod="watcher-kuttl-default/watcher-kuttl-api-0" Oct 03 09:00:25 crc kubenswrapper[4765]: I1003 09:00:25.884467 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/449e82dc-3ec9-40a6-8067-a94e3ccd1be5-config-data\") pod \"watcher-kuttl-decision-engine-0\" (UID: \"449e82dc-3ec9-40a6-8067-a94e3ccd1be5\") " pod="watcher-kuttl-default/watcher-kuttl-decision-engine-0" Oct 03 09:00:25 crc kubenswrapper[4765]: I1003 09:00:25.884900 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"custom-prometheus-ca\" (UniqueName: \"kubernetes.io/secret/3961e3c3-cfbc-4b31-af2d-957d38432858-custom-prometheus-ca\") pod \"watcher-kuttl-api-0\" (UID: \"3961e3c3-cfbc-4b31-af2d-957d38432858\") " pod="watcher-kuttl-default/watcher-kuttl-api-0" Oct 03 09:00:25 crc kubenswrapper[4765]: I1003 09:00:25.886942 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"custom-prometheus-ca\" (UniqueName: \"kubernetes.io/secret/449e82dc-3ec9-40a6-8067-a94e3ccd1be5-custom-prometheus-ca\") pod \"watcher-kuttl-decision-engine-0\" (UID: \"449e82dc-3ec9-40a6-8067-a94e3ccd1be5\") " pod="watcher-kuttl-default/watcher-kuttl-decision-engine-0" Oct 03 09:00:25 crc kubenswrapper[4765]: I1003 09:00:25.897525 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-787q5\" (UniqueName: \"kubernetes.io/projected/449e82dc-3ec9-40a6-8067-a94e3ccd1be5-kube-api-access-787q5\") pod \"watcher-kuttl-decision-engine-0\" (UID: \"449e82dc-3ec9-40a6-8067-a94e3ccd1be5\") " pod="watcher-kuttl-default/watcher-kuttl-decision-engine-0" Oct 03 09:00:25 crc kubenswrapper[4765]: I1003 09:00:25.897627 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/449e82dc-3ec9-40a6-8067-a94e3ccd1be5-combined-ca-bundle\") pod \"watcher-kuttl-decision-engine-0\" (UID: \"449e82dc-3ec9-40a6-8067-a94e3ccd1be5\") " pod="watcher-kuttl-default/watcher-kuttl-decision-engine-0" Oct 03 09:00:25 crc kubenswrapper[4765]: I1003 09:00:25.907497 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rrr9t\" (UniqueName: \"kubernetes.io/projected/3961e3c3-cfbc-4b31-af2d-957d38432858-kube-api-access-rrr9t\") pod \"watcher-kuttl-api-0\" (UID: \"3961e3c3-cfbc-4b31-af2d-957d38432858\") " pod="watcher-kuttl-default/watcher-kuttl-api-0" Oct 03 09:00:25 crc kubenswrapper[4765]: I1003 09:00:25.971040 4765 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/watcher-kuttl-decision-engine-0" Oct 03 09:00:25 crc kubenswrapper[4765]: I1003 09:00:25.972457 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/07aae327-dc35-4cbd-9566-027d2a972f57-combined-ca-bundle\") pod \"watcher-kuttl-applier-0\" (UID: \"07aae327-dc35-4cbd-9566-027d2a972f57\") " pod="watcher-kuttl-default/watcher-kuttl-applier-0" Oct 03 09:00:25 crc kubenswrapper[4765]: I1003 09:00:25.972509 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lrkkx\" (UniqueName: \"kubernetes.io/projected/07aae327-dc35-4cbd-9566-027d2a972f57-kube-api-access-lrkkx\") pod \"watcher-kuttl-applier-0\" (UID: \"07aae327-dc35-4cbd-9566-027d2a972f57\") " pod="watcher-kuttl-default/watcher-kuttl-applier-0" Oct 03 09:00:25 crc kubenswrapper[4765]: I1003 09:00:25.972556 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/07aae327-dc35-4cbd-9566-027d2a972f57-config-data\") pod \"watcher-kuttl-applier-0\" (UID: \"07aae327-dc35-4cbd-9566-027d2a972f57\") " pod="watcher-kuttl-default/watcher-kuttl-applier-0" Oct 03 09:00:25 crc kubenswrapper[4765]: I1003 09:00:25.972587 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/07aae327-dc35-4cbd-9566-027d2a972f57-logs\") pod \"watcher-kuttl-applier-0\" (UID: \"07aae327-dc35-4cbd-9566-027d2a972f57\") " pod="watcher-kuttl-default/watcher-kuttl-applier-0" Oct 03 09:00:25 crc kubenswrapper[4765]: I1003 09:00:25.973071 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/07aae327-dc35-4cbd-9566-027d2a972f57-logs\") pod \"watcher-kuttl-applier-0\" (UID: \"07aae327-dc35-4cbd-9566-027d2a972f57\") " pod="watcher-kuttl-default/watcher-kuttl-applier-0" Oct 03 09:00:25 crc kubenswrapper[4765]: I1003 09:00:25.979015 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/07aae327-dc35-4cbd-9566-027d2a972f57-combined-ca-bundle\") pod \"watcher-kuttl-applier-0\" (UID: \"07aae327-dc35-4cbd-9566-027d2a972f57\") " pod="watcher-kuttl-default/watcher-kuttl-applier-0" Oct 03 09:00:25 crc kubenswrapper[4765]: I1003 09:00:25.983216 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/07aae327-dc35-4cbd-9566-027d2a972f57-config-data\") pod \"watcher-kuttl-applier-0\" (UID: \"07aae327-dc35-4cbd-9566-027d2a972f57\") " pod="watcher-kuttl-default/watcher-kuttl-applier-0" Oct 03 09:00:25 crc kubenswrapper[4765]: I1003 09:00:25.988535 4765 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/watcher-kuttl-api-0" Oct 03 09:00:25 crc kubenswrapper[4765]: I1003 09:00:25.992223 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lrkkx\" (UniqueName: \"kubernetes.io/projected/07aae327-dc35-4cbd-9566-027d2a972f57-kube-api-access-lrkkx\") pod \"watcher-kuttl-applier-0\" (UID: \"07aae327-dc35-4cbd-9566-027d2a972f57\") " pod="watcher-kuttl-default/watcher-kuttl-applier-0" Oct 03 09:00:26 crc kubenswrapper[4765]: I1003 09:00:26.058312 4765 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/watcher-kuttl-applier-0" Oct 03 09:00:26 crc kubenswrapper[4765]: I1003 09:00:26.466159 4765 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["watcher-kuttl-default/watcher-kuttl-decision-engine-0"] Oct 03 09:00:26 crc kubenswrapper[4765]: I1003 09:00:26.586888 4765 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["watcher-kuttl-default/watcher-kuttl-api-0"] Oct 03 09:00:26 crc kubenswrapper[4765]: W1003 09:00:26.589636 4765 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod3961e3c3_cfbc_4b31_af2d_957d38432858.slice/crio-7c36260de7c1b2e82a18880c567df91d5d3e03f614a6f3638612d7789fe025df WatchSource:0}: Error finding container 7c36260de7c1b2e82a18880c567df91d5d3e03f614a6f3638612d7789fe025df: Status 404 returned error can't find the container with id 7c36260de7c1b2e82a18880c567df91d5d3e03f614a6f3638612d7789fe025df Oct 03 09:00:26 crc kubenswrapper[4765]: I1003 09:00:26.681125 4765 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["watcher-kuttl-default/watcher-kuttl-applier-0"] Oct 03 09:00:27 crc kubenswrapper[4765]: I1003 09:00:27.386914 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/watcher-kuttl-decision-engine-0" event={"ID":"449e82dc-3ec9-40a6-8067-a94e3ccd1be5","Type":"ContainerStarted","Data":"48370813e7c9f11310ec52d3d6264053cd2396e45d2edb81f74bacfb620fbea5"} Oct 03 09:00:27 crc kubenswrapper[4765]: I1003 09:00:27.388023 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/watcher-kuttl-applier-0" event={"ID":"07aae327-dc35-4cbd-9566-027d2a972f57","Type":"ContainerStarted","Data":"70e982299d14deda4fce787103a9b1e120db1d6c4128ba597670dbaaeba6e3d5"} Oct 03 09:00:27 crc kubenswrapper[4765]: I1003 09:00:27.392910 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/watcher-kuttl-api-0" event={"ID":"3961e3c3-cfbc-4b31-af2d-957d38432858","Type":"ContainerStarted","Data":"b1be1b329830cd59739528151fc1bd5e2c1f802d74604af5fd6b2579a1031fbd"} Oct 03 09:00:27 crc kubenswrapper[4765]: I1003 09:00:27.392957 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/watcher-kuttl-api-0" event={"ID":"3961e3c3-cfbc-4b31-af2d-957d38432858","Type":"ContainerStarted","Data":"d99bc88f1622d082610e0c7f328b2d0e3ebf9d36de3a796f7ef7cf8bcbcab157"} Oct 03 09:00:27 crc kubenswrapper[4765]: I1003 09:00:27.392970 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/watcher-kuttl-api-0" event={"ID":"3961e3c3-cfbc-4b31-af2d-957d38432858","Type":"ContainerStarted","Data":"7c36260de7c1b2e82a18880c567df91d5d3e03f614a6f3638612d7789fe025df"} Oct 03 09:00:27 crc kubenswrapper[4765]: I1003 09:00:27.393232 4765 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="watcher-kuttl-default/watcher-kuttl-api-0" Oct 03 09:00:27 crc kubenswrapper[4765]: I1003 09:00:27.425555 4765 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="watcher-kuttl-default/watcher-kuttl-api-0" podStartSLOduration=2.425528692 podStartE2EDuration="2.425528692s" podCreationTimestamp="2025-10-03 09:00:25 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-03 09:00:27.415710545 +0000 UTC m=+1271.717204895" watchObservedRunningTime="2025-10-03 09:00:27.425528692 +0000 UTC m=+1271.727023022" Oct 03 09:00:29 crc kubenswrapper[4765]: I1003 09:00:29.419895 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/watcher-kuttl-applier-0" event={"ID":"07aae327-dc35-4cbd-9566-027d2a972f57","Type":"ContainerStarted","Data":"fab7606703c6f973346c547f76bd13a3dc1446fa542728941391fdc5fc1d3c35"} Oct 03 09:00:29 crc kubenswrapper[4765]: I1003 09:00:29.422593 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/watcher-kuttl-decision-engine-0" event={"ID":"449e82dc-3ec9-40a6-8067-a94e3ccd1be5","Type":"ContainerStarted","Data":"267ad800f7352bd13ee8919738203ef8405405ee534397788ac16556b41cd6d5"} Oct 03 09:00:29 crc kubenswrapper[4765]: I1003 09:00:29.449131 4765 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="watcher-kuttl-default/watcher-kuttl-applier-0" podStartSLOduration=2.8095543530000002 podStartE2EDuration="4.44910962s" podCreationTimestamp="2025-10-03 09:00:25 +0000 UTC" firstStartedPulling="2025-10-03 09:00:26.690894818 +0000 UTC m=+1270.992389138" lastFinishedPulling="2025-10-03 09:00:28.330450075 +0000 UTC m=+1272.631944405" observedRunningTime="2025-10-03 09:00:29.443092829 +0000 UTC m=+1273.744587169" watchObservedRunningTime="2025-10-03 09:00:29.44910962 +0000 UTC m=+1273.750603950" Oct 03 09:00:29 crc kubenswrapper[4765]: I1003 09:00:29.464452 4765 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="watcher-kuttl-default/watcher-kuttl-decision-engine-0" podStartSLOduration=2.609600867 podStartE2EDuration="4.464428596s" podCreationTimestamp="2025-10-03 09:00:25 +0000 UTC" firstStartedPulling="2025-10-03 09:00:26.471719928 +0000 UTC m=+1270.773214258" lastFinishedPulling="2025-10-03 09:00:28.326547667 +0000 UTC m=+1272.628041987" observedRunningTime="2025-10-03 09:00:29.457772619 +0000 UTC m=+1273.759266949" watchObservedRunningTime="2025-10-03 09:00:29.464428596 +0000 UTC m=+1273.765922926" Oct 03 09:00:29 crc kubenswrapper[4765]: I1003 09:00:29.765562 4765 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="watcher-kuttl-default/watcher-kuttl-api-0" Oct 03 09:00:30 crc kubenswrapper[4765]: I1003 09:00:30.680272 4765 patch_prober.go:28] interesting pod/machine-config-daemon-j8mss container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 03 09:00:30 crc kubenswrapper[4765]: I1003 09:00:30.680335 4765 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-j8mss" podUID="d636dbad-9ffa-4ba7-953f-adea04b76a23" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 03 09:00:30 crc kubenswrapper[4765]: I1003 09:00:30.990092 4765 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="watcher-kuttl-default/watcher-kuttl-api-0" Oct 03 09:00:31 crc kubenswrapper[4765]: I1003 09:00:31.058688 4765 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="watcher-kuttl-default/watcher-kuttl-applier-0" Oct 03 09:00:35 crc kubenswrapper[4765]: I1003 09:00:35.972037 4765 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="watcher-kuttl-default/watcher-kuttl-decision-engine-0" Oct 03 09:00:35 crc kubenswrapper[4765]: I1003 09:00:35.990149 4765 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="watcher-kuttl-default/watcher-kuttl-api-0" Oct 03 09:00:35 crc kubenswrapper[4765]: I1003 09:00:35.997986 4765 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="watcher-kuttl-default/watcher-kuttl-api-0" Oct 03 09:00:36 crc kubenswrapper[4765]: I1003 09:00:36.002004 4765 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="watcher-kuttl-default/watcher-kuttl-decision-engine-0" Oct 03 09:00:36 crc kubenswrapper[4765]: I1003 09:00:36.059053 4765 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="watcher-kuttl-default/watcher-kuttl-applier-0" Oct 03 09:00:36 crc kubenswrapper[4765]: I1003 09:00:36.083044 4765 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="watcher-kuttl-default/watcher-kuttl-applier-0" Oct 03 09:00:36 crc kubenswrapper[4765]: I1003 09:00:36.477020 4765 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="watcher-kuttl-default/watcher-kuttl-decision-engine-0" Oct 03 09:00:36 crc kubenswrapper[4765]: I1003 09:00:36.483442 4765 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="watcher-kuttl-default/watcher-kuttl-api-0" Oct 03 09:00:36 crc kubenswrapper[4765]: I1003 09:00:36.505837 4765 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="watcher-kuttl-default/watcher-kuttl-applier-0" Oct 03 09:00:36 crc kubenswrapper[4765]: I1003 09:00:36.509367 4765 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="watcher-kuttl-default/watcher-kuttl-decision-engine-0" Oct 03 09:00:37 crc kubenswrapper[4765]: I1003 09:00:37.643217 4765 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["watcher-kuttl-default/ceilometer-0"] Oct 03 09:00:37 crc kubenswrapper[4765]: I1003 09:00:37.644897 4765 kuberuntime_container.go:808] "Killing container with a grace period" pod="watcher-kuttl-default/ceilometer-0" podUID="94554d2b-e45b-4c38-8871-e3b4febc81c9" containerName="sg-core" containerID="cri-o://453023e03ab393ece167179e27b70c18a91576ec723a5ab19ab3761315a90975" gracePeriod=30 Oct 03 09:00:37 crc kubenswrapper[4765]: I1003 09:00:37.644950 4765 kuberuntime_container.go:808] "Killing container with a grace period" pod="watcher-kuttl-default/ceilometer-0" podUID="94554d2b-e45b-4c38-8871-e3b4febc81c9" containerName="proxy-httpd" containerID="cri-o://648e88427fda1612f35b3aa26beb8885f330cf2512ef2c142fc9274949e3ca4f" gracePeriod=30 Oct 03 09:00:37 crc kubenswrapper[4765]: I1003 09:00:37.644950 4765 kuberuntime_container.go:808] "Killing container with a grace period" pod="watcher-kuttl-default/ceilometer-0" podUID="94554d2b-e45b-4c38-8871-e3b4febc81c9" containerName="ceilometer-notification-agent" containerID="cri-o://7e5174f0fd52405eb17accd4f6b83457a5ef9e014ed5831cd1e8e6991515d187" gracePeriod=30 Oct 03 09:00:37 crc kubenswrapper[4765]: I1003 09:00:37.644859 4765 kuberuntime_container.go:808] "Killing container with a grace period" pod="watcher-kuttl-default/ceilometer-0" podUID="94554d2b-e45b-4c38-8871-e3b4febc81c9" containerName="ceilometer-central-agent" containerID="cri-o://470a9cbaae5d3d072025eb3f50a1b783b9601912597f3957f8e110e9c1176a2f" gracePeriod=30 Oct 03 09:00:38 crc kubenswrapper[4765]: I1003 09:00:38.492384 4765 generic.go:334] "Generic (PLEG): container finished" podID="94554d2b-e45b-4c38-8871-e3b4febc81c9" containerID="648e88427fda1612f35b3aa26beb8885f330cf2512ef2c142fc9274949e3ca4f" exitCode=0 Oct 03 09:00:38 crc kubenswrapper[4765]: I1003 09:00:38.492714 4765 generic.go:334] "Generic (PLEG): container finished" podID="94554d2b-e45b-4c38-8871-e3b4febc81c9" containerID="453023e03ab393ece167179e27b70c18a91576ec723a5ab19ab3761315a90975" exitCode=2 Oct 03 09:00:38 crc kubenswrapper[4765]: I1003 09:00:38.492731 4765 generic.go:334] "Generic (PLEG): container finished" podID="94554d2b-e45b-4c38-8871-e3b4febc81c9" containerID="470a9cbaae5d3d072025eb3f50a1b783b9601912597f3957f8e110e9c1176a2f" exitCode=0 Oct 03 09:00:38 crc kubenswrapper[4765]: I1003 09:00:38.492465 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/ceilometer-0" event={"ID":"94554d2b-e45b-4c38-8871-e3b4febc81c9","Type":"ContainerDied","Data":"648e88427fda1612f35b3aa26beb8885f330cf2512ef2c142fc9274949e3ca4f"} Oct 03 09:00:38 crc kubenswrapper[4765]: I1003 09:00:38.492839 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/ceilometer-0" event={"ID":"94554d2b-e45b-4c38-8871-e3b4febc81c9","Type":"ContainerDied","Data":"453023e03ab393ece167179e27b70c18a91576ec723a5ab19ab3761315a90975"} Oct 03 09:00:38 crc kubenswrapper[4765]: I1003 09:00:38.492858 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/ceilometer-0" event={"ID":"94554d2b-e45b-4c38-8871-e3b4febc81c9","Type":"ContainerDied","Data":"470a9cbaae5d3d072025eb3f50a1b783b9601912597f3957f8e110e9c1176a2f"} Oct 03 09:00:39 crc kubenswrapper[4765]: I1003 09:00:39.989847 4765 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["watcher-kuttl-default/watcher-kuttl-db-sync-4ll9c"] Oct 03 09:00:39 crc kubenswrapper[4765]: I1003 09:00:39.998636 4765 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["watcher-kuttl-default/watcher-kuttl-db-sync-4ll9c"] Oct 03 09:00:40 crc kubenswrapper[4765]: I1003 09:00:40.058915 4765 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["watcher-kuttl-default/watcher3897-account-delete-x4lz4"] Oct 03 09:00:40 crc kubenswrapper[4765]: I1003 09:00:40.061720 4765 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/watcher3897-account-delete-x4lz4" Oct 03 09:00:40 crc kubenswrapper[4765]: I1003 09:00:40.091178 4765 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["watcher-kuttl-default/watcher-kuttl-applier-0"] Oct 03 09:00:40 crc kubenswrapper[4765]: I1003 09:00:40.091406 4765 kuberuntime_container.go:808] "Killing container with a grace period" pod="watcher-kuttl-default/watcher-kuttl-applier-0" podUID="07aae327-dc35-4cbd-9566-027d2a972f57" containerName="watcher-applier" containerID="cri-o://fab7606703c6f973346c547f76bd13a3dc1446fa542728941391fdc5fc1d3c35" gracePeriod=30 Oct 03 09:00:40 crc kubenswrapper[4765]: I1003 09:00:40.105294 4765 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["watcher-kuttl-default/watcher3897-account-delete-x4lz4"] Oct 03 09:00:40 crc kubenswrapper[4765]: I1003 09:00:40.114204 4765 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["watcher-kuttl-default/watcher-kuttl-decision-engine-0"] Oct 03 09:00:40 crc kubenswrapper[4765]: I1003 09:00:40.114462 4765 kuberuntime_container.go:808] "Killing container with a grace period" pod="watcher-kuttl-default/watcher-kuttl-decision-engine-0" podUID="449e82dc-3ec9-40a6-8067-a94e3ccd1be5" containerName="watcher-decision-engine" containerID="cri-o://267ad800f7352bd13ee8919738203ef8405405ee534397788ac16556b41cd6d5" gracePeriod=30 Oct 03 09:00:40 crc kubenswrapper[4765]: I1003 09:00:40.133449 4765 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["watcher-kuttl-default/watcher-db-create-skwjk"] Oct 03 09:00:40 crc kubenswrapper[4765]: I1003 09:00:40.149044 4765 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["watcher-kuttl-default/watcher-db-create-skwjk"] Oct 03 09:00:40 crc kubenswrapper[4765]: I1003 09:00:40.178730 4765 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["watcher-kuttl-default/watcher-kuttl-api-0"] Oct 03 09:00:40 crc kubenswrapper[4765]: I1003 09:00:40.178988 4765 kuberuntime_container.go:808] "Killing container with a grace period" pod="watcher-kuttl-default/watcher-kuttl-api-0" podUID="3961e3c3-cfbc-4b31-af2d-957d38432858" containerName="watcher-kuttl-api-log" containerID="cri-o://d99bc88f1622d082610e0c7f328b2d0e3ebf9d36de3a796f7ef7cf8bcbcab157" gracePeriod=30 Oct 03 09:00:40 crc kubenswrapper[4765]: I1003 09:00:40.179572 4765 kuberuntime_container.go:808] "Killing container with a grace period" pod="watcher-kuttl-default/watcher-kuttl-api-0" podUID="3961e3c3-cfbc-4b31-af2d-957d38432858" containerName="watcher-api" containerID="cri-o://b1be1b329830cd59739528151fc1bd5e2c1f802d74604af5fd6b2579a1031fbd" gracePeriod=30 Oct 03 09:00:40 crc kubenswrapper[4765]: I1003 09:00:40.198844 4765 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["watcher-kuttl-default/watcher-3897-account-create-qd2xq"] Oct 03 09:00:40 crc kubenswrapper[4765]: I1003 09:00:40.212553 4765 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["watcher-kuttl-default/watcher-3897-account-create-qd2xq"] Oct 03 09:00:40 crc kubenswrapper[4765]: I1003 09:00:40.217522 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-b9mp7\" (UniqueName: \"kubernetes.io/projected/653475b8-1f2d-4e5c-8710-4fa4d7bc838f-kube-api-access-b9mp7\") pod \"watcher3897-account-delete-x4lz4\" (UID: \"653475b8-1f2d-4e5c-8710-4fa4d7bc838f\") " pod="watcher-kuttl-default/watcher3897-account-delete-x4lz4" Oct 03 09:00:40 crc kubenswrapper[4765]: I1003 09:00:40.224782 4765 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["watcher-kuttl-default/watcher3897-account-delete-x4lz4"] Oct 03 09:00:40 crc kubenswrapper[4765]: E1003 09:00:40.226057 4765 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[kube-api-access-b9mp7], unattached volumes=[], failed to process volumes=[]: context canceled" pod="watcher-kuttl-default/watcher3897-account-delete-x4lz4" podUID="653475b8-1f2d-4e5c-8710-4fa4d7bc838f" Oct 03 09:00:40 crc kubenswrapper[4765]: I1003 09:00:40.319631 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-b9mp7\" (UniqueName: \"kubernetes.io/projected/653475b8-1f2d-4e5c-8710-4fa4d7bc838f-kube-api-access-b9mp7\") pod \"watcher3897-account-delete-x4lz4\" (UID: \"653475b8-1f2d-4e5c-8710-4fa4d7bc838f\") " pod="watcher-kuttl-default/watcher3897-account-delete-x4lz4" Oct 03 09:00:40 crc kubenswrapper[4765]: I1003 09:00:40.328692 4765 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="498f14d0-1965-4754-8cb4-5742535aa52b" path="/var/lib/kubelet/pods/498f14d0-1965-4754-8cb4-5742535aa52b/volumes" Oct 03 09:00:40 crc kubenswrapper[4765]: I1003 09:00:40.329717 4765 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6c523deb-e99b-4a55-85f7-35346117544f" path="/var/lib/kubelet/pods/6c523deb-e99b-4a55-85f7-35346117544f/volumes" Oct 03 09:00:40 crc kubenswrapper[4765]: I1003 09:00:40.342420 4765 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="de1b6b9b-b149-41f6-8657-d3e2cfac1e32" path="/var/lib/kubelet/pods/de1b6b9b-b149-41f6-8657-d3e2cfac1e32/volumes" Oct 03 09:00:40 crc kubenswrapper[4765]: I1003 09:00:40.356456 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-b9mp7\" (UniqueName: \"kubernetes.io/projected/653475b8-1f2d-4e5c-8710-4fa4d7bc838f-kube-api-access-b9mp7\") pod \"watcher3897-account-delete-x4lz4\" (UID: \"653475b8-1f2d-4e5c-8710-4fa4d7bc838f\") " pod="watcher-kuttl-default/watcher3897-account-delete-x4lz4" Oct 03 09:00:40 crc kubenswrapper[4765]: I1003 09:00:40.510417 4765 generic.go:334] "Generic (PLEG): container finished" podID="3961e3c3-cfbc-4b31-af2d-957d38432858" containerID="d99bc88f1622d082610e0c7f328b2d0e3ebf9d36de3a796f7ef7cf8bcbcab157" exitCode=143 Oct 03 09:00:40 crc kubenswrapper[4765]: I1003 09:00:40.510500 4765 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/watcher3897-account-delete-x4lz4" Oct 03 09:00:40 crc kubenswrapper[4765]: I1003 09:00:40.511074 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/watcher-kuttl-api-0" event={"ID":"3961e3c3-cfbc-4b31-af2d-957d38432858","Type":"ContainerDied","Data":"d99bc88f1622d082610e0c7f328b2d0e3ebf9d36de3a796f7ef7cf8bcbcab157"} Oct 03 09:00:40 crc kubenswrapper[4765]: I1003 09:00:40.521970 4765 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/watcher3897-account-delete-x4lz4" Oct 03 09:00:40 crc kubenswrapper[4765]: I1003 09:00:40.625229 4765 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-b9mp7\" (UniqueName: \"kubernetes.io/projected/653475b8-1f2d-4e5c-8710-4fa4d7bc838f-kube-api-access-b9mp7\") pod \"653475b8-1f2d-4e5c-8710-4fa4d7bc838f\" (UID: \"653475b8-1f2d-4e5c-8710-4fa4d7bc838f\") " Oct 03 09:00:40 crc kubenswrapper[4765]: I1003 09:00:40.630188 4765 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/653475b8-1f2d-4e5c-8710-4fa4d7bc838f-kube-api-access-b9mp7" (OuterVolumeSpecName: "kube-api-access-b9mp7") pod "653475b8-1f2d-4e5c-8710-4fa4d7bc838f" (UID: "653475b8-1f2d-4e5c-8710-4fa4d7bc838f"). InnerVolumeSpecName "kube-api-access-b9mp7". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 09:00:40 crc kubenswrapper[4765]: I1003 09:00:40.726986 4765 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-b9mp7\" (UniqueName: \"kubernetes.io/projected/653475b8-1f2d-4e5c-8710-4fa4d7bc838f-kube-api-access-b9mp7\") on node \"crc\" DevicePath \"\"" Oct 03 09:00:41 crc kubenswrapper[4765]: I1003 09:00:41.037405 4765 prober.go:107] "Probe failed" probeType="Readiness" pod="watcher-kuttl-default/watcher-kuttl-api-0" podUID="3961e3c3-cfbc-4b31-af2d-957d38432858" containerName="watcher-api" probeResult="failure" output="Get \"http://10.217.0.135:9322/\": read tcp 10.217.0.2:37846->10.217.0.135:9322: read: connection reset by peer" Oct 03 09:00:41 crc kubenswrapper[4765]: I1003 09:00:41.037459 4765 prober.go:107] "Probe failed" probeType="Readiness" pod="watcher-kuttl-default/watcher-kuttl-api-0" podUID="3961e3c3-cfbc-4b31-af2d-957d38432858" containerName="watcher-kuttl-api-log" probeResult="failure" output="Get \"http://10.217.0.135:9322/\": read tcp 10.217.0.2:37848->10.217.0.135:9322: read: connection reset by peer" Oct 03 09:00:41 crc kubenswrapper[4765]: E1003 09:00:41.060302 4765 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="fab7606703c6f973346c547f76bd13a3dc1446fa542728941391fdc5fc1d3c35" cmd=["/usr/bin/pgrep","-r","DRST","watcher-applier"] Oct 03 09:00:41 crc kubenswrapper[4765]: E1003 09:00:41.061803 4765 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="fab7606703c6f973346c547f76bd13a3dc1446fa542728941391fdc5fc1d3c35" cmd=["/usr/bin/pgrep","-r","DRST","watcher-applier"] Oct 03 09:00:41 crc kubenswrapper[4765]: E1003 09:00:41.062997 4765 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="fab7606703c6f973346c547f76bd13a3dc1446fa542728941391fdc5fc1d3c35" cmd=["/usr/bin/pgrep","-r","DRST","watcher-applier"] Oct 03 09:00:41 crc kubenswrapper[4765]: E1003 09:00:41.063034 4765 prober.go:104] "Probe errored" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" probeType="Readiness" pod="watcher-kuttl-default/watcher-kuttl-applier-0" podUID="07aae327-dc35-4cbd-9566-027d2a972f57" containerName="watcher-applier" Oct 03 09:00:41 crc kubenswrapper[4765]: I1003 09:00:41.522744 4765 generic.go:334] "Generic (PLEG): container finished" podID="449e82dc-3ec9-40a6-8067-a94e3ccd1be5" containerID="267ad800f7352bd13ee8919738203ef8405405ee534397788ac16556b41cd6d5" exitCode=0 Oct 03 09:00:41 crc kubenswrapper[4765]: I1003 09:00:41.523070 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/watcher-kuttl-decision-engine-0" event={"ID":"449e82dc-3ec9-40a6-8067-a94e3ccd1be5","Type":"ContainerDied","Data":"267ad800f7352bd13ee8919738203ef8405405ee534397788ac16556b41cd6d5"} Oct 03 09:00:41 crc kubenswrapper[4765]: I1003 09:00:41.524695 4765 generic.go:334] "Generic (PLEG): container finished" podID="3961e3c3-cfbc-4b31-af2d-957d38432858" containerID="b1be1b329830cd59739528151fc1bd5e2c1f802d74604af5fd6b2579a1031fbd" exitCode=0 Oct 03 09:00:41 crc kubenswrapper[4765]: I1003 09:00:41.524772 4765 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/watcher3897-account-delete-x4lz4" Oct 03 09:00:41 crc kubenswrapper[4765]: I1003 09:00:41.529092 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/watcher-kuttl-api-0" event={"ID":"3961e3c3-cfbc-4b31-af2d-957d38432858","Type":"ContainerDied","Data":"b1be1b329830cd59739528151fc1bd5e2c1f802d74604af5fd6b2579a1031fbd"} Oct 03 09:00:41 crc kubenswrapper[4765]: I1003 09:00:41.529138 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/watcher-kuttl-api-0" event={"ID":"3961e3c3-cfbc-4b31-af2d-957d38432858","Type":"ContainerDied","Data":"7c36260de7c1b2e82a18880c567df91d5d3e03f614a6f3638612d7789fe025df"} Oct 03 09:00:41 crc kubenswrapper[4765]: I1003 09:00:41.529149 4765 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="7c36260de7c1b2e82a18880c567df91d5d3e03f614a6f3638612d7789fe025df" Oct 03 09:00:41 crc kubenswrapper[4765]: I1003 09:00:41.597183 4765 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/watcher-kuttl-api-0" Oct 03 09:00:41 crc kubenswrapper[4765]: I1003 09:00:41.617524 4765 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/watcher-kuttl-decision-engine-0" Oct 03 09:00:41 crc kubenswrapper[4765]: I1003 09:00:41.631611 4765 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["watcher-kuttl-default/watcher3897-account-delete-x4lz4"] Oct 03 09:00:41 crc kubenswrapper[4765]: I1003 09:00:41.638350 4765 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["watcher-kuttl-default/watcher3897-account-delete-x4lz4"] Oct 03 09:00:41 crc kubenswrapper[4765]: I1003 09:00:41.746586 4765 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rrr9t\" (UniqueName: \"kubernetes.io/projected/3961e3c3-cfbc-4b31-af2d-957d38432858-kube-api-access-rrr9t\") pod \"3961e3c3-cfbc-4b31-af2d-957d38432858\" (UID: \"3961e3c3-cfbc-4b31-af2d-957d38432858\") " Oct 03 09:00:41 crc kubenswrapper[4765]: I1003 09:00:41.746779 4765 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-787q5\" (UniqueName: \"kubernetes.io/projected/449e82dc-3ec9-40a6-8067-a94e3ccd1be5-kube-api-access-787q5\") pod \"449e82dc-3ec9-40a6-8067-a94e3ccd1be5\" (UID: \"449e82dc-3ec9-40a6-8067-a94e3ccd1be5\") " Oct 03 09:00:41 crc kubenswrapper[4765]: I1003 09:00:41.746817 4765 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/449e82dc-3ec9-40a6-8067-a94e3ccd1be5-combined-ca-bundle\") pod \"449e82dc-3ec9-40a6-8067-a94e3ccd1be5\" (UID: \"449e82dc-3ec9-40a6-8067-a94e3ccd1be5\") " Oct 03 09:00:41 crc kubenswrapper[4765]: I1003 09:00:41.746853 4765 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3961e3c3-cfbc-4b31-af2d-957d38432858-config-data\") pod \"3961e3c3-cfbc-4b31-af2d-957d38432858\" (UID: \"3961e3c3-cfbc-4b31-af2d-957d38432858\") " Oct 03 09:00:41 crc kubenswrapper[4765]: I1003 09:00:41.746938 4765 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"custom-prometheus-ca\" (UniqueName: \"kubernetes.io/secret/3961e3c3-cfbc-4b31-af2d-957d38432858-custom-prometheus-ca\") pod \"3961e3c3-cfbc-4b31-af2d-957d38432858\" (UID: \"3961e3c3-cfbc-4b31-af2d-957d38432858\") " Oct 03 09:00:41 crc kubenswrapper[4765]: I1003 09:00:41.746975 4765 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3961e3c3-cfbc-4b31-af2d-957d38432858-combined-ca-bundle\") pod \"3961e3c3-cfbc-4b31-af2d-957d38432858\" (UID: \"3961e3c3-cfbc-4b31-af2d-957d38432858\") " Oct 03 09:00:41 crc kubenswrapper[4765]: I1003 09:00:41.747013 4765 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/449e82dc-3ec9-40a6-8067-a94e3ccd1be5-logs\") pod \"449e82dc-3ec9-40a6-8067-a94e3ccd1be5\" (UID: \"449e82dc-3ec9-40a6-8067-a94e3ccd1be5\") " Oct 03 09:00:41 crc kubenswrapper[4765]: I1003 09:00:41.747116 4765 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/3961e3c3-cfbc-4b31-af2d-957d38432858-logs\") pod \"3961e3c3-cfbc-4b31-af2d-957d38432858\" (UID: \"3961e3c3-cfbc-4b31-af2d-957d38432858\") " Oct 03 09:00:41 crc kubenswrapper[4765]: I1003 09:00:41.747150 4765 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"custom-prometheus-ca\" (UniqueName: \"kubernetes.io/secret/449e82dc-3ec9-40a6-8067-a94e3ccd1be5-custom-prometheus-ca\") pod \"449e82dc-3ec9-40a6-8067-a94e3ccd1be5\" (UID: \"449e82dc-3ec9-40a6-8067-a94e3ccd1be5\") " Oct 03 09:00:41 crc kubenswrapper[4765]: I1003 09:00:41.747206 4765 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/449e82dc-3ec9-40a6-8067-a94e3ccd1be5-config-data\") pod \"449e82dc-3ec9-40a6-8067-a94e3ccd1be5\" (UID: \"449e82dc-3ec9-40a6-8067-a94e3ccd1be5\") " Oct 03 09:00:41 crc kubenswrapper[4765]: I1003 09:00:41.749302 4765 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/3961e3c3-cfbc-4b31-af2d-957d38432858-logs" (OuterVolumeSpecName: "logs") pod "3961e3c3-cfbc-4b31-af2d-957d38432858" (UID: "3961e3c3-cfbc-4b31-af2d-957d38432858"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 03 09:00:41 crc kubenswrapper[4765]: I1003 09:00:41.749570 4765 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/449e82dc-3ec9-40a6-8067-a94e3ccd1be5-logs" (OuterVolumeSpecName: "logs") pod "449e82dc-3ec9-40a6-8067-a94e3ccd1be5" (UID: "449e82dc-3ec9-40a6-8067-a94e3ccd1be5"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 03 09:00:41 crc kubenswrapper[4765]: I1003 09:00:41.771064 4765 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3961e3c3-cfbc-4b31-af2d-957d38432858-kube-api-access-rrr9t" (OuterVolumeSpecName: "kube-api-access-rrr9t") pod "3961e3c3-cfbc-4b31-af2d-957d38432858" (UID: "3961e3c3-cfbc-4b31-af2d-957d38432858"). InnerVolumeSpecName "kube-api-access-rrr9t". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 09:00:41 crc kubenswrapper[4765]: I1003 09:00:41.775850 4765 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/449e82dc-3ec9-40a6-8067-a94e3ccd1be5-kube-api-access-787q5" (OuterVolumeSpecName: "kube-api-access-787q5") pod "449e82dc-3ec9-40a6-8067-a94e3ccd1be5" (UID: "449e82dc-3ec9-40a6-8067-a94e3ccd1be5"). InnerVolumeSpecName "kube-api-access-787q5". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 09:00:41 crc kubenswrapper[4765]: I1003 09:00:41.784543 4765 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/449e82dc-3ec9-40a6-8067-a94e3ccd1be5-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "449e82dc-3ec9-40a6-8067-a94e3ccd1be5" (UID: "449e82dc-3ec9-40a6-8067-a94e3ccd1be5"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 09:00:41 crc kubenswrapper[4765]: I1003 09:00:41.785936 4765 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/449e82dc-3ec9-40a6-8067-a94e3ccd1be5-custom-prometheus-ca" (OuterVolumeSpecName: "custom-prometheus-ca") pod "449e82dc-3ec9-40a6-8067-a94e3ccd1be5" (UID: "449e82dc-3ec9-40a6-8067-a94e3ccd1be5"). InnerVolumeSpecName "custom-prometheus-ca". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 09:00:41 crc kubenswrapper[4765]: I1003 09:00:41.790183 4765 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3961e3c3-cfbc-4b31-af2d-957d38432858-custom-prometheus-ca" (OuterVolumeSpecName: "custom-prometheus-ca") pod "3961e3c3-cfbc-4b31-af2d-957d38432858" (UID: "3961e3c3-cfbc-4b31-af2d-957d38432858"). InnerVolumeSpecName "custom-prometheus-ca". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 09:00:41 crc kubenswrapper[4765]: I1003 09:00:41.796252 4765 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3961e3c3-cfbc-4b31-af2d-957d38432858-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "3961e3c3-cfbc-4b31-af2d-957d38432858" (UID: "3961e3c3-cfbc-4b31-af2d-957d38432858"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 09:00:41 crc kubenswrapper[4765]: I1003 09:00:41.807233 4765 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/449e82dc-3ec9-40a6-8067-a94e3ccd1be5-config-data" (OuterVolumeSpecName: "config-data") pod "449e82dc-3ec9-40a6-8067-a94e3ccd1be5" (UID: "449e82dc-3ec9-40a6-8067-a94e3ccd1be5"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 09:00:41 crc kubenswrapper[4765]: I1003 09:00:41.811862 4765 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3961e3c3-cfbc-4b31-af2d-957d38432858-config-data" (OuterVolumeSpecName: "config-data") pod "3961e3c3-cfbc-4b31-af2d-957d38432858" (UID: "3961e3c3-cfbc-4b31-af2d-957d38432858"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 09:00:41 crc kubenswrapper[4765]: I1003 09:00:41.849542 4765 reconciler_common.go:293] "Volume detached for volume \"custom-prometheus-ca\" (UniqueName: \"kubernetes.io/secret/3961e3c3-cfbc-4b31-af2d-957d38432858-custom-prometheus-ca\") on node \"crc\" DevicePath \"\"" Oct 03 09:00:41 crc kubenswrapper[4765]: I1003 09:00:41.849579 4765 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3961e3c3-cfbc-4b31-af2d-957d38432858-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 03 09:00:41 crc kubenswrapper[4765]: I1003 09:00:41.849590 4765 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/449e82dc-3ec9-40a6-8067-a94e3ccd1be5-logs\") on node \"crc\" DevicePath \"\"" Oct 03 09:00:41 crc kubenswrapper[4765]: I1003 09:00:41.849600 4765 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/3961e3c3-cfbc-4b31-af2d-957d38432858-logs\") on node \"crc\" DevicePath \"\"" Oct 03 09:00:41 crc kubenswrapper[4765]: I1003 09:00:41.849608 4765 reconciler_common.go:293] "Volume detached for volume \"custom-prometheus-ca\" (UniqueName: \"kubernetes.io/secret/449e82dc-3ec9-40a6-8067-a94e3ccd1be5-custom-prometheus-ca\") on node \"crc\" DevicePath \"\"" Oct 03 09:00:41 crc kubenswrapper[4765]: I1003 09:00:41.849617 4765 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/449e82dc-3ec9-40a6-8067-a94e3ccd1be5-config-data\") on node \"crc\" DevicePath \"\"" Oct 03 09:00:41 crc kubenswrapper[4765]: I1003 09:00:41.849626 4765 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rrr9t\" (UniqueName: \"kubernetes.io/projected/3961e3c3-cfbc-4b31-af2d-957d38432858-kube-api-access-rrr9t\") on node \"crc\" DevicePath \"\"" Oct 03 09:00:41 crc kubenswrapper[4765]: I1003 09:00:41.849634 4765 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-787q5\" (UniqueName: \"kubernetes.io/projected/449e82dc-3ec9-40a6-8067-a94e3ccd1be5-kube-api-access-787q5\") on node \"crc\" DevicePath \"\"" Oct 03 09:00:41 crc kubenswrapper[4765]: I1003 09:00:41.849665 4765 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/449e82dc-3ec9-40a6-8067-a94e3ccd1be5-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 03 09:00:41 crc kubenswrapper[4765]: I1003 09:00:41.849679 4765 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3961e3c3-cfbc-4b31-af2d-957d38432858-config-data\") on node \"crc\" DevicePath \"\"" Oct 03 09:00:42 crc kubenswrapper[4765]: I1003 09:00:42.318037 4765 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="653475b8-1f2d-4e5c-8710-4fa4d7bc838f" path="/var/lib/kubelet/pods/653475b8-1f2d-4e5c-8710-4fa4d7bc838f/volumes" Oct 03 09:00:42 crc kubenswrapper[4765]: I1003 09:00:42.533332 4765 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/watcher-kuttl-decision-engine-0" Oct 03 09:00:42 crc kubenswrapper[4765]: I1003 09:00:42.533320 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/watcher-kuttl-decision-engine-0" event={"ID":"449e82dc-3ec9-40a6-8067-a94e3ccd1be5","Type":"ContainerDied","Data":"48370813e7c9f11310ec52d3d6264053cd2396e45d2edb81f74bacfb620fbea5"} Oct 03 09:00:42 crc kubenswrapper[4765]: I1003 09:00:42.533451 4765 scope.go:117] "RemoveContainer" containerID="267ad800f7352bd13ee8919738203ef8405405ee534397788ac16556b41cd6d5" Oct 03 09:00:42 crc kubenswrapper[4765]: I1003 09:00:42.533341 4765 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/watcher-kuttl-api-0" Oct 03 09:00:42 crc kubenswrapper[4765]: I1003 09:00:42.567942 4765 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["watcher-kuttl-default/watcher-kuttl-decision-engine-0"] Oct 03 09:00:42 crc kubenswrapper[4765]: I1003 09:00:42.580261 4765 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["watcher-kuttl-default/watcher-kuttl-decision-engine-0"] Oct 03 09:00:42 crc kubenswrapper[4765]: I1003 09:00:42.588028 4765 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["watcher-kuttl-default/watcher-kuttl-api-0"] Oct 03 09:00:42 crc kubenswrapper[4765]: I1003 09:00:42.594941 4765 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["watcher-kuttl-default/watcher-kuttl-api-0"] Oct 03 09:00:44 crc kubenswrapper[4765]: I1003 09:00:44.315581 4765 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3961e3c3-cfbc-4b31-af2d-957d38432858" path="/var/lib/kubelet/pods/3961e3c3-cfbc-4b31-af2d-957d38432858/volumes" Oct 03 09:00:44 crc kubenswrapper[4765]: I1003 09:00:44.316872 4765 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="449e82dc-3ec9-40a6-8067-a94e3ccd1be5" path="/var/lib/kubelet/pods/449e82dc-3ec9-40a6-8067-a94e3ccd1be5/volumes" Oct 03 09:00:45 crc kubenswrapper[4765]: I1003 09:00:45.105080 4765 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/watcher-kuttl-applier-0" Oct 03 09:00:45 crc kubenswrapper[4765]: I1003 09:00:45.215878 4765 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/07aae327-dc35-4cbd-9566-027d2a972f57-config-data\") pod \"07aae327-dc35-4cbd-9566-027d2a972f57\" (UID: \"07aae327-dc35-4cbd-9566-027d2a972f57\") " Oct 03 09:00:45 crc kubenswrapper[4765]: I1003 09:00:45.216229 4765 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lrkkx\" (UniqueName: \"kubernetes.io/projected/07aae327-dc35-4cbd-9566-027d2a972f57-kube-api-access-lrkkx\") pod \"07aae327-dc35-4cbd-9566-027d2a972f57\" (UID: \"07aae327-dc35-4cbd-9566-027d2a972f57\") " Oct 03 09:00:45 crc kubenswrapper[4765]: I1003 09:00:45.216365 4765 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/07aae327-dc35-4cbd-9566-027d2a972f57-combined-ca-bundle\") pod \"07aae327-dc35-4cbd-9566-027d2a972f57\" (UID: \"07aae327-dc35-4cbd-9566-027d2a972f57\") " Oct 03 09:00:45 crc kubenswrapper[4765]: I1003 09:00:45.216384 4765 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/07aae327-dc35-4cbd-9566-027d2a972f57-logs\") pod \"07aae327-dc35-4cbd-9566-027d2a972f57\" (UID: \"07aae327-dc35-4cbd-9566-027d2a972f57\") " Oct 03 09:00:45 crc kubenswrapper[4765]: I1003 09:00:45.216842 4765 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/07aae327-dc35-4cbd-9566-027d2a972f57-logs" (OuterVolumeSpecName: "logs") pod "07aae327-dc35-4cbd-9566-027d2a972f57" (UID: "07aae327-dc35-4cbd-9566-027d2a972f57"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 03 09:00:45 crc kubenswrapper[4765]: I1003 09:00:45.226931 4765 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/07aae327-dc35-4cbd-9566-027d2a972f57-kube-api-access-lrkkx" (OuterVolumeSpecName: "kube-api-access-lrkkx") pod "07aae327-dc35-4cbd-9566-027d2a972f57" (UID: "07aae327-dc35-4cbd-9566-027d2a972f57"). InnerVolumeSpecName "kube-api-access-lrkkx". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 09:00:45 crc kubenswrapper[4765]: I1003 09:00:45.243263 4765 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/07aae327-dc35-4cbd-9566-027d2a972f57-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "07aae327-dc35-4cbd-9566-027d2a972f57" (UID: "07aae327-dc35-4cbd-9566-027d2a972f57"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 09:00:45 crc kubenswrapper[4765]: I1003 09:00:45.273025 4765 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/07aae327-dc35-4cbd-9566-027d2a972f57-config-data" (OuterVolumeSpecName: "config-data") pod "07aae327-dc35-4cbd-9566-027d2a972f57" (UID: "07aae327-dc35-4cbd-9566-027d2a972f57"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 09:00:45 crc kubenswrapper[4765]: I1003 09:00:45.318879 4765 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/07aae327-dc35-4cbd-9566-027d2a972f57-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 03 09:00:45 crc kubenswrapper[4765]: I1003 09:00:45.318913 4765 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/07aae327-dc35-4cbd-9566-027d2a972f57-logs\") on node \"crc\" DevicePath \"\"" Oct 03 09:00:45 crc kubenswrapper[4765]: I1003 09:00:45.318926 4765 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/07aae327-dc35-4cbd-9566-027d2a972f57-config-data\") on node \"crc\" DevicePath \"\"" Oct 03 09:00:45 crc kubenswrapper[4765]: I1003 09:00:45.318937 4765 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lrkkx\" (UniqueName: \"kubernetes.io/projected/07aae327-dc35-4cbd-9566-027d2a972f57-kube-api-access-lrkkx\") on node \"crc\" DevicePath \"\"" Oct 03 09:00:45 crc kubenswrapper[4765]: I1003 09:00:45.564492 4765 generic.go:334] "Generic (PLEG): container finished" podID="94554d2b-e45b-4c38-8871-e3b4febc81c9" containerID="7e5174f0fd52405eb17accd4f6b83457a5ef9e014ed5831cd1e8e6991515d187" exitCode=0 Oct 03 09:00:45 crc kubenswrapper[4765]: I1003 09:00:45.564555 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/ceilometer-0" event={"ID":"94554d2b-e45b-4c38-8871-e3b4febc81c9","Type":"ContainerDied","Data":"7e5174f0fd52405eb17accd4f6b83457a5ef9e014ed5831cd1e8e6991515d187"} Oct 03 09:00:45 crc kubenswrapper[4765]: I1003 09:00:45.566292 4765 generic.go:334] "Generic (PLEG): container finished" podID="07aae327-dc35-4cbd-9566-027d2a972f57" containerID="fab7606703c6f973346c547f76bd13a3dc1446fa542728941391fdc5fc1d3c35" exitCode=0 Oct 03 09:00:45 crc kubenswrapper[4765]: I1003 09:00:45.566325 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/watcher-kuttl-applier-0" event={"ID":"07aae327-dc35-4cbd-9566-027d2a972f57","Type":"ContainerDied","Data":"fab7606703c6f973346c547f76bd13a3dc1446fa542728941391fdc5fc1d3c35"} Oct 03 09:00:45 crc kubenswrapper[4765]: I1003 09:00:45.566349 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/watcher-kuttl-applier-0" event={"ID":"07aae327-dc35-4cbd-9566-027d2a972f57","Type":"ContainerDied","Data":"70e982299d14deda4fce787103a9b1e120db1d6c4128ba597670dbaaeba6e3d5"} Oct 03 09:00:45 crc kubenswrapper[4765]: I1003 09:00:45.566370 4765 scope.go:117] "RemoveContainer" containerID="fab7606703c6f973346c547f76bd13a3dc1446fa542728941391fdc5fc1d3c35" Oct 03 09:00:45 crc kubenswrapper[4765]: I1003 09:00:45.566503 4765 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/watcher-kuttl-applier-0" Oct 03 09:00:45 crc kubenswrapper[4765]: I1003 09:00:45.589533 4765 scope.go:117] "RemoveContainer" containerID="fab7606703c6f973346c547f76bd13a3dc1446fa542728941391fdc5fc1d3c35" Oct 03 09:00:45 crc kubenswrapper[4765]: E1003 09:00:45.590197 4765 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"fab7606703c6f973346c547f76bd13a3dc1446fa542728941391fdc5fc1d3c35\": container with ID starting with fab7606703c6f973346c547f76bd13a3dc1446fa542728941391fdc5fc1d3c35 not found: ID does not exist" containerID="fab7606703c6f973346c547f76bd13a3dc1446fa542728941391fdc5fc1d3c35" Oct 03 09:00:45 crc kubenswrapper[4765]: I1003 09:00:45.590299 4765 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"fab7606703c6f973346c547f76bd13a3dc1446fa542728941391fdc5fc1d3c35"} err="failed to get container status \"fab7606703c6f973346c547f76bd13a3dc1446fa542728941391fdc5fc1d3c35\": rpc error: code = NotFound desc = could not find container \"fab7606703c6f973346c547f76bd13a3dc1446fa542728941391fdc5fc1d3c35\": container with ID starting with fab7606703c6f973346c547f76bd13a3dc1446fa542728941391fdc5fc1d3c35 not found: ID does not exist" Oct 03 09:00:45 crc kubenswrapper[4765]: I1003 09:00:45.594520 4765 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["watcher-kuttl-default/watcher-kuttl-applier-0"] Oct 03 09:00:45 crc kubenswrapper[4765]: I1003 09:00:45.605916 4765 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["watcher-kuttl-default/watcher-kuttl-applier-0"] Oct 03 09:00:45 crc kubenswrapper[4765]: I1003 09:00:45.635180 4765 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/ceilometer-0" Oct 03 09:00:45 crc kubenswrapper[4765]: I1003 09:00:45.825665 4765 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/94554d2b-e45b-4c38-8871-e3b4febc81c9-scripts\") pod \"94554d2b-e45b-4c38-8871-e3b4febc81c9\" (UID: \"94554d2b-e45b-4c38-8871-e3b4febc81c9\") " Oct 03 09:00:45 crc kubenswrapper[4765]: I1003 09:00:45.825737 4765 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6dwb4\" (UniqueName: \"kubernetes.io/projected/94554d2b-e45b-4c38-8871-e3b4febc81c9-kube-api-access-6dwb4\") pod \"94554d2b-e45b-4c38-8871-e3b4febc81c9\" (UID: \"94554d2b-e45b-4c38-8871-e3b4febc81c9\") " Oct 03 09:00:45 crc kubenswrapper[4765]: I1003 09:00:45.825771 4765 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/94554d2b-e45b-4c38-8871-e3b4febc81c9-sg-core-conf-yaml\") pod \"94554d2b-e45b-4c38-8871-e3b4febc81c9\" (UID: \"94554d2b-e45b-4c38-8871-e3b4febc81c9\") " Oct 03 09:00:45 crc kubenswrapper[4765]: I1003 09:00:45.825832 4765 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/94554d2b-e45b-4c38-8871-e3b4febc81c9-combined-ca-bundle\") pod \"94554d2b-e45b-4c38-8871-e3b4febc81c9\" (UID: \"94554d2b-e45b-4c38-8871-e3b4febc81c9\") " Oct 03 09:00:45 crc kubenswrapper[4765]: I1003 09:00:45.825897 4765 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/94554d2b-e45b-4c38-8871-e3b4febc81c9-ceilometer-tls-certs\") pod \"94554d2b-e45b-4c38-8871-e3b4febc81c9\" (UID: \"94554d2b-e45b-4c38-8871-e3b4febc81c9\") " Oct 03 09:00:45 crc kubenswrapper[4765]: I1003 09:00:45.825922 4765 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/94554d2b-e45b-4c38-8871-e3b4febc81c9-config-data\") pod \"94554d2b-e45b-4c38-8871-e3b4febc81c9\" (UID: \"94554d2b-e45b-4c38-8871-e3b4febc81c9\") " Oct 03 09:00:45 crc kubenswrapper[4765]: I1003 09:00:45.825992 4765 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/94554d2b-e45b-4c38-8871-e3b4febc81c9-run-httpd\") pod \"94554d2b-e45b-4c38-8871-e3b4febc81c9\" (UID: \"94554d2b-e45b-4c38-8871-e3b4febc81c9\") " Oct 03 09:00:45 crc kubenswrapper[4765]: I1003 09:00:45.826031 4765 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/94554d2b-e45b-4c38-8871-e3b4febc81c9-log-httpd\") pod \"94554d2b-e45b-4c38-8871-e3b4febc81c9\" (UID: \"94554d2b-e45b-4c38-8871-e3b4febc81c9\") " Oct 03 09:00:45 crc kubenswrapper[4765]: I1003 09:00:45.826783 4765 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/94554d2b-e45b-4c38-8871-e3b4febc81c9-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "94554d2b-e45b-4c38-8871-e3b4febc81c9" (UID: "94554d2b-e45b-4c38-8871-e3b4febc81c9"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 03 09:00:45 crc kubenswrapper[4765]: I1003 09:00:45.827009 4765 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/94554d2b-e45b-4c38-8871-e3b4febc81c9-run-httpd\") on node \"crc\" DevicePath \"\"" Oct 03 09:00:45 crc kubenswrapper[4765]: I1003 09:00:45.827198 4765 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/94554d2b-e45b-4c38-8871-e3b4febc81c9-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "94554d2b-e45b-4c38-8871-e3b4febc81c9" (UID: "94554d2b-e45b-4c38-8871-e3b4febc81c9"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 03 09:00:45 crc kubenswrapper[4765]: I1003 09:00:45.830447 4765 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/94554d2b-e45b-4c38-8871-e3b4febc81c9-scripts" (OuterVolumeSpecName: "scripts") pod "94554d2b-e45b-4c38-8871-e3b4febc81c9" (UID: "94554d2b-e45b-4c38-8871-e3b4febc81c9"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 09:00:45 crc kubenswrapper[4765]: I1003 09:00:45.842098 4765 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/94554d2b-e45b-4c38-8871-e3b4febc81c9-kube-api-access-6dwb4" (OuterVolumeSpecName: "kube-api-access-6dwb4") pod "94554d2b-e45b-4c38-8871-e3b4febc81c9" (UID: "94554d2b-e45b-4c38-8871-e3b4febc81c9"). InnerVolumeSpecName "kube-api-access-6dwb4". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 09:00:45 crc kubenswrapper[4765]: I1003 09:00:45.852416 4765 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/94554d2b-e45b-4c38-8871-e3b4febc81c9-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "94554d2b-e45b-4c38-8871-e3b4febc81c9" (UID: "94554d2b-e45b-4c38-8871-e3b4febc81c9"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 09:00:45 crc kubenswrapper[4765]: I1003 09:00:45.882943 4765 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/94554d2b-e45b-4c38-8871-e3b4febc81c9-ceilometer-tls-certs" (OuterVolumeSpecName: "ceilometer-tls-certs") pod "94554d2b-e45b-4c38-8871-e3b4febc81c9" (UID: "94554d2b-e45b-4c38-8871-e3b4febc81c9"). InnerVolumeSpecName "ceilometer-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 09:00:45 crc kubenswrapper[4765]: I1003 09:00:45.897523 4765 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/94554d2b-e45b-4c38-8871-e3b4febc81c9-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "94554d2b-e45b-4c38-8871-e3b4febc81c9" (UID: "94554d2b-e45b-4c38-8871-e3b4febc81c9"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 09:00:45 crc kubenswrapper[4765]: I1003 09:00:45.929284 4765 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/94554d2b-e45b-4c38-8871-e3b4febc81c9-log-httpd\") on node \"crc\" DevicePath \"\"" Oct 03 09:00:45 crc kubenswrapper[4765]: I1003 09:00:45.929585 4765 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/94554d2b-e45b-4c38-8871-e3b4febc81c9-scripts\") on node \"crc\" DevicePath \"\"" Oct 03 09:00:45 crc kubenswrapper[4765]: I1003 09:00:45.929608 4765 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6dwb4\" (UniqueName: \"kubernetes.io/projected/94554d2b-e45b-4c38-8871-e3b4febc81c9-kube-api-access-6dwb4\") on node \"crc\" DevicePath \"\"" Oct 03 09:00:45 crc kubenswrapper[4765]: I1003 09:00:45.929619 4765 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/94554d2b-e45b-4c38-8871-e3b4febc81c9-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Oct 03 09:00:45 crc kubenswrapper[4765]: I1003 09:00:45.929628 4765 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/94554d2b-e45b-4c38-8871-e3b4febc81c9-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 03 09:00:45 crc kubenswrapper[4765]: I1003 09:00:45.929636 4765 reconciler_common.go:293] "Volume detached for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/94554d2b-e45b-4c38-8871-e3b4febc81c9-ceilometer-tls-certs\") on node \"crc\" DevicePath \"\"" Oct 03 09:00:45 crc kubenswrapper[4765]: I1003 09:00:45.935366 4765 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/94554d2b-e45b-4c38-8871-e3b4febc81c9-config-data" (OuterVolumeSpecName: "config-data") pod "94554d2b-e45b-4c38-8871-e3b4febc81c9" (UID: "94554d2b-e45b-4c38-8871-e3b4febc81c9"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 09:00:46 crc kubenswrapper[4765]: I1003 09:00:46.031110 4765 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/94554d2b-e45b-4c38-8871-e3b4febc81c9-config-data\") on node \"crc\" DevicePath \"\"" Oct 03 09:00:46 crc kubenswrapper[4765]: I1003 09:00:46.234282 4765 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["watcher-kuttl-default/watcher-db-create-dvh78"] Oct 03 09:00:46 crc kubenswrapper[4765]: E1003 09:00:46.234682 4765 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="449e82dc-3ec9-40a6-8067-a94e3ccd1be5" containerName="watcher-decision-engine" Oct 03 09:00:46 crc kubenswrapper[4765]: I1003 09:00:46.234705 4765 state_mem.go:107] "Deleted CPUSet assignment" podUID="449e82dc-3ec9-40a6-8067-a94e3ccd1be5" containerName="watcher-decision-engine" Oct 03 09:00:46 crc kubenswrapper[4765]: E1003 09:00:46.234726 4765 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="94554d2b-e45b-4c38-8871-e3b4febc81c9" containerName="ceilometer-central-agent" Oct 03 09:00:46 crc kubenswrapper[4765]: I1003 09:00:46.234736 4765 state_mem.go:107] "Deleted CPUSet assignment" podUID="94554d2b-e45b-4c38-8871-e3b4febc81c9" containerName="ceilometer-central-agent" Oct 03 09:00:46 crc kubenswrapper[4765]: E1003 09:00:46.234762 4765 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3961e3c3-cfbc-4b31-af2d-957d38432858" containerName="watcher-kuttl-api-log" Oct 03 09:00:46 crc kubenswrapper[4765]: I1003 09:00:46.234770 4765 state_mem.go:107] "Deleted CPUSet assignment" podUID="3961e3c3-cfbc-4b31-af2d-957d38432858" containerName="watcher-kuttl-api-log" Oct 03 09:00:46 crc kubenswrapper[4765]: E1003 09:00:46.234784 4765 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="07aae327-dc35-4cbd-9566-027d2a972f57" containerName="watcher-applier" Oct 03 09:00:46 crc kubenswrapper[4765]: I1003 09:00:46.234791 4765 state_mem.go:107] "Deleted CPUSet assignment" podUID="07aae327-dc35-4cbd-9566-027d2a972f57" containerName="watcher-applier" Oct 03 09:00:46 crc kubenswrapper[4765]: E1003 09:00:46.234800 4765 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3961e3c3-cfbc-4b31-af2d-957d38432858" containerName="watcher-api" Oct 03 09:00:46 crc kubenswrapper[4765]: I1003 09:00:46.234810 4765 state_mem.go:107] "Deleted CPUSet assignment" podUID="3961e3c3-cfbc-4b31-af2d-957d38432858" containerName="watcher-api" Oct 03 09:00:46 crc kubenswrapper[4765]: E1003 09:00:46.234829 4765 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="94554d2b-e45b-4c38-8871-e3b4febc81c9" containerName="proxy-httpd" Oct 03 09:00:46 crc kubenswrapper[4765]: I1003 09:00:46.234837 4765 state_mem.go:107] "Deleted CPUSet assignment" podUID="94554d2b-e45b-4c38-8871-e3b4febc81c9" containerName="proxy-httpd" Oct 03 09:00:46 crc kubenswrapper[4765]: E1003 09:00:46.234854 4765 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="94554d2b-e45b-4c38-8871-e3b4febc81c9" containerName="ceilometer-notification-agent" Oct 03 09:00:46 crc kubenswrapper[4765]: I1003 09:00:46.234862 4765 state_mem.go:107] "Deleted CPUSet assignment" podUID="94554d2b-e45b-4c38-8871-e3b4febc81c9" containerName="ceilometer-notification-agent" Oct 03 09:00:46 crc kubenswrapper[4765]: E1003 09:00:46.234873 4765 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="94554d2b-e45b-4c38-8871-e3b4febc81c9" containerName="sg-core" Oct 03 09:00:46 crc kubenswrapper[4765]: I1003 09:00:46.234880 4765 state_mem.go:107] "Deleted CPUSet assignment" podUID="94554d2b-e45b-4c38-8871-e3b4febc81c9" containerName="sg-core" Oct 03 09:00:46 crc kubenswrapper[4765]: I1003 09:00:46.235091 4765 memory_manager.go:354] "RemoveStaleState removing state" podUID="94554d2b-e45b-4c38-8871-e3b4febc81c9" containerName="ceilometer-notification-agent" Oct 03 09:00:46 crc kubenswrapper[4765]: I1003 09:00:46.235127 4765 memory_manager.go:354] "RemoveStaleState removing state" podUID="3961e3c3-cfbc-4b31-af2d-957d38432858" containerName="watcher-kuttl-api-log" Oct 03 09:00:46 crc kubenswrapper[4765]: I1003 09:00:46.235150 4765 memory_manager.go:354] "RemoveStaleState removing state" podUID="94554d2b-e45b-4c38-8871-e3b4febc81c9" containerName="ceilometer-central-agent" Oct 03 09:00:46 crc kubenswrapper[4765]: I1003 09:00:46.235163 4765 memory_manager.go:354] "RemoveStaleState removing state" podUID="449e82dc-3ec9-40a6-8067-a94e3ccd1be5" containerName="watcher-decision-engine" Oct 03 09:00:46 crc kubenswrapper[4765]: I1003 09:00:46.235172 4765 memory_manager.go:354] "RemoveStaleState removing state" podUID="94554d2b-e45b-4c38-8871-e3b4febc81c9" containerName="sg-core" Oct 03 09:00:46 crc kubenswrapper[4765]: I1003 09:00:46.235189 4765 memory_manager.go:354] "RemoveStaleState removing state" podUID="94554d2b-e45b-4c38-8871-e3b4febc81c9" containerName="proxy-httpd" Oct 03 09:00:46 crc kubenswrapper[4765]: I1003 09:00:46.235217 4765 memory_manager.go:354] "RemoveStaleState removing state" podUID="07aae327-dc35-4cbd-9566-027d2a972f57" containerName="watcher-applier" Oct 03 09:00:46 crc kubenswrapper[4765]: I1003 09:00:46.235237 4765 memory_manager.go:354] "RemoveStaleState removing state" podUID="3961e3c3-cfbc-4b31-af2d-957d38432858" containerName="watcher-api" Oct 03 09:00:46 crc kubenswrapper[4765]: I1003 09:00:46.235929 4765 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/watcher-db-create-dvh78" Oct 03 09:00:46 crc kubenswrapper[4765]: I1003 09:00:46.247158 4765 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["watcher-kuttl-default/watcher-db-create-dvh78"] Oct 03 09:00:46 crc kubenswrapper[4765]: I1003 09:00:46.319474 4765 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="07aae327-dc35-4cbd-9566-027d2a972f57" path="/var/lib/kubelet/pods/07aae327-dc35-4cbd-9566-027d2a972f57/volumes" Oct 03 09:00:46 crc kubenswrapper[4765]: I1003 09:00:46.333922 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-47ds4\" (UniqueName: \"kubernetes.io/projected/d833609e-467c-49a2-ac00-c5224ffb1cb5-kube-api-access-47ds4\") pod \"watcher-db-create-dvh78\" (UID: \"d833609e-467c-49a2-ac00-c5224ffb1cb5\") " pod="watcher-kuttl-default/watcher-db-create-dvh78" Oct 03 09:00:46 crc kubenswrapper[4765]: I1003 09:00:46.435158 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-47ds4\" (UniqueName: \"kubernetes.io/projected/d833609e-467c-49a2-ac00-c5224ffb1cb5-kube-api-access-47ds4\") pod \"watcher-db-create-dvh78\" (UID: \"d833609e-467c-49a2-ac00-c5224ffb1cb5\") " pod="watcher-kuttl-default/watcher-db-create-dvh78" Oct 03 09:00:46 crc kubenswrapper[4765]: I1003 09:00:46.452002 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-47ds4\" (UniqueName: \"kubernetes.io/projected/d833609e-467c-49a2-ac00-c5224ffb1cb5-kube-api-access-47ds4\") pod \"watcher-db-create-dvh78\" (UID: \"d833609e-467c-49a2-ac00-c5224ffb1cb5\") " pod="watcher-kuttl-default/watcher-db-create-dvh78" Oct 03 09:00:46 crc kubenswrapper[4765]: I1003 09:00:46.553737 4765 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/watcher-db-create-dvh78" Oct 03 09:00:46 crc kubenswrapper[4765]: I1003 09:00:46.579717 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/ceilometer-0" event={"ID":"94554d2b-e45b-4c38-8871-e3b4febc81c9","Type":"ContainerDied","Data":"57149f8ee7f3aab948f11fdde03b72b6885a6e1ef3e320cd48a9231407f188ac"} Oct 03 09:00:46 crc kubenswrapper[4765]: I1003 09:00:46.579772 4765 scope.go:117] "RemoveContainer" containerID="648e88427fda1612f35b3aa26beb8885f330cf2512ef2c142fc9274949e3ca4f" Oct 03 09:00:46 crc kubenswrapper[4765]: I1003 09:00:46.579813 4765 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/ceilometer-0" Oct 03 09:00:46 crc kubenswrapper[4765]: I1003 09:00:46.611710 4765 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["watcher-kuttl-default/ceilometer-0"] Oct 03 09:00:46 crc kubenswrapper[4765]: I1003 09:00:46.617300 4765 scope.go:117] "RemoveContainer" containerID="453023e03ab393ece167179e27b70c18a91576ec723a5ab19ab3761315a90975" Oct 03 09:00:46 crc kubenswrapper[4765]: I1003 09:00:46.624846 4765 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["watcher-kuttl-default/ceilometer-0"] Oct 03 09:00:46 crc kubenswrapper[4765]: I1003 09:00:46.655378 4765 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["watcher-kuttl-default/ceilometer-0"] Oct 03 09:00:46 crc kubenswrapper[4765]: I1003 09:00:46.657922 4765 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/ceilometer-0" Oct 03 09:00:46 crc kubenswrapper[4765]: I1003 09:00:46.672893 4765 reflector.go:368] Caches populated for *v1.Secret from object-"watcher-kuttl-default"/"ceilometer-scripts" Oct 03 09:00:46 crc kubenswrapper[4765]: I1003 09:00:46.673316 4765 scope.go:117] "RemoveContainer" containerID="7e5174f0fd52405eb17accd4f6b83457a5ef9e014ed5831cd1e8e6991515d187" Oct 03 09:00:46 crc kubenswrapper[4765]: I1003 09:00:46.673820 4765 reflector.go:368] Caches populated for *v1.Secret from object-"watcher-kuttl-default"/"ceilometer-config-data" Oct 03 09:00:46 crc kubenswrapper[4765]: I1003 09:00:46.674312 4765 reflector.go:368] Caches populated for *v1.Secret from object-"watcher-kuttl-default"/"cert-ceilometer-internal-svc" Oct 03 09:00:46 crc kubenswrapper[4765]: I1003 09:00:46.717428 4765 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["watcher-kuttl-default/ceilometer-0"] Oct 03 09:00:46 crc kubenswrapper[4765]: I1003 09:00:46.720738 4765 scope.go:117] "RemoveContainer" containerID="470a9cbaae5d3d072025eb3f50a1b783b9601912597f3957f8e110e9c1176a2f" Oct 03 09:00:46 crc kubenswrapper[4765]: I1003 09:00:46.840426 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e4e9013d-4e82-42e9-81f4-f3a8608d564d-scripts\") pod \"ceilometer-0\" (UID: \"e4e9013d-4e82-42e9-81f4-f3a8608d564d\") " pod="watcher-kuttl-default/ceilometer-0" Oct 03 09:00:46 crc kubenswrapper[4765]: I1003 09:00:46.840738 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/e4e9013d-4e82-42e9-81f4-f3a8608d564d-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"e4e9013d-4e82-42e9-81f4-f3a8608d564d\") " pod="watcher-kuttl-default/ceilometer-0" Oct 03 09:00:46 crc kubenswrapper[4765]: I1003 09:00:46.840771 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e4e9013d-4e82-42e9-81f4-f3a8608d564d-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"e4e9013d-4e82-42e9-81f4-f3a8608d564d\") " pod="watcher-kuttl-default/ceilometer-0" Oct 03 09:00:46 crc kubenswrapper[4765]: I1003 09:00:46.840857 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e4e9013d-4e82-42e9-81f4-f3a8608d564d-config-data\") pod \"ceilometer-0\" (UID: \"e4e9013d-4e82-42e9-81f4-f3a8608d564d\") " pod="watcher-kuttl-default/ceilometer-0" Oct 03 09:00:46 crc kubenswrapper[4765]: I1003 09:00:46.840882 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/e4e9013d-4e82-42e9-81f4-f3a8608d564d-run-httpd\") pod \"ceilometer-0\" (UID: \"e4e9013d-4e82-42e9-81f4-f3a8608d564d\") " pod="watcher-kuttl-default/ceilometer-0" Oct 03 09:00:46 crc kubenswrapper[4765]: I1003 09:00:46.840915 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/e4e9013d-4e82-42e9-81f4-f3a8608d564d-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"e4e9013d-4e82-42e9-81f4-f3a8608d564d\") " pod="watcher-kuttl-default/ceilometer-0" Oct 03 09:00:46 crc kubenswrapper[4765]: I1003 09:00:46.840944 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-84k42\" (UniqueName: \"kubernetes.io/projected/e4e9013d-4e82-42e9-81f4-f3a8608d564d-kube-api-access-84k42\") pod \"ceilometer-0\" (UID: \"e4e9013d-4e82-42e9-81f4-f3a8608d564d\") " pod="watcher-kuttl-default/ceilometer-0" Oct 03 09:00:46 crc kubenswrapper[4765]: I1003 09:00:46.840975 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/e4e9013d-4e82-42e9-81f4-f3a8608d564d-log-httpd\") pod \"ceilometer-0\" (UID: \"e4e9013d-4e82-42e9-81f4-f3a8608d564d\") " pod="watcher-kuttl-default/ceilometer-0" Oct 03 09:00:46 crc kubenswrapper[4765]: I1003 09:00:46.942155 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/e4e9013d-4e82-42e9-81f4-f3a8608d564d-log-httpd\") pod \"ceilometer-0\" (UID: \"e4e9013d-4e82-42e9-81f4-f3a8608d564d\") " pod="watcher-kuttl-default/ceilometer-0" Oct 03 09:00:46 crc kubenswrapper[4765]: I1003 09:00:46.942220 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e4e9013d-4e82-42e9-81f4-f3a8608d564d-scripts\") pod \"ceilometer-0\" (UID: \"e4e9013d-4e82-42e9-81f4-f3a8608d564d\") " pod="watcher-kuttl-default/ceilometer-0" Oct 03 09:00:46 crc kubenswrapper[4765]: I1003 09:00:46.942293 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/e4e9013d-4e82-42e9-81f4-f3a8608d564d-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"e4e9013d-4e82-42e9-81f4-f3a8608d564d\") " pod="watcher-kuttl-default/ceilometer-0" Oct 03 09:00:46 crc kubenswrapper[4765]: I1003 09:00:46.942322 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e4e9013d-4e82-42e9-81f4-f3a8608d564d-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"e4e9013d-4e82-42e9-81f4-f3a8608d564d\") " pod="watcher-kuttl-default/ceilometer-0" Oct 03 09:00:46 crc kubenswrapper[4765]: I1003 09:00:46.942390 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e4e9013d-4e82-42e9-81f4-f3a8608d564d-config-data\") pod \"ceilometer-0\" (UID: \"e4e9013d-4e82-42e9-81f4-f3a8608d564d\") " pod="watcher-kuttl-default/ceilometer-0" Oct 03 09:00:46 crc kubenswrapper[4765]: I1003 09:00:46.942413 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/e4e9013d-4e82-42e9-81f4-f3a8608d564d-run-httpd\") pod \"ceilometer-0\" (UID: \"e4e9013d-4e82-42e9-81f4-f3a8608d564d\") " pod="watcher-kuttl-default/ceilometer-0" Oct 03 09:00:46 crc kubenswrapper[4765]: I1003 09:00:46.942442 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/e4e9013d-4e82-42e9-81f4-f3a8608d564d-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"e4e9013d-4e82-42e9-81f4-f3a8608d564d\") " pod="watcher-kuttl-default/ceilometer-0" Oct 03 09:00:46 crc kubenswrapper[4765]: I1003 09:00:46.942475 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-84k42\" (UniqueName: \"kubernetes.io/projected/e4e9013d-4e82-42e9-81f4-f3a8608d564d-kube-api-access-84k42\") pod \"ceilometer-0\" (UID: \"e4e9013d-4e82-42e9-81f4-f3a8608d564d\") " pod="watcher-kuttl-default/ceilometer-0" Oct 03 09:00:46 crc kubenswrapper[4765]: I1003 09:00:46.943238 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/e4e9013d-4e82-42e9-81f4-f3a8608d564d-log-httpd\") pod \"ceilometer-0\" (UID: \"e4e9013d-4e82-42e9-81f4-f3a8608d564d\") " pod="watcher-kuttl-default/ceilometer-0" Oct 03 09:00:46 crc kubenswrapper[4765]: I1003 09:00:46.944128 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/e4e9013d-4e82-42e9-81f4-f3a8608d564d-run-httpd\") pod \"ceilometer-0\" (UID: \"e4e9013d-4e82-42e9-81f4-f3a8608d564d\") " pod="watcher-kuttl-default/ceilometer-0" Oct 03 09:00:46 crc kubenswrapper[4765]: I1003 09:00:46.950223 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/e4e9013d-4e82-42e9-81f4-f3a8608d564d-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"e4e9013d-4e82-42e9-81f4-f3a8608d564d\") " pod="watcher-kuttl-default/ceilometer-0" Oct 03 09:00:46 crc kubenswrapper[4765]: I1003 09:00:46.950247 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/e4e9013d-4e82-42e9-81f4-f3a8608d564d-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"e4e9013d-4e82-42e9-81f4-f3a8608d564d\") " pod="watcher-kuttl-default/ceilometer-0" Oct 03 09:00:46 crc kubenswrapper[4765]: I1003 09:00:46.977258 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e4e9013d-4e82-42e9-81f4-f3a8608d564d-config-data\") pod \"ceilometer-0\" (UID: \"e4e9013d-4e82-42e9-81f4-f3a8608d564d\") " pod="watcher-kuttl-default/ceilometer-0" Oct 03 09:00:46 crc kubenswrapper[4765]: I1003 09:00:46.979290 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e4e9013d-4e82-42e9-81f4-f3a8608d564d-scripts\") pod \"ceilometer-0\" (UID: \"e4e9013d-4e82-42e9-81f4-f3a8608d564d\") " pod="watcher-kuttl-default/ceilometer-0" Oct 03 09:00:46 crc kubenswrapper[4765]: I1003 09:00:46.979939 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e4e9013d-4e82-42e9-81f4-f3a8608d564d-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"e4e9013d-4e82-42e9-81f4-f3a8608d564d\") " pod="watcher-kuttl-default/ceilometer-0" Oct 03 09:00:46 crc kubenswrapper[4765]: I1003 09:00:46.982187 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-84k42\" (UniqueName: \"kubernetes.io/projected/e4e9013d-4e82-42e9-81f4-f3a8608d564d-kube-api-access-84k42\") pod \"ceilometer-0\" (UID: \"e4e9013d-4e82-42e9-81f4-f3a8608d564d\") " pod="watcher-kuttl-default/ceilometer-0" Oct 03 09:00:46 crc kubenswrapper[4765]: I1003 09:00:46.998604 4765 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/ceilometer-0" Oct 03 09:00:47 crc kubenswrapper[4765]: I1003 09:00:47.291415 4765 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["watcher-kuttl-default/watcher-db-create-dvh78"] Oct 03 09:00:47 crc kubenswrapper[4765]: I1003 09:00:47.493898 4765 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["watcher-kuttl-default/ceilometer-0"] Oct 03 09:00:47 crc kubenswrapper[4765]: I1003 09:00:47.588244 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/ceilometer-0" event={"ID":"e4e9013d-4e82-42e9-81f4-f3a8608d564d","Type":"ContainerStarted","Data":"3615654e2130e877de03b052069a7174cab00b6321bc4c569e070af7219260c6"} Oct 03 09:00:47 crc kubenswrapper[4765]: I1003 09:00:47.591350 4765 generic.go:334] "Generic (PLEG): container finished" podID="d833609e-467c-49a2-ac00-c5224ffb1cb5" containerID="4950246c5c7eca0d45ad61a373617e1f39830c8dfe1935688f16f0279279a881" exitCode=0 Oct 03 09:00:47 crc kubenswrapper[4765]: I1003 09:00:47.591393 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/watcher-db-create-dvh78" event={"ID":"d833609e-467c-49a2-ac00-c5224ffb1cb5","Type":"ContainerDied","Data":"4950246c5c7eca0d45ad61a373617e1f39830c8dfe1935688f16f0279279a881"} Oct 03 09:00:47 crc kubenswrapper[4765]: I1003 09:00:47.591417 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/watcher-db-create-dvh78" event={"ID":"d833609e-467c-49a2-ac00-c5224ffb1cb5","Type":"ContainerStarted","Data":"228894ff52ba99f51f02f288e8d0d1fb39168df240762973610ed4f312f28431"} Oct 03 09:00:48 crc kubenswrapper[4765]: I1003 09:00:48.315959 4765 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="94554d2b-e45b-4c38-8871-e3b4febc81c9" path="/var/lib/kubelet/pods/94554d2b-e45b-4c38-8871-e3b4febc81c9/volumes" Oct 03 09:00:48 crc kubenswrapper[4765]: I1003 09:00:48.600808 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/ceilometer-0" event={"ID":"e4e9013d-4e82-42e9-81f4-f3a8608d564d","Type":"ContainerStarted","Data":"e926be1b2eddb4c1ba41b06d9cae1d186dfae963aa48eb29e85be7e29e8b48b0"} Oct 03 09:00:49 crc kubenswrapper[4765]: I1003 09:00:49.006331 4765 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/watcher-db-create-dvh78" Oct 03 09:00:49 crc kubenswrapper[4765]: I1003 09:00:49.085133 4765 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-47ds4\" (UniqueName: \"kubernetes.io/projected/d833609e-467c-49a2-ac00-c5224ffb1cb5-kube-api-access-47ds4\") pod \"d833609e-467c-49a2-ac00-c5224ffb1cb5\" (UID: \"d833609e-467c-49a2-ac00-c5224ffb1cb5\") " Oct 03 09:00:49 crc kubenswrapper[4765]: I1003 09:00:49.089989 4765 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d833609e-467c-49a2-ac00-c5224ffb1cb5-kube-api-access-47ds4" (OuterVolumeSpecName: "kube-api-access-47ds4") pod "d833609e-467c-49a2-ac00-c5224ffb1cb5" (UID: "d833609e-467c-49a2-ac00-c5224ffb1cb5"). InnerVolumeSpecName "kube-api-access-47ds4". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 09:00:49 crc kubenswrapper[4765]: I1003 09:00:49.187345 4765 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-47ds4\" (UniqueName: \"kubernetes.io/projected/d833609e-467c-49a2-ac00-c5224ffb1cb5-kube-api-access-47ds4\") on node \"crc\" DevicePath \"\"" Oct 03 09:00:49 crc kubenswrapper[4765]: I1003 09:00:49.610231 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/ceilometer-0" event={"ID":"e4e9013d-4e82-42e9-81f4-f3a8608d564d","Type":"ContainerStarted","Data":"fc9c8a0f1b7fd46433a370ba82c5a2b0ce850c29c99455c75f14ea35a687c8d8"} Oct 03 09:00:49 crc kubenswrapper[4765]: I1003 09:00:49.610278 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/ceilometer-0" event={"ID":"e4e9013d-4e82-42e9-81f4-f3a8608d564d","Type":"ContainerStarted","Data":"85689c79ad5864cf081c3be5eee11bfcbf52ba81da4163e3f61da84f9eb536c0"} Oct 03 09:00:49 crc kubenswrapper[4765]: I1003 09:00:49.616147 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/watcher-db-create-dvh78" event={"ID":"d833609e-467c-49a2-ac00-c5224ffb1cb5","Type":"ContainerDied","Data":"228894ff52ba99f51f02f288e8d0d1fb39168df240762973610ed4f312f28431"} Oct 03 09:00:49 crc kubenswrapper[4765]: I1003 09:00:49.616184 4765 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="228894ff52ba99f51f02f288e8d0d1fb39168df240762973610ed4f312f28431" Oct 03 09:00:49 crc kubenswrapper[4765]: I1003 09:00:49.616235 4765 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/watcher-db-create-dvh78" Oct 03 09:00:51 crc kubenswrapper[4765]: I1003 09:00:51.634817 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/ceilometer-0" event={"ID":"e4e9013d-4e82-42e9-81f4-f3a8608d564d","Type":"ContainerStarted","Data":"3e41e5b6a1677af5c275f81caabde4ed38e4b598326418f025a27a74dec79b67"} Oct 03 09:00:51 crc kubenswrapper[4765]: I1003 09:00:51.635402 4765 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="watcher-kuttl-default/ceilometer-0" Oct 03 09:00:51 crc kubenswrapper[4765]: I1003 09:00:51.660246 4765 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="watcher-kuttl-default/ceilometer-0" podStartSLOduration=2.682980861 podStartE2EDuration="5.660229151s" podCreationTimestamp="2025-10-03 09:00:46 +0000 UTC" firstStartedPulling="2025-10-03 09:00:47.497375609 +0000 UTC m=+1291.798869939" lastFinishedPulling="2025-10-03 09:00:50.474623889 +0000 UTC m=+1294.776118229" observedRunningTime="2025-10-03 09:00:51.652465046 +0000 UTC m=+1295.953959386" watchObservedRunningTime="2025-10-03 09:00:51.660229151 +0000 UTC m=+1295.961723481" Oct 03 09:00:56 crc kubenswrapper[4765]: I1003 09:00:56.244905 4765 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["watcher-kuttl-default/watcher-3038-account-create-78hd4"] Oct 03 09:00:56 crc kubenswrapper[4765]: E1003 09:00:56.245576 4765 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d833609e-467c-49a2-ac00-c5224ffb1cb5" containerName="mariadb-database-create" Oct 03 09:00:56 crc kubenswrapper[4765]: I1003 09:00:56.245591 4765 state_mem.go:107] "Deleted CPUSet assignment" podUID="d833609e-467c-49a2-ac00-c5224ffb1cb5" containerName="mariadb-database-create" Oct 03 09:00:56 crc kubenswrapper[4765]: I1003 09:00:56.245748 4765 memory_manager.go:354] "RemoveStaleState removing state" podUID="d833609e-467c-49a2-ac00-c5224ffb1cb5" containerName="mariadb-database-create" Oct 03 09:00:56 crc kubenswrapper[4765]: I1003 09:00:56.246275 4765 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/watcher-3038-account-create-78hd4" Oct 03 09:00:56 crc kubenswrapper[4765]: I1003 09:00:56.248869 4765 reflector.go:368] Caches populated for *v1.Secret from object-"watcher-kuttl-default"/"watcher-db-secret" Oct 03 09:00:56 crc kubenswrapper[4765]: I1003 09:00:56.264364 4765 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["watcher-kuttl-default/watcher-3038-account-create-78hd4"] Oct 03 09:00:56 crc kubenswrapper[4765]: I1003 09:00:56.392954 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-j22k9\" (UniqueName: \"kubernetes.io/projected/693c5bb3-d4e8-4f16-863e-c79e9d1de450-kube-api-access-j22k9\") pod \"watcher-3038-account-create-78hd4\" (UID: \"693c5bb3-d4e8-4f16-863e-c79e9d1de450\") " pod="watcher-kuttl-default/watcher-3038-account-create-78hd4" Oct 03 09:00:56 crc kubenswrapper[4765]: I1003 09:00:56.495217 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-j22k9\" (UniqueName: \"kubernetes.io/projected/693c5bb3-d4e8-4f16-863e-c79e9d1de450-kube-api-access-j22k9\") pod \"watcher-3038-account-create-78hd4\" (UID: \"693c5bb3-d4e8-4f16-863e-c79e9d1de450\") " pod="watcher-kuttl-default/watcher-3038-account-create-78hd4" Oct 03 09:00:56 crc kubenswrapper[4765]: I1003 09:00:56.514248 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-j22k9\" (UniqueName: \"kubernetes.io/projected/693c5bb3-d4e8-4f16-863e-c79e9d1de450-kube-api-access-j22k9\") pod \"watcher-3038-account-create-78hd4\" (UID: \"693c5bb3-d4e8-4f16-863e-c79e9d1de450\") " pod="watcher-kuttl-default/watcher-3038-account-create-78hd4" Oct 03 09:00:56 crc kubenswrapper[4765]: I1003 09:00:56.580798 4765 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/watcher-3038-account-create-78hd4" Oct 03 09:00:57 crc kubenswrapper[4765]: I1003 09:00:57.073502 4765 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["watcher-kuttl-default/watcher-3038-account-create-78hd4"] Oct 03 09:00:57 crc kubenswrapper[4765]: I1003 09:00:57.683020 4765 generic.go:334] "Generic (PLEG): container finished" podID="693c5bb3-d4e8-4f16-863e-c79e9d1de450" containerID="3f6a182133151f1955d7525508cf4c5906a5b3a7d463d258e266b946537feca2" exitCode=0 Oct 03 09:00:57 crc kubenswrapper[4765]: I1003 09:00:57.683073 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/watcher-3038-account-create-78hd4" event={"ID":"693c5bb3-d4e8-4f16-863e-c79e9d1de450","Type":"ContainerDied","Data":"3f6a182133151f1955d7525508cf4c5906a5b3a7d463d258e266b946537feca2"} Oct 03 09:00:57 crc kubenswrapper[4765]: I1003 09:00:57.684409 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/watcher-3038-account-create-78hd4" event={"ID":"693c5bb3-d4e8-4f16-863e-c79e9d1de450","Type":"ContainerStarted","Data":"9997bf9bc9d5c6b4ed0c16df9984422f3578c5e8a5305f4c974f03f7c406bc44"} Oct 03 09:00:59 crc kubenswrapper[4765]: I1003 09:00:59.077180 4765 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/watcher-3038-account-create-78hd4" Oct 03 09:00:59 crc kubenswrapper[4765]: I1003 09:00:59.151611 4765 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-j22k9\" (UniqueName: \"kubernetes.io/projected/693c5bb3-d4e8-4f16-863e-c79e9d1de450-kube-api-access-j22k9\") pod \"693c5bb3-d4e8-4f16-863e-c79e9d1de450\" (UID: \"693c5bb3-d4e8-4f16-863e-c79e9d1de450\") " Oct 03 09:00:59 crc kubenswrapper[4765]: I1003 09:00:59.157635 4765 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/693c5bb3-d4e8-4f16-863e-c79e9d1de450-kube-api-access-j22k9" (OuterVolumeSpecName: "kube-api-access-j22k9") pod "693c5bb3-d4e8-4f16-863e-c79e9d1de450" (UID: "693c5bb3-d4e8-4f16-863e-c79e9d1de450"). InnerVolumeSpecName "kube-api-access-j22k9". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 09:00:59 crc kubenswrapper[4765]: I1003 09:00:59.254034 4765 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-j22k9\" (UniqueName: \"kubernetes.io/projected/693c5bb3-d4e8-4f16-863e-c79e9d1de450-kube-api-access-j22k9\") on node \"crc\" DevicePath \"\"" Oct 03 09:00:59 crc kubenswrapper[4765]: I1003 09:00:59.705762 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/watcher-3038-account-create-78hd4" event={"ID":"693c5bb3-d4e8-4f16-863e-c79e9d1de450","Type":"ContainerDied","Data":"9997bf9bc9d5c6b4ed0c16df9984422f3578c5e8a5305f4c974f03f7c406bc44"} Oct 03 09:00:59 crc kubenswrapper[4765]: I1003 09:00:59.706075 4765 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="9997bf9bc9d5c6b4ed0c16df9984422f3578c5e8a5305f4c974f03f7c406bc44" Oct 03 09:00:59 crc kubenswrapper[4765]: I1003 09:00:59.705836 4765 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/watcher-3038-account-create-78hd4" Oct 03 09:01:00 crc kubenswrapper[4765]: I1003 09:01:00.146046 4765 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["watcher-kuttl-default/keystone-cron-29324701-zwcd8"] Oct 03 09:01:00 crc kubenswrapper[4765]: E1003 09:01:00.146457 4765 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="693c5bb3-d4e8-4f16-863e-c79e9d1de450" containerName="mariadb-account-create" Oct 03 09:01:00 crc kubenswrapper[4765]: I1003 09:01:00.146471 4765 state_mem.go:107] "Deleted CPUSet assignment" podUID="693c5bb3-d4e8-4f16-863e-c79e9d1de450" containerName="mariadb-account-create" Oct 03 09:01:00 crc kubenswrapper[4765]: I1003 09:01:00.146727 4765 memory_manager.go:354] "RemoveStaleState removing state" podUID="693c5bb3-d4e8-4f16-863e-c79e9d1de450" containerName="mariadb-account-create" Oct 03 09:01:00 crc kubenswrapper[4765]: I1003 09:01:00.147451 4765 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/keystone-cron-29324701-zwcd8" Oct 03 09:01:00 crc kubenswrapper[4765]: I1003 09:01:00.156868 4765 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["watcher-kuttl-default/keystone-cron-29324701-zwcd8"] Oct 03 09:01:00 crc kubenswrapper[4765]: I1003 09:01:00.176876 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gkb7c\" (UniqueName: \"kubernetes.io/projected/ad34be00-f9d6-41d7-9db0-decf8a030e53-kube-api-access-gkb7c\") pod \"keystone-cron-29324701-zwcd8\" (UID: \"ad34be00-f9d6-41d7-9db0-decf8a030e53\") " pod="watcher-kuttl-default/keystone-cron-29324701-zwcd8" Oct 03 09:01:00 crc kubenswrapper[4765]: I1003 09:01:00.176972 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/ad34be00-f9d6-41d7-9db0-decf8a030e53-fernet-keys\") pod \"keystone-cron-29324701-zwcd8\" (UID: \"ad34be00-f9d6-41d7-9db0-decf8a030e53\") " pod="watcher-kuttl-default/keystone-cron-29324701-zwcd8" Oct 03 09:01:00 crc kubenswrapper[4765]: I1003 09:01:00.177011 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ad34be00-f9d6-41d7-9db0-decf8a030e53-config-data\") pod \"keystone-cron-29324701-zwcd8\" (UID: \"ad34be00-f9d6-41d7-9db0-decf8a030e53\") " pod="watcher-kuttl-default/keystone-cron-29324701-zwcd8" Oct 03 09:01:00 crc kubenswrapper[4765]: I1003 09:01:00.177037 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ad34be00-f9d6-41d7-9db0-decf8a030e53-combined-ca-bundle\") pod \"keystone-cron-29324701-zwcd8\" (UID: \"ad34be00-f9d6-41d7-9db0-decf8a030e53\") " pod="watcher-kuttl-default/keystone-cron-29324701-zwcd8" Oct 03 09:01:00 crc kubenswrapper[4765]: I1003 09:01:00.278186 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ad34be00-f9d6-41d7-9db0-decf8a030e53-combined-ca-bundle\") pod \"keystone-cron-29324701-zwcd8\" (UID: \"ad34be00-f9d6-41d7-9db0-decf8a030e53\") " pod="watcher-kuttl-default/keystone-cron-29324701-zwcd8" Oct 03 09:01:00 crc kubenswrapper[4765]: I1003 09:01:00.278309 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gkb7c\" (UniqueName: \"kubernetes.io/projected/ad34be00-f9d6-41d7-9db0-decf8a030e53-kube-api-access-gkb7c\") pod \"keystone-cron-29324701-zwcd8\" (UID: \"ad34be00-f9d6-41d7-9db0-decf8a030e53\") " pod="watcher-kuttl-default/keystone-cron-29324701-zwcd8" Oct 03 09:01:00 crc kubenswrapper[4765]: I1003 09:01:00.278402 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/ad34be00-f9d6-41d7-9db0-decf8a030e53-fernet-keys\") pod \"keystone-cron-29324701-zwcd8\" (UID: \"ad34be00-f9d6-41d7-9db0-decf8a030e53\") " pod="watcher-kuttl-default/keystone-cron-29324701-zwcd8" Oct 03 09:01:00 crc kubenswrapper[4765]: I1003 09:01:00.278446 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ad34be00-f9d6-41d7-9db0-decf8a030e53-config-data\") pod \"keystone-cron-29324701-zwcd8\" (UID: \"ad34be00-f9d6-41d7-9db0-decf8a030e53\") " pod="watcher-kuttl-default/keystone-cron-29324701-zwcd8" Oct 03 09:01:00 crc kubenswrapper[4765]: I1003 09:01:00.282735 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ad34be00-f9d6-41d7-9db0-decf8a030e53-config-data\") pod \"keystone-cron-29324701-zwcd8\" (UID: \"ad34be00-f9d6-41d7-9db0-decf8a030e53\") " pod="watcher-kuttl-default/keystone-cron-29324701-zwcd8" Oct 03 09:01:00 crc kubenswrapper[4765]: I1003 09:01:00.284737 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/ad34be00-f9d6-41d7-9db0-decf8a030e53-fernet-keys\") pod \"keystone-cron-29324701-zwcd8\" (UID: \"ad34be00-f9d6-41d7-9db0-decf8a030e53\") " pod="watcher-kuttl-default/keystone-cron-29324701-zwcd8" Oct 03 09:01:00 crc kubenswrapper[4765]: I1003 09:01:00.286562 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ad34be00-f9d6-41d7-9db0-decf8a030e53-combined-ca-bundle\") pod \"keystone-cron-29324701-zwcd8\" (UID: \"ad34be00-f9d6-41d7-9db0-decf8a030e53\") " pod="watcher-kuttl-default/keystone-cron-29324701-zwcd8" Oct 03 09:01:00 crc kubenswrapper[4765]: I1003 09:01:00.298603 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gkb7c\" (UniqueName: \"kubernetes.io/projected/ad34be00-f9d6-41d7-9db0-decf8a030e53-kube-api-access-gkb7c\") pod \"keystone-cron-29324701-zwcd8\" (UID: \"ad34be00-f9d6-41d7-9db0-decf8a030e53\") " pod="watcher-kuttl-default/keystone-cron-29324701-zwcd8" Oct 03 09:01:00 crc kubenswrapper[4765]: I1003 09:01:00.468742 4765 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/keystone-cron-29324701-zwcd8" Oct 03 09:01:00 crc kubenswrapper[4765]: I1003 09:01:00.680828 4765 patch_prober.go:28] interesting pod/machine-config-daemon-j8mss container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 03 09:01:00 crc kubenswrapper[4765]: I1003 09:01:00.681156 4765 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-j8mss" podUID="d636dbad-9ffa-4ba7-953f-adea04b76a23" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 03 09:01:00 crc kubenswrapper[4765]: I1003 09:01:00.913328 4765 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["watcher-kuttl-default/keystone-cron-29324701-zwcd8"] Oct 03 09:01:00 crc kubenswrapper[4765]: W1003 09:01:00.917830 4765 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podad34be00_f9d6_41d7_9db0_decf8a030e53.slice/crio-e6138f3f76299f0e5f198fbd350880c00cd0f1966bb93ddc3b7235902c8586c8 WatchSource:0}: Error finding container e6138f3f76299f0e5f198fbd350880c00cd0f1966bb93ddc3b7235902c8586c8: Status 404 returned error can't find the container with id e6138f3f76299f0e5f198fbd350880c00cd0f1966bb93ddc3b7235902c8586c8 Oct 03 09:01:01 crc kubenswrapper[4765]: I1003 09:01:01.407229 4765 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["watcher-kuttl-default/watcher-kuttl-db-sync-krl79"] Oct 03 09:01:01 crc kubenswrapper[4765]: I1003 09:01:01.408875 4765 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/watcher-kuttl-db-sync-krl79" Oct 03 09:01:01 crc kubenswrapper[4765]: I1003 09:01:01.410938 4765 reflector.go:368] Caches populated for *v1.Secret from object-"watcher-kuttl-default"/"watcher-watcher-kuttl-dockercfg-v6c2m" Oct 03 09:01:01 crc kubenswrapper[4765]: I1003 09:01:01.411529 4765 reflector.go:368] Caches populated for *v1.Secret from object-"watcher-kuttl-default"/"watcher-kuttl-config-data" Oct 03 09:01:01 crc kubenswrapper[4765]: I1003 09:01:01.430472 4765 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["watcher-kuttl-default/watcher-kuttl-db-sync-krl79"] Oct 03 09:01:01 crc kubenswrapper[4765]: I1003 09:01:01.499148 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/9ab77867-5e7d-4768-bfce-8e6346e5d1c1-db-sync-config-data\") pod \"watcher-kuttl-db-sync-krl79\" (UID: \"9ab77867-5e7d-4768-bfce-8e6346e5d1c1\") " pod="watcher-kuttl-default/watcher-kuttl-db-sync-krl79" Oct 03 09:01:01 crc kubenswrapper[4765]: I1003 09:01:01.499372 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9ab77867-5e7d-4768-bfce-8e6346e5d1c1-config-data\") pod \"watcher-kuttl-db-sync-krl79\" (UID: \"9ab77867-5e7d-4768-bfce-8e6346e5d1c1\") " pod="watcher-kuttl-default/watcher-kuttl-db-sync-krl79" Oct 03 09:01:01 crc kubenswrapper[4765]: I1003 09:01:01.499426 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9ab77867-5e7d-4768-bfce-8e6346e5d1c1-combined-ca-bundle\") pod \"watcher-kuttl-db-sync-krl79\" (UID: \"9ab77867-5e7d-4768-bfce-8e6346e5d1c1\") " pod="watcher-kuttl-default/watcher-kuttl-db-sync-krl79" Oct 03 09:01:01 crc kubenswrapper[4765]: I1003 09:01:01.499617 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-q7dqt\" (UniqueName: \"kubernetes.io/projected/9ab77867-5e7d-4768-bfce-8e6346e5d1c1-kube-api-access-q7dqt\") pod \"watcher-kuttl-db-sync-krl79\" (UID: \"9ab77867-5e7d-4768-bfce-8e6346e5d1c1\") " pod="watcher-kuttl-default/watcher-kuttl-db-sync-krl79" Oct 03 09:01:01 crc kubenswrapper[4765]: I1003 09:01:01.601232 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-q7dqt\" (UniqueName: \"kubernetes.io/projected/9ab77867-5e7d-4768-bfce-8e6346e5d1c1-kube-api-access-q7dqt\") pod \"watcher-kuttl-db-sync-krl79\" (UID: \"9ab77867-5e7d-4768-bfce-8e6346e5d1c1\") " pod="watcher-kuttl-default/watcher-kuttl-db-sync-krl79" Oct 03 09:01:01 crc kubenswrapper[4765]: I1003 09:01:01.601293 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/9ab77867-5e7d-4768-bfce-8e6346e5d1c1-db-sync-config-data\") pod \"watcher-kuttl-db-sync-krl79\" (UID: \"9ab77867-5e7d-4768-bfce-8e6346e5d1c1\") " pod="watcher-kuttl-default/watcher-kuttl-db-sync-krl79" Oct 03 09:01:01 crc kubenswrapper[4765]: I1003 09:01:01.601347 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9ab77867-5e7d-4768-bfce-8e6346e5d1c1-config-data\") pod \"watcher-kuttl-db-sync-krl79\" (UID: \"9ab77867-5e7d-4768-bfce-8e6346e5d1c1\") " pod="watcher-kuttl-default/watcher-kuttl-db-sync-krl79" Oct 03 09:01:01 crc kubenswrapper[4765]: I1003 09:01:01.601370 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9ab77867-5e7d-4768-bfce-8e6346e5d1c1-combined-ca-bundle\") pod \"watcher-kuttl-db-sync-krl79\" (UID: \"9ab77867-5e7d-4768-bfce-8e6346e5d1c1\") " pod="watcher-kuttl-default/watcher-kuttl-db-sync-krl79" Oct 03 09:01:01 crc kubenswrapper[4765]: I1003 09:01:01.604987 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/9ab77867-5e7d-4768-bfce-8e6346e5d1c1-db-sync-config-data\") pod \"watcher-kuttl-db-sync-krl79\" (UID: \"9ab77867-5e7d-4768-bfce-8e6346e5d1c1\") " pod="watcher-kuttl-default/watcher-kuttl-db-sync-krl79" Oct 03 09:01:01 crc kubenswrapper[4765]: I1003 09:01:01.614278 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9ab77867-5e7d-4768-bfce-8e6346e5d1c1-combined-ca-bundle\") pod \"watcher-kuttl-db-sync-krl79\" (UID: \"9ab77867-5e7d-4768-bfce-8e6346e5d1c1\") " pod="watcher-kuttl-default/watcher-kuttl-db-sync-krl79" Oct 03 09:01:01 crc kubenswrapper[4765]: I1003 09:01:01.614462 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9ab77867-5e7d-4768-bfce-8e6346e5d1c1-config-data\") pod \"watcher-kuttl-db-sync-krl79\" (UID: \"9ab77867-5e7d-4768-bfce-8e6346e5d1c1\") " pod="watcher-kuttl-default/watcher-kuttl-db-sync-krl79" Oct 03 09:01:01 crc kubenswrapper[4765]: I1003 09:01:01.624321 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-q7dqt\" (UniqueName: \"kubernetes.io/projected/9ab77867-5e7d-4768-bfce-8e6346e5d1c1-kube-api-access-q7dqt\") pod \"watcher-kuttl-db-sync-krl79\" (UID: \"9ab77867-5e7d-4768-bfce-8e6346e5d1c1\") " pod="watcher-kuttl-default/watcher-kuttl-db-sync-krl79" Oct 03 09:01:01 crc kubenswrapper[4765]: I1003 09:01:01.721470 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/keystone-cron-29324701-zwcd8" event={"ID":"ad34be00-f9d6-41d7-9db0-decf8a030e53","Type":"ContainerStarted","Data":"572831d92fe9d68e1aad57cd3b86f654e656cf15a327624a653b0b5d126f79d9"} Oct 03 09:01:01 crc kubenswrapper[4765]: I1003 09:01:01.721512 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/keystone-cron-29324701-zwcd8" event={"ID":"ad34be00-f9d6-41d7-9db0-decf8a030e53","Type":"ContainerStarted","Data":"e6138f3f76299f0e5f198fbd350880c00cd0f1966bb93ddc3b7235902c8586c8"} Oct 03 09:01:01 crc kubenswrapper[4765]: I1003 09:01:01.738574 4765 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="watcher-kuttl-default/keystone-cron-29324701-zwcd8" podStartSLOduration=1.738551909 podStartE2EDuration="1.738551909s" podCreationTimestamp="2025-10-03 09:01:00 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-03 09:01:01.734505508 +0000 UTC m=+1306.035999858" watchObservedRunningTime="2025-10-03 09:01:01.738551909 +0000 UTC m=+1306.040046239" Oct 03 09:01:01 crc kubenswrapper[4765]: I1003 09:01:01.779452 4765 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/watcher-kuttl-db-sync-krl79" Oct 03 09:01:02 crc kubenswrapper[4765]: I1003 09:01:02.257798 4765 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["watcher-kuttl-default/watcher-kuttl-db-sync-krl79"] Oct 03 09:01:02 crc kubenswrapper[4765]: I1003 09:01:02.748364 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/watcher-kuttl-db-sync-krl79" event={"ID":"9ab77867-5e7d-4768-bfce-8e6346e5d1c1","Type":"ContainerStarted","Data":"7394fa8968e57e58857a0527abf4f3b23ec959eef0ec49bd23f20f27793b6cf6"} Oct 03 09:01:02 crc kubenswrapper[4765]: I1003 09:01:02.748653 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/watcher-kuttl-db-sync-krl79" event={"ID":"9ab77867-5e7d-4768-bfce-8e6346e5d1c1","Type":"ContainerStarted","Data":"700dad470f48edc90772738a48f9327d458e01cfb832e5a198d96c33b7ca8378"} Oct 03 09:01:02 crc kubenswrapper[4765]: I1003 09:01:02.784298 4765 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="watcher-kuttl-default/watcher-kuttl-db-sync-krl79" podStartSLOduration=1.7842737290000001 podStartE2EDuration="1.784273729s" podCreationTimestamp="2025-10-03 09:01:01 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-03 09:01:02.766524682 +0000 UTC m=+1307.068019012" watchObservedRunningTime="2025-10-03 09:01:02.784273729 +0000 UTC m=+1307.085768059" Oct 03 09:01:03 crc kubenswrapper[4765]: I1003 09:01:03.756524 4765 generic.go:334] "Generic (PLEG): container finished" podID="ad34be00-f9d6-41d7-9db0-decf8a030e53" containerID="572831d92fe9d68e1aad57cd3b86f654e656cf15a327624a653b0b5d126f79d9" exitCode=0 Oct 03 09:01:03 crc kubenswrapper[4765]: I1003 09:01:03.756768 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/keystone-cron-29324701-zwcd8" event={"ID":"ad34be00-f9d6-41d7-9db0-decf8a030e53","Type":"ContainerDied","Data":"572831d92fe9d68e1aad57cd3b86f654e656cf15a327624a653b0b5d126f79d9"} Oct 03 09:01:05 crc kubenswrapper[4765]: I1003 09:01:05.117809 4765 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/keystone-cron-29324701-zwcd8" Oct 03 09:01:05 crc kubenswrapper[4765]: I1003 09:01:05.163290 4765 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ad34be00-f9d6-41d7-9db0-decf8a030e53-combined-ca-bundle\") pod \"ad34be00-f9d6-41d7-9db0-decf8a030e53\" (UID: \"ad34be00-f9d6-41d7-9db0-decf8a030e53\") " Oct 03 09:01:05 crc kubenswrapper[4765]: I1003 09:01:05.163364 4765 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ad34be00-f9d6-41d7-9db0-decf8a030e53-config-data\") pod \"ad34be00-f9d6-41d7-9db0-decf8a030e53\" (UID: \"ad34be00-f9d6-41d7-9db0-decf8a030e53\") " Oct 03 09:01:05 crc kubenswrapper[4765]: I1003 09:01:05.163427 4765 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/ad34be00-f9d6-41d7-9db0-decf8a030e53-fernet-keys\") pod \"ad34be00-f9d6-41d7-9db0-decf8a030e53\" (UID: \"ad34be00-f9d6-41d7-9db0-decf8a030e53\") " Oct 03 09:01:05 crc kubenswrapper[4765]: I1003 09:01:05.163457 4765 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gkb7c\" (UniqueName: \"kubernetes.io/projected/ad34be00-f9d6-41d7-9db0-decf8a030e53-kube-api-access-gkb7c\") pod \"ad34be00-f9d6-41d7-9db0-decf8a030e53\" (UID: \"ad34be00-f9d6-41d7-9db0-decf8a030e53\") " Oct 03 09:01:05 crc kubenswrapper[4765]: I1003 09:01:05.175871 4765 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ad34be00-f9d6-41d7-9db0-decf8a030e53-kube-api-access-gkb7c" (OuterVolumeSpecName: "kube-api-access-gkb7c") pod "ad34be00-f9d6-41d7-9db0-decf8a030e53" (UID: "ad34be00-f9d6-41d7-9db0-decf8a030e53"). InnerVolumeSpecName "kube-api-access-gkb7c". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 09:01:05 crc kubenswrapper[4765]: I1003 09:01:05.175991 4765 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ad34be00-f9d6-41d7-9db0-decf8a030e53-fernet-keys" (OuterVolumeSpecName: "fernet-keys") pod "ad34be00-f9d6-41d7-9db0-decf8a030e53" (UID: "ad34be00-f9d6-41d7-9db0-decf8a030e53"). InnerVolumeSpecName "fernet-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 09:01:05 crc kubenswrapper[4765]: I1003 09:01:05.185956 4765 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ad34be00-f9d6-41d7-9db0-decf8a030e53-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "ad34be00-f9d6-41d7-9db0-decf8a030e53" (UID: "ad34be00-f9d6-41d7-9db0-decf8a030e53"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 09:01:05 crc kubenswrapper[4765]: I1003 09:01:05.224599 4765 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ad34be00-f9d6-41d7-9db0-decf8a030e53-config-data" (OuterVolumeSpecName: "config-data") pod "ad34be00-f9d6-41d7-9db0-decf8a030e53" (UID: "ad34be00-f9d6-41d7-9db0-decf8a030e53"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 09:01:05 crc kubenswrapper[4765]: I1003 09:01:05.265952 4765 reconciler_common.go:293] "Volume detached for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/ad34be00-f9d6-41d7-9db0-decf8a030e53-fernet-keys\") on node \"crc\" DevicePath \"\"" Oct 03 09:01:05 crc kubenswrapper[4765]: I1003 09:01:05.266011 4765 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-gkb7c\" (UniqueName: \"kubernetes.io/projected/ad34be00-f9d6-41d7-9db0-decf8a030e53-kube-api-access-gkb7c\") on node \"crc\" DevicePath \"\"" Oct 03 09:01:05 crc kubenswrapper[4765]: I1003 09:01:05.266026 4765 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ad34be00-f9d6-41d7-9db0-decf8a030e53-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 03 09:01:05 crc kubenswrapper[4765]: I1003 09:01:05.266038 4765 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ad34be00-f9d6-41d7-9db0-decf8a030e53-config-data\") on node \"crc\" DevicePath \"\"" Oct 03 09:01:05 crc kubenswrapper[4765]: I1003 09:01:05.773161 4765 generic.go:334] "Generic (PLEG): container finished" podID="9ab77867-5e7d-4768-bfce-8e6346e5d1c1" containerID="7394fa8968e57e58857a0527abf4f3b23ec959eef0ec49bd23f20f27793b6cf6" exitCode=0 Oct 03 09:01:05 crc kubenswrapper[4765]: I1003 09:01:05.773248 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/watcher-kuttl-db-sync-krl79" event={"ID":"9ab77867-5e7d-4768-bfce-8e6346e5d1c1","Type":"ContainerDied","Data":"7394fa8968e57e58857a0527abf4f3b23ec959eef0ec49bd23f20f27793b6cf6"} Oct 03 09:01:05 crc kubenswrapper[4765]: I1003 09:01:05.774778 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/keystone-cron-29324701-zwcd8" event={"ID":"ad34be00-f9d6-41d7-9db0-decf8a030e53","Type":"ContainerDied","Data":"e6138f3f76299f0e5f198fbd350880c00cd0f1966bb93ddc3b7235902c8586c8"} Oct 03 09:01:05 crc kubenswrapper[4765]: I1003 09:01:05.774812 4765 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="e6138f3f76299f0e5f198fbd350880c00cd0f1966bb93ddc3b7235902c8586c8" Oct 03 09:01:05 crc kubenswrapper[4765]: I1003 09:01:05.774876 4765 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/keystone-cron-29324701-zwcd8" Oct 03 09:01:07 crc kubenswrapper[4765]: I1003 09:01:07.157832 4765 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/watcher-kuttl-db-sync-krl79" Oct 03 09:01:07 crc kubenswrapper[4765]: I1003 09:01:07.302096 4765 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9ab77867-5e7d-4768-bfce-8e6346e5d1c1-config-data\") pod \"9ab77867-5e7d-4768-bfce-8e6346e5d1c1\" (UID: \"9ab77867-5e7d-4768-bfce-8e6346e5d1c1\") " Oct 03 09:01:07 crc kubenswrapper[4765]: I1003 09:01:07.302234 4765 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9ab77867-5e7d-4768-bfce-8e6346e5d1c1-combined-ca-bundle\") pod \"9ab77867-5e7d-4768-bfce-8e6346e5d1c1\" (UID: \"9ab77867-5e7d-4768-bfce-8e6346e5d1c1\") " Oct 03 09:01:07 crc kubenswrapper[4765]: I1003 09:01:07.302259 4765 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/9ab77867-5e7d-4768-bfce-8e6346e5d1c1-db-sync-config-data\") pod \"9ab77867-5e7d-4768-bfce-8e6346e5d1c1\" (UID: \"9ab77867-5e7d-4768-bfce-8e6346e5d1c1\") " Oct 03 09:01:07 crc kubenswrapper[4765]: I1003 09:01:07.302367 4765 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-q7dqt\" (UniqueName: \"kubernetes.io/projected/9ab77867-5e7d-4768-bfce-8e6346e5d1c1-kube-api-access-q7dqt\") pod \"9ab77867-5e7d-4768-bfce-8e6346e5d1c1\" (UID: \"9ab77867-5e7d-4768-bfce-8e6346e5d1c1\") " Oct 03 09:01:07 crc kubenswrapper[4765]: I1003 09:01:07.309586 4765 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9ab77867-5e7d-4768-bfce-8e6346e5d1c1-db-sync-config-data" (OuterVolumeSpecName: "db-sync-config-data") pod "9ab77867-5e7d-4768-bfce-8e6346e5d1c1" (UID: "9ab77867-5e7d-4768-bfce-8e6346e5d1c1"). InnerVolumeSpecName "db-sync-config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 09:01:07 crc kubenswrapper[4765]: I1003 09:01:07.314330 4765 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9ab77867-5e7d-4768-bfce-8e6346e5d1c1-kube-api-access-q7dqt" (OuterVolumeSpecName: "kube-api-access-q7dqt") pod "9ab77867-5e7d-4768-bfce-8e6346e5d1c1" (UID: "9ab77867-5e7d-4768-bfce-8e6346e5d1c1"). InnerVolumeSpecName "kube-api-access-q7dqt". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 09:01:07 crc kubenswrapper[4765]: I1003 09:01:07.331087 4765 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9ab77867-5e7d-4768-bfce-8e6346e5d1c1-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "9ab77867-5e7d-4768-bfce-8e6346e5d1c1" (UID: "9ab77867-5e7d-4768-bfce-8e6346e5d1c1"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 09:01:07 crc kubenswrapper[4765]: I1003 09:01:07.356778 4765 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9ab77867-5e7d-4768-bfce-8e6346e5d1c1-config-data" (OuterVolumeSpecName: "config-data") pod "9ab77867-5e7d-4768-bfce-8e6346e5d1c1" (UID: "9ab77867-5e7d-4768-bfce-8e6346e5d1c1"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 09:01:07 crc kubenswrapper[4765]: I1003 09:01:07.404560 4765 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9ab77867-5e7d-4768-bfce-8e6346e5d1c1-config-data\") on node \"crc\" DevicePath \"\"" Oct 03 09:01:07 crc kubenswrapper[4765]: I1003 09:01:07.404602 4765 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9ab77867-5e7d-4768-bfce-8e6346e5d1c1-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 03 09:01:07 crc kubenswrapper[4765]: I1003 09:01:07.404614 4765 reconciler_common.go:293] "Volume detached for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/9ab77867-5e7d-4768-bfce-8e6346e5d1c1-db-sync-config-data\") on node \"crc\" DevicePath \"\"" Oct 03 09:01:07 crc kubenswrapper[4765]: I1003 09:01:07.404624 4765 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-q7dqt\" (UniqueName: \"kubernetes.io/projected/9ab77867-5e7d-4768-bfce-8e6346e5d1c1-kube-api-access-q7dqt\") on node \"crc\" DevicePath \"\"" Oct 03 09:01:07 crc kubenswrapper[4765]: I1003 09:01:07.805823 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/watcher-kuttl-db-sync-krl79" event={"ID":"9ab77867-5e7d-4768-bfce-8e6346e5d1c1","Type":"ContainerDied","Data":"700dad470f48edc90772738a48f9327d458e01cfb832e5a198d96c33b7ca8378"} Oct 03 09:01:07 crc kubenswrapper[4765]: I1003 09:01:07.805864 4765 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="700dad470f48edc90772738a48f9327d458e01cfb832e5a198d96c33b7ca8378" Oct 03 09:01:07 crc kubenswrapper[4765]: I1003 09:01:07.806070 4765 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/watcher-kuttl-db-sync-krl79" Oct 03 09:01:08 crc kubenswrapper[4765]: I1003 09:01:08.061699 4765 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["watcher-kuttl-default/watcher-kuttl-api-0"] Oct 03 09:01:08 crc kubenswrapper[4765]: E1003 09:01:08.062171 4765 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9ab77867-5e7d-4768-bfce-8e6346e5d1c1" containerName="watcher-kuttl-db-sync" Oct 03 09:01:08 crc kubenswrapper[4765]: I1003 09:01:08.062195 4765 state_mem.go:107] "Deleted CPUSet assignment" podUID="9ab77867-5e7d-4768-bfce-8e6346e5d1c1" containerName="watcher-kuttl-db-sync" Oct 03 09:01:08 crc kubenswrapper[4765]: E1003 09:01:08.062245 4765 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ad34be00-f9d6-41d7-9db0-decf8a030e53" containerName="keystone-cron" Oct 03 09:01:08 crc kubenswrapper[4765]: I1003 09:01:08.062256 4765 state_mem.go:107] "Deleted CPUSet assignment" podUID="ad34be00-f9d6-41d7-9db0-decf8a030e53" containerName="keystone-cron" Oct 03 09:01:08 crc kubenswrapper[4765]: I1003 09:01:08.062431 4765 memory_manager.go:354] "RemoveStaleState removing state" podUID="ad34be00-f9d6-41d7-9db0-decf8a030e53" containerName="keystone-cron" Oct 03 09:01:08 crc kubenswrapper[4765]: I1003 09:01:08.062455 4765 memory_manager.go:354] "RemoveStaleState removing state" podUID="9ab77867-5e7d-4768-bfce-8e6346e5d1c1" containerName="watcher-kuttl-db-sync" Oct 03 09:01:08 crc kubenswrapper[4765]: I1003 09:01:08.063633 4765 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/watcher-kuttl-api-0" Oct 03 09:01:08 crc kubenswrapper[4765]: I1003 09:01:08.066734 4765 reflector.go:368] Caches populated for *v1.Secret from object-"watcher-kuttl-default"/"watcher-watcher-kuttl-dockercfg-v6c2m" Oct 03 09:01:08 crc kubenswrapper[4765]: I1003 09:01:08.067543 4765 reflector.go:368] Caches populated for *v1.Secret from object-"watcher-kuttl-default"/"watcher-kuttl-api-config-data" Oct 03 09:01:08 crc kubenswrapper[4765]: I1003 09:01:08.068540 4765 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["watcher-kuttl-default/watcher-kuttl-applier-0"] Oct 03 09:01:08 crc kubenswrapper[4765]: I1003 09:01:08.069553 4765 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/watcher-kuttl-applier-0" Oct 03 09:01:08 crc kubenswrapper[4765]: I1003 09:01:08.074276 4765 reflector.go:368] Caches populated for *v1.Secret from object-"watcher-kuttl-default"/"watcher-kuttl-applier-config-data" Oct 03 09:01:08 crc kubenswrapper[4765]: I1003 09:01:08.087429 4765 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["watcher-kuttl-default/watcher-kuttl-applier-0"] Oct 03 09:01:08 crc kubenswrapper[4765]: I1003 09:01:08.095350 4765 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["watcher-kuttl-default/watcher-kuttl-api-0"] Oct 03 09:01:08 crc kubenswrapper[4765]: I1003 09:01:08.124765 4765 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["watcher-kuttl-default/watcher-kuttl-decision-engine-0"] Oct 03 09:01:08 crc kubenswrapper[4765]: I1003 09:01:08.126073 4765 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/watcher-kuttl-decision-engine-0" Oct 03 09:01:08 crc kubenswrapper[4765]: I1003 09:01:08.129113 4765 reflector.go:368] Caches populated for *v1.Secret from object-"watcher-kuttl-default"/"watcher-kuttl-decision-engine-config-data" Oct 03 09:01:08 crc kubenswrapper[4765]: I1003 09:01:08.143382 4765 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["watcher-kuttl-default/watcher-kuttl-decision-engine-0"] Oct 03 09:01:08 crc kubenswrapper[4765]: I1003 09:01:08.216984 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2dc92c6d-ca03-4ad1-968e-fbba2da759c4-config-data\") pod \"watcher-kuttl-applier-0\" (UID: \"2dc92c6d-ca03-4ad1-968e-fbba2da759c4\") " pod="watcher-kuttl-default/watcher-kuttl-applier-0" Oct 03 09:01:08 crc kubenswrapper[4765]: I1003 09:01:08.217041 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-68kng\" (UniqueName: \"kubernetes.io/projected/2dc92c6d-ca03-4ad1-968e-fbba2da759c4-kube-api-access-68kng\") pod \"watcher-kuttl-applier-0\" (UID: \"2dc92c6d-ca03-4ad1-968e-fbba2da759c4\") " pod="watcher-kuttl-default/watcher-kuttl-applier-0" Oct 03 09:01:08 crc kubenswrapper[4765]: I1003 09:01:08.217069 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2dc92c6d-ca03-4ad1-968e-fbba2da759c4-combined-ca-bundle\") pod \"watcher-kuttl-applier-0\" (UID: \"2dc92c6d-ca03-4ad1-968e-fbba2da759c4\") " pod="watcher-kuttl-default/watcher-kuttl-applier-0" Oct 03 09:01:08 crc kubenswrapper[4765]: I1003 09:01:08.217286 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e66d5305-05e9-4cf3-a8d4-513b77bdba78-combined-ca-bundle\") pod \"watcher-kuttl-api-0\" (UID: \"e66d5305-05e9-4cf3-a8d4-513b77bdba78\") " pod="watcher-kuttl-default/watcher-kuttl-api-0" Oct 03 09:01:08 crc kubenswrapper[4765]: I1003 09:01:08.217344 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e66d5305-05e9-4cf3-a8d4-513b77bdba78-config-data\") pod \"watcher-kuttl-api-0\" (UID: \"e66d5305-05e9-4cf3-a8d4-513b77bdba78\") " pod="watcher-kuttl-default/watcher-kuttl-api-0" Oct 03 09:01:08 crc kubenswrapper[4765]: I1003 09:01:08.217368 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/e66d5305-05e9-4cf3-a8d4-513b77bdba78-logs\") pod \"watcher-kuttl-api-0\" (UID: \"e66d5305-05e9-4cf3-a8d4-513b77bdba78\") " pod="watcher-kuttl-default/watcher-kuttl-api-0" Oct 03 09:01:08 crc kubenswrapper[4765]: I1003 09:01:08.217499 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/2dc92c6d-ca03-4ad1-968e-fbba2da759c4-logs\") pod \"watcher-kuttl-applier-0\" (UID: \"2dc92c6d-ca03-4ad1-968e-fbba2da759c4\") " pod="watcher-kuttl-default/watcher-kuttl-applier-0" Oct 03 09:01:08 crc kubenswrapper[4765]: I1003 09:01:08.217641 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jxh2d\" (UniqueName: \"kubernetes.io/projected/e66d5305-05e9-4cf3-a8d4-513b77bdba78-kube-api-access-jxh2d\") pod \"watcher-kuttl-api-0\" (UID: \"e66d5305-05e9-4cf3-a8d4-513b77bdba78\") " pod="watcher-kuttl-default/watcher-kuttl-api-0" Oct 03 09:01:08 crc kubenswrapper[4765]: I1003 09:01:08.217937 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"custom-prometheus-ca\" (UniqueName: \"kubernetes.io/secret/e66d5305-05e9-4cf3-a8d4-513b77bdba78-custom-prometheus-ca\") pod \"watcher-kuttl-api-0\" (UID: \"e66d5305-05e9-4cf3-a8d4-513b77bdba78\") " pod="watcher-kuttl-default/watcher-kuttl-api-0" Oct 03 09:01:08 crc kubenswrapper[4765]: I1003 09:01:08.319973 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e66d5305-05e9-4cf3-a8d4-513b77bdba78-config-data\") pod \"watcher-kuttl-api-0\" (UID: \"e66d5305-05e9-4cf3-a8d4-513b77bdba78\") " pod="watcher-kuttl-default/watcher-kuttl-api-0" Oct 03 09:01:08 crc kubenswrapper[4765]: I1003 09:01:08.320028 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/e66d5305-05e9-4cf3-a8d4-513b77bdba78-logs\") pod \"watcher-kuttl-api-0\" (UID: \"e66d5305-05e9-4cf3-a8d4-513b77bdba78\") " pod="watcher-kuttl-default/watcher-kuttl-api-0" Oct 03 09:01:08 crc kubenswrapper[4765]: I1003 09:01:08.320061 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"custom-prometheus-ca\" (UniqueName: \"kubernetes.io/secret/96419208-f57b-4a84-875b-1f9e851d7eda-custom-prometheus-ca\") pod \"watcher-kuttl-decision-engine-0\" (UID: \"96419208-f57b-4a84-875b-1f9e851d7eda\") " pod="watcher-kuttl-default/watcher-kuttl-decision-engine-0" Oct 03 09:01:08 crc kubenswrapper[4765]: I1003 09:01:08.320086 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/2dc92c6d-ca03-4ad1-968e-fbba2da759c4-logs\") pod \"watcher-kuttl-applier-0\" (UID: \"2dc92c6d-ca03-4ad1-968e-fbba2da759c4\") " pod="watcher-kuttl-default/watcher-kuttl-applier-0" Oct 03 09:01:08 crc kubenswrapper[4765]: I1003 09:01:08.320129 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/96419208-f57b-4a84-875b-1f9e851d7eda-config-data\") pod \"watcher-kuttl-decision-engine-0\" (UID: \"96419208-f57b-4a84-875b-1f9e851d7eda\") " pod="watcher-kuttl-default/watcher-kuttl-decision-engine-0" Oct 03 09:01:08 crc kubenswrapper[4765]: I1003 09:01:08.320179 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jxh2d\" (UniqueName: \"kubernetes.io/projected/e66d5305-05e9-4cf3-a8d4-513b77bdba78-kube-api-access-jxh2d\") pod \"watcher-kuttl-api-0\" (UID: \"e66d5305-05e9-4cf3-a8d4-513b77bdba78\") " pod="watcher-kuttl-default/watcher-kuttl-api-0" Oct 03 09:01:08 crc kubenswrapper[4765]: I1003 09:01:08.320228 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"custom-prometheus-ca\" (UniqueName: \"kubernetes.io/secret/e66d5305-05e9-4cf3-a8d4-513b77bdba78-custom-prometheus-ca\") pod \"watcher-kuttl-api-0\" (UID: \"e66d5305-05e9-4cf3-a8d4-513b77bdba78\") " pod="watcher-kuttl-default/watcher-kuttl-api-0" Oct 03 09:01:08 crc kubenswrapper[4765]: I1003 09:01:08.320267 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/96419208-f57b-4a84-875b-1f9e851d7eda-logs\") pod \"watcher-kuttl-decision-engine-0\" (UID: \"96419208-f57b-4a84-875b-1f9e851d7eda\") " pod="watcher-kuttl-default/watcher-kuttl-decision-engine-0" Oct 03 09:01:08 crc kubenswrapper[4765]: I1003 09:01:08.320320 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-x65lw\" (UniqueName: \"kubernetes.io/projected/96419208-f57b-4a84-875b-1f9e851d7eda-kube-api-access-x65lw\") pod \"watcher-kuttl-decision-engine-0\" (UID: \"96419208-f57b-4a84-875b-1f9e851d7eda\") " pod="watcher-kuttl-default/watcher-kuttl-decision-engine-0" Oct 03 09:01:08 crc kubenswrapper[4765]: I1003 09:01:08.320355 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2dc92c6d-ca03-4ad1-968e-fbba2da759c4-config-data\") pod \"watcher-kuttl-applier-0\" (UID: \"2dc92c6d-ca03-4ad1-968e-fbba2da759c4\") " pod="watcher-kuttl-default/watcher-kuttl-applier-0" Oct 03 09:01:08 crc kubenswrapper[4765]: I1003 09:01:08.320390 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-68kng\" (UniqueName: \"kubernetes.io/projected/2dc92c6d-ca03-4ad1-968e-fbba2da759c4-kube-api-access-68kng\") pod \"watcher-kuttl-applier-0\" (UID: \"2dc92c6d-ca03-4ad1-968e-fbba2da759c4\") " pod="watcher-kuttl-default/watcher-kuttl-applier-0" Oct 03 09:01:08 crc kubenswrapper[4765]: I1003 09:01:08.320413 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2dc92c6d-ca03-4ad1-968e-fbba2da759c4-combined-ca-bundle\") pod \"watcher-kuttl-applier-0\" (UID: \"2dc92c6d-ca03-4ad1-968e-fbba2da759c4\") " pod="watcher-kuttl-default/watcher-kuttl-applier-0" Oct 03 09:01:08 crc kubenswrapper[4765]: I1003 09:01:08.320437 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e66d5305-05e9-4cf3-a8d4-513b77bdba78-combined-ca-bundle\") pod \"watcher-kuttl-api-0\" (UID: \"e66d5305-05e9-4cf3-a8d4-513b77bdba78\") " pod="watcher-kuttl-default/watcher-kuttl-api-0" Oct 03 09:01:08 crc kubenswrapper[4765]: I1003 09:01:08.320478 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/96419208-f57b-4a84-875b-1f9e851d7eda-combined-ca-bundle\") pod \"watcher-kuttl-decision-engine-0\" (UID: \"96419208-f57b-4a84-875b-1f9e851d7eda\") " pod="watcher-kuttl-default/watcher-kuttl-decision-engine-0" Oct 03 09:01:08 crc kubenswrapper[4765]: I1003 09:01:08.320528 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/2dc92c6d-ca03-4ad1-968e-fbba2da759c4-logs\") pod \"watcher-kuttl-applier-0\" (UID: \"2dc92c6d-ca03-4ad1-968e-fbba2da759c4\") " pod="watcher-kuttl-default/watcher-kuttl-applier-0" Oct 03 09:01:08 crc kubenswrapper[4765]: I1003 09:01:08.321215 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/e66d5305-05e9-4cf3-a8d4-513b77bdba78-logs\") pod \"watcher-kuttl-api-0\" (UID: \"e66d5305-05e9-4cf3-a8d4-513b77bdba78\") " pod="watcher-kuttl-default/watcher-kuttl-api-0" Oct 03 09:01:08 crc kubenswrapper[4765]: I1003 09:01:08.324937 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e66d5305-05e9-4cf3-a8d4-513b77bdba78-combined-ca-bundle\") pod \"watcher-kuttl-api-0\" (UID: \"e66d5305-05e9-4cf3-a8d4-513b77bdba78\") " pod="watcher-kuttl-default/watcher-kuttl-api-0" Oct 03 09:01:08 crc kubenswrapper[4765]: I1003 09:01:08.325396 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2dc92c6d-ca03-4ad1-968e-fbba2da759c4-combined-ca-bundle\") pod \"watcher-kuttl-applier-0\" (UID: \"2dc92c6d-ca03-4ad1-968e-fbba2da759c4\") " pod="watcher-kuttl-default/watcher-kuttl-applier-0" Oct 03 09:01:08 crc kubenswrapper[4765]: I1003 09:01:08.326057 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"custom-prometheus-ca\" (UniqueName: \"kubernetes.io/secret/e66d5305-05e9-4cf3-a8d4-513b77bdba78-custom-prometheus-ca\") pod \"watcher-kuttl-api-0\" (UID: \"e66d5305-05e9-4cf3-a8d4-513b77bdba78\") " pod="watcher-kuttl-default/watcher-kuttl-api-0" Oct 03 09:01:08 crc kubenswrapper[4765]: I1003 09:01:08.328129 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e66d5305-05e9-4cf3-a8d4-513b77bdba78-config-data\") pod \"watcher-kuttl-api-0\" (UID: \"e66d5305-05e9-4cf3-a8d4-513b77bdba78\") " pod="watcher-kuttl-default/watcher-kuttl-api-0" Oct 03 09:01:08 crc kubenswrapper[4765]: I1003 09:01:08.338857 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2dc92c6d-ca03-4ad1-968e-fbba2da759c4-config-data\") pod \"watcher-kuttl-applier-0\" (UID: \"2dc92c6d-ca03-4ad1-968e-fbba2da759c4\") " pod="watcher-kuttl-default/watcher-kuttl-applier-0" Oct 03 09:01:08 crc kubenswrapper[4765]: I1003 09:01:08.339213 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jxh2d\" (UniqueName: \"kubernetes.io/projected/e66d5305-05e9-4cf3-a8d4-513b77bdba78-kube-api-access-jxh2d\") pod \"watcher-kuttl-api-0\" (UID: \"e66d5305-05e9-4cf3-a8d4-513b77bdba78\") " pod="watcher-kuttl-default/watcher-kuttl-api-0" Oct 03 09:01:08 crc kubenswrapper[4765]: I1003 09:01:08.344089 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-68kng\" (UniqueName: \"kubernetes.io/projected/2dc92c6d-ca03-4ad1-968e-fbba2da759c4-kube-api-access-68kng\") pod \"watcher-kuttl-applier-0\" (UID: \"2dc92c6d-ca03-4ad1-968e-fbba2da759c4\") " pod="watcher-kuttl-default/watcher-kuttl-applier-0" Oct 03 09:01:08 crc kubenswrapper[4765]: I1003 09:01:08.381041 4765 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/watcher-kuttl-api-0" Oct 03 09:01:08 crc kubenswrapper[4765]: I1003 09:01:08.391382 4765 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/watcher-kuttl-applier-0" Oct 03 09:01:08 crc kubenswrapper[4765]: I1003 09:01:08.423076 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"custom-prometheus-ca\" (UniqueName: \"kubernetes.io/secret/96419208-f57b-4a84-875b-1f9e851d7eda-custom-prometheus-ca\") pod \"watcher-kuttl-decision-engine-0\" (UID: \"96419208-f57b-4a84-875b-1f9e851d7eda\") " pod="watcher-kuttl-default/watcher-kuttl-decision-engine-0" Oct 03 09:01:08 crc kubenswrapper[4765]: I1003 09:01:08.423150 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/96419208-f57b-4a84-875b-1f9e851d7eda-config-data\") pod \"watcher-kuttl-decision-engine-0\" (UID: \"96419208-f57b-4a84-875b-1f9e851d7eda\") " pod="watcher-kuttl-default/watcher-kuttl-decision-engine-0" Oct 03 09:01:08 crc kubenswrapper[4765]: I1003 09:01:08.423224 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/96419208-f57b-4a84-875b-1f9e851d7eda-logs\") pod \"watcher-kuttl-decision-engine-0\" (UID: \"96419208-f57b-4a84-875b-1f9e851d7eda\") " pod="watcher-kuttl-default/watcher-kuttl-decision-engine-0" Oct 03 09:01:08 crc kubenswrapper[4765]: I1003 09:01:08.423259 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-x65lw\" (UniqueName: \"kubernetes.io/projected/96419208-f57b-4a84-875b-1f9e851d7eda-kube-api-access-x65lw\") pod \"watcher-kuttl-decision-engine-0\" (UID: \"96419208-f57b-4a84-875b-1f9e851d7eda\") " pod="watcher-kuttl-default/watcher-kuttl-decision-engine-0" Oct 03 09:01:08 crc kubenswrapper[4765]: I1003 09:01:08.423323 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/96419208-f57b-4a84-875b-1f9e851d7eda-combined-ca-bundle\") pod \"watcher-kuttl-decision-engine-0\" (UID: \"96419208-f57b-4a84-875b-1f9e851d7eda\") " pod="watcher-kuttl-default/watcher-kuttl-decision-engine-0" Oct 03 09:01:08 crc kubenswrapper[4765]: I1003 09:01:08.423969 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/96419208-f57b-4a84-875b-1f9e851d7eda-logs\") pod \"watcher-kuttl-decision-engine-0\" (UID: \"96419208-f57b-4a84-875b-1f9e851d7eda\") " pod="watcher-kuttl-default/watcher-kuttl-decision-engine-0" Oct 03 09:01:08 crc kubenswrapper[4765]: I1003 09:01:08.427618 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/96419208-f57b-4a84-875b-1f9e851d7eda-config-data\") pod \"watcher-kuttl-decision-engine-0\" (UID: \"96419208-f57b-4a84-875b-1f9e851d7eda\") " pod="watcher-kuttl-default/watcher-kuttl-decision-engine-0" Oct 03 09:01:08 crc kubenswrapper[4765]: I1003 09:01:08.428018 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/96419208-f57b-4a84-875b-1f9e851d7eda-combined-ca-bundle\") pod \"watcher-kuttl-decision-engine-0\" (UID: \"96419208-f57b-4a84-875b-1f9e851d7eda\") " pod="watcher-kuttl-default/watcher-kuttl-decision-engine-0" Oct 03 09:01:08 crc kubenswrapper[4765]: I1003 09:01:08.429093 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"custom-prometheus-ca\" (UniqueName: \"kubernetes.io/secret/96419208-f57b-4a84-875b-1f9e851d7eda-custom-prometheus-ca\") pod \"watcher-kuttl-decision-engine-0\" (UID: \"96419208-f57b-4a84-875b-1f9e851d7eda\") " pod="watcher-kuttl-default/watcher-kuttl-decision-engine-0" Oct 03 09:01:08 crc kubenswrapper[4765]: I1003 09:01:08.445345 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-x65lw\" (UniqueName: \"kubernetes.io/projected/96419208-f57b-4a84-875b-1f9e851d7eda-kube-api-access-x65lw\") pod \"watcher-kuttl-decision-engine-0\" (UID: \"96419208-f57b-4a84-875b-1f9e851d7eda\") " pod="watcher-kuttl-default/watcher-kuttl-decision-engine-0" Oct 03 09:01:08 crc kubenswrapper[4765]: I1003 09:01:08.743161 4765 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/watcher-kuttl-decision-engine-0" Oct 03 09:01:08 crc kubenswrapper[4765]: I1003 09:01:08.857968 4765 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["watcher-kuttl-default/watcher-kuttl-api-0"] Oct 03 09:01:08 crc kubenswrapper[4765]: W1003 09:01:08.928815 4765 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod2dc92c6d_ca03_4ad1_968e_fbba2da759c4.slice/crio-cac00616c30e8d87c861498b155c36212f32a9f57af95bed2f3230262e7ca69d WatchSource:0}: Error finding container cac00616c30e8d87c861498b155c36212f32a9f57af95bed2f3230262e7ca69d: Status 404 returned error can't find the container with id cac00616c30e8d87c861498b155c36212f32a9f57af95bed2f3230262e7ca69d Oct 03 09:01:08 crc kubenswrapper[4765]: I1003 09:01:08.929115 4765 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["watcher-kuttl-default/watcher-kuttl-applier-0"] Oct 03 09:01:09 crc kubenswrapper[4765]: I1003 09:01:09.180947 4765 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["watcher-kuttl-default/watcher-kuttl-decision-engine-0"] Oct 03 09:01:09 crc kubenswrapper[4765]: W1003 09:01:09.184310 4765 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod96419208_f57b_4a84_875b_1f9e851d7eda.slice/crio-9a01f4ec73699ae72eab4d99b6b0e63a0074d205f7fc1aa2c848d3ce72a22711 WatchSource:0}: Error finding container 9a01f4ec73699ae72eab4d99b6b0e63a0074d205f7fc1aa2c848d3ce72a22711: Status 404 returned error can't find the container with id 9a01f4ec73699ae72eab4d99b6b0e63a0074d205f7fc1aa2c848d3ce72a22711 Oct 03 09:01:09 crc kubenswrapper[4765]: I1003 09:01:09.822468 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/watcher-kuttl-applier-0" event={"ID":"2dc92c6d-ca03-4ad1-968e-fbba2da759c4","Type":"ContainerStarted","Data":"fe6bd96bd0ee31f36d5282d04af28274d3605bfec0b6f768f38898134c1e7c89"} Oct 03 09:01:09 crc kubenswrapper[4765]: I1003 09:01:09.822519 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/watcher-kuttl-applier-0" event={"ID":"2dc92c6d-ca03-4ad1-968e-fbba2da759c4","Type":"ContainerStarted","Data":"cac00616c30e8d87c861498b155c36212f32a9f57af95bed2f3230262e7ca69d"} Oct 03 09:01:09 crc kubenswrapper[4765]: I1003 09:01:09.824413 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/watcher-kuttl-decision-engine-0" event={"ID":"96419208-f57b-4a84-875b-1f9e851d7eda","Type":"ContainerStarted","Data":"ba2fea4514c3b2f5c109eb4a23579391f812b1b9d121c46f92ec8d781d2a241d"} Oct 03 09:01:09 crc kubenswrapper[4765]: I1003 09:01:09.824452 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/watcher-kuttl-decision-engine-0" event={"ID":"96419208-f57b-4a84-875b-1f9e851d7eda","Type":"ContainerStarted","Data":"9a01f4ec73699ae72eab4d99b6b0e63a0074d205f7fc1aa2c848d3ce72a22711"} Oct 03 09:01:09 crc kubenswrapper[4765]: I1003 09:01:09.826818 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/watcher-kuttl-api-0" event={"ID":"e66d5305-05e9-4cf3-a8d4-513b77bdba78","Type":"ContainerStarted","Data":"fa8202d1d727d47cd3288a1fd2ab6d2b6ebc4c536c6f697fbd3cd688857d726a"} Oct 03 09:01:09 crc kubenswrapper[4765]: I1003 09:01:09.826869 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/watcher-kuttl-api-0" event={"ID":"e66d5305-05e9-4cf3-a8d4-513b77bdba78","Type":"ContainerStarted","Data":"e52f8ae5caf6db690d28d2269de31c3837bb3873534dc0f76a4a4d1ea8d5ea5f"} Oct 03 09:01:09 crc kubenswrapper[4765]: I1003 09:01:09.826884 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/watcher-kuttl-api-0" event={"ID":"e66d5305-05e9-4cf3-a8d4-513b77bdba78","Type":"ContainerStarted","Data":"2033d7613e4e7a62f8315a5fb0d0ee1131fdeee2cd689b27659192aad85f3096"} Oct 03 09:01:09 crc kubenswrapper[4765]: I1003 09:01:09.828050 4765 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="watcher-kuttl-default/watcher-kuttl-api-0" Oct 03 09:01:09 crc kubenswrapper[4765]: I1003 09:01:09.886634 4765 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="watcher-kuttl-default/watcher-kuttl-api-0" podStartSLOduration=1.886616279 podStartE2EDuration="1.886616279s" podCreationTimestamp="2025-10-03 09:01:08 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-03 09:01:09.875633502 +0000 UTC m=+1314.177127842" watchObservedRunningTime="2025-10-03 09:01:09.886616279 +0000 UTC m=+1314.188110609" Oct 03 09:01:09 crc kubenswrapper[4765]: I1003 09:01:09.900169 4765 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="watcher-kuttl-default/watcher-kuttl-applier-0" podStartSLOduration=1.900143179 podStartE2EDuration="1.900143179s" podCreationTimestamp="2025-10-03 09:01:08 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-03 09:01:09.852048198 +0000 UTC m=+1314.153542538" watchObservedRunningTime="2025-10-03 09:01:09.900143179 +0000 UTC m=+1314.201637509" Oct 03 09:01:09 crc kubenswrapper[4765]: I1003 09:01:09.903815 4765 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="watcher-kuttl-default/watcher-kuttl-decision-engine-0" podStartSLOduration=1.903799771 podStartE2EDuration="1.903799771s" podCreationTimestamp="2025-10-03 09:01:08 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-03 09:01:09.898551299 +0000 UTC m=+1314.200045639" watchObservedRunningTime="2025-10-03 09:01:09.903799771 +0000 UTC m=+1314.205294111" Oct 03 09:01:11 crc kubenswrapper[4765]: I1003 09:01:11.843439 4765 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Oct 03 09:01:12 crc kubenswrapper[4765]: I1003 09:01:12.165432 4765 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="watcher-kuttl-default/watcher-kuttl-api-0" Oct 03 09:01:13 crc kubenswrapper[4765]: I1003 09:01:13.381715 4765 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="watcher-kuttl-default/watcher-kuttl-api-0" Oct 03 09:01:13 crc kubenswrapper[4765]: I1003 09:01:13.392719 4765 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="watcher-kuttl-default/watcher-kuttl-applier-0" Oct 03 09:01:17 crc kubenswrapper[4765]: I1003 09:01:17.020942 4765 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="watcher-kuttl-default/ceilometer-0" Oct 03 09:01:18 crc kubenswrapper[4765]: I1003 09:01:18.382123 4765 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="watcher-kuttl-default/watcher-kuttl-api-0" Oct 03 09:01:18 crc kubenswrapper[4765]: I1003 09:01:18.389528 4765 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="watcher-kuttl-default/watcher-kuttl-api-0" Oct 03 09:01:18 crc kubenswrapper[4765]: I1003 09:01:18.392018 4765 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="watcher-kuttl-default/watcher-kuttl-applier-0" Oct 03 09:01:18 crc kubenswrapper[4765]: I1003 09:01:18.434796 4765 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="watcher-kuttl-default/watcher-kuttl-applier-0" Oct 03 09:01:18 crc kubenswrapper[4765]: I1003 09:01:18.744654 4765 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="watcher-kuttl-default/watcher-kuttl-decision-engine-0" Oct 03 09:01:18 crc kubenswrapper[4765]: I1003 09:01:18.771320 4765 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="watcher-kuttl-default/watcher-kuttl-decision-engine-0" Oct 03 09:01:18 crc kubenswrapper[4765]: I1003 09:01:18.913568 4765 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="watcher-kuttl-default/watcher-kuttl-decision-engine-0" Oct 03 09:01:18 crc kubenswrapper[4765]: I1003 09:01:18.921660 4765 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="watcher-kuttl-default/watcher-kuttl-api-0" Oct 03 09:01:18 crc kubenswrapper[4765]: I1003 09:01:18.954297 4765 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="watcher-kuttl-default/watcher-kuttl-applier-0" Oct 03 09:01:18 crc kubenswrapper[4765]: I1003 09:01:18.958137 4765 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="watcher-kuttl-default/watcher-kuttl-decision-engine-0" Oct 03 09:01:20 crc kubenswrapper[4765]: I1003 09:01:20.320399 4765 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["watcher-kuttl-default/watcher-kuttl-db-sync-krl79"] Oct 03 09:01:20 crc kubenswrapper[4765]: I1003 09:01:20.326742 4765 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["watcher-kuttl-default/watcher-kuttl-db-sync-krl79"] Oct 03 09:01:20 crc kubenswrapper[4765]: I1003 09:01:20.366315 4765 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["watcher-kuttl-default/watcher-kuttl-applier-0"] Oct 03 09:01:20 crc kubenswrapper[4765]: I1003 09:01:20.415372 4765 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["watcher-kuttl-default/watcher-kuttl-api-0"] Oct 03 09:01:20 crc kubenswrapper[4765]: I1003 09:01:20.434182 4765 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["watcher-kuttl-default/watcher3038-account-delete-fxxhm"] Oct 03 09:01:20 crc kubenswrapper[4765]: I1003 09:01:20.436701 4765 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/watcher3038-account-delete-fxxhm" Oct 03 09:01:20 crc kubenswrapper[4765]: I1003 09:01:20.455088 4765 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["watcher-kuttl-default/watcher3038-account-delete-fxxhm"] Oct 03 09:01:20 crc kubenswrapper[4765]: I1003 09:01:20.493710 4765 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["watcher-kuttl-default/watcher-db-create-dvh78"] Oct 03 09:01:20 crc kubenswrapper[4765]: I1003 09:01:20.510709 4765 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["watcher-kuttl-default/watcher-db-create-dvh78"] Oct 03 09:01:20 crc kubenswrapper[4765]: I1003 09:01:20.532266 4765 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["watcher-kuttl-default/watcher-3038-account-create-78hd4"] Oct 03 09:01:20 crc kubenswrapper[4765]: I1003 09:01:20.538503 4765 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["watcher-kuttl-default/watcher-kuttl-decision-engine-0"] Oct 03 09:01:20 crc kubenswrapper[4765]: I1003 09:01:20.544432 4765 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["watcher-kuttl-default/watcher-3038-account-create-78hd4"] Oct 03 09:01:20 crc kubenswrapper[4765]: I1003 09:01:20.547435 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nbpxs\" (UniqueName: \"kubernetes.io/projected/38d88a2b-52f0-4658-b7ec-df3c9c91069f-kube-api-access-nbpxs\") pod \"watcher3038-account-delete-fxxhm\" (UID: \"38d88a2b-52f0-4658-b7ec-df3c9c91069f\") " pod="watcher-kuttl-default/watcher3038-account-delete-fxxhm" Oct 03 09:01:20 crc kubenswrapper[4765]: I1003 09:01:20.553796 4765 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["watcher-kuttl-default/watcher3038-account-delete-fxxhm"] Oct 03 09:01:20 crc kubenswrapper[4765]: E1003 09:01:20.554666 4765 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[kube-api-access-nbpxs], unattached volumes=[], failed to process volumes=[]: context canceled" pod="watcher-kuttl-default/watcher3038-account-delete-fxxhm" podUID="38d88a2b-52f0-4658-b7ec-df3c9c91069f" Oct 03 09:01:20 crc kubenswrapper[4765]: I1003 09:01:20.649077 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nbpxs\" (UniqueName: \"kubernetes.io/projected/38d88a2b-52f0-4658-b7ec-df3c9c91069f-kube-api-access-nbpxs\") pod \"watcher3038-account-delete-fxxhm\" (UID: \"38d88a2b-52f0-4658-b7ec-df3c9c91069f\") " pod="watcher-kuttl-default/watcher3038-account-delete-fxxhm" Oct 03 09:01:20 crc kubenswrapper[4765]: I1003 09:01:20.676230 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nbpxs\" (UniqueName: \"kubernetes.io/projected/38d88a2b-52f0-4658-b7ec-df3c9c91069f-kube-api-access-nbpxs\") pod \"watcher3038-account-delete-fxxhm\" (UID: \"38d88a2b-52f0-4658-b7ec-df3c9c91069f\") " pod="watcher-kuttl-default/watcher3038-account-delete-fxxhm" Oct 03 09:01:20 crc kubenswrapper[4765]: I1003 09:01:20.748386 4765 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["watcher-kuttl-default/ceilometer-0"] Oct 03 09:01:20 crc kubenswrapper[4765]: I1003 09:01:20.748692 4765 kuberuntime_container.go:808] "Killing container with a grace period" pod="watcher-kuttl-default/ceilometer-0" podUID="e4e9013d-4e82-42e9-81f4-f3a8608d564d" containerName="sg-core" containerID="cri-o://fc9c8a0f1b7fd46433a370ba82c5a2b0ce850c29c99455c75f14ea35a687c8d8" gracePeriod=30 Oct 03 09:01:20 crc kubenswrapper[4765]: I1003 09:01:20.748756 4765 kuberuntime_container.go:808] "Killing container with a grace period" pod="watcher-kuttl-default/ceilometer-0" podUID="e4e9013d-4e82-42e9-81f4-f3a8608d564d" containerName="ceilometer-notification-agent" containerID="cri-o://85689c79ad5864cf081c3be5eee11bfcbf52ba81da4163e3f61da84f9eb536c0" gracePeriod=30 Oct 03 09:01:20 crc kubenswrapper[4765]: I1003 09:01:20.748662 4765 kuberuntime_container.go:808] "Killing container with a grace period" pod="watcher-kuttl-default/ceilometer-0" podUID="e4e9013d-4e82-42e9-81f4-f3a8608d564d" containerName="ceilometer-central-agent" containerID="cri-o://e926be1b2eddb4c1ba41b06d9cae1d186dfae963aa48eb29e85be7e29e8b48b0" gracePeriod=30 Oct 03 09:01:20 crc kubenswrapper[4765]: I1003 09:01:20.748818 4765 kuberuntime_container.go:808] "Killing container with a grace period" pod="watcher-kuttl-default/ceilometer-0" podUID="e4e9013d-4e82-42e9-81f4-f3a8608d564d" containerName="proxy-httpd" containerID="cri-o://3e41e5b6a1677af5c275f81caabde4ed38e4b598326418f025a27a74dec79b67" gracePeriod=30 Oct 03 09:01:20 crc kubenswrapper[4765]: I1003 09:01:20.930700 4765 generic.go:334] "Generic (PLEG): container finished" podID="e4e9013d-4e82-42e9-81f4-f3a8608d564d" containerID="fc9c8a0f1b7fd46433a370ba82c5a2b0ce850c29c99455c75f14ea35a687c8d8" exitCode=2 Oct 03 09:01:20 crc kubenswrapper[4765]: I1003 09:01:20.930789 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/ceilometer-0" event={"ID":"e4e9013d-4e82-42e9-81f4-f3a8608d564d","Type":"ContainerDied","Data":"fc9c8a0f1b7fd46433a370ba82c5a2b0ce850c29c99455c75f14ea35a687c8d8"} Oct 03 09:01:20 crc kubenswrapper[4765]: I1003 09:01:20.930954 4765 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/watcher3038-account-delete-fxxhm" Oct 03 09:01:20 crc kubenswrapper[4765]: I1003 09:01:20.930953 4765 kuberuntime_container.go:808] "Killing container with a grace period" pod="watcher-kuttl-default/watcher-kuttl-api-0" podUID="e66d5305-05e9-4cf3-a8d4-513b77bdba78" containerName="watcher-kuttl-api-log" containerID="cri-o://e52f8ae5caf6db690d28d2269de31c3837bb3873534dc0f76a4a4d1ea8d5ea5f" gracePeriod=30 Oct 03 09:01:20 crc kubenswrapper[4765]: I1003 09:01:20.930992 4765 kuberuntime_container.go:808] "Killing container with a grace period" pod="watcher-kuttl-default/watcher-kuttl-api-0" podUID="e66d5305-05e9-4cf3-a8d4-513b77bdba78" containerName="watcher-api" containerID="cri-o://fa8202d1d727d47cd3288a1fd2ab6d2b6ebc4c536c6f697fbd3cd688857d726a" gracePeriod=30 Oct 03 09:01:20 crc kubenswrapper[4765]: I1003 09:01:20.931742 4765 kubelet_pods.go:1007] "Unable to retrieve pull secret, the image pull may not succeed." pod="watcher-kuttl-default/watcher-kuttl-decision-engine-0" secret="" err="secret \"watcher-watcher-kuttl-dockercfg-v6c2m\" not found" Oct 03 09:01:20 crc kubenswrapper[4765]: I1003 09:01:20.931942 4765 kuberuntime_container.go:808] "Killing container with a grace period" pod="watcher-kuttl-default/watcher-kuttl-applier-0" podUID="2dc92c6d-ca03-4ad1-968e-fbba2da759c4" containerName="watcher-applier" containerID="cri-o://fe6bd96bd0ee31f36d5282d04af28274d3605bfec0b6f768f38898134c1e7c89" gracePeriod=30 Oct 03 09:01:20 crc kubenswrapper[4765]: I1003 09:01:20.942623 4765 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/watcher3038-account-delete-fxxhm" Oct 03 09:01:20 crc kubenswrapper[4765]: E1003 09:01:20.954256 4765 secret.go:188] Couldn't get secret watcher-kuttl-default/watcher-kuttl-decision-engine-config-data: secret "watcher-kuttl-decision-engine-config-data" not found Oct 03 09:01:20 crc kubenswrapper[4765]: E1003 09:01:20.954332 4765 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/96419208-f57b-4a84-875b-1f9e851d7eda-config-data podName:96419208-f57b-4a84-875b-1f9e851d7eda nodeName:}" failed. No retries permitted until 2025-10-03 09:01:21.454310996 +0000 UTC m=+1325.755805326 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "config-data" (UniqueName: "kubernetes.io/secret/96419208-f57b-4a84-875b-1f9e851d7eda-config-data") pod "watcher-kuttl-decision-engine-0" (UID: "96419208-f57b-4a84-875b-1f9e851d7eda") : secret "watcher-kuttl-decision-engine-config-data" not found Oct 03 09:01:21 crc kubenswrapper[4765]: I1003 09:01:21.058457 4765 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nbpxs\" (UniqueName: \"kubernetes.io/projected/38d88a2b-52f0-4658-b7ec-df3c9c91069f-kube-api-access-nbpxs\") pod \"38d88a2b-52f0-4658-b7ec-df3c9c91069f\" (UID: \"38d88a2b-52f0-4658-b7ec-df3c9c91069f\") " Oct 03 09:01:21 crc kubenswrapper[4765]: I1003 09:01:21.070515 4765 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/38d88a2b-52f0-4658-b7ec-df3c9c91069f-kube-api-access-nbpxs" (OuterVolumeSpecName: "kube-api-access-nbpxs") pod "38d88a2b-52f0-4658-b7ec-df3c9c91069f" (UID: "38d88a2b-52f0-4658-b7ec-df3c9c91069f"). InnerVolumeSpecName "kube-api-access-nbpxs". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 09:01:21 crc kubenswrapper[4765]: I1003 09:01:21.161290 4765 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-nbpxs\" (UniqueName: \"kubernetes.io/projected/38d88a2b-52f0-4658-b7ec-df3c9c91069f-kube-api-access-nbpxs\") on node \"crc\" DevicePath \"\"" Oct 03 09:01:21 crc kubenswrapper[4765]: E1003 09:01:21.468324 4765 secret.go:188] Couldn't get secret watcher-kuttl-default/watcher-kuttl-decision-engine-config-data: secret "watcher-kuttl-decision-engine-config-data" not found Oct 03 09:01:21 crc kubenswrapper[4765]: E1003 09:01:21.469297 4765 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/96419208-f57b-4a84-875b-1f9e851d7eda-config-data podName:96419208-f57b-4a84-875b-1f9e851d7eda nodeName:}" failed. No retries permitted until 2025-10-03 09:01:22.469275096 +0000 UTC m=+1326.770769426 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "config-data" (UniqueName: "kubernetes.io/secret/96419208-f57b-4a84-875b-1f9e851d7eda-config-data") pod "watcher-kuttl-decision-engine-0" (UID: "96419208-f57b-4a84-875b-1f9e851d7eda") : secret "watcher-kuttl-decision-engine-config-data" not found Oct 03 09:01:21 crc kubenswrapper[4765]: I1003 09:01:21.940443 4765 generic.go:334] "Generic (PLEG): container finished" podID="e4e9013d-4e82-42e9-81f4-f3a8608d564d" containerID="3e41e5b6a1677af5c275f81caabde4ed38e4b598326418f025a27a74dec79b67" exitCode=0 Oct 03 09:01:21 crc kubenswrapper[4765]: I1003 09:01:21.940929 4765 generic.go:334] "Generic (PLEG): container finished" podID="e4e9013d-4e82-42e9-81f4-f3a8608d564d" containerID="e926be1b2eddb4c1ba41b06d9cae1d186dfae963aa48eb29e85be7e29e8b48b0" exitCode=0 Oct 03 09:01:21 crc kubenswrapper[4765]: I1003 09:01:21.940517 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/ceilometer-0" event={"ID":"e4e9013d-4e82-42e9-81f4-f3a8608d564d","Type":"ContainerDied","Data":"3e41e5b6a1677af5c275f81caabde4ed38e4b598326418f025a27a74dec79b67"} Oct 03 09:01:21 crc kubenswrapper[4765]: I1003 09:01:21.941197 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/ceilometer-0" event={"ID":"e4e9013d-4e82-42e9-81f4-f3a8608d564d","Type":"ContainerDied","Data":"e926be1b2eddb4c1ba41b06d9cae1d186dfae963aa48eb29e85be7e29e8b48b0"} Oct 03 09:01:21 crc kubenswrapper[4765]: I1003 09:01:21.943156 4765 generic.go:334] "Generic (PLEG): container finished" podID="e66d5305-05e9-4cf3-a8d4-513b77bdba78" containerID="e52f8ae5caf6db690d28d2269de31c3837bb3873534dc0f76a4a4d1ea8d5ea5f" exitCode=143 Oct 03 09:01:21 crc kubenswrapper[4765]: I1003 09:01:21.943262 4765 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/watcher3038-account-delete-fxxhm" Oct 03 09:01:21 crc kubenswrapper[4765]: I1003 09:01:21.943330 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/watcher-kuttl-api-0" event={"ID":"e66d5305-05e9-4cf3-a8d4-513b77bdba78","Type":"ContainerDied","Data":"e52f8ae5caf6db690d28d2269de31c3837bb3873534dc0f76a4a4d1ea8d5ea5f"} Oct 03 09:01:21 crc kubenswrapper[4765]: I1003 09:01:21.943567 4765 kuberuntime_container.go:808] "Killing container with a grace period" pod="watcher-kuttl-default/watcher-kuttl-decision-engine-0" podUID="96419208-f57b-4a84-875b-1f9e851d7eda" containerName="watcher-decision-engine" containerID="cri-o://ba2fea4514c3b2f5c109eb4a23579391f812b1b9d121c46f92ec8d781d2a241d" gracePeriod=30 Oct 03 09:01:21 crc kubenswrapper[4765]: I1003 09:01:21.993866 4765 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["watcher-kuttl-default/watcher3038-account-delete-fxxhm"] Oct 03 09:01:22 crc kubenswrapper[4765]: I1003 09:01:22.001319 4765 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["watcher-kuttl-default/watcher3038-account-delete-fxxhm"] Oct 03 09:01:22 crc kubenswrapper[4765]: I1003 09:01:22.322007 4765 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="38d88a2b-52f0-4658-b7ec-df3c9c91069f" path="/var/lib/kubelet/pods/38d88a2b-52f0-4658-b7ec-df3c9c91069f/volumes" Oct 03 09:01:22 crc kubenswrapper[4765]: I1003 09:01:22.322403 4765 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="693c5bb3-d4e8-4f16-863e-c79e9d1de450" path="/var/lib/kubelet/pods/693c5bb3-d4e8-4f16-863e-c79e9d1de450/volumes" Oct 03 09:01:22 crc kubenswrapper[4765]: I1003 09:01:22.325939 4765 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9ab77867-5e7d-4768-bfce-8e6346e5d1c1" path="/var/lib/kubelet/pods/9ab77867-5e7d-4768-bfce-8e6346e5d1c1/volumes" Oct 03 09:01:22 crc kubenswrapper[4765]: I1003 09:01:22.326470 4765 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d833609e-467c-49a2-ac00-c5224ffb1cb5" path="/var/lib/kubelet/pods/d833609e-467c-49a2-ac00-c5224ffb1cb5/volumes" Oct 03 09:01:22 crc kubenswrapper[4765]: E1003 09:01:22.484341 4765 secret.go:188] Couldn't get secret watcher-kuttl-default/watcher-kuttl-decision-engine-config-data: secret "watcher-kuttl-decision-engine-config-data" not found Oct 03 09:01:22 crc kubenswrapper[4765]: E1003 09:01:22.484426 4765 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/96419208-f57b-4a84-875b-1f9e851d7eda-config-data podName:96419208-f57b-4a84-875b-1f9e851d7eda nodeName:}" failed. No retries permitted until 2025-10-03 09:01:24.484406114 +0000 UTC m=+1328.785900454 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "config-data" (UniqueName: "kubernetes.io/secret/96419208-f57b-4a84-875b-1f9e851d7eda-config-data") pod "watcher-kuttl-decision-engine-0" (UID: "96419208-f57b-4a84-875b-1f9e851d7eda") : secret "watcher-kuttl-decision-engine-config-data" not found Oct 03 09:01:22 crc kubenswrapper[4765]: I1003 09:01:22.650421 4765 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/watcher-kuttl-api-0" Oct 03 09:01:22 crc kubenswrapper[4765]: I1003 09:01:22.687139 4765 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"custom-prometheus-ca\" (UniqueName: \"kubernetes.io/secret/e66d5305-05e9-4cf3-a8d4-513b77bdba78-custom-prometheus-ca\") pod \"e66d5305-05e9-4cf3-a8d4-513b77bdba78\" (UID: \"e66d5305-05e9-4cf3-a8d4-513b77bdba78\") " Oct 03 09:01:22 crc kubenswrapper[4765]: I1003 09:01:22.687238 4765 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/e66d5305-05e9-4cf3-a8d4-513b77bdba78-logs\") pod \"e66d5305-05e9-4cf3-a8d4-513b77bdba78\" (UID: \"e66d5305-05e9-4cf3-a8d4-513b77bdba78\") " Oct 03 09:01:22 crc kubenswrapper[4765]: I1003 09:01:22.687299 4765 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jxh2d\" (UniqueName: \"kubernetes.io/projected/e66d5305-05e9-4cf3-a8d4-513b77bdba78-kube-api-access-jxh2d\") pod \"e66d5305-05e9-4cf3-a8d4-513b77bdba78\" (UID: \"e66d5305-05e9-4cf3-a8d4-513b77bdba78\") " Oct 03 09:01:22 crc kubenswrapper[4765]: I1003 09:01:22.687517 4765 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e66d5305-05e9-4cf3-a8d4-513b77bdba78-config-data\") pod \"e66d5305-05e9-4cf3-a8d4-513b77bdba78\" (UID: \"e66d5305-05e9-4cf3-a8d4-513b77bdba78\") " Oct 03 09:01:22 crc kubenswrapper[4765]: I1003 09:01:22.687546 4765 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e66d5305-05e9-4cf3-a8d4-513b77bdba78-combined-ca-bundle\") pod \"e66d5305-05e9-4cf3-a8d4-513b77bdba78\" (UID: \"e66d5305-05e9-4cf3-a8d4-513b77bdba78\") " Oct 03 09:01:22 crc kubenswrapper[4765]: I1003 09:01:22.688455 4765 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e66d5305-05e9-4cf3-a8d4-513b77bdba78-logs" (OuterVolumeSpecName: "logs") pod "e66d5305-05e9-4cf3-a8d4-513b77bdba78" (UID: "e66d5305-05e9-4cf3-a8d4-513b77bdba78"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 03 09:01:22 crc kubenswrapper[4765]: I1003 09:01:22.695327 4765 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e66d5305-05e9-4cf3-a8d4-513b77bdba78-kube-api-access-jxh2d" (OuterVolumeSpecName: "kube-api-access-jxh2d") pod "e66d5305-05e9-4cf3-a8d4-513b77bdba78" (UID: "e66d5305-05e9-4cf3-a8d4-513b77bdba78"). InnerVolumeSpecName "kube-api-access-jxh2d". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 09:01:22 crc kubenswrapper[4765]: I1003 09:01:22.721820 4765 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e66d5305-05e9-4cf3-a8d4-513b77bdba78-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "e66d5305-05e9-4cf3-a8d4-513b77bdba78" (UID: "e66d5305-05e9-4cf3-a8d4-513b77bdba78"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 09:01:22 crc kubenswrapper[4765]: I1003 09:01:22.726135 4765 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e66d5305-05e9-4cf3-a8d4-513b77bdba78-custom-prometheus-ca" (OuterVolumeSpecName: "custom-prometheus-ca") pod "e66d5305-05e9-4cf3-a8d4-513b77bdba78" (UID: "e66d5305-05e9-4cf3-a8d4-513b77bdba78"). InnerVolumeSpecName "custom-prometheus-ca". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 09:01:22 crc kubenswrapper[4765]: I1003 09:01:22.764844 4765 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e66d5305-05e9-4cf3-a8d4-513b77bdba78-config-data" (OuterVolumeSpecName: "config-data") pod "e66d5305-05e9-4cf3-a8d4-513b77bdba78" (UID: "e66d5305-05e9-4cf3-a8d4-513b77bdba78"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 09:01:22 crc kubenswrapper[4765]: I1003 09:01:22.794246 4765 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/e66d5305-05e9-4cf3-a8d4-513b77bdba78-logs\") on node \"crc\" DevicePath \"\"" Oct 03 09:01:22 crc kubenswrapper[4765]: I1003 09:01:22.794290 4765 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jxh2d\" (UniqueName: \"kubernetes.io/projected/e66d5305-05e9-4cf3-a8d4-513b77bdba78-kube-api-access-jxh2d\") on node \"crc\" DevicePath \"\"" Oct 03 09:01:22 crc kubenswrapper[4765]: I1003 09:01:22.794303 4765 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e66d5305-05e9-4cf3-a8d4-513b77bdba78-config-data\") on node \"crc\" DevicePath \"\"" Oct 03 09:01:22 crc kubenswrapper[4765]: I1003 09:01:22.794315 4765 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e66d5305-05e9-4cf3-a8d4-513b77bdba78-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 03 09:01:22 crc kubenswrapper[4765]: I1003 09:01:22.794326 4765 reconciler_common.go:293] "Volume detached for volume \"custom-prometheus-ca\" (UniqueName: \"kubernetes.io/secret/e66d5305-05e9-4cf3-a8d4-513b77bdba78-custom-prometheus-ca\") on node \"crc\" DevicePath \"\"" Oct 03 09:01:22 crc kubenswrapper[4765]: I1003 09:01:22.952889 4765 generic.go:334] "Generic (PLEG): container finished" podID="e66d5305-05e9-4cf3-a8d4-513b77bdba78" containerID="fa8202d1d727d47cd3288a1fd2ab6d2b6ebc4c536c6f697fbd3cd688857d726a" exitCode=0 Oct 03 09:01:22 crc kubenswrapper[4765]: I1003 09:01:22.952948 4765 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/watcher-kuttl-api-0" Oct 03 09:01:22 crc kubenswrapper[4765]: I1003 09:01:22.952968 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/watcher-kuttl-api-0" event={"ID":"e66d5305-05e9-4cf3-a8d4-513b77bdba78","Type":"ContainerDied","Data":"fa8202d1d727d47cd3288a1fd2ab6d2b6ebc4c536c6f697fbd3cd688857d726a"} Oct 03 09:01:22 crc kubenswrapper[4765]: I1003 09:01:22.953211 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/watcher-kuttl-api-0" event={"ID":"e66d5305-05e9-4cf3-a8d4-513b77bdba78","Type":"ContainerDied","Data":"2033d7613e4e7a62f8315a5fb0d0ee1131fdeee2cd689b27659192aad85f3096"} Oct 03 09:01:22 crc kubenswrapper[4765]: I1003 09:01:22.953232 4765 scope.go:117] "RemoveContainer" containerID="fa8202d1d727d47cd3288a1fd2ab6d2b6ebc4c536c6f697fbd3cd688857d726a" Oct 03 09:01:22 crc kubenswrapper[4765]: I1003 09:01:22.978132 4765 scope.go:117] "RemoveContainer" containerID="e52f8ae5caf6db690d28d2269de31c3837bb3873534dc0f76a4a4d1ea8d5ea5f" Oct 03 09:01:23 crc kubenswrapper[4765]: I1003 09:01:23.001491 4765 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["watcher-kuttl-default/watcher-kuttl-api-0"] Oct 03 09:01:23 crc kubenswrapper[4765]: I1003 09:01:23.012099 4765 scope.go:117] "RemoveContainer" containerID="fa8202d1d727d47cd3288a1fd2ab6d2b6ebc4c536c6f697fbd3cd688857d726a" Oct 03 09:01:23 crc kubenswrapper[4765]: E1003 09:01:23.012614 4765 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"fa8202d1d727d47cd3288a1fd2ab6d2b6ebc4c536c6f697fbd3cd688857d726a\": container with ID starting with fa8202d1d727d47cd3288a1fd2ab6d2b6ebc4c536c6f697fbd3cd688857d726a not found: ID does not exist" containerID="fa8202d1d727d47cd3288a1fd2ab6d2b6ebc4c536c6f697fbd3cd688857d726a" Oct 03 09:01:23 crc kubenswrapper[4765]: I1003 09:01:23.012661 4765 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"fa8202d1d727d47cd3288a1fd2ab6d2b6ebc4c536c6f697fbd3cd688857d726a"} err="failed to get container status \"fa8202d1d727d47cd3288a1fd2ab6d2b6ebc4c536c6f697fbd3cd688857d726a\": rpc error: code = NotFound desc = could not find container \"fa8202d1d727d47cd3288a1fd2ab6d2b6ebc4c536c6f697fbd3cd688857d726a\": container with ID starting with fa8202d1d727d47cd3288a1fd2ab6d2b6ebc4c536c6f697fbd3cd688857d726a not found: ID does not exist" Oct 03 09:01:23 crc kubenswrapper[4765]: I1003 09:01:23.012687 4765 scope.go:117] "RemoveContainer" containerID="e52f8ae5caf6db690d28d2269de31c3837bb3873534dc0f76a4a4d1ea8d5ea5f" Oct 03 09:01:23 crc kubenswrapper[4765]: E1003 09:01:23.012929 4765 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e52f8ae5caf6db690d28d2269de31c3837bb3873534dc0f76a4a4d1ea8d5ea5f\": container with ID starting with e52f8ae5caf6db690d28d2269de31c3837bb3873534dc0f76a4a4d1ea8d5ea5f not found: ID does not exist" containerID="e52f8ae5caf6db690d28d2269de31c3837bb3873534dc0f76a4a4d1ea8d5ea5f" Oct 03 09:01:23 crc kubenswrapper[4765]: I1003 09:01:23.012954 4765 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e52f8ae5caf6db690d28d2269de31c3837bb3873534dc0f76a4a4d1ea8d5ea5f"} err="failed to get container status \"e52f8ae5caf6db690d28d2269de31c3837bb3873534dc0f76a4a4d1ea8d5ea5f\": rpc error: code = NotFound desc = could not find container \"e52f8ae5caf6db690d28d2269de31c3837bb3873534dc0f76a4a4d1ea8d5ea5f\": container with ID starting with e52f8ae5caf6db690d28d2269de31c3837bb3873534dc0f76a4a4d1ea8d5ea5f not found: ID does not exist" Oct 03 09:01:23 crc kubenswrapper[4765]: I1003 09:01:23.015544 4765 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["watcher-kuttl-default/watcher-kuttl-api-0"] Oct 03 09:01:23 crc kubenswrapper[4765]: E1003 09:01:23.394621 4765 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="fe6bd96bd0ee31f36d5282d04af28274d3605bfec0b6f768f38898134c1e7c89" cmd=["/usr/bin/pgrep","-r","DRST","watcher-applier"] Oct 03 09:01:23 crc kubenswrapper[4765]: E1003 09:01:23.399811 4765 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="fe6bd96bd0ee31f36d5282d04af28274d3605bfec0b6f768f38898134c1e7c89" cmd=["/usr/bin/pgrep","-r","DRST","watcher-applier"] Oct 03 09:01:23 crc kubenswrapper[4765]: E1003 09:01:23.401828 4765 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="fe6bd96bd0ee31f36d5282d04af28274d3605bfec0b6f768f38898134c1e7c89" cmd=["/usr/bin/pgrep","-r","DRST","watcher-applier"] Oct 03 09:01:23 crc kubenswrapper[4765]: E1003 09:01:23.401917 4765 prober.go:104] "Probe errored" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" probeType="Readiness" pod="watcher-kuttl-default/watcher-kuttl-applier-0" podUID="2dc92c6d-ca03-4ad1-968e-fbba2da759c4" containerName="watcher-applier" Oct 03 09:01:24 crc kubenswrapper[4765]: I1003 09:01:24.319469 4765 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e66d5305-05e9-4cf3-a8d4-513b77bdba78" path="/var/lib/kubelet/pods/e66d5305-05e9-4cf3-a8d4-513b77bdba78/volumes" Oct 03 09:01:24 crc kubenswrapper[4765]: E1003 09:01:24.531897 4765 secret.go:188] Couldn't get secret watcher-kuttl-default/watcher-kuttl-decision-engine-config-data: secret "watcher-kuttl-decision-engine-config-data" not found Oct 03 09:01:24 crc kubenswrapper[4765]: E1003 09:01:24.531962 4765 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/96419208-f57b-4a84-875b-1f9e851d7eda-config-data podName:96419208-f57b-4a84-875b-1f9e851d7eda nodeName:}" failed. No retries permitted until 2025-10-03 09:01:28.531948687 +0000 UTC m=+1332.833443017 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "config-data" (UniqueName: "kubernetes.io/secret/96419208-f57b-4a84-875b-1f9e851d7eda-config-data") pod "watcher-kuttl-decision-engine-0" (UID: "96419208-f57b-4a84-875b-1f9e851d7eda") : secret "watcher-kuttl-decision-engine-config-data" not found Oct 03 09:01:25 crc kubenswrapper[4765]: I1003 09:01:25.673605 4765 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/watcher-kuttl-applier-0" Oct 03 09:01:25 crc kubenswrapper[4765]: I1003 09:01:25.754290 4765 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2dc92c6d-ca03-4ad1-968e-fbba2da759c4-combined-ca-bundle\") pod \"2dc92c6d-ca03-4ad1-968e-fbba2da759c4\" (UID: \"2dc92c6d-ca03-4ad1-968e-fbba2da759c4\") " Oct 03 09:01:25 crc kubenswrapper[4765]: I1003 09:01:25.754396 4765 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/2dc92c6d-ca03-4ad1-968e-fbba2da759c4-logs\") pod \"2dc92c6d-ca03-4ad1-968e-fbba2da759c4\" (UID: \"2dc92c6d-ca03-4ad1-968e-fbba2da759c4\") " Oct 03 09:01:25 crc kubenswrapper[4765]: I1003 09:01:25.754448 4765 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2dc92c6d-ca03-4ad1-968e-fbba2da759c4-config-data\") pod \"2dc92c6d-ca03-4ad1-968e-fbba2da759c4\" (UID: \"2dc92c6d-ca03-4ad1-968e-fbba2da759c4\") " Oct 03 09:01:25 crc kubenswrapper[4765]: I1003 09:01:25.754528 4765 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-68kng\" (UniqueName: \"kubernetes.io/projected/2dc92c6d-ca03-4ad1-968e-fbba2da759c4-kube-api-access-68kng\") pod \"2dc92c6d-ca03-4ad1-968e-fbba2da759c4\" (UID: \"2dc92c6d-ca03-4ad1-968e-fbba2da759c4\") " Oct 03 09:01:25 crc kubenswrapper[4765]: I1003 09:01:25.754880 4765 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/2dc92c6d-ca03-4ad1-968e-fbba2da759c4-logs" (OuterVolumeSpecName: "logs") pod "2dc92c6d-ca03-4ad1-968e-fbba2da759c4" (UID: "2dc92c6d-ca03-4ad1-968e-fbba2da759c4"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 03 09:01:25 crc kubenswrapper[4765]: I1003 09:01:25.772424 4765 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2dc92c6d-ca03-4ad1-968e-fbba2da759c4-kube-api-access-68kng" (OuterVolumeSpecName: "kube-api-access-68kng") pod "2dc92c6d-ca03-4ad1-968e-fbba2da759c4" (UID: "2dc92c6d-ca03-4ad1-968e-fbba2da759c4"). InnerVolumeSpecName "kube-api-access-68kng". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 09:01:25 crc kubenswrapper[4765]: I1003 09:01:25.785494 4765 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2dc92c6d-ca03-4ad1-968e-fbba2da759c4-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "2dc92c6d-ca03-4ad1-968e-fbba2da759c4" (UID: "2dc92c6d-ca03-4ad1-968e-fbba2da759c4"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 09:01:25 crc kubenswrapper[4765]: I1003 09:01:25.796543 4765 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2dc92c6d-ca03-4ad1-968e-fbba2da759c4-config-data" (OuterVolumeSpecName: "config-data") pod "2dc92c6d-ca03-4ad1-968e-fbba2da759c4" (UID: "2dc92c6d-ca03-4ad1-968e-fbba2da759c4"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 09:01:25 crc kubenswrapper[4765]: I1003 09:01:25.808163 4765 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/ceilometer-0" Oct 03 09:01:25 crc kubenswrapper[4765]: I1003 09:01:25.856564 4765 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e4e9013d-4e82-42e9-81f4-f3a8608d564d-config-data\") pod \"e4e9013d-4e82-42e9-81f4-f3a8608d564d\" (UID: \"e4e9013d-4e82-42e9-81f4-f3a8608d564d\") " Oct 03 09:01:25 crc kubenswrapper[4765]: I1003 09:01:25.856628 4765 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-84k42\" (UniqueName: \"kubernetes.io/projected/e4e9013d-4e82-42e9-81f4-f3a8608d564d-kube-api-access-84k42\") pod \"e4e9013d-4e82-42e9-81f4-f3a8608d564d\" (UID: \"e4e9013d-4e82-42e9-81f4-f3a8608d564d\") " Oct 03 09:01:25 crc kubenswrapper[4765]: I1003 09:01:25.856677 4765 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/e4e9013d-4e82-42e9-81f4-f3a8608d564d-ceilometer-tls-certs\") pod \"e4e9013d-4e82-42e9-81f4-f3a8608d564d\" (UID: \"e4e9013d-4e82-42e9-81f4-f3a8608d564d\") " Oct 03 09:01:25 crc kubenswrapper[4765]: I1003 09:01:25.856750 4765 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/e4e9013d-4e82-42e9-81f4-f3a8608d564d-run-httpd\") pod \"e4e9013d-4e82-42e9-81f4-f3a8608d564d\" (UID: \"e4e9013d-4e82-42e9-81f4-f3a8608d564d\") " Oct 03 09:01:25 crc kubenswrapper[4765]: I1003 09:01:25.856791 4765 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e4e9013d-4e82-42e9-81f4-f3a8608d564d-scripts\") pod \"e4e9013d-4e82-42e9-81f4-f3a8608d564d\" (UID: \"e4e9013d-4e82-42e9-81f4-f3a8608d564d\") " Oct 03 09:01:25 crc kubenswrapper[4765]: I1003 09:01:25.856849 4765 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e4e9013d-4e82-42e9-81f4-f3a8608d564d-combined-ca-bundle\") pod \"e4e9013d-4e82-42e9-81f4-f3a8608d564d\" (UID: \"e4e9013d-4e82-42e9-81f4-f3a8608d564d\") " Oct 03 09:01:25 crc kubenswrapper[4765]: I1003 09:01:25.856873 4765 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/e4e9013d-4e82-42e9-81f4-f3a8608d564d-sg-core-conf-yaml\") pod \"e4e9013d-4e82-42e9-81f4-f3a8608d564d\" (UID: \"e4e9013d-4e82-42e9-81f4-f3a8608d564d\") " Oct 03 09:01:25 crc kubenswrapper[4765]: I1003 09:01:25.856954 4765 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/e4e9013d-4e82-42e9-81f4-f3a8608d564d-log-httpd\") pod \"e4e9013d-4e82-42e9-81f4-f3a8608d564d\" (UID: \"e4e9013d-4e82-42e9-81f4-f3a8608d564d\") " Oct 03 09:01:25 crc kubenswrapper[4765]: I1003 09:01:25.857339 4765 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/2dc92c6d-ca03-4ad1-968e-fbba2da759c4-logs\") on node \"crc\" DevicePath \"\"" Oct 03 09:01:25 crc kubenswrapper[4765]: I1003 09:01:25.857358 4765 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2dc92c6d-ca03-4ad1-968e-fbba2da759c4-config-data\") on node \"crc\" DevicePath \"\"" Oct 03 09:01:25 crc kubenswrapper[4765]: I1003 09:01:25.857370 4765 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-68kng\" (UniqueName: \"kubernetes.io/projected/2dc92c6d-ca03-4ad1-968e-fbba2da759c4-kube-api-access-68kng\") on node \"crc\" DevicePath \"\"" Oct 03 09:01:25 crc kubenswrapper[4765]: I1003 09:01:25.857381 4765 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2dc92c6d-ca03-4ad1-968e-fbba2da759c4-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 03 09:01:25 crc kubenswrapper[4765]: I1003 09:01:25.857782 4765 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e4e9013d-4e82-42e9-81f4-f3a8608d564d-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "e4e9013d-4e82-42e9-81f4-f3a8608d564d" (UID: "e4e9013d-4e82-42e9-81f4-f3a8608d564d"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 03 09:01:25 crc kubenswrapper[4765]: I1003 09:01:25.857886 4765 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e4e9013d-4e82-42e9-81f4-f3a8608d564d-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "e4e9013d-4e82-42e9-81f4-f3a8608d564d" (UID: "e4e9013d-4e82-42e9-81f4-f3a8608d564d"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 03 09:01:25 crc kubenswrapper[4765]: I1003 09:01:25.863415 4765 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e4e9013d-4e82-42e9-81f4-f3a8608d564d-scripts" (OuterVolumeSpecName: "scripts") pod "e4e9013d-4e82-42e9-81f4-f3a8608d564d" (UID: "e4e9013d-4e82-42e9-81f4-f3a8608d564d"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 09:01:25 crc kubenswrapper[4765]: I1003 09:01:25.863466 4765 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e4e9013d-4e82-42e9-81f4-f3a8608d564d-kube-api-access-84k42" (OuterVolumeSpecName: "kube-api-access-84k42") pod "e4e9013d-4e82-42e9-81f4-f3a8608d564d" (UID: "e4e9013d-4e82-42e9-81f4-f3a8608d564d"). InnerVolumeSpecName "kube-api-access-84k42". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 09:01:25 crc kubenswrapper[4765]: I1003 09:01:25.888533 4765 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e4e9013d-4e82-42e9-81f4-f3a8608d564d-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "e4e9013d-4e82-42e9-81f4-f3a8608d564d" (UID: "e4e9013d-4e82-42e9-81f4-f3a8608d564d"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 09:01:25 crc kubenswrapper[4765]: I1003 09:01:25.906693 4765 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e4e9013d-4e82-42e9-81f4-f3a8608d564d-ceilometer-tls-certs" (OuterVolumeSpecName: "ceilometer-tls-certs") pod "e4e9013d-4e82-42e9-81f4-f3a8608d564d" (UID: "e4e9013d-4e82-42e9-81f4-f3a8608d564d"). InnerVolumeSpecName "ceilometer-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 09:01:25 crc kubenswrapper[4765]: I1003 09:01:25.939319 4765 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e4e9013d-4e82-42e9-81f4-f3a8608d564d-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "e4e9013d-4e82-42e9-81f4-f3a8608d564d" (UID: "e4e9013d-4e82-42e9-81f4-f3a8608d564d"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 09:01:25 crc kubenswrapper[4765]: I1003 09:01:25.939356 4765 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e4e9013d-4e82-42e9-81f4-f3a8608d564d-config-data" (OuterVolumeSpecName: "config-data") pod "e4e9013d-4e82-42e9-81f4-f3a8608d564d" (UID: "e4e9013d-4e82-42e9-81f4-f3a8608d564d"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 09:01:25 crc kubenswrapper[4765]: I1003 09:01:25.960870 4765 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/e4e9013d-4e82-42e9-81f4-f3a8608d564d-run-httpd\") on node \"crc\" DevicePath \"\"" Oct 03 09:01:25 crc kubenswrapper[4765]: I1003 09:01:25.960905 4765 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e4e9013d-4e82-42e9-81f4-f3a8608d564d-scripts\") on node \"crc\" DevicePath \"\"" Oct 03 09:01:25 crc kubenswrapper[4765]: I1003 09:01:25.960915 4765 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e4e9013d-4e82-42e9-81f4-f3a8608d564d-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 03 09:01:25 crc kubenswrapper[4765]: I1003 09:01:25.960925 4765 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/e4e9013d-4e82-42e9-81f4-f3a8608d564d-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Oct 03 09:01:25 crc kubenswrapper[4765]: I1003 09:01:25.960932 4765 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/e4e9013d-4e82-42e9-81f4-f3a8608d564d-log-httpd\") on node \"crc\" DevicePath \"\"" Oct 03 09:01:25 crc kubenswrapper[4765]: I1003 09:01:25.960940 4765 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e4e9013d-4e82-42e9-81f4-f3a8608d564d-config-data\") on node \"crc\" DevicePath \"\"" Oct 03 09:01:25 crc kubenswrapper[4765]: I1003 09:01:25.960948 4765 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-84k42\" (UniqueName: \"kubernetes.io/projected/e4e9013d-4e82-42e9-81f4-f3a8608d564d-kube-api-access-84k42\") on node \"crc\" DevicePath \"\"" Oct 03 09:01:25 crc kubenswrapper[4765]: I1003 09:01:25.960958 4765 reconciler_common.go:293] "Volume detached for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/e4e9013d-4e82-42e9-81f4-f3a8608d564d-ceilometer-tls-certs\") on node \"crc\" DevicePath \"\"" Oct 03 09:01:25 crc kubenswrapper[4765]: I1003 09:01:25.978020 4765 generic.go:334] "Generic (PLEG): container finished" podID="e4e9013d-4e82-42e9-81f4-f3a8608d564d" containerID="85689c79ad5864cf081c3be5eee11bfcbf52ba81da4163e3f61da84f9eb536c0" exitCode=0 Oct 03 09:01:25 crc kubenswrapper[4765]: I1003 09:01:25.978068 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/ceilometer-0" event={"ID":"e4e9013d-4e82-42e9-81f4-f3a8608d564d","Type":"ContainerDied","Data":"85689c79ad5864cf081c3be5eee11bfcbf52ba81da4163e3f61da84f9eb536c0"} Oct 03 09:01:25 crc kubenswrapper[4765]: I1003 09:01:25.978395 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/ceilometer-0" event={"ID":"e4e9013d-4e82-42e9-81f4-f3a8608d564d","Type":"ContainerDied","Data":"3615654e2130e877de03b052069a7174cab00b6321bc4c569e070af7219260c6"} Oct 03 09:01:25 crc kubenswrapper[4765]: I1003 09:01:25.978483 4765 scope.go:117] "RemoveContainer" containerID="3e41e5b6a1677af5c275f81caabde4ed38e4b598326418f025a27a74dec79b67" Oct 03 09:01:25 crc kubenswrapper[4765]: I1003 09:01:25.978080 4765 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/ceilometer-0" Oct 03 09:01:25 crc kubenswrapper[4765]: I1003 09:01:25.980816 4765 generic.go:334] "Generic (PLEG): container finished" podID="2dc92c6d-ca03-4ad1-968e-fbba2da759c4" containerID="fe6bd96bd0ee31f36d5282d04af28274d3605bfec0b6f768f38898134c1e7c89" exitCode=0 Oct 03 09:01:25 crc kubenswrapper[4765]: I1003 09:01:25.980896 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/watcher-kuttl-applier-0" event={"ID":"2dc92c6d-ca03-4ad1-968e-fbba2da759c4","Type":"ContainerDied","Data":"fe6bd96bd0ee31f36d5282d04af28274d3605bfec0b6f768f38898134c1e7c89"} Oct 03 09:01:25 crc kubenswrapper[4765]: I1003 09:01:25.981079 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/watcher-kuttl-applier-0" event={"ID":"2dc92c6d-ca03-4ad1-968e-fbba2da759c4","Type":"ContainerDied","Data":"cac00616c30e8d87c861498b155c36212f32a9f57af95bed2f3230262e7ca69d"} Oct 03 09:01:25 crc kubenswrapper[4765]: I1003 09:01:25.980922 4765 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/watcher-kuttl-applier-0" Oct 03 09:01:26 crc kubenswrapper[4765]: I1003 09:01:26.002545 4765 scope.go:117] "RemoveContainer" containerID="fc9c8a0f1b7fd46433a370ba82c5a2b0ce850c29c99455c75f14ea35a687c8d8" Oct 03 09:01:26 crc kubenswrapper[4765]: I1003 09:01:26.019966 4765 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["watcher-kuttl-default/ceilometer-0"] Oct 03 09:01:26 crc kubenswrapper[4765]: I1003 09:01:26.033515 4765 scope.go:117] "RemoveContainer" containerID="85689c79ad5864cf081c3be5eee11bfcbf52ba81da4163e3f61da84f9eb536c0" Oct 03 09:01:26 crc kubenswrapper[4765]: I1003 09:01:26.035536 4765 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["watcher-kuttl-default/ceilometer-0"] Oct 03 09:01:26 crc kubenswrapper[4765]: I1003 09:01:26.055831 4765 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["watcher-kuttl-default/watcher-kuttl-applier-0"] Oct 03 09:01:26 crc kubenswrapper[4765]: I1003 09:01:26.058347 4765 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["watcher-kuttl-default/watcher-kuttl-applier-0"] Oct 03 09:01:26 crc kubenswrapper[4765]: I1003 09:01:26.066376 4765 scope.go:117] "RemoveContainer" containerID="e926be1b2eddb4c1ba41b06d9cae1d186dfae963aa48eb29e85be7e29e8b48b0" Oct 03 09:01:26 crc kubenswrapper[4765]: I1003 09:01:26.080250 4765 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["watcher-kuttl-default/ceilometer-0"] Oct 03 09:01:26 crc kubenswrapper[4765]: E1003 09:01:26.080634 4765 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e66d5305-05e9-4cf3-a8d4-513b77bdba78" containerName="watcher-api" Oct 03 09:01:26 crc kubenswrapper[4765]: I1003 09:01:26.080670 4765 state_mem.go:107] "Deleted CPUSet assignment" podUID="e66d5305-05e9-4cf3-a8d4-513b77bdba78" containerName="watcher-api" Oct 03 09:01:26 crc kubenswrapper[4765]: E1003 09:01:26.080686 4765 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2dc92c6d-ca03-4ad1-968e-fbba2da759c4" containerName="watcher-applier" Oct 03 09:01:26 crc kubenswrapper[4765]: I1003 09:01:26.080696 4765 state_mem.go:107] "Deleted CPUSet assignment" podUID="2dc92c6d-ca03-4ad1-968e-fbba2da759c4" containerName="watcher-applier" Oct 03 09:01:26 crc kubenswrapper[4765]: E1003 09:01:26.080714 4765 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e4e9013d-4e82-42e9-81f4-f3a8608d564d" containerName="proxy-httpd" Oct 03 09:01:26 crc kubenswrapper[4765]: I1003 09:01:26.080722 4765 state_mem.go:107] "Deleted CPUSet assignment" podUID="e4e9013d-4e82-42e9-81f4-f3a8608d564d" containerName="proxy-httpd" Oct 03 09:01:26 crc kubenswrapper[4765]: E1003 09:01:26.080744 4765 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e4e9013d-4e82-42e9-81f4-f3a8608d564d" containerName="sg-core" Oct 03 09:01:26 crc kubenswrapper[4765]: I1003 09:01:26.080751 4765 state_mem.go:107] "Deleted CPUSet assignment" podUID="e4e9013d-4e82-42e9-81f4-f3a8608d564d" containerName="sg-core" Oct 03 09:01:26 crc kubenswrapper[4765]: E1003 09:01:26.080769 4765 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e4e9013d-4e82-42e9-81f4-f3a8608d564d" containerName="ceilometer-central-agent" Oct 03 09:01:26 crc kubenswrapper[4765]: I1003 09:01:26.080776 4765 state_mem.go:107] "Deleted CPUSet assignment" podUID="e4e9013d-4e82-42e9-81f4-f3a8608d564d" containerName="ceilometer-central-agent" Oct 03 09:01:26 crc kubenswrapper[4765]: E1003 09:01:26.080786 4765 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e4e9013d-4e82-42e9-81f4-f3a8608d564d" containerName="ceilometer-notification-agent" Oct 03 09:01:26 crc kubenswrapper[4765]: I1003 09:01:26.080794 4765 state_mem.go:107] "Deleted CPUSet assignment" podUID="e4e9013d-4e82-42e9-81f4-f3a8608d564d" containerName="ceilometer-notification-agent" Oct 03 09:01:26 crc kubenswrapper[4765]: E1003 09:01:26.080807 4765 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e66d5305-05e9-4cf3-a8d4-513b77bdba78" containerName="watcher-kuttl-api-log" Oct 03 09:01:26 crc kubenswrapper[4765]: I1003 09:01:26.080816 4765 state_mem.go:107] "Deleted CPUSet assignment" podUID="e66d5305-05e9-4cf3-a8d4-513b77bdba78" containerName="watcher-kuttl-api-log" Oct 03 09:01:26 crc kubenswrapper[4765]: I1003 09:01:26.081037 4765 memory_manager.go:354] "RemoveStaleState removing state" podUID="e4e9013d-4e82-42e9-81f4-f3a8608d564d" containerName="ceilometer-notification-agent" Oct 03 09:01:26 crc kubenswrapper[4765]: I1003 09:01:26.081054 4765 memory_manager.go:354] "RemoveStaleState removing state" podUID="e66d5305-05e9-4cf3-a8d4-513b77bdba78" containerName="watcher-api" Oct 03 09:01:26 crc kubenswrapper[4765]: I1003 09:01:26.081075 4765 memory_manager.go:354] "RemoveStaleState removing state" podUID="e4e9013d-4e82-42e9-81f4-f3a8608d564d" containerName="ceilometer-central-agent" Oct 03 09:01:26 crc kubenswrapper[4765]: I1003 09:01:26.081085 4765 memory_manager.go:354] "RemoveStaleState removing state" podUID="e4e9013d-4e82-42e9-81f4-f3a8608d564d" containerName="proxy-httpd" Oct 03 09:01:26 crc kubenswrapper[4765]: I1003 09:01:26.081094 4765 memory_manager.go:354] "RemoveStaleState removing state" podUID="e4e9013d-4e82-42e9-81f4-f3a8608d564d" containerName="sg-core" Oct 03 09:01:26 crc kubenswrapper[4765]: I1003 09:01:26.081108 4765 memory_manager.go:354] "RemoveStaleState removing state" podUID="2dc92c6d-ca03-4ad1-968e-fbba2da759c4" containerName="watcher-applier" Oct 03 09:01:26 crc kubenswrapper[4765]: I1003 09:01:26.081139 4765 memory_manager.go:354] "RemoveStaleState removing state" podUID="e66d5305-05e9-4cf3-a8d4-513b77bdba78" containerName="watcher-kuttl-api-log" Oct 03 09:01:26 crc kubenswrapper[4765]: I1003 09:01:26.082781 4765 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/ceilometer-0" Oct 03 09:01:26 crc kubenswrapper[4765]: I1003 09:01:26.085108 4765 reflector.go:368] Caches populated for *v1.Secret from object-"watcher-kuttl-default"/"ceilometer-config-data" Oct 03 09:01:26 crc kubenswrapper[4765]: I1003 09:01:26.085312 4765 reflector.go:368] Caches populated for *v1.Secret from object-"watcher-kuttl-default"/"cert-ceilometer-internal-svc" Oct 03 09:01:26 crc kubenswrapper[4765]: I1003 09:01:26.085484 4765 reflector.go:368] Caches populated for *v1.Secret from object-"watcher-kuttl-default"/"ceilometer-scripts" Oct 03 09:01:26 crc kubenswrapper[4765]: I1003 09:01:26.088715 4765 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["watcher-kuttl-default/ceilometer-0"] Oct 03 09:01:26 crc kubenswrapper[4765]: I1003 09:01:26.098314 4765 scope.go:117] "RemoveContainer" containerID="3e41e5b6a1677af5c275f81caabde4ed38e4b598326418f025a27a74dec79b67" Oct 03 09:01:26 crc kubenswrapper[4765]: E1003 09:01:26.100615 4765 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3e41e5b6a1677af5c275f81caabde4ed38e4b598326418f025a27a74dec79b67\": container with ID starting with 3e41e5b6a1677af5c275f81caabde4ed38e4b598326418f025a27a74dec79b67 not found: ID does not exist" containerID="3e41e5b6a1677af5c275f81caabde4ed38e4b598326418f025a27a74dec79b67" Oct 03 09:01:26 crc kubenswrapper[4765]: I1003 09:01:26.100664 4765 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3e41e5b6a1677af5c275f81caabde4ed38e4b598326418f025a27a74dec79b67"} err="failed to get container status \"3e41e5b6a1677af5c275f81caabde4ed38e4b598326418f025a27a74dec79b67\": rpc error: code = NotFound desc = could not find container \"3e41e5b6a1677af5c275f81caabde4ed38e4b598326418f025a27a74dec79b67\": container with ID starting with 3e41e5b6a1677af5c275f81caabde4ed38e4b598326418f025a27a74dec79b67 not found: ID does not exist" Oct 03 09:01:26 crc kubenswrapper[4765]: I1003 09:01:26.100687 4765 scope.go:117] "RemoveContainer" containerID="fc9c8a0f1b7fd46433a370ba82c5a2b0ce850c29c99455c75f14ea35a687c8d8" Oct 03 09:01:26 crc kubenswrapper[4765]: E1003 09:01:26.109522 4765 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"fc9c8a0f1b7fd46433a370ba82c5a2b0ce850c29c99455c75f14ea35a687c8d8\": container with ID starting with fc9c8a0f1b7fd46433a370ba82c5a2b0ce850c29c99455c75f14ea35a687c8d8 not found: ID does not exist" containerID="fc9c8a0f1b7fd46433a370ba82c5a2b0ce850c29c99455c75f14ea35a687c8d8" Oct 03 09:01:26 crc kubenswrapper[4765]: I1003 09:01:26.109572 4765 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"fc9c8a0f1b7fd46433a370ba82c5a2b0ce850c29c99455c75f14ea35a687c8d8"} err="failed to get container status \"fc9c8a0f1b7fd46433a370ba82c5a2b0ce850c29c99455c75f14ea35a687c8d8\": rpc error: code = NotFound desc = could not find container \"fc9c8a0f1b7fd46433a370ba82c5a2b0ce850c29c99455c75f14ea35a687c8d8\": container with ID starting with fc9c8a0f1b7fd46433a370ba82c5a2b0ce850c29c99455c75f14ea35a687c8d8 not found: ID does not exist" Oct 03 09:01:26 crc kubenswrapper[4765]: I1003 09:01:26.109606 4765 scope.go:117] "RemoveContainer" containerID="85689c79ad5864cf081c3be5eee11bfcbf52ba81da4163e3f61da84f9eb536c0" Oct 03 09:01:26 crc kubenswrapper[4765]: E1003 09:01:26.111059 4765 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"85689c79ad5864cf081c3be5eee11bfcbf52ba81da4163e3f61da84f9eb536c0\": container with ID starting with 85689c79ad5864cf081c3be5eee11bfcbf52ba81da4163e3f61da84f9eb536c0 not found: ID does not exist" containerID="85689c79ad5864cf081c3be5eee11bfcbf52ba81da4163e3f61da84f9eb536c0" Oct 03 09:01:26 crc kubenswrapper[4765]: I1003 09:01:26.111094 4765 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"85689c79ad5864cf081c3be5eee11bfcbf52ba81da4163e3f61da84f9eb536c0"} err="failed to get container status \"85689c79ad5864cf081c3be5eee11bfcbf52ba81da4163e3f61da84f9eb536c0\": rpc error: code = NotFound desc = could not find container \"85689c79ad5864cf081c3be5eee11bfcbf52ba81da4163e3f61da84f9eb536c0\": container with ID starting with 85689c79ad5864cf081c3be5eee11bfcbf52ba81da4163e3f61da84f9eb536c0 not found: ID does not exist" Oct 03 09:01:26 crc kubenswrapper[4765]: I1003 09:01:26.111113 4765 scope.go:117] "RemoveContainer" containerID="e926be1b2eddb4c1ba41b06d9cae1d186dfae963aa48eb29e85be7e29e8b48b0" Oct 03 09:01:26 crc kubenswrapper[4765]: E1003 09:01:26.111874 4765 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e926be1b2eddb4c1ba41b06d9cae1d186dfae963aa48eb29e85be7e29e8b48b0\": container with ID starting with e926be1b2eddb4c1ba41b06d9cae1d186dfae963aa48eb29e85be7e29e8b48b0 not found: ID does not exist" containerID="e926be1b2eddb4c1ba41b06d9cae1d186dfae963aa48eb29e85be7e29e8b48b0" Oct 03 09:01:26 crc kubenswrapper[4765]: I1003 09:01:26.111912 4765 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e926be1b2eddb4c1ba41b06d9cae1d186dfae963aa48eb29e85be7e29e8b48b0"} err="failed to get container status \"e926be1b2eddb4c1ba41b06d9cae1d186dfae963aa48eb29e85be7e29e8b48b0\": rpc error: code = NotFound desc = could not find container \"e926be1b2eddb4c1ba41b06d9cae1d186dfae963aa48eb29e85be7e29e8b48b0\": container with ID starting with e926be1b2eddb4c1ba41b06d9cae1d186dfae963aa48eb29e85be7e29e8b48b0 not found: ID does not exist" Oct 03 09:01:26 crc kubenswrapper[4765]: I1003 09:01:26.111931 4765 scope.go:117] "RemoveContainer" containerID="fe6bd96bd0ee31f36d5282d04af28274d3605bfec0b6f768f38898134c1e7c89" Oct 03 09:01:26 crc kubenswrapper[4765]: I1003 09:01:26.128371 4765 scope.go:117] "RemoveContainer" containerID="fe6bd96bd0ee31f36d5282d04af28274d3605bfec0b6f768f38898134c1e7c89" Oct 03 09:01:26 crc kubenswrapper[4765]: E1003 09:01:26.128757 4765 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"fe6bd96bd0ee31f36d5282d04af28274d3605bfec0b6f768f38898134c1e7c89\": container with ID starting with fe6bd96bd0ee31f36d5282d04af28274d3605bfec0b6f768f38898134c1e7c89 not found: ID does not exist" containerID="fe6bd96bd0ee31f36d5282d04af28274d3605bfec0b6f768f38898134c1e7c89" Oct 03 09:01:26 crc kubenswrapper[4765]: I1003 09:01:26.128781 4765 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"fe6bd96bd0ee31f36d5282d04af28274d3605bfec0b6f768f38898134c1e7c89"} err="failed to get container status \"fe6bd96bd0ee31f36d5282d04af28274d3605bfec0b6f768f38898134c1e7c89\": rpc error: code = NotFound desc = could not find container \"fe6bd96bd0ee31f36d5282d04af28274d3605bfec0b6f768f38898134c1e7c89\": container with ID starting with fe6bd96bd0ee31f36d5282d04af28274d3605bfec0b6f768f38898134c1e7c89 not found: ID does not exist" Oct 03 09:01:26 crc kubenswrapper[4765]: I1003 09:01:26.163733 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/82df5442-55c2-414c-b79e-1b2cb7ea3c98-config-data\") pod \"ceilometer-0\" (UID: \"82df5442-55c2-414c-b79e-1b2cb7ea3c98\") " pod="watcher-kuttl-default/ceilometer-0" Oct 03 09:01:26 crc kubenswrapper[4765]: I1003 09:01:26.163775 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/82df5442-55c2-414c-b79e-1b2cb7ea3c98-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"82df5442-55c2-414c-b79e-1b2cb7ea3c98\") " pod="watcher-kuttl-default/ceilometer-0" Oct 03 09:01:26 crc kubenswrapper[4765]: I1003 09:01:26.163795 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/82df5442-55c2-414c-b79e-1b2cb7ea3c98-scripts\") pod \"ceilometer-0\" (UID: \"82df5442-55c2-414c-b79e-1b2cb7ea3c98\") " pod="watcher-kuttl-default/ceilometer-0" Oct 03 09:01:26 crc kubenswrapper[4765]: I1003 09:01:26.163814 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/82df5442-55c2-414c-b79e-1b2cb7ea3c98-log-httpd\") pod \"ceilometer-0\" (UID: \"82df5442-55c2-414c-b79e-1b2cb7ea3c98\") " pod="watcher-kuttl-default/ceilometer-0" Oct 03 09:01:26 crc kubenswrapper[4765]: I1003 09:01:26.163884 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/82df5442-55c2-414c-b79e-1b2cb7ea3c98-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"82df5442-55c2-414c-b79e-1b2cb7ea3c98\") " pod="watcher-kuttl-default/ceilometer-0" Oct 03 09:01:26 crc kubenswrapper[4765]: I1003 09:01:26.163902 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rmdfj\" (UniqueName: \"kubernetes.io/projected/82df5442-55c2-414c-b79e-1b2cb7ea3c98-kube-api-access-rmdfj\") pod \"ceilometer-0\" (UID: \"82df5442-55c2-414c-b79e-1b2cb7ea3c98\") " pod="watcher-kuttl-default/ceilometer-0" Oct 03 09:01:26 crc kubenswrapper[4765]: I1003 09:01:26.163920 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/82df5442-55c2-414c-b79e-1b2cb7ea3c98-run-httpd\") pod \"ceilometer-0\" (UID: \"82df5442-55c2-414c-b79e-1b2cb7ea3c98\") " pod="watcher-kuttl-default/ceilometer-0" Oct 03 09:01:26 crc kubenswrapper[4765]: I1003 09:01:26.163969 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/82df5442-55c2-414c-b79e-1b2cb7ea3c98-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"82df5442-55c2-414c-b79e-1b2cb7ea3c98\") " pod="watcher-kuttl-default/ceilometer-0" Oct 03 09:01:26 crc kubenswrapper[4765]: I1003 09:01:26.265601 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/82df5442-55c2-414c-b79e-1b2cb7ea3c98-config-data\") pod \"ceilometer-0\" (UID: \"82df5442-55c2-414c-b79e-1b2cb7ea3c98\") " pod="watcher-kuttl-default/ceilometer-0" Oct 03 09:01:26 crc kubenswrapper[4765]: I1003 09:01:26.265743 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/82df5442-55c2-414c-b79e-1b2cb7ea3c98-scripts\") pod \"ceilometer-0\" (UID: \"82df5442-55c2-414c-b79e-1b2cb7ea3c98\") " pod="watcher-kuttl-default/ceilometer-0" Oct 03 09:01:26 crc kubenswrapper[4765]: I1003 09:01:26.265767 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/82df5442-55c2-414c-b79e-1b2cb7ea3c98-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"82df5442-55c2-414c-b79e-1b2cb7ea3c98\") " pod="watcher-kuttl-default/ceilometer-0" Oct 03 09:01:26 crc kubenswrapper[4765]: I1003 09:01:26.265793 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/82df5442-55c2-414c-b79e-1b2cb7ea3c98-log-httpd\") pod \"ceilometer-0\" (UID: \"82df5442-55c2-414c-b79e-1b2cb7ea3c98\") " pod="watcher-kuttl-default/ceilometer-0" Oct 03 09:01:26 crc kubenswrapper[4765]: I1003 09:01:26.265832 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/82df5442-55c2-414c-b79e-1b2cb7ea3c98-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"82df5442-55c2-414c-b79e-1b2cb7ea3c98\") " pod="watcher-kuttl-default/ceilometer-0" Oct 03 09:01:26 crc kubenswrapper[4765]: I1003 09:01:26.265855 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rmdfj\" (UniqueName: \"kubernetes.io/projected/82df5442-55c2-414c-b79e-1b2cb7ea3c98-kube-api-access-rmdfj\") pod \"ceilometer-0\" (UID: \"82df5442-55c2-414c-b79e-1b2cb7ea3c98\") " pod="watcher-kuttl-default/ceilometer-0" Oct 03 09:01:26 crc kubenswrapper[4765]: I1003 09:01:26.265877 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/82df5442-55c2-414c-b79e-1b2cb7ea3c98-run-httpd\") pod \"ceilometer-0\" (UID: \"82df5442-55c2-414c-b79e-1b2cb7ea3c98\") " pod="watcher-kuttl-default/ceilometer-0" Oct 03 09:01:26 crc kubenswrapper[4765]: I1003 09:01:26.266589 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/82df5442-55c2-414c-b79e-1b2cb7ea3c98-log-httpd\") pod \"ceilometer-0\" (UID: \"82df5442-55c2-414c-b79e-1b2cb7ea3c98\") " pod="watcher-kuttl-default/ceilometer-0" Oct 03 09:01:26 crc kubenswrapper[4765]: I1003 09:01:26.266620 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/82df5442-55c2-414c-b79e-1b2cb7ea3c98-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"82df5442-55c2-414c-b79e-1b2cb7ea3c98\") " pod="watcher-kuttl-default/ceilometer-0" Oct 03 09:01:26 crc kubenswrapper[4765]: I1003 09:01:26.266904 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/82df5442-55c2-414c-b79e-1b2cb7ea3c98-run-httpd\") pod \"ceilometer-0\" (UID: \"82df5442-55c2-414c-b79e-1b2cb7ea3c98\") " pod="watcher-kuttl-default/ceilometer-0" Oct 03 09:01:26 crc kubenswrapper[4765]: I1003 09:01:26.270947 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/82df5442-55c2-414c-b79e-1b2cb7ea3c98-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"82df5442-55c2-414c-b79e-1b2cb7ea3c98\") " pod="watcher-kuttl-default/ceilometer-0" Oct 03 09:01:26 crc kubenswrapper[4765]: I1003 09:01:26.270960 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/82df5442-55c2-414c-b79e-1b2cb7ea3c98-scripts\") pod \"ceilometer-0\" (UID: \"82df5442-55c2-414c-b79e-1b2cb7ea3c98\") " pod="watcher-kuttl-default/ceilometer-0" Oct 03 09:01:26 crc kubenswrapper[4765]: I1003 09:01:26.271290 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/82df5442-55c2-414c-b79e-1b2cb7ea3c98-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"82df5442-55c2-414c-b79e-1b2cb7ea3c98\") " pod="watcher-kuttl-default/ceilometer-0" Oct 03 09:01:26 crc kubenswrapper[4765]: I1003 09:01:26.271391 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/82df5442-55c2-414c-b79e-1b2cb7ea3c98-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"82df5442-55c2-414c-b79e-1b2cb7ea3c98\") " pod="watcher-kuttl-default/ceilometer-0" Oct 03 09:01:26 crc kubenswrapper[4765]: I1003 09:01:26.272576 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/82df5442-55c2-414c-b79e-1b2cb7ea3c98-config-data\") pod \"ceilometer-0\" (UID: \"82df5442-55c2-414c-b79e-1b2cb7ea3c98\") " pod="watcher-kuttl-default/ceilometer-0" Oct 03 09:01:26 crc kubenswrapper[4765]: I1003 09:01:26.298521 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rmdfj\" (UniqueName: \"kubernetes.io/projected/82df5442-55c2-414c-b79e-1b2cb7ea3c98-kube-api-access-rmdfj\") pod \"ceilometer-0\" (UID: \"82df5442-55c2-414c-b79e-1b2cb7ea3c98\") " pod="watcher-kuttl-default/ceilometer-0" Oct 03 09:01:26 crc kubenswrapper[4765]: I1003 09:01:26.319523 4765 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2dc92c6d-ca03-4ad1-968e-fbba2da759c4" path="/var/lib/kubelet/pods/2dc92c6d-ca03-4ad1-968e-fbba2da759c4/volumes" Oct 03 09:01:26 crc kubenswrapper[4765]: I1003 09:01:26.320402 4765 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e4e9013d-4e82-42e9-81f4-f3a8608d564d" path="/var/lib/kubelet/pods/e4e9013d-4e82-42e9-81f4-f3a8608d564d/volumes" Oct 03 09:01:26 crc kubenswrapper[4765]: I1003 09:01:26.405755 4765 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/ceilometer-0" Oct 03 09:01:26 crc kubenswrapper[4765]: I1003 09:01:26.878882 4765 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["watcher-kuttl-default/ceilometer-0"] Oct 03 09:01:27 crc kubenswrapper[4765]: I1003 09:01:27.022902 4765 generic.go:334] "Generic (PLEG): container finished" podID="96419208-f57b-4a84-875b-1f9e851d7eda" containerID="ba2fea4514c3b2f5c109eb4a23579391f812b1b9d121c46f92ec8d781d2a241d" exitCode=0 Oct 03 09:01:27 crc kubenswrapper[4765]: I1003 09:01:27.023183 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/watcher-kuttl-decision-engine-0" event={"ID":"96419208-f57b-4a84-875b-1f9e851d7eda","Type":"ContainerDied","Data":"ba2fea4514c3b2f5c109eb4a23579391f812b1b9d121c46f92ec8d781d2a241d"} Oct 03 09:01:27 crc kubenswrapper[4765]: I1003 09:01:27.029858 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/ceilometer-0" event={"ID":"82df5442-55c2-414c-b79e-1b2cb7ea3c98","Type":"ContainerStarted","Data":"241c556bd94929d659b63c12be96486d595b488142f56b5dc8ed3c87fae1cecd"} Oct 03 09:01:27 crc kubenswrapper[4765]: I1003 09:01:27.078271 4765 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/watcher-kuttl-decision-engine-0" Oct 03 09:01:27 crc kubenswrapper[4765]: I1003 09:01:27.182613 4765 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"custom-prometheus-ca\" (UniqueName: \"kubernetes.io/secret/96419208-f57b-4a84-875b-1f9e851d7eda-custom-prometheus-ca\") pod \"96419208-f57b-4a84-875b-1f9e851d7eda\" (UID: \"96419208-f57b-4a84-875b-1f9e851d7eda\") " Oct 03 09:01:27 crc kubenswrapper[4765]: I1003 09:01:27.182863 4765 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-x65lw\" (UniqueName: \"kubernetes.io/projected/96419208-f57b-4a84-875b-1f9e851d7eda-kube-api-access-x65lw\") pod \"96419208-f57b-4a84-875b-1f9e851d7eda\" (UID: \"96419208-f57b-4a84-875b-1f9e851d7eda\") " Oct 03 09:01:27 crc kubenswrapper[4765]: I1003 09:01:27.182914 4765 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/96419208-f57b-4a84-875b-1f9e851d7eda-combined-ca-bundle\") pod \"96419208-f57b-4a84-875b-1f9e851d7eda\" (UID: \"96419208-f57b-4a84-875b-1f9e851d7eda\") " Oct 03 09:01:27 crc kubenswrapper[4765]: I1003 09:01:27.182936 4765 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/96419208-f57b-4a84-875b-1f9e851d7eda-logs\") pod \"96419208-f57b-4a84-875b-1f9e851d7eda\" (UID: \"96419208-f57b-4a84-875b-1f9e851d7eda\") " Oct 03 09:01:27 crc kubenswrapper[4765]: I1003 09:01:27.182999 4765 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/96419208-f57b-4a84-875b-1f9e851d7eda-config-data\") pod \"96419208-f57b-4a84-875b-1f9e851d7eda\" (UID: \"96419208-f57b-4a84-875b-1f9e851d7eda\") " Oct 03 09:01:27 crc kubenswrapper[4765]: I1003 09:01:27.184287 4765 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/96419208-f57b-4a84-875b-1f9e851d7eda-logs" (OuterVolumeSpecName: "logs") pod "96419208-f57b-4a84-875b-1f9e851d7eda" (UID: "96419208-f57b-4a84-875b-1f9e851d7eda"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 03 09:01:27 crc kubenswrapper[4765]: I1003 09:01:27.190926 4765 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/96419208-f57b-4a84-875b-1f9e851d7eda-kube-api-access-x65lw" (OuterVolumeSpecName: "kube-api-access-x65lw") pod "96419208-f57b-4a84-875b-1f9e851d7eda" (UID: "96419208-f57b-4a84-875b-1f9e851d7eda"). InnerVolumeSpecName "kube-api-access-x65lw". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 09:01:27 crc kubenswrapper[4765]: I1003 09:01:27.209827 4765 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/96419208-f57b-4a84-875b-1f9e851d7eda-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "96419208-f57b-4a84-875b-1f9e851d7eda" (UID: "96419208-f57b-4a84-875b-1f9e851d7eda"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 09:01:27 crc kubenswrapper[4765]: I1003 09:01:27.211795 4765 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/96419208-f57b-4a84-875b-1f9e851d7eda-custom-prometheus-ca" (OuterVolumeSpecName: "custom-prometheus-ca") pod "96419208-f57b-4a84-875b-1f9e851d7eda" (UID: "96419208-f57b-4a84-875b-1f9e851d7eda"). InnerVolumeSpecName "custom-prometheus-ca". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 09:01:27 crc kubenswrapper[4765]: I1003 09:01:27.238824 4765 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/96419208-f57b-4a84-875b-1f9e851d7eda-config-data" (OuterVolumeSpecName: "config-data") pod "96419208-f57b-4a84-875b-1f9e851d7eda" (UID: "96419208-f57b-4a84-875b-1f9e851d7eda"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 09:01:27 crc kubenswrapper[4765]: I1003 09:01:27.284758 4765 reconciler_common.go:293] "Volume detached for volume \"custom-prometheus-ca\" (UniqueName: \"kubernetes.io/secret/96419208-f57b-4a84-875b-1f9e851d7eda-custom-prometheus-ca\") on node \"crc\" DevicePath \"\"" Oct 03 09:01:27 crc kubenswrapper[4765]: I1003 09:01:27.284802 4765 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-x65lw\" (UniqueName: \"kubernetes.io/projected/96419208-f57b-4a84-875b-1f9e851d7eda-kube-api-access-x65lw\") on node \"crc\" DevicePath \"\"" Oct 03 09:01:27 crc kubenswrapper[4765]: I1003 09:01:27.284816 4765 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/96419208-f57b-4a84-875b-1f9e851d7eda-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 03 09:01:27 crc kubenswrapper[4765]: I1003 09:01:27.284828 4765 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/96419208-f57b-4a84-875b-1f9e851d7eda-logs\") on node \"crc\" DevicePath \"\"" Oct 03 09:01:27 crc kubenswrapper[4765]: I1003 09:01:27.284843 4765 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/96419208-f57b-4a84-875b-1f9e851d7eda-config-data\") on node \"crc\" DevicePath \"\"" Oct 03 09:01:28 crc kubenswrapper[4765]: I1003 09:01:28.039013 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/watcher-kuttl-decision-engine-0" event={"ID":"96419208-f57b-4a84-875b-1f9e851d7eda","Type":"ContainerDied","Data":"9a01f4ec73699ae72eab4d99b6b0e63a0074d205f7fc1aa2c848d3ce72a22711"} Oct 03 09:01:28 crc kubenswrapper[4765]: I1003 09:01:28.039322 4765 scope.go:117] "RemoveContainer" containerID="ba2fea4514c3b2f5c109eb4a23579391f812b1b9d121c46f92ec8d781d2a241d" Oct 03 09:01:28 crc kubenswrapper[4765]: I1003 09:01:28.040467 4765 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/watcher-kuttl-decision-engine-0" Oct 03 09:01:28 crc kubenswrapper[4765]: I1003 09:01:28.041784 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/ceilometer-0" event={"ID":"82df5442-55c2-414c-b79e-1b2cb7ea3c98","Type":"ContainerStarted","Data":"539b4b5f78c6e31bc0bd30571fbf8b42fbc7dceb39fd92509c3b752732a4a759"} Oct 03 09:01:28 crc kubenswrapper[4765]: I1003 09:01:28.092990 4765 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["watcher-kuttl-default/watcher-kuttl-decision-engine-0"] Oct 03 09:01:28 crc kubenswrapper[4765]: I1003 09:01:28.101703 4765 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["watcher-kuttl-default/watcher-kuttl-decision-engine-0"] Oct 03 09:01:28 crc kubenswrapper[4765]: I1003 09:01:28.318721 4765 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="96419208-f57b-4a84-875b-1f9e851d7eda" path="/var/lib/kubelet/pods/96419208-f57b-4a84-875b-1f9e851d7eda/volumes" Oct 03 09:01:29 crc kubenswrapper[4765]: I1003 09:01:29.053660 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/ceilometer-0" event={"ID":"82df5442-55c2-414c-b79e-1b2cb7ea3c98","Type":"ContainerStarted","Data":"93e61721165ef7c7f5e500de0476a37d1501e69249171c4eaf395fc24a6113d3"} Oct 03 09:01:29 crc kubenswrapper[4765]: I1003 09:01:29.053917 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/ceilometer-0" event={"ID":"82df5442-55c2-414c-b79e-1b2cb7ea3c98","Type":"ContainerStarted","Data":"9ec71c76904490e7ff12c75b98e9d1957f9c7ed58d059931122b1d39416fbde9"} Oct 03 09:01:30 crc kubenswrapper[4765]: I1003 09:01:30.377718 4765 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["watcher-kuttl-default/watcher-db-create-zcrmj"] Oct 03 09:01:30 crc kubenswrapper[4765]: E1003 09:01:30.378408 4765 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="96419208-f57b-4a84-875b-1f9e851d7eda" containerName="watcher-decision-engine" Oct 03 09:01:30 crc kubenswrapper[4765]: I1003 09:01:30.378423 4765 state_mem.go:107] "Deleted CPUSet assignment" podUID="96419208-f57b-4a84-875b-1f9e851d7eda" containerName="watcher-decision-engine" Oct 03 09:01:30 crc kubenswrapper[4765]: I1003 09:01:30.378592 4765 memory_manager.go:354] "RemoveStaleState removing state" podUID="96419208-f57b-4a84-875b-1f9e851d7eda" containerName="watcher-decision-engine" Oct 03 09:01:30 crc kubenswrapper[4765]: I1003 09:01:30.379243 4765 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/watcher-db-create-zcrmj" Oct 03 09:01:30 crc kubenswrapper[4765]: I1003 09:01:30.385080 4765 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["watcher-kuttl-default/watcher-db-create-zcrmj"] Oct 03 09:01:30 crc kubenswrapper[4765]: I1003 09:01:30.439390 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qpzx2\" (UniqueName: \"kubernetes.io/projected/79ccdb04-ca5c-4266-8367-b0a04310f061-kube-api-access-qpzx2\") pod \"watcher-db-create-zcrmj\" (UID: \"79ccdb04-ca5c-4266-8367-b0a04310f061\") " pod="watcher-kuttl-default/watcher-db-create-zcrmj" Oct 03 09:01:30 crc kubenswrapper[4765]: I1003 09:01:30.540613 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qpzx2\" (UniqueName: \"kubernetes.io/projected/79ccdb04-ca5c-4266-8367-b0a04310f061-kube-api-access-qpzx2\") pod \"watcher-db-create-zcrmj\" (UID: \"79ccdb04-ca5c-4266-8367-b0a04310f061\") " pod="watcher-kuttl-default/watcher-db-create-zcrmj" Oct 03 09:01:30 crc kubenswrapper[4765]: I1003 09:01:30.556427 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qpzx2\" (UniqueName: \"kubernetes.io/projected/79ccdb04-ca5c-4266-8367-b0a04310f061-kube-api-access-qpzx2\") pod \"watcher-db-create-zcrmj\" (UID: \"79ccdb04-ca5c-4266-8367-b0a04310f061\") " pod="watcher-kuttl-default/watcher-db-create-zcrmj" Oct 03 09:01:30 crc kubenswrapper[4765]: I1003 09:01:30.680147 4765 patch_prober.go:28] interesting pod/machine-config-daemon-j8mss container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 03 09:01:30 crc kubenswrapper[4765]: I1003 09:01:30.680205 4765 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-j8mss" podUID="d636dbad-9ffa-4ba7-953f-adea04b76a23" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 03 09:01:30 crc kubenswrapper[4765]: I1003 09:01:30.680247 4765 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-j8mss" Oct 03 09:01:30 crc kubenswrapper[4765]: I1003 09:01:30.680901 4765 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"6fb31d91d836934f6972647efe119143f3dae08e768c5506a0423da0d8bd74e8"} pod="openshift-machine-config-operator/machine-config-daemon-j8mss" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Oct 03 09:01:30 crc kubenswrapper[4765]: I1003 09:01:30.680965 4765 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-j8mss" podUID="d636dbad-9ffa-4ba7-953f-adea04b76a23" containerName="machine-config-daemon" containerID="cri-o://6fb31d91d836934f6972647efe119143f3dae08e768c5506a0423da0d8bd74e8" gracePeriod=600 Oct 03 09:01:30 crc kubenswrapper[4765]: I1003 09:01:30.717397 4765 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/watcher-db-create-zcrmj" Oct 03 09:01:31 crc kubenswrapper[4765]: I1003 09:01:31.118982 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/ceilometer-0" event={"ID":"82df5442-55c2-414c-b79e-1b2cb7ea3c98","Type":"ContainerStarted","Data":"d15244f22349beeb877b9a00b7fae8926dcba142f46dac5470fa49e3a13903b6"} Oct 03 09:01:31 crc kubenswrapper[4765]: I1003 09:01:31.120827 4765 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="watcher-kuttl-default/ceilometer-0" Oct 03 09:01:31 crc kubenswrapper[4765]: I1003 09:01:31.179692 4765 generic.go:334] "Generic (PLEG): container finished" podID="d636dbad-9ffa-4ba7-953f-adea04b76a23" containerID="6fb31d91d836934f6972647efe119143f3dae08e768c5506a0423da0d8bd74e8" exitCode=0 Oct 03 09:01:31 crc kubenswrapper[4765]: I1003 09:01:31.179739 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-j8mss" event={"ID":"d636dbad-9ffa-4ba7-953f-adea04b76a23","Type":"ContainerDied","Data":"6fb31d91d836934f6972647efe119143f3dae08e768c5506a0423da0d8bd74e8"} Oct 03 09:01:31 crc kubenswrapper[4765]: I1003 09:01:31.179766 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-j8mss" event={"ID":"d636dbad-9ffa-4ba7-953f-adea04b76a23","Type":"ContainerStarted","Data":"dd918556e4256b95f1ffce5dba4f8a301b33441a569fc5bbea88da3f09eb9800"} Oct 03 09:01:31 crc kubenswrapper[4765]: I1003 09:01:31.179783 4765 scope.go:117] "RemoveContainer" containerID="3ea26be161c74098dfac030bd0f30c6280c924f6e3ecfdb459bea4fd5d08ace1" Oct 03 09:01:31 crc kubenswrapper[4765]: I1003 09:01:31.184266 4765 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="watcher-kuttl-default/ceilometer-0" podStartSLOduration=1.923339828 podStartE2EDuration="5.184247551s" podCreationTimestamp="2025-10-03 09:01:26 +0000 UTC" firstStartedPulling="2025-10-03 09:01:26.992220195 +0000 UTC m=+1331.293714525" lastFinishedPulling="2025-10-03 09:01:30.253127918 +0000 UTC m=+1334.554622248" observedRunningTime="2025-10-03 09:01:31.180100327 +0000 UTC m=+1335.481594647" watchObservedRunningTime="2025-10-03 09:01:31.184247551 +0000 UTC m=+1335.485741881" Oct 03 09:01:31 crc kubenswrapper[4765]: I1003 09:01:31.292695 4765 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["watcher-kuttl-default/watcher-db-create-zcrmj"] Oct 03 09:01:31 crc kubenswrapper[4765]: W1003 09:01:31.293973 4765 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod79ccdb04_ca5c_4266_8367_b0a04310f061.slice/crio-d4ac88c2da68c6610bf0c826cd4115345e11cdeaa80e6dbcb7ddffd689fdad8c WatchSource:0}: Error finding container d4ac88c2da68c6610bf0c826cd4115345e11cdeaa80e6dbcb7ddffd689fdad8c: Status 404 returned error can't find the container with id d4ac88c2da68c6610bf0c826cd4115345e11cdeaa80e6dbcb7ddffd689fdad8c Oct 03 09:01:32 crc kubenswrapper[4765]: I1003 09:01:32.191580 4765 generic.go:334] "Generic (PLEG): container finished" podID="79ccdb04-ca5c-4266-8367-b0a04310f061" containerID="24a927e1eb93abbbf3d89f6e61c8d60c35cb8444b79fef81158ad66d2e80d4ab" exitCode=0 Oct 03 09:01:32 crc kubenswrapper[4765]: I1003 09:01:32.191687 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/watcher-db-create-zcrmj" event={"ID":"79ccdb04-ca5c-4266-8367-b0a04310f061","Type":"ContainerDied","Data":"24a927e1eb93abbbf3d89f6e61c8d60c35cb8444b79fef81158ad66d2e80d4ab"} Oct 03 09:01:32 crc kubenswrapper[4765]: I1003 09:01:32.192018 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/watcher-db-create-zcrmj" event={"ID":"79ccdb04-ca5c-4266-8367-b0a04310f061","Type":"ContainerStarted","Data":"d4ac88c2da68c6610bf0c826cd4115345e11cdeaa80e6dbcb7ddffd689fdad8c"} Oct 03 09:01:33 crc kubenswrapper[4765]: I1003 09:01:33.566407 4765 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/watcher-db-create-zcrmj" Oct 03 09:01:33 crc kubenswrapper[4765]: I1003 09:01:33.599297 4765 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qpzx2\" (UniqueName: \"kubernetes.io/projected/79ccdb04-ca5c-4266-8367-b0a04310f061-kube-api-access-qpzx2\") pod \"79ccdb04-ca5c-4266-8367-b0a04310f061\" (UID: \"79ccdb04-ca5c-4266-8367-b0a04310f061\") " Oct 03 09:01:33 crc kubenswrapper[4765]: I1003 09:01:33.604828 4765 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/79ccdb04-ca5c-4266-8367-b0a04310f061-kube-api-access-qpzx2" (OuterVolumeSpecName: "kube-api-access-qpzx2") pod "79ccdb04-ca5c-4266-8367-b0a04310f061" (UID: "79ccdb04-ca5c-4266-8367-b0a04310f061"). InnerVolumeSpecName "kube-api-access-qpzx2". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 09:01:33 crc kubenswrapper[4765]: I1003 09:01:33.700575 4765 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qpzx2\" (UniqueName: \"kubernetes.io/projected/79ccdb04-ca5c-4266-8367-b0a04310f061-kube-api-access-qpzx2\") on node \"crc\" DevicePath \"\"" Oct 03 09:01:34 crc kubenswrapper[4765]: I1003 09:01:34.208076 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/watcher-db-create-zcrmj" event={"ID":"79ccdb04-ca5c-4266-8367-b0a04310f061","Type":"ContainerDied","Data":"d4ac88c2da68c6610bf0c826cd4115345e11cdeaa80e6dbcb7ddffd689fdad8c"} Oct 03 09:01:34 crc kubenswrapper[4765]: I1003 09:01:34.208122 4765 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="d4ac88c2da68c6610bf0c826cd4115345e11cdeaa80e6dbcb7ddffd689fdad8c" Oct 03 09:01:34 crc kubenswrapper[4765]: I1003 09:01:34.208146 4765 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/watcher-db-create-zcrmj" Oct 03 09:01:40 crc kubenswrapper[4765]: I1003 09:01:40.368129 4765 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["watcher-kuttl-default/watcher-8945-account-create-sd2pq"] Oct 03 09:01:40 crc kubenswrapper[4765]: E1003 09:01:40.369557 4765 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="79ccdb04-ca5c-4266-8367-b0a04310f061" containerName="mariadb-database-create" Oct 03 09:01:40 crc kubenswrapper[4765]: I1003 09:01:40.369578 4765 state_mem.go:107] "Deleted CPUSet assignment" podUID="79ccdb04-ca5c-4266-8367-b0a04310f061" containerName="mariadb-database-create" Oct 03 09:01:40 crc kubenswrapper[4765]: I1003 09:01:40.369784 4765 memory_manager.go:354] "RemoveStaleState removing state" podUID="79ccdb04-ca5c-4266-8367-b0a04310f061" containerName="mariadb-database-create" Oct 03 09:01:40 crc kubenswrapper[4765]: I1003 09:01:40.370528 4765 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/watcher-8945-account-create-sd2pq" Oct 03 09:01:40 crc kubenswrapper[4765]: I1003 09:01:40.372576 4765 reflector.go:368] Caches populated for *v1.Secret from object-"watcher-kuttl-default"/"watcher-db-secret" Oct 03 09:01:40 crc kubenswrapper[4765]: I1003 09:01:40.394224 4765 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["watcher-kuttl-default/watcher-8945-account-create-sd2pq"] Oct 03 09:01:40 crc kubenswrapper[4765]: I1003 09:01:40.407371 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sszx5\" (UniqueName: \"kubernetes.io/projected/fbe411d1-aa31-4448-a2b6-176de25d7ff0-kube-api-access-sszx5\") pod \"watcher-8945-account-create-sd2pq\" (UID: \"fbe411d1-aa31-4448-a2b6-176de25d7ff0\") " pod="watcher-kuttl-default/watcher-8945-account-create-sd2pq" Oct 03 09:01:40 crc kubenswrapper[4765]: I1003 09:01:40.509088 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-sszx5\" (UniqueName: \"kubernetes.io/projected/fbe411d1-aa31-4448-a2b6-176de25d7ff0-kube-api-access-sszx5\") pod \"watcher-8945-account-create-sd2pq\" (UID: \"fbe411d1-aa31-4448-a2b6-176de25d7ff0\") " pod="watcher-kuttl-default/watcher-8945-account-create-sd2pq" Oct 03 09:01:40 crc kubenswrapper[4765]: I1003 09:01:40.536871 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-sszx5\" (UniqueName: \"kubernetes.io/projected/fbe411d1-aa31-4448-a2b6-176de25d7ff0-kube-api-access-sszx5\") pod \"watcher-8945-account-create-sd2pq\" (UID: \"fbe411d1-aa31-4448-a2b6-176de25d7ff0\") " pod="watcher-kuttl-default/watcher-8945-account-create-sd2pq" Oct 03 09:01:40 crc kubenswrapper[4765]: I1003 09:01:40.692627 4765 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/watcher-8945-account-create-sd2pq" Oct 03 09:01:41 crc kubenswrapper[4765]: I1003 09:01:41.290387 4765 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["watcher-kuttl-default/watcher-8945-account-create-sd2pq"] Oct 03 09:01:42 crc kubenswrapper[4765]: I1003 09:01:42.271633 4765 generic.go:334] "Generic (PLEG): container finished" podID="fbe411d1-aa31-4448-a2b6-176de25d7ff0" containerID="420bd236d0b7be62bc45f519bc95d0a361de7a867c4336e89abf73594df0cc38" exitCode=0 Oct 03 09:01:42 crc kubenswrapper[4765]: I1003 09:01:42.271754 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/watcher-8945-account-create-sd2pq" event={"ID":"fbe411d1-aa31-4448-a2b6-176de25d7ff0","Type":"ContainerDied","Data":"420bd236d0b7be62bc45f519bc95d0a361de7a867c4336e89abf73594df0cc38"} Oct 03 09:01:42 crc kubenswrapper[4765]: I1003 09:01:42.272028 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/watcher-8945-account-create-sd2pq" event={"ID":"fbe411d1-aa31-4448-a2b6-176de25d7ff0","Type":"ContainerStarted","Data":"9243d9d85a2da5c136bc1c091afac9aad2de7e842cafa8d89d35ea70462fd858"} Oct 03 09:01:43 crc kubenswrapper[4765]: I1003 09:01:43.838046 4765 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/watcher-8945-account-create-sd2pq" Oct 03 09:01:43 crc kubenswrapper[4765]: I1003 09:01:43.860996 4765 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-sszx5\" (UniqueName: \"kubernetes.io/projected/fbe411d1-aa31-4448-a2b6-176de25d7ff0-kube-api-access-sszx5\") pod \"fbe411d1-aa31-4448-a2b6-176de25d7ff0\" (UID: \"fbe411d1-aa31-4448-a2b6-176de25d7ff0\") " Oct 03 09:01:43 crc kubenswrapper[4765]: I1003 09:01:43.866997 4765 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/fbe411d1-aa31-4448-a2b6-176de25d7ff0-kube-api-access-sszx5" (OuterVolumeSpecName: "kube-api-access-sszx5") pod "fbe411d1-aa31-4448-a2b6-176de25d7ff0" (UID: "fbe411d1-aa31-4448-a2b6-176de25d7ff0"). InnerVolumeSpecName "kube-api-access-sszx5". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 09:01:43 crc kubenswrapper[4765]: I1003 09:01:43.962638 4765 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-sszx5\" (UniqueName: \"kubernetes.io/projected/fbe411d1-aa31-4448-a2b6-176de25d7ff0-kube-api-access-sszx5\") on node \"crc\" DevicePath \"\"" Oct 03 09:01:44 crc kubenswrapper[4765]: I1003 09:01:44.289287 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/watcher-8945-account-create-sd2pq" event={"ID":"fbe411d1-aa31-4448-a2b6-176de25d7ff0","Type":"ContainerDied","Data":"9243d9d85a2da5c136bc1c091afac9aad2de7e842cafa8d89d35ea70462fd858"} Oct 03 09:01:44 crc kubenswrapper[4765]: I1003 09:01:44.289628 4765 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="9243d9d85a2da5c136bc1c091afac9aad2de7e842cafa8d89d35ea70462fd858" Oct 03 09:01:44 crc kubenswrapper[4765]: I1003 09:01:44.289396 4765 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/watcher-8945-account-create-sd2pq" Oct 03 09:01:45 crc kubenswrapper[4765]: I1003 09:01:45.699188 4765 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["watcher-kuttl-default/watcher-kuttl-db-sync-crsnq"] Oct 03 09:01:45 crc kubenswrapper[4765]: E1003 09:01:45.699489 4765 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fbe411d1-aa31-4448-a2b6-176de25d7ff0" containerName="mariadb-account-create" Oct 03 09:01:45 crc kubenswrapper[4765]: I1003 09:01:45.699499 4765 state_mem.go:107] "Deleted CPUSet assignment" podUID="fbe411d1-aa31-4448-a2b6-176de25d7ff0" containerName="mariadb-account-create" Oct 03 09:01:45 crc kubenswrapper[4765]: I1003 09:01:45.699663 4765 memory_manager.go:354] "RemoveStaleState removing state" podUID="fbe411d1-aa31-4448-a2b6-176de25d7ff0" containerName="mariadb-account-create" Oct 03 09:01:45 crc kubenswrapper[4765]: I1003 09:01:45.700185 4765 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/watcher-kuttl-db-sync-crsnq" Oct 03 09:01:45 crc kubenswrapper[4765]: I1003 09:01:45.705093 4765 reflector.go:368] Caches populated for *v1.Secret from object-"watcher-kuttl-default"/"watcher-kuttl-config-data" Oct 03 09:01:45 crc kubenswrapper[4765]: I1003 09:01:45.709201 4765 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["watcher-kuttl-default/watcher-kuttl-db-sync-crsnq"] Oct 03 09:01:45 crc kubenswrapper[4765]: I1003 09:01:45.711603 4765 reflector.go:368] Caches populated for *v1.Secret from object-"watcher-kuttl-default"/"watcher-watcher-kuttl-dockercfg-cm42b" Oct 03 09:01:45 crc kubenswrapper[4765]: I1003 09:01:45.788890 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/796f3197-44d8-408b-a8d4-2b5c6a282ff5-combined-ca-bundle\") pod \"watcher-kuttl-db-sync-crsnq\" (UID: \"796f3197-44d8-408b-a8d4-2b5c6a282ff5\") " pod="watcher-kuttl-default/watcher-kuttl-db-sync-crsnq" Oct 03 09:01:45 crc kubenswrapper[4765]: I1003 09:01:45.788965 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/796f3197-44d8-408b-a8d4-2b5c6a282ff5-config-data\") pod \"watcher-kuttl-db-sync-crsnq\" (UID: \"796f3197-44d8-408b-a8d4-2b5c6a282ff5\") " pod="watcher-kuttl-default/watcher-kuttl-db-sync-crsnq" Oct 03 09:01:45 crc kubenswrapper[4765]: I1003 09:01:45.789069 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/796f3197-44d8-408b-a8d4-2b5c6a282ff5-db-sync-config-data\") pod \"watcher-kuttl-db-sync-crsnq\" (UID: \"796f3197-44d8-408b-a8d4-2b5c6a282ff5\") " pod="watcher-kuttl-default/watcher-kuttl-db-sync-crsnq" Oct 03 09:01:45 crc kubenswrapper[4765]: I1003 09:01:45.789191 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2kj9q\" (UniqueName: \"kubernetes.io/projected/796f3197-44d8-408b-a8d4-2b5c6a282ff5-kube-api-access-2kj9q\") pod \"watcher-kuttl-db-sync-crsnq\" (UID: \"796f3197-44d8-408b-a8d4-2b5c6a282ff5\") " pod="watcher-kuttl-default/watcher-kuttl-db-sync-crsnq" Oct 03 09:01:45 crc kubenswrapper[4765]: I1003 09:01:45.890628 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2kj9q\" (UniqueName: \"kubernetes.io/projected/796f3197-44d8-408b-a8d4-2b5c6a282ff5-kube-api-access-2kj9q\") pod \"watcher-kuttl-db-sync-crsnq\" (UID: \"796f3197-44d8-408b-a8d4-2b5c6a282ff5\") " pod="watcher-kuttl-default/watcher-kuttl-db-sync-crsnq" Oct 03 09:01:45 crc kubenswrapper[4765]: I1003 09:01:45.890691 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/796f3197-44d8-408b-a8d4-2b5c6a282ff5-combined-ca-bundle\") pod \"watcher-kuttl-db-sync-crsnq\" (UID: \"796f3197-44d8-408b-a8d4-2b5c6a282ff5\") " pod="watcher-kuttl-default/watcher-kuttl-db-sync-crsnq" Oct 03 09:01:45 crc kubenswrapper[4765]: I1003 09:01:45.890711 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/796f3197-44d8-408b-a8d4-2b5c6a282ff5-config-data\") pod \"watcher-kuttl-db-sync-crsnq\" (UID: \"796f3197-44d8-408b-a8d4-2b5c6a282ff5\") " pod="watcher-kuttl-default/watcher-kuttl-db-sync-crsnq" Oct 03 09:01:45 crc kubenswrapper[4765]: I1003 09:01:45.890772 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/796f3197-44d8-408b-a8d4-2b5c6a282ff5-db-sync-config-data\") pod \"watcher-kuttl-db-sync-crsnq\" (UID: \"796f3197-44d8-408b-a8d4-2b5c6a282ff5\") " pod="watcher-kuttl-default/watcher-kuttl-db-sync-crsnq" Oct 03 09:01:45 crc kubenswrapper[4765]: I1003 09:01:45.896106 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/796f3197-44d8-408b-a8d4-2b5c6a282ff5-config-data\") pod \"watcher-kuttl-db-sync-crsnq\" (UID: \"796f3197-44d8-408b-a8d4-2b5c6a282ff5\") " pod="watcher-kuttl-default/watcher-kuttl-db-sync-crsnq" Oct 03 09:01:45 crc kubenswrapper[4765]: I1003 09:01:45.904146 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/796f3197-44d8-408b-a8d4-2b5c6a282ff5-combined-ca-bundle\") pod \"watcher-kuttl-db-sync-crsnq\" (UID: \"796f3197-44d8-408b-a8d4-2b5c6a282ff5\") " pod="watcher-kuttl-default/watcher-kuttl-db-sync-crsnq" Oct 03 09:01:45 crc kubenswrapper[4765]: I1003 09:01:45.905071 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/796f3197-44d8-408b-a8d4-2b5c6a282ff5-db-sync-config-data\") pod \"watcher-kuttl-db-sync-crsnq\" (UID: \"796f3197-44d8-408b-a8d4-2b5c6a282ff5\") " pod="watcher-kuttl-default/watcher-kuttl-db-sync-crsnq" Oct 03 09:01:45 crc kubenswrapper[4765]: I1003 09:01:45.908407 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2kj9q\" (UniqueName: \"kubernetes.io/projected/796f3197-44d8-408b-a8d4-2b5c6a282ff5-kube-api-access-2kj9q\") pod \"watcher-kuttl-db-sync-crsnq\" (UID: \"796f3197-44d8-408b-a8d4-2b5c6a282ff5\") " pod="watcher-kuttl-default/watcher-kuttl-db-sync-crsnq" Oct 03 09:01:46 crc kubenswrapper[4765]: I1003 09:01:46.018897 4765 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/watcher-kuttl-db-sync-crsnq" Oct 03 09:01:46 crc kubenswrapper[4765]: I1003 09:01:46.492570 4765 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["watcher-kuttl-default/watcher-kuttl-db-sync-crsnq"] Oct 03 09:01:47 crc kubenswrapper[4765]: I1003 09:01:47.388318 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/watcher-kuttl-db-sync-crsnq" event={"ID":"796f3197-44d8-408b-a8d4-2b5c6a282ff5","Type":"ContainerStarted","Data":"3e876d0881959f828070159610cf715b0328f5f69152bcbf95e36604b60306d6"} Oct 03 09:01:47 crc kubenswrapper[4765]: I1003 09:01:47.388604 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/watcher-kuttl-db-sync-crsnq" event={"ID":"796f3197-44d8-408b-a8d4-2b5c6a282ff5","Type":"ContainerStarted","Data":"dc87328d9ff05053d6edd405d624f262af94552b492b268f26021277438d8cfc"} Oct 03 09:01:47 crc kubenswrapper[4765]: I1003 09:01:47.433514 4765 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="watcher-kuttl-default/watcher-kuttl-db-sync-crsnq" podStartSLOduration=2.433487778 podStartE2EDuration="2.433487778s" podCreationTimestamp="2025-10-03 09:01:45 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-03 09:01:47.430048342 +0000 UTC m=+1351.731542692" watchObservedRunningTime="2025-10-03 09:01:47.433487778 +0000 UTC m=+1351.734982108" Oct 03 09:01:50 crc kubenswrapper[4765]: I1003 09:01:50.412382 4765 generic.go:334] "Generic (PLEG): container finished" podID="796f3197-44d8-408b-a8d4-2b5c6a282ff5" containerID="3e876d0881959f828070159610cf715b0328f5f69152bcbf95e36604b60306d6" exitCode=0 Oct 03 09:01:50 crc kubenswrapper[4765]: I1003 09:01:50.412434 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/watcher-kuttl-db-sync-crsnq" event={"ID":"796f3197-44d8-408b-a8d4-2b5c6a282ff5","Type":"ContainerDied","Data":"3e876d0881959f828070159610cf715b0328f5f69152bcbf95e36604b60306d6"} Oct 03 09:01:51 crc kubenswrapper[4765]: I1003 09:01:51.906172 4765 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/watcher-kuttl-db-sync-crsnq" Oct 03 09:01:52 crc kubenswrapper[4765]: I1003 09:01:52.086243 4765 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2kj9q\" (UniqueName: \"kubernetes.io/projected/796f3197-44d8-408b-a8d4-2b5c6a282ff5-kube-api-access-2kj9q\") pod \"796f3197-44d8-408b-a8d4-2b5c6a282ff5\" (UID: \"796f3197-44d8-408b-a8d4-2b5c6a282ff5\") " Oct 03 09:01:52 crc kubenswrapper[4765]: I1003 09:01:52.086302 4765 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/796f3197-44d8-408b-a8d4-2b5c6a282ff5-combined-ca-bundle\") pod \"796f3197-44d8-408b-a8d4-2b5c6a282ff5\" (UID: \"796f3197-44d8-408b-a8d4-2b5c6a282ff5\") " Oct 03 09:01:52 crc kubenswrapper[4765]: I1003 09:01:52.086344 4765 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/796f3197-44d8-408b-a8d4-2b5c6a282ff5-config-data\") pod \"796f3197-44d8-408b-a8d4-2b5c6a282ff5\" (UID: \"796f3197-44d8-408b-a8d4-2b5c6a282ff5\") " Oct 03 09:01:52 crc kubenswrapper[4765]: I1003 09:01:52.086402 4765 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/796f3197-44d8-408b-a8d4-2b5c6a282ff5-db-sync-config-data\") pod \"796f3197-44d8-408b-a8d4-2b5c6a282ff5\" (UID: \"796f3197-44d8-408b-a8d4-2b5c6a282ff5\") " Oct 03 09:01:52 crc kubenswrapper[4765]: I1003 09:01:52.091297 4765 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/796f3197-44d8-408b-a8d4-2b5c6a282ff5-db-sync-config-data" (OuterVolumeSpecName: "db-sync-config-data") pod "796f3197-44d8-408b-a8d4-2b5c6a282ff5" (UID: "796f3197-44d8-408b-a8d4-2b5c6a282ff5"). InnerVolumeSpecName "db-sync-config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 09:01:52 crc kubenswrapper[4765]: I1003 09:01:52.099966 4765 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/796f3197-44d8-408b-a8d4-2b5c6a282ff5-kube-api-access-2kj9q" (OuterVolumeSpecName: "kube-api-access-2kj9q") pod "796f3197-44d8-408b-a8d4-2b5c6a282ff5" (UID: "796f3197-44d8-408b-a8d4-2b5c6a282ff5"). InnerVolumeSpecName "kube-api-access-2kj9q". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 09:01:52 crc kubenswrapper[4765]: I1003 09:01:52.114495 4765 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/796f3197-44d8-408b-a8d4-2b5c6a282ff5-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "796f3197-44d8-408b-a8d4-2b5c6a282ff5" (UID: "796f3197-44d8-408b-a8d4-2b5c6a282ff5"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 09:01:52 crc kubenswrapper[4765]: I1003 09:01:52.129787 4765 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/796f3197-44d8-408b-a8d4-2b5c6a282ff5-config-data" (OuterVolumeSpecName: "config-data") pod "796f3197-44d8-408b-a8d4-2b5c6a282ff5" (UID: "796f3197-44d8-408b-a8d4-2b5c6a282ff5"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 09:01:52 crc kubenswrapper[4765]: I1003 09:01:52.188365 4765 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2kj9q\" (UniqueName: \"kubernetes.io/projected/796f3197-44d8-408b-a8d4-2b5c6a282ff5-kube-api-access-2kj9q\") on node \"crc\" DevicePath \"\"" Oct 03 09:01:52 crc kubenswrapper[4765]: I1003 09:01:52.188401 4765 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/796f3197-44d8-408b-a8d4-2b5c6a282ff5-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 03 09:01:52 crc kubenswrapper[4765]: I1003 09:01:52.188414 4765 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/796f3197-44d8-408b-a8d4-2b5c6a282ff5-config-data\") on node \"crc\" DevicePath \"\"" Oct 03 09:01:52 crc kubenswrapper[4765]: I1003 09:01:52.188422 4765 reconciler_common.go:293] "Volume detached for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/796f3197-44d8-408b-a8d4-2b5c6a282ff5-db-sync-config-data\") on node \"crc\" DevicePath \"\"" Oct 03 09:01:52 crc kubenswrapper[4765]: I1003 09:01:52.446380 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/watcher-kuttl-db-sync-crsnq" event={"ID":"796f3197-44d8-408b-a8d4-2b5c6a282ff5","Type":"ContainerDied","Data":"dc87328d9ff05053d6edd405d624f262af94552b492b268f26021277438d8cfc"} Oct 03 09:01:52 crc kubenswrapper[4765]: I1003 09:01:52.446429 4765 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="dc87328d9ff05053d6edd405d624f262af94552b492b268f26021277438d8cfc" Oct 03 09:01:52 crc kubenswrapper[4765]: I1003 09:01:52.446467 4765 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/watcher-kuttl-db-sync-crsnq" Oct 03 09:01:52 crc kubenswrapper[4765]: I1003 09:01:52.768154 4765 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["watcher-kuttl-default/watcher-kuttl-decision-engine-0"] Oct 03 09:01:52 crc kubenswrapper[4765]: E1003 09:01:52.768885 4765 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="796f3197-44d8-408b-a8d4-2b5c6a282ff5" containerName="watcher-kuttl-db-sync" Oct 03 09:01:52 crc kubenswrapper[4765]: I1003 09:01:52.768992 4765 state_mem.go:107] "Deleted CPUSet assignment" podUID="796f3197-44d8-408b-a8d4-2b5c6a282ff5" containerName="watcher-kuttl-db-sync" Oct 03 09:01:52 crc kubenswrapper[4765]: I1003 09:01:52.769261 4765 memory_manager.go:354] "RemoveStaleState removing state" podUID="796f3197-44d8-408b-a8d4-2b5c6a282ff5" containerName="watcher-kuttl-db-sync" Oct 03 09:01:52 crc kubenswrapper[4765]: I1003 09:01:52.769871 4765 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/watcher-kuttl-decision-engine-0" Oct 03 09:01:52 crc kubenswrapper[4765]: I1003 09:01:52.772579 4765 reflector.go:368] Caches populated for *v1.Secret from object-"watcher-kuttl-default"/"watcher-kuttl-decision-engine-config-data" Oct 03 09:01:52 crc kubenswrapper[4765]: I1003 09:01:52.772977 4765 reflector.go:368] Caches populated for *v1.Secret from object-"watcher-kuttl-default"/"watcher-watcher-kuttl-dockercfg-cm42b" Oct 03 09:01:52 crc kubenswrapper[4765]: I1003 09:01:52.781136 4765 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["watcher-kuttl-default/watcher-kuttl-decision-engine-0"] Oct 03 09:01:52 crc kubenswrapper[4765]: I1003 09:01:52.848270 4765 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["watcher-kuttl-default/watcher-kuttl-applier-0"] Oct 03 09:01:52 crc kubenswrapper[4765]: I1003 09:01:52.849255 4765 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/watcher-kuttl-applier-0" Oct 03 09:01:52 crc kubenswrapper[4765]: I1003 09:01:52.855407 4765 reflector.go:368] Caches populated for *v1.Secret from object-"watcher-kuttl-default"/"watcher-kuttl-applier-config-data" Oct 03 09:01:52 crc kubenswrapper[4765]: I1003 09:01:52.862152 4765 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["watcher-kuttl-default/watcher-kuttl-applier-0"] Oct 03 09:01:52 crc kubenswrapper[4765]: I1003 09:01:52.899973 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"custom-prometheus-ca\" (UniqueName: \"kubernetes.io/secret/f497c8f6-6ad1-4287-a194-e7cd523e8eb7-custom-prometheus-ca\") pod \"watcher-kuttl-decision-engine-0\" (UID: \"f497c8f6-6ad1-4287-a194-e7cd523e8eb7\") " pod="watcher-kuttl-default/watcher-kuttl-decision-engine-0" Oct 03 09:01:52 crc kubenswrapper[4765]: I1003 09:01:52.900521 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5xcb4\" (UniqueName: \"kubernetes.io/projected/f497c8f6-6ad1-4287-a194-e7cd523e8eb7-kube-api-access-5xcb4\") pod \"watcher-kuttl-decision-engine-0\" (UID: \"f497c8f6-6ad1-4287-a194-e7cd523e8eb7\") " pod="watcher-kuttl-default/watcher-kuttl-decision-engine-0" Oct 03 09:01:52 crc kubenswrapper[4765]: I1003 09:01:52.900597 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f497c8f6-6ad1-4287-a194-e7cd523e8eb7-config-data\") pod \"watcher-kuttl-decision-engine-0\" (UID: \"f497c8f6-6ad1-4287-a194-e7cd523e8eb7\") " pod="watcher-kuttl-default/watcher-kuttl-decision-engine-0" Oct 03 09:01:52 crc kubenswrapper[4765]: I1003 09:01:52.900680 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/f497c8f6-6ad1-4287-a194-e7cd523e8eb7-logs\") pod \"watcher-kuttl-decision-engine-0\" (UID: \"f497c8f6-6ad1-4287-a194-e7cd523e8eb7\") " pod="watcher-kuttl-default/watcher-kuttl-decision-engine-0" Oct 03 09:01:52 crc kubenswrapper[4765]: I1003 09:01:52.900722 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f497c8f6-6ad1-4287-a194-e7cd523e8eb7-combined-ca-bundle\") pod \"watcher-kuttl-decision-engine-0\" (UID: \"f497c8f6-6ad1-4287-a194-e7cd523e8eb7\") " pod="watcher-kuttl-default/watcher-kuttl-decision-engine-0" Oct 03 09:01:52 crc kubenswrapper[4765]: I1003 09:01:52.936829 4765 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["watcher-kuttl-default/watcher-kuttl-api-0"] Oct 03 09:01:52 crc kubenswrapper[4765]: I1003 09:01:52.938227 4765 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/watcher-kuttl-api-0" Oct 03 09:01:52 crc kubenswrapper[4765]: I1003 09:01:52.942957 4765 reflector.go:368] Caches populated for *v1.Secret from object-"watcher-kuttl-default"/"cert-watcher-internal-svc" Oct 03 09:01:52 crc kubenswrapper[4765]: I1003 09:01:52.943692 4765 reflector.go:368] Caches populated for *v1.Secret from object-"watcher-kuttl-default"/"watcher-kuttl-api-config-data" Oct 03 09:01:52 crc kubenswrapper[4765]: I1003 09:01:52.943885 4765 reflector.go:368] Caches populated for *v1.Secret from object-"watcher-kuttl-default"/"cert-watcher-public-svc" Oct 03 09:01:52 crc kubenswrapper[4765]: I1003 09:01:52.968985 4765 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["watcher-kuttl-default/watcher-kuttl-api-0"] Oct 03 09:01:53 crc kubenswrapper[4765]: I1003 09:01:53.003691 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vcj52\" (UniqueName: \"kubernetes.io/projected/814e2e41-1895-4103-b651-9e8e9db42905-kube-api-access-vcj52\") pod \"watcher-kuttl-applier-0\" (UID: \"814e2e41-1895-4103-b651-9e8e9db42905\") " pod="watcher-kuttl-default/watcher-kuttl-applier-0" Oct 03 09:01:53 crc kubenswrapper[4765]: I1003 09:01:53.004102 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"custom-prometheus-ca\" (UniqueName: \"kubernetes.io/secret/f497c8f6-6ad1-4287-a194-e7cd523e8eb7-custom-prometheus-ca\") pod \"watcher-kuttl-decision-engine-0\" (UID: \"f497c8f6-6ad1-4287-a194-e7cd523e8eb7\") " pod="watcher-kuttl-default/watcher-kuttl-decision-engine-0" Oct 03 09:01:53 crc kubenswrapper[4765]: I1003 09:01:53.004178 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/814e2e41-1895-4103-b651-9e8e9db42905-logs\") pod \"watcher-kuttl-applier-0\" (UID: \"814e2e41-1895-4103-b651-9e8e9db42905\") " pod="watcher-kuttl-default/watcher-kuttl-applier-0" Oct 03 09:01:53 crc kubenswrapper[4765]: I1003 09:01:53.004303 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5xcb4\" (UniqueName: \"kubernetes.io/projected/f497c8f6-6ad1-4287-a194-e7cd523e8eb7-kube-api-access-5xcb4\") pod \"watcher-kuttl-decision-engine-0\" (UID: \"f497c8f6-6ad1-4287-a194-e7cd523e8eb7\") " pod="watcher-kuttl-default/watcher-kuttl-decision-engine-0" Oct 03 09:01:53 crc kubenswrapper[4765]: I1003 09:01:53.004322 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f497c8f6-6ad1-4287-a194-e7cd523e8eb7-config-data\") pod \"watcher-kuttl-decision-engine-0\" (UID: \"f497c8f6-6ad1-4287-a194-e7cd523e8eb7\") " pod="watcher-kuttl-default/watcher-kuttl-decision-engine-0" Oct 03 09:01:53 crc kubenswrapper[4765]: I1003 09:01:53.004365 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/f497c8f6-6ad1-4287-a194-e7cd523e8eb7-logs\") pod \"watcher-kuttl-decision-engine-0\" (UID: \"f497c8f6-6ad1-4287-a194-e7cd523e8eb7\") " pod="watcher-kuttl-default/watcher-kuttl-decision-engine-0" Oct 03 09:01:53 crc kubenswrapper[4765]: I1003 09:01:53.004395 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/814e2e41-1895-4103-b651-9e8e9db42905-config-data\") pod \"watcher-kuttl-applier-0\" (UID: \"814e2e41-1895-4103-b651-9e8e9db42905\") " pod="watcher-kuttl-default/watcher-kuttl-applier-0" Oct 03 09:01:53 crc kubenswrapper[4765]: I1003 09:01:53.004417 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/814e2e41-1895-4103-b651-9e8e9db42905-combined-ca-bundle\") pod \"watcher-kuttl-applier-0\" (UID: \"814e2e41-1895-4103-b651-9e8e9db42905\") " pod="watcher-kuttl-default/watcher-kuttl-applier-0" Oct 03 09:01:53 crc kubenswrapper[4765]: I1003 09:01:53.004441 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f497c8f6-6ad1-4287-a194-e7cd523e8eb7-combined-ca-bundle\") pod \"watcher-kuttl-decision-engine-0\" (UID: \"f497c8f6-6ad1-4287-a194-e7cd523e8eb7\") " pod="watcher-kuttl-default/watcher-kuttl-decision-engine-0" Oct 03 09:01:53 crc kubenswrapper[4765]: I1003 09:01:53.007967 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/f497c8f6-6ad1-4287-a194-e7cd523e8eb7-logs\") pod \"watcher-kuttl-decision-engine-0\" (UID: \"f497c8f6-6ad1-4287-a194-e7cd523e8eb7\") " pod="watcher-kuttl-default/watcher-kuttl-decision-engine-0" Oct 03 09:01:53 crc kubenswrapper[4765]: I1003 09:01:53.008871 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f497c8f6-6ad1-4287-a194-e7cd523e8eb7-config-data\") pod \"watcher-kuttl-decision-engine-0\" (UID: \"f497c8f6-6ad1-4287-a194-e7cd523e8eb7\") " pod="watcher-kuttl-default/watcher-kuttl-decision-engine-0" Oct 03 09:01:53 crc kubenswrapper[4765]: I1003 09:01:53.009949 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"custom-prometheus-ca\" (UniqueName: \"kubernetes.io/secret/f497c8f6-6ad1-4287-a194-e7cd523e8eb7-custom-prometheus-ca\") pod \"watcher-kuttl-decision-engine-0\" (UID: \"f497c8f6-6ad1-4287-a194-e7cd523e8eb7\") " pod="watcher-kuttl-default/watcher-kuttl-decision-engine-0" Oct 03 09:01:53 crc kubenswrapper[4765]: I1003 09:01:53.010333 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f497c8f6-6ad1-4287-a194-e7cd523e8eb7-combined-ca-bundle\") pod \"watcher-kuttl-decision-engine-0\" (UID: \"f497c8f6-6ad1-4287-a194-e7cd523e8eb7\") " pod="watcher-kuttl-default/watcher-kuttl-decision-engine-0" Oct 03 09:01:53 crc kubenswrapper[4765]: I1003 09:01:53.028762 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5xcb4\" (UniqueName: \"kubernetes.io/projected/f497c8f6-6ad1-4287-a194-e7cd523e8eb7-kube-api-access-5xcb4\") pod \"watcher-kuttl-decision-engine-0\" (UID: \"f497c8f6-6ad1-4287-a194-e7cd523e8eb7\") " pod="watcher-kuttl-default/watcher-kuttl-decision-engine-0" Oct 03 09:01:53 crc kubenswrapper[4765]: I1003 09:01:53.089443 4765 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/watcher-kuttl-decision-engine-0" Oct 03 09:01:53 crc kubenswrapper[4765]: I1003 09:01:53.105821 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/814e2e41-1895-4103-b651-9e8e9db42905-logs\") pod \"watcher-kuttl-applier-0\" (UID: \"814e2e41-1895-4103-b651-9e8e9db42905\") " pod="watcher-kuttl-default/watcher-kuttl-applier-0" Oct 03 09:01:53 crc kubenswrapper[4765]: I1003 09:01:53.106061 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/3343efeb-5001-4879-9e39-f534fe992296-public-tls-certs\") pod \"watcher-kuttl-api-0\" (UID: \"3343efeb-5001-4879-9e39-f534fe992296\") " pod="watcher-kuttl-default/watcher-kuttl-api-0" Oct 03 09:01:53 crc kubenswrapper[4765]: I1003 09:01:53.106185 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/3343efeb-5001-4879-9e39-f534fe992296-internal-tls-certs\") pod \"watcher-kuttl-api-0\" (UID: \"3343efeb-5001-4879-9e39-f534fe992296\") " pod="watcher-kuttl-default/watcher-kuttl-api-0" Oct 03 09:01:53 crc kubenswrapper[4765]: I1003 09:01:53.106274 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jbdj6\" (UniqueName: \"kubernetes.io/projected/3343efeb-5001-4879-9e39-f534fe992296-kube-api-access-jbdj6\") pod \"watcher-kuttl-api-0\" (UID: \"3343efeb-5001-4879-9e39-f534fe992296\") " pod="watcher-kuttl-default/watcher-kuttl-api-0" Oct 03 09:01:53 crc kubenswrapper[4765]: I1003 09:01:53.106388 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"custom-prometheus-ca\" (UniqueName: \"kubernetes.io/secret/3343efeb-5001-4879-9e39-f534fe992296-custom-prometheus-ca\") pod \"watcher-kuttl-api-0\" (UID: \"3343efeb-5001-4879-9e39-f534fe992296\") " pod="watcher-kuttl-default/watcher-kuttl-api-0" Oct 03 09:01:53 crc kubenswrapper[4765]: I1003 09:01:53.106458 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/814e2e41-1895-4103-b651-9e8e9db42905-config-data\") pod \"watcher-kuttl-applier-0\" (UID: \"814e2e41-1895-4103-b651-9e8e9db42905\") " pod="watcher-kuttl-default/watcher-kuttl-applier-0" Oct 03 09:01:53 crc kubenswrapper[4765]: I1003 09:01:53.106479 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/814e2e41-1895-4103-b651-9e8e9db42905-logs\") pod \"watcher-kuttl-applier-0\" (UID: \"814e2e41-1895-4103-b651-9e8e9db42905\") " pod="watcher-kuttl-default/watcher-kuttl-applier-0" Oct 03 09:01:53 crc kubenswrapper[4765]: I1003 09:01:53.106556 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/814e2e41-1895-4103-b651-9e8e9db42905-combined-ca-bundle\") pod \"watcher-kuttl-applier-0\" (UID: \"814e2e41-1895-4103-b651-9e8e9db42905\") " pod="watcher-kuttl-default/watcher-kuttl-applier-0" Oct 03 09:01:53 crc kubenswrapper[4765]: I1003 09:01:53.107294 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/3343efeb-5001-4879-9e39-f534fe992296-logs\") pod \"watcher-kuttl-api-0\" (UID: \"3343efeb-5001-4879-9e39-f534fe992296\") " pod="watcher-kuttl-default/watcher-kuttl-api-0" Oct 03 09:01:53 crc kubenswrapper[4765]: I1003 09:01:53.107469 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3343efeb-5001-4879-9e39-f534fe992296-config-data\") pod \"watcher-kuttl-api-0\" (UID: \"3343efeb-5001-4879-9e39-f534fe992296\") " pod="watcher-kuttl-default/watcher-kuttl-api-0" Oct 03 09:01:53 crc kubenswrapper[4765]: I1003 09:01:53.107566 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vcj52\" (UniqueName: \"kubernetes.io/projected/814e2e41-1895-4103-b651-9e8e9db42905-kube-api-access-vcj52\") pod \"watcher-kuttl-applier-0\" (UID: \"814e2e41-1895-4103-b651-9e8e9db42905\") " pod="watcher-kuttl-default/watcher-kuttl-applier-0" Oct 03 09:01:53 crc kubenswrapper[4765]: I1003 09:01:53.107601 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3343efeb-5001-4879-9e39-f534fe992296-combined-ca-bundle\") pod \"watcher-kuttl-api-0\" (UID: \"3343efeb-5001-4879-9e39-f534fe992296\") " pod="watcher-kuttl-default/watcher-kuttl-api-0" Oct 03 09:01:53 crc kubenswrapper[4765]: I1003 09:01:53.110005 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/814e2e41-1895-4103-b651-9e8e9db42905-combined-ca-bundle\") pod \"watcher-kuttl-applier-0\" (UID: \"814e2e41-1895-4103-b651-9e8e9db42905\") " pod="watcher-kuttl-default/watcher-kuttl-applier-0" Oct 03 09:01:53 crc kubenswrapper[4765]: I1003 09:01:53.110570 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/814e2e41-1895-4103-b651-9e8e9db42905-config-data\") pod \"watcher-kuttl-applier-0\" (UID: \"814e2e41-1895-4103-b651-9e8e9db42905\") " pod="watcher-kuttl-default/watcher-kuttl-applier-0" Oct 03 09:01:53 crc kubenswrapper[4765]: I1003 09:01:53.128743 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vcj52\" (UniqueName: \"kubernetes.io/projected/814e2e41-1895-4103-b651-9e8e9db42905-kube-api-access-vcj52\") pod \"watcher-kuttl-applier-0\" (UID: \"814e2e41-1895-4103-b651-9e8e9db42905\") " pod="watcher-kuttl-default/watcher-kuttl-applier-0" Oct 03 09:01:53 crc kubenswrapper[4765]: I1003 09:01:53.164507 4765 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/watcher-kuttl-applier-0" Oct 03 09:01:53 crc kubenswrapper[4765]: I1003 09:01:53.208343 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/3343efeb-5001-4879-9e39-f534fe992296-public-tls-certs\") pod \"watcher-kuttl-api-0\" (UID: \"3343efeb-5001-4879-9e39-f534fe992296\") " pod="watcher-kuttl-default/watcher-kuttl-api-0" Oct 03 09:01:53 crc kubenswrapper[4765]: I1003 09:01:53.208408 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/3343efeb-5001-4879-9e39-f534fe992296-internal-tls-certs\") pod \"watcher-kuttl-api-0\" (UID: \"3343efeb-5001-4879-9e39-f534fe992296\") " pod="watcher-kuttl-default/watcher-kuttl-api-0" Oct 03 09:01:53 crc kubenswrapper[4765]: I1003 09:01:53.208449 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jbdj6\" (UniqueName: \"kubernetes.io/projected/3343efeb-5001-4879-9e39-f534fe992296-kube-api-access-jbdj6\") pod \"watcher-kuttl-api-0\" (UID: \"3343efeb-5001-4879-9e39-f534fe992296\") " pod="watcher-kuttl-default/watcher-kuttl-api-0" Oct 03 09:01:53 crc kubenswrapper[4765]: I1003 09:01:53.208510 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"custom-prometheus-ca\" (UniqueName: \"kubernetes.io/secret/3343efeb-5001-4879-9e39-f534fe992296-custom-prometheus-ca\") pod \"watcher-kuttl-api-0\" (UID: \"3343efeb-5001-4879-9e39-f534fe992296\") " pod="watcher-kuttl-default/watcher-kuttl-api-0" Oct 03 09:01:53 crc kubenswrapper[4765]: I1003 09:01:53.208554 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/3343efeb-5001-4879-9e39-f534fe992296-logs\") pod \"watcher-kuttl-api-0\" (UID: \"3343efeb-5001-4879-9e39-f534fe992296\") " pod="watcher-kuttl-default/watcher-kuttl-api-0" Oct 03 09:01:53 crc kubenswrapper[4765]: I1003 09:01:53.208592 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3343efeb-5001-4879-9e39-f534fe992296-config-data\") pod \"watcher-kuttl-api-0\" (UID: \"3343efeb-5001-4879-9e39-f534fe992296\") " pod="watcher-kuttl-default/watcher-kuttl-api-0" Oct 03 09:01:53 crc kubenswrapper[4765]: I1003 09:01:53.208630 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3343efeb-5001-4879-9e39-f534fe992296-combined-ca-bundle\") pod \"watcher-kuttl-api-0\" (UID: \"3343efeb-5001-4879-9e39-f534fe992296\") " pod="watcher-kuttl-default/watcher-kuttl-api-0" Oct 03 09:01:53 crc kubenswrapper[4765]: I1003 09:01:53.209071 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/3343efeb-5001-4879-9e39-f534fe992296-logs\") pod \"watcher-kuttl-api-0\" (UID: \"3343efeb-5001-4879-9e39-f534fe992296\") " pod="watcher-kuttl-default/watcher-kuttl-api-0" Oct 03 09:01:53 crc kubenswrapper[4765]: I1003 09:01:53.219035 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/3343efeb-5001-4879-9e39-f534fe992296-internal-tls-certs\") pod \"watcher-kuttl-api-0\" (UID: \"3343efeb-5001-4879-9e39-f534fe992296\") " pod="watcher-kuttl-default/watcher-kuttl-api-0" Oct 03 09:01:53 crc kubenswrapper[4765]: I1003 09:01:53.227127 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3343efeb-5001-4879-9e39-f534fe992296-config-data\") pod \"watcher-kuttl-api-0\" (UID: \"3343efeb-5001-4879-9e39-f534fe992296\") " pod="watcher-kuttl-default/watcher-kuttl-api-0" Oct 03 09:01:53 crc kubenswrapper[4765]: I1003 09:01:53.238119 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/3343efeb-5001-4879-9e39-f534fe992296-public-tls-certs\") pod \"watcher-kuttl-api-0\" (UID: \"3343efeb-5001-4879-9e39-f534fe992296\") " pod="watcher-kuttl-default/watcher-kuttl-api-0" Oct 03 09:01:53 crc kubenswrapper[4765]: I1003 09:01:53.238225 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"custom-prometheus-ca\" (UniqueName: \"kubernetes.io/secret/3343efeb-5001-4879-9e39-f534fe992296-custom-prometheus-ca\") pod \"watcher-kuttl-api-0\" (UID: \"3343efeb-5001-4879-9e39-f534fe992296\") " pod="watcher-kuttl-default/watcher-kuttl-api-0" Oct 03 09:01:53 crc kubenswrapper[4765]: I1003 09:01:53.242852 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3343efeb-5001-4879-9e39-f534fe992296-combined-ca-bundle\") pod \"watcher-kuttl-api-0\" (UID: \"3343efeb-5001-4879-9e39-f534fe992296\") " pod="watcher-kuttl-default/watcher-kuttl-api-0" Oct 03 09:01:53 crc kubenswrapper[4765]: I1003 09:01:53.268557 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jbdj6\" (UniqueName: \"kubernetes.io/projected/3343efeb-5001-4879-9e39-f534fe992296-kube-api-access-jbdj6\") pod \"watcher-kuttl-api-0\" (UID: \"3343efeb-5001-4879-9e39-f534fe992296\") " pod="watcher-kuttl-default/watcher-kuttl-api-0" Oct 03 09:01:53 crc kubenswrapper[4765]: I1003 09:01:53.552968 4765 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/watcher-kuttl-api-0" Oct 03 09:01:53 crc kubenswrapper[4765]: I1003 09:01:53.620756 4765 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["watcher-kuttl-default/watcher-kuttl-decision-engine-0"] Oct 03 09:01:53 crc kubenswrapper[4765]: I1003 09:01:53.769658 4765 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["watcher-kuttl-default/watcher-kuttl-applier-0"] Oct 03 09:01:53 crc kubenswrapper[4765]: W1003 09:01:53.781790 4765 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod814e2e41_1895_4103_b651_9e8e9db42905.slice/crio-a1dee04a50a3af16bcd9e85c981ed3d703ab16b332debf69391f2bcb33f919a5 WatchSource:0}: Error finding container a1dee04a50a3af16bcd9e85c981ed3d703ab16b332debf69391f2bcb33f919a5: Status 404 returned error can't find the container with id a1dee04a50a3af16bcd9e85c981ed3d703ab16b332debf69391f2bcb33f919a5 Oct 03 09:01:54 crc kubenswrapper[4765]: I1003 09:01:54.062275 4765 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["watcher-kuttl-default/watcher-kuttl-api-0"] Oct 03 09:01:54 crc kubenswrapper[4765]: W1003 09:01:54.070347 4765 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod3343efeb_5001_4879_9e39_f534fe992296.slice/crio-6948c3eaa009ba5dfe61a06b1b76193cb3a2debbf644ae3956a58c3adb349592 WatchSource:0}: Error finding container 6948c3eaa009ba5dfe61a06b1b76193cb3a2debbf644ae3956a58c3adb349592: Status 404 returned error can't find the container with id 6948c3eaa009ba5dfe61a06b1b76193cb3a2debbf644ae3956a58c3adb349592 Oct 03 09:01:54 crc kubenswrapper[4765]: I1003 09:01:54.468031 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/watcher-kuttl-applier-0" event={"ID":"814e2e41-1895-4103-b651-9e8e9db42905","Type":"ContainerStarted","Data":"6a8e4114cc89500a9ec7aab695e077b388beb5d9aa62d833e8ca372e856f6652"} Oct 03 09:01:54 crc kubenswrapper[4765]: I1003 09:01:54.468778 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/watcher-kuttl-applier-0" event={"ID":"814e2e41-1895-4103-b651-9e8e9db42905","Type":"ContainerStarted","Data":"a1dee04a50a3af16bcd9e85c981ed3d703ab16b332debf69391f2bcb33f919a5"} Oct 03 09:01:54 crc kubenswrapper[4765]: I1003 09:01:54.480469 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/watcher-kuttl-decision-engine-0" event={"ID":"f497c8f6-6ad1-4287-a194-e7cd523e8eb7","Type":"ContainerStarted","Data":"3f05ffa3de7010d992730b04abb3eec4fcdce81def31055ecf52bb58787eb748"} Oct 03 09:01:54 crc kubenswrapper[4765]: I1003 09:01:54.480679 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/watcher-kuttl-decision-engine-0" event={"ID":"f497c8f6-6ad1-4287-a194-e7cd523e8eb7","Type":"ContainerStarted","Data":"e4daa85676010b8ae43865e480f00d39a905b131b580e600031264bf5ebb0dca"} Oct 03 09:01:54 crc kubenswrapper[4765]: I1003 09:01:54.482513 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/watcher-kuttl-api-0" event={"ID":"3343efeb-5001-4879-9e39-f534fe992296","Type":"ContainerStarted","Data":"a256f7182c9954a2fb4a26aec81aea2f48da1e9d97f6c25eac8ed563735336a1"} Oct 03 09:01:54 crc kubenswrapper[4765]: I1003 09:01:54.482554 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/watcher-kuttl-api-0" event={"ID":"3343efeb-5001-4879-9e39-f534fe992296","Type":"ContainerStarted","Data":"6948c3eaa009ba5dfe61a06b1b76193cb3a2debbf644ae3956a58c3adb349592"} Oct 03 09:01:54 crc kubenswrapper[4765]: I1003 09:01:54.497076 4765 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="watcher-kuttl-default/watcher-kuttl-applier-0" podStartSLOduration=2.497058441 podStartE2EDuration="2.497058441s" podCreationTimestamp="2025-10-03 09:01:52 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-03 09:01:54.493303877 +0000 UTC m=+1358.794798217" watchObservedRunningTime="2025-10-03 09:01:54.497058441 +0000 UTC m=+1358.798552761" Oct 03 09:01:54 crc kubenswrapper[4765]: I1003 09:01:54.527526 4765 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="watcher-kuttl-default/watcher-kuttl-decision-engine-0" podStartSLOduration=2.527505488 podStartE2EDuration="2.527505488s" podCreationTimestamp="2025-10-03 09:01:52 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-03 09:01:54.51725865 +0000 UTC m=+1358.818752980" watchObservedRunningTime="2025-10-03 09:01:54.527505488 +0000 UTC m=+1358.828999818" Oct 03 09:01:55 crc kubenswrapper[4765]: I1003 09:01:55.493713 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/watcher-kuttl-api-0" event={"ID":"3343efeb-5001-4879-9e39-f534fe992296","Type":"ContainerStarted","Data":"4f7de61b330256f08a4627ab69ec4c4cb6a28eb0fcfd7fdef90a82185028672c"} Oct 03 09:01:55 crc kubenswrapper[4765]: I1003 09:01:55.494353 4765 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="watcher-kuttl-default/watcher-kuttl-api-0" Oct 03 09:01:55 crc kubenswrapper[4765]: I1003 09:01:55.523906 4765 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="watcher-kuttl-default/watcher-kuttl-api-0" podStartSLOduration=3.523889434 podStartE2EDuration="3.523889434s" podCreationTimestamp="2025-10-03 09:01:52 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-03 09:01:55.519417431 +0000 UTC m=+1359.820911761" watchObservedRunningTime="2025-10-03 09:01:55.523889434 +0000 UTC m=+1359.825383764" Oct 03 09:01:56 crc kubenswrapper[4765]: I1003 09:01:56.419339 4765 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="watcher-kuttl-default/ceilometer-0" Oct 03 09:01:57 crc kubenswrapper[4765]: I1003 09:01:57.969769 4765 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="watcher-kuttl-default/watcher-kuttl-api-0" Oct 03 09:01:58 crc kubenswrapper[4765]: I1003 09:01:58.164738 4765 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="watcher-kuttl-default/watcher-kuttl-applier-0" Oct 03 09:01:58 crc kubenswrapper[4765]: I1003 09:01:58.553432 4765 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="watcher-kuttl-default/watcher-kuttl-api-0" Oct 03 09:02:03 crc kubenswrapper[4765]: I1003 09:02:03.090443 4765 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="watcher-kuttl-default/watcher-kuttl-decision-engine-0" Oct 03 09:02:03 crc kubenswrapper[4765]: I1003 09:02:03.132566 4765 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="watcher-kuttl-default/watcher-kuttl-decision-engine-0" Oct 03 09:02:03 crc kubenswrapper[4765]: I1003 09:02:03.164975 4765 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="watcher-kuttl-default/watcher-kuttl-applier-0" Oct 03 09:02:03 crc kubenswrapper[4765]: I1003 09:02:03.187085 4765 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="watcher-kuttl-default/watcher-kuttl-applier-0" Oct 03 09:02:03 crc kubenswrapper[4765]: I1003 09:02:03.553997 4765 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="watcher-kuttl-default/watcher-kuttl-decision-engine-0" Oct 03 09:02:03 crc kubenswrapper[4765]: I1003 09:02:03.554027 4765 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="watcher-kuttl-default/watcher-kuttl-api-0" Oct 03 09:02:03 crc kubenswrapper[4765]: I1003 09:02:03.566751 4765 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="watcher-kuttl-default/watcher-kuttl-api-0" Oct 03 09:02:03 crc kubenswrapper[4765]: I1003 09:02:03.582276 4765 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="watcher-kuttl-default/watcher-kuttl-applier-0" Oct 03 09:02:03 crc kubenswrapper[4765]: I1003 09:02:03.588596 4765 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="watcher-kuttl-default/watcher-kuttl-decision-engine-0" Oct 03 09:02:04 crc kubenswrapper[4765]: I1003 09:02:04.566511 4765 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="watcher-kuttl-default/watcher-kuttl-api-0" Oct 03 09:02:06 crc kubenswrapper[4765]: I1003 09:02:06.743603 4765 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["watcher-kuttl-default/ceilometer-0"] Oct 03 09:02:06 crc kubenswrapper[4765]: I1003 09:02:06.745316 4765 kuberuntime_container.go:808] "Killing container with a grace period" pod="watcher-kuttl-default/ceilometer-0" podUID="82df5442-55c2-414c-b79e-1b2cb7ea3c98" containerName="ceilometer-central-agent" containerID="cri-o://539b4b5f78c6e31bc0bd30571fbf8b42fbc7dceb39fd92509c3b752732a4a759" gracePeriod=30 Oct 03 09:02:06 crc kubenswrapper[4765]: I1003 09:02:06.745456 4765 kuberuntime_container.go:808] "Killing container with a grace period" pod="watcher-kuttl-default/ceilometer-0" podUID="82df5442-55c2-414c-b79e-1b2cb7ea3c98" containerName="ceilometer-notification-agent" containerID="cri-o://9ec71c76904490e7ff12c75b98e9d1957f9c7ed58d059931122b1d39416fbde9" gracePeriod=30 Oct 03 09:02:06 crc kubenswrapper[4765]: I1003 09:02:06.745382 4765 kuberuntime_container.go:808] "Killing container with a grace period" pod="watcher-kuttl-default/ceilometer-0" podUID="82df5442-55c2-414c-b79e-1b2cb7ea3c98" containerName="proxy-httpd" containerID="cri-o://d15244f22349beeb877b9a00b7fae8926dcba142f46dac5470fa49e3a13903b6" gracePeriod=30 Oct 03 09:02:06 crc kubenswrapper[4765]: I1003 09:02:06.745438 4765 kuberuntime_container.go:808] "Killing container with a grace period" pod="watcher-kuttl-default/ceilometer-0" podUID="82df5442-55c2-414c-b79e-1b2cb7ea3c98" containerName="sg-core" containerID="cri-o://93e61721165ef7c7f5e500de0476a37d1501e69249171c4eaf395fc24a6113d3" gracePeriod=30 Oct 03 09:02:06 crc kubenswrapper[4765]: E1003 09:02:06.951683 4765 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod82df5442_55c2_414c_b79e_1b2cb7ea3c98.slice/crio-conmon-93e61721165ef7c7f5e500de0476a37d1501e69249171c4eaf395fc24a6113d3.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod82df5442_55c2_414c_b79e_1b2cb7ea3c98.slice/crio-d15244f22349beeb877b9a00b7fae8926dcba142f46dac5470fa49e3a13903b6.scope\": RecentStats: unable to find data in memory cache]" Oct 03 09:02:07 crc kubenswrapper[4765]: I1003 09:02:07.592135 4765 generic.go:334] "Generic (PLEG): container finished" podID="82df5442-55c2-414c-b79e-1b2cb7ea3c98" containerID="d15244f22349beeb877b9a00b7fae8926dcba142f46dac5470fa49e3a13903b6" exitCode=0 Oct 03 09:02:07 crc kubenswrapper[4765]: I1003 09:02:07.592495 4765 generic.go:334] "Generic (PLEG): container finished" podID="82df5442-55c2-414c-b79e-1b2cb7ea3c98" containerID="93e61721165ef7c7f5e500de0476a37d1501e69249171c4eaf395fc24a6113d3" exitCode=2 Oct 03 09:02:07 crc kubenswrapper[4765]: I1003 09:02:07.592505 4765 generic.go:334] "Generic (PLEG): container finished" podID="82df5442-55c2-414c-b79e-1b2cb7ea3c98" containerID="539b4b5f78c6e31bc0bd30571fbf8b42fbc7dceb39fd92509c3b752732a4a759" exitCode=0 Oct 03 09:02:07 crc kubenswrapper[4765]: I1003 09:02:07.592219 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/ceilometer-0" event={"ID":"82df5442-55c2-414c-b79e-1b2cb7ea3c98","Type":"ContainerDied","Data":"d15244f22349beeb877b9a00b7fae8926dcba142f46dac5470fa49e3a13903b6"} Oct 03 09:02:07 crc kubenswrapper[4765]: I1003 09:02:07.592532 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/ceilometer-0" event={"ID":"82df5442-55c2-414c-b79e-1b2cb7ea3c98","Type":"ContainerDied","Data":"93e61721165ef7c7f5e500de0476a37d1501e69249171c4eaf395fc24a6113d3"} Oct 03 09:02:07 crc kubenswrapper[4765]: I1003 09:02:07.592542 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/ceilometer-0" event={"ID":"82df5442-55c2-414c-b79e-1b2cb7ea3c98","Type":"ContainerDied","Data":"539b4b5f78c6e31bc0bd30571fbf8b42fbc7dceb39fd92509c3b752732a4a759"} Oct 03 09:02:07 crc kubenswrapper[4765]: I1003 09:02:07.847955 4765 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["watcher-kuttl-default/watcher-kuttl-api-0"] Oct 03 09:02:07 crc kubenswrapper[4765]: I1003 09:02:07.848240 4765 kuberuntime_container.go:808] "Killing container with a grace period" pod="watcher-kuttl-default/watcher-kuttl-api-0" podUID="3343efeb-5001-4879-9e39-f534fe992296" containerName="watcher-kuttl-api-log" containerID="cri-o://a256f7182c9954a2fb4a26aec81aea2f48da1e9d97f6c25eac8ed563735336a1" gracePeriod=30 Oct 03 09:02:07 crc kubenswrapper[4765]: I1003 09:02:07.848306 4765 kuberuntime_container.go:808] "Killing container with a grace period" pod="watcher-kuttl-default/watcher-kuttl-api-0" podUID="3343efeb-5001-4879-9e39-f534fe992296" containerName="watcher-api" containerID="cri-o://4f7de61b330256f08a4627ab69ec4c4cb6a28eb0fcfd7fdef90a82185028672c" gracePeriod=30 Oct 03 09:02:08 crc kubenswrapper[4765]: I1003 09:02:08.615461 4765 generic.go:334] "Generic (PLEG): container finished" podID="3343efeb-5001-4879-9e39-f534fe992296" containerID="4f7de61b330256f08a4627ab69ec4c4cb6a28eb0fcfd7fdef90a82185028672c" exitCode=0 Oct 03 09:02:08 crc kubenswrapper[4765]: I1003 09:02:08.615638 4765 generic.go:334] "Generic (PLEG): container finished" podID="3343efeb-5001-4879-9e39-f534fe992296" containerID="a256f7182c9954a2fb4a26aec81aea2f48da1e9d97f6c25eac8ed563735336a1" exitCode=143 Oct 03 09:02:08 crc kubenswrapper[4765]: I1003 09:02:08.615674 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/watcher-kuttl-api-0" event={"ID":"3343efeb-5001-4879-9e39-f534fe992296","Type":"ContainerDied","Data":"4f7de61b330256f08a4627ab69ec4c4cb6a28eb0fcfd7fdef90a82185028672c"} Oct 03 09:02:08 crc kubenswrapper[4765]: I1003 09:02:08.615714 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/watcher-kuttl-api-0" event={"ID":"3343efeb-5001-4879-9e39-f534fe992296","Type":"ContainerDied","Data":"a256f7182c9954a2fb4a26aec81aea2f48da1e9d97f6c25eac8ed563735336a1"} Oct 03 09:02:08 crc kubenswrapper[4765]: I1003 09:02:08.787160 4765 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/watcher-kuttl-api-0" Oct 03 09:02:08 crc kubenswrapper[4765]: I1003 09:02:08.877317 4765 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3343efeb-5001-4879-9e39-f534fe992296-config-data\") pod \"3343efeb-5001-4879-9e39-f534fe992296\" (UID: \"3343efeb-5001-4879-9e39-f534fe992296\") " Oct 03 09:02:08 crc kubenswrapper[4765]: I1003 09:02:08.877433 4765 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/3343efeb-5001-4879-9e39-f534fe992296-internal-tls-certs\") pod \"3343efeb-5001-4879-9e39-f534fe992296\" (UID: \"3343efeb-5001-4879-9e39-f534fe992296\") " Oct 03 09:02:08 crc kubenswrapper[4765]: I1003 09:02:08.877491 4765 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jbdj6\" (UniqueName: \"kubernetes.io/projected/3343efeb-5001-4879-9e39-f534fe992296-kube-api-access-jbdj6\") pod \"3343efeb-5001-4879-9e39-f534fe992296\" (UID: \"3343efeb-5001-4879-9e39-f534fe992296\") " Oct 03 09:02:08 crc kubenswrapper[4765]: I1003 09:02:08.877526 4765 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/3343efeb-5001-4879-9e39-f534fe992296-public-tls-certs\") pod \"3343efeb-5001-4879-9e39-f534fe992296\" (UID: \"3343efeb-5001-4879-9e39-f534fe992296\") " Oct 03 09:02:08 crc kubenswrapper[4765]: I1003 09:02:08.877573 4765 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/3343efeb-5001-4879-9e39-f534fe992296-logs\") pod \"3343efeb-5001-4879-9e39-f534fe992296\" (UID: \"3343efeb-5001-4879-9e39-f534fe992296\") " Oct 03 09:02:08 crc kubenswrapper[4765]: I1003 09:02:08.877698 4765 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3343efeb-5001-4879-9e39-f534fe992296-combined-ca-bundle\") pod \"3343efeb-5001-4879-9e39-f534fe992296\" (UID: \"3343efeb-5001-4879-9e39-f534fe992296\") " Oct 03 09:02:08 crc kubenswrapper[4765]: I1003 09:02:08.877757 4765 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"custom-prometheus-ca\" (UniqueName: \"kubernetes.io/secret/3343efeb-5001-4879-9e39-f534fe992296-custom-prometheus-ca\") pod \"3343efeb-5001-4879-9e39-f534fe992296\" (UID: \"3343efeb-5001-4879-9e39-f534fe992296\") " Oct 03 09:02:08 crc kubenswrapper[4765]: I1003 09:02:08.878198 4765 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/3343efeb-5001-4879-9e39-f534fe992296-logs" (OuterVolumeSpecName: "logs") pod "3343efeb-5001-4879-9e39-f534fe992296" (UID: "3343efeb-5001-4879-9e39-f534fe992296"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 03 09:02:08 crc kubenswrapper[4765]: I1003 09:02:08.886399 4765 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3343efeb-5001-4879-9e39-f534fe992296-kube-api-access-jbdj6" (OuterVolumeSpecName: "kube-api-access-jbdj6") pod "3343efeb-5001-4879-9e39-f534fe992296" (UID: "3343efeb-5001-4879-9e39-f534fe992296"). InnerVolumeSpecName "kube-api-access-jbdj6". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 09:02:08 crc kubenswrapper[4765]: I1003 09:02:08.910768 4765 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3343efeb-5001-4879-9e39-f534fe992296-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "3343efeb-5001-4879-9e39-f534fe992296" (UID: "3343efeb-5001-4879-9e39-f534fe992296"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 09:02:08 crc kubenswrapper[4765]: I1003 09:02:08.917097 4765 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3343efeb-5001-4879-9e39-f534fe992296-custom-prometheus-ca" (OuterVolumeSpecName: "custom-prometheus-ca") pod "3343efeb-5001-4879-9e39-f534fe992296" (UID: "3343efeb-5001-4879-9e39-f534fe992296"). InnerVolumeSpecName "custom-prometheus-ca". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 09:02:08 crc kubenswrapper[4765]: I1003 09:02:08.934893 4765 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3343efeb-5001-4879-9e39-f534fe992296-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "3343efeb-5001-4879-9e39-f534fe992296" (UID: "3343efeb-5001-4879-9e39-f534fe992296"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 09:02:08 crc kubenswrapper[4765]: I1003 09:02:08.943920 4765 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3343efeb-5001-4879-9e39-f534fe992296-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "3343efeb-5001-4879-9e39-f534fe992296" (UID: "3343efeb-5001-4879-9e39-f534fe992296"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 09:02:08 crc kubenswrapper[4765]: I1003 09:02:08.967406 4765 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3343efeb-5001-4879-9e39-f534fe992296-config-data" (OuterVolumeSpecName: "config-data") pod "3343efeb-5001-4879-9e39-f534fe992296" (UID: "3343efeb-5001-4879-9e39-f534fe992296"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 09:02:08 crc kubenswrapper[4765]: I1003 09:02:08.979736 4765 reconciler_common.go:293] "Volume detached for volume \"custom-prometheus-ca\" (UniqueName: \"kubernetes.io/secret/3343efeb-5001-4879-9e39-f534fe992296-custom-prometheus-ca\") on node \"crc\" DevicePath \"\"" Oct 03 09:02:08 crc kubenswrapper[4765]: I1003 09:02:08.979777 4765 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3343efeb-5001-4879-9e39-f534fe992296-config-data\") on node \"crc\" DevicePath \"\"" Oct 03 09:02:08 crc kubenswrapper[4765]: I1003 09:02:08.979789 4765 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/3343efeb-5001-4879-9e39-f534fe992296-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Oct 03 09:02:08 crc kubenswrapper[4765]: I1003 09:02:08.979801 4765 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jbdj6\" (UniqueName: \"kubernetes.io/projected/3343efeb-5001-4879-9e39-f534fe992296-kube-api-access-jbdj6\") on node \"crc\" DevicePath \"\"" Oct 03 09:02:08 crc kubenswrapper[4765]: I1003 09:02:08.979813 4765 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/3343efeb-5001-4879-9e39-f534fe992296-public-tls-certs\") on node \"crc\" DevicePath \"\"" Oct 03 09:02:08 crc kubenswrapper[4765]: I1003 09:02:08.979823 4765 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/3343efeb-5001-4879-9e39-f534fe992296-logs\") on node \"crc\" DevicePath \"\"" Oct 03 09:02:08 crc kubenswrapper[4765]: I1003 09:02:08.979833 4765 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3343efeb-5001-4879-9e39-f534fe992296-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 03 09:02:09 crc kubenswrapper[4765]: I1003 09:02:09.631357 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/watcher-kuttl-api-0" event={"ID":"3343efeb-5001-4879-9e39-f534fe992296","Type":"ContainerDied","Data":"6948c3eaa009ba5dfe61a06b1b76193cb3a2debbf644ae3956a58c3adb349592"} Oct 03 09:02:09 crc kubenswrapper[4765]: I1003 09:02:09.631403 4765 scope.go:117] "RemoveContainer" containerID="4f7de61b330256f08a4627ab69ec4c4cb6a28eb0fcfd7fdef90a82185028672c" Oct 03 09:02:09 crc kubenswrapper[4765]: I1003 09:02:09.631534 4765 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/watcher-kuttl-api-0" Oct 03 09:02:09 crc kubenswrapper[4765]: I1003 09:02:09.672479 4765 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["watcher-kuttl-default/watcher-kuttl-api-0"] Oct 03 09:02:09 crc kubenswrapper[4765]: I1003 09:02:09.683368 4765 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["watcher-kuttl-default/watcher-kuttl-api-0"] Oct 03 09:02:09 crc kubenswrapper[4765]: I1003 09:02:09.696495 4765 scope.go:117] "RemoveContainer" containerID="a256f7182c9954a2fb4a26aec81aea2f48da1e9d97f6c25eac8ed563735336a1" Oct 03 09:02:09 crc kubenswrapper[4765]: I1003 09:02:09.696752 4765 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["watcher-kuttl-default/watcher-kuttl-api-0"] Oct 03 09:02:09 crc kubenswrapper[4765]: E1003 09:02:09.697229 4765 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3343efeb-5001-4879-9e39-f534fe992296" containerName="watcher-kuttl-api-log" Oct 03 09:02:09 crc kubenswrapper[4765]: I1003 09:02:09.697257 4765 state_mem.go:107] "Deleted CPUSet assignment" podUID="3343efeb-5001-4879-9e39-f534fe992296" containerName="watcher-kuttl-api-log" Oct 03 09:02:09 crc kubenswrapper[4765]: E1003 09:02:09.697280 4765 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3343efeb-5001-4879-9e39-f534fe992296" containerName="watcher-api" Oct 03 09:02:09 crc kubenswrapper[4765]: I1003 09:02:09.697289 4765 state_mem.go:107] "Deleted CPUSet assignment" podUID="3343efeb-5001-4879-9e39-f534fe992296" containerName="watcher-api" Oct 03 09:02:09 crc kubenswrapper[4765]: I1003 09:02:09.697568 4765 memory_manager.go:354] "RemoveStaleState removing state" podUID="3343efeb-5001-4879-9e39-f534fe992296" containerName="watcher-api" Oct 03 09:02:09 crc kubenswrapper[4765]: I1003 09:02:09.697602 4765 memory_manager.go:354] "RemoveStaleState removing state" podUID="3343efeb-5001-4879-9e39-f534fe992296" containerName="watcher-kuttl-api-log" Oct 03 09:02:09 crc kubenswrapper[4765]: I1003 09:02:09.698787 4765 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/watcher-kuttl-api-0" Oct 03 09:02:09 crc kubenswrapper[4765]: I1003 09:02:09.702879 4765 reflector.go:368] Caches populated for *v1.Secret from object-"watcher-kuttl-default"/"cert-watcher-internal-svc" Oct 03 09:02:09 crc kubenswrapper[4765]: I1003 09:02:09.703276 4765 reflector.go:368] Caches populated for *v1.Secret from object-"watcher-kuttl-default"/"cert-watcher-public-svc" Oct 03 09:02:09 crc kubenswrapper[4765]: I1003 09:02:09.703407 4765 reflector.go:368] Caches populated for *v1.Secret from object-"watcher-kuttl-default"/"watcher-kuttl-api-config-data" Oct 03 09:02:09 crc kubenswrapper[4765]: I1003 09:02:09.712062 4765 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["watcher-kuttl-default/watcher-kuttl-api-0"] Oct 03 09:02:09 crc kubenswrapper[4765]: I1003 09:02:09.792480 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4825e1eb-acf6-4b63-a45e-d667bd450345-combined-ca-bundle\") pod \"watcher-kuttl-api-0\" (UID: \"4825e1eb-acf6-4b63-a45e-d667bd450345\") " pod="watcher-kuttl-default/watcher-kuttl-api-0" Oct 03 09:02:09 crc kubenswrapper[4765]: I1003 09:02:09.792564 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"custom-prometheus-ca\" (UniqueName: \"kubernetes.io/secret/4825e1eb-acf6-4b63-a45e-d667bd450345-custom-prometheus-ca\") pod \"watcher-kuttl-api-0\" (UID: \"4825e1eb-acf6-4b63-a45e-d667bd450345\") " pod="watcher-kuttl-default/watcher-kuttl-api-0" Oct 03 09:02:09 crc kubenswrapper[4765]: I1003 09:02:09.792616 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/4825e1eb-acf6-4b63-a45e-d667bd450345-internal-tls-certs\") pod \"watcher-kuttl-api-0\" (UID: \"4825e1eb-acf6-4b63-a45e-d667bd450345\") " pod="watcher-kuttl-default/watcher-kuttl-api-0" Oct 03 09:02:09 crc kubenswrapper[4765]: I1003 09:02:09.792681 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tsxqg\" (UniqueName: \"kubernetes.io/projected/4825e1eb-acf6-4b63-a45e-d667bd450345-kube-api-access-tsxqg\") pod \"watcher-kuttl-api-0\" (UID: \"4825e1eb-acf6-4b63-a45e-d667bd450345\") " pod="watcher-kuttl-default/watcher-kuttl-api-0" Oct 03 09:02:09 crc kubenswrapper[4765]: I1003 09:02:09.792784 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/4825e1eb-acf6-4b63-a45e-d667bd450345-public-tls-certs\") pod \"watcher-kuttl-api-0\" (UID: \"4825e1eb-acf6-4b63-a45e-d667bd450345\") " pod="watcher-kuttl-default/watcher-kuttl-api-0" Oct 03 09:02:09 crc kubenswrapper[4765]: I1003 09:02:09.792823 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4825e1eb-acf6-4b63-a45e-d667bd450345-config-data\") pod \"watcher-kuttl-api-0\" (UID: \"4825e1eb-acf6-4b63-a45e-d667bd450345\") " pod="watcher-kuttl-default/watcher-kuttl-api-0" Oct 03 09:02:09 crc kubenswrapper[4765]: I1003 09:02:09.792883 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/4825e1eb-acf6-4b63-a45e-d667bd450345-logs\") pod \"watcher-kuttl-api-0\" (UID: \"4825e1eb-acf6-4b63-a45e-d667bd450345\") " pod="watcher-kuttl-default/watcher-kuttl-api-0" Oct 03 09:02:09 crc kubenswrapper[4765]: I1003 09:02:09.894587 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/4825e1eb-acf6-4b63-a45e-d667bd450345-public-tls-certs\") pod \"watcher-kuttl-api-0\" (UID: \"4825e1eb-acf6-4b63-a45e-d667bd450345\") " pod="watcher-kuttl-default/watcher-kuttl-api-0" Oct 03 09:02:09 crc kubenswrapper[4765]: I1003 09:02:09.894630 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4825e1eb-acf6-4b63-a45e-d667bd450345-config-data\") pod \"watcher-kuttl-api-0\" (UID: \"4825e1eb-acf6-4b63-a45e-d667bd450345\") " pod="watcher-kuttl-default/watcher-kuttl-api-0" Oct 03 09:02:09 crc kubenswrapper[4765]: I1003 09:02:09.894674 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/4825e1eb-acf6-4b63-a45e-d667bd450345-logs\") pod \"watcher-kuttl-api-0\" (UID: \"4825e1eb-acf6-4b63-a45e-d667bd450345\") " pod="watcher-kuttl-default/watcher-kuttl-api-0" Oct 03 09:02:09 crc kubenswrapper[4765]: I1003 09:02:09.894711 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4825e1eb-acf6-4b63-a45e-d667bd450345-combined-ca-bundle\") pod \"watcher-kuttl-api-0\" (UID: \"4825e1eb-acf6-4b63-a45e-d667bd450345\") " pod="watcher-kuttl-default/watcher-kuttl-api-0" Oct 03 09:02:09 crc kubenswrapper[4765]: I1003 09:02:09.894756 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"custom-prometheus-ca\" (UniqueName: \"kubernetes.io/secret/4825e1eb-acf6-4b63-a45e-d667bd450345-custom-prometheus-ca\") pod \"watcher-kuttl-api-0\" (UID: \"4825e1eb-acf6-4b63-a45e-d667bd450345\") " pod="watcher-kuttl-default/watcher-kuttl-api-0" Oct 03 09:02:09 crc kubenswrapper[4765]: I1003 09:02:09.894803 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/4825e1eb-acf6-4b63-a45e-d667bd450345-internal-tls-certs\") pod \"watcher-kuttl-api-0\" (UID: \"4825e1eb-acf6-4b63-a45e-d667bd450345\") " pod="watcher-kuttl-default/watcher-kuttl-api-0" Oct 03 09:02:09 crc kubenswrapper[4765]: I1003 09:02:09.894830 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tsxqg\" (UniqueName: \"kubernetes.io/projected/4825e1eb-acf6-4b63-a45e-d667bd450345-kube-api-access-tsxqg\") pod \"watcher-kuttl-api-0\" (UID: \"4825e1eb-acf6-4b63-a45e-d667bd450345\") " pod="watcher-kuttl-default/watcher-kuttl-api-0" Oct 03 09:02:09 crc kubenswrapper[4765]: I1003 09:02:09.895203 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/4825e1eb-acf6-4b63-a45e-d667bd450345-logs\") pod \"watcher-kuttl-api-0\" (UID: \"4825e1eb-acf6-4b63-a45e-d667bd450345\") " pod="watcher-kuttl-default/watcher-kuttl-api-0" Oct 03 09:02:09 crc kubenswrapper[4765]: I1003 09:02:09.902405 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/4825e1eb-acf6-4b63-a45e-d667bd450345-public-tls-certs\") pod \"watcher-kuttl-api-0\" (UID: \"4825e1eb-acf6-4b63-a45e-d667bd450345\") " pod="watcher-kuttl-default/watcher-kuttl-api-0" Oct 03 09:02:09 crc kubenswrapper[4765]: I1003 09:02:09.902467 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/4825e1eb-acf6-4b63-a45e-d667bd450345-internal-tls-certs\") pod \"watcher-kuttl-api-0\" (UID: \"4825e1eb-acf6-4b63-a45e-d667bd450345\") " pod="watcher-kuttl-default/watcher-kuttl-api-0" Oct 03 09:02:09 crc kubenswrapper[4765]: I1003 09:02:09.902705 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"custom-prometheus-ca\" (UniqueName: \"kubernetes.io/secret/4825e1eb-acf6-4b63-a45e-d667bd450345-custom-prometheus-ca\") pod \"watcher-kuttl-api-0\" (UID: \"4825e1eb-acf6-4b63-a45e-d667bd450345\") " pod="watcher-kuttl-default/watcher-kuttl-api-0" Oct 03 09:02:09 crc kubenswrapper[4765]: I1003 09:02:09.903494 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4825e1eb-acf6-4b63-a45e-d667bd450345-combined-ca-bundle\") pod \"watcher-kuttl-api-0\" (UID: \"4825e1eb-acf6-4b63-a45e-d667bd450345\") " pod="watcher-kuttl-default/watcher-kuttl-api-0" Oct 03 09:02:09 crc kubenswrapper[4765]: I1003 09:02:09.904454 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4825e1eb-acf6-4b63-a45e-d667bd450345-config-data\") pod \"watcher-kuttl-api-0\" (UID: \"4825e1eb-acf6-4b63-a45e-d667bd450345\") " pod="watcher-kuttl-default/watcher-kuttl-api-0" Oct 03 09:02:09 crc kubenswrapper[4765]: I1003 09:02:09.917206 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tsxqg\" (UniqueName: \"kubernetes.io/projected/4825e1eb-acf6-4b63-a45e-d667bd450345-kube-api-access-tsxqg\") pod \"watcher-kuttl-api-0\" (UID: \"4825e1eb-acf6-4b63-a45e-d667bd450345\") " pod="watcher-kuttl-default/watcher-kuttl-api-0" Oct 03 09:02:10 crc kubenswrapper[4765]: I1003 09:02:10.016934 4765 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/watcher-kuttl-api-0" Oct 03 09:02:10 crc kubenswrapper[4765]: I1003 09:02:10.319370 4765 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3343efeb-5001-4879-9e39-f534fe992296" path="/var/lib/kubelet/pods/3343efeb-5001-4879-9e39-f534fe992296/volumes" Oct 03 09:02:10 crc kubenswrapper[4765]: I1003 09:02:10.477515 4765 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["watcher-kuttl-default/watcher-kuttl-api-0"] Oct 03 09:02:10 crc kubenswrapper[4765]: I1003 09:02:10.641538 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/watcher-kuttl-api-0" event={"ID":"4825e1eb-acf6-4b63-a45e-d667bd450345","Type":"ContainerStarted","Data":"8c7659e78092365bf578a41d2b2db0129d6bc8f41b75d3281afe9e65815178fa"} Oct 03 09:02:11 crc kubenswrapper[4765]: I1003 09:02:11.651116 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/watcher-kuttl-api-0" event={"ID":"4825e1eb-acf6-4b63-a45e-d667bd450345","Type":"ContainerStarted","Data":"dc66b34b55444c4e79e251ff624bb1774aaead653731b7e96d41821e33d4159b"} Oct 03 09:02:11 crc kubenswrapper[4765]: I1003 09:02:11.651461 4765 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="watcher-kuttl-default/watcher-kuttl-api-0" Oct 03 09:02:11 crc kubenswrapper[4765]: I1003 09:02:11.651472 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/watcher-kuttl-api-0" event={"ID":"4825e1eb-acf6-4b63-a45e-d667bd450345","Type":"ContainerStarted","Data":"c63fcfa0539545e558647a6bbe0a5034f8f09d394741418ffbfba8184c6dddbe"} Oct 03 09:02:11 crc kubenswrapper[4765]: I1003 09:02:11.672437 4765 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="watcher-kuttl-default/watcher-kuttl-api-0" podStartSLOduration=2.67241772 podStartE2EDuration="2.67241772s" podCreationTimestamp="2025-10-03 09:02:09 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-03 09:02:11.668550583 +0000 UTC m=+1375.970044913" watchObservedRunningTime="2025-10-03 09:02:11.67241772 +0000 UTC m=+1375.973912050" Oct 03 09:02:12 crc kubenswrapper[4765]: I1003 09:02:12.045547 4765 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["watcher-kuttl-default/watcher-kuttl-api-0"] Oct 03 09:02:12 crc kubenswrapper[4765]: I1003 09:02:12.354811 4765 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/ceilometer-0" Oct 03 09:02:12 crc kubenswrapper[4765]: I1003 09:02:12.537540 4765 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/82df5442-55c2-414c-b79e-1b2cb7ea3c98-ceilometer-tls-certs\") pod \"82df5442-55c2-414c-b79e-1b2cb7ea3c98\" (UID: \"82df5442-55c2-414c-b79e-1b2cb7ea3c98\") " Oct 03 09:02:12 crc kubenswrapper[4765]: I1003 09:02:12.537631 4765 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/82df5442-55c2-414c-b79e-1b2cb7ea3c98-log-httpd\") pod \"82df5442-55c2-414c-b79e-1b2cb7ea3c98\" (UID: \"82df5442-55c2-414c-b79e-1b2cb7ea3c98\") " Oct 03 09:02:12 crc kubenswrapper[4765]: I1003 09:02:12.537732 4765 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/82df5442-55c2-414c-b79e-1b2cb7ea3c98-scripts\") pod \"82df5442-55c2-414c-b79e-1b2cb7ea3c98\" (UID: \"82df5442-55c2-414c-b79e-1b2cb7ea3c98\") " Oct 03 09:02:12 crc kubenswrapper[4765]: I1003 09:02:12.537793 4765 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/82df5442-55c2-414c-b79e-1b2cb7ea3c98-combined-ca-bundle\") pod \"82df5442-55c2-414c-b79e-1b2cb7ea3c98\" (UID: \"82df5442-55c2-414c-b79e-1b2cb7ea3c98\") " Oct 03 09:02:12 crc kubenswrapper[4765]: I1003 09:02:12.537839 4765 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/82df5442-55c2-414c-b79e-1b2cb7ea3c98-sg-core-conf-yaml\") pod \"82df5442-55c2-414c-b79e-1b2cb7ea3c98\" (UID: \"82df5442-55c2-414c-b79e-1b2cb7ea3c98\") " Oct 03 09:02:12 crc kubenswrapper[4765]: I1003 09:02:12.537861 4765 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/82df5442-55c2-414c-b79e-1b2cb7ea3c98-run-httpd\") pod \"82df5442-55c2-414c-b79e-1b2cb7ea3c98\" (UID: \"82df5442-55c2-414c-b79e-1b2cb7ea3c98\") " Oct 03 09:02:12 crc kubenswrapper[4765]: I1003 09:02:12.537888 4765 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rmdfj\" (UniqueName: \"kubernetes.io/projected/82df5442-55c2-414c-b79e-1b2cb7ea3c98-kube-api-access-rmdfj\") pod \"82df5442-55c2-414c-b79e-1b2cb7ea3c98\" (UID: \"82df5442-55c2-414c-b79e-1b2cb7ea3c98\") " Oct 03 09:02:12 crc kubenswrapper[4765]: I1003 09:02:12.537957 4765 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/82df5442-55c2-414c-b79e-1b2cb7ea3c98-config-data\") pod \"82df5442-55c2-414c-b79e-1b2cb7ea3c98\" (UID: \"82df5442-55c2-414c-b79e-1b2cb7ea3c98\") " Oct 03 09:02:12 crc kubenswrapper[4765]: I1003 09:02:12.538028 4765 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/82df5442-55c2-414c-b79e-1b2cb7ea3c98-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "82df5442-55c2-414c-b79e-1b2cb7ea3c98" (UID: "82df5442-55c2-414c-b79e-1b2cb7ea3c98"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 03 09:02:12 crc kubenswrapper[4765]: I1003 09:02:12.538197 4765 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/82df5442-55c2-414c-b79e-1b2cb7ea3c98-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "82df5442-55c2-414c-b79e-1b2cb7ea3c98" (UID: "82df5442-55c2-414c-b79e-1b2cb7ea3c98"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 03 09:02:12 crc kubenswrapper[4765]: I1003 09:02:12.538461 4765 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/82df5442-55c2-414c-b79e-1b2cb7ea3c98-log-httpd\") on node \"crc\" DevicePath \"\"" Oct 03 09:02:12 crc kubenswrapper[4765]: I1003 09:02:12.538479 4765 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/82df5442-55c2-414c-b79e-1b2cb7ea3c98-run-httpd\") on node \"crc\" DevicePath \"\"" Oct 03 09:02:12 crc kubenswrapper[4765]: I1003 09:02:12.542204 4765 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/82df5442-55c2-414c-b79e-1b2cb7ea3c98-scripts" (OuterVolumeSpecName: "scripts") pod "82df5442-55c2-414c-b79e-1b2cb7ea3c98" (UID: "82df5442-55c2-414c-b79e-1b2cb7ea3c98"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 09:02:12 crc kubenswrapper[4765]: I1003 09:02:12.542572 4765 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/82df5442-55c2-414c-b79e-1b2cb7ea3c98-kube-api-access-rmdfj" (OuterVolumeSpecName: "kube-api-access-rmdfj") pod "82df5442-55c2-414c-b79e-1b2cb7ea3c98" (UID: "82df5442-55c2-414c-b79e-1b2cb7ea3c98"). InnerVolumeSpecName "kube-api-access-rmdfj". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 09:02:12 crc kubenswrapper[4765]: I1003 09:02:12.583917 4765 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/82df5442-55c2-414c-b79e-1b2cb7ea3c98-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "82df5442-55c2-414c-b79e-1b2cb7ea3c98" (UID: "82df5442-55c2-414c-b79e-1b2cb7ea3c98"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 09:02:12 crc kubenswrapper[4765]: I1003 09:02:12.588790 4765 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/82df5442-55c2-414c-b79e-1b2cb7ea3c98-ceilometer-tls-certs" (OuterVolumeSpecName: "ceilometer-tls-certs") pod "82df5442-55c2-414c-b79e-1b2cb7ea3c98" (UID: "82df5442-55c2-414c-b79e-1b2cb7ea3c98"). InnerVolumeSpecName "ceilometer-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 09:02:12 crc kubenswrapper[4765]: I1003 09:02:12.639567 4765 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/82df5442-55c2-414c-b79e-1b2cb7ea3c98-scripts\") on node \"crc\" DevicePath \"\"" Oct 03 09:02:12 crc kubenswrapper[4765]: I1003 09:02:12.639614 4765 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/82df5442-55c2-414c-b79e-1b2cb7ea3c98-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Oct 03 09:02:12 crc kubenswrapper[4765]: I1003 09:02:12.639633 4765 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rmdfj\" (UniqueName: \"kubernetes.io/projected/82df5442-55c2-414c-b79e-1b2cb7ea3c98-kube-api-access-rmdfj\") on node \"crc\" DevicePath \"\"" Oct 03 09:02:12 crc kubenswrapper[4765]: I1003 09:02:12.639660 4765 reconciler_common.go:293] "Volume detached for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/82df5442-55c2-414c-b79e-1b2cb7ea3c98-ceilometer-tls-certs\") on node \"crc\" DevicePath \"\"" Oct 03 09:02:12 crc kubenswrapper[4765]: I1003 09:02:12.646710 4765 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/82df5442-55c2-414c-b79e-1b2cb7ea3c98-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "82df5442-55c2-414c-b79e-1b2cb7ea3c98" (UID: "82df5442-55c2-414c-b79e-1b2cb7ea3c98"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 09:02:12 crc kubenswrapper[4765]: I1003 09:02:12.648593 4765 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/82df5442-55c2-414c-b79e-1b2cb7ea3c98-config-data" (OuterVolumeSpecName: "config-data") pod "82df5442-55c2-414c-b79e-1b2cb7ea3c98" (UID: "82df5442-55c2-414c-b79e-1b2cb7ea3c98"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 09:02:12 crc kubenswrapper[4765]: I1003 09:02:12.661220 4765 generic.go:334] "Generic (PLEG): container finished" podID="82df5442-55c2-414c-b79e-1b2cb7ea3c98" containerID="9ec71c76904490e7ff12c75b98e9d1957f9c7ed58d059931122b1d39416fbde9" exitCode=0 Oct 03 09:02:12 crc kubenswrapper[4765]: I1003 09:02:12.662019 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/ceilometer-0" event={"ID":"82df5442-55c2-414c-b79e-1b2cb7ea3c98","Type":"ContainerDied","Data":"9ec71c76904490e7ff12c75b98e9d1957f9c7ed58d059931122b1d39416fbde9"} Oct 03 09:02:12 crc kubenswrapper[4765]: I1003 09:02:12.662133 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/ceilometer-0" event={"ID":"82df5442-55c2-414c-b79e-1b2cb7ea3c98","Type":"ContainerDied","Data":"241c556bd94929d659b63c12be96486d595b488142f56b5dc8ed3c87fae1cecd"} Oct 03 09:02:12 crc kubenswrapper[4765]: I1003 09:02:12.662056 4765 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/ceilometer-0" Oct 03 09:02:12 crc kubenswrapper[4765]: I1003 09:02:12.662206 4765 scope.go:117] "RemoveContainer" containerID="d15244f22349beeb877b9a00b7fae8926dcba142f46dac5470fa49e3a13903b6" Oct 03 09:02:12 crc kubenswrapper[4765]: I1003 09:02:12.684864 4765 scope.go:117] "RemoveContainer" containerID="93e61721165ef7c7f5e500de0476a37d1501e69249171c4eaf395fc24a6113d3" Oct 03 09:02:12 crc kubenswrapper[4765]: I1003 09:02:12.740900 4765 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/82df5442-55c2-414c-b79e-1b2cb7ea3c98-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 03 09:02:12 crc kubenswrapper[4765]: I1003 09:02:12.740943 4765 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/82df5442-55c2-414c-b79e-1b2cb7ea3c98-config-data\") on node \"crc\" DevicePath \"\"" Oct 03 09:02:12 crc kubenswrapper[4765]: I1003 09:02:12.746496 4765 scope.go:117] "RemoveContainer" containerID="9ec71c76904490e7ff12c75b98e9d1957f9c7ed58d059931122b1d39416fbde9" Oct 03 09:02:12 crc kubenswrapper[4765]: I1003 09:02:12.749448 4765 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["watcher-kuttl-default/ceilometer-0"] Oct 03 09:02:12 crc kubenswrapper[4765]: I1003 09:02:12.760684 4765 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["watcher-kuttl-default/ceilometer-0"] Oct 03 09:02:12 crc kubenswrapper[4765]: I1003 09:02:12.769044 4765 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["watcher-kuttl-default/ceilometer-0"] Oct 03 09:02:12 crc kubenswrapper[4765]: E1003 09:02:12.769359 4765 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="82df5442-55c2-414c-b79e-1b2cb7ea3c98" containerName="proxy-httpd" Oct 03 09:02:12 crc kubenswrapper[4765]: I1003 09:02:12.769374 4765 state_mem.go:107] "Deleted CPUSet assignment" podUID="82df5442-55c2-414c-b79e-1b2cb7ea3c98" containerName="proxy-httpd" Oct 03 09:02:12 crc kubenswrapper[4765]: E1003 09:02:12.769401 4765 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="82df5442-55c2-414c-b79e-1b2cb7ea3c98" containerName="ceilometer-notification-agent" Oct 03 09:02:12 crc kubenswrapper[4765]: I1003 09:02:12.769407 4765 state_mem.go:107] "Deleted CPUSet assignment" podUID="82df5442-55c2-414c-b79e-1b2cb7ea3c98" containerName="ceilometer-notification-agent" Oct 03 09:02:12 crc kubenswrapper[4765]: E1003 09:02:12.769418 4765 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="82df5442-55c2-414c-b79e-1b2cb7ea3c98" containerName="sg-core" Oct 03 09:02:12 crc kubenswrapper[4765]: I1003 09:02:12.769424 4765 state_mem.go:107] "Deleted CPUSet assignment" podUID="82df5442-55c2-414c-b79e-1b2cb7ea3c98" containerName="sg-core" Oct 03 09:02:12 crc kubenswrapper[4765]: E1003 09:02:12.769443 4765 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="82df5442-55c2-414c-b79e-1b2cb7ea3c98" containerName="ceilometer-central-agent" Oct 03 09:02:12 crc kubenswrapper[4765]: I1003 09:02:12.769448 4765 state_mem.go:107] "Deleted CPUSet assignment" podUID="82df5442-55c2-414c-b79e-1b2cb7ea3c98" containerName="ceilometer-central-agent" Oct 03 09:02:12 crc kubenswrapper[4765]: I1003 09:02:12.769600 4765 memory_manager.go:354] "RemoveStaleState removing state" podUID="82df5442-55c2-414c-b79e-1b2cb7ea3c98" containerName="ceilometer-central-agent" Oct 03 09:02:12 crc kubenswrapper[4765]: I1003 09:02:12.769614 4765 memory_manager.go:354] "RemoveStaleState removing state" podUID="82df5442-55c2-414c-b79e-1b2cb7ea3c98" containerName="ceilometer-notification-agent" Oct 03 09:02:12 crc kubenswrapper[4765]: I1003 09:02:12.769625 4765 memory_manager.go:354] "RemoveStaleState removing state" podUID="82df5442-55c2-414c-b79e-1b2cb7ea3c98" containerName="sg-core" Oct 03 09:02:12 crc kubenswrapper[4765]: I1003 09:02:12.769639 4765 memory_manager.go:354] "RemoveStaleState removing state" podUID="82df5442-55c2-414c-b79e-1b2cb7ea3c98" containerName="proxy-httpd" Oct 03 09:02:12 crc kubenswrapper[4765]: I1003 09:02:12.771684 4765 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/ceilometer-0" Oct 03 09:02:12 crc kubenswrapper[4765]: I1003 09:02:12.776737 4765 reflector.go:368] Caches populated for *v1.Secret from object-"watcher-kuttl-default"/"ceilometer-scripts" Oct 03 09:02:12 crc kubenswrapper[4765]: I1003 09:02:12.777027 4765 reflector.go:368] Caches populated for *v1.Secret from object-"watcher-kuttl-default"/"ceilometer-config-data" Oct 03 09:02:12 crc kubenswrapper[4765]: I1003 09:02:12.777976 4765 reflector.go:368] Caches populated for *v1.Secret from object-"watcher-kuttl-default"/"cert-ceilometer-internal-svc" Oct 03 09:02:12 crc kubenswrapper[4765]: I1003 09:02:12.784625 4765 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["watcher-kuttl-default/ceilometer-0"] Oct 03 09:02:12 crc kubenswrapper[4765]: I1003 09:02:12.790981 4765 scope.go:117] "RemoveContainer" containerID="539b4b5f78c6e31bc0bd30571fbf8b42fbc7dceb39fd92509c3b752732a4a759" Oct 03 09:02:12 crc kubenswrapper[4765]: I1003 09:02:12.856443 4765 scope.go:117] "RemoveContainer" containerID="d15244f22349beeb877b9a00b7fae8926dcba142f46dac5470fa49e3a13903b6" Oct 03 09:02:12 crc kubenswrapper[4765]: E1003 09:02:12.857839 4765 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d15244f22349beeb877b9a00b7fae8926dcba142f46dac5470fa49e3a13903b6\": container with ID starting with d15244f22349beeb877b9a00b7fae8926dcba142f46dac5470fa49e3a13903b6 not found: ID does not exist" containerID="d15244f22349beeb877b9a00b7fae8926dcba142f46dac5470fa49e3a13903b6" Oct 03 09:02:12 crc kubenswrapper[4765]: I1003 09:02:12.857876 4765 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d15244f22349beeb877b9a00b7fae8926dcba142f46dac5470fa49e3a13903b6"} err="failed to get container status \"d15244f22349beeb877b9a00b7fae8926dcba142f46dac5470fa49e3a13903b6\": rpc error: code = NotFound desc = could not find container \"d15244f22349beeb877b9a00b7fae8926dcba142f46dac5470fa49e3a13903b6\": container with ID starting with d15244f22349beeb877b9a00b7fae8926dcba142f46dac5470fa49e3a13903b6 not found: ID does not exist" Oct 03 09:02:12 crc kubenswrapper[4765]: I1003 09:02:12.857904 4765 scope.go:117] "RemoveContainer" containerID="93e61721165ef7c7f5e500de0476a37d1501e69249171c4eaf395fc24a6113d3" Oct 03 09:02:12 crc kubenswrapper[4765]: E1003 09:02:12.859099 4765 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"93e61721165ef7c7f5e500de0476a37d1501e69249171c4eaf395fc24a6113d3\": container with ID starting with 93e61721165ef7c7f5e500de0476a37d1501e69249171c4eaf395fc24a6113d3 not found: ID does not exist" containerID="93e61721165ef7c7f5e500de0476a37d1501e69249171c4eaf395fc24a6113d3" Oct 03 09:02:12 crc kubenswrapper[4765]: I1003 09:02:12.859133 4765 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"93e61721165ef7c7f5e500de0476a37d1501e69249171c4eaf395fc24a6113d3"} err="failed to get container status \"93e61721165ef7c7f5e500de0476a37d1501e69249171c4eaf395fc24a6113d3\": rpc error: code = NotFound desc = could not find container \"93e61721165ef7c7f5e500de0476a37d1501e69249171c4eaf395fc24a6113d3\": container with ID starting with 93e61721165ef7c7f5e500de0476a37d1501e69249171c4eaf395fc24a6113d3 not found: ID does not exist" Oct 03 09:02:12 crc kubenswrapper[4765]: I1003 09:02:12.859152 4765 scope.go:117] "RemoveContainer" containerID="9ec71c76904490e7ff12c75b98e9d1957f9c7ed58d059931122b1d39416fbde9" Oct 03 09:02:12 crc kubenswrapper[4765]: E1003 09:02:12.859461 4765 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9ec71c76904490e7ff12c75b98e9d1957f9c7ed58d059931122b1d39416fbde9\": container with ID starting with 9ec71c76904490e7ff12c75b98e9d1957f9c7ed58d059931122b1d39416fbde9 not found: ID does not exist" containerID="9ec71c76904490e7ff12c75b98e9d1957f9c7ed58d059931122b1d39416fbde9" Oct 03 09:02:12 crc kubenswrapper[4765]: I1003 09:02:12.859495 4765 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9ec71c76904490e7ff12c75b98e9d1957f9c7ed58d059931122b1d39416fbde9"} err="failed to get container status \"9ec71c76904490e7ff12c75b98e9d1957f9c7ed58d059931122b1d39416fbde9\": rpc error: code = NotFound desc = could not find container \"9ec71c76904490e7ff12c75b98e9d1957f9c7ed58d059931122b1d39416fbde9\": container with ID starting with 9ec71c76904490e7ff12c75b98e9d1957f9c7ed58d059931122b1d39416fbde9 not found: ID does not exist" Oct 03 09:02:12 crc kubenswrapper[4765]: I1003 09:02:12.859512 4765 scope.go:117] "RemoveContainer" containerID="539b4b5f78c6e31bc0bd30571fbf8b42fbc7dceb39fd92509c3b752732a4a759" Oct 03 09:02:12 crc kubenswrapper[4765]: E1003 09:02:12.859965 4765 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"539b4b5f78c6e31bc0bd30571fbf8b42fbc7dceb39fd92509c3b752732a4a759\": container with ID starting with 539b4b5f78c6e31bc0bd30571fbf8b42fbc7dceb39fd92509c3b752732a4a759 not found: ID does not exist" containerID="539b4b5f78c6e31bc0bd30571fbf8b42fbc7dceb39fd92509c3b752732a4a759" Oct 03 09:02:12 crc kubenswrapper[4765]: I1003 09:02:12.859991 4765 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"539b4b5f78c6e31bc0bd30571fbf8b42fbc7dceb39fd92509c3b752732a4a759"} err="failed to get container status \"539b4b5f78c6e31bc0bd30571fbf8b42fbc7dceb39fd92509c3b752732a4a759\": rpc error: code = NotFound desc = could not find container \"539b4b5f78c6e31bc0bd30571fbf8b42fbc7dceb39fd92509c3b752732a4a759\": container with ID starting with 539b4b5f78c6e31bc0bd30571fbf8b42fbc7dceb39fd92509c3b752732a4a759 not found: ID does not exist" Oct 03 09:02:12 crc kubenswrapper[4765]: I1003 09:02:12.944281 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/dd692e27-9c41-4946-9776-955baa355470-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"dd692e27-9c41-4946-9776-955baa355470\") " pod="watcher-kuttl-default/ceilometer-0" Oct 03 09:02:12 crc kubenswrapper[4765]: I1003 09:02:12.944352 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/dd692e27-9c41-4946-9776-955baa355470-config-data\") pod \"ceilometer-0\" (UID: \"dd692e27-9c41-4946-9776-955baa355470\") " pod="watcher-kuttl-default/ceilometer-0" Oct 03 09:02:12 crc kubenswrapper[4765]: I1003 09:02:12.944383 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2f5gq\" (UniqueName: \"kubernetes.io/projected/dd692e27-9c41-4946-9776-955baa355470-kube-api-access-2f5gq\") pod \"ceilometer-0\" (UID: \"dd692e27-9c41-4946-9776-955baa355470\") " pod="watcher-kuttl-default/ceilometer-0" Oct 03 09:02:12 crc kubenswrapper[4765]: I1003 09:02:12.944418 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/dd692e27-9c41-4946-9776-955baa355470-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"dd692e27-9c41-4946-9776-955baa355470\") " pod="watcher-kuttl-default/ceilometer-0" Oct 03 09:02:12 crc kubenswrapper[4765]: I1003 09:02:12.944457 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/dd692e27-9c41-4946-9776-955baa355470-scripts\") pod \"ceilometer-0\" (UID: \"dd692e27-9c41-4946-9776-955baa355470\") " pod="watcher-kuttl-default/ceilometer-0" Oct 03 09:02:12 crc kubenswrapper[4765]: I1003 09:02:12.944474 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/dd692e27-9c41-4946-9776-955baa355470-run-httpd\") pod \"ceilometer-0\" (UID: \"dd692e27-9c41-4946-9776-955baa355470\") " pod="watcher-kuttl-default/ceilometer-0" Oct 03 09:02:12 crc kubenswrapper[4765]: I1003 09:02:12.944506 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/dd692e27-9c41-4946-9776-955baa355470-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"dd692e27-9c41-4946-9776-955baa355470\") " pod="watcher-kuttl-default/ceilometer-0" Oct 03 09:02:12 crc kubenswrapper[4765]: I1003 09:02:12.944525 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/dd692e27-9c41-4946-9776-955baa355470-log-httpd\") pod \"ceilometer-0\" (UID: \"dd692e27-9c41-4946-9776-955baa355470\") " pod="watcher-kuttl-default/ceilometer-0" Oct 03 09:02:13 crc kubenswrapper[4765]: I1003 09:02:13.045572 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/dd692e27-9c41-4946-9776-955baa355470-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"dd692e27-9c41-4946-9776-955baa355470\") " pod="watcher-kuttl-default/ceilometer-0" Oct 03 09:02:13 crc kubenswrapper[4765]: I1003 09:02:13.045659 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/dd692e27-9c41-4946-9776-955baa355470-config-data\") pod \"ceilometer-0\" (UID: \"dd692e27-9c41-4946-9776-955baa355470\") " pod="watcher-kuttl-default/ceilometer-0" Oct 03 09:02:13 crc kubenswrapper[4765]: I1003 09:02:13.045683 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2f5gq\" (UniqueName: \"kubernetes.io/projected/dd692e27-9c41-4946-9776-955baa355470-kube-api-access-2f5gq\") pod \"ceilometer-0\" (UID: \"dd692e27-9c41-4946-9776-955baa355470\") " pod="watcher-kuttl-default/ceilometer-0" Oct 03 09:02:13 crc kubenswrapper[4765]: I1003 09:02:13.045712 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/dd692e27-9c41-4946-9776-955baa355470-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"dd692e27-9c41-4946-9776-955baa355470\") " pod="watcher-kuttl-default/ceilometer-0" Oct 03 09:02:13 crc kubenswrapper[4765]: I1003 09:02:13.045752 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/dd692e27-9c41-4946-9776-955baa355470-scripts\") pod \"ceilometer-0\" (UID: \"dd692e27-9c41-4946-9776-955baa355470\") " pod="watcher-kuttl-default/ceilometer-0" Oct 03 09:02:13 crc kubenswrapper[4765]: I1003 09:02:13.045782 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/dd692e27-9c41-4946-9776-955baa355470-run-httpd\") pod \"ceilometer-0\" (UID: \"dd692e27-9c41-4946-9776-955baa355470\") " pod="watcher-kuttl-default/ceilometer-0" Oct 03 09:02:13 crc kubenswrapper[4765]: I1003 09:02:13.045816 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/dd692e27-9c41-4946-9776-955baa355470-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"dd692e27-9c41-4946-9776-955baa355470\") " pod="watcher-kuttl-default/ceilometer-0" Oct 03 09:02:13 crc kubenswrapper[4765]: I1003 09:02:13.045837 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/dd692e27-9c41-4946-9776-955baa355470-log-httpd\") pod \"ceilometer-0\" (UID: \"dd692e27-9c41-4946-9776-955baa355470\") " pod="watcher-kuttl-default/ceilometer-0" Oct 03 09:02:13 crc kubenswrapper[4765]: I1003 09:02:13.046245 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/dd692e27-9c41-4946-9776-955baa355470-log-httpd\") pod \"ceilometer-0\" (UID: \"dd692e27-9c41-4946-9776-955baa355470\") " pod="watcher-kuttl-default/ceilometer-0" Oct 03 09:02:13 crc kubenswrapper[4765]: I1003 09:02:13.047425 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/dd692e27-9c41-4946-9776-955baa355470-run-httpd\") pod \"ceilometer-0\" (UID: \"dd692e27-9c41-4946-9776-955baa355470\") " pod="watcher-kuttl-default/ceilometer-0" Oct 03 09:02:13 crc kubenswrapper[4765]: I1003 09:02:13.050708 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/dd692e27-9c41-4946-9776-955baa355470-scripts\") pod \"ceilometer-0\" (UID: \"dd692e27-9c41-4946-9776-955baa355470\") " pod="watcher-kuttl-default/ceilometer-0" Oct 03 09:02:13 crc kubenswrapper[4765]: I1003 09:02:13.052389 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/dd692e27-9c41-4946-9776-955baa355470-config-data\") pod \"ceilometer-0\" (UID: \"dd692e27-9c41-4946-9776-955baa355470\") " pod="watcher-kuttl-default/ceilometer-0" Oct 03 09:02:13 crc kubenswrapper[4765]: I1003 09:02:13.052854 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/dd692e27-9c41-4946-9776-955baa355470-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"dd692e27-9c41-4946-9776-955baa355470\") " pod="watcher-kuttl-default/ceilometer-0" Oct 03 09:02:13 crc kubenswrapper[4765]: I1003 09:02:13.053085 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/dd692e27-9c41-4946-9776-955baa355470-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"dd692e27-9c41-4946-9776-955baa355470\") " pod="watcher-kuttl-default/ceilometer-0" Oct 03 09:02:13 crc kubenswrapper[4765]: I1003 09:02:13.062203 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/dd692e27-9c41-4946-9776-955baa355470-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"dd692e27-9c41-4946-9776-955baa355470\") " pod="watcher-kuttl-default/ceilometer-0" Oct 03 09:02:13 crc kubenswrapper[4765]: I1003 09:02:13.067274 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2f5gq\" (UniqueName: \"kubernetes.io/projected/dd692e27-9c41-4946-9776-955baa355470-kube-api-access-2f5gq\") pod \"ceilometer-0\" (UID: \"dd692e27-9c41-4946-9776-955baa355470\") " pod="watcher-kuttl-default/ceilometer-0" Oct 03 09:02:13 crc kubenswrapper[4765]: I1003 09:02:13.138388 4765 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/ceilometer-0" Oct 03 09:02:13 crc kubenswrapper[4765]: I1003 09:02:13.555099 4765 prober.go:107] "Probe failed" probeType="Readiness" pod="watcher-kuttl-default/watcher-kuttl-api-0" podUID="3343efeb-5001-4879-9e39-f534fe992296" containerName="watcher-api" probeResult="failure" output="Get \"https://10.217.0.153:9322/\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Oct 03 09:02:13 crc kubenswrapper[4765]: I1003 09:02:13.555228 4765 prober.go:107] "Probe failed" probeType="Readiness" pod="watcher-kuttl-default/watcher-kuttl-api-0" podUID="3343efeb-5001-4879-9e39-f534fe992296" containerName="watcher-kuttl-api-log" probeResult="failure" output="Get \"https://10.217.0.153:9322/\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Oct 03 09:02:13 crc kubenswrapper[4765]: I1003 09:02:13.669922 4765 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Oct 03 09:02:13 crc kubenswrapper[4765]: I1003 09:02:13.670060 4765 kuberuntime_container.go:808] "Killing container with a grace period" pod="watcher-kuttl-default/watcher-kuttl-api-0" podUID="4825e1eb-acf6-4b63-a45e-d667bd450345" containerName="watcher-kuttl-api-log" containerID="cri-o://dc66b34b55444c4e79e251ff624bb1774aaead653731b7e96d41821e33d4159b" gracePeriod=30 Oct 03 09:02:13 crc kubenswrapper[4765]: I1003 09:02:13.670500 4765 kuberuntime_container.go:808] "Killing container with a grace period" pod="watcher-kuttl-default/watcher-kuttl-api-0" podUID="4825e1eb-acf6-4b63-a45e-d667bd450345" containerName="watcher-api" containerID="cri-o://c63fcfa0539545e558647a6bbe0a5034f8f09d394741418ffbfba8184c6dddbe" gracePeriod=30 Oct 03 09:02:13 crc kubenswrapper[4765]: I1003 09:02:13.675479 4765 prober.go:107] "Probe failed" probeType="Readiness" pod="watcher-kuttl-default/watcher-kuttl-api-0" podUID="4825e1eb-acf6-4b63-a45e-d667bd450345" containerName="watcher-api" probeResult="failure" output="Get \"https://10.217.0.154:9322/\": EOF" Oct 03 09:02:13 crc kubenswrapper[4765]: I1003 09:02:13.687990 4765 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["watcher-kuttl-default/ceilometer-0"] Oct 03 09:02:13 crc kubenswrapper[4765]: W1003 09:02:13.702578 4765 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-poddd692e27_9c41_4946_9776_955baa355470.slice/crio-9d88333f9d3f16fb324d796d2ebf6d39c2880c44877399f59750b21e023f7b4d WatchSource:0}: Error finding container 9d88333f9d3f16fb324d796d2ebf6d39c2880c44877399f59750b21e023f7b4d: Status 404 returned error can't find the container with id 9d88333f9d3f16fb324d796d2ebf6d39c2880c44877399f59750b21e023f7b4d Oct 03 09:02:14 crc kubenswrapper[4765]: I1003 09:02:14.316903 4765 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="82df5442-55c2-414c-b79e-1b2cb7ea3c98" path="/var/lib/kubelet/pods/82df5442-55c2-414c-b79e-1b2cb7ea3c98/volumes" Oct 03 09:02:14 crc kubenswrapper[4765]: I1003 09:02:14.707047 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/ceilometer-0" event={"ID":"dd692e27-9c41-4946-9776-955baa355470","Type":"ContainerStarted","Data":"8e33ade36873f25cd6c35a750342c13049f8880c6cbf20f4b2f65c5f2ec1c71c"} Oct 03 09:02:14 crc kubenswrapper[4765]: I1003 09:02:14.707123 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/ceilometer-0" event={"ID":"dd692e27-9c41-4946-9776-955baa355470","Type":"ContainerStarted","Data":"9d88333f9d3f16fb324d796d2ebf6d39c2880c44877399f59750b21e023f7b4d"} Oct 03 09:02:14 crc kubenswrapper[4765]: I1003 09:02:14.710783 4765 generic.go:334] "Generic (PLEG): container finished" podID="4825e1eb-acf6-4b63-a45e-d667bd450345" containerID="dc66b34b55444c4e79e251ff624bb1774aaead653731b7e96d41821e33d4159b" exitCode=143 Oct 03 09:02:14 crc kubenswrapper[4765]: I1003 09:02:14.711675 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/watcher-kuttl-api-0" event={"ID":"4825e1eb-acf6-4b63-a45e-d667bd450345","Type":"ContainerDied","Data":"dc66b34b55444c4e79e251ff624bb1774aaead653731b7e96d41821e33d4159b"} Oct 03 09:02:15 crc kubenswrapper[4765]: I1003 09:02:15.017316 4765 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="watcher-kuttl-default/watcher-kuttl-api-0" Oct 03 09:02:15 crc kubenswrapper[4765]: I1003 09:02:15.801257 4765 prober.go:107] "Probe failed" probeType="Readiness" pod="watcher-kuttl-default/watcher-kuttl-api-0" podUID="4825e1eb-acf6-4b63-a45e-d667bd450345" containerName="watcher-api" probeResult="failure" output="Get \"https://10.217.0.154:9322/\": read tcp 10.217.0.2:48200->10.217.0.154:9322: read: connection reset by peer" Oct 03 09:02:15 crc kubenswrapper[4765]: I1003 09:02:15.802020 4765 prober.go:107] "Probe failed" probeType="Readiness" pod="watcher-kuttl-default/watcher-kuttl-api-0" podUID="4825e1eb-acf6-4b63-a45e-d667bd450345" containerName="watcher-api" probeResult="failure" output="Get \"https://10.217.0.154:9322/\": dial tcp 10.217.0.154:9322: connect: connection refused" Oct 03 09:02:16 crc kubenswrapper[4765]: I1003 09:02:16.298348 4765 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/watcher-kuttl-api-0" Oct 03 09:02:16 crc kubenswrapper[4765]: I1003 09:02:16.498064 4765 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tsxqg\" (UniqueName: \"kubernetes.io/projected/4825e1eb-acf6-4b63-a45e-d667bd450345-kube-api-access-tsxqg\") pod \"4825e1eb-acf6-4b63-a45e-d667bd450345\" (UID: \"4825e1eb-acf6-4b63-a45e-d667bd450345\") " Oct 03 09:02:16 crc kubenswrapper[4765]: I1003 09:02:16.498145 4765 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/4825e1eb-acf6-4b63-a45e-d667bd450345-internal-tls-certs\") pod \"4825e1eb-acf6-4b63-a45e-d667bd450345\" (UID: \"4825e1eb-acf6-4b63-a45e-d667bd450345\") " Oct 03 09:02:16 crc kubenswrapper[4765]: I1003 09:02:16.498219 4765 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/4825e1eb-acf6-4b63-a45e-d667bd450345-public-tls-certs\") pod \"4825e1eb-acf6-4b63-a45e-d667bd450345\" (UID: \"4825e1eb-acf6-4b63-a45e-d667bd450345\") " Oct 03 09:02:16 crc kubenswrapper[4765]: I1003 09:02:16.498270 4765 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4825e1eb-acf6-4b63-a45e-d667bd450345-combined-ca-bundle\") pod \"4825e1eb-acf6-4b63-a45e-d667bd450345\" (UID: \"4825e1eb-acf6-4b63-a45e-d667bd450345\") " Oct 03 09:02:16 crc kubenswrapper[4765]: I1003 09:02:16.498368 4765 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/4825e1eb-acf6-4b63-a45e-d667bd450345-logs\") pod \"4825e1eb-acf6-4b63-a45e-d667bd450345\" (UID: \"4825e1eb-acf6-4b63-a45e-d667bd450345\") " Oct 03 09:02:16 crc kubenswrapper[4765]: I1003 09:02:16.498448 4765 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4825e1eb-acf6-4b63-a45e-d667bd450345-config-data\") pod \"4825e1eb-acf6-4b63-a45e-d667bd450345\" (UID: \"4825e1eb-acf6-4b63-a45e-d667bd450345\") " Oct 03 09:02:16 crc kubenswrapper[4765]: I1003 09:02:16.498507 4765 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"custom-prometheus-ca\" (UniqueName: \"kubernetes.io/secret/4825e1eb-acf6-4b63-a45e-d667bd450345-custom-prometheus-ca\") pod \"4825e1eb-acf6-4b63-a45e-d667bd450345\" (UID: \"4825e1eb-acf6-4b63-a45e-d667bd450345\") " Oct 03 09:02:16 crc kubenswrapper[4765]: I1003 09:02:16.498950 4765 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/4825e1eb-acf6-4b63-a45e-d667bd450345-logs" (OuterVolumeSpecName: "logs") pod "4825e1eb-acf6-4b63-a45e-d667bd450345" (UID: "4825e1eb-acf6-4b63-a45e-d667bd450345"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 03 09:02:16 crc kubenswrapper[4765]: I1003 09:02:16.503796 4765 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4825e1eb-acf6-4b63-a45e-d667bd450345-kube-api-access-tsxqg" (OuterVolumeSpecName: "kube-api-access-tsxqg") pod "4825e1eb-acf6-4b63-a45e-d667bd450345" (UID: "4825e1eb-acf6-4b63-a45e-d667bd450345"). InnerVolumeSpecName "kube-api-access-tsxqg". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 09:02:16 crc kubenswrapper[4765]: I1003 09:02:16.525412 4765 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4825e1eb-acf6-4b63-a45e-d667bd450345-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "4825e1eb-acf6-4b63-a45e-d667bd450345" (UID: "4825e1eb-acf6-4b63-a45e-d667bd450345"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 09:02:16 crc kubenswrapper[4765]: I1003 09:02:16.542407 4765 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4825e1eb-acf6-4b63-a45e-d667bd450345-custom-prometheus-ca" (OuterVolumeSpecName: "custom-prometheus-ca") pod "4825e1eb-acf6-4b63-a45e-d667bd450345" (UID: "4825e1eb-acf6-4b63-a45e-d667bd450345"). InnerVolumeSpecName "custom-prometheus-ca". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 09:02:16 crc kubenswrapper[4765]: I1003 09:02:16.545492 4765 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4825e1eb-acf6-4b63-a45e-d667bd450345-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "4825e1eb-acf6-4b63-a45e-d667bd450345" (UID: "4825e1eb-acf6-4b63-a45e-d667bd450345"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 09:02:16 crc kubenswrapper[4765]: I1003 09:02:16.546021 4765 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4825e1eb-acf6-4b63-a45e-d667bd450345-config-data" (OuterVolumeSpecName: "config-data") pod "4825e1eb-acf6-4b63-a45e-d667bd450345" (UID: "4825e1eb-acf6-4b63-a45e-d667bd450345"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 09:02:16 crc kubenswrapper[4765]: I1003 09:02:16.550391 4765 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4825e1eb-acf6-4b63-a45e-d667bd450345-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "4825e1eb-acf6-4b63-a45e-d667bd450345" (UID: "4825e1eb-acf6-4b63-a45e-d667bd450345"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 09:02:16 crc kubenswrapper[4765]: I1003 09:02:16.600199 4765 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4825e1eb-acf6-4b63-a45e-d667bd450345-config-data\") on node \"crc\" DevicePath \"\"" Oct 03 09:02:16 crc kubenswrapper[4765]: I1003 09:02:16.600229 4765 reconciler_common.go:293] "Volume detached for volume \"custom-prometheus-ca\" (UniqueName: \"kubernetes.io/secret/4825e1eb-acf6-4b63-a45e-d667bd450345-custom-prometheus-ca\") on node \"crc\" DevicePath \"\"" Oct 03 09:02:16 crc kubenswrapper[4765]: I1003 09:02:16.600241 4765 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-tsxqg\" (UniqueName: \"kubernetes.io/projected/4825e1eb-acf6-4b63-a45e-d667bd450345-kube-api-access-tsxqg\") on node \"crc\" DevicePath \"\"" Oct 03 09:02:16 crc kubenswrapper[4765]: I1003 09:02:16.600250 4765 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/4825e1eb-acf6-4b63-a45e-d667bd450345-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Oct 03 09:02:16 crc kubenswrapper[4765]: I1003 09:02:16.600261 4765 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/4825e1eb-acf6-4b63-a45e-d667bd450345-public-tls-certs\") on node \"crc\" DevicePath \"\"" Oct 03 09:02:16 crc kubenswrapper[4765]: I1003 09:02:16.600271 4765 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4825e1eb-acf6-4b63-a45e-d667bd450345-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 03 09:02:16 crc kubenswrapper[4765]: I1003 09:02:16.600282 4765 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/4825e1eb-acf6-4b63-a45e-d667bd450345-logs\") on node \"crc\" DevicePath \"\"" Oct 03 09:02:16 crc kubenswrapper[4765]: I1003 09:02:16.727181 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/ceilometer-0" event={"ID":"dd692e27-9c41-4946-9776-955baa355470","Type":"ContainerStarted","Data":"4ea9ee96f0faed918eedf3026e52af26391c80dbf9886e1598feb36c3570b6a8"} Oct 03 09:02:16 crc kubenswrapper[4765]: I1003 09:02:16.729126 4765 generic.go:334] "Generic (PLEG): container finished" podID="4825e1eb-acf6-4b63-a45e-d667bd450345" containerID="c63fcfa0539545e558647a6bbe0a5034f8f09d394741418ffbfba8184c6dddbe" exitCode=0 Oct 03 09:02:16 crc kubenswrapper[4765]: I1003 09:02:16.729154 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/watcher-kuttl-api-0" event={"ID":"4825e1eb-acf6-4b63-a45e-d667bd450345","Type":"ContainerDied","Data":"c63fcfa0539545e558647a6bbe0a5034f8f09d394741418ffbfba8184c6dddbe"} Oct 03 09:02:16 crc kubenswrapper[4765]: I1003 09:02:16.729170 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/watcher-kuttl-api-0" event={"ID":"4825e1eb-acf6-4b63-a45e-d667bd450345","Type":"ContainerDied","Data":"8c7659e78092365bf578a41d2b2db0129d6bc8f41b75d3281afe9e65815178fa"} Oct 03 09:02:16 crc kubenswrapper[4765]: I1003 09:02:16.729186 4765 scope.go:117] "RemoveContainer" containerID="c63fcfa0539545e558647a6bbe0a5034f8f09d394741418ffbfba8184c6dddbe" Oct 03 09:02:16 crc kubenswrapper[4765]: I1003 09:02:16.729232 4765 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/watcher-kuttl-api-0" Oct 03 09:02:16 crc kubenswrapper[4765]: I1003 09:02:16.787431 4765 scope.go:117] "RemoveContainer" containerID="dc66b34b55444c4e79e251ff624bb1774aaead653731b7e96d41821e33d4159b" Oct 03 09:02:16 crc kubenswrapper[4765]: I1003 09:02:16.792753 4765 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["watcher-kuttl-default/watcher-kuttl-api-0"] Oct 03 09:02:16 crc kubenswrapper[4765]: I1003 09:02:16.800018 4765 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["watcher-kuttl-default/watcher-kuttl-api-0"] Oct 03 09:02:16 crc kubenswrapper[4765]: I1003 09:02:16.812166 4765 scope.go:117] "RemoveContainer" containerID="c63fcfa0539545e558647a6bbe0a5034f8f09d394741418ffbfba8184c6dddbe" Oct 03 09:02:16 crc kubenswrapper[4765]: E1003 09:02:16.812560 4765 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c63fcfa0539545e558647a6bbe0a5034f8f09d394741418ffbfba8184c6dddbe\": container with ID starting with c63fcfa0539545e558647a6bbe0a5034f8f09d394741418ffbfba8184c6dddbe not found: ID does not exist" containerID="c63fcfa0539545e558647a6bbe0a5034f8f09d394741418ffbfba8184c6dddbe" Oct 03 09:02:16 crc kubenswrapper[4765]: I1003 09:02:16.812596 4765 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c63fcfa0539545e558647a6bbe0a5034f8f09d394741418ffbfba8184c6dddbe"} err="failed to get container status \"c63fcfa0539545e558647a6bbe0a5034f8f09d394741418ffbfba8184c6dddbe\": rpc error: code = NotFound desc = could not find container \"c63fcfa0539545e558647a6bbe0a5034f8f09d394741418ffbfba8184c6dddbe\": container with ID starting with c63fcfa0539545e558647a6bbe0a5034f8f09d394741418ffbfba8184c6dddbe not found: ID does not exist" Oct 03 09:02:16 crc kubenswrapper[4765]: I1003 09:02:16.812624 4765 scope.go:117] "RemoveContainer" containerID="dc66b34b55444c4e79e251ff624bb1774aaead653731b7e96d41821e33d4159b" Oct 03 09:02:16 crc kubenswrapper[4765]: E1003 09:02:16.813008 4765 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"dc66b34b55444c4e79e251ff624bb1774aaead653731b7e96d41821e33d4159b\": container with ID starting with dc66b34b55444c4e79e251ff624bb1774aaead653731b7e96d41821e33d4159b not found: ID does not exist" containerID="dc66b34b55444c4e79e251ff624bb1774aaead653731b7e96d41821e33d4159b" Oct 03 09:02:16 crc kubenswrapper[4765]: I1003 09:02:16.813048 4765 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"dc66b34b55444c4e79e251ff624bb1774aaead653731b7e96d41821e33d4159b"} err="failed to get container status \"dc66b34b55444c4e79e251ff624bb1774aaead653731b7e96d41821e33d4159b\": rpc error: code = NotFound desc = could not find container \"dc66b34b55444c4e79e251ff624bb1774aaead653731b7e96d41821e33d4159b\": container with ID starting with dc66b34b55444c4e79e251ff624bb1774aaead653731b7e96d41821e33d4159b not found: ID does not exist" Oct 03 09:02:16 crc kubenswrapper[4765]: I1003 09:02:16.841537 4765 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["watcher-kuttl-default/watcher-kuttl-api-0"] Oct 03 09:02:16 crc kubenswrapper[4765]: E1003 09:02:16.841893 4765 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4825e1eb-acf6-4b63-a45e-d667bd450345" containerName="watcher-api" Oct 03 09:02:16 crc kubenswrapper[4765]: I1003 09:02:16.841910 4765 state_mem.go:107] "Deleted CPUSet assignment" podUID="4825e1eb-acf6-4b63-a45e-d667bd450345" containerName="watcher-api" Oct 03 09:02:16 crc kubenswrapper[4765]: E1003 09:02:16.841927 4765 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4825e1eb-acf6-4b63-a45e-d667bd450345" containerName="watcher-kuttl-api-log" Oct 03 09:02:16 crc kubenswrapper[4765]: I1003 09:02:16.841934 4765 state_mem.go:107] "Deleted CPUSet assignment" podUID="4825e1eb-acf6-4b63-a45e-d667bd450345" containerName="watcher-kuttl-api-log" Oct 03 09:02:16 crc kubenswrapper[4765]: I1003 09:02:16.842094 4765 memory_manager.go:354] "RemoveStaleState removing state" podUID="4825e1eb-acf6-4b63-a45e-d667bd450345" containerName="watcher-kuttl-api-log" Oct 03 09:02:16 crc kubenswrapper[4765]: I1003 09:02:16.842112 4765 memory_manager.go:354] "RemoveStaleState removing state" podUID="4825e1eb-acf6-4b63-a45e-d667bd450345" containerName="watcher-api" Oct 03 09:02:16 crc kubenswrapper[4765]: I1003 09:02:16.843128 4765 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/watcher-kuttl-api-0" Oct 03 09:02:16 crc kubenswrapper[4765]: I1003 09:02:16.848034 4765 reflector.go:368] Caches populated for *v1.Secret from object-"watcher-kuttl-default"/"cert-watcher-public-svc" Oct 03 09:02:16 crc kubenswrapper[4765]: I1003 09:02:16.848267 4765 reflector.go:368] Caches populated for *v1.Secret from object-"watcher-kuttl-default"/"watcher-kuttl-api-config-data" Oct 03 09:02:16 crc kubenswrapper[4765]: I1003 09:02:16.848407 4765 reflector.go:368] Caches populated for *v1.Secret from object-"watcher-kuttl-default"/"cert-watcher-internal-svc" Oct 03 09:02:16 crc kubenswrapper[4765]: I1003 09:02:16.904100 4765 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["watcher-kuttl-default/watcher-kuttl-api-0"] Oct 03 09:02:16 crc kubenswrapper[4765]: I1003 09:02:16.905454 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/6b2be982-d42e-40f6-840c-cbdb35fefc4b-public-tls-certs\") pod \"watcher-kuttl-api-0\" (UID: \"6b2be982-d42e-40f6-840c-cbdb35fefc4b\") " pod="watcher-kuttl-default/watcher-kuttl-api-0" Oct 03 09:02:16 crc kubenswrapper[4765]: I1003 09:02:16.905550 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6b2be982-d42e-40f6-840c-cbdb35fefc4b-config-data\") pod \"watcher-kuttl-api-0\" (UID: \"6b2be982-d42e-40f6-840c-cbdb35fefc4b\") " pod="watcher-kuttl-default/watcher-kuttl-api-0" Oct 03 09:02:16 crc kubenswrapper[4765]: I1003 09:02:16.905667 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rgb72\" (UniqueName: \"kubernetes.io/projected/6b2be982-d42e-40f6-840c-cbdb35fefc4b-kube-api-access-rgb72\") pod \"watcher-kuttl-api-0\" (UID: \"6b2be982-d42e-40f6-840c-cbdb35fefc4b\") " pod="watcher-kuttl-default/watcher-kuttl-api-0" Oct 03 09:02:16 crc kubenswrapper[4765]: I1003 09:02:16.905714 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"custom-prometheus-ca\" (UniqueName: \"kubernetes.io/secret/6b2be982-d42e-40f6-840c-cbdb35fefc4b-custom-prometheus-ca\") pod \"watcher-kuttl-api-0\" (UID: \"6b2be982-d42e-40f6-840c-cbdb35fefc4b\") " pod="watcher-kuttl-default/watcher-kuttl-api-0" Oct 03 09:02:16 crc kubenswrapper[4765]: I1003 09:02:16.905808 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/6b2be982-d42e-40f6-840c-cbdb35fefc4b-internal-tls-certs\") pod \"watcher-kuttl-api-0\" (UID: \"6b2be982-d42e-40f6-840c-cbdb35fefc4b\") " pod="watcher-kuttl-default/watcher-kuttl-api-0" Oct 03 09:02:16 crc kubenswrapper[4765]: I1003 09:02:16.905874 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6b2be982-d42e-40f6-840c-cbdb35fefc4b-combined-ca-bundle\") pod \"watcher-kuttl-api-0\" (UID: \"6b2be982-d42e-40f6-840c-cbdb35fefc4b\") " pod="watcher-kuttl-default/watcher-kuttl-api-0" Oct 03 09:02:16 crc kubenswrapper[4765]: I1003 09:02:16.905932 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/6b2be982-d42e-40f6-840c-cbdb35fefc4b-logs\") pod \"watcher-kuttl-api-0\" (UID: \"6b2be982-d42e-40f6-840c-cbdb35fefc4b\") " pod="watcher-kuttl-default/watcher-kuttl-api-0" Oct 03 09:02:17 crc kubenswrapper[4765]: I1003 09:02:17.007039 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rgb72\" (UniqueName: \"kubernetes.io/projected/6b2be982-d42e-40f6-840c-cbdb35fefc4b-kube-api-access-rgb72\") pod \"watcher-kuttl-api-0\" (UID: \"6b2be982-d42e-40f6-840c-cbdb35fefc4b\") " pod="watcher-kuttl-default/watcher-kuttl-api-0" Oct 03 09:02:17 crc kubenswrapper[4765]: I1003 09:02:17.007096 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"custom-prometheus-ca\" (UniqueName: \"kubernetes.io/secret/6b2be982-d42e-40f6-840c-cbdb35fefc4b-custom-prometheus-ca\") pod \"watcher-kuttl-api-0\" (UID: \"6b2be982-d42e-40f6-840c-cbdb35fefc4b\") " pod="watcher-kuttl-default/watcher-kuttl-api-0" Oct 03 09:02:17 crc kubenswrapper[4765]: I1003 09:02:17.007151 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/6b2be982-d42e-40f6-840c-cbdb35fefc4b-internal-tls-certs\") pod \"watcher-kuttl-api-0\" (UID: \"6b2be982-d42e-40f6-840c-cbdb35fefc4b\") " pod="watcher-kuttl-default/watcher-kuttl-api-0" Oct 03 09:02:17 crc kubenswrapper[4765]: I1003 09:02:17.007188 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6b2be982-d42e-40f6-840c-cbdb35fefc4b-combined-ca-bundle\") pod \"watcher-kuttl-api-0\" (UID: \"6b2be982-d42e-40f6-840c-cbdb35fefc4b\") " pod="watcher-kuttl-default/watcher-kuttl-api-0" Oct 03 09:02:17 crc kubenswrapper[4765]: I1003 09:02:17.007221 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/6b2be982-d42e-40f6-840c-cbdb35fefc4b-logs\") pod \"watcher-kuttl-api-0\" (UID: \"6b2be982-d42e-40f6-840c-cbdb35fefc4b\") " pod="watcher-kuttl-default/watcher-kuttl-api-0" Oct 03 09:02:17 crc kubenswrapper[4765]: I1003 09:02:17.007264 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/6b2be982-d42e-40f6-840c-cbdb35fefc4b-public-tls-certs\") pod \"watcher-kuttl-api-0\" (UID: \"6b2be982-d42e-40f6-840c-cbdb35fefc4b\") " pod="watcher-kuttl-default/watcher-kuttl-api-0" Oct 03 09:02:17 crc kubenswrapper[4765]: I1003 09:02:17.007293 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6b2be982-d42e-40f6-840c-cbdb35fefc4b-config-data\") pod \"watcher-kuttl-api-0\" (UID: \"6b2be982-d42e-40f6-840c-cbdb35fefc4b\") " pod="watcher-kuttl-default/watcher-kuttl-api-0" Oct 03 09:02:17 crc kubenswrapper[4765]: I1003 09:02:17.013662 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/6b2be982-d42e-40f6-840c-cbdb35fefc4b-logs\") pod \"watcher-kuttl-api-0\" (UID: \"6b2be982-d42e-40f6-840c-cbdb35fefc4b\") " pod="watcher-kuttl-default/watcher-kuttl-api-0" Oct 03 09:02:17 crc kubenswrapper[4765]: I1003 09:02:17.015803 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/6b2be982-d42e-40f6-840c-cbdb35fefc4b-internal-tls-certs\") pod \"watcher-kuttl-api-0\" (UID: \"6b2be982-d42e-40f6-840c-cbdb35fefc4b\") " pod="watcher-kuttl-default/watcher-kuttl-api-0" Oct 03 09:02:17 crc kubenswrapper[4765]: I1003 09:02:17.022284 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"custom-prometheus-ca\" (UniqueName: \"kubernetes.io/secret/6b2be982-d42e-40f6-840c-cbdb35fefc4b-custom-prometheus-ca\") pod \"watcher-kuttl-api-0\" (UID: \"6b2be982-d42e-40f6-840c-cbdb35fefc4b\") " pod="watcher-kuttl-default/watcher-kuttl-api-0" Oct 03 09:02:17 crc kubenswrapper[4765]: I1003 09:02:17.024789 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6b2be982-d42e-40f6-840c-cbdb35fefc4b-config-data\") pod \"watcher-kuttl-api-0\" (UID: \"6b2be982-d42e-40f6-840c-cbdb35fefc4b\") " pod="watcher-kuttl-default/watcher-kuttl-api-0" Oct 03 09:02:17 crc kubenswrapper[4765]: I1003 09:02:17.025114 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/6b2be982-d42e-40f6-840c-cbdb35fefc4b-public-tls-certs\") pod \"watcher-kuttl-api-0\" (UID: \"6b2be982-d42e-40f6-840c-cbdb35fefc4b\") " pod="watcher-kuttl-default/watcher-kuttl-api-0" Oct 03 09:02:17 crc kubenswrapper[4765]: I1003 09:02:17.052322 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6b2be982-d42e-40f6-840c-cbdb35fefc4b-combined-ca-bundle\") pod \"watcher-kuttl-api-0\" (UID: \"6b2be982-d42e-40f6-840c-cbdb35fefc4b\") " pod="watcher-kuttl-default/watcher-kuttl-api-0" Oct 03 09:02:17 crc kubenswrapper[4765]: I1003 09:02:17.078267 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rgb72\" (UniqueName: \"kubernetes.io/projected/6b2be982-d42e-40f6-840c-cbdb35fefc4b-kube-api-access-rgb72\") pod \"watcher-kuttl-api-0\" (UID: \"6b2be982-d42e-40f6-840c-cbdb35fefc4b\") " pod="watcher-kuttl-default/watcher-kuttl-api-0" Oct 03 09:02:17 crc kubenswrapper[4765]: I1003 09:02:17.280095 4765 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/watcher-kuttl-api-0" Oct 03 09:02:17 crc kubenswrapper[4765]: I1003 09:02:17.709065 4765 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["watcher-kuttl-default/watcher-kuttl-api-0"] Oct 03 09:02:17 crc kubenswrapper[4765]: W1003 09:02:17.710465 4765 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod6b2be982_d42e_40f6_840c_cbdb35fefc4b.slice/crio-eed7451c613d2404816651890867b073eb3144c902db41fe12cfaefe4d935521 WatchSource:0}: Error finding container eed7451c613d2404816651890867b073eb3144c902db41fe12cfaefe4d935521: Status 404 returned error can't find the container with id eed7451c613d2404816651890867b073eb3144c902db41fe12cfaefe4d935521 Oct 03 09:02:17 crc kubenswrapper[4765]: I1003 09:02:17.741439 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/ceilometer-0" event={"ID":"dd692e27-9c41-4946-9776-955baa355470","Type":"ContainerStarted","Data":"f75c8cbcdeebbc2265f17a6512043a708c837bdb0cc944615199051b33cc7d6c"} Oct 03 09:02:17 crc kubenswrapper[4765]: I1003 09:02:17.742832 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/watcher-kuttl-api-0" event={"ID":"6b2be982-d42e-40f6-840c-cbdb35fefc4b","Type":"ContainerStarted","Data":"eed7451c613d2404816651890867b073eb3144c902db41fe12cfaefe4d935521"} Oct 03 09:02:18 crc kubenswrapper[4765]: I1003 09:02:18.323420 4765 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4825e1eb-acf6-4b63-a45e-d667bd450345" path="/var/lib/kubelet/pods/4825e1eb-acf6-4b63-a45e-d667bd450345/volumes" Oct 03 09:02:18 crc kubenswrapper[4765]: I1003 09:02:18.636891 4765 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["watcher-kuttl-default/watcher-kuttl-db-sync-crsnq"] Oct 03 09:02:18 crc kubenswrapper[4765]: I1003 09:02:18.643005 4765 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["watcher-kuttl-default/watcher-kuttl-db-sync-crsnq"] Oct 03 09:02:18 crc kubenswrapper[4765]: I1003 09:02:18.698297 4765 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["watcher-kuttl-default/watcher8945-account-delete-pt9gj"] Oct 03 09:02:18 crc kubenswrapper[4765]: I1003 09:02:18.699771 4765 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/watcher8945-account-delete-pt9gj" Oct 03 09:02:18 crc kubenswrapper[4765]: I1003 09:02:18.708614 4765 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["watcher-kuttl-default/watcher-kuttl-decision-engine-0"] Oct 03 09:02:18 crc kubenswrapper[4765]: I1003 09:02:18.708856 4765 kuberuntime_container.go:808] "Killing container with a grace period" pod="watcher-kuttl-default/watcher-kuttl-decision-engine-0" podUID="f497c8f6-6ad1-4287-a194-e7cd523e8eb7" containerName="watcher-decision-engine" containerID="cri-o://3f05ffa3de7010d992730b04abb3eec4fcdce81def31055ecf52bb58787eb748" gracePeriod=30 Oct 03 09:02:18 crc kubenswrapper[4765]: I1003 09:02:18.734547 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mzfhj\" (UniqueName: \"kubernetes.io/projected/dac4b22e-04b7-4780-acee-fd0971bd5e94-kube-api-access-mzfhj\") pod \"watcher8945-account-delete-pt9gj\" (UID: \"dac4b22e-04b7-4780-acee-fd0971bd5e94\") " pod="watcher-kuttl-default/watcher8945-account-delete-pt9gj" Oct 03 09:02:18 crc kubenswrapper[4765]: I1003 09:02:18.752672 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/watcher-kuttl-api-0" event={"ID":"6b2be982-d42e-40f6-840c-cbdb35fefc4b","Type":"ContainerStarted","Data":"8a669e0ca9c434079bad9eadee6b72778566c16870fa5bf81fa0f4084aa60c6a"} Oct 03 09:02:18 crc kubenswrapper[4765]: I1003 09:02:18.753037 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/watcher-kuttl-api-0" event={"ID":"6b2be982-d42e-40f6-840c-cbdb35fefc4b","Type":"ContainerStarted","Data":"d518aa16eb43acf135258aaecf8698528e8c875f6760020b6f37989dd9ea6843"} Oct 03 09:02:18 crc kubenswrapper[4765]: I1003 09:02:18.753163 4765 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="watcher-kuttl-default/watcher-kuttl-api-0" Oct 03 09:02:18 crc kubenswrapper[4765]: I1003 09:02:18.753288 4765 kubelet_pods.go:1007] "Unable to retrieve pull secret, the image pull may not succeed." pod="watcher-kuttl-default/watcher-kuttl-api-0" secret="" err="secret \"watcher-watcher-kuttl-dockercfg-cm42b\" not found" Oct 03 09:02:18 crc kubenswrapper[4765]: I1003 09:02:18.757518 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/ceilometer-0" event={"ID":"dd692e27-9c41-4946-9776-955baa355470","Type":"ContainerStarted","Data":"7b0a838d09d1f8e5dda48c85ff65d03d96320f14aa1115cb502077600166ce87"} Oct 03 09:02:18 crc kubenswrapper[4765]: I1003 09:02:18.757795 4765 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="watcher-kuttl-default/ceilometer-0" Oct 03 09:02:18 crc kubenswrapper[4765]: I1003 09:02:18.766565 4765 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["watcher-kuttl-default/watcher8945-account-delete-pt9gj"] Oct 03 09:02:18 crc kubenswrapper[4765]: I1003 09:02:18.796170 4765 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["watcher-kuttl-default/watcher-kuttl-applier-0"] Oct 03 09:02:18 crc kubenswrapper[4765]: I1003 09:02:18.796427 4765 kuberuntime_container.go:808] "Killing container with a grace period" pod="watcher-kuttl-default/watcher-kuttl-applier-0" podUID="814e2e41-1895-4103-b651-9e8e9db42905" containerName="watcher-applier" containerID="cri-o://6a8e4114cc89500a9ec7aab695e077b388beb5d9aa62d833e8ca372e856f6652" gracePeriod=30 Oct 03 09:02:18 crc kubenswrapper[4765]: I1003 09:02:18.814895 4765 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="watcher-kuttl-default/watcher-kuttl-api-0" podStartSLOduration=2.814869578 podStartE2EDuration="2.814869578s" podCreationTimestamp="2025-10-03 09:02:16 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-03 09:02:18.813295648 +0000 UTC m=+1383.114789978" watchObservedRunningTime="2025-10-03 09:02:18.814869578 +0000 UTC m=+1383.116363908" Oct 03 09:02:18 crc kubenswrapper[4765]: I1003 09:02:18.835909 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mzfhj\" (UniqueName: \"kubernetes.io/projected/dac4b22e-04b7-4780-acee-fd0971bd5e94-kube-api-access-mzfhj\") pod \"watcher8945-account-delete-pt9gj\" (UID: \"dac4b22e-04b7-4780-acee-fd0971bd5e94\") " pod="watcher-kuttl-default/watcher8945-account-delete-pt9gj" Oct 03 09:02:18 crc kubenswrapper[4765]: I1003 09:02:18.839469 4765 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["watcher-kuttl-default/watcher-kuttl-api-0"] Oct 03 09:02:18 crc kubenswrapper[4765]: I1003 09:02:18.854284 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mzfhj\" (UniqueName: \"kubernetes.io/projected/dac4b22e-04b7-4780-acee-fd0971bd5e94-kube-api-access-mzfhj\") pod \"watcher8945-account-delete-pt9gj\" (UID: \"dac4b22e-04b7-4780-acee-fd0971bd5e94\") " pod="watcher-kuttl-default/watcher8945-account-delete-pt9gj" Oct 03 09:02:18 crc kubenswrapper[4765]: I1003 09:02:18.864710 4765 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="watcher-kuttl-default/ceilometer-0" podStartSLOduration=2.3643233439999998 podStartE2EDuration="6.864619894s" podCreationTimestamp="2025-10-03 09:02:12 +0000 UTC" firstStartedPulling="2025-10-03 09:02:13.705250654 +0000 UTC m=+1378.006744984" lastFinishedPulling="2025-10-03 09:02:18.205547204 +0000 UTC m=+1382.507041534" observedRunningTime="2025-10-03 09:02:18.863043535 +0000 UTC m=+1383.164537875" watchObservedRunningTime="2025-10-03 09:02:18.864619894 +0000 UTC m=+1383.166114224" Oct 03 09:02:18 crc kubenswrapper[4765]: E1003 09:02:18.940356 4765 secret.go:188] Couldn't get secret watcher-kuttl-default/watcher-kuttl-api-config-data: secret "watcher-kuttl-api-config-data" not found Oct 03 09:02:18 crc kubenswrapper[4765]: E1003 09:02:18.940435 4765 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/6b2be982-d42e-40f6-840c-cbdb35fefc4b-config-data podName:6b2be982-d42e-40f6-840c-cbdb35fefc4b nodeName:}" failed. No retries permitted until 2025-10-03 09:02:19.440411608 +0000 UTC m=+1383.741905938 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "config-data" (UniqueName: "kubernetes.io/secret/6b2be982-d42e-40f6-840c-cbdb35fefc4b-config-data") pod "watcher-kuttl-api-0" (UID: "6b2be982-d42e-40f6-840c-cbdb35fefc4b") : secret "watcher-kuttl-api-config-data" not found Oct 03 09:02:19 crc kubenswrapper[4765]: I1003 09:02:19.019741 4765 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/watcher8945-account-delete-pt9gj" Oct 03 09:02:19 crc kubenswrapper[4765]: E1003 09:02:19.448517 4765 secret.go:188] Couldn't get secret watcher-kuttl-default/watcher-kuttl-api-config-data: secret "watcher-kuttl-api-config-data" not found Oct 03 09:02:19 crc kubenswrapper[4765]: E1003 09:02:19.448758 4765 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/6b2be982-d42e-40f6-840c-cbdb35fefc4b-config-data podName:6b2be982-d42e-40f6-840c-cbdb35fefc4b nodeName:}" failed. No retries permitted until 2025-10-03 09:02:20.448744702 +0000 UTC m=+1384.750239032 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "config-data" (UniqueName: "kubernetes.io/secret/6b2be982-d42e-40f6-840c-cbdb35fefc4b-config-data") pod "watcher-kuttl-api-0" (UID: "6b2be982-d42e-40f6-840c-cbdb35fefc4b") : secret "watcher-kuttl-api-config-data" not found Oct 03 09:02:19 crc kubenswrapper[4765]: I1003 09:02:19.513104 4765 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["watcher-kuttl-default/watcher8945-account-delete-pt9gj"] Oct 03 09:02:19 crc kubenswrapper[4765]: I1003 09:02:19.775090 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/watcher8945-account-delete-pt9gj" event={"ID":"dac4b22e-04b7-4780-acee-fd0971bd5e94","Type":"ContainerStarted","Data":"7f0779cb52b4215fc79ae8065c56892a6740e507e4a3a5b41f70a32e94b373ec"} Oct 03 09:02:19 crc kubenswrapper[4765]: I1003 09:02:19.775133 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/watcher8945-account-delete-pt9gj" event={"ID":"dac4b22e-04b7-4780-acee-fd0971bd5e94","Type":"ContainerStarted","Data":"1bb06d02607c1e84d733d4a0c4a8fa4fdffddcde537f592ecdeff02cc5a571c6"} Oct 03 09:02:19 crc kubenswrapper[4765]: I1003 09:02:19.775576 4765 kubelet_pods.go:1007] "Unable to retrieve pull secret, the image pull may not succeed." pod="watcher-kuttl-default/watcher-kuttl-api-0" secret="" err="secret \"watcher-watcher-kuttl-dockercfg-cm42b\" not found" Oct 03 09:02:20 crc kubenswrapper[4765]: I1003 09:02:20.370204 4765 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="796f3197-44d8-408b-a8d4-2b5c6a282ff5" path="/var/lib/kubelet/pods/796f3197-44d8-408b-a8d4-2b5c6a282ff5/volumes" Oct 03 09:02:20 crc kubenswrapper[4765]: E1003 09:02:20.479963 4765 secret.go:188] Couldn't get secret watcher-kuttl-default/watcher-kuttl-api-config-data: secret "watcher-kuttl-api-config-data" not found Oct 03 09:02:20 crc kubenswrapper[4765]: E1003 09:02:20.480019 4765 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/6b2be982-d42e-40f6-840c-cbdb35fefc4b-config-data podName:6b2be982-d42e-40f6-840c-cbdb35fefc4b nodeName:}" failed. No retries permitted until 2025-10-03 09:02:22.480005158 +0000 UTC m=+1386.781499488 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "config-data" (UniqueName: "kubernetes.io/secret/6b2be982-d42e-40f6-840c-cbdb35fefc4b-config-data") pod "watcher-kuttl-api-0" (UID: "6b2be982-d42e-40f6-840c-cbdb35fefc4b") : secret "watcher-kuttl-api-config-data" not found Oct 03 09:02:20 crc kubenswrapper[4765]: I1003 09:02:20.783618 4765 generic.go:334] "Generic (PLEG): container finished" podID="814e2e41-1895-4103-b651-9e8e9db42905" containerID="6a8e4114cc89500a9ec7aab695e077b388beb5d9aa62d833e8ca372e856f6652" exitCode=0 Oct 03 09:02:20 crc kubenswrapper[4765]: I1003 09:02:20.783687 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/watcher-kuttl-applier-0" event={"ID":"814e2e41-1895-4103-b651-9e8e9db42905","Type":"ContainerDied","Data":"6a8e4114cc89500a9ec7aab695e077b388beb5d9aa62d833e8ca372e856f6652"} Oct 03 09:02:20 crc kubenswrapper[4765]: I1003 09:02:20.786085 4765 generic.go:334] "Generic (PLEG): container finished" podID="dac4b22e-04b7-4780-acee-fd0971bd5e94" containerID="7f0779cb52b4215fc79ae8065c56892a6740e507e4a3a5b41f70a32e94b373ec" exitCode=0 Oct 03 09:02:20 crc kubenswrapper[4765]: I1003 09:02:20.786166 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/watcher8945-account-delete-pt9gj" event={"ID":"dac4b22e-04b7-4780-acee-fd0971bd5e94","Type":"ContainerDied","Data":"7f0779cb52b4215fc79ae8065c56892a6740e507e4a3a5b41f70a32e94b373ec"} Oct 03 09:02:20 crc kubenswrapper[4765]: I1003 09:02:20.786187 4765 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Oct 03 09:02:20 crc kubenswrapper[4765]: I1003 09:02:20.786400 4765 kuberuntime_container.go:808] "Killing container with a grace period" pod="watcher-kuttl-default/watcher-kuttl-api-0" podUID="6b2be982-d42e-40f6-840c-cbdb35fefc4b" containerName="watcher-kuttl-api-log" containerID="cri-o://d518aa16eb43acf135258aaecf8698528e8c875f6760020b6f37989dd9ea6843" gracePeriod=30 Oct 03 09:02:20 crc kubenswrapper[4765]: I1003 09:02:20.786965 4765 kuberuntime_container.go:808] "Killing container with a grace period" pod="watcher-kuttl-default/watcher-kuttl-api-0" podUID="6b2be982-d42e-40f6-840c-cbdb35fefc4b" containerName="watcher-api" containerID="cri-o://8a669e0ca9c434079bad9eadee6b72778566c16870fa5bf81fa0f4084aa60c6a" gracePeriod=30 Oct 03 09:02:20 crc kubenswrapper[4765]: I1003 09:02:20.792779 4765 prober.go:107] "Probe failed" probeType="Readiness" pod="watcher-kuttl-default/watcher-kuttl-api-0" podUID="6b2be982-d42e-40f6-840c-cbdb35fefc4b" containerName="watcher-api" probeResult="failure" output="Get \"https://10.217.0.156:9322/\": EOF" Oct 03 09:02:20 crc kubenswrapper[4765]: I1003 09:02:20.874996 4765 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/watcher-kuttl-applier-0" Oct 03 09:02:20 crc kubenswrapper[4765]: I1003 09:02:20.985978 4765 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vcj52\" (UniqueName: \"kubernetes.io/projected/814e2e41-1895-4103-b651-9e8e9db42905-kube-api-access-vcj52\") pod \"814e2e41-1895-4103-b651-9e8e9db42905\" (UID: \"814e2e41-1895-4103-b651-9e8e9db42905\") " Oct 03 09:02:20 crc kubenswrapper[4765]: I1003 09:02:20.986052 4765 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/814e2e41-1895-4103-b651-9e8e9db42905-combined-ca-bundle\") pod \"814e2e41-1895-4103-b651-9e8e9db42905\" (UID: \"814e2e41-1895-4103-b651-9e8e9db42905\") " Oct 03 09:02:20 crc kubenswrapper[4765]: I1003 09:02:20.986134 4765 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/814e2e41-1895-4103-b651-9e8e9db42905-config-data\") pod \"814e2e41-1895-4103-b651-9e8e9db42905\" (UID: \"814e2e41-1895-4103-b651-9e8e9db42905\") " Oct 03 09:02:20 crc kubenswrapper[4765]: I1003 09:02:20.986157 4765 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/814e2e41-1895-4103-b651-9e8e9db42905-logs\") pod \"814e2e41-1895-4103-b651-9e8e9db42905\" (UID: \"814e2e41-1895-4103-b651-9e8e9db42905\") " Oct 03 09:02:20 crc kubenswrapper[4765]: I1003 09:02:20.986925 4765 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/814e2e41-1895-4103-b651-9e8e9db42905-logs" (OuterVolumeSpecName: "logs") pod "814e2e41-1895-4103-b651-9e8e9db42905" (UID: "814e2e41-1895-4103-b651-9e8e9db42905"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 03 09:02:21 crc kubenswrapper[4765]: I1003 09:02:20.992833 4765 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/814e2e41-1895-4103-b651-9e8e9db42905-kube-api-access-vcj52" (OuterVolumeSpecName: "kube-api-access-vcj52") pod "814e2e41-1895-4103-b651-9e8e9db42905" (UID: "814e2e41-1895-4103-b651-9e8e9db42905"). InnerVolumeSpecName "kube-api-access-vcj52". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 09:02:21 crc kubenswrapper[4765]: I1003 09:02:21.094146 4765 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/814e2e41-1895-4103-b651-9e8e9db42905-logs\") on node \"crc\" DevicePath \"\"" Oct 03 09:02:21 crc kubenswrapper[4765]: I1003 09:02:21.094188 4765 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vcj52\" (UniqueName: \"kubernetes.io/projected/814e2e41-1895-4103-b651-9e8e9db42905-kube-api-access-vcj52\") on node \"crc\" DevicePath \"\"" Oct 03 09:02:21 crc kubenswrapper[4765]: I1003 09:02:21.104052 4765 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/814e2e41-1895-4103-b651-9e8e9db42905-config-data" (OuterVolumeSpecName: "config-data") pod "814e2e41-1895-4103-b651-9e8e9db42905" (UID: "814e2e41-1895-4103-b651-9e8e9db42905"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 09:02:21 crc kubenswrapper[4765]: I1003 09:02:21.117935 4765 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/814e2e41-1895-4103-b651-9e8e9db42905-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "814e2e41-1895-4103-b651-9e8e9db42905" (UID: "814e2e41-1895-4103-b651-9e8e9db42905"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 09:02:21 crc kubenswrapper[4765]: I1003 09:02:21.197271 4765 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/814e2e41-1895-4103-b651-9e8e9db42905-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 03 09:02:21 crc kubenswrapper[4765]: I1003 09:02:21.197319 4765 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/814e2e41-1895-4103-b651-9e8e9db42905-config-data\") on node \"crc\" DevicePath \"\"" Oct 03 09:02:21 crc kubenswrapper[4765]: I1003 09:02:21.209905 4765 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/watcher8945-account-delete-pt9gj" Oct 03 09:02:21 crc kubenswrapper[4765]: I1003 09:02:21.400999 4765 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mzfhj\" (UniqueName: \"kubernetes.io/projected/dac4b22e-04b7-4780-acee-fd0971bd5e94-kube-api-access-mzfhj\") pod \"dac4b22e-04b7-4780-acee-fd0971bd5e94\" (UID: \"dac4b22e-04b7-4780-acee-fd0971bd5e94\") " Oct 03 09:02:21 crc kubenswrapper[4765]: I1003 09:02:21.408969 4765 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/dac4b22e-04b7-4780-acee-fd0971bd5e94-kube-api-access-mzfhj" (OuterVolumeSpecName: "kube-api-access-mzfhj") pod "dac4b22e-04b7-4780-acee-fd0971bd5e94" (UID: "dac4b22e-04b7-4780-acee-fd0971bd5e94"). InnerVolumeSpecName "kube-api-access-mzfhj". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 09:02:21 crc kubenswrapper[4765]: I1003 09:02:21.505201 4765 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mzfhj\" (UniqueName: \"kubernetes.io/projected/dac4b22e-04b7-4780-acee-fd0971bd5e94-kube-api-access-mzfhj\") on node \"crc\" DevicePath \"\"" Oct 03 09:02:21 crc kubenswrapper[4765]: I1003 09:02:21.528093 4765 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["watcher-kuttl-default/ceilometer-0"] Oct 03 09:02:21 crc kubenswrapper[4765]: I1003 09:02:21.528391 4765 kuberuntime_container.go:808] "Killing container with a grace period" pod="watcher-kuttl-default/ceilometer-0" podUID="dd692e27-9c41-4946-9776-955baa355470" containerName="ceilometer-central-agent" containerID="cri-o://8e33ade36873f25cd6c35a750342c13049f8880c6cbf20f4b2f65c5f2ec1c71c" gracePeriod=30 Oct 03 09:02:21 crc kubenswrapper[4765]: I1003 09:02:21.528745 4765 kuberuntime_container.go:808] "Killing container with a grace period" pod="watcher-kuttl-default/ceilometer-0" podUID="dd692e27-9c41-4946-9776-955baa355470" containerName="proxy-httpd" containerID="cri-o://7b0a838d09d1f8e5dda48c85ff65d03d96320f14aa1115cb502077600166ce87" gracePeriod=30 Oct 03 09:02:21 crc kubenswrapper[4765]: I1003 09:02:21.528831 4765 kuberuntime_container.go:808] "Killing container with a grace period" pod="watcher-kuttl-default/ceilometer-0" podUID="dd692e27-9c41-4946-9776-955baa355470" containerName="sg-core" containerID="cri-o://f75c8cbcdeebbc2265f17a6512043a708c837bdb0cc944615199051b33cc7d6c" gracePeriod=30 Oct 03 09:02:21 crc kubenswrapper[4765]: I1003 09:02:21.528893 4765 kuberuntime_container.go:808] "Killing container with a grace period" pod="watcher-kuttl-default/ceilometer-0" podUID="dd692e27-9c41-4946-9776-955baa355470" containerName="ceilometer-notification-agent" containerID="cri-o://4ea9ee96f0faed918eedf3026e52af26391c80dbf9886e1598feb36c3570b6a8" gracePeriod=30 Oct 03 09:02:21 crc kubenswrapper[4765]: I1003 09:02:21.569678 4765 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/watcher-kuttl-decision-engine-0" Oct 03 09:02:21 crc kubenswrapper[4765]: I1003 09:02:21.708075 4765 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/f497c8f6-6ad1-4287-a194-e7cd523e8eb7-logs\") pod \"f497c8f6-6ad1-4287-a194-e7cd523e8eb7\" (UID: \"f497c8f6-6ad1-4287-a194-e7cd523e8eb7\") " Oct 03 09:02:21 crc kubenswrapper[4765]: I1003 09:02:21.708130 4765 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"custom-prometheus-ca\" (UniqueName: \"kubernetes.io/secret/f497c8f6-6ad1-4287-a194-e7cd523e8eb7-custom-prometheus-ca\") pod \"f497c8f6-6ad1-4287-a194-e7cd523e8eb7\" (UID: \"f497c8f6-6ad1-4287-a194-e7cd523e8eb7\") " Oct 03 09:02:21 crc kubenswrapper[4765]: I1003 09:02:21.708165 4765 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f497c8f6-6ad1-4287-a194-e7cd523e8eb7-combined-ca-bundle\") pod \"f497c8f6-6ad1-4287-a194-e7cd523e8eb7\" (UID: \"f497c8f6-6ad1-4287-a194-e7cd523e8eb7\") " Oct 03 09:02:21 crc kubenswrapper[4765]: I1003 09:02:21.708297 4765 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f497c8f6-6ad1-4287-a194-e7cd523e8eb7-config-data\") pod \"f497c8f6-6ad1-4287-a194-e7cd523e8eb7\" (UID: \"f497c8f6-6ad1-4287-a194-e7cd523e8eb7\") " Oct 03 09:02:21 crc kubenswrapper[4765]: I1003 09:02:21.708351 4765 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5xcb4\" (UniqueName: \"kubernetes.io/projected/f497c8f6-6ad1-4287-a194-e7cd523e8eb7-kube-api-access-5xcb4\") pod \"f497c8f6-6ad1-4287-a194-e7cd523e8eb7\" (UID: \"f497c8f6-6ad1-4287-a194-e7cd523e8eb7\") " Oct 03 09:02:21 crc kubenswrapper[4765]: I1003 09:02:21.708621 4765 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f497c8f6-6ad1-4287-a194-e7cd523e8eb7-logs" (OuterVolumeSpecName: "logs") pod "f497c8f6-6ad1-4287-a194-e7cd523e8eb7" (UID: "f497c8f6-6ad1-4287-a194-e7cd523e8eb7"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 03 09:02:21 crc kubenswrapper[4765]: I1003 09:02:21.708854 4765 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/f497c8f6-6ad1-4287-a194-e7cd523e8eb7-logs\") on node \"crc\" DevicePath \"\"" Oct 03 09:02:21 crc kubenswrapper[4765]: I1003 09:02:21.717095 4765 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f497c8f6-6ad1-4287-a194-e7cd523e8eb7-kube-api-access-5xcb4" (OuterVolumeSpecName: "kube-api-access-5xcb4") pod "f497c8f6-6ad1-4287-a194-e7cd523e8eb7" (UID: "f497c8f6-6ad1-4287-a194-e7cd523e8eb7"). InnerVolumeSpecName "kube-api-access-5xcb4". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 09:02:21 crc kubenswrapper[4765]: I1003 09:02:21.741704 4765 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f497c8f6-6ad1-4287-a194-e7cd523e8eb7-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "f497c8f6-6ad1-4287-a194-e7cd523e8eb7" (UID: "f497c8f6-6ad1-4287-a194-e7cd523e8eb7"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 09:02:21 crc kubenswrapper[4765]: I1003 09:02:21.744765 4765 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f497c8f6-6ad1-4287-a194-e7cd523e8eb7-custom-prometheus-ca" (OuterVolumeSpecName: "custom-prometheus-ca") pod "f497c8f6-6ad1-4287-a194-e7cd523e8eb7" (UID: "f497c8f6-6ad1-4287-a194-e7cd523e8eb7"). InnerVolumeSpecName "custom-prometheus-ca". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 09:02:21 crc kubenswrapper[4765]: I1003 09:02:21.760857 4765 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f497c8f6-6ad1-4287-a194-e7cd523e8eb7-config-data" (OuterVolumeSpecName: "config-data") pod "f497c8f6-6ad1-4287-a194-e7cd523e8eb7" (UID: "f497c8f6-6ad1-4287-a194-e7cd523e8eb7"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 09:02:21 crc kubenswrapper[4765]: I1003 09:02:21.794678 4765 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/watcher-kuttl-applier-0" Oct 03 09:02:21 crc kubenswrapper[4765]: I1003 09:02:21.794754 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/watcher-kuttl-applier-0" event={"ID":"814e2e41-1895-4103-b651-9e8e9db42905","Type":"ContainerDied","Data":"a1dee04a50a3af16bcd9e85c981ed3d703ab16b332debf69391f2bcb33f919a5"} Oct 03 09:02:21 crc kubenswrapper[4765]: I1003 09:02:21.794796 4765 scope.go:117] "RemoveContainer" containerID="6a8e4114cc89500a9ec7aab695e077b388beb5d9aa62d833e8ca372e856f6652" Oct 03 09:02:21 crc kubenswrapper[4765]: I1003 09:02:21.796428 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/watcher8945-account-delete-pt9gj" event={"ID":"dac4b22e-04b7-4780-acee-fd0971bd5e94","Type":"ContainerDied","Data":"1bb06d02607c1e84d733d4a0c4a8fa4fdffddcde537f592ecdeff02cc5a571c6"} Oct 03 09:02:21 crc kubenswrapper[4765]: I1003 09:02:21.796463 4765 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="1bb06d02607c1e84d733d4a0c4a8fa4fdffddcde537f592ecdeff02cc5a571c6" Oct 03 09:02:21 crc kubenswrapper[4765]: I1003 09:02:21.796531 4765 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/watcher8945-account-delete-pt9gj" Oct 03 09:02:21 crc kubenswrapper[4765]: I1003 09:02:21.810315 4765 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f497c8f6-6ad1-4287-a194-e7cd523e8eb7-config-data\") on node \"crc\" DevicePath \"\"" Oct 03 09:02:21 crc kubenswrapper[4765]: I1003 09:02:21.810344 4765 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-5xcb4\" (UniqueName: \"kubernetes.io/projected/f497c8f6-6ad1-4287-a194-e7cd523e8eb7-kube-api-access-5xcb4\") on node \"crc\" DevicePath \"\"" Oct 03 09:02:21 crc kubenswrapper[4765]: I1003 09:02:21.810357 4765 reconciler_common.go:293] "Volume detached for volume \"custom-prometheus-ca\" (UniqueName: \"kubernetes.io/secret/f497c8f6-6ad1-4287-a194-e7cd523e8eb7-custom-prometheus-ca\") on node \"crc\" DevicePath \"\"" Oct 03 09:02:21 crc kubenswrapper[4765]: I1003 09:02:21.810370 4765 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f497c8f6-6ad1-4287-a194-e7cd523e8eb7-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 03 09:02:21 crc kubenswrapper[4765]: I1003 09:02:21.822752 4765 generic.go:334] "Generic (PLEG): container finished" podID="6b2be982-d42e-40f6-840c-cbdb35fefc4b" containerID="d518aa16eb43acf135258aaecf8698528e8c875f6760020b6f37989dd9ea6843" exitCode=143 Oct 03 09:02:21 crc kubenswrapper[4765]: I1003 09:02:21.822847 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/watcher-kuttl-api-0" event={"ID":"6b2be982-d42e-40f6-840c-cbdb35fefc4b","Type":"ContainerDied","Data":"d518aa16eb43acf135258aaecf8698528e8c875f6760020b6f37989dd9ea6843"} Oct 03 09:02:21 crc kubenswrapper[4765]: I1003 09:02:21.826901 4765 generic.go:334] "Generic (PLEG): container finished" podID="f497c8f6-6ad1-4287-a194-e7cd523e8eb7" containerID="3f05ffa3de7010d992730b04abb3eec4fcdce81def31055ecf52bb58787eb748" exitCode=0 Oct 03 09:02:21 crc kubenswrapper[4765]: I1003 09:02:21.827196 4765 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/watcher-kuttl-decision-engine-0" Oct 03 09:02:21 crc kubenswrapper[4765]: I1003 09:02:21.827092 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/watcher-kuttl-decision-engine-0" event={"ID":"f497c8f6-6ad1-4287-a194-e7cd523e8eb7","Type":"ContainerDied","Data":"3f05ffa3de7010d992730b04abb3eec4fcdce81def31055ecf52bb58787eb748"} Oct 03 09:02:21 crc kubenswrapper[4765]: I1003 09:02:21.827982 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/watcher-kuttl-decision-engine-0" event={"ID":"f497c8f6-6ad1-4287-a194-e7cd523e8eb7","Type":"ContainerDied","Data":"e4daa85676010b8ae43865e480f00d39a905b131b580e600031264bf5ebb0dca"} Oct 03 09:02:21 crc kubenswrapper[4765]: I1003 09:02:21.838143 4765 generic.go:334] "Generic (PLEG): container finished" podID="dd692e27-9c41-4946-9776-955baa355470" containerID="7b0a838d09d1f8e5dda48c85ff65d03d96320f14aa1115cb502077600166ce87" exitCode=0 Oct 03 09:02:21 crc kubenswrapper[4765]: I1003 09:02:21.838171 4765 generic.go:334] "Generic (PLEG): container finished" podID="dd692e27-9c41-4946-9776-955baa355470" containerID="f75c8cbcdeebbc2265f17a6512043a708c837bdb0cc944615199051b33cc7d6c" exitCode=2 Oct 03 09:02:21 crc kubenswrapper[4765]: I1003 09:02:21.838190 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/ceilometer-0" event={"ID":"dd692e27-9c41-4946-9776-955baa355470","Type":"ContainerDied","Data":"7b0a838d09d1f8e5dda48c85ff65d03d96320f14aa1115cb502077600166ce87"} Oct 03 09:02:21 crc kubenswrapper[4765]: I1003 09:02:21.838234 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/ceilometer-0" event={"ID":"dd692e27-9c41-4946-9776-955baa355470","Type":"ContainerDied","Data":"f75c8cbcdeebbc2265f17a6512043a708c837bdb0cc944615199051b33cc7d6c"} Oct 03 09:02:21 crc kubenswrapper[4765]: I1003 09:02:21.872519 4765 scope.go:117] "RemoveContainer" containerID="3f05ffa3de7010d992730b04abb3eec4fcdce81def31055ecf52bb58787eb748" Oct 03 09:02:21 crc kubenswrapper[4765]: I1003 09:02:21.890824 4765 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["watcher-kuttl-default/watcher-kuttl-applier-0"] Oct 03 09:02:21 crc kubenswrapper[4765]: I1003 09:02:21.898323 4765 scope.go:117] "RemoveContainer" containerID="3f05ffa3de7010d992730b04abb3eec4fcdce81def31055ecf52bb58787eb748" Oct 03 09:02:21 crc kubenswrapper[4765]: E1003 09:02:21.898772 4765 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3f05ffa3de7010d992730b04abb3eec4fcdce81def31055ecf52bb58787eb748\": container with ID starting with 3f05ffa3de7010d992730b04abb3eec4fcdce81def31055ecf52bb58787eb748 not found: ID does not exist" containerID="3f05ffa3de7010d992730b04abb3eec4fcdce81def31055ecf52bb58787eb748" Oct 03 09:02:21 crc kubenswrapper[4765]: I1003 09:02:21.898815 4765 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3f05ffa3de7010d992730b04abb3eec4fcdce81def31055ecf52bb58787eb748"} err="failed to get container status \"3f05ffa3de7010d992730b04abb3eec4fcdce81def31055ecf52bb58787eb748\": rpc error: code = NotFound desc = could not find container \"3f05ffa3de7010d992730b04abb3eec4fcdce81def31055ecf52bb58787eb748\": container with ID starting with 3f05ffa3de7010d992730b04abb3eec4fcdce81def31055ecf52bb58787eb748 not found: ID does not exist" Oct 03 09:02:21 crc kubenswrapper[4765]: I1003 09:02:21.920073 4765 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["watcher-kuttl-default/watcher-kuttl-applier-0"] Oct 03 09:02:21 crc kubenswrapper[4765]: I1003 09:02:21.925666 4765 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["watcher-kuttl-default/watcher-kuttl-decision-engine-0"] Oct 03 09:02:21 crc kubenswrapper[4765]: I1003 09:02:21.937374 4765 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["watcher-kuttl-default/watcher-kuttl-decision-engine-0"] Oct 03 09:02:22 crc kubenswrapper[4765]: I1003 09:02:22.280751 4765 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="watcher-kuttl-default/watcher-kuttl-api-0" Oct 03 09:02:22 crc kubenswrapper[4765]: I1003 09:02:22.318539 4765 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="814e2e41-1895-4103-b651-9e8e9db42905" path="/var/lib/kubelet/pods/814e2e41-1895-4103-b651-9e8e9db42905/volumes" Oct 03 09:02:22 crc kubenswrapper[4765]: I1003 09:02:22.319232 4765 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f497c8f6-6ad1-4287-a194-e7cd523e8eb7" path="/var/lib/kubelet/pods/f497c8f6-6ad1-4287-a194-e7cd523e8eb7/volumes" Oct 03 09:02:22 crc kubenswrapper[4765]: I1003 09:02:22.470536 4765 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/ceilometer-0" Oct 03 09:02:22 crc kubenswrapper[4765]: E1003 09:02:22.520335 4765 secret.go:188] Couldn't get secret watcher-kuttl-default/watcher-kuttl-api-config-data: secret "watcher-kuttl-api-config-data" not found Oct 03 09:02:22 crc kubenswrapper[4765]: E1003 09:02:22.521405 4765 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/6b2be982-d42e-40f6-840c-cbdb35fefc4b-config-data podName:6b2be982-d42e-40f6-840c-cbdb35fefc4b nodeName:}" failed. No retries permitted until 2025-10-03 09:02:26.521385029 +0000 UTC m=+1390.822879359 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "config-data" (UniqueName: "kubernetes.io/secret/6b2be982-d42e-40f6-840c-cbdb35fefc4b-config-data") pod "watcher-kuttl-api-0" (UID: "6b2be982-d42e-40f6-840c-cbdb35fefc4b") : secret "watcher-kuttl-api-config-data" not found Oct 03 09:02:22 crc kubenswrapper[4765]: I1003 09:02:22.621298 4765 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/dd692e27-9c41-4946-9776-955baa355470-scripts\") pod \"dd692e27-9c41-4946-9776-955baa355470\" (UID: \"dd692e27-9c41-4946-9776-955baa355470\") " Oct 03 09:02:22 crc kubenswrapper[4765]: I1003 09:02:22.621852 4765 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/dd692e27-9c41-4946-9776-955baa355470-combined-ca-bundle\") pod \"dd692e27-9c41-4946-9776-955baa355470\" (UID: \"dd692e27-9c41-4946-9776-955baa355470\") " Oct 03 09:02:22 crc kubenswrapper[4765]: I1003 09:02:22.621976 4765 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2f5gq\" (UniqueName: \"kubernetes.io/projected/dd692e27-9c41-4946-9776-955baa355470-kube-api-access-2f5gq\") pod \"dd692e27-9c41-4946-9776-955baa355470\" (UID: \"dd692e27-9c41-4946-9776-955baa355470\") " Oct 03 09:02:22 crc kubenswrapper[4765]: I1003 09:02:22.622095 4765 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/dd692e27-9c41-4946-9776-955baa355470-config-data\") pod \"dd692e27-9c41-4946-9776-955baa355470\" (UID: \"dd692e27-9c41-4946-9776-955baa355470\") " Oct 03 09:02:22 crc kubenswrapper[4765]: I1003 09:02:22.622277 4765 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/dd692e27-9c41-4946-9776-955baa355470-sg-core-conf-yaml\") pod \"dd692e27-9c41-4946-9776-955baa355470\" (UID: \"dd692e27-9c41-4946-9776-955baa355470\") " Oct 03 09:02:22 crc kubenswrapper[4765]: I1003 09:02:22.622362 4765 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/dd692e27-9c41-4946-9776-955baa355470-run-httpd\") pod \"dd692e27-9c41-4946-9776-955baa355470\" (UID: \"dd692e27-9c41-4946-9776-955baa355470\") " Oct 03 09:02:22 crc kubenswrapper[4765]: I1003 09:02:22.622495 4765 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/dd692e27-9c41-4946-9776-955baa355470-log-httpd\") pod \"dd692e27-9c41-4946-9776-955baa355470\" (UID: \"dd692e27-9c41-4946-9776-955baa355470\") " Oct 03 09:02:22 crc kubenswrapper[4765]: I1003 09:02:22.622610 4765 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/dd692e27-9c41-4946-9776-955baa355470-ceilometer-tls-certs\") pod \"dd692e27-9c41-4946-9776-955baa355470\" (UID: \"dd692e27-9c41-4946-9776-955baa355470\") " Oct 03 09:02:22 crc kubenswrapper[4765]: I1003 09:02:22.623226 4765 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/dd692e27-9c41-4946-9776-955baa355470-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "dd692e27-9c41-4946-9776-955baa355470" (UID: "dd692e27-9c41-4946-9776-955baa355470"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 03 09:02:22 crc kubenswrapper[4765]: I1003 09:02:22.623534 4765 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/dd692e27-9c41-4946-9776-955baa355470-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "dd692e27-9c41-4946-9776-955baa355470" (UID: "dd692e27-9c41-4946-9776-955baa355470"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 03 09:02:22 crc kubenswrapper[4765]: I1003 09:02:22.627104 4765 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/dd692e27-9c41-4946-9776-955baa355470-scripts" (OuterVolumeSpecName: "scripts") pod "dd692e27-9c41-4946-9776-955baa355470" (UID: "dd692e27-9c41-4946-9776-955baa355470"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 09:02:22 crc kubenswrapper[4765]: I1003 09:02:22.628312 4765 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/dd692e27-9c41-4946-9776-955baa355470-kube-api-access-2f5gq" (OuterVolumeSpecName: "kube-api-access-2f5gq") pod "dd692e27-9c41-4946-9776-955baa355470" (UID: "dd692e27-9c41-4946-9776-955baa355470"). InnerVolumeSpecName "kube-api-access-2f5gq". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 09:02:22 crc kubenswrapper[4765]: I1003 09:02:22.658012 4765 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/dd692e27-9c41-4946-9776-955baa355470-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "dd692e27-9c41-4946-9776-955baa355470" (UID: "dd692e27-9c41-4946-9776-955baa355470"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 09:02:22 crc kubenswrapper[4765]: I1003 09:02:22.669478 4765 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/dd692e27-9c41-4946-9776-955baa355470-ceilometer-tls-certs" (OuterVolumeSpecName: "ceilometer-tls-certs") pod "dd692e27-9c41-4946-9776-955baa355470" (UID: "dd692e27-9c41-4946-9776-955baa355470"). InnerVolumeSpecName "ceilometer-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 09:02:22 crc kubenswrapper[4765]: I1003 09:02:22.688829 4765 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/dd692e27-9c41-4946-9776-955baa355470-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "dd692e27-9c41-4946-9776-955baa355470" (UID: "dd692e27-9c41-4946-9776-955baa355470"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 09:02:22 crc kubenswrapper[4765]: I1003 09:02:22.724201 4765 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/dd692e27-9c41-4946-9776-955baa355470-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 03 09:02:22 crc kubenswrapper[4765]: I1003 09:02:22.724415 4765 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2f5gq\" (UniqueName: \"kubernetes.io/projected/dd692e27-9c41-4946-9776-955baa355470-kube-api-access-2f5gq\") on node \"crc\" DevicePath \"\"" Oct 03 09:02:22 crc kubenswrapper[4765]: I1003 09:02:22.724477 4765 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/dd692e27-9c41-4946-9776-955baa355470-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Oct 03 09:02:22 crc kubenswrapper[4765]: I1003 09:02:22.724529 4765 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/dd692e27-9c41-4946-9776-955baa355470-run-httpd\") on node \"crc\" DevicePath \"\"" Oct 03 09:02:22 crc kubenswrapper[4765]: I1003 09:02:22.724591 4765 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/dd692e27-9c41-4946-9776-955baa355470-log-httpd\") on node \"crc\" DevicePath \"\"" Oct 03 09:02:22 crc kubenswrapper[4765]: I1003 09:02:22.724754 4765 reconciler_common.go:293] "Volume detached for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/dd692e27-9c41-4946-9776-955baa355470-ceilometer-tls-certs\") on node \"crc\" DevicePath \"\"" Oct 03 09:02:22 crc kubenswrapper[4765]: I1003 09:02:22.724844 4765 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/dd692e27-9c41-4946-9776-955baa355470-scripts\") on node \"crc\" DevicePath \"\"" Oct 03 09:02:22 crc kubenswrapper[4765]: I1003 09:02:22.730499 4765 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/dd692e27-9c41-4946-9776-955baa355470-config-data" (OuterVolumeSpecName: "config-data") pod "dd692e27-9c41-4946-9776-955baa355470" (UID: "dd692e27-9c41-4946-9776-955baa355470"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 09:02:22 crc kubenswrapper[4765]: I1003 09:02:22.826401 4765 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/dd692e27-9c41-4946-9776-955baa355470-config-data\") on node \"crc\" DevicePath \"\"" Oct 03 09:02:22 crc kubenswrapper[4765]: I1003 09:02:22.850165 4765 generic.go:334] "Generic (PLEG): container finished" podID="dd692e27-9c41-4946-9776-955baa355470" containerID="4ea9ee96f0faed918eedf3026e52af26391c80dbf9886e1598feb36c3570b6a8" exitCode=0 Oct 03 09:02:22 crc kubenswrapper[4765]: I1003 09:02:22.850212 4765 generic.go:334] "Generic (PLEG): container finished" podID="dd692e27-9c41-4946-9776-955baa355470" containerID="8e33ade36873f25cd6c35a750342c13049f8880c6cbf20f4b2f65c5f2ec1c71c" exitCode=0 Oct 03 09:02:22 crc kubenswrapper[4765]: I1003 09:02:22.850234 4765 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/ceilometer-0" Oct 03 09:02:22 crc kubenswrapper[4765]: I1003 09:02:22.850263 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/ceilometer-0" event={"ID":"dd692e27-9c41-4946-9776-955baa355470","Type":"ContainerDied","Data":"4ea9ee96f0faed918eedf3026e52af26391c80dbf9886e1598feb36c3570b6a8"} Oct 03 09:02:22 crc kubenswrapper[4765]: I1003 09:02:22.850311 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/ceilometer-0" event={"ID":"dd692e27-9c41-4946-9776-955baa355470","Type":"ContainerDied","Data":"8e33ade36873f25cd6c35a750342c13049f8880c6cbf20f4b2f65c5f2ec1c71c"} Oct 03 09:02:22 crc kubenswrapper[4765]: I1003 09:02:22.850327 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/ceilometer-0" event={"ID":"dd692e27-9c41-4946-9776-955baa355470","Type":"ContainerDied","Data":"9d88333f9d3f16fb324d796d2ebf6d39c2880c44877399f59750b21e023f7b4d"} Oct 03 09:02:22 crc kubenswrapper[4765]: I1003 09:02:22.850345 4765 scope.go:117] "RemoveContainer" containerID="7b0a838d09d1f8e5dda48c85ff65d03d96320f14aa1115cb502077600166ce87" Oct 03 09:02:22 crc kubenswrapper[4765]: I1003 09:02:22.886009 4765 scope.go:117] "RemoveContainer" containerID="f75c8cbcdeebbc2265f17a6512043a708c837bdb0cc944615199051b33cc7d6c" Oct 03 09:02:22 crc kubenswrapper[4765]: I1003 09:02:22.906905 4765 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["watcher-kuttl-default/ceilometer-0"] Oct 03 09:02:22 crc kubenswrapper[4765]: I1003 09:02:22.911893 4765 scope.go:117] "RemoveContainer" containerID="4ea9ee96f0faed918eedf3026e52af26391c80dbf9886e1598feb36c3570b6a8" Oct 03 09:02:22 crc kubenswrapper[4765]: I1003 09:02:22.914368 4765 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["watcher-kuttl-default/ceilometer-0"] Oct 03 09:02:22 crc kubenswrapper[4765]: I1003 09:02:22.914913 4765 prober.go:107] "Probe failed" probeType="Readiness" pod="watcher-kuttl-default/watcher-kuttl-api-0" podUID="6b2be982-d42e-40f6-840c-cbdb35fefc4b" containerName="watcher-api" probeResult="failure" output="Get \"https://10.217.0.156:9322/\": read tcp 10.217.0.2:33242->10.217.0.156:9322: read: connection reset by peer" Oct 03 09:02:22 crc kubenswrapper[4765]: I1003 09:02:22.915839 4765 prober.go:107] "Probe failed" probeType="Readiness" pod="watcher-kuttl-default/watcher-kuttl-api-0" podUID="6b2be982-d42e-40f6-840c-cbdb35fefc4b" containerName="watcher-api" probeResult="failure" output="Get \"https://10.217.0.156:9322/\": dial tcp 10.217.0.156:9322: connect: connection refused" Oct 03 09:02:22 crc kubenswrapper[4765]: I1003 09:02:22.922837 4765 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["watcher-kuttl-default/ceilometer-0"] Oct 03 09:02:22 crc kubenswrapper[4765]: E1003 09:02:22.923501 4765 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f497c8f6-6ad1-4287-a194-e7cd523e8eb7" containerName="watcher-decision-engine" Oct 03 09:02:22 crc kubenswrapper[4765]: I1003 09:02:22.923539 4765 state_mem.go:107] "Deleted CPUSet assignment" podUID="f497c8f6-6ad1-4287-a194-e7cd523e8eb7" containerName="watcher-decision-engine" Oct 03 09:02:22 crc kubenswrapper[4765]: E1003 09:02:22.923553 4765 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="dd692e27-9c41-4946-9776-955baa355470" containerName="sg-core" Oct 03 09:02:22 crc kubenswrapper[4765]: I1003 09:02:22.923562 4765 state_mem.go:107] "Deleted CPUSet assignment" podUID="dd692e27-9c41-4946-9776-955baa355470" containerName="sg-core" Oct 03 09:02:22 crc kubenswrapper[4765]: E1003 09:02:22.923579 4765 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="dac4b22e-04b7-4780-acee-fd0971bd5e94" containerName="mariadb-account-delete" Oct 03 09:02:22 crc kubenswrapper[4765]: I1003 09:02:22.923588 4765 state_mem.go:107] "Deleted CPUSet assignment" podUID="dac4b22e-04b7-4780-acee-fd0971bd5e94" containerName="mariadb-account-delete" Oct 03 09:02:22 crc kubenswrapper[4765]: E1003 09:02:22.923604 4765 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="814e2e41-1895-4103-b651-9e8e9db42905" containerName="watcher-applier" Oct 03 09:02:22 crc kubenswrapper[4765]: I1003 09:02:22.923609 4765 state_mem.go:107] "Deleted CPUSet assignment" podUID="814e2e41-1895-4103-b651-9e8e9db42905" containerName="watcher-applier" Oct 03 09:02:22 crc kubenswrapper[4765]: E1003 09:02:22.923619 4765 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="dd692e27-9c41-4946-9776-955baa355470" containerName="proxy-httpd" Oct 03 09:02:22 crc kubenswrapper[4765]: I1003 09:02:22.923625 4765 state_mem.go:107] "Deleted CPUSet assignment" podUID="dd692e27-9c41-4946-9776-955baa355470" containerName="proxy-httpd" Oct 03 09:02:22 crc kubenswrapper[4765]: E1003 09:02:22.923660 4765 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="dd692e27-9c41-4946-9776-955baa355470" containerName="ceilometer-notification-agent" Oct 03 09:02:22 crc kubenswrapper[4765]: I1003 09:02:22.923669 4765 state_mem.go:107] "Deleted CPUSet assignment" podUID="dd692e27-9c41-4946-9776-955baa355470" containerName="ceilometer-notification-agent" Oct 03 09:02:22 crc kubenswrapper[4765]: E1003 09:02:22.923680 4765 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="dd692e27-9c41-4946-9776-955baa355470" containerName="ceilometer-central-agent" Oct 03 09:02:22 crc kubenswrapper[4765]: I1003 09:02:22.923687 4765 state_mem.go:107] "Deleted CPUSet assignment" podUID="dd692e27-9c41-4946-9776-955baa355470" containerName="ceilometer-central-agent" Oct 03 09:02:22 crc kubenswrapper[4765]: I1003 09:02:22.923871 4765 memory_manager.go:354] "RemoveStaleState removing state" podUID="dd692e27-9c41-4946-9776-955baa355470" containerName="proxy-httpd" Oct 03 09:02:22 crc kubenswrapper[4765]: I1003 09:02:22.923891 4765 memory_manager.go:354] "RemoveStaleState removing state" podUID="f497c8f6-6ad1-4287-a194-e7cd523e8eb7" containerName="watcher-decision-engine" Oct 03 09:02:22 crc kubenswrapper[4765]: I1003 09:02:22.923901 4765 memory_manager.go:354] "RemoveStaleState removing state" podUID="dac4b22e-04b7-4780-acee-fd0971bd5e94" containerName="mariadb-account-delete" Oct 03 09:02:22 crc kubenswrapper[4765]: I1003 09:02:22.923914 4765 memory_manager.go:354] "RemoveStaleState removing state" podUID="dd692e27-9c41-4946-9776-955baa355470" containerName="sg-core" Oct 03 09:02:22 crc kubenswrapper[4765]: I1003 09:02:22.923925 4765 memory_manager.go:354] "RemoveStaleState removing state" podUID="dd692e27-9c41-4946-9776-955baa355470" containerName="ceilometer-notification-agent" Oct 03 09:02:22 crc kubenswrapper[4765]: I1003 09:02:22.923936 4765 memory_manager.go:354] "RemoveStaleState removing state" podUID="dd692e27-9c41-4946-9776-955baa355470" containerName="ceilometer-central-agent" Oct 03 09:02:22 crc kubenswrapper[4765]: I1003 09:02:22.923952 4765 memory_manager.go:354] "RemoveStaleState removing state" podUID="814e2e41-1895-4103-b651-9e8e9db42905" containerName="watcher-applier" Oct 03 09:02:22 crc kubenswrapper[4765]: I1003 09:02:22.925818 4765 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/ceilometer-0" Oct 03 09:02:22 crc kubenswrapper[4765]: I1003 09:02:22.928038 4765 reflector.go:368] Caches populated for *v1.Secret from object-"watcher-kuttl-default"/"cert-ceilometer-internal-svc" Oct 03 09:02:22 crc kubenswrapper[4765]: I1003 09:02:22.928055 4765 reflector.go:368] Caches populated for *v1.Secret from object-"watcher-kuttl-default"/"ceilometer-scripts" Oct 03 09:02:22 crc kubenswrapper[4765]: I1003 09:02:22.928277 4765 reflector.go:368] Caches populated for *v1.Secret from object-"watcher-kuttl-default"/"ceilometer-config-data" Oct 03 09:02:22 crc kubenswrapper[4765]: I1003 09:02:22.935796 4765 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["watcher-kuttl-default/ceilometer-0"] Oct 03 09:02:22 crc kubenswrapper[4765]: I1003 09:02:22.950670 4765 scope.go:117] "RemoveContainer" containerID="8e33ade36873f25cd6c35a750342c13049f8880c6cbf20f4b2f65c5f2ec1c71c" Oct 03 09:02:23 crc kubenswrapper[4765]: I1003 09:02:23.029593 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c001d2a5-b27f-456a-a2b1-141a880b8174-config-data\") pod \"ceilometer-0\" (UID: \"c001d2a5-b27f-456a-a2b1-141a880b8174\") " pod="watcher-kuttl-default/ceilometer-0" Oct 03 09:02:23 crc kubenswrapper[4765]: I1003 09:02:23.029667 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/c001d2a5-b27f-456a-a2b1-141a880b8174-log-httpd\") pod \"ceilometer-0\" (UID: \"c001d2a5-b27f-456a-a2b1-141a880b8174\") " pod="watcher-kuttl-default/ceilometer-0" Oct 03 09:02:23 crc kubenswrapper[4765]: I1003 09:02:23.029696 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/c001d2a5-b27f-456a-a2b1-141a880b8174-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"c001d2a5-b27f-456a-a2b1-141a880b8174\") " pod="watcher-kuttl-default/ceilometer-0" Oct 03 09:02:23 crc kubenswrapper[4765]: I1003 09:02:23.029721 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c001d2a5-b27f-456a-a2b1-141a880b8174-scripts\") pod \"ceilometer-0\" (UID: \"c001d2a5-b27f-456a-a2b1-141a880b8174\") " pod="watcher-kuttl-default/ceilometer-0" Oct 03 09:02:23 crc kubenswrapper[4765]: I1003 09:02:23.029762 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/c001d2a5-b27f-456a-a2b1-141a880b8174-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"c001d2a5-b27f-456a-a2b1-141a880b8174\") " pod="watcher-kuttl-default/ceilometer-0" Oct 03 09:02:23 crc kubenswrapper[4765]: I1003 09:02:23.029800 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/c001d2a5-b27f-456a-a2b1-141a880b8174-run-httpd\") pod \"ceilometer-0\" (UID: \"c001d2a5-b27f-456a-a2b1-141a880b8174\") " pod="watcher-kuttl-default/ceilometer-0" Oct 03 09:02:23 crc kubenswrapper[4765]: I1003 09:02:23.029825 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-k2x8v\" (UniqueName: \"kubernetes.io/projected/c001d2a5-b27f-456a-a2b1-141a880b8174-kube-api-access-k2x8v\") pod \"ceilometer-0\" (UID: \"c001d2a5-b27f-456a-a2b1-141a880b8174\") " pod="watcher-kuttl-default/ceilometer-0" Oct 03 09:02:23 crc kubenswrapper[4765]: I1003 09:02:23.029854 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c001d2a5-b27f-456a-a2b1-141a880b8174-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"c001d2a5-b27f-456a-a2b1-141a880b8174\") " pod="watcher-kuttl-default/ceilometer-0" Oct 03 09:02:23 crc kubenswrapper[4765]: I1003 09:02:23.067594 4765 scope.go:117] "RemoveContainer" containerID="7b0a838d09d1f8e5dda48c85ff65d03d96320f14aa1115cb502077600166ce87" Oct 03 09:02:23 crc kubenswrapper[4765]: E1003 09:02:23.068412 4765 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7b0a838d09d1f8e5dda48c85ff65d03d96320f14aa1115cb502077600166ce87\": container with ID starting with 7b0a838d09d1f8e5dda48c85ff65d03d96320f14aa1115cb502077600166ce87 not found: ID does not exist" containerID="7b0a838d09d1f8e5dda48c85ff65d03d96320f14aa1115cb502077600166ce87" Oct 03 09:02:23 crc kubenswrapper[4765]: I1003 09:02:23.068479 4765 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7b0a838d09d1f8e5dda48c85ff65d03d96320f14aa1115cb502077600166ce87"} err="failed to get container status \"7b0a838d09d1f8e5dda48c85ff65d03d96320f14aa1115cb502077600166ce87\": rpc error: code = NotFound desc = could not find container \"7b0a838d09d1f8e5dda48c85ff65d03d96320f14aa1115cb502077600166ce87\": container with ID starting with 7b0a838d09d1f8e5dda48c85ff65d03d96320f14aa1115cb502077600166ce87 not found: ID does not exist" Oct 03 09:02:23 crc kubenswrapper[4765]: I1003 09:02:23.068503 4765 scope.go:117] "RemoveContainer" containerID="f75c8cbcdeebbc2265f17a6512043a708c837bdb0cc944615199051b33cc7d6c" Oct 03 09:02:23 crc kubenswrapper[4765]: E1003 09:02:23.068916 4765 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f75c8cbcdeebbc2265f17a6512043a708c837bdb0cc944615199051b33cc7d6c\": container with ID starting with f75c8cbcdeebbc2265f17a6512043a708c837bdb0cc944615199051b33cc7d6c not found: ID does not exist" containerID="f75c8cbcdeebbc2265f17a6512043a708c837bdb0cc944615199051b33cc7d6c" Oct 03 09:02:23 crc kubenswrapper[4765]: I1003 09:02:23.068954 4765 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f75c8cbcdeebbc2265f17a6512043a708c837bdb0cc944615199051b33cc7d6c"} err="failed to get container status \"f75c8cbcdeebbc2265f17a6512043a708c837bdb0cc944615199051b33cc7d6c\": rpc error: code = NotFound desc = could not find container \"f75c8cbcdeebbc2265f17a6512043a708c837bdb0cc944615199051b33cc7d6c\": container with ID starting with f75c8cbcdeebbc2265f17a6512043a708c837bdb0cc944615199051b33cc7d6c not found: ID does not exist" Oct 03 09:02:23 crc kubenswrapper[4765]: I1003 09:02:23.068978 4765 scope.go:117] "RemoveContainer" containerID="4ea9ee96f0faed918eedf3026e52af26391c80dbf9886e1598feb36c3570b6a8" Oct 03 09:02:23 crc kubenswrapper[4765]: E1003 09:02:23.069323 4765 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4ea9ee96f0faed918eedf3026e52af26391c80dbf9886e1598feb36c3570b6a8\": container with ID starting with 4ea9ee96f0faed918eedf3026e52af26391c80dbf9886e1598feb36c3570b6a8 not found: ID does not exist" containerID="4ea9ee96f0faed918eedf3026e52af26391c80dbf9886e1598feb36c3570b6a8" Oct 03 09:02:23 crc kubenswrapper[4765]: I1003 09:02:23.069375 4765 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4ea9ee96f0faed918eedf3026e52af26391c80dbf9886e1598feb36c3570b6a8"} err="failed to get container status \"4ea9ee96f0faed918eedf3026e52af26391c80dbf9886e1598feb36c3570b6a8\": rpc error: code = NotFound desc = could not find container \"4ea9ee96f0faed918eedf3026e52af26391c80dbf9886e1598feb36c3570b6a8\": container with ID starting with 4ea9ee96f0faed918eedf3026e52af26391c80dbf9886e1598feb36c3570b6a8 not found: ID does not exist" Oct 03 09:02:23 crc kubenswrapper[4765]: I1003 09:02:23.069399 4765 scope.go:117] "RemoveContainer" containerID="8e33ade36873f25cd6c35a750342c13049f8880c6cbf20f4b2f65c5f2ec1c71c" Oct 03 09:02:23 crc kubenswrapper[4765]: E1003 09:02:23.069817 4765 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8e33ade36873f25cd6c35a750342c13049f8880c6cbf20f4b2f65c5f2ec1c71c\": container with ID starting with 8e33ade36873f25cd6c35a750342c13049f8880c6cbf20f4b2f65c5f2ec1c71c not found: ID does not exist" containerID="8e33ade36873f25cd6c35a750342c13049f8880c6cbf20f4b2f65c5f2ec1c71c" Oct 03 09:02:23 crc kubenswrapper[4765]: I1003 09:02:23.069839 4765 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8e33ade36873f25cd6c35a750342c13049f8880c6cbf20f4b2f65c5f2ec1c71c"} err="failed to get container status \"8e33ade36873f25cd6c35a750342c13049f8880c6cbf20f4b2f65c5f2ec1c71c\": rpc error: code = NotFound desc = could not find container \"8e33ade36873f25cd6c35a750342c13049f8880c6cbf20f4b2f65c5f2ec1c71c\": container with ID starting with 8e33ade36873f25cd6c35a750342c13049f8880c6cbf20f4b2f65c5f2ec1c71c not found: ID does not exist" Oct 03 09:02:23 crc kubenswrapper[4765]: I1003 09:02:23.069856 4765 scope.go:117] "RemoveContainer" containerID="7b0a838d09d1f8e5dda48c85ff65d03d96320f14aa1115cb502077600166ce87" Oct 03 09:02:23 crc kubenswrapper[4765]: I1003 09:02:23.070280 4765 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7b0a838d09d1f8e5dda48c85ff65d03d96320f14aa1115cb502077600166ce87"} err="failed to get container status \"7b0a838d09d1f8e5dda48c85ff65d03d96320f14aa1115cb502077600166ce87\": rpc error: code = NotFound desc = could not find container \"7b0a838d09d1f8e5dda48c85ff65d03d96320f14aa1115cb502077600166ce87\": container with ID starting with 7b0a838d09d1f8e5dda48c85ff65d03d96320f14aa1115cb502077600166ce87 not found: ID does not exist" Oct 03 09:02:23 crc kubenswrapper[4765]: I1003 09:02:23.070297 4765 scope.go:117] "RemoveContainer" containerID="f75c8cbcdeebbc2265f17a6512043a708c837bdb0cc944615199051b33cc7d6c" Oct 03 09:02:23 crc kubenswrapper[4765]: I1003 09:02:23.070572 4765 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f75c8cbcdeebbc2265f17a6512043a708c837bdb0cc944615199051b33cc7d6c"} err="failed to get container status \"f75c8cbcdeebbc2265f17a6512043a708c837bdb0cc944615199051b33cc7d6c\": rpc error: code = NotFound desc = could not find container \"f75c8cbcdeebbc2265f17a6512043a708c837bdb0cc944615199051b33cc7d6c\": container with ID starting with f75c8cbcdeebbc2265f17a6512043a708c837bdb0cc944615199051b33cc7d6c not found: ID does not exist" Oct 03 09:02:23 crc kubenswrapper[4765]: I1003 09:02:23.070590 4765 scope.go:117] "RemoveContainer" containerID="4ea9ee96f0faed918eedf3026e52af26391c80dbf9886e1598feb36c3570b6a8" Oct 03 09:02:23 crc kubenswrapper[4765]: I1003 09:02:23.070849 4765 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4ea9ee96f0faed918eedf3026e52af26391c80dbf9886e1598feb36c3570b6a8"} err="failed to get container status \"4ea9ee96f0faed918eedf3026e52af26391c80dbf9886e1598feb36c3570b6a8\": rpc error: code = NotFound desc = could not find container \"4ea9ee96f0faed918eedf3026e52af26391c80dbf9886e1598feb36c3570b6a8\": container with ID starting with 4ea9ee96f0faed918eedf3026e52af26391c80dbf9886e1598feb36c3570b6a8 not found: ID does not exist" Oct 03 09:02:23 crc kubenswrapper[4765]: I1003 09:02:23.070879 4765 scope.go:117] "RemoveContainer" containerID="8e33ade36873f25cd6c35a750342c13049f8880c6cbf20f4b2f65c5f2ec1c71c" Oct 03 09:02:23 crc kubenswrapper[4765]: I1003 09:02:23.071128 4765 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8e33ade36873f25cd6c35a750342c13049f8880c6cbf20f4b2f65c5f2ec1c71c"} err="failed to get container status \"8e33ade36873f25cd6c35a750342c13049f8880c6cbf20f4b2f65c5f2ec1c71c\": rpc error: code = NotFound desc = could not find container \"8e33ade36873f25cd6c35a750342c13049f8880c6cbf20f4b2f65c5f2ec1c71c\": container with ID starting with 8e33ade36873f25cd6c35a750342c13049f8880c6cbf20f4b2f65c5f2ec1c71c not found: ID does not exist" Oct 03 09:02:23 crc kubenswrapper[4765]: I1003 09:02:23.131711 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-k2x8v\" (UniqueName: \"kubernetes.io/projected/c001d2a5-b27f-456a-a2b1-141a880b8174-kube-api-access-k2x8v\") pod \"ceilometer-0\" (UID: \"c001d2a5-b27f-456a-a2b1-141a880b8174\") " pod="watcher-kuttl-default/ceilometer-0" Oct 03 09:02:23 crc kubenswrapper[4765]: I1003 09:02:23.131759 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c001d2a5-b27f-456a-a2b1-141a880b8174-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"c001d2a5-b27f-456a-a2b1-141a880b8174\") " pod="watcher-kuttl-default/ceilometer-0" Oct 03 09:02:23 crc kubenswrapper[4765]: I1003 09:02:23.131845 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c001d2a5-b27f-456a-a2b1-141a880b8174-config-data\") pod \"ceilometer-0\" (UID: \"c001d2a5-b27f-456a-a2b1-141a880b8174\") " pod="watcher-kuttl-default/ceilometer-0" Oct 03 09:02:23 crc kubenswrapper[4765]: I1003 09:02:23.131872 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/c001d2a5-b27f-456a-a2b1-141a880b8174-log-httpd\") pod \"ceilometer-0\" (UID: \"c001d2a5-b27f-456a-a2b1-141a880b8174\") " pod="watcher-kuttl-default/ceilometer-0" Oct 03 09:02:23 crc kubenswrapper[4765]: I1003 09:02:23.131888 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/c001d2a5-b27f-456a-a2b1-141a880b8174-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"c001d2a5-b27f-456a-a2b1-141a880b8174\") " pod="watcher-kuttl-default/ceilometer-0" Oct 03 09:02:23 crc kubenswrapper[4765]: I1003 09:02:23.131905 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c001d2a5-b27f-456a-a2b1-141a880b8174-scripts\") pod \"ceilometer-0\" (UID: \"c001d2a5-b27f-456a-a2b1-141a880b8174\") " pod="watcher-kuttl-default/ceilometer-0" Oct 03 09:02:23 crc kubenswrapper[4765]: I1003 09:02:23.131937 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/c001d2a5-b27f-456a-a2b1-141a880b8174-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"c001d2a5-b27f-456a-a2b1-141a880b8174\") " pod="watcher-kuttl-default/ceilometer-0" Oct 03 09:02:23 crc kubenswrapper[4765]: I1003 09:02:23.131963 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/c001d2a5-b27f-456a-a2b1-141a880b8174-run-httpd\") pod \"ceilometer-0\" (UID: \"c001d2a5-b27f-456a-a2b1-141a880b8174\") " pod="watcher-kuttl-default/ceilometer-0" Oct 03 09:02:23 crc kubenswrapper[4765]: I1003 09:02:23.132606 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/c001d2a5-b27f-456a-a2b1-141a880b8174-log-httpd\") pod \"ceilometer-0\" (UID: \"c001d2a5-b27f-456a-a2b1-141a880b8174\") " pod="watcher-kuttl-default/ceilometer-0" Oct 03 09:02:23 crc kubenswrapper[4765]: I1003 09:02:23.132981 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/c001d2a5-b27f-456a-a2b1-141a880b8174-run-httpd\") pod \"ceilometer-0\" (UID: \"c001d2a5-b27f-456a-a2b1-141a880b8174\") " pod="watcher-kuttl-default/ceilometer-0" Oct 03 09:02:23 crc kubenswrapper[4765]: I1003 09:02:23.147129 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c001d2a5-b27f-456a-a2b1-141a880b8174-scripts\") pod \"ceilometer-0\" (UID: \"c001d2a5-b27f-456a-a2b1-141a880b8174\") " pod="watcher-kuttl-default/ceilometer-0" Oct 03 09:02:23 crc kubenswrapper[4765]: I1003 09:02:23.147337 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c001d2a5-b27f-456a-a2b1-141a880b8174-config-data\") pod \"ceilometer-0\" (UID: \"c001d2a5-b27f-456a-a2b1-141a880b8174\") " pod="watcher-kuttl-default/ceilometer-0" Oct 03 09:02:23 crc kubenswrapper[4765]: I1003 09:02:23.147791 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c001d2a5-b27f-456a-a2b1-141a880b8174-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"c001d2a5-b27f-456a-a2b1-141a880b8174\") " pod="watcher-kuttl-default/ceilometer-0" Oct 03 09:02:23 crc kubenswrapper[4765]: I1003 09:02:23.149436 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/c001d2a5-b27f-456a-a2b1-141a880b8174-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"c001d2a5-b27f-456a-a2b1-141a880b8174\") " pod="watcher-kuttl-default/ceilometer-0" Oct 03 09:02:23 crc kubenswrapper[4765]: I1003 09:02:23.165500 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/c001d2a5-b27f-456a-a2b1-141a880b8174-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"c001d2a5-b27f-456a-a2b1-141a880b8174\") " pod="watcher-kuttl-default/ceilometer-0" Oct 03 09:02:23 crc kubenswrapper[4765]: I1003 09:02:23.165633 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-k2x8v\" (UniqueName: \"kubernetes.io/projected/c001d2a5-b27f-456a-a2b1-141a880b8174-kube-api-access-k2x8v\") pod \"ceilometer-0\" (UID: \"c001d2a5-b27f-456a-a2b1-141a880b8174\") " pod="watcher-kuttl-default/ceilometer-0" Oct 03 09:02:23 crc kubenswrapper[4765]: I1003 09:02:23.327510 4765 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/watcher-kuttl-api-0" Oct 03 09:02:23 crc kubenswrapper[4765]: I1003 09:02:23.352018 4765 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/ceilometer-0" Oct 03 09:02:23 crc kubenswrapper[4765]: I1003 09:02:23.436037 4765 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6b2be982-d42e-40f6-840c-cbdb35fefc4b-config-data\") pod \"6b2be982-d42e-40f6-840c-cbdb35fefc4b\" (UID: \"6b2be982-d42e-40f6-840c-cbdb35fefc4b\") " Oct 03 09:02:23 crc kubenswrapper[4765]: I1003 09:02:23.436082 4765 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6b2be982-d42e-40f6-840c-cbdb35fefc4b-combined-ca-bundle\") pod \"6b2be982-d42e-40f6-840c-cbdb35fefc4b\" (UID: \"6b2be982-d42e-40f6-840c-cbdb35fefc4b\") " Oct 03 09:02:23 crc kubenswrapper[4765]: I1003 09:02:23.436155 4765 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/6b2be982-d42e-40f6-840c-cbdb35fefc4b-internal-tls-certs\") pod \"6b2be982-d42e-40f6-840c-cbdb35fefc4b\" (UID: \"6b2be982-d42e-40f6-840c-cbdb35fefc4b\") " Oct 03 09:02:23 crc kubenswrapper[4765]: I1003 09:02:23.436234 4765 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rgb72\" (UniqueName: \"kubernetes.io/projected/6b2be982-d42e-40f6-840c-cbdb35fefc4b-kube-api-access-rgb72\") pod \"6b2be982-d42e-40f6-840c-cbdb35fefc4b\" (UID: \"6b2be982-d42e-40f6-840c-cbdb35fefc4b\") " Oct 03 09:02:23 crc kubenswrapper[4765]: I1003 09:02:23.436284 4765 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/6b2be982-d42e-40f6-840c-cbdb35fefc4b-logs\") pod \"6b2be982-d42e-40f6-840c-cbdb35fefc4b\" (UID: \"6b2be982-d42e-40f6-840c-cbdb35fefc4b\") " Oct 03 09:02:23 crc kubenswrapper[4765]: I1003 09:02:23.436332 4765 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/6b2be982-d42e-40f6-840c-cbdb35fefc4b-public-tls-certs\") pod \"6b2be982-d42e-40f6-840c-cbdb35fefc4b\" (UID: \"6b2be982-d42e-40f6-840c-cbdb35fefc4b\") " Oct 03 09:02:23 crc kubenswrapper[4765]: I1003 09:02:23.436413 4765 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"custom-prometheus-ca\" (UniqueName: \"kubernetes.io/secret/6b2be982-d42e-40f6-840c-cbdb35fefc4b-custom-prometheus-ca\") pod \"6b2be982-d42e-40f6-840c-cbdb35fefc4b\" (UID: \"6b2be982-d42e-40f6-840c-cbdb35fefc4b\") " Oct 03 09:02:23 crc kubenswrapper[4765]: I1003 09:02:23.440027 4765 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/6b2be982-d42e-40f6-840c-cbdb35fefc4b-logs" (OuterVolumeSpecName: "logs") pod "6b2be982-d42e-40f6-840c-cbdb35fefc4b" (UID: "6b2be982-d42e-40f6-840c-cbdb35fefc4b"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 03 09:02:23 crc kubenswrapper[4765]: I1003 09:02:23.445995 4765 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6b2be982-d42e-40f6-840c-cbdb35fefc4b-kube-api-access-rgb72" (OuterVolumeSpecName: "kube-api-access-rgb72") pod "6b2be982-d42e-40f6-840c-cbdb35fefc4b" (UID: "6b2be982-d42e-40f6-840c-cbdb35fefc4b"). InnerVolumeSpecName "kube-api-access-rgb72". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 09:02:23 crc kubenswrapper[4765]: I1003 09:02:23.465811 4765 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6b2be982-d42e-40f6-840c-cbdb35fefc4b-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "6b2be982-d42e-40f6-840c-cbdb35fefc4b" (UID: "6b2be982-d42e-40f6-840c-cbdb35fefc4b"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 09:02:23 crc kubenswrapper[4765]: I1003 09:02:23.468595 4765 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6b2be982-d42e-40f6-840c-cbdb35fefc4b-custom-prometheus-ca" (OuterVolumeSpecName: "custom-prometheus-ca") pod "6b2be982-d42e-40f6-840c-cbdb35fefc4b" (UID: "6b2be982-d42e-40f6-840c-cbdb35fefc4b"). InnerVolumeSpecName "custom-prometheus-ca". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 09:02:23 crc kubenswrapper[4765]: I1003 09:02:23.482555 4765 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6b2be982-d42e-40f6-840c-cbdb35fefc4b-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "6b2be982-d42e-40f6-840c-cbdb35fefc4b" (UID: "6b2be982-d42e-40f6-840c-cbdb35fefc4b"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 09:02:23 crc kubenswrapper[4765]: I1003 09:02:23.483295 4765 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6b2be982-d42e-40f6-840c-cbdb35fefc4b-config-data" (OuterVolumeSpecName: "config-data") pod "6b2be982-d42e-40f6-840c-cbdb35fefc4b" (UID: "6b2be982-d42e-40f6-840c-cbdb35fefc4b"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 09:02:23 crc kubenswrapper[4765]: I1003 09:02:23.496306 4765 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6b2be982-d42e-40f6-840c-cbdb35fefc4b-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "6b2be982-d42e-40f6-840c-cbdb35fefc4b" (UID: "6b2be982-d42e-40f6-840c-cbdb35fefc4b"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 09:02:23 crc kubenswrapper[4765]: I1003 09:02:23.539038 4765 reconciler_common.go:293] "Volume detached for volume \"custom-prometheus-ca\" (UniqueName: \"kubernetes.io/secret/6b2be982-d42e-40f6-840c-cbdb35fefc4b-custom-prometheus-ca\") on node \"crc\" DevicePath \"\"" Oct 03 09:02:23 crc kubenswrapper[4765]: I1003 09:02:23.539067 4765 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6b2be982-d42e-40f6-840c-cbdb35fefc4b-config-data\") on node \"crc\" DevicePath \"\"" Oct 03 09:02:23 crc kubenswrapper[4765]: I1003 09:02:23.539075 4765 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6b2be982-d42e-40f6-840c-cbdb35fefc4b-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 03 09:02:23 crc kubenswrapper[4765]: I1003 09:02:23.539084 4765 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/6b2be982-d42e-40f6-840c-cbdb35fefc4b-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Oct 03 09:02:23 crc kubenswrapper[4765]: I1003 09:02:23.539093 4765 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rgb72\" (UniqueName: \"kubernetes.io/projected/6b2be982-d42e-40f6-840c-cbdb35fefc4b-kube-api-access-rgb72\") on node \"crc\" DevicePath \"\"" Oct 03 09:02:23 crc kubenswrapper[4765]: I1003 09:02:23.539103 4765 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/6b2be982-d42e-40f6-840c-cbdb35fefc4b-logs\") on node \"crc\" DevicePath \"\"" Oct 03 09:02:23 crc kubenswrapper[4765]: I1003 09:02:23.539111 4765 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/6b2be982-d42e-40f6-840c-cbdb35fefc4b-public-tls-certs\") on node \"crc\" DevicePath \"\"" Oct 03 09:02:23 crc kubenswrapper[4765]: I1003 09:02:23.747241 4765 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["watcher-kuttl-default/watcher-db-create-zcrmj"] Oct 03 09:02:23 crc kubenswrapper[4765]: I1003 09:02:23.759012 4765 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["watcher-kuttl-default/watcher-db-create-zcrmj"] Oct 03 09:02:23 crc kubenswrapper[4765]: I1003 09:02:23.769569 4765 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["watcher-kuttl-default/watcher8945-account-delete-pt9gj"] Oct 03 09:02:23 crc kubenswrapper[4765]: I1003 09:02:23.776502 4765 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["watcher-kuttl-default/watcher-8945-account-create-sd2pq"] Oct 03 09:02:23 crc kubenswrapper[4765]: I1003 09:02:23.782619 4765 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["watcher-kuttl-default/watcher8945-account-delete-pt9gj"] Oct 03 09:02:23 crc kubenswrapper[4765]: I1003 09:02:23.788766 4765 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["watcher-kuttl-default/watcher-8945-account-create-sd2pq"] Oct 03 09:02:23 crc kubenswrapper[4765]: I1003 09:02:23.794694 4765 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["watcher-kuttl-default/ceilometer-0"] Oct 03 09:02:23 crc kubenswrapper[4765]: I1003 09:02:23.862549 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/ceilometer-0" event={"ID":"c001d2a5-b27f-456a-a2b1-141a880b8174","Type":"ContainerStarted","Data":"2ac1bf149582bca5b2370067c7b7146af05ea1e4a389a55e841010130fa812b8"} Oct 03 09:02:23 crc kubenswrapper[4765]: I1003 09:02:23.865774 4765 generic.go:334] "Generic (PLEG): container finished" podID="6b2be982-d42e-40f6-840c-cbdb35fefc4b" containerID="8a669e0ca9c434079bad9eadee6b72778566c16870fa5bf81fa0f4084aa60c6a" exitCode=0 Oct 03 09:02:23 crc kubenswrapper[4765]: I1003 09:02:23.865800 4765 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/watcher-kuttl-api-0" Oct 03 09:02:23 crc kubenswrapper[4765]: I1003 09:02:23.865809 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/watcher-kuttl-api-0" event={"ID":"6b2be982-d42e-40f6-840c-cbdb35fefc4b","Type":"ContainerDied","Data":"8a669e0ca9c434079bad9eadee6b72778566c16870fa5bf81fa0f4084aa60c6a"} Oct 03 09:02:23 crc kubenswrapper[4765]: I1003 09:02:23.865904 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/watcher-kuttl-api-0" event={"ID":"6b2be982-d42e-40f6-840c-cbdb35fefc4b","Type":"ContainerDied","Data":"eed7451c613d2404816651890867b073eb3144c902db41fe12cfaefe4d935521"} Oct 03 09:02:23 crc kubenswrapper[4765]: I1003 09:02:23.865951 4765 scope.go:117] "RemoveContainer" containerID="8a669e0ca9c434079bad9eadee6b72778566c16870fa5bf81fa0f4084aa60c6a" Oct 03 09:02:23 crc kubenswrapper[4765]: I1003 09:02:23.886605 4765 scope.go:117] "RemoveContainer" containerID="d518aa16eb43acf135258aaecf8698528e8c875f6760020b6f37989dd9ea6843" Oct 03 09:02:23 crc kubenswrapper[4765]: I1003 09:02:23.906279 4765 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["watcher-kuttl-default/watcher-kuttl-api-0"] Oct 03 09:02:23 crc kubenswrapper[4765]: I1003 09:02:23.916222 4765 scope.go:117] "RemoveContainer" containerID="8a669e0ca9c434079bad9eadee6b72778566c16870fa5bf81fa0f4084aa60c6a" Oct 03 09:02:23 crc kubenswrapper[4765]: E1003 09:02:23.916691 4765 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8a669e0ca9c434079bad9eadee6b72778566c16870fa5bf81fa0f4084aa60c6a\": container with ID starting with 8a669e0ca9c434079bad9eadee6b72778566c16870fa5bf81fa0f4084aa60c6a not found: ID does not exist" containerID="8a669e0ca9c434079bad9eadee6b72778566c16870fa5bf81fa0f4084aa60c6a" Oct 03 09:02:23 crc kubenswrapper[4765]: I1003 09:02:23.916734 4765 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8a669e0ca9c434079bad9eadee6b72778566c16870fa5bf81fa0f4084aa60c6a"} err="failed to get container status \"8a669e0ca9c434079bad9eadee6b72778566c16870fa5bf81fa0f4084aa60c6a\": rpc error: code = NotFound desc = could not find container \"8a669e0ca9c434079bad9eadee6b72778566c16870fa5bf81fa0f4084aa60c6a\": container with ID starting with 8a669e0ca9c434079bad9eadee6b72778566c16870fa5bf81fa0f4084aa60c6a not found: ID does not exist" Oct 03 09:02:23 crc kubenswrapper[4765]: I1003 09:02:23.916760 4765 scope.go:117] "RemoveContainer" containerID="d518aa16eb43acf135258aaecf8698528e8c875f6760020b6f37989dd9ea6843" Oct 03 09:02:23 crc kubenswrapper[4765]: E1003 09:02:23.917076 4765 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d518aa16eb43acf135258aaecf8698528e8c875f6760020b6f37989dd9ea6843\": container with ID starting with d518aa16eb43acf135258aaecf8698528e8c875f6760020b6f37989dd9ea6843 not found: ID does not exist" containerID="d518aa16eb43acf135258aaecf8698528e8c875f6760020b6f37989dd9ea6843" Oct 03 09:02:23 crc kubenswrapper[4765]: I1003 09:02:23.917104 4765 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d518aa16eb43acf135258aaecf8698528e8c875f6760020b6f37989dd9ea6843"} err="failed to get container status \"d518aa16eb43acf135258aaecf8698528e8c875f6760020b6f37989dd9ea6843\": rpc error: code = NotFound desc = could not find container \"d518aa16eb43acf135258aaecf8698528e8c875f6760020b6f37989dd9ea6843\": container with ID starting with d518aa16eb43acf135258aaecf8698528e8c875f6760020b6f37989dd9ea6843 not found: ID does not exist" Oct 03 09:02:23 crc kubenswrapper[4765]: I1003 09:02:23.917872 4765 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["watcher-kuttl-default/watcher-kuttl-api-0"] Oct 03 09:02:24 crc kubenswrapper[4765]: I1003 09:02:24.321427 4765 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6b2be982-d42e-40f6-840c-cbdb35fefc4b" path="/var/lib/kubelet/pods/6b2be982-d42e-40f6-840c-cbdb35fefc4b/volumes" Oct 03 09:02:24 crc kubenswrapper[4765]: I1003 09:02:24.322588 4765 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="79ccdb04-ca5c-4266-8367-b0a04310f061" path="/var/lib/kubelet/pods/79ccdb04-ca5c-4266-8367-b0a04310f061/volumes" Oct 03 09:02:24 crc kubenswrapper[4765]: I1003 09:02:24.325432 4765 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="dac4b22e-04b7-4780-acee-fd0971bd5e94" path="/var/lib/kubelet/pods/dac4b22e-04b7-4780-acee-fd0971bd5e94/volumes" Oct 03 09:02:24 crc kubenswrapper[4765]: I1003 09:02:24.326090 4765 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="dd692e27-9c41-4946-9776-955baa355470" path="/var/lib/kubelet/pods/dd692e27-9c41-4946-9776-955baa355470/volumes" Oct 03 09:02:24 crc kubenswrapper[4765]: I1003 09:02:24.326794 4765 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="fbe411d1-aa31-4448-a2b6-176de25d7ff0" path="/var/lib/kubelet/pods/fbe411d1-aa31-4448-a2b6-176de25d7ff0/volumes" Oct 03 09:02:24 crc kubenswrapper[4765]: I1003 09:02:24.888031 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/ceilometer-0" event={"ID":"c001d2a5-b27f-456a-a2b1-141a880b8174","Type":"ContainerStarted","Data":"55e483c8bee6161449985dd163cf8b26544fc6755d63dde24d62b842676929bb"} Oct 03 09:02:24 crc kubenswrapper[4765]: I1003 09:02:24.926260 4765 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["watcher-kuttl-default/watcher-db-create-nt8ws"] Oct 03 09:02:24 crc kubenswrapper[4765]: E1003 09:02:24.926765 4765 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6b2be982-d42e-40f6-840c-cbdb35fefc4b" containerName="watcher-api" Oct 03 09:02:24 crc kubenswrapper[4765]: I1003 09:02:24.926781 4765 state_mem.go:107] "Deleted CPUSet assignment" podUID="6b2be982-d42e-40f6-840c-cbdb35fefc4b" containerName="watcher-api" Oct 03 09:02:24 crc kubenswrapper[4765]: E1003 09:02:24.926830 4765 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6b2be982-d42e-40f6-840c-cbdb35fefc4b" containerName="watcher-kuttl-api-log" Oct 03 09:02:24 crc kubenswrapper[4765]: I1003 09:02:24.926840 4765 state_mem.go:107] "Deleted CPUSet assignment" podUID="6b2be982-d42e-40f6-840c-cbdb35fefc4b" containerName="watcher-kuttl-api-log" Oct 03 09:02:24 crc kubenswrapper[4765]: I1003 09:02:24.927028 4765 memory_manager.go:354] "RemoveStaleState removing state" podUID="6b2be982-d42e-40f6-840c-cbdb35fefc4b" containerName="watcher-kuttl-api-log" Oct 03 09:02:24 crc kubenswrapper[4765]: I1003 09:02:24.927056 4765 memory_manager.go:354] "RemoveStaleState removing state" podUID="6b2be982-d42e-40f6-840c-cbdb35fefc4b" containerName="watcher-api" Oct 03 09:02:24 crc kubenswrapper[4765]: I1003 09:02:24.927837 4765 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/watcher-db-create-nt8ws" Oct 03 09:02:24 crc kubenswrapper[4765]: I1003 09:02:24.934093 4765 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["watcher-kuttl-default/watcher-db-create-nt8ws"] Oct 03 09:02:25 crc kubenswrapper[4765]: I1003 09:02:25.063294 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rdxg9\" (UniqueName: \"kubernetes.io/projected/575c88eb-254b-45ff-ab90-bbf9cf94c4fa-kube-api-access-rdxg9\") pod \"watcher-db-create-nt8ws\" (UID: \"575c88eb-254b-45ff-ab90-bbf9cf94c4fa\") " pod="watcher-kuttl-default/watcher-db-create-nt8ws" Oct 03 09:02:25 crc kubenswrapper[4765]: I1003 09:02:25.165790 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rdxg9\" (UniqueName: \"kubernetes.io/projected/575c88eb-254b-45ff-ab90-bbf9cf94c4fa-kube-api-access-rdxg9\") pod \"watcher-db-create-nt8ws\" (UID: \"575c88eb-254b-45ff-ab90-bbf9cf94c4fa\") " pod="watcher-kuttl-default/watcher-db-create-nt8ws" Oct 03 09:02:25 crc kubenswrapper[4765]: I1003 09:02:25.184709 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rdxg9\" (UniqueName: \"kubernetes.io/projected/575c88eb-254b-45ff-ab90-bbf9cf94c4fa-kube-api-access-rdxg9\") pod \"watcher-db-create-nt8ws\" (UID: \"575c88eb-254b-45ff-ab90-bbf9cf94c4fa\") " pod="watcher-kuttl-default/watcher-db-create-nt8ws" Oct 03 09:02:25 crc kubenswrapper[4765]: I1003 09:02:25.363984 4765 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/watcher-db-create-nt8ws" Oct 03 09:02:25 crc kubenswrapper[4765]: I1003 09:02:25.868776 4765 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["watcher-kuttl-default/watcher-db-create-nt8ws"] Oct 03 09:02:25 crc kubenswrapper[4765]: I1003 09:02:25.924727 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/watcher-db-create-nt8ws" event={"ID":"575c88eb-254b-45ff-ab90-bbf9cf94c4fa","Type":"ContainerStarted","Data":"e6240ad67d671dc872a358b5ec92b290ae8e237011641cd6e0bd1395fb2b81cf"} Oct 03 09:02:25 crc kubenswrapper[4765]: I1003 09:02:25.944768 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/ceilometer-0" event={"ID":"c001d2a5-b27f-456a-a2b1-141a880b8174","Type":"ContainerStarted","Data":"3f543215158d5bbf414aad2c250acf81a88b3a0690ca4b892e71402a82a9207a"} Oct 03 09:02:26 crc kubenswrapper[4765]: I1003 09:02:26.961278 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/ceilometer-0" event={"ID":"c001d2a5-b27f-456a-a2b1-141a880b8174","Type":"ContainerStarted","Data":"5ff432912d77484e720f934975943fa00031a7c8eb3579ae1b3e2b17e52fc02f"} Oct 03 09:02:26 crc kubenswrapper[4765]: I1003 09:02:26.963787 4765 generic.go:334] "Generic (PLEG): container finished" podID="575c88eb-254b-45ff-ab90-bbf9cf94c4fa" containerID="4459abc515635cdd34530daac327cd3e8b9803349e47b91ff79beed3a2f9e681" exitCode=0 Oct 03 09:02:26 crc kubenswrapper[4765]: I1003 09:02:26.963834 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/watcher-db-create-nt8ws" event={"ID":"575c88eb-254b-45ff-ab90-bbf9cf94c4fa","Type":"ContainerDied","Data":"4459abc515635cdd34530daac327cd3e8b9803349e47b91ff79beed3a2f9e681"} Oct 03 09:02:27 crc kubenswrapper[4765]: I1003 09:02:27.973969 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/ceilometer-0" event={"ID":"c001d2a5-b27f-456a-a2b1-141a880b8174","Type":"ContainerStarted","Data":"4b0d8c497d1cf0e9c1f70cb9e2f3be559d1151f9f481999af0ac9f89b13ceee7"} Oct 03 09:02:27 crc kubenswrapper[4765]: I1003 09:02:27.974250 4765 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="watcher-kuttl-default/ceilometer-0" Oct 03 09:02:27 crc kubenswrapper[4765]: I1003 09:02:27.996671 4765 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="watcher-kuttl-default/ceilometer-0" podStartSLOduration=2.69991171 podStartE2EDuration="5.996622934s" podCreationTimestamp="2025-10-03 09:02:22 +0000 UTC" firstStartedPulling="2025-10-03 09:02:23.801854456 +0000 UTC m=+1388.103348786" lastFinishedPulling="2025-10-03 09:02:27.09856568 +0000 UTC m=+1391.400060010" observedRunningTime="2025-10-03 09:02:27.992879739 +0000 UTC m=+1392.294374069" watchObservedRunningTime="2025-10-03 09:02:27.996622934 +0000 UTC m=+1392.298117264" Oct 03 09:02:28 crc kubenswrapper[4765]: I1003 09:02:28.361291 4765 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/watcher-db-create-nt8ws" Oct 03 09:02:28 crc kubenswrapper[4765]: I1003 09:02:28.533559 4765 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rdxg9\" (UniqueName: \"kubernetes.io/projected/575c88eb-254b-45ff-ab90-bbf9cf94c4fa-kube-api-access-rdxg9\") pod \"575c88eb-254b-45ff-ab90-bbf9cf94c4fa\" (UID: \"575c88eb-254b-45ff-ab90-bbf9cf94c4fa\") " Oct 03 09:02:28 crc kubenswrapper[4765]: I1003 09:02:28.539015 4765 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/575c88eb-254b-45ff-ab90-bbf9cf94c4fa-kube-api-access-rdxg9" (OuterVolumeSpecName: "kube-api-access-rdxg9") pod "575c88eb-254b-45ff-ab90-bbf9cf94c4fa" (UID: "575c88eb-254b-45ff-ab90-bbf9cf94c4fa"). InnerVolumeSpecName "kube-api-access-rdxg9". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 09:02:28 crc kubenswrapper[4765]: I1003 09:02:28.635919 4765 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rdxg9\" (UniqueName: \"kubernetes.io/projected/575c88eb-254b-45ff-ab90-bbf9cf94c4fa-kube-api-access-rdxg9\") on node \"crc\" DevicePath \"\"" Oct 03 09:02:28 crc kubenswrapper[4765]: I1003 09:02:28.990983 4765 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/watcher-db-create-nt8ws" Oct 03 09:02:28 crc kubenswrapper[4765]: I1003 09:02:28.990972 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/watcher-db-create-nt8ws" event={"ID":"575c88eb-254b-45ff-ab90-bbf9cf94c4fa","Type":"ContainerDied","Data":"e6240ad67d671dc872a358b5ec92b290ae8e237011641cd6e0bd1395fb2b81cf"} Oct 03 09:02:28 crc kubenswrapper[4765]: I1003 09:02:28.991403 4765 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="e6240ad67d671dc872a358b5ec92b290ae8e237011641cd6e0bd1395fb2b81cf" Oct 03 09:02:29 crc kubenswrapper[4765]: I1003 09:02:29.962543 4765 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-fpjfs"] Oct 03 09:02:29 crc kubenswrapper[4765]: E1003 09:02:29.963298 4765 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="575c88eb-254b-45ff-ab90-bbf9cf94c4fa" containerName="mariadb-database-create" Oct 03 09:02:29 crc kubenswrapper[4765]: I1003 09:02:29.963321 4765 state_mem.go:107] "Deleted CPUSet assignment" podUID="575c88eb-254b-45ff-ab90-bbf9cf94c4fa" containerName="mariadb-database-create" Oct 03 09:02:29 crc kubenswrapper[4765]: I1003 09:02:29.963532 4765 memory_manager.go:354] "RemoveStaleState removing state" podUID="575c88eb-254b-45ff-ab90-bbf9cf94c4fa" containerName="mariadb-database-create" Oct 03 09:02:29 crc kubenswrapper[4765]: I1003 09:02:29.965131 4765 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-fpjfs" Oct 03 09:02:29 crc kubenswrapper[4765]: I1003 09:02:29.971303 4765 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-fpjfs"] Oct 03 09:02:30 crc kubenswrapper[4765]: I1003 09:02:30.063387 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bjk5n\" (UniqueName: \"kubernetes.io/projected/1c18f25e-29fa-49bb-8c72-38b1db164e6c-kube-api-access-bjk5n\") pod \"redhat-operators-fpjfs\" (UID: \"1c18f25e-29fa-49bb-8c72-38b1db164e6c\") " pod="openshift-marketplace/redhat-operators-fpjfs" Oct 03 09:02:30 crc kubenswrapper[4765]: I1003 09:02:30.063486 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1c18f25e-29fa-49bb-8c72-38b1db164e6c-catalog-content\") pod \"redhat-operators-fpjfs\" (UID: \"1c18f25e-29fa-49bb-8c72-38b1db164e6c\") " pod="openshift-marketplace/redhat-operators-fpjfs" Oct 03 09:02:30 crc kubenswrapper[4765]: I1003 09:02:30.063571 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1c18f25e-29fa-49bb-8c72-38b1db164e6c-utilities\") pod \"redhat-operators-fpjfs\" (UID: \"1c18f25e-29fa-49bb-8c72-38b1db164e6c\") " pod="openshift-marketplace/redhat-operators-fpjfs" Oct 03 09:02:30 crc kubenswrapper[4765]: I1003 09:02:30.164601 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bjk5n\" (UniqueName: \"kubernetes.io/projected/1c18f25e-29fa-49bb-8c72-38b1db164e6c-kube-api-access-bjk5n\") pod \"redhat-operators-fpjfs\" (UID: \"1c18f25e-29fa-49bb-8c72-38b1db164e6c\") " pod="openshift-marketplace/redhat-operators-fpjfs" Oct 03 09:02:30 crc kubenswrapper[4765]: I1003 09:02:30.165060 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1c18f25e-29fa-49bb-8c72-38b1db164e6c-catalog-content\") pod \"redhat-operators-fpjfs\" (UID: \"1c18f25e-29fa-49bb-8c72-38b1db164e6c\") " pod="openshift-marketplace/redhat-operators-fpjfs" Oct 03 09:02:30 crc kubenswrapper[4765]: I1003 09:02:30.165836 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1c18f25e-29fa-49bb-8c72-38b1db164e6c-catalog-content\") pod \"redhat-operators-fpjfs\" (UID: \"1c18f25e-29fa-49bb-8c72-38b1db164e6c\") " pod="openshift-marketplace/redhat-operators-fpjfs" Oct 03 09:02:30 crc kubenswrapper[4765]: I1003 09:02:30.165908 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1c18f25e-29fa-49bb-8c72-38b1db164e6c-utilities\") pod \"redhat-operators-fpjfs\" (UID: \"1c18f25e-29fa-49bb-8c72-38b1db164e6c\") " pod="openshift-marketplace/redhat-operators-fpjfs" Oct 03 09:02:30 crc kubenswrapper[4765]: I1003 09:02:30.166196 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1c18f25e-29fa-49bb-8c72-38b1db164e6c-utilities\") pod \"redhat-operators-fpjfs\" (UID: \"1c18f25e-29fa-49bb-8c72-38b1db164e6c\") " pod="openshift-marketplace/redhat-operators-fpjfs" Oct 03 09:02:30 crc kubenswrapper[4765]: I1003 09:02:30.188524 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bjk5n\" (UniqueName: \"kubernetes.io/projected/1c18f25e-29fa-49bb-8c72-38b1db164e6c-kube-api-access-bjk5n\") pod \"redhat-operators-fpjfs\" (UID: \"1c18f25e-29fa-49bb-8c72-38b1db164e6c\") " pod="openshift-marketplace/redhat-operators-fpjfs" Oct 03 09:02:30 crc kubenswrapper[4765]: I1003 09:02:30.282540 4765 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-fpjfs" Oct 03 09:02:30 crc kubenswrapper[4765]: I1003 09:02:30.818353 4765 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-fpjfs"] Oct 03 09:02:31 crc kubenswrapper[4765]: I1003 09:02:31.019249 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-fpjfs" event={"ID":"1c18f25e-29fa-49bb-8c72-38b1db164e6c","Type":"ContainerStarted","Data":"ea80f25a72abe0fe95e94eead5d1a32754df2b2e0787aa0a5f6ee32edd7ed404"} Oct 03 09:02:31 crc kubenswrapper[4765]: I1003 09:02:31.019295 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-fpjfs" event={"ID":"1c18f25e-29fa-49bb-8c72-38b1db164e6c","Type":"ContainerStarted","Data":"12eb952fec3b356afd298ef8ce28f42031546945473748290479c272edfd120a"} Oct 03 09:02:32 crc kubenswrapper[4765]: I1003 09:02:32.028351 4765 generic.go:334] "Generic (PLEG): container finished" podID="1c18f25e-29fa-49bb-8c72-38b1db164e6c" containerID="ea80f25a72abe0fe95e94eead5d1a32754df2b2e0787aa0a5f6ee32edd7ed404" exitCode=0 Oct 03 09:02:32 crc kubenswrapper[4765]: I1003 09:02:32.028420 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-fpjfs" event={"ID":"1c18f25e-29fa-49bb-8c72-38b1db164e6c","Type":"ContainerDied","Data":"ea80f25a72abe0fe95e94eead5d1a32754df2b2e0787aa0a5f6ee32edd7ed404"} Oct 03 09:02:34 crc kubenswrapper[4765]: I1003 09:02:34.050211 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-fpjfs" event={"ID":"1c18f25e-29fa-49bb-8c72-38b1db164e6c","Type":"ContainerStarted","Data":"1092a8f3640f0faa0190fafddc31bf953e5e83b202dc3594cc29da82c206bc28"} Oct 03 09:02:34 crc kubenswrapper[4765]: I1003 09:02:34.930907 4765 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["watcher-kuttl-default/watcher-adea-account-create-8dvhn"] Oct 03 09:02:34 crc kubenswrapper[4765]: I1003 09:02:34.931836 4765 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/watcher-adea-account-create-8dvhn" Oct 03 09:02:34 crc kubenswrapper[4765]: I1003 09:02:34.933977 4765 reflector.go:368] Caches populated for *v1.Secret from object-"watcher-kuttl-default"/"watcher-db-secret" Oct 03 09:02:34 crc kubenswrapper[4765]: I1003 09:02:34.940517 4765 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["watcher-kuttl-default/watcher-adea-account-create-8dvhn"] Oct 03 09:02:35 crc kubenswrapper[4765]: I1003 09:02:35.040034 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-c2nbj\" (UniqueName: \"kubernetes.io/projected/7a703dbc-56c9-490f-8180-0322a355f7f3-kube-api-access-c2nbj\") pod \"watcher-adea-account-create-8dvhn\" (UID: \"7a703dbc-56c9-490f-8180-0322a355f7f3\") " pod="watcher-kuttl-default/watcher-adea-account-create-8dvhn" Oct 03 09:02:35 crc kubenswrapper[4765]: I1003 09:02:35.058763 4765 generic.go:334] "Generic (PLEG): container finished" podID="1c18f25e-29fa-49bb-8c72-38b1db164e6c" containerID="1092a8f3640f0faa0190fafddc31bf953e5e83b202dc3594cc29da82c206bc28" exitCode=0 Oct 03 09:02:35 crc kubenswrapper[4765]: I1003 09:02:35.058815 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-fpjfs" event={"ID":"1c18f25e-29fa-49bb-8c72-38b1db164e6c","Type":"ContainerDied","Data":"1092a8f3640f0faa0190fafddc31bf953e5e83b202dc3594cc29da82c206bc28"} Oct 03 09:02:35 crc kubenswrapper[4765]: I1003 09:02:35.141515 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-c2nbj\" (UniqueName: \"kubernetes.io/projected/7a703dbc-56c9-490f-8180-0322a355f7f3-kube-api-access-c2nbj\") pod \"watcher-adea-account-create-8dvhn\" (UID: \"7a703dbc-56c9-490f-8180-0322a355f7f3\") " pod="watcher-kuttl-default/watcher-adea-account-create-8dvhn" Oct 03 09:02:35 crc kubenswrapper[4765]: I1003 09:02:35.160287 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-c2nbj\" (UniqueName: \"kubernetes.io/projected/7a703dbc-56c9-490f-8180-0322a355f7f3-kube-api-access-c2nbj\") pod \"watcher-adea-account-create-8dvhn\" (UID: \"7a703dbc-56c9-490f-8180-0322a355f7f3\") " pod="watcher-kuttl-default/watcher-adea-account-create-8dvhn" Oct 03 09:02:35 crc kubenswrapper[4765]: I1003 09:02:35.247972 4765 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/watcher-adea-account-create-8dvhn" Oct 03 09:02:35 crc kubenswrapper[4765]: I1003 09:02:35.658730 4765 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["watcher-kuttl-default/watcher-adea-account-create-8dvhn"] Oct 03 09:02:36 crc kubenswrapper[4765]: I1003 09:02:36.069231 4765 generic.go:334] "Generic (PLEG): container finished" podID="7a703dbc-56c9-490f-8180-0322a355f7f3" containerID="066430c682df73e67f1c634704cdfee0b662d84f04be4d8d8c4ef68a2c49e23f" exitCode=0 Oct 03 09:02:36 crc kubenswrapper[4765]: I1003 09:02:36.069301 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/watcher-adea-account-create-8dvhn" event={"ID":"7a703dbc-56c9-490f-8180-0322a355f7f3","Type":"ContainerDied","Data":"066430c682df73e67f1c634704cdfee0b662d84f04be4d8d8c4ef68a2c49e23f"} Oct 03 09:02:36 crc kubenswrapper[4765]: I1003 09:02:36.073270 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/watcher-adea-account-create-8dvhn" event={"ID":"7a703dbc-56c9-490f-8180-0322a355f7f3","Type":"ContainerStarted","Data":"1c35fd2098ba04c1f33ed50f67b2653c95ba9d3d241804828309900012d74841"} Oct 03 09:02:36 crc kubenswrapper[4765]: I1003 09:02:36.073445 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-fpjfs" event={"ID":"1c18f25e-29fa-49bb-8c72-38b1db164e6c","Type":"ContainerStarted","Data":"cb60f94d4f619aac0c10c41008dd801d40d5be9238203d84ac4920aea4cf4df3"} Oct 03 09:02:36 crc kubenswrapper[4765]: I1003 09:02:36.099776 4765 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-fpjfs" podStartSLOduration=3.468839517 podStartE2EDuration="7.099758198s" podCreationTimestamp="2025-10-03 09:02:29 +0000 UTC" firstStartedPulling="2025-10-03 09:02:32.030217402 +0000 UTC m=+1396.331711732" lastFinishedPulling="2025-10-03 09:02:35.661136083 +0000 UTC m=+1399.962630413" observedRunningTime="2025-10-03 09:02:36.094799582 +0000 UTC m=+1400.396293922" watchObservedRunningTime="2025-10-03 09:02:36.099758198 +0000 UTC m=+1400.401252528" Oct 03 09:02:37 crc kubenswrapper[4765]: I1003 09:02:37.444491 4765 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/watcher-adea-account-create-8dvhn" Oct 03 09:02:37 crc kubenswrapper[4765]: I1003 09:02:37.579020 4765 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-c2nbj\" (UniqueName: \"kubernetes.io/projected/7a703dbc-56c9-490f-8180-0322a355f7f3-kube-api-access-c2nbj\") pod \"7a703dbc-56c9-490f-8180-0322a355f7f3\" (UID: \"7a703dbc-56c9-490f-8180-0322a355f7f3\") " Oct 03 09:02:37 crc kubenswrapper[4765]: I1003 09:02:37.596870 4765 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7a703dbc-56c9-490f-8180-0322a355f7f3-kube-api-access-c2nbj" (OuterVolumeSpecName: "kube-api-access-c2nbj") pod "7a703dbc-56c9-490f-8180-0322a355f7f3" (UID: "7a703dbc-56c9-490f-8180-0322a355f7f3"). InnerVolumeSpecName "kube-api-access-c2nbj". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 09:02:37 crc kubenswrapper[4765]: I1003 09:02:37.680311 4765 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-c2nbj\" (UniqueName: \"kubernetes.io/projected/7a703dbc-56c9-490f-8180-0322a355f7f3-kube-api-access-c2nbj\") on node \"crc\" DevicePath \"\"" Oct 03 09:02:38 crc kubenswrapper[4765]: I1003 09:02:38.088695 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/watcher-adea-account-create-8dvhn" event={"ID":"7a703dbc-56c9-490f-8180-0322a355f7f3","Type":"ContainerDied","Data":"1c35fd2098ba04c1f33ed50f67b2653c95ba9d3d241804828309900012d74841"} Oct 03 09:02:38 crc kubenswrapper[4765]: I1003 09:02:38.088976 4765 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="1c35fd2098ba04c1f33ed50f67b2653c95ba9d3d241804828309900012d74841" Oct 03 09:02:38 crc kubenswrapper[4765]: I1003 09:02:38.088734 4765 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/watcher-adea-account-create-8dvhn" Oct 03 09:02:40 crc kubenswrapper[4765]: I1003 09:02:40.121573 4765 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["watcher-kuttl-default/watcher-kuttl-db-sync-vslqx"] Oct 03 09:02:40 crc kubenswrapper[4765]: E1003 09:02:40.122279 4765 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7a703dbc-56c9-490f-8180-0322a355f7f3" containerName="mariadb-account-create" Oct 03 09:02:40 crc kubenswrapper[4765]: I1003 09:02:40.122294 4765 state_mem.go:107] "Deleted CPUSet assignment" podUID="7a703dbc-56c9-490f-8180-0322a355f7f3" containerName="mariadb-account-create" Oct 03 09:02:40 crc kubenswrapper[4765]: I1003 09:02:40.122471 4765 memory_manager.go:354] "RemoveStaleState removing state" podUID="7a703dbc-56c9-490f-8180-0322a355f7f3" containerName="mariadb-account-create" Oct 03 09:02:40 crc kubenswrapper[4765]: I1003 09:02:40.123156 4765 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/watcher-kuttl-db-sync-vslqx" Oct 03 09:02:40 crc kubenswrapper[4765]: I1003 09:02:40.124992 4765 reflector.go:368] Caches populated for *v1.Secret from object-"watcher-kuttl-default"/"watcher-watcher-kuttl-dockercfg-shwdt" Oct 03 09:02:40 crc kubenswrapper[4765]: I1003 09:02:40.127174 4765 reflector.go:368] Caches populated for *v1.Secret from object-"watcher-kuttl-default"/"watcher-kuttl-config-data" Oct 03 09:02:40 crc kubenswrapper[4765]: I1003 09:02:40.134244 4765 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["watcher-kuttl-default/watcher-kuttl-db-sync-vslqx"] Oct 03 09:02:40 crc kubenswrapper[4765]: I1003 09:02:40.216072 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5eab1ba2-0082-4d6c-b42d-9cb409d59fac-combined-ca-bundle\") pod \"watcher-kuttl-db-sync-vslqx\" (UID: \"5eab1ba2-0082-4d6c-b42d-9cb409d59fac\") " pod="watcher-kuttl-default/watcher-kuttl-db-sync-vslqx" Oct 03 09:02:40 crc kubenswrapper[4765]: I1003 09:02:40.216418 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5eab1ba2-0082-4d6c-b42d-9cb409d59fac-config-data\") pod \"watcher-kuttl-db-sync-vslqx\" (UID: \"5eab1ba2-0082-4d6c-b42d-9cb409d59fac\") " pod="watcher-kuttl-default/watcher-kuttl-db-sync-vslqx" Oct 03 09:02:40 crc kubenswrapper[4765]: I1003 09:02:40.216619 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-w9vv8\" (UniqueName: \"kubernetes.io/projected/5eab1ba2-0082-4d6c-b42d-9cb409d59fac-kube-api-access-w9vv8\") pod \"watcher-kuttl-db-sync-vslqx\" (UID: \"5eab1ba2-0082-4d6c-b42d-9cb409d59fac\") " pod="watcher-kuttl-default/watcher-kuttl-db-sync-vslqx" Oct 03 09:02:40 crc kubenswrapper[4765]: I1003 09:02:40.216860 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/5eab1ba2-0082-4d6c-b42d-9cb409d59fac-db-sync-config-data\") pod \"watcher-kuttl-db-sync-vslqx\" (UID: \"5eab1ba2-0082-4d6c-b42d-9cb409d59fac\") " pod="watcher-kuttl-default/watcher-kuttl-db-sync-vslqx" Oct 03 09:02:40 crc kubenswrapper[4765]: I1003 09:02:40.283570 4765 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-fpjfs" Oct 03 09:02:40 crc kubenswrapper[4765]: I1003 09:02:40.283620 4765 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-fpjfs" Oct 03 09:02:40 crc kubenswrapper[4765]: I1003 09:02:40.317861 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/5eab1ba2-0082-4d6c-b42d-9cb409d59fac-db-sync-config-data\") pod \"watcher-kuttl-db-sync-vslqx\" (UID: \"5eab1ba2-0082-4d6c-b42d-9cb409d59fac\") " pod="watcher-kuttl-default/watcher-kuttl-db-sync-vslqx" Oct 03 09:02:40 crc kubenswrapper[4765]: I1003 09:02:40.317967 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5eab1ba2-0082-4d6c-b42d-9cb409d59fac-combined-ca-bundle\") pod \"watcher-kuttl-db-sync-vslqx\" (UID: \"5eab1ba2-0082-4d6c-b42d-9cb409d59fac\") " pod="watcher-kuttl-default/watcher-kuttl-db-sync-vslqx" Oct 03 09:02:40 crc kubenswrapper[4765]: I1003 09:02:40.319167 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5eab1ba2-0082-4d6c-b42d-9cb409d59fac-config-data\") pod \"watcher-kuttl-db-sync-vslqx\" (UID: \"5eab1ba2-0082-4d6c-b42d-9cb409d59fac\") " pod="watcher-kuttl-default/watcher-kuttl-db-sync-vslqx" Oct 03 09:02:40 crc kubenswrapper[4765]: I1003 09:02:40.319219 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-w9vv8\" (UniqueName: \"kubernetes.io/projected/5eab1ba2-0082-4d6c-b42d-9cb409d59fac-kube-api-access-w9vv8\") pod \"watcher-kuttl-db-sync-vslqx\" (UID: \"5eab1ba2-0082-4d6c-b42d-9cb409d59fac\") " pod="watcher-kuttl-default/watcher-kuttl-db-sync-vslqx" Oct 03 09:02:40 crc kubenswrapper[4765]: I1003 09:02:40.325466 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/5eab1ba2-0082-4d6c-b42d-9cb409d59fac-db-sync-config-data\") pod \"watcher-kuttl-db-sync-vslqx\" (UID: \"5eab1ba2-0082-4d6c-b42d-9cb409d59fac\") " pod="watcher-kuttl-default/watcher-kuttl-db-sync-vslqx" Oct 03 09:02:40 crc kubenswrapper[4765]: I1003 09:02:40.329245 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5eab1ba2-0082-4d6c-b42d-9cb409d59fac-config-data\") pod \"watcher-kuttl-db-sync-vslqx\" (UID: \"5eab1ba2-0082-4d6c-b42d-9cb409d59fac\") " pod="watcher-kuttl-default/watcher-kuttl-db-sync-vslqx" Oct 03 09:02:40 crc kubenswrapper[4765]: I1003 09:02:40.329293 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5eab1ba2-0082-4d6c-b42d-9cb409d59fac-combined-ca-bundle\") pod \"watcher-kuttl-db-sync-vslqx\" (UID: \"5eab1ba2-0082-4d6c-b42d-9cb409d59fac\") " pod="watcher-kuttl-default/watcher-kuttl-db-sync-vslqx" Oct 03 09:02:40 crc kubenswrapper[4765]: I1003 09:02:40.344763 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-w9vv8\" (UniqueName: \"kubernetes.io/projected/5eab1ba2-0082-4d6c-b42d-9cb409d59fac-kube-api-access-w9vv8\") pod \"watcher-kuttl-db-sync-vslqx\" (UID: \"5eab1ba2-0082-4d6c-b42d-9cb409d59fac\") " pod="watcher-kuttl-default/watcher-kuttl-db-sync-vslqx" Oct 03 09:02:40 crc kubenswrapper[4765]: I1003 09:02:40.441742 4765 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/watcher-kuttl-db-sync-vslqx" Oct 03 09:02:41 crc kubenswrapper[4765]: I1003 09:02:41.007838 4765 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["watcher-kuttl-default/watcher-kuttl-db-sync-vslqx"] Oct 03 09:02:41 crc kubenswrapper[4765]: I1003 09:02:41.120716 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/watcher-kuttl-db-sync-vslqx" event={"ID":"5eab1ba2-0082-4d6c-b42d-9cb409d59fac","Type":"ContainerStarted","Data":"d9a9de626572aa083f75c5e0d4765ed293748db5c82ad2626c514f6e3fe0ee99"} Oct 03 09:02:41 crc kubenswrapper[4765]: I1003 09:02:41.337615 4765 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-fpjfs" podUID="1c18f25e-29fa-49bb-8c72-38b1db164e6c" containerName="registry-server" probeResult="failure" output=< Oct 03 09:02:41 crc kubenswrapper[4765]: timeout: failed to connect service ":50051" within 1s Oct 03 09:02:41 crc kubenswrapper[4765]: > Oct 03 09:02:42 crc kubenswrapper[4765]: I1003 09:02:42.130498 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/watcher-kuttl-db-sync-vslqx" event={"ID":"5eab1ba2-0082-4d6c-b42d-9cb409d59fac","Type":"ContainerStarted","Data":"c6deb8c59356abc0c9a2545173d3120e90f8ca2abc81252a82f9ce2813e834a4"} Oct 03 09:02:42 crc kubenswrapper[4765]: I1003 09:02:42.153473 4765 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="watcher-kuttl-default/watcher-kuttl-db-sync-vslqx" podStartSLOduration=2.153448948 podStartE2EDuration="2.153448948s" podCreationTimestamp="2025-10-03 09:02:40 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-03 09:02:42.1483745 +0000 UTC m=+1406.449868830" watchObservedRunningTime="2025-10-03 09:02:42.153448948 +0000 UTC m=+1406.454943278" Oct 03 09:02:44 crc kubenswrapper[4765]: I1003 09:02:44.147979 4765 generic.go:334] "Generic (PLEG): container finished" podID="5eab1ba2-0082-4d6c-b42d-9cb409d59fac" containerID="c6deb8c59356abc0c9a2545173d3120e90f8ca2abc81252a82f9ce2813e834a4" exitCode=0 Oct 03 09:02:44 crc kubenswrapper[4765]: I1003 09:02:44.148063 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/watcher-kuttl-db-sync-vslqx" event={"ID":"5eab1ba2-0082-4d6c-b42d-9cb409d59fac","Type":"ContainerDied","Data":"c6deb8c59356abc0c9a2545173d3120e90f8ca2abc81252a82f9ce2813e834a4"} Oct 03 09:02:45 crc kubenswrapper[4765]: I1003 09:02:45.578137 4765 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/watcher-kuttl-db-sync-vslqx" Oct 03 09:02:45 crc kubenswrapper[4765]: I1003 09:02:45.750244 4765 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5eab1ba2-0082-4d6c-b42d-9cb409d59fac-config-data\") pod \"5eab1ba2-0082-4d6c-b42d-9cb409d59fac\" (UID: \"5eab1ba2-0082-4d6c-b42d-9cb409d59fac\") " Oct 03 09:02:45 crc kubenswrapper[4765]: I1003 09:02:45.750292 4765 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-w9vv8\" (UniqueName: \"kubernetes.io/projected/5eab1ba2-0082-4d6c-b42d-9cb409d59fac-kube-api-access-w9vv8\") pod \"5eab1ba2-0082-4d6c-b42d-9cb409d59fac\" (UID: \"5eab1ba2-0082-4d6c-b42d-9cb409d59fac\") " Oct 03 09:02:45 crc kubenswrapper[4765]: I1003 09:02:45.750455 4765 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/5eab1ba2-0082-4d6c-b42d-9cb409d59fac-db-sync-config-data\") pod \"5eab1ba2-0082-4d6c-b42d-9cb409d59fac\" (UID: \"5eab1ba2-0082-4d6c-b42d-9cb409d59fac\") " Oct 03 09:02:45 crc kubenswrapper[4765]: I1003 09:02:45.750509 4765 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5eab1ba2-0082-4d6c-b42d-9cb409d59fac-combined-ca-bundle\") pod \"5eab1ba2-0082-4d6c-b42d-9cb409d59fac\" (UID: \"5eab1ba2-0082-4d6c-b42d-9cb409d59fac\") " Oct 03 09:02:45 crc kubenswrapper[4765]: I1003 09:02:45.755169 4765 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5eab1ba2-0082-4d6c-b42d-9cb409d59fac-db-sync-config-data" (OuterVolumeSpecName: "db-sync-config-data") pod "5eab1ba2-0082-4d6c-b42d-9cb409d59fac" (UID: "5eab1ba2-0082-4d6c-b42d-9cb409d59fac"). InnerVolumeSpecName "db-sync-config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 09:02:45 crc kubenswrapper[4765]: I1003 09:02:45.755639 4765 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5eab1ba2-0082-4d6c-b42d-9cb409d59fac-kube-api-access-w9vv8" (OuterVolumeSpecName: "kube-api-access-w9vv8") pod "5eab1ba2-0082-4d6c-b42d-9cb409d59fac" (UID: "5eab1ba2-0082-4d6c-b42d-9cb409d59fac"). InnerVolumeSpecName "kube-api-access-w9vv8". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 09:02:45 crc kubenswrapper[4765]: I1003 09:02:45.774836 4765 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5eab1ba2-0082-4d6c-b42d-9cb409d59fac-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "5eab1ba2-0082-4d6c-b42d-9cb409d59fac" (UID: "5eab1ba2-0082-4d6c-b42d-9cb409d59fac"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 09:02:45 crc kubenswrapper[4765]: I1003 09:02:45.795794 4765 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5eab1ba2-0082-4d6c-b42d-9cb409d59fac-config-data" (OuterVolumeSpecName: "config-data") pod "5eab1ba2-0082-4d6c-b42d-9cb409d59fac" (UID: "5eab1ba2-0082-4d6c-b42d-9cb409d59fac"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 09:02:45 crc kubenswrapper[4765]: I1003 09:02:45.851785 4765 reconciler_common.go:293] "Volume detached for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/5eab1ba2-0082-4d6c-b42d-9cb409d59fac-db-sync-config-data\") on node \"crc\" DevicePath \"\"" Oct 03 09:02:45 crc kubenswrapper[4765]: I1003 09:02:45.851815 4765 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5eab1ba2-0082-4d6c-b42d-9cb409d59fac-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 03 09:02:45 crc kubenswrapper[4765]: I1003 09:02:45.851824 4765 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5eab1ba2-0082-4d6c-b42d-9cb409d59fac-config-data\") on node \"crc\" DevicePath \"\"" Oct 03 09:02:45 crc kubenswrapper[4765]: I1003 09:02:45.851833 4765 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-w9vv8\" (UniqueName: \"kubernetes.io/projected/5eab1ba2-0082-4d6c-b42d-9cb409d59fac-kube-api-access-w9vv8\") on node \"crc\" DevicePath \"\"" Oct 03 09:02:46 crc kubenswrapper[4765]: I1003 09:02:46.163942 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/watcher-kuttl-db-sync-vslqx" event={"ID":"5eab1ba2-0082-4d6c-b42d-9cb409d59fac","Type":"ContainerDied","Data":"d9a9de626572aa083f75c5e0d4765ed293748db5c82ad2626c514f6e3fe0ee99"} Oct 03 09:02:46 crc kubenswrapper[4765]: I1003 09:02:46.163986 4765 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/watcher-kuttl-db-sync-vslqx" Oct 03 09:02:46 crc kubenswrapper[4765]: I1003 09:02:46.163991 4765 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="d9a9de626572aa083f75c5e0d4765ed293748db5c82ad2626c514f6e3fe0ee99" Oct 03 09:02:46 crc kubenswrapper[4765]: I1003 09:02:46.415159 4765 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["watcher-kuttl-default/watcher-kuttl-decision-engine-0"] Oct 03 09:02:46 crc kubenswrapper[4765]: E1003 09:02:46.415604 4765 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5eab1ba2-0082-4d6c-b42d-9cb409d59fac" containerName="watcher-kuttl-db-sync" Oct 03 09:02:46 crc kubenswrapper[4765]: I1003 09:02:46.415629 4765 state_mem.go:107] "Deleted CPUSet assignment" podUID="5eab1ba2-0082-4d6c-b42d-9cb409d59fac" containerName="watcher-kuttl-db-sync" Oct 03 09:02:46 crc kubenswrapper[4765]: I1003 09:02:46.415881 4765 memory_manager.go:354] "RemoveStaleState removing state" podUID="5eab1ba2-0082-4d6c-b42d-9cb409d59fac" containerName="watcher-kuttl-db-sync" Oct 03 09:02:46 crc kubenswrapper[4765]: I1003 09:02:46.416604 4765 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/watcher-kuttl-decision-engine-0" Oct 03 09:02:46 crc kubenswrapper[4765]: I1003 09:02:46.419739 4765 reflector.go:368] Caches populated for *v1.Secret from object-"watcher-kuttl-default"/"watcher-watcher-kuttl-dockercfg-shwdt" Oct 03 09:02:46 crc kubenswrapper[4765]: I1003 09:02:46.420969 4765 reflector.go:368] Caches populated for *v1.Secret from object-"watcher-kuttl-default"/"watcher-kuttl-decision-engine-config-data" Oct 03 09:02:46 crc kubenswrapper[4765]: I1003 09:02:46.429880 4765 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["watcher-kuttl-default/watcher-kuttl-decision-engine-0"] Oct 03 09:02:46 crc kubenswrapper[4765]: I1003 09:02:46.484399 4765 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["watcher-kuttl-default/watcher-kuttl-applier-0"] Oct 03 09:02:46 crc kubenswrapper[4765]: I1003 09:02:46.485558 4765 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/watcher-kuttl-applier-0" Oct 03 09:02:46 crc kubenswrapper[4765]: I1003 09:02:46.488003 4765 reflector.go:368] Caches populated for *v1.Secret from object-"watcher-kuttl-default"/"watcher-kuttl-applier-config-data" Oct 03 09:02:46 crc kubenswrapper[4765]: I1003 09:02:46.506357 4765 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["watcher-kuttl-default/watcher-kuttl-applier-0"] Oct 03 09:02:46 crc kubenswrapper[4765]: I1003 09:02:46.561378 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1eef47c2-953c-4299-a994-a3ab5583b2d0-config-data\") pod \"watcher-kuttl-decision-engine-0\" (UID: \"1eef47c2-953c-4299-a994-a3ab5583b2d0\") " pod="watcher-kuttl-default/watcher-kuttl-decision-engine-0" Oct 03 09:02:46 crc kubenswrapper[4765]: I1003 09:02:46.561500 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"custom-prometheus-ca\" (UniqueName: \"kubernetes.io/secret/1eef47c2-953c-4299-a994-a3ab5583b2d0-custom-prometheus-ca\") pod \"watcher-kuttl-decision-engine-0\" (UID: \"1eef47c2-953c-4299-a994-a3ab5583b2d0\") " pod="watcher-kuttl-default/watcher-kuttl-decision-engine-0" Oct 03 09:02:46 crc kubenswrapper[4765]: I1003 09:02:46.561536 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1eef47c2-953c-4299-a994-a3ab5583b2d0-combined-ca-bundle\") pod \"watcher-kuttl-decision-engine-0\" (UID: \"1eef47c2-953c-4299-a994-a3ab5583b2d0\") " pod="watcher-kuttl-default/watcher-kuttl-decision-engine-0" Oct 03 09:02:46 crc kubenswrapper[4765]: I1003 09:02:46.561570 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/1eef47c2-953c-4299-a994-a3ab5583b2d0-logs\") pod \"watcher-kuttl-decision-engine-0\" (UID: \"1eef47c2-953c-4299-a994-a3ab5583b2d0\") " pod="watcher-kuttl-default/watcher-kuttl-decision-engine-0" Oct 03 09:02:46 crc kubenswrapper[4765]: I1003 09:02:46.561756 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ljtwk\" (UniqueName: \"kubernetes.io/projected/1eef47c2-953c-4299-a994-a3ab5583b2d0-kube-api-access-ljtwk\") pod \"watcher-kuttl-decision-engine-0\" (UID: \"1eef47c2-953c-4299-a994-a3ab5583b2d0\") " pod="watcher-kuttl-default/watcher-kuttl-decision-engine-0" Oct 03 09:02:46 crc kubenswrapper[4765]: I1003 09:02:46.572463 4765 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["watcher-kuttl-default/watcher-kuttl-api-0"] Oct 03 09:02:46 crc kubenswrapper[4765]: I1003 09:02:46.573880 4765 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/watcher-kuttl-api-0" Oct 03 09:02:46 crc kubenswrapper[4765]: I1003 09:02:46.577473 4765 reflector.go:368] Caches populated for *v1.Secret from object-"watcher-kuttl-default"/"cert-watcher-public-svc" Oct 03 09:02:46 crc kubenswrapper[4765]: I1003 09:02:46.578002 4765 reflector.go:368] Caches populated for *v1.Secret from object-"watcher-kuttl-default"/"cert-watcher-internal-svc" Oct 03 09:02:46 crc kubenswrapper[4765]: I1003 09:02:46.581282 4765 reflector.go:368] Caches populated for *v1.Secret from object-"watcher-kuttl-default"/"watcher-kuttl-api-config-data" Oct 03 09:02:46 crc kubenswrapper[4765]: I1003 09:02:46.592544 4765 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["watcher-kuttl-default/watcher-kuttl-api-0"] Oct 03 09:02:46 crc kubenswrapper[4765]: I1003 09:02:46.663172 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5c3fda04-8007-4e3b-9283-9a28e6c57c7c-combined-ca-bundle\") pod \"watcher-kuttl-api-0\" (UID: \"5c3fda04-8007-4e3b-9283-9a28e6c57c7c\") " pod="watcher-kuttl-default/watcher-kuttl-api-0" Oct 03 09:02:46 crc kubenswrapper[4765]: I1003 09:02:46.663236 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5c3fda04-8007-4e3b-9283-9a28e6c57c7c-config-data\") pod \"watcher-kuttl-api-0\" (UID: \"5c3fda04-8007-4e3b-9283-9a28e6c57c7c\") " pod="watcher-kuttl-default/watcher-kuttl-api-0" Oct 03 09:02:46 crc kubenswrapper[4765]: I1003 09:02:46.663257 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-j9m5s\" (UniqueName: \"kubernetes.io/projected/02077287-6b7d-4d30-8a19-e7f42699e5d2-kube-api-access-j9m5s\") pod \"watcher-kuttl-applier-0\" (UID: \"02077287-6b7d-4d30-8a19-e7f42699e5d2\") " pod="watcher-kuttl-default/watcher-kuttl-applier-0" Oct 03 09:02:46 crc kubenswrapper[4765]: I1003 09:02:46.663301 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"custom-prometheus-ca\" (UniqueName: \"kubernetes.io/secret/1eef47c2-953c-4299-a994-a3ab5583b2d0-custom-prometheus-ca\") pod \"watcher-kuttl-decision-engine-0\" (UID: \"1eef47c2-953c-4299-a994-a3ab5583b2d0\") " pod="watcher-kuttl-default/watcher-kuttl-decision-engine-0" Oct 03 09:02:46 crc kubenswrapper[4765]: I1003 09:02:46.663327 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1eef47c2-953c-4299-a994-a3ab5583b2d0-combined-ca-bundle\") pod \"watcher-kuttl-decision-engine-0\" (UID: \"1eef47c2-953c-4299-a994-a3ab5583b2d0\") " pod="watcher-kuttl-default/watcher-kuttl-decision-engine-0" Oct 03 09:02:46 crc kubenswrapper[4765]: I1003 09:02:46.663356 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/1eef47c2-953c-4299-a994-a3ab5583b2d0-logs\") pod \"watcher-kuttl-decision-engine-0\" (UID: \"1eef47c2-953c-4299-a994-a3ab5583b2d0\") " pod="watcher-kuttl-default/watcher-kuttl-decision-engine-0" Oct 03 09:02:46 crc kubenswrapper[4765]: I1003 09:02:46.663377 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/02077287-6b7d-4d30-8a19-e7f42699e5d2-combined-ca-bundle\") pod \"watcher-kuttl-applier-0\" (UID: \"02077287-6b7d-4d30-8a19-e7f42699e5d2\") " pod="watcher-kuttl-default/watcher-kuttl-applier-0" Oct 03 09:02:46 crc kubenswrapper[4765]: I1003 09:02:46.663409 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"custom-prometheus-ca\" (UniqueName: \"kubernetes.io/secret/5c3fda04-8007-4e3b-9283-9a28e6c57c7c-custom-prometheus-ca\") pod \"watcher-kuttl-api-0\" (UID: \"5c3fda04-8007-4e3b-9283-9a28e6c57c7c\") " pod="watcher-kuttl-default/watcher-kuttl-api-0" Oct 03 09:02:46 crc kubenswrapper[4765]: I1003 09:02:46.663450 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/5c3fda04-8007-4e3b-9283-9a28e6c57c7c-logs\") pod \"watcher-kuttl-api-0\" (UID: \"5c3fda04-8007-4e3b-9283-9a28e6c57c7c\") " pod="watcher-kuttl-default/watcher-kuttl-api-0" Oct 03 09:02:46 crc kubenswrapper[4765]: I1003 09:02:46.663469 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/5c3fda04-8007-4e3b-9283-9a28e6c57c7c-public-tls-certs\") pod \"watcher-kuttl-api-0\" (UID: \"5c3fda04-8007-4e3b-9283-9a28e6c57c7c\") " pod="watcher-kuttl-default/watcher-kuttl-api-0" Oct 03 09:02:46 crc kubenswrapper[4765]: I1003 09:02:46.663514 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/02077287-6b7d-4d30-8a19-e7f42699e5d2-config-data\") pod \"watcher-kuttl-applier-0\" (UID: \"02077287-6b7d-4d30-8a19-e7f42699e5d2\") " pod="watcher-kuttl-default/watcher-kuttl-applier-0" Oct 03 09:02:46 crc kubenswrapper[4765]: I1003 09:02:46.663535 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/5c3fda04-8007-4e3b-9283-9a28e6c57c7c-internal-tls-certs\") pod \"watcher-kuttl-api-0\" (UID: \"5c3fda04-8007-4e3b-9283-9a28e6c57c7c\") " pod="watcher-kuttl-default/watcher-kuttl-api-0" Oct 03 09:02:46 crc kubenswrapper[4765]: I1003 09:02:46.663559 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ljtwk\" (UniqueName: \"kubernetes.io/projected/1eef47c2-953c-4299-a994-a3ab5583b2d0-kube-api-access-ljtwk\") pod \"watcher-kuttl-decision-engine-0\" (UID: \"1eef47c2-953c-4299-a994-a3ab5583b2d0\") " pod="watcher-kuttl-default/watcher-kuttl-decision-engine-0" Oct 03 09:02:46 crc kubenswrapper[4765]: I1003 09:02:46.663599 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/02077287-6b7d-4d30-8a19-e7f42699e5d2-logs\") pod \"watcher-kuttl-applier-0\" (UID: \"02077287-6b7d-4d30-8a19-e7f42699e5d2\") " pod="watcher-kuttl-default/watcher-kuttl-applier-0" Oct 03 09:02:46 crc kubenswrapper[4765]: I1003 09:02:46.663620 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1eef47c2-953c-4299-a994-a3ab5583b2d0-config-data\") pod \"watcher-kuttl-decision-engine-0\" (UID: \"1eef47c2-953c-4299-a994-a3ab5583b2d0\") " pod="watcher-kuttl-default/watcher-kuttl-decision-engine-0" Oct 03 09:02:46 crc kubenswrapper[4765]: I1003 09:02:46.663641 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fjrfd\" (UniqueName: \"kubernetes.io/projected/5c3fda04-8007-4e3b-9283-9a28e6c57c7c-kube-api-access-fjrfd\") pod \"watcher-kuttl-api-0\" (UID: \"5c3fda04-8007-4e3b-9283-9a28e6c57c7c\") " pod="watcher-kuttl-default/watcher-kuttl-api-0" Oct 03 09:02:46 crc kubenswrapper[4765]: I1003 09:02:46.664444 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/1eef47c2-953c-4299-a994-a3ab5583b2d0-logs\") pod \"watcher-kuttl-decision-engine-0\" (UID: \"1eef47c2-953c-4299-a994-a3ab5583b2d0\") " pod="watcher-kuttl-default/watcher-kuttl-decision-engine-0" Oct 03 09:02:46 crc kubenswrapper[4765]: I1003 09:02:46.670946 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1eef47c2-953c-4299-a994-a3ab5583b2d0-config-data\") pod \"watcher-kuttl-decision-engine-0\" (UID: \"1eef47c2-953c-4299-a994-a3ab5583b2d0\") " pod="watcher-kuttl-default/watcher-kuttl-decision-engine-0" Oct 03 09:02:46 crc kubenswrapper[4765]: I1003 09:02:46.673045 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1eef47c2-953c-4299-a994-a3ab5583b2d0-combined-ca-bundle\") pod \"watcher-kuttl-decision-engine-0\" (UID: \"1eef47c2-953c-4299-a994-a3ab5583b2d0\") " pod="watcher-kuttl-default/watcher-kuttl-decision-engine-0" Oct 03 09:02:46 crc kubenswrapper[4765]: I1003 09:02:46.673495 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"custom-prometheus-ca\" (UniqueName: \"kubernetes.io/secret/1eef47c2-953c-4299-a994-a3ab5583b2d0-custom-prometheus-ca\") pod \"watcher-kuttl-decision-engine-0\" (UID: \"1eef47c2-953c-4299-a994-a3ab5583b2d0\") " pod="watcher-kuttl-default/watcher-kuttl-decision-engine-0" Oct 03 09:02:46 crc kubenswrapper[4765]: I1003 09:02:46.686582 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ljtwk\" (UniqueName: \"kubernetes.io/projected/1eef47c2-953c-4299-a994-a3ab5583b2d0-kube-api-access-ljtwk\") pod \"watcher-kuttl-decision-engine-0\" (UID: \"1eef47c2-953c-4299-a994-a3ab5583b2d0\") " pod="watcher-kuttl-default/watcher-kuttl-decision-engine-0" Oct 03 09:02:46 crc kubenswrapper[4765]: I1003 09:02:46.766238 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/02077287-6b7d-4d30-8a19-e7f42699e5d2-config-data\") pod \"watcher-kuttl-applier-0\" (UID: \"02077287-6b7d-4d30-8a19-e7f42699e5d2\") " pod="watcher-kuttl-default/watcher-kuttl-applier-0" Oct 03 09:02:46 crc kubenswrapper[4765]: I1003 09:02:46.766308 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/5c3fda04-8007-4e3b-9283-9a28e6c57c7c-internal-tls-certs\") pod \"watcher-kuttl-api-0\" (UID: \"5c3fda04-8007-4e3b-9283-9a28e6c57c7c\") " pod="watcher-kuttl-default/watcher-kuttl-api-0" Oct 03 09:02:46 crc kubenswrapper[4765]: I1003 09:02:46.766370 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/02077287-6b7d-4d30-8a19-e7f42699e5d2-logs\") pod \"watcher-kuttl-applier-0\" (UID: \"02077287-6b7d-4d30-8a19-e7f42699e5d2\") " pod="watcher-kuttl-default/watcher-kuttl-applier-0" Oct 03 09:02:46 crc kubenswrapper[4765]: I1003 09:02:46.766399 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fjrfd\" (UniqueName: \"kubernetes.io/projected/5c3fda04-8007-4e3b-9283-9a28e6c57c7c-kube-api-access-fjrfd\") pod \"watcher-kuttl-api-0\" (UID: \"5c3fda04-8007-4e3b-9283-9a28e6c57c7c\") " pod="watcher-kuttl-default/watcher-kuttl-api-0" Oct 03 09:02:46 crc kubenswrapper[4765]: I1003 09:02:46.766438 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5c3fda04-8007-4e3b-9283-9a28e6c57c7c-combined-ca-bundle\") pod \"watcher-kuttl-api-0\" (UID: \"5c3fda04-8007-4e3b-9283-9a28e6c57c7c\") " pod="watcher-kuttl-default/watcher-kuttl-api-0" Oct 03 09:02:46 crc kubenswrapper[4765]: I1003 09:02:46.766460 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5c3fda04-8007-4e3b-9283-9a28e6c57c7c-config-data\") pod \"watcher-kuttl-api-0\" (UID: \"5c3fda04-8007-4e3b-9283-9a28e6c57c7c\") " pod="watcher-kuttl-default/watcher-kuttl-api-0" Oct 03 09:02:46 crc kubenswrapper[4765]: I1003 09:02:46.766483 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-j9m5s\" (UniqueName: \"kubernetes.io/projected/02077287-6b7d-4d30-8a19-e7f42699e5d2-kube-api-access-j9m5s\") pod \"watcher-kuttl-applier-0\" (UID: \"02077287-6b7d-4d30-8a19-e7f42699e5d2\") " pod="watcher-kuttl-default/watcher-kuttl-applier-0" Oct 03 09:02:46 crc kubenswrapper[4765]: I1003 09:02:46.766524 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/02077287-6b7d-4d30-8a19-e7f42699e5d2-combined-ca-bundle\") pod \"watcher-kuttl-applier-0\" (UID: \"02077287-6b7d-4d30-8a19-e7f42699e5d2\") " pod="watcher-kuttl-default/watcher-kuttl-applier-0" Oct 03 09:02:46 crc kubenswrapper[4765]: I1003 09:02:46.766561 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"custom-prometheus-ca\" (UniqueName: \"kubernetes.io/secret/5c3fda04-8007-4e3b-9283-9a28e6c57c7c-custom-prometheus-ca\") pod \"watcher-kuttl-api-0\" (UID: \"5c3fda04-8007-4e3b-9283-9a28e6c57c7c\") " pod="watcher-kuttl-default/watcher-kuttl-api-0" Oct 03 09:02:46 crc kubenswrapper[4765]: I1003 09:02:46.766611 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/5c3fda04-8007-4e3b-9283-9a28e6c57c7c-logs\") pod \"watcher-kuttl-api-0\" (UID: \"5c3fda04-8007-4e3b-9283-9a28e6c57c7c\") " pod="watcher-kuttl-default/watcher-kuttl-api-0" Oct 03 09:02:46 crc kubenswrapper[4765]: I1003 09:02:46.766672 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/5c3fda04-8007-4e3b-9283-9a28e6c57c7c-public-tls-certs\") pod \"watcher-kuttl-api-0\" (UID: \"5c3fda04-8007-4e3b-9283-9a28e6c57c7c\") " pod="watcher-kuttl-default/watcher-kuttl-api-0" Oct 03 09:02:46 crc kubenswrapper[4765]: I1003 09:02:46.770454 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"custom-prometheus-ca\" (UniqueName: \"kubernetes.io/secret/5c3fda04-8007-4e3b-9283-9a28e6c57c7c-custom-prometheus-ca\") pod \"watcher-kuttl-api-0\" (UID: \"5c3fda04-8007-4e3b-9283-9a28e6c57c7c\") " pod="watcher-kuttl-default/watcher-kuttl-api-0" Oct 03 09:02:46 crc kubenswrapper[4765]: I1003 09:02:46.770617 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/5c3fda04-8007-4e3b-9283-9a28e6c57c7c-internal-tls-certs\") pod \"watcher-kuttl-api-0\" (UID: \"5c3fda04-8007-4e3b-9283-9a28e6c57c7c\") " pod="watcher-kuttl-default/watcher-kuttl-api-0" Oct 03 09:02:46 crc kubenswrapper[4765]: I1003 09:02:46.771275 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/5c3fda04-8007-4e3b-9283-9a28e6c57c7c-public-tls-certs\") pod \"watcher-kuttl-api-0\" (UID: \"5c3fda04-8007-4e3b-9283-9a28e6c57c7c\") " pod="watcher-kuttl-default/watcher-kuttl-api-0" Oct 03 09:02:46 crc kubenswrapper[4765]: I1003 09:02:46.775187 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/5c3fda04-8007-4e3b-9283-9a28e6c57c7c-logs\") pod \"watcher-kuttl-api-0\" (UID: \"5c3fda04-8007-4e3b-9283-9a28e6c57c7c\") " pod="watcher-kuttl-default/watcher-kuttl-api-0" Oct 03 09:02:46 crc kubenswrapper[4765]: I1003 09:02:46.775530 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/02077287-6b7d-4d30-8a19-e7f42699e5d2-logs\") pod \"watcher-kuttl-applier-0\" (UID: \"02077287-6b7d-4d30-8a19-e7f42699e5d2\") " pod="watcher-kuttl-default/watcher-kuttl-applier-0" Oct 03 09:02:46 crc kubenswrapper[4765]: I1003 09:02:46.777845 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/02077287-6b7d-4d30-8a19-e7f42699e5d2-config-data\") pod \"watcher-kuttl-applier-0\" (UID: \"02077287-6b7d-4d30-8a19-e7f42699e5d2\") " pod="watcher-kuttl-default/watcher-kuttl-applier-0" Oct 03 09:02:46 crc kubenswrapper[4765]: I1003 09:02:46.780339 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/02077287-6b7d-4d30-8a19-e7f42699e5d2-combined-ca-bundle\") pod \"watcher-kuttl-applier-0\" (UID: \"02077287-6b7d-4d30-8a19-e7f42699e5d2\") " pod="watcher-kuttl-default/watcher-kuttl-applier-0" Oct 03 09:02:46 crc kubenswrapper[4765]: I1003 09:02:46.782895 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5c3fda04-8007-4e3b-9283-9a28e6c57c7c-config-data\") pod \"watcher-kuttl-api-0\" (UID: \"5c3fda04-8007-4e3b-9283-9a28e6c57c7c\") " pod="watcher-kuttl-default/watcher-kuttl-api-0" Oct 03 09:02:46 crc kubenswrapper[4765]: I1003 09:02:46.783408 4765 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/watcher-kuttl-decision-engine-0" Oct 03 09:02:46 crc kubenswrapper[4765]: I1003 09:02:46.788708 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5c3fda04-8007-4e3b-9283-9a28e6c57c7c-combined-ca-bundle\") pod \"watcher-kuttl-api-0\" (UID: \"5c3fda04-8007-4e3b-9283-9a28e6c57c7c\") " pod="watcher-kuttl-default/watcher-kuttl-api-0" Oct 03 09:02:46 crc kubenswrapper[4765]: I1003 09:02:46.789843 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fjrfd\" (UniqueName: \"kubernetes.io/projected/5c3fda04-8007-4e3b-9283-9a28e6c57c7c-kube-api-access-fjrfd\") pod \"watcher-kuttl-api-0\" (UID: \"5c3fda04-8007-4e3b-9283-9a28e6c57c7c\") " pod="watcher-kuttl-default/watcher-kuttl-api-0" Oct 03 09:02:46 crc kubenswrapper[4765]: I1003 09:02:46.796919 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-j9m5s\" (UniqueName: \"kubernetes.io/projected/02077287-6b7d-4d30-8a19-e7f42699e5d2-kube-api-access-j9m5s\") pod \"watcher-kuttl-applier-0\" (UID: \"02077287-6b7d-4d30-8a19-e7f42699e5d2\") " pod="watcher-kuttl-default/watcher-kuttl-applier-0" Oct 03 09:02:46 crc kubenswrapper[4765]: I1003 09:02:46.804154 4765 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/watcher-kuttl-applier-0" Oct 03 09:02:46 crc kubenswrapper[4765]: I1003 09:02:46.895059 4765 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/watcher-kuttl-api-0" Oct 03 09:02:47 crc kubenswrapper[4765]: I1003 09:02:47.338195 4765 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["watcher-kuttl-default/watcher-kuttl-decision-engine-0"] Oct 03 09:02:47 crc kubenswrapper[4765]: I1003 09:02:47.349670 4765 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["watcher-kuttl-default/watcher-kuttl-applier-0"] Oct 03 09:02:47 crc kubenswrapper[4765]: I1003 09:02:47.513189 4765 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["watcher-kuttl-default/watcher-kuttl-api-0"] Oct 03 09:02:47 crc kubenswrapper[4765]: W1003 09:02:47.524849 4765 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod5c3fda04_8007_4e3b_9283_9a28e6c57c7c.slice/crio-4f38d01ece8841ac7637f5362e3b48408f7a82468b71be9891b50fa71506a684 WatchSource:0}: Error finding container 4f38d01ece8841ac7637f5362e3b48408f7a82468b71be9891b50fa71506a684: Status 404 returned error can't find the container with id 4f38d01ece8841ac7637f5362e3b48408f7a82468b71be9891b50fa71506a684 Oct 03 09:02:48 crc kubenswrapper[4765]: I1003 09:02:48.186988 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/watcher-kuttl-decision-engine-0" event={"ID":"1eef47c2-953c-4299-a994-a3ab5583b2d0","Type":"ContainerStarted","Data":"3475db93fee76cefce14c7c9aef47cf2bafa98b960c66b70f4d3da860393bb9d"} Oct 03 09:02:48 crc kubenswrapper[4765]: I1003 09:02:48.187293 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/watcher-kuttl-decision-engine-0" event={"ID":"1eef47c2-953c-4299-a994-a3ab5583b2d0","Type":"ContainerStarted","Data":"a6586bd5d098bf1c2249fe0eb885aa9ffc184dea725da899bd4cc5dda4f34c7c"} Oct 03 09:02:48 crc kubenswrapper[4765]: I1003 09:02:48.191506 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/watcher-kuttl-applier-0" event={"ID":"02077287-6b7d-4d30-8a19-e7f42699e5d2","Type":"ContainerStarted","Data":"fb430e7d139d8ef6c9d2c00d38b03c6289fd36807707c4b86a3953e4fc3f713d"} Oct 03 09:02:48 crc kubenswrapper[4765]: I1003 09:02:48.191552 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/watcher-kuttl-applier-0" event={"ID":"02077287-6b7d-4d30-8a19-e7f42699e5d2","Type":"ContainerStarted","Data":"3a6d4977b5de3fce7113a8a0559b82e44601759afb53733a6c517a6a4ffa30dd"} Oct 03 09:02:48 crc kubenswrapper[4765]: I1003 09:02:48.194063 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/watcher-kuttl-api-0" event={"ID":"5c3fda04-8007-4e3b-9283-9a28e6c57c7c","Type":"ContainerStarted","Data":"10e554f2337fa67d36ffcbf048fca52097cf515a04ca2df4b373f5bb6888638b"} Oct 03 09:02:48 crc kubenswrapper[4765]: I1003 09:02:48.194275 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/watcher-kuttl-api-0" event={"ID":"5c3fda04-8007-4e3b-9283-9a28e6c57c7c","Type":"ContainerStarted","Data":"4f38d01ece8841ac7637f5362e3b48408f7a82468b71be9891b50fa71506a684"} Oct 03 09:02:48 crc kubenswrapper[4765]: I1003 09:02:48.216250 4765 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="watcher-kuttl-default/watcher-kuttl-decision-engine-0" podStartSLOduration=2.216226988 podStartE2EDuration="2.216226988s" podCreationTimestamp="2025-10-03 09:02:46 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-03 09:02:48.210967856 +0000 UTC m=+1412.512462206" watchObservedRunningTime="2025-10-03 09:02:48.216226988 +0000 UTC m=+1412.517721328" Oct 03 09:02:48 crc kubenswrapper[4765]: I1003 09:02:48.240622 4765 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="watcher-kuttl-default/watcher-kuttl-applier-0" podStartSLOduration=2.240598234 podStartE2EDuration="2.240598234s" podCreationTimestamp="2025-10-03 09:02:46 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-03 09:02:48.240493051 +0000 UTC m=+1412.541987381" watchObservedRunningTime="2025-10-03 09:02:48.240598234 +0000 UTC m=+1412.542092564" Oct 03 09:02:49 crc kubenswrapper[4765]: I1003 09:02:49.212536 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/watcher-kuttl-api-0" event={"ID":"5c3fda04-8007-4e3b-9283-9a28e6c57c7c","Type":"ContainerStarted","Data":"b90e44508f71fa956b268881a4af09fc788044b0a23b91bc37720587a17b35b5"} Oct 03 09:02:49 crc kubenswrapper[4765]: I1003 09:02:49.213719 4765 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="watcher-kuttl-default/watcher-kuttl-api-0" Oct 03 09:02:49 crc kubenswrapper[4765]: I1003 09:02:49.238107 4765 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="watcher-kuttl-default/watcher-kuttl-api-0" podStartSLOduration=3.238079947 podStartE2EDuration="3.238079947s" podCreationTimestamp="2025-10-03 09:02:46 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-03 09:02:49.235110952 +0000 UTC m=+1413.536605302" watchObservedRunningTime="2025-10-03 09:02:49.238079947 +0000 UTC m=+1413.539574277" Oct 03 09:02:50 crc kubenswrapper[4765]: I1003 09:02:50.328753 4765 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-fpjfs" Oct 03 09:02:50 crc kubenswrapper[4765]: I1003 09:02:50.395559 4765 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-fpjfs" Oct 03 09:02:51 crc kubenswrapper[4765]: I1003 09:02:51.226187 4765 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Oct 03 09:02:51 crc kubenswrapper[4765]: I1003 09:02:51.720591 4765 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="watcher-kuttl-default/watcher-kuttl-api-0" Oct 03 09:02:51 crc kubenswrapper[4765]: I1003 09:02:51.804984 4765 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="watcher-kuttl-default/watcher-kuttl-applier-0" Oct 03 09:02:51 crc kubenswrapper[4765]: I1003 09:02:51.895712 4765 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="watcher-kuttl-default/watcher-kuttl-api-0" Oct 03 09:02:53 crc kubenswrapper[4765]: I1003 09:02:53.488030 4765 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="watcher-kuttl-default/ceilometer-0" Oct 03 09:02:53 crc kubenswrapper[4765]: I1003 09:02:53.950910 4765 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-fpjfs"] Oct 03 09:02:53 crc kubenswrapper[4765]: I1003 09:02:53.951186 4765 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-fpjfs" podUID="1c18f25e-29fa-49bb-8c72-38b1db164e6c" containerName="registry-server" containerID="cri-o://cb60f94d4f619aac0c10c41008dd801d40d5be9238203d84ac4920aea4cf4df3" gracePeriod=2 Oct 03 09:02:54 crc kubenswrapper[4765]: I1003 09:02:54.255101 4765 generic.go:334] "Generic (PLEG): container finished" podID="1c18f25e-29fa-49bb-8c72-38b1db164e6c" containerID="cb60f94d4f619aac0c10c41008dd801d40d5be9238203d84ac4920aea4cf4df3" exitCode=0 Oct 03 09:02:54 crc kubenswrapper[4765]: I1003 09:02:54.255178 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-fpjfs" event={"ID":"1c18f25e-29fa-49bb-8c72-38b1db164e6c","Type":"ContainerDied","Data":"cb60f94d4f619aac0c10c41008dd801d40d5be9238203d84ac4920aea4cf4df3"} Oct 03 09:02:54 crc kubenswrapper[4765]: I1003 09:02:54.565561 4765 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-fpjfs" Oct 03 09:02:54 crc kubenswrapper[4765]: I1003 09:02:54.712397 4765 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1c18f25e-29fa-49bb-8c72-38b1db164e6c-catalog-content\") pod \"1c18f25e-29fa-49bb-8c72-38b1db164e6c\" (UID: \"1c18f25e-29fa-49bb-8c72-38b1db164e6c\") " Oct 03 09:02:54 crc kubenswrapper[4765]: I1003 09:02:54.712827 4765 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bjk5n\" (UniqueName: \"kubernetes.io/projected/1c18f25e-29fa-49bb-8c72-38b1db164e6c-kube-api-access-bjk5n\") pod \"1c18f25e-29fa-49bb-8c72-38b1db164e6c\" (UID: \"1c18f25e-29fa-49bb-8c72-38b1db164e6c\") " Oct 03 09:02:54 crc kubenswrapper[4765]: I1003 09:02:54.712906 4765 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1c18f25e-29fa-49bb-8c72-38b1db164e6c-utilities\") pod \"1c18f25e-29fa-49bb-8c72-38b1db164e6c\" (UID: \"1c18f25e-29fa-49bb-8c72-38b1db164e6c\") " Oct 03 09:02:54 crc kubenswrapper[4765]: I1003 09:02:54.713847 4765 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1c18f25e-29fa-49bb-8c72-38b1db164e6c-utilities" (OuterVolumeSpecName: "utilities") pod "1c18f25e-29fa-49bb-8c72-38b1db164e6c" (UID: "1c18f25e-29fa-49bb-8c72-38b1db164e6c"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 03 09:02:54 crc kubenswrapper[4765]: I1003 09:02:54.720627 4765 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1c18f25e-29fa-49bb-8c72-38b1db164e6c-kube-api-access-bjk5n" (OuterVolumeSpecName: "kube-api-access-bjk5n") pod "1c18f25e-29fa-49bb-8c72-38b1db164e6c" (UID: "1c18f25e-29fa-49bb-8c72-38b1db164e6c"). InnerVolumeSpecName "kube-api-access-bjk5n". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 09:02:54 crc kubenswrapper[4765]: I1003 09:02:54.800068 4765 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1c18f25e-29fa-49bb-8c72-38b1db164e6c-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "1c18f25e-29fa-49bb-8c72-38b1db164e6c" (UID: "1c18f25e-29fa-49bb-8c72-38b1db164e6c"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 03 09:02:54 crc kubenswrapper[4765]: I1003 09:02:54.815126 4765 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1c18f25e-29fa-49bb-8c72-38b1db164e6c-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 03 09:02:54 crc kubenswrapper[4765]: I1003 09:02:54.815183 4765 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bjk5n\" (UniqueName: \"kubernetes.io/projected/1c18f25e-29fa-49bb-8c72-38b1db164e6c-kube-api-access-bjk5n\") on node \"crc\" DevicePath \"\"" Oct 03 09:02:54 crc kubenswrapper[4765]: I1003 09:02:54.815194 4765 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1c18f25e-29fa-49bb-8c72-38b1db164e6c-utilities\") on node \"crc\" DevicePath \"\"" Oct 03 09:02:55 crc kubenswrapper[4765]: I1003 09:02:55.267200 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-fpjfs" event={"ID":"1c18f25e-29fa-49bb-8c72-38b1db164e6c","Type":"ContainerDied","Data":"12eb952fec3b356afd298ef8ce28f42031546945473748290479c272edfd120a"} Oct 03 09:02:55 crc kubenswrapper[4765]: I1003 09:02:55.267244 4765 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-fpjfs" Oct 03 09:02:55 crc kubenswrapper[4765]: I1003 09:02:55.267255 4765 scope.go:117] "RemoveContainer" containerID="cb60f94d4f619aac0c10c41008dd801d40d5be9238203d84ac4920aea4cf4df3" Oct 03 09:02:55 crc kubenswrapper[4765]: I1003 09:02:55.286377 4765 scope.go:117] "RemoveContainer" containerID="1092a8f3640f0faa0190fafddc31bf953e5e83b202dc3594cc29da82c206bc28" Oct 03 09:02:55 crc kubenswrapper[4765]: I1003 09:02:55.301041 4765 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-fpjfs"] Oct 03 09:02:55 crc kubenswrapper[4765]: I1003 09:02:55.306558 4765 scope.go:117] "RemoveContainer" containerID="ea80f25a72abe0fe95e94eead5d1a32754df2b2e0787aa0a5f6ee32edd7ed404" Oct 03 09:02:55 crc kubenswrapper[4765]: I1003 09:02:55.309232 4765 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-fpjfs"] Oct 03 09:02:56 crc kubenswrapper[4765]: I1003 09:02:56.318382 4765 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1c18f25e-29fa-49bb-8c72-38b1db164e6c" path="/var/lib/kubelet/pods/1c18f25e-29fa-49bb-8c72-38b1db164e6c/volumes" Oct 03 09:02:56 crc kubenswrapper[4765]: I1003 09:02:56.784785 4765 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="watcher-kuttl-default/watcher-kuttl-decision-engine-0" Oct 03 09:02:56 crc kubenswrapper[4765]: I1003 09:02:56.804676 4765 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="watcher-kuttl-default/watcher-kuttl-applier-0" Oct 03 09:02:56 crc kubenswrapper[4765]: I1003 09:02:56.808340 4765 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="watcher-kuttl-default/watcher-kuttl-decision-engine-0" Oct 03 09:02:56 crc kubenswrapper[4765]: I1003 09:02:56.829761 4765 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="watcher-kuttl-default/watcher-kuttl-applier-0" Oct 03 09:02:56 crc kubenswrapper[4765]: I1003 09:02:56.900787 4765 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="watcher-kuttl-default/watcher-kuttl-api-0" Oct 03 09:02:56 crc kubenswrapper[4765]: I1003 09:02:56.920684 4765 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="watcher-kuttl-default/watcher-kuttl-api-0" Oct 03 09:02:57 crc kubenswrapper[4765]: I1003 09:02:57.283691 4765 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="watcher-kuttl-default/watcher-kuttl-decision-engine-0" Oct 03 09:02:57 crc kubenswrapper[4765]: I1003 09:02:57.292154 4765 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="watcher-kuttl-default/watcher-kuttl-api-0" Oct 03 09:02:57 crc kubenswrapper[4765]: I1003 09:02:57.317142 4765 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="watcher-kuttl-default/watcher-kuttl-decision-engine-0" Oct 03 09:02:57 crc kubenswrapper[4765]: I1003 09:02:57.323633 4765 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="watcher-kuttl-default/watcher-kuttl-applier-0" Oct 03 09:02:59 crc kubenswrapper[4765]: I1003 09:02:59.687963 4765 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["watcher-kuttl-default/ceilometer-0"] Oct 03 09:02:59 crc kubenswrapper[4765]: I1003 09:02:59.688742 4765 kuberuntime_container.go:808] "Killing container with a grace period" pod="watcher-kuttl-default/ceilometer-0" podUID="c001d2a5-b27f-456a-a2b1-141a880b8174" containerName="ceilometer-central-agent" containerID="cri-o://55e483c8bee6161449985dd163cf8b26544fc6755d63dde24d62b842676929bb" gracePeriod=30 Oct 03 09:02:59 crc kubenswrapper[4765]: I1003 09:02:59.688884 4765 kuberuntime_container.go:808] "Killing container with a grace period" pod="watcher-kuttl-default/ceilometer-0" podUID="c001d2a5-b27f-456a-a2b1-141a880b8174" containerName="proxy-httpd" containerID="cri-o://4b0d8c497d1cf0e9c1f70cb9e2f3be559d1151f9f481999af0ac9f89b13ceee7" gracePeriod=30 Oct 03 09:02:59 crc kubenswrapper[4765]: I1003 09:02:59.688932 4765 kuberuntime_container.go:808] "Killing container with a grace period" pod="watcher-kuttl-default/ceilometer-0" podUID="c001d2a5-b27f-456a-a2b1-141a880b8174" containerName="sg-core" containerID="cri-o://5ff432912d77484e720f934975943fa00031a7c8eb3579ae1b3e2b17e52fc02f" gracePeriod=30 Oct 03 09:02:59 crc kubenswrapper[4765]: I1003 09:02:59.688961 4765 kuberuntime_container.go:808] "Killing container with a grace period" pod="watcher-kuttl-default/ceilometer-0" podUID="c001d2a5-b27f-456a-a2b1-141a880b8174" containerName="ceilometer-notification-agent" containerID="cri-o://3f543215158d5bbf414aad2c250acf81a88b3a0690ca4b892e71402a82a9207a" gracePeriod=30 Oct 03 09:03:00 crc kubenswrapper[4765]: I1003 09:03:00.329526 4765 generic.go:334] "Generic (PLEG): container finished" podID="c001d2a5-b27f-456a-a2b1-141a880b8174" containerID="4b0d8c497d1cf0e9c1f70cb9e2f3be559d1151f9f481999af0ac9f89b13ceee7" exitCode=0 Oct 03 09:03:00 crc kubenswrapper[4765]: I1003 09:03:00.329570 4765 generic.go:334] "Generic (PLEG): container finished" podID="c001d2a5-b27f-456a-a2b1-141a880b8174" containerID="5ff432912d77484e720f934975943fa00031a7c8eb3579ae1b3e2b17e52fc02f" exitCode=2 Oct 03 09:03:00 crc kubenswrapper[4765]: I1003 09:03:00.329579 4765 generic.go:334] "Generic (PLEG): container finished" podID="c001d2a5-b27f-456a-a2b1-141a880b8174" containerID="55e483c8bee6161449985dd163cf8b26544fc6755d63dde24d62b842676929bb" exitCode=0 Oct 03 09:03:00 crc kubenswrapper[4765]: I1003 09:03:00.329601 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/ceilometer-0" event={"ID":"c001d2a5-b27f-456a-a2b1-141a880b8174","Type":"ContainerDied","Data":"4b0d8c497d1cf0e9c1f70cb9e2f3be559d1151f9f481999af0ac9f89b13ceee7"} Oct 03 09:03:00 crc kubenswrapper[4765]: I1003 09:03:00.329628 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/ceilometer-0" event={"ID":"c001d2a5-b27f-456a-a2b1-141a880b8174","Type":"ContainerDied","Data":"5ff432912d77484e720f934975943fa00031a7c8eb3579ae1b3e2b17e52fc02f"} Oct 03 09:03:00 crc kubenswrapper[4765]: I1003 09:03:00.329712 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/ceilometer-0" event={"ID":"c001d2a5-b27f-456a-a2b1-141a880b8174","Type":"ContainerDied","Data":"55e483c8bee6161449985dd163cf8b26544fc6755d63dde24d62b842676929bb"} Oct 03 09:03:02 crc kubenswrapper[4765]: I1003 09:03:02.302013 4765 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/ceilometer-0" Oct 03 09:03:02 crc kubenswrapper[4765]: I1003 09:03:02.348162 4765 generic.go:334] "Generic (PLEG): container finished" podID="c001d2a5-b27f-456a-a2b1-141a880b8174" containerID="3f543215158d5bbf414aad2c250acf81a88b3a0690ca4b892e71402a82a9207a" exitCode=0 Oct 03 09:03:02 crc kubenswrapper[4765]: I1003 09:03:02.348204 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/ceilometer-0" event={"ID":"c001d2a5-b27f-456a-a2b1-141a880b8174","Type":"ContainerDied","Data":"3f543215158d5bbf414aad2c250acf81a88b3a0690ca4b892e71402a82a9207a"} Oct 03 09:03:02 crc kubenswrapper[4765]: I1003 09:03:02.348235 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/ceilometer-0" event={"ID":"c001d2a5-b27f-456a-a2b1-141a880b8174","Type":"ContainerDied","Data":"2ac1bf149582bca5b2370067c7b7146af05ea1e4a389a55e841010130fa812b8"} Oct 03 09:03:02 crc kubenswrapper[4765]: I1003 09:03:02.348259 4765 scope.go:117] "RemoveContainer" containerID="4b0d8c497d1cf0e9c1f70cb9e2f3be559d1151f9f481999af0ac9f89b13ceee7" Oct 03 09:03:02 crc kubenswrapper[4765]: I1003 09:03:02.348424 4765 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/ceilometer-0" Oct 03 09:03:02 crc kubenswrapper[4765]: I1003 09:03:02.401093 4765 scope.go:117] "RemoveContainer" containerID="5ff432912d77484e720f934975943fa00031a7c8eb3579ae1b3e2b17e52fc02f" Oct 03 09:03:02 crc kubenswrapper[4765]: I1003 09:03:02.420658 4765 scope.go:117] "RemoveContainer" containerID="3f543215158d5bbf414aad2c250acf81a88b3a0690ca4b892e71402a82a9207a" Oct 03 09:03:02 crc kubenswrapper[4765]: I1003 09:03:02.442199 4765 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c001d2a5-b27f-456a-a2b1-141a880b8174-combined-ca-bundle\") pod \"c001d2a5-b27f-456a-a2b1-141a880b8174\" (UID: \"c001d2a5-b27f-456a-a2b1-141a880b8174\") " Oct 03 09:03:02 crc kubenswrapper[4765]: I1003 09:03:02.442299 4765 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-k2x8v\" (UniqueName: \"kubernetes.io/projected/c001d2a5-b27f-456a-a2b1-141a880b8174-kube-api-access-k2x8v\") pod \"c001d2a5-b27f-456a-a2b1-141a880b8174\" (UID: \"c001d2a5-b27f-456a-a2b1-141a880b8174\") " Oct 03 09:03:02 crc kubenswrapper[4765]: I1003 09:03:02.442344 4765 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c001d2a5-b27f-456a-a2b1-141a880b8174-scripts\") pod \"c001d2a5-b27f-456a-a2b1-141a880b8174\" (UID: \"c001d2a5-b27f-456a-a2b1-141a880b8174\") " Oct 03 09:03:02 crc kubenswrapper[4765]: I1003 09:03:02.442377 4765 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c001d2a5-b27f-456a-a2b1-141a880b8174-config-data\") pod \"c001d2a5-b27f-456a-a2b1-141a880b8174\" (UID: \"c001d2a5-b27f-456a-a2b1-141a880b8174\") " Oct 03 09:03:02 crc kubenswrapper[4765]: I1003 09:03:02.442423 4765 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/c001d2a5-b27f-456a-a2b1-141a880b8174-log-httpd\") pod \"c001d2a5-b27f-456a-a2b1-141a880b8174\" (UID: \"c001d2a5-b27f-456a-a2b1-141a880b8174\") " Oct 03 09:03:02 crc kubenswrapper[4765]: I1003 09:03:02.443188 4765 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c001d2a5-b27f-456a-a2b1-141a880b8174-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "c001d2a5-b27f-456a-a2b1-141a880b8174" (UID: "c001d2a5-b27f-456a-a2b1-141a880b8174"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 03 09:03:02 crc kubenswrapper[4765]: I1003 09:03:02.442640 4765 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/c001d2a5-b27f-456a-a2b1-141a880b8174-ceilometer-tls-certs\") pod \"c001d2a5-b27f-456a-a2b1-141a880b8174\" (UID: \"c001d2a5-b27f-456a-a2b1-141a880b8174\") " Oct 03 09:03:02 crc kubenswrapper[4765]: I1003 09:03:02.443417 4765 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/c001d2a5-b27f-456a-a2b1-141a880b8174-sg-core-conf-yaml\") pod \"c001d2a5-b27f-456a-a2b1-141a880b8174\" (UID: \"c001d2a5-b27f-456a-a2b1-141a880b8174\") " Oct 03 09:03:02 crc kubenswrapper[4765]: I1003 09:03:02.443477 4765 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/c001d2a5-b27f-456a-a2b1-141a880b8174-run-httpd\") pod \"c001d2a5-b27f-456a-a2b1-141a880b8174\" (UID: \"c001d2a5-b27f-456a-a2b1-141a880b8174\") " Oct 03 09:03:02 crc kubenswrapper[4765]: I1003 09:03:02.444024 4765 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/c001d2a5-b27f-456a-a2b1-141a880b8174-log-httpd\") on node \"crc\" DevicePath \"\"" Oct 03 09:03:02 crc kubenswrapper[4765]: I1003 09:03:02.444015 4765 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c001d2a5-b27f-456a-a2b1-141a880b8174-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "c001d2a5-b27f-456a-a2b1-141a880b8174" (UID: "c001d2a5-b27f-456a-a2b1-141a880b8174"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 03 09:03:02 crc kubenswrapper[4765]: I1003 09:03:02.446273 4765 scope.go:117] "RemoveContainer" containerID="55e483c8bee6161449985dd163cf8b26544fc6755d63dde24d62b842676929bb" Oct 03 09:03:02 crc kubenswrapper[4765]: I1003 09:03:02.459187 4765 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c001d2a5-b27f-456a-a2b1-141a880b8174-kube-api-access-k2x8v" (OuterVolumeSpecName: "kube-api-access-k2x8v") pod "c001d2a5-b27f-456a-a2b1-141a880b8174" (UID: "c001d2a5-b27f-456a-a2b1-141a880b8174"). InnerVolumeSpecName "kube-api-access-k2x8v". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 09:03:02 crc kubenswrapper[4765]: I1003 09:03:02.462100 4765 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c001d2a5-b27f-456a-a2b1-141a880b8174-scripts" (OuterVolumeSpecName: "scripts") pod "c001d2a5-b27f-456a-a2b1-141a880b8174" (UID: "c001d2a5-b27f-456a-a2b1-141a880b8174"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 09:03:02 crc kubenswrapper[4765]: I1003 09:03:02.474015 4765 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c001d2a5-b27f-456a-a2b1-141a880b8174-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "c001d2a5-b27f-456a-a2b1-141a880b8174" (UID: "c001d2a5-b27f-456a-a2b1-141a880b8174"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 09:03:02 crc kubenswrapper[4765]: I1003 09:03:02.490882 4765 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c001d2a5-b27f-456a-a2b1-141a880b8174-ceilometer-tls-certs" (OuterVolumeSpecName: "ceilometer-tls-certs") pod "c001d2a5-b27f-456a-a2b1-141a880b8174" (UID: "c001d2a5-b27f-456a-a2b1-141a880b8174"). InnerVolumeSpecName "ceilometer-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 09:03:02 crc kubenswrapper[4765]: I1003 09:03:02.528270 4765 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c001d2a5-b27f-456a-a2b1-141a880b8174-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "c001d2a5-b27f-456a-a2b1-141a880b8174" (UID: "c001d2a5-b27f-456a-a2b1-141a880b8174"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 09:03:02 crc kubenswrapper[4765]: I1003 09:03:02.545336 4765 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/c001d2a5-b27f-456a-a2b1-141a880b8174-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Oct 03 09:03:02 crc kubenswrapper[4765]: I1003 09:03:02.545370 4765 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/c001d2a5-b27f-456a-a2b1-141a880b8174-run-httpd\") on node \"crc\" DevicePath \"\"" Oct 03 09:03:02 crc kubenswrapper[4765]: I1003 09:03:02.545381 4765 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c001d2a5-b27f-456a-a2b1-141a880b8174-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 03 09:03:02 crc kubenswrapper[4765]: I1003 09:03:02.545390 4765 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-k2x8v\" (UniqueName: \"kubernetes.io/projected/c001d2a5-b27f-456a-a2b1-141a880b8174-kube-api-access-k2x8v\") on node \"crc\" DevicePath \"\"" Oct 03 09:03:02 crc kubenswrapper[4765]: I1003 09:03:02.545402 4765 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c001d2a5-b27f-456a-a2b1-141a880b8174-scripts\") on node \"crc\" DevicePath \"\"" Oct 03 09:03:02 crc kubenswrapper[4765]: I1003 09:03:02.545410 4765 reconciler_common.go:293] "Volume detached for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/c001d2a5-b27f-456a-a2b1-141a880b8174-ceilometer-tls-certs\") on node \"crc\" DevicePath \"\"" Oct 03 09:03:02 crc kubenswrapper[4765]: I1003 09:03:02.546188 4765 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c001d2a5-b27f-456a-a2b1-141a880b8174-config-data" (OuterVolumeSpecName: "config-data") pod "c001d2a5-b27f-456a-a2b1-141a880b8174" (UID: "c001d2a5-b27f-456a-a2b1-141a880b8174"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 09:03:02 crc kubenswrapper[4765]: I1003 09:03:02.551451 4765 scope.go:117] "RemoveContainer" containerID="4b0d8c497d1cf0e9c1f70cb9e2f3be559d1151f9f481999af0ac9f89b13ceee7" Oct 03 09:03:02 crc kubenswrapper[4765]: E1003 09:03:02.551874 4765 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4b0d8c497d1cf0e9c1f70cb9e2f3be559d1151f9f481999af0ac9f89b13ceee7\": container with ID starting with 4b0d8c497d1cf0e9c1f70cb9e2f3be559d1151f9f481999af0ac9f89b13ceee7 not found: ID does not exist" containerID="4b0d8c497d1cf0e9c1f70cb9e2f3be559d1151f9f481999af0ac9f89b13ceee7" Oct 03 09:03:02 crc kubenswrapper[4765]: I1003 09:03:02.551930 4765 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4b0d8c497d1cf0e9c1f70cb9e2f3be559d1151f9f481999af0ac9f89b13ceee7"} err="failed to get container status \"4b0d8c497d1cf0e9c1f70cb9e2f3be559d1151f9f481999af0ac9f89b13ceee7\": rpc error: code = NotFound desc = could not find container \"4b0d8c497d1cf0e9c1f70cb9e2f3be559d1151f9f481999af0ac9f89b13ceee7\": container with ID starting with 4b0d8c497d1cf0e9c1f70cb9e2f3be559d1151f9f481999af0ac9f89b13ceee7 not found: ID does not exist" Oct 03 09:03:02 crc kubenswrapper[4765]: I1003 09:03:02.551955 4765 scope.go:117] "RemoveContainer" containerID="5ff432912d77484e720f934975943fa00031a7c8eb3579ae1b3e2b17e52fc02f" Oct 03 09:03:02 crc kubenswrapper[4765]: E1003 09:03:02.552295 4765 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5ff432912d77484e720f934975943fa00031a7c8eb3579ae1b3e2b17e52fc02f\": container with ID starting with 5ff432912d77484e720f934975943fa00031a7c8eb3579ae1b3e2b17e52fc02f not found: ID does not exist" containerID="5ff432912d77484e720f934975943fa00031a7c8eb3579ae1b3e2b17e52fc02f" Oct 03 09:03:02 crc kubenswrapper[4765]: I1003 09:03:02.552415 4765 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5ff432912d77484e720f934975943fa00031a7c8eb3579ae1b3e2b17e52fc02f"} err="failed to get container status \"5ff432912d77484e720f934975943fa00031a7c8eb3579ae1b3e2b17e52fc02f\": rpc error: code = NotFound desc = could not find container \"5ff432912d77484e720f934975943fa00031a7c8eb3579ae1b3e2b17e52fc02f\": container with ID starting with 5ff432912d77484e720f934975943fa00031a7c8eb3579ae1b3e2b17e52fc02f not found: ID does not exist" Oct 03 09:03:02 crc kubenswrapper[4765]: I1003 09:03:02.552524 4765 scope.go:117] "RemoveContainer" containerID="3f543215158d5bbf414aad2c250acf81a88b3a0690ca4b892e71402a82a9207a" Oct 03 09:03:02 crc kubenswrapper[4765]: E1003 09:03:02.552986 4765 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3f543215158d5bbf414aad2c250acf81a88b3a0690ca4b892e71402a82a9207a\": container with ID starting with 3f543215158d5bbf414aad2c250acf81a88b3a0690ca4b892e71402a82a9207a not found: ID does not exist" containerID="3f543215158d5bbf414aad2c250acf81a88b3a0690ca4b892e71402a82a9207a" Oct 03 09:03:02 crc kubenswrapper[4765]: I1003 09:03:02.553008 4765 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3f543215158d5bbf414aad2c250acf81a88b3a0690ca4b892e71402a82a9207a"} err="failed to get container status \"3f543215158d5bbf414aad2c250acf81a88b3a0690ca4b892e71402a82a9207a\": rpc error: code = NotFound desc = could not find container \"3f543215158d5bbf414aad2c250acf81a88b3a0690ca4b892e71402a82a9207a\": container with ID starting with 3f543215158d5bbf414aad2c250acf81a88b3a0690ca4b892e71402a82a9207a not found: ID does not exist" Oct 03 09:03:02 crc kubenswrapper[4765]: I1003 09:03:02.553023 4765 scope.go:117] "RemoveContainer" containerID="55e483c8bee6161449985dd163cf8b26544fc6755d63dde24d62b842676929bb" Oct 03 09:03:02 crc kubenswrapper[4765]: E1003 09:03:02.553193 4765 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"55e483c8bee6161449985dd163cf8b26544fc6755d63dde24d62b842676929bb\": container with ID starting with 55e483c8bee6161449985dd163cf8b26544fc6755d63dde24d62b842676929bb not found: ID does not exist" containerID="55e483c8bee6161449985dd163cf8b26544fc6755d63dde24d62b842676929bb" Oct 03 09:03:02 crc kubenswrapper[4765]: I1003 09:03:02.553214 4765 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"55e483c8bee6161449985dd163cf8b26544fc6755d63dde24d62b842676929bb"} err="failed to get container status \"55e483c8bee6161449985dd163cf8b26544fc6755d63dde24d62b842676929bb\": rpc error: code = NotFound desc = could not find container \"55e483c8bee6161449985dd163cf8b26544fc6755d63dde24d62b842676929bb\": container with ID starting with 55e483c8bee6161449985dd163cf8b26544fc6755d63dde24d62b842676929bb not found: ID does not exist" Oct 03 09:03:02 crc kubenswrapper[4765]: I1003 09:03:02.646333 4765 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c001d2a5-b27f-456a-a2b1-141a880b8174-config-data\") on node \"crc\" DevicePath \"\"" Oct 03 09:03:02 crc kubenswrapper[4765]: I1003 09:03:02.693398 4765 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["watcher-kuttl-default/ceilometer-0"] Oct 03 09:03:02 crc kubenswrapper[4765]: I1003 09:03:02.705375 4765 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["watcher-kuttl-default/ceilometer-0"] Oct 03 09:03:02 crc kubenswrapper[4765]: I1003 09:03:02.770513 4765 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["watcher-kuttl-default/ceilometer-0"] Oct 03 09:03:02 crc kubenswrapper[4765]: E1003 09:03:02.779190 4765 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c001d2a5-b27f-456a-a2b1-141a880b8174" containerName="sg-core" Oct 03 09:03:02 crc kubenswrapper[4765]: I1003 09:03:02.779292 4765 state_mem.go:107] "Deleted CPUSet assignment" podUID="c001d2a5-b27f-456a-a2b1-141a880b8174" containerName="sg-core" Oct 03 09:03:02 crc kubenswrapper[4765]: E1003 09:03:02.779378 4765 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c001d2a5-b27f-456a-a2b1-141a880b8174" containerName="ceilometer-central-agent" Oct 03 09:03:02 crc kubenswrapper[4765]: I1003 09:03:02.779451 4765 state_mem.go:107] "Deleted CPUSet assignment" podUID="c001d2a5-b27f-456a-a2b1-141a880b8174" containerName="ceilometer-central-agent" Oct 03 09:03:02 crc kubenswrapper[4765]: E1003 09:03:02.779542 4765 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1c18f25e-29fa-49bb-8c72-38b1db164e6c" containerName="extract-content" Oct 03 09:03:02 crc kubenswrapper[4765]: I1003 09:03:02.779727 4765 state_mem.go:107] "Deleted CPUSet assignment" podUID="1c18f25e-29fa-49bb-8c72-38b1db164e6c" containerName="extract-content" Oct 03 09:03:02 crc kubenswrapper[4765]: E1003 09:03:02.779838 4765 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1c18f25e-29fa-49bb-8c72-38b1db164e6c" containerName="registry-server" Oct 03 09:03:02 crc kubenswrapper[4765]: I1003 09:03:02.779915 4765 state_mem.go:107] "Deleted CPUSet assignment" podUID="1c18f25e-29fa-49bb-8c72-38b1db164e6c" containerName="registry-server" Oct 03 09:03:02 crc kubenswrapper[4765]: E1003 09:03:02.780009 4765 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c001d2a5-b27f-456a-a2b1-141a880b8174" containerName="proxy-httpd" Oct 03 09:03:02 crc kubenswrapper[4765]: I1003 09:03:02.780087 4765 state_mem.go:107] "Deleted CPUSet assignment" podUID="c001d2a5-b27f-456a-a2b1-141a880b8174" containerName="proxy-httpd" Oct 03 09:03:02 crc kubenswrapper[4765]: E1003 09:03:02.780166 4765 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1c18f25e-29fa-49bb-8c72-38b1db164e6c" containerName="extract-utilities" Oct 03 09:03:02 crc kubenswrapper[4765]: I1003 09:03:02.780245 4765 state_mem.go:107] "Deleted CPUSet assignment" podUID="1c18f25e-29fa-49bb-8c72-38b1db164e6c" containerName="extract-utilities" Oct 03 09:03:02 crc kubenswrapper[4765]: E1003 09:03:02.780413 4765 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c001d2a5-b27f-456a-a2b1-141a880b8174" containerName="ceilometer-notification-agent" Oct 03 09:03:02 crc kubenswrapper[4765]: I1003 09:03:02.780493 4765 state_mem.go:107] "Deleted CPUSet assignment" podUID="c001d2a5-b27f-456a-a2b1-141a880b8174" containerName="ceilometer-notification-agent" Oct 03 09:03:02 crc kubenswrapper[4765]: I1003 09:03:02.781015 4765 memory_manager.go:354] "RemoveStaleState removing state" podUID="1c18f25e-29fa-49bb-8c72-38b1db164e6c" containerName="registry-server" Oct 03 09:03:02 crc kubenswrapper[4765]: I1003 09:03:02.781124 4765 memory_manager.go:354] "RemoveStaleState removing state" podUID="c001d2a5-b27f-456a-a2b1-141a880b8174" containerName="sg-core" Oct 03 09:03:02 crc kubenswrapper[4765]: I1003 09:03:02.781218 4765 memory_manager.go:354] "RemoveStaleState removing state" podUID="c001d2a5-b27f-456a-a2b1-141a880b8174" containerName="ceilometer-central-agent" Oct 03 09:03:02 crc kubenswrapper[4765]: I1003 09:03:02.781303 4765 memory_manager.go:354] "RemoveStaleState removing state" podUID="c001d2a5-b27f-456a-a2b1-141a880b8174" containerName="ceilometer-notification-agent" Oct 03 09:03:02 crc kubenswrapper[4765]: I1003 09:03:02.781402 4765 memory_manager.go:354] "RemoveStaleState removing state" podUID="c001d2a5-b27f-456a-a2b1-141a880b8174" containerName="proxy-httpd" Oct 03 09:03:02 crc kubenswrapper[4765]: I1003 09:03:02.783836 4765 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/ceilometer-0" Oct 03 09:03:02 crc kubenswrapper[4765]: I1003 09:03:02.786692 4765 reflector.go:368] Caches populated for *v1.Secret from object-"watcher-kuttl-default"/"ceilometer-config-data" Oct 03 09:03:02 crc kubenswrapper[4765]: I1003 09:03:02.786988 4765 reflector.go:368] Caches populated for *v1.Secret from object-"watcher-kuttl-default"/"cert-ceilometer-internal-svc" Oct 03 09:03:02 crc kubenswrapper[4765]: I1003 09:03:02.787514 4765 reflector.go:368] Caches populated for *v1.Secret from object-"watcher-kuttl-default"/"ceilometer-scripts" Oct 03 09:03:02 crc kubenswrapper[4765]: I1003 09:03:02.821367 4765 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["watcher-kuttl-default/ceilometer-0"] Oct 03 09:03:02 crc kubenswrapper[4765]: I1003 09:03:02.850538 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/1cc99b7b-ff5c-4e91-a07c-52f372cd1fca-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"1cc99b7b-ff5c-4e91-a07c-52f372cd1fca\") " pod="watcher-kuttl-default/ceilometer-0" Oct 03 09:03:02 crc kubenswrapper[4765]: I1003 09:03:02.850612 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/1cc99b7b-ff5c-4e91-a07c-52f372cd1fca-run-httpd\") pod \"ceilometer-0\" (UID: \"1cc99b7b-ff5c-4e91-a07c-52f372cd1fca\") " pod="watcher-kuttl-default/ceilometer-0" Oct 03 09:03:02 crc kubenswrapper[4765]: I1003 09:03:02.850662 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/1cc99b7b-ff5c-4e91-a07c-52f372cd1fca-log-httpd\") pod \"ceilometer-0\" (UID: \"1cc99b7b-ff5c-4e91-a07c-52f372cd1fca\") " pod="watcher-kuttl-default/ceilometer-0" Oct 03 09:03:02 crc kubenswrapper[4765]: I1003 09:03:02.850717 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tfws9\" (UniqueName: \"kubernetes.io/projected/1cc99b7b-ff5c-4e91-a07c-52f372cd1fca-kube-api-access-tfws9\") pod \"ceilometer-0\" (UID: \"1cc99b7b-ff5c-4e91-a07c-52f372cd1fca\") " pod="watcher-kuttl-default/ceilometer-0" Oct 03 09:03:02 crc kubenswrapper[4765]: I1003 09:03:02.850748 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/1cc99b7b-ff5c-4e91-a07c-52f372cd1fca-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"1cc99b7b-ff5c-4e91-a07c-52f372cd1fca\") " pod="watcher-kuttl-default/ceilometer-0" Oct 03 09:03:02 crc kubenswrapper[4765]: I1003 09:03:02.850774 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1cc99b7b-ff5c-4e91-a07c-52f372cd1fca-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"1cc99b7b-ff5c-4e91-a07c-52f372cd1fca\") " pod="watcher-kuttl-default/ceilometer-0" Oct 03 09:03:02 crc kubenswrapper[4765]: I1003 09:03:02.850840 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1cc99b7b-ff5c-4e91-a07c-52f372cd1fca-config-data\") pod \"ceilometer-0\" (UID: \"1cc99b7b-ff5c-4e91-a07c-52f372cd1fca\") " pod="watcher-kuttl-default/ceilometer-0" Oct 03 09:03:02 crc kubenswrapper[4765]: I1003 09:03:02.850862 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/1cc99b7b-ff5c-4e91-a07c-52f372cd1fca-scripts\") pod \"ceilometer-0\" (UID: \"1cc99b7b-ff5c-4e91-a07c-52f372cd1fca\") " pod="watcher-kuttl-default/ceilometer-0" Oct 03 09:03:02 crc kubenswrapper[4765]: I1003 09:03:02.952605 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/1cc99b7b-ff5c-4e91-a07c-52f372cd1fca-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"1cc99b7b-ff5c-4e91-a07c-52f372cd1fca\") " pod="watcher-kuttl-default/ceilometer-0" Oct 03 09:03:02 crc kubenswrapper[4765]: I1003 09:03:02.952675 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1cc99b7b-ff5c-4e91-a07c-52f372cd1fca-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"1cc99b7b-ff5c-4e91-a07c-52f372cd1fca\") " pod="watcher-kuttl-default/ceilometer-0" Oct 03 09:03:02 crc kubenswrapper[4765]: I1003 09:03:02.952740 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1cc99b7b-ff5c-4e91-a07c-52f372cd1fca-config-data\") pod \"ceilometer-0\" (UID: \"1cc99b7b-ff5c-4e91-a07c-52f372cd1fca\") " pod="watcher-kuttl-default/ceilometer-0" Oct 03 09:03:02 crc kubenswrapper[4765]: I1003 09:03:02.952758 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/1cc99b7b-ff5c-4e91-a07c-52f372cd1fca-scripts\") pod \"ceilometer-0\" (UID: \"1cc99b7b-ff5c-4e91-a07c-52f372cd1fca\") " pod="watcher-kuttl-default/ceilometer-0" Oct 03 09:03:02 crc kubenswrapper[4765]: I1003 09:03:02.953400 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/1cc99b7b-ff5c-4e91-a07c-52f372cd1fca-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"1cc99b7b-ff5c-4e91-a07c-52f372cd1fca\") " pod="watcher-kuttl-default/ceilometer-0" Oct 03 09:03:02 crc kubenswrapper[4765]: I1003 09:03:02.953451 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/1cc99b7b-ff5c-4e91-a07c-52f372cd1fca-run-httpd\") pod \"ceilometer-0\" (UID: \"1cc99b7b-ff5c-4e91-a07c-52f372cd1fca\") " pod="watcher-kuttl-default/ceilometer-0" Oct 03 09:03:02 crc kubenswrapper[4765]: I1003 09:03:02.953477 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/1cc99b7b-ff5c-4e91-a07c-52f372cd1fca-log-httpd\") pod \"ceilometer-0\" (UID: \"1cc99b7b-ff5c-4e91-a07c-52f372cd1fca\") " pod="watcher-kuttl-default/ceilometer-0" Oct 03 09:03:02 crc kubenswrapper[4765]: I1003 09:03:02.953524 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tfws9\" (UniqueName: \"kubernetes.io/projected/1cc99b7b-ff5c-4e91-a07c-52f372cd1fca-kube-api-access-tfws9\") pod \"ceilometer-0\" (UID: \"1cc99b7b-ff5c-4e91-a07c-52f372cd1fca\") " pod="watcher-kuttl-default/ceilometer-0" Oct 03 09:03:02 crc kubenswrapper[4765]: I1003 09:03:02.954447 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/1cc99b7b-ff5c-4e91-a07c-52f372cd1fca-run-httpd\") pod \"ceilometer-0\" (UID: \"1cc99b7b-ff5c-4e91-a07c-52f372cd1fca\") " pod="watcher-kuttl-default/ceilometer-0" Oct 03 09:03:02 crc kubenswrapper[4765]: I1003 09:03:02.954715 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/1cc99b7b-ff5c-4e91-a07c-52f372cd1fca-log-httpd\") pod \"ceilometer-0\" (UID: \"1cc99b7b-ff5c-4e91-a07c-52f372cd1fca\") " pod="watcher-kuttl-default/ceilometer-0" Oct 03 09:03:02 crc kubenswrapper[4765]: I1003 09:03:02.957305 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/1cc99b7b-ff5c-4e91-a07c-52f372cd1fca-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"1cc99b7b-ff5c-4e91-a07c-52f372cd1fca\") " pod="watcher-kuttl-default/ceilometer-0" Oct 03 09:03:02 crc kubenswrapper[4765]: I1003 09:03:02.957505 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/1cc99b7b-ff5c-4e91-a07c-52f372cd1fca-scripts\") pod \"ceilometer-0\" (UID: \"1cc99b7b-ff5c-4e91-a07c-52f372cd1fca\") " pod="watcher-kuttl-default/ceilometer-0" Oct 03 09:03:02 crc kubenswrapper[4765]: I1003 09:03:02.957788 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1cc99b7b-ff5c-4e91-a07c-52f372cd1fca-config-data\") pod \"ceilometer-0\" (UID: \"1cc99b7b-ff5c-4e91-a07c-52f372cd1fca\") " pod="watcher-kuttl-default/ceilometer-0" Oct 03 09:03:02 crc kubenswrapper[4765]: I1003 09:03:02.960753 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/1cc99b7b-ff5c-4e91-a07c-52f372cd1fca-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"1cc99b7b-ff5c-4e91-a07c-52f372cd1fca\") " pod="watcher-kuttl-default/ceilometer-0" Oct 03 09:03:02 crc kubenswrapper[4765]: I1003 09:03:02.961182 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1cc99b7b-ff5c-4e91-a07c-52f372cd1fca-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"1cc99b7b-ff5c-4e91-a07c-52f372cd1fca\") " pod="watcher-kuttl-default/ceilometer-0" Oct 03 09:03:02 crc kubenswrapper[4765]: I1003 09:03:02.974415 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tfws9\" (UniqueName: \"kubernetes.io/projected/1cc99b7b-ff5c-4e91-a07c-52f372cd1fca-kube-api-access-tfws9\") pod \"ceilometer-0\" (UID: \"1cc99b7b-ff5c-4e91-a07c-52f372cd1fca\") " pod="watcher-kuttl-default/ceilometer-0" Oct 03 09:03:03 crc kubenswrapper[4765]: I1003 09:03:03.100727 4765 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/ceilometer-0" Oct 03 09:03:03 crc kubenswrapper[4765]: I1003 09:03:03.558687 4765 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["watcher-kuttl-default/ceilometer-0"] Oct 03 09:03:03 crc kubenswrapper[4765]: W1003 09:03:03.561952 4765 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod1cc99b7b_ff5c_4e91_a07c_52f372cd1fca.slice/crio-138dd69f06db6ca626bc445a63ac569671a8911838a7d468c4c41c202487762a WatchSource:0}: Error finding container 138dd69f06db6ca626bc445a63ac569671a8911838a7d468c4c41c202487762a: Status 404 returned error can't find the container with id 138dd69f06db6ca626bc445a63ac569671a8911838a7d468c4c41c202487762a Oct 03 09:03:04 crc kubenswrapper[4765]: I1003 09:03:04.316701 4765 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c001d2a5-b27f-456a-a2b1-141a880b8174" path="/var/lib/kubelet/pods/c001d2a5-b27f-456a-a2b1-141a880b8174/volumes" Oct 03 09:03:04 crc kubenswrapper[4765]: I1003 09:03:04.375453 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/ceilometer-0" event={"ID":"1cc99b7b-ff5c-4e91-a07c-52f372cd1fca","Type":"ContainerStarted","Data":"0ed793914a926a330301fc682306b9862b81d5c880898a1ca7915a9f72dc751e"} Oct 03 09:03:04 crc kubenswrapper[4765]: I1003 09:03:04.375499 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/ceilometer-0" event={"ID":"1cc99b7b-ff5c-4e91-a07c-52f372cd1fca","Type":"ContainerStarted","Data":"138dd69f06db6ca626bc445a63ac569671a8911838a7d468c4c41c202487762a"} Oct 03 09:03:05 crc kubenswrapper[4765]: I1003 09:03:05.386301 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/ceilometer-0" event={"ID":"1cc99b7b-ff5c-4e91-a07c-52f372cd1fca","Type":"ContainerStarted","Data":"551ebb5a4921e3577d7f5c4c4913b0f4198e76251b38360b0e5f9bc493ad69fc"} Oct 03 09:03:06 crc kubenswrapper[4765]: I1003 09:03:06.421616 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/ceilometer-0" event={"ID":"1cc99b7b-ff5c-4e91-a07c-52f372cd1fca","Type":"ContainerStarted","Data":"3799d764f3c9903802bca2d1df0d3fbc3bfcb31a932fba626fe6faac9b780c75"} Oct 03 09:03:07 crc kubenswrapper[4765]: I1003 09:03:07.432727 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/ceilometer-0" event={"ID":"1cc99b7b-ff5c-4e91-a07c-52f372cd1fca","Type":"ContainerStarted","Data":"cd1b44b04a74a6c33dd93784d299aa05f27a684d0a8af972bd2779294c393815"} Oct 03 09:03:07 crc kubenswrapper[4765]: I1003 09:03:07.434564 4765 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="watcher-kuttl-default/ceilometer-0" Oct 03 09:03:07 crc kubenswrapper[4765]: I1003 09:03:07.478842 4765 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="watcher-kuttl-default/ceilometer-0" podStartSLOduration=2.196705252 podStartE2EDuration="5.478820347s" podCreationTimestamp="2025-10-03 09:03:02 +0000 UTC" firstStartedPulling="2025-10-03 09:03:03.564347887 +0000 UTC m=+1427.865842217" lastFinishedPulling="2025-10-03 09:03:06.846462982 +0000 UTC m=+1431.147957312" observedRunningTime="2025-10-03 09:03:07.470674272 +0000 UTC m=+1431.772168602" watchObservedRunningTime="2025-10-03 09:03:07.478820347 +0000 UTC m=+1431.780314677" Oct 03 09:03:14 crc kubenswrapper[4765]: I1003 09:03:14.715252 4765 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["watcher-kuttl-default/memcached-0"] Oct 03 09:03:14 crc kubenswrapper[4765]: I1003 09:03:14.715993 4765 kuberuntime_container.go:808] "Killing container with a grace period" pod="watcher-kuttl-default/memcached-0" podUID="2336fb85-91d3-4450-90ee-52264f3dc39f" containerName="memcached" containerID="cri-o://95cc04ebd1b6e9267df24d6177f27b547015ef04c08a3eea05707e0e0c268088" gracePeriod=30 Oct 03 09:03:14 crc kubenswrapper[4765]: I1003 09:03:14.744066 4765 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["watcher-kuttl-default/watcher-kuttl-applier-0"] Oct 03 09:03:14 crc kubenswrapper[4765]: I1003 09:03:14.744379 4765 kuberuntime_container.go:808] "Killing container with a grace period" pod="watcher-kuttl-default/watcher-kuttl-applier-0" podUID="02077287-6b7d-4d30-8a19-e7f42699e5d2" containerName="watcher-applier" containerID="cri-o://fb430e7d139d8ef6c9d2c00d38b03c6289fd36807707c4b86a3953e4fc3f713d" gracePeriod=30 Oct 03 09:03:14 crc kubenswrapper[4765]: I1003 09:03:14.752876 4765 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["watcher-kuttl-default/watcher-kuttl-decision-engine-0"] Oct 03 09:03:14 crc kubenswrapper[4765]: I1003 09:03:14.753415 4765 kuberuntime_container.go:808] "Killing container with a grace period" pod="watcher-kuttl-default/watcher-kuttl-decision-engine-0" podUID="1eef47c2-953c-4299-a994-a3ab5583b2d0" containerName="watcher-decision-engine" containerID="cri-o://3475db93fee76cefce14c7c9aef47cf2bafa98b960c66b70f4d3da860393bb9d" gracePeriod=30 Oct 03 09:03:14 crc kubenswrapper[4765]: I1003 09:03:14.762233 4765 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["watcher-kuttl-default/watcher-kuttl-api-0"] Oct 03 09:03:14 crc kubenswrapper[4765]: I1003 09:03:14.762533 4765 kuberuntime_container.go:808] "Killing container with a grace period" pod="watcher-kuttl-default/watcher-kuttl-api-0" podUID="5c3fda04-8007-4e3b-9283-9a28e6c57c7c" containerName="watcher-kuttl-api-log" containerID="cri-o://10e554f2337fa67d36ffcbf048fca52097cf515a04ca2df4b373f5bb6888638b" gracePeriod=30 Oct 03 09:03:14 crc kubenswrapper[4765]: I1003 09:03:14.762759 4765 kuberuntime_container.go:808] "Killing container with a grace period" pod="watcher-kuttl-default/watcher-kuttl-api-0" podUID="5c3fda04-8007-4e3b-9283-9a28e6c57c7c" containerName="watcher-api" containerID="cri-o://b90e44508f71fa956b268881a4af09fc788044b0a23b91bc37720587a17b35b5" gracePeriod=30 Oct 03 09:03:14 crc kubenswrapper[4765]: I1003 09:03:14.914251 4765 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["watcher-kuttl-default/keystone-bootstrap-dgcbq"] Oct 03 09:03:14 crc kubenswrapper[4765]: I1003 09:03:14.920733 4765 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["watcher-kuttl-default/keystone-bootstrap-dgcbq"] Oct 03 09:03:14 crc kubenswrapper[4765]: I1003 09:03:14.997510 4765 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["watcher-kuttl-default/keystone-bootstrap-kbjbs"] Oct 03 09:03:14 crc kubenswrapper[4765]: I1003 09:03:14.998600 4765 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/keystone-bootstrap-kbjbs" Oct 03 09:03:15 crc kubenswrapper[4765]: I1003 09:03:15.002783 4765 reflector.go:368] Caches populated for *v1.Secret from object-"watcher-kuttl-default"/"cert-memcached-mtls" Oct 03 09:03:15 crc kubenswrapper[4765]: I1003 09:03:15.010637 4765 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["watcher-kuttl-default/keystone-bootstrap-kbjbs"] Oct 03 09:03:15 crc kubenswrapper[4765]: I1003 09:03:15.098459 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/3645711c-7984-4502-aee7-98e45640eaa9-scripts\") pod \"keystone-bootstrap-kbjbs\" (UID: \"3645711c-7984-4502-aee7-98e45640eaa9\") " pod="watcher-kuttl-default/keystone-bootstrap-kbjbs" Oct 03 09:03:15 crc kubenswrapper[4765]: I1003 09:03:15.098502 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jh4rn\" (UniqueName: \"kubernetes.io/projected/3645711c-7984-4502-aee7-98e45640eaa9-kube-api-access-jh4rn\") pod \"keystone-bootstrap-kbjbs\" (UID: \"3645711c-7984-4502-aee7-98e45640eaa9\") " pod="watcher-kuttl-default/keystone-bootstrap-kbjbs" Oct 03 09:03:15 crc kubenswrapper[4765]: I1003 09:03:15.098521 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/3645711c-7984-4502-aee7-98e45640eaa9-fernet-keys\") pod \"keystone-bootstrap-kbjbs\" (UID: \"3645711c-7984-4502-aee7-98e45640eaa9\") " pod="watcher-kuttl-default/keystone-bootstrap-kbjbs" Oct 03 09:03:15 crc kubenswrapper[4765]: I1003 09:03:15.098542 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/3645711c-7984-4502-aee7-98e45640eaa9-credential-keys\") pod \"keystone-bootstrap-kbjbs\" (UID: \"3645711c-7984-4502-aee7-98e45640eaa9\") " pod="watcher-kuttl-default/keystone-bootstrap-kbjbs" Oct 03 09:03:15 crc kubenswrapper[4765]: I1003 09:03:15.098803 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3645711c-7984-4502-aee7-98e45640eaa9-combined-ca-bundle\") pod \"keystone-bootstrap-kbjbs\" (UID: \"3645711c-7984-4502-aee7-98e45640eaa9\") " pod="watcher-kuttl-default/keystone-bootstrap-kbjbs" Oct 03 09:03:15 crc kubenswrapper[4765]: I1003 09:03:15.098884 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3645711c-7984-4502-aee7-98e45640eaa9-config-data\") pod \"keystone-bootstrap-kbjbs\" (UID: \"3645711c-7984-4502-aee7-98e45640eaa9\") " pod="watcher-kuttl-default/keystone-bootstrap-kbjbs" Oct 03 09:03:15 crc kubenswrapper[4765]: I1003 09:03:15.098931 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert-memcached-mtls\" (UniqueName: \"kubernetes.io/secret/3645711c-7984-4502-aee7-98e45640eaa9-cert-memcached-mtls\") pod \"keystone-bootstrap-kbjbs\" (UID: \"3645711c-7984-4502-aee7-98e45640eaa9\") " pod="watcher-kuttl-default/keystone-bootstrap-kbjbs" Oct 03 09:03:15 crc kubenswrapper[4765]: I1003 09:03:15.200244 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert-memcached-mtls\" (UniqueName: \"kubernetes.io/secret/3645711c-7984-4502-aee7-98e45640eaa9-cert-memcached-mtls\") pod \"keystone-bootstrap-kbjbs\" (UID: \"3645711c-7984-4502-aee7-98e45640eaa9\") " pod="watcher-kuttl-default/keystone-bootstrap-kbjbs" Oct 03 09:03:15 crc kubenswrapper[4765]: I1003 09:03:15.200402 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/3645711c-7984-4502-aee7-98e45640eaa9-scripts\") pod \"keystone-bootstrap-kbjbs\" (UID: \"3645711c-7984-4502-aee7-98e45640eaa9\") " pod="watcher-kuttl-default/keystone-bootstrap-kbjbs" Oct 03 09:03:15 crc kubenswrapper[4765]: I1003 09:03:15.200429 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jh4rn\" (UniqueName: \"kubernetes.io/projected/3645711c-7984-4502-aee7-98e45640eaa9-kube-api-access-jh4rn\") pod \"keystone-bootstrap-kbjbs\" (UID: \"3645711c-7984-4502-aee7-98e45640eaa9\") " pod="watcher-kuttl-default/keystone-bootstrap-kbjbs" Oct 03 09:03:15 crc kubenswrapper[4765]: I1003 09:03:15.200453 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/3645711c-7984-4502-aee7-98e45640eaa9-fernet-keys\") pod \"keystone-bootstrap-kbjbs\" (UID: \"3645711c-7984-4502-aee7-98e45640eaa9\") " pod="watcher-kuttl-default/keystone-bootstrap-kbjbs" Oct 03 09:03:15 crc kubenswrapper[4765]: I1003 09:03:15.200478 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/3645711c-7984-4502-aee7-98e45640eaa9-credential-keys\") pod \"keystone-bootstrap-kbjbs\" (UID: \"3645711c-7984-4502-aee7-98e45640eaa9\") " pod="watcher-kuttl-default/keystone-bootstrap-kbjbs" Oct 03 09:03:15 crc kubenswrapper[4765]: I1003 09:03:15.200525 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3645711c-7984-4502-aee7-98e45640eaa9-combined-ca-bundle\") pod \"keystone-bootstrap-kbjbs\" (UID: \"3645711c-7984-4502-aee7-98e45640eaa9\") " pod="watcher-kuttl-default/keystone-bootstrap-kbjbs" Oct 03 09:03:15 crc kubenswrapper[4765]: I1003 09:03:15.200559 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3645711c-7984-4502-aee7-98e45640eaa9-config-data\") pod \"keystone-bootstrap-kbjbs\" (UID: \"3645711c-7984-4502-aee7-98e45640eaa9\") " pod="watcher-kuttl-default/keystone-bootstrap-kbjbs" Oct 03 09:03:15 crc kubenswrapper[4765]: I1003 09:03:15.207253 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/3645711c-7984-4502-aee7-98e45640eaa9-scripts\") pod \"keystone-bootstrap-kbjbs\" (UID: \"3645711c-7984-4502-aee7-98e45640eaa9\") " pod="watcher-kuttl-default/keystone-bootstrap-kbjbs" Oct 03 09:03:15 crc kubenswrapper[4765]: I1003 09:03:15.207517 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert-memcached-mtls\" (UniqueName: \"kubernetes.io/secret/3645711c-7984-4502-aee7-98e45640eaa9-cert-memcached-mtls\") pod \"keystone-bootstrap-kbjbs\" (UID: \"3645711c-7984-4502-aee7-98e45640eaa9\") " pod="watcher-kuttl-default/keystone-bootstrap-kbjbs" Oct 03 09:03:15 crc kubenswrapper[4765]: I1003 09:03:15.209557 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3645711c-7984-4502-aee7-98e45640eaa9-config-data\") pod \"keystone-bootstrap-kbjbs\" (UID: \"3645711c-7984-4502-aee7-98e45640eaa9\") " pod="watcher-kuttl-default/keystone-bootstrap-kbjbs" Oct 03 09:03:15 crc kubenswrapper[4765]: I1003 09:03:15.209883 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/3645711c-7984-4502-aee7-98e45640eaa9-fernet-keys\") pod \"keystone-bootstrap-kbjbs\" (UID: \"3645711c-7984-4502-aee7-98e45640eaa9\") " pod="watcher-kuttl-default/keystone-bootstrap-kbjbs" Oct 03 09:03:15 crc kubenswrapper[4765]: I1003 09:03:15.210150 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3645711c-7984-4502-aee7-98e45640eaa9-combined-ca-bundle\") pod \"keystone-bootstrap-kbjbs\" (UID: \"3645711c-7984-4502-aee7-98e45640eaa9\") " pod="watcher-kuttl-default/keystone-bootstrap-kbjbs" Oct 03 09:03:15 crc kubenswrapper[4765]: I1003 09:03:15.216096 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/3645711c-7984-4502-aee7-98e45640eaa9-credential-keys\") pod \"keystone-bootstrap-kbjbs\" (UID: \"3645711c-7984-4502-aee7-98e45640eaa9\") " pod="watcher-kuttl-default/keystone-bootstrap-kbjbs" Oct 03 09:03:15 crc kubenswrapper[4765]: I1003 09:03:15.219982 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jh4rn\" (UniqueName: \"kubernetes.io/projected/3645711c-7984-4502-aee7-98e45640eaa9-kube-api-access-jh4rn\") pod \"keystone-bootstrap-kbjbs\" (UID: \"3645711c-7984-4502-aee7-98e45640eaa9\") " pod="watcher-kuttl-default/keystone-bootstrap-kbjbs" Oct 03 09:03:15 crc kubenswrapper[4765]: I1003 09:03:15.355638 4765 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/keystone-bootstrap-kbjbs" Oct 03 09:03:15 crc kubenswrapper[4765]: I1003 09:03:15.508360 4765 generic.go:334] "Generic (PLEG): container finished" podID="5c3fda04-8007-4e3b-9283-9a28e6c57c7c" containerID="10e554f2337fa67d36ffcbf048fca52097cf515a04ca2df4b373f5bb6888638b" exitCode=143 Oct 03 09:03:15 crc kubenswrapper[4765]: I1003 09:03:15.508415 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/watcher-kuttl-api-0" event={"ID":"5c3fda04-8007-4e3b-9283-9a28e6c57c7c","Type":"ContainerDied","Data":"10e554f2337fa67d36ffcbf048fca52097cf515a04ca2df4b373f5bb6888638b"} Oct 03 09:03:15 crc kubenswrapper[4765]: I1003 09:03:15.832066 4765 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["watcher-kuttl-default/keystone-bootstrap-kbjbs"] Oct 03 09:03:16 crc kubenswrapper[4765]: I1003 09:03:16.238477 4765 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/watcher-kuttl-api-0" Oct 03 09:03:16 crc kubenswrapper[4765]: I1003 09:03:16.348287 4765 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b3c84158-3adc-480a-8e89-c28795415db5" path="/var/lib/kubelet/pods/b3c84158-3adc-480a-8e89-c28795415db5/volumes" Oct 03 09:03:16 crc kubenswrapper[4765]: I1003 09:03:16.428218 4765 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"custom-prometheus-ca\" (UniqueName: \"kubernetes.io/secret/5c3fda04-8007-4e3b-9283-9a28e6c57c7c-custom-prometheus-ca\") pod \"5c3fda04-8007-4e3b-9283-9a28e6c57c7c\" (UID: \"5c3fda04-8007-4e3b-9283-9a28e6c57c7c\") " Oct 03 09:03:16 crc kubenswrapper[4765]: I1003 09:03:16.428265 4765 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/5c3fda04-8007-4e3b-9283-9a28e6c57c7c-logs\") pod \"5c3fda04-8007-4e3b-9283-9a28e6c57c7c\" (UID: \"5c3fda04-8007-4e3b-9283-9a28e6c57c7c\") " Oct 03 09:03:16 crc kubenswrapper[4765]: I1003 09:03:16.428393 4765 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/5c3fda04-8007-4e3b-9283-9a28e6c57c7c-internal-tls-certs\") pod \"5c3fda04-8007-4e3b-9283-9a28e6c57c7c\" (UID: \"5c3fda04-8007-4e3b-9283-9a28e6c57c7c\") " Oct 03 09:03:16 crc kubenswrapper[4765]: I1003 09:03:16.428433 4765 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/5c3fda04-8007-4e3b-9283-9a28e6c57c7c-public-tls-certs\") pod \"5c3fda04-8007-4e3b-9283-9a28e6c57c7c\" (UID: \"5c3fda04-8007-4e3b-9283-9a28e6c57c7c\") " Oct 03 09:03:16 crc kubenswrapper[4765]: I1003 09:03:16.428481 4765 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5c3fda04-8007-4e3b-9283-9a28e6c57c7c-config-data\") pod \"5c3fda04-8007-4e3b-9283-9a28e6c57c7c\" (UID: \"5c3fda04-8007-4e3b-9283-9a28e6c57c7c\") " Oct 03 09:03:16 crc kubenswrapper[4765]: I1003 09:03:16.428506 4765 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5c3fda04-8007-4e3b-9283-9a28e6c57c7c-combined-ca-bundle\") pod \"5c3fda04-8007-4e3b-9283-9a28e6c57c7c\" (UID: \"5c3fda04-8007-4e3b-9283-9a28e6c57c7c\") " Oct 03 09:03:16 crc kubenswrapper[4765]: I1003 09:03:16.428542 4765 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fjrfd\" (UniqueName: \"kubernetes.io/projected/5c3fda04-8007-4e3b-9283-9a28e6c57c7c-kube-api-access-fjrfd\") pod \"5c3fda04-8007-4e3b-9283-9a28e6c57c7c\" (UID: \"5c3fda04-8007-4e3b-9283-9a28e6c57c7c\") " Oct 03 09:03:16 crc kubenswrapper[4765]: I1003 09:03:16.432126 4765 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5c3fda04-8007-4e3b-9283-9a28e6c57c7c-logs" (OuterVolumeSpecName: "logs") pod "5c3fda04-8007-4e3b-9283-9a28e6c57c7c" (UID: "5c3fda04-8007-4e3b-9283-9a28e6c57c7c"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 03 09:03:16 crc kubenswrapper[4765]: I1003 09:03:16.442922 4765 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5c3fda04-8007-4e3b-9283-9a28e6c57c7c-kube-api-access-fjrfd" (OuterVolumeSpecName: "kube-api-access-fjrfd") pod "5c3fda04-8007-4e3b-9283-9a28e6c57c7c" (UID: "5c3fda04-8007-4e3b-9283-9a28e6c57c7c"). InnerVolumeSpecName "kube-api-access-fjrfd". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 09:03:16 crc kubenswrapper[4765]: I1003 09:03:16.474137 4765 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5c3fda04-8007-4e3b-9283-9a28e6c57c7c-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "5c3fda04-8007-4e3b-9283-9a28e6c57c7c" (UID: "5c3fda04-8007-4e3b-9283-9a28e6c57c7c"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 09:03:16 crc kubenswrapper[4765]: I1003 09:03:16.474264 4765 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5c3fda04-8007-4e3b-9283-9a28e6c57c7c-custom-prometheus-ca" (OuterVolumeSpecName: "custom-prometheus-ca") pod "5c3fda04-8007-4e3b-9283-9a28e6c57c7c" (UID: "5c3fda04-8007-4e3b-9283-9a28e6c57c7c"). InnerVolumeSpecName "custom-prometheus-ca". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 09:03:16 crc kubenswrapper[4765]: I1003 09:03:16.478998 4765 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5c3fda04-8007-4e3b-9283-9a28e6c57c7c-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "5c3fda04-8007-4e3b-9283-9a28e6c57c7c" (UID: "5c3fda04-8007-4e3b-9283-9a28e6c57c7c"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 09:03:16 crc kubenswrapper[4765]: I1003 09:03:16.483080 4765 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5c3fda04-8007-4e3b-9283-9a28e6c57c7c-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "5c3fda04-8007-4e3b-9283-9a28e6c57c7c" (UID: "5c3fda04-8007-4e3b-9283-9a28e6c57c7c"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 09:03:16 crc kubenswrapper[4765]: I1003 09:03:16.499006 4765 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5c3fda04-8007-4e3b-9283-9a28e6c57c7c-config-data" (OuterVolumeSpecName: "config-data") pod "5c3fda04-8007-4e3b-9283-9a28e6c57c7c" (UID: "5c3fda04-8007-4e3b-9283-9a28e6c57c7c"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 09:03:16 crc kubenswrapper[4765]: I1003 09:03:16.514061 4765 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/memcached-0" Oct 03 09:03:16 crc kubenswrapper[4765]: I1003 09:03:16.521201 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/keystone-bootstrap-kbjbs" event={"ID":"3645711c-7984-4502-aee7-98e45640eaa9","Type":"ContainerStarted","Data":"d63ad9af49bea67159a5f84c95383734beb6b9e672682fefa4010326a84c3331"} Oct 03 09:03:16 crc kubenswrapper[4765]: I1003 09:03:16.521292 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/keystone-bootstrap-kbjbs" event={"ID":"3645711c-7984-4502-aee7-98e45640eaa9","Type":"ContainerStarted","Data":"7393c7551885273b28d2ecc56ac02883b9bb03037ede752ebdc86469457653fb"} Oct 03 09:03:16 crc kubenswrapper[4765]: I1003 09:03:16.522789 4765 generic.go:334] "Generic (PLEG): container finished" podID="5c3fda04-8007-4e3b-9283-9a28e6c57c7c" containerID="b90e44508f71fa956b268881a4af09fc788044b0a23b91bc37720587a17b35b5" exitCode=0 Oct 03 09:03:16 crc kubenswrapper[4765]: I1003 09:03:16.523131 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/watcher-kuttl-api-0" event={"ID":"5c3fda04-8007-4e3b-9283-9a28e6c57c7c","Type":"ContainerDied","Data":"b90e44508f71fa956b268881a4af09fc788044b0a23b91bc37720587a17b35b5"} Oct 03 09:03:16 crc kubenswrapper[4765]: I1003 09:03:16.523154 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/watcher-kuttl-api-0" event={"ID":"5c3fda04-8007-4e3b-9283-9a28e6c57c7c","Type":"ContainerDied","Data":"4f38d01ece8841ac7637f5362e3b48408f7a82468b71be9891b50fa71506a684"} Oct 03 09:03:16 crc kubenswrapper[4765]: I1003 09:03:16.523172 4765 scope.go:117] "RemoveContainer" containerID="b90e44508f71fa956b268881a4af09fc788044b0a23b91bc37720587a17b35b5" Oct 03 09:03:16 crc kubenswrapper[4765]: I1003 09:03:16.523311 4765 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/watcher-kuttl-api-0" Oct 03 09:03:16 crc kubenswrapper[4765]: I1003 09:03:16.530193 4765 generic.go:334] "Generic (PLEG): container finished" podID="2336fb85-91d3-4450-90ee-52264f3dc39f" containerID="95cc04ebd1b6e9267df24d6177f27b547015ef04c08a3eea05707e0e0c268088" exitCode=0 Oct 03 09:03:16 crc kubenswrapper[4765]: I1003 09:03:16.530242 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/memcached-0" event={"ID":"2336fb85-91d3-4450-90ee-52264f3dc39f","Type":"ContainerDied","Data":"95cc04ebd1b6e9267df24d6177f27b547015ef04c08a3eea05707e0e0c268088"} Oct 03 09:03:16 crc kubenswrapper[4765]: I1003 09:03:16.530267 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/memcached-0" event={"ID":"2336fb85-91d3-4450-90ee-52264f3dc39f","Type":"ContainerDied","Data":"5a4a2e4cd8d1e7efb6c2f15eba9297355f243ce065b8786932b959cdc56eea07"} Oct 03 09:03:16 crc kubenswrapper[4765]: I1003 09:03:16.530317 4765 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/memcached-0" Oct 03 09:03:16 crc kubenswrapper[4765]: I1003 09:03:16.532838 4765 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/5c3fda04-8007-4e3b-9283-9a28e6c57c7c-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Oct 03 09:03:16 crc kubenswrapper[4765]: I1003 09:03:16.532900 4765 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/5c3fda04-8007-4e3b-9283-9a28e6c57c7c-public-tls-certs\") on node \"crc\" DevicePath \"\"" Oct 03 09:03:16 crc kubenswrapper[4765]: I1003 09:03:16.532913 4765 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5c3fda04-8007-4e3b-9283-9a28e6c57c7c-config-data\") on node \"crc\" DevicePath \"\"" Oct 03 09:03:16 crc kubenswrapper[4765]: I1003 09:03:16.532924 4765 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5c3fda04-8007-4e3b-9283-9a28e6c57c7c-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 03 09:03:16 crc kubenswrapper[4765]: I1003 09:03:16.532935 4765 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fjrfd\" (UniqueName: \"kubernetes.io/projected/5c3fda04-8007-4e3b-9283-9a28e6c57c7c-kube-api-access-fjrfd\") on node \"crc\" DevicePath \"\"" Oct 03 09:03:16 crc kubenswrapper[4765]: I1003 09:03:16.532948 4765 reconciler_common.go:293] "Volume detached for volume \"custom-prometheus-ca\" (UniqueName: \"kubernetes.io/secret/5c3fda04-8007-4e3b-9283-9a28e6c57c7c-custom-prometheus-ca\") on node \"crc\" DevicePath \"\"" Oct 03 09:03:16 crc kubenswrapper[4765]: I1003 09:03:16.532959 4765 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/5c3fda04-8007-4e3b-9283-9a28e6c57c7c-logs\") on node \"crc\" DevicePath \"\"" Oct 03 09:03:16 crc kubenswrapper[4765]: I1003 09:03:16.579202 4765 scope.go:117] "RemoveContainer" containerID="10e554f2337fa67d36ffcbf048fca52097cf515a04ca2df4b373f5bb6888638b" Oct 03 09:03:16 crc kubenswrapper[4765]: I1003 09:03:16.593279 4765 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="watcher-kuttl-default/keystone-bootstrap-kbjbs" podStartSLOduration=2.593240443 podStartE2EDuration="2.593240443s" podCreationTimestamp="2025-10-03 09:03:14 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-03 09:03:16.581581789 +0000 UTC m=+1440.883076119" watchObservedRunningTime="2025-10-03 09:03:16.593240443 +0000 UTC m=+1440.894734773" Oct 03 09:03:16 crc kubenswrapper[4765]: I1003 09:03:16.626308 4765 scope.go:117] "RemoveContainer" containerID="b90e44508f71fa956b268881a4af09fc788044b0a23b91bc37720587a17b35b5" Oct 03 09:03:16 crc kubenswrapper[4765]: E1003 09:03:16.629728 4765 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b90e44508f71fa956b268881a4af09fc788044b0a23b91bc37720587a17b35b5\": container with ID starting with b90e44508f71fa956b268881a4af09fc788044b0a23b91bc37720587a17b35b5 not found: ID does not exist" containerID="b90e44508f71fa956b268881a4af09fc788044b0a23b91bc37720587a17b35b5" Oct 03 09:03:16 crc kubenswrapper[4765]: I1003 09:03:16.629786 4765 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b90e44508f71fa956b268881a4af09fc788044b0a23b91bc37720587a17b35b5"} err="failed to get container status \"b90e44508f71fa956b268881a4af09fc788044b0a23b91bc37720587a17b35b5\": rpc error: code = NotFound desc = could not find container \"b90e44508f71fa956b268881a4af09fc788044b0a23b91bc37720587a17b35b5\": container with ID starting with b90e44508f71fa956b268881a4af09fc788044b0a23b91bc37720587a17b35b5 not found: ID does not exist" Oct 03 09:03:16 crc kubenswrapper[4765]: I1003 09:03:16.629820 4765 scope.go:117] "RemoveContainer" containerID="10e554f2337fa67d36ffcbf048fca52097cf515a04ca2df4b373f5bb6888638b" Oct 03 09:03:16 crc kubenswrapper[4765]: I1003 09:03:16.633501 4765 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/2336fb85-91d3-4450-90ee-52264f3dc39f-config-data\") pod \"2336fb85-91d3-4450-90ee-52264f3dc39f\" (UID: \"2336fb85-91d3-4450-90ee-52264f3dc39f\") " Oct 03 09:03:16 crc kubenswrapper[4765]: I1003 09:03:16.633621 4765 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/2336fb85-91d3-4450-90ee-52264f3dc39f-kolla-config\") pod \"2336fb85-91d3-4450-90ee-52264f3dc39f\" (UID: \"2336fb85-91d3-4450-90ee-52264f3dc39f\") " Oct 03 09:03:16 crc kubenswrapper[4765]: I1003 09:03:16.634243 4765 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/2336fb85-91d3-4450-90ee-52264f3dc39f-kolla-config" (OuterVolumeSpecName: "kolla-config") pod "2336fb85-91d3-4450-90ee-52264f3dc39f" (UID: "2336fb85-91d3-4450-90ee-52264f3dc39f"). InnerVolumeSpecName "kolla-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 03 09:03:16 crc kubenswrapper[4765]: I1003 09:03:16.634390 4765 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/2336fb85-91d3-4450-90ee-52264f3dc39f-config-data" (OuterVolumeSpecName: "config-data") pod "2336fb85-91d3-4450-90ee-52264f3dc39f" (UID: "2336fb85-91d3-4450-90ee-52264f3dc39f"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 03 09:03:16 crc kubenswrapper[4765]: I1003 09:03:16.634923 4765 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"memcached-tls-certs\" (UniqueName: \"kubernetes.io/secret/2336fb85-91d3-4450-90ee-52264f3dc39f-memcached-tls-certs\") pod \"2336fb85-91d3-4450-90ee-52264f3dc39f\" (UID: \"2336fb85-91d3-4450-90ee-52264f3dc39f\") " Oct 03 09:03:16 crc kubenswrapper[4765]: I1003 09:03:16.634986 4765 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2336fb85-91d3-4450-90ee-52264f3dc39f-combined-ca-bundle\") pod \"2336fb85-91d3-4450-90ee-52264f3dc39f\" (UID: \"2336fb85-91d3-4450-90ee-52264f3dc39f\") " Oct 03 09:03:16 crc kubenswrapper[4765]: I1003 09:03:16.635017 4765 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-h6tgg\" (UniqueName: \"kubernetes.io/projected/2336fb85-91d3-4450-90ee-52264f3dc39f-kube-api-access-h6tgg\") pod \"2336fb85-91d3-4450-90ee-52264f3dc39f\" (UID: \"2336fb85-91d3-4450-90ee-52264f3dc39f\") " Oct 03 09:03:16 crc kubenswrapper[4765]: I1003 09:03:16.635681 4765 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/2336fb85-91d3-4450-90ee-52264f3dc39f-config-data\") on node \"crc\" DevicePath \"\"" Oct 03 09:03:16 crc kubenswrapper[4765]: I1003 09:03:16.635698 4765 reconciler_common.go:293] "Volume detached for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/2336fb85-91d3-4450-90ee-52264f3dc39f-kolla-config\") on node \"crc\" DevicePath \"\"" Oct 03 09:03:16 crc kubenswrapper[4765]: E1003 09:03:16.637841 4765 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"10e554f2337fa67d36ffcbf048fca52097cf515a04ca2df4b373f5bb6888638b\": container with ID starting with 10e554f2337fa67d36ffcbf048fca52097cf515a04ca2df4b373f5bb6888638b not found: ID does not exist" containerID="10e554f2337fa67d36ffcbf048fca52097cf515a04ca2df4b373f5bb6888638b" Oct 03 09:03:16 crc kubenswrapper[4765]: I1003 09:03:16.637942 4765 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"10e554f2337fa67d36ffcbf048fca52097cf515a04ca2df4b373f5bb6888638b"} err="failed to get container status \"10e554f2337fa67d36ffcbf048fca52097cf515a04ca2df4b373f5bb6888638b\": rpc error: code = NotFound desc = could not find container \"10e554f2337fa67d36ffcbf048fca52097cf515a04ca2df4b373f5bb6888638b\": container with ID starting with 10e554f2337fa67d36ffcbf048fca52097cf515a04ca2df4b373f5bb6888638b not found: ID does not exist" Oct 03 09:03:16 crc kubenswrapper[4765]: I1003 09:03:16.637973 4765 scope.go:117] "RemoveContainer" containerID="95cc04ebd1b6e9267df24d6177f27b547015ef04c08a3eea05707e0e0c268088" Oct 03 09:03:16 crc kubenswrapper[4765]: I1003 09:03:16.640638 4765 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2336fb85-91d3-4450-90ee-52264f3dc39f-kube-api-access-h6tgg" (OuterVolumeSpecName: "kube-api-access-h6tgg") pod "2336fb85-91d3-4450-90ee-52264f3dc39f" (UID: "2336fb85-91d3-4450-90ee-52264f3dc39f"). InnerVolumeSpecName "kube-api-access-h6tgg". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 09:03:16 crc kubenswrapper[4765]: I1003 09:03:16.663198 4765 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["watcher-kuttl-default/watcher-kuttl-api-0"] Oct 03 09:03:16 crc kubenswrapper[4765]: I1003 09:03:16.678021 4765 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["watcher-kuttl-default/watcher-kuttl-api-0"] Oct 03 09:03:16 crc kubenswrapper[4765]: I1003 09:03:16.685165 4765 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["watcher-kuttl-default/watcher-kuttl-api-0"] Oct 03 09:03:16 crc kubenswrapper[4765]: E1003 09:03:16.685539 4765 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5c3fda04-8007-4e3b-9283-9a28e6c57c7c" containerName="watcher-api" Oct 03 09:03:16 crc kubenswrapper[4765]: I1003 09:03:16.685558 4765 state_mem.go:107] "Deleted CPUSet assignment" podUID="5c3fda04-8007-4e3b-9283-9a28e6c57c7c" containerName="watcher-api" Oct 03 09:03:16 crc kubenswrapper[4765]: E1003 09:03:16.685573 4765 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5c3fda04-8007-4e3b-9283-9a28e6c57c7c" containerName="watcher-kuttl-api-log" Oct 03 09:03:16 crc kubenswrapper[4765]: I1003 09:03:16.685582 4765 state_mem.go:107] "Deleted CPUSet assignment" podUID="5c3fda04-8007-4e3b-9283-9a28e6c57c7c" containerName="watcher-kuttl-api-log" Oct 03 09:03:16 crc kubenswrapper[4765]: E1003 09:03:16.685614 4765 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2336fb85-91d3-4450-90ee-52264f3dc39f" containerName="memcached" Oct 03 09:03:16 crc kubenswrapper[4765]: I1003 09:03:16.685622 4765 state_mem.go:107] "Deleted CPUSet assignment" podUID="2336fb85-91d3-4450-90ee-52264f3dc39f" containerName="memcached" Oct 03 09:03:16 crc kubenswrapper[4765]: I1003 09:03:16.686771 4765 memory_manager.go:354] "RemoveStaleState removing state" podUID="5c3fda04-8007-4e3b-9283-9a28e6c57c7c" containerName="watcher-api" Oct 03 09:03:16 crc kubenswrapper[4765]: I1003 09:03:16.686808 4765 memory_manager.go:354] "RemoveStaleState removing state" podUID="2336fb85-91d3-4450-90ee-52264f3dc39f" containerName="memcached" Oct 03 09:03:16 crc kubenswrapper[4765]: I1003 09:03:16.686834 4765 memory_manager.go:354] "RemoveStaleState removing state" podUID="5c3fda04-8007-4e3b-9283-9a28e6c57c7c" containerName="watcher-kuttl-api-log" Oct 03 09:03:16 crc kubenswrapper[4765]: I1003 09:03:16.687915 4765 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/watcher-kuttl-api-0" Oct 03 09:03:16 crc kubenswrapper[4765]: I1003 09:03:16.696149 4765 reflector.go:368] Caches populated for *v1.Secret from object-"watcher-kuttl-default"/"watcher-kuttl-api-config-data" Oct 03 09:03:16 crc kubenswrapper[4765]: I1003 09:03:16.696385 4765 reflector.go:368] Caches populated for *v1.Secret from object-"watcher-kuttl-default"/"cert-watcher-internal-svc" Oct 03 09:03:16 crc kubenswrapper[4765]: I1003 09:03:16.696572 4765 scope.go:117] "RemoveContainer" containerID="95cc04ebd1b6e9267df24d6177f27b547015ef04c08a3eea05707e0e0c268088" Oct 03 09:03:16 crc kubenswrapper[4765]: I1003 09:03:16.701824 4765 reflector.go:368] Caches populated for *v1.Secret from object-"watcher-kuttl-default"/"cert-watcher-public-svc" Oct 03 09:03:16 crc kubenswrapper[4765]: E1003 09:03:16.716850 4765 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"95cc04ebd1b6e9267df24d6177f27b547015ef04c08a3eea05707e0e0c268088\": container with ID starting with 95cc04ebd1b6e9267df24d6177f27b547015ef04c08a3eea05707e0e0c268088 not found: ID does not exist" containerID="95cc04ebd1b6e9267df24d6177f27b547015ef04c08a3eea05707e0e0c268088" Oct 03 09:03:16 crc kubenswrapper[4765]: I1003 09:03:16.716927 4765 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"95cc04ebd1b6e9267df24d6177f27b547015ef04c08a3eea05707e0e0c268088"} err="failed to get container status \"95cc04ebd1b6e9267df24d6177f27b547015ef04c08a3eea05707e0e0c268088\": rpc error: code = NotFound desc = could not find container \"95cc04ebd1b6e9267df24d6177f27b547015ef04c08a3eea05707e0e0c268088\": container with ID starting with 95cc04ebd1b6e9267df24d6177f27b547015ef04c08a3eea05707e0e0c268088 not found: ID does not exist" Oct 03 09:03:16 crc kubenswrapper[4765]: I1003 09:03:16.739593 4765 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-h6tgg\" (UniqueName: \"kubernetes.io/projected/2336fb85-91d3-4450-90ee-52264f3dc39f-kube-api-access-h6tgg\") on node \"crc\" DevicePath \"\"" Oct 03 09:03:16 crc kubenswrapper[4765]: I1003 09:03:16.744923 4765 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["watcher-kuttl-default/watcher-kuttl-api-0"] Oct 03 09:03:16 crc kubenswrapper[4765]: I1003 09:03:16.746039 4765 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2336fb85-91d3-4450-90ee-52264f3dc39f-memcached-tls-certs" (OuterVolumeSpecName: "memcached-tls-certs") pod "2336fb85-91d3-4450-90ee-52264f3dc39f" (UID: "2336fb85-91d3-4450-90ee-52264f3dc39f"). InnerVolumeSpecName "memcached-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 09:03:16 crc kubenswrapper[4765]: I1003 09:03:16.795894 4765 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2336fb85-91d3-4450-90ee-52264f3dc39f-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "2336fb85-91d3-4450-90ee-52264f3dc39f" (UID: "2336fb85-91d3-4450-90ee-52264f3dc39f"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 09:03:16 crc kubenswrapper[4765]: E1003 09:03:16.811960 4765 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="3475db93fee76cefce14c7c9aef47cf2bafa98b960c66b70f4d3da860393bb9d" cmd=["/usr/bin/pgrep","-f","-r","DRST","watcher-decision-engine"] Oct 03 09:03:16 crc kubenswrapper[4765]: E1003 09:03:16.824201 4765 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="3475db93fee76cefce14c7c9aef47cf2bafa98b960c66b70f4d3da860393bb9d" cmd=["/usr/bin/pgrep","-f","-r","DRST","watcher-decision-engine"] Oct 03 09:03:16 crc kubenswrapper[4765]: E1003 09:03:16.825886 4765 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="3475db93fee76cefce14c7c9aef47cf2bafa98b960c66b70f4d3da860393bb9d" cmd=["/usr/bin/pgrep","-f","-r","DRST","watcher-decision-engine"] Oct 03 09:03:16 crc kubenswrapper[4765]: E1003 09:03:16.825934 4765 prober.go:104] "Probe errored" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" probeType="Readiness" pod="watcher-kuttl-default/watcher-kuttl-decision-engine-0" podUID="1eef47c2-953c-4299-a994-a3ab5583b2d0" containerName="watcher-decision-engine" Oct 03 09:03:16 crc kubenswrapper[4765]: E1003 09:03:16.826039 4765 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="fb430e7d139d8ef6c9d2c00d38b03c6289fd36807707c4b86a3953e4fc3f713d" cmd=["/usr/bin/pgrep","-r","DRST","watcher-applier"] Oct 03 09:03:16 crc kubenswrapper[4765]: E1003 09:03:16.829750 4765 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="fb430e7d139d8ef6c9d2c00d38b03c6289fd36807707c4b86a3953e4fc3f713d" cmd=["/usr/bin/pgrep","-r","DRST","watcher-applier"] Oct 03 09:03:16 crc kubenswrapper[4765]: I1003 09:03:16.842458 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-trvjt\" (UniqueName: \"kubernetes.io/projected/9ba86238-0b43-4a15-878a-1c0495098797-kube-api-access-trvjt\") pod \"watcher-kuttl-api-0\" (UID: \"9ba86238-0b43-4a15-878a-1c0495098797\") " pod="watcher-kuttl-default/watcher-kuttl-api-0" Oct 03 09:03:16 crc kubenswrapper[4765]: I1003 09:03:16.842526 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9ba86238-0b43-4a15-878a-1c0495098797-config-data\") pod \"watcher-kuttl-api-0\" (UID: \"9ba86238-0b43-4a15-878a-1c0495098797\") " pod="watcher-kuttl-default/watcher-kuttl-api-0" Oct 03 09:03:16 crc kubenswrapper[4765]: I1003 09:03:16.842546 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert-memcached-mtls\" (UniqueName: \"kubernetes.io/secret/9ba86238-0b43-4a15-878a-1c0495098797-cert-memcached-mtls\") pod \"watcher-kuttl-api-0\" (UID: \"9ba86238-0b43-4a15-878a-1c0495098797\") " pod="watcher-kuttl-default/watcher-kuttl-api-0" Oct 03 09:03:16 crc kubenswrapper[4765]: I1003 09:03:16.842563 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/9ba86238-0b43-4a15-878a-1c0495098797-logs\") pod \"watcher-kuttl-api-0\" (UID: \"9ba86238-0b43-4a15-878a-1c0495098797\") " pod="watcher-kuttl-default/watcher-kuttl-api-0" Oct 03 09:03:16 crc kubenswrapper[4765]: E1003 09:03:16.842759 4765 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="fb430e7d139d8ef6c9d2c00d38b03c6289fd36807707c4b86a3953e4fc3f713d" cmd=["/usr/bin/pgrep","-r","DRST","watcher-applier"] Oct 03 09:03:16 crc kubenswrapper[4765]: E1003 09:03:16.842820 4765 prober.go:104] "Probe errored" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" probeType="Readiness" pod="watcher-kuttl-default/watcher-kuttl-applier-0" podUID="02077287-6b7d-4d30-8a19-e7f42699e5d2" containerName="watcher-applier" Oct 03 09:03:16 crc kubenswrapper[4765]: I1003 09:03:16.843366 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/9ba86238-0b43-4a15-878a-1c0495098797-public-tls-certs\") pod \"watcher-kuttl-api-0\" (UID: \"9ba86238-0b43-4a15-878a-1c0495098797\") " pod="watcher-kuttl-default/watcher-kuttl-api-0" Oct 03 09:03:16 crc kubenswrapper[4765]: I1003 09:03:16.843417 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/9ba86238-0b43-4a15-878a-1c0495098797-internal-tls-certs\") pod \"watcher-kuttl-api-0\" (UID: \"9ba86238-0b43-4a15-878a-1c0495098797\") " pod="watcher-kuttl-default/watcher-kuttl-api-0" Oct 03 09:03:16 crc kubenswrapper[4765]: I1003 09:03:16.844390 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"custom-prometheus-ca\" (UniqueName: \"kubernetes.io/secret/9ba86238-0b43-4a15-878a-1c0495098797-custom-prometheus-ca\") pod \"watcher-kuttl-api-0\" (UID: \"9ba86238-0b43-4a15-878a-1c0495098797\") " pod="watcher-kuttl-default/watcher-kuttl-api-0" Oct 03 09:03:16 crc kubenswrapper[4765]: I1003 09:03:16.844464 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9ba86238-0b43-4a15-878a-1c0495098797-combined-ca-bundle\") pod \"watcher-kuttl-api-0\" (UID: \"9ba86238-0b43-4a15-878a-1c0495098797\") " pod="watcher-kuttl-default/watcher-kuttl-api-0" Oct 03 09:03:16 crc kubenswrapper[4765]: I1003 09:03:16.844545 4765 reconciler_common.go:293] "Volume detached for volume \"memcached-tls-certs\" (UniqueName: \"kubernetes.io/secret/2336fb85-91d3-4450-90ee-52264f3dc39f-memcached-tls-certs\") on node \"crc\" DevicePath \"\"" Oct 03 09:03:16 crc kubenswrapper[4765]: I1003 09:03:16.844569 4765 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2336fb85-91d3-4450-90ee-52264f3dc39f-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 03 09:03:16 crc kubenswrapper[4765]: I1003 09:03:16.905877 4765 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["watcher-kuttl-default/memcached-0"] Oct 03 09:03:16 crc kubenswrapper[4765]: I1003 09:03:16.920913 4765 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["watcher-kuttl-default/memcached-0"] Oct 03 09:03:16 crc kubenswrapper[4765]: I1003 09:03:16.944900 4765 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["watcher-kuttl-default/memcached-0"] Oct 03 09:03:16 crc kubenswrapper[4765]: I1003 09:03:16.945995 4765 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/memcached-0" Oct 03 09:03:16 crc kubenswrapper[4765]: I1003 09:03:16.948647 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"custom-prometheus-ca\" (UniqueName: \"kubernetes.io/secret/9ba86238-0b43-4a15-878a-1c0495098797-custom-prometheus-ca\") pod \"watcher-kuttl-api-0\" (UID: \"9ba86238-0b43-4a15-878a-1c0495098797\") " pod="watcher-kuttl-default/watcher-kuttl-api-0" Oct 03 09:03:16 crc kubenswrapper[4765]: I1003 09:03:16.948724 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9ba86238-0b43-4a15-878a-1c0495098797-combined-ca-bundle\") pod \"watcher-kuttl-api-0\" (UID: \"9ba86238-0b43-4a15-878a-1c0495098797\") " pod="watcher-kuttl-default/watcher-kuttl-api-0" Oct 03 09:03:16 crc kubenswrapper[4765]: I1003 09:03:16.948766 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-trvjt\" (UniqueName: \"kubernetes.io/projected/9ba86238-0b43-4a15-878a-1c0495098797-kube-api-access-trvjt\") pod \"watcher-kuttl-api-0\" (UID: \"9ba86238-0b43-4a15-878a-1c0495098797\") " pod="watcher-kuttl-default/watcher-kuttl-api-0" Oct 03 09:03:16 crc kubenswrapper[4765]: I1003 09:03:16.948793 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9ba86238-0b43-4a15-878a-1c0495098797-config-data\") pod \"watcher-kuttl-api-0\" (UID: \"9ba86238-0b43-4a15-878a-1c0495098797\") " pod="watcher-kuttl-default/watcher-kuttl-api-0" Oct 03 09:03:16 crc kubenswrapper[4765]: I1003 09:03:16.948809 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert-memcached-mtls\" (UniqueName: \"kubernetes.io/secret/9ba86238-0b43-4a15-878a-1c0495098797-cert-memcached-mtls\") pod \"watcher-kuttl-api-0\" (UID: \"9ba86238-0b43-4a15-878a-1c0495098797\") " pod="watcher-kuttl-default/watcher-kuttl-api-0" Oct 03 09:03:16 crc kubenswrapper[4765]: I1003 09:03:16.948824 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/9ba86238-0b43-4a15-878a-1c0495098797-logs\") pod \"watcher-kuttl-api-0\" (UID: \"9ba86238-0b43-4a15-878a-1c0495098797\") " pod="watcher-kuttl-default/watcher-kuttl-api-0" Oct 03 09:03:16 crc kubenswrapper[4765]: I1003 09:03:16.948866 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/9ba86238-0b43-4a15-878a-1c0495098797-public-tls-certs\") pod \"watcher-kuttl-api-0\" (UID: \"9ba86238-0b43-4a15-878a-1c0495098797\") " pod="watcher-kuttl-default/watcher-kuttl-api-0" Oct 03 09:03:16 crc kubenswrapper[4765]: I1003 09:03:16.948894 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/9ba86238-0b43-4a15-878a-1c0495098797-internal-tls-certs\") pod \"watcher-kuttl-api-0\" (UID: \"9ba86238-0b43-4a15-878a-1c0495098797\") " pod="watcher-kuttl-default/watcher-kuttl-api-0" Oct 03 09:03:16 crc kubenswrapper[4765]: I1003 09:03:16.949782 4765 reflector.go:368] Caches populated for *v1.Secret from object-"watcher-kuttl-default"/"cert-memcached-svc" Oct 03 09:03:16 crc kubenswrapper[4765]: I1003 09:03:16.949989 4765 reflector.go:368] Caches populated for *v1.ConfigMap from object-"watcher-kuttl-default"/"memcached-config-data" Oct 03 09:03:16 crc kubenswrapper[4765]: I1003 09:03:16.951110 4765 reflector.go:368] Caches populated for *v1.Secret from object-"watcher-kuttl-default"/"memcached-memcached-dockercfg-9c22t" Oct 03 09:03:16 crc kubenswrapper[4765]: I1003 09:03:16.951682 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/9ba86238-0b43-4a15-878a-1c0495098797-logs\") pod \"watcher-kuttl-api-0\" (UID: \"9ba86238-0b43-4a15-878a-1c0495098797\") " pod="watcher-kuttl-default/watcher-kuttl-api-0" Oct 03 09:03:16 crc kubenswrapper[4765]: I1003 09:03:16.970294 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9ba86238-0b43-4a15-878a-1c0495098797-config-data\") pod \"watcher-kuttl-api-0\" (UID: \"9ba86238-0b43-4a15-878a-1c0495098797\") " pod="watcher-kuttl-default/watcher-kuttl-api-0" Oct 03 09:03:16 crc kubenswrapper[4765]: I1003 09:03:16.976690 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert-memcached-mtls\" (UniqueName: \"kubernetes.io/secret/9ba86238-0b43-4a15-878a-1c0495098797-cert-memcached-mtls\") pod \"watcher-kuttl-api-0\" (UID: \"9ba86238-0b43-4a15-878a-1c0495098797\") " pod="watcher-kuttl-default/watcher-kuttl-api-0" Oct 03 09:03:16 crc kubenswrapper[4765]: I1003 09:03:16.976700 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"custom-prometheus-ca\" (UniqueName: \"kubernetes.io/secret/9ba86238-0b43-4a15-878a-1c0495098797-custom-prometheus-ca\") pod \"watcher-kuttl-api-0\" (UID: \"9ba86238-0b43-4a15-878a-1c0495098797\") " pod="watcher-kuttl-default/watcher-kuttl-api-0" Oct 03 09:03:16 crc kubenswrapper[4765]: I1003 09:03:16.976826 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9ba86238-0b43-4a15-878a-1c0495098797-combined-ca-bundle\") pod \"watcher-kuttl-api-0\" (UID: \"9ba86238-0b43-4a15-878a-1c0495098797\") " pod="watcher-kuttl-default/watcher-kuttl-api-0" Oct 03 09:03:16 crc kubenswrapper[4765]: I1003 09:03:16.978720 4765 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["watcher-kuttl-default/memcached-0"] Oct 03 09:03:16 crc kubenswrapper[4765]: I1003 09:03:16.979905 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/9ba86238-0b43-4a15-878a-1c0495098797-internal-tls-certs\") pod \"watcher-kuttl-api-0\" (UID: \"9ba86238-0b43-4a15-878a-1c0495098797\") " pod="watcher-kuttl-default/watcher-kuttl-api-0" Oct 03 09:03:16 crc kubenswrapper[4765]: I1003 09:03:16.981584 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-trvjt\" (UniqueName: \"kubernetes.io/projected/9ba86238-0b43-4a15-878a-1c0495098797-kube-api-access-trvjt\") pod \"watcher-kuttl-api-0\" (UID: \"9ba86238-0b43-4a15-878a-1c0495098797\") " pod="watcher-kuttl-default/watcher-kuttl-api-0" Oct 03 09:03:16 crc kubenswrapper[4765]: I1003 09:03:16.985208 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/9ba86238-0b43-4a15-878a-1c0495098797-public-tls-certs\") pod \"watcher-kuttl-api-0\" (UID: \"9ba86238-0b43-4a15-878a-1c0495098797\") " pod="watcher-kuttl-default/watcher-kuttl-api-0" Oct 03 09:03:17 crc kubenswrapper[4765]: I1003 09:03:17.050516 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/3023d790-ea2b-44fc-9255-338f822b368c-kolla-config\") pod \"memcached-0\" (UID: \"3023d790-ea2b-44fc-9255-338f822b368c\") " pod="watcher-kuttl-default/memcached-0" Oct 03 09:03:17 crc kubenswrapper[4765]: I1003 09:03:17.050569 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3023d790-ea2b-44fc-9255-338f822b368c-combined-ca-bundle\") pod \"memcached-0\" (UID: \"3023d790-ea2b-44fc-9255-338f822b368c\") " pod="watcher-kuttl-default/memcached-0" Oct 03 09:03:17 crc kubenswrapper[4765]: I1003 09:03:17.050592 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pztgc\" (UniqueName: \"kubernetes.io/projected/3023d790-ea2b-44fc-9255-338f822b368c-kube-api-access-pztgc\") pod \"memcached-0\" (UID: \"3023d790-ea2b-44fc-9255-338f822b368c\") " pod="watcher-kuttl-default/memcached-0" Oct 03 09:03:17 crc kubenswrapper[4765]: I1003 09:03:17.050608 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/3023d790-ea2b-44fc-9255-338f822b368c-config-data\") pod \"memcached-0\" (UID: \"3023d790-ea2b-44fc-9255-338f822b368c\") " pod="watcher-kuttl-default/memcached-0" Oct 03 09:03:17 crc kubenswrapper[4765]: I1003 09:03:17.050684 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"memcached-tls-certs\" (UniqueName: \"kubernetes.io/secret/3023d790-ea2b-44fc-9255-338f822b368c-memcached-tls-certs\") pod \"memcached-0\" (UID: \"3023d790-ea2b-44fc-9255-338f822b368c\") " pod="watcher-kuttl-default/memcached-0" Oct 03 09:03:17 crc kubenswrapper[4765]: I1003 09:03:17.070178 4765 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/watcher-kuttl-api-0" Oct 03 09:03:17 crc kubenswrapper[4765]: I1003 09:03:17.156219 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"memcached-tls-certs\" (UniqueName: \"kubernetes.io/secret/3023d790-ea2b-44fc-9255-338f822b368c-memcached-tls-certs\") pod \"memcached-0\" (UID: \"3023d790-ea2b-44fc-9255-338f822b368c\") " pod="watcher-kuttl-default/memcached-0" Oct 03 09:03:17 crc kubenswrapper[4765]: I1003 09:03:17.156294 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/3023d790-ea2b-44fc-9255-338f822b368c-kolla-config\") pod \"memcached-0\" (UID: \"3023d790-ea2b-44fc-9255-338f822b368c\") " pod="watcher-kuttl-default/memcached-0" Oct 03 09:03:17 crc kubenswrapper[4765]: I1003 09:03:17.156331 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3023d790-ea2b-44fc-9255-338f822b368c-combined-ca-bundle\") pod \"memcached-0\" (UID: \"3023d790-ea2b-44fc-9255-338f822b368c\") " pod="watcher-kuttl-default/memcached-0" Oct 03 09:03:17 crc kubenswrapper[4765]: I1003 09:03:17.156354 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pztgc\" (UniqueName: \"kubernetes.io/projected/3023d790-ea2b-44fc-9255-338f822b368c-kube-api-access-pztgc\") pod \"memcached-0\" (UID: \"3023d790-ea2b-44fc-9255-338f822b368c\") " pod="watcher-kuttl-default/memcached-0" Oct 03 09:03:17 crc kubenswrapper[4765]: I1003 09:03:17.156378 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/3023d790-ea2b-44fc-9255-338f822b368c-config-data\") pod \"memcached-0\" (UID: \"3023d790-ea2b-44fc-9255-338f822b368c\") " pod="watcher-kuttl-default/memcached-0" Oct 03 09:03:17 crc kubenswrapper[4765]: I1003 09:03:17.157392 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/3023d790-ea2b-44fc-9255-338f822b368c-kolla-config\") pod \"memcached-0\" (UID: \"3023d790-ea2b-44fc-9255-338f822b368c\") " pod="watcher-kuttl-default/memcached-0" Oct 03 09:03:17 crc kubenswrapper[4765]: I1003 09:03:17.157883 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/3023d790-ea2b-44fc-9255-338f822b368c-config-data\") pod \"memcached-0\" (UID: \"3023d790-ea2b-44fc-9255-338f822b368c\") " pod="watcher-kuttl-default/memcached-0" Oct 03 09:03:17 crc kubenswrapper[4765]: I1003 09:03:17.164575 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3023d790-ea2b-44fc-9255-338f822b368c-combined-ca-bundle\") pod \"memcached-0\" (UID: \"3023d790-ea2b-44fc-9255-338f822b368c\") " pod="watcher-kuttl-default/memcached-0" Oct 03 09:03:17 crc kubenswrapper[4765]: I1003 09:03:17.173014 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"memcached-tls-certs\" (UniqueName: \"kubernetes.io/secret/3023d790-ea2b-44fc-9255-338f822b368c-memcached-tls-certs\") pod \"memcached-0\" (UID: \"3023d790-ea2b-44fc-9255-338f822b368c\") " pod="watcher-kuttl-default/memcached-0" Oct 03 09:03:17 crc kubenswrapper[4765]: I1003 09:03:17.180137 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pztgc\" (UniqueName: \"kubernetes.io/projected/3023d790-ea2b-44fc-9255-338f822b368c-kube-api-access-pztgc\") pod \"memcached-0\" (UID: \"3023d790-ea2b-44fc-9255-338f822b368c\") " pod="watcher-kuttl-default/memcached-0" Oct 03 09:03:17 crc kubenswrapper[4765]: I1003 09:03:17.364896 4765 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/memcached-0" Oct 03 09:03:17 crc kubenswrapper[4765]: W1003 09:03:17.580032 4765 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod9ba86238_0b43_4a15_878a_1c0495098797.slice/crio-8a927fa22b80fe8f447ddb7c014121e80154d5ac20462b0f07cfdc186c35287a WatchSource:0}: Error finding container 8a927fa22b80fe8f447ddb7c014121e80154d5ac20462b0f07cfdc186c35287a: Status 404 returned error can't find the container with id 8a927fa22b80fe8f447ddb7c014121e80154d5ac20462b0f07cfdc186c35287a Oct 03 09:03:17 crc kubenswrapper[4765]: I1003 09:03:17.581207 4765 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["watcher-kuttl-default/watcher-kuttl-api-0"] Oct 03 09:03:17 crc kubenswrapper[4765]: I1003 09:03:17.880705 4765 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["watcher-kuttl-default/memcached-0"] Oct 03 09:03:17 crc kubenswrapper[4765]: W1003 09:03:17.880735 4765 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod3023d790_ea2b_44fc_9255_338f822b368c.slice/crio-2e3225e26341e2be8c5fc5f6082700c2f7e6138b68b654bd202df4576f2b1391 WatchSource:0}: Error finding container 2e3225e26341e2be8c5fc5f6082700c2f7e6138b68b654bd202df4576f2b1391: Status 404 returned error can't find the container with id 2e3225e26341e2be8c5fc5f6082700c2f7e6138b68b654bd202df4576f2b1391 Oct 03 09:03:18 crc kubenswrapper[4765]: I1003 09:03:18.317136 4765 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2336fb85-91d3-4450-90ee-52264f3dc39f" path="/var/lib/kubelet/pods/2336fb85-91d3-4450-90ee-52264f3dc39f/volumes" Oct 03 09:03:18 crc kubenswrapper[4765]: I1003 09:03:18.317692 4765 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5c3fda04-8007-4e3b-9283-9a28e6c57c7c" path="/var/lib/kubelet/pods/5c3fda04-8007-4e3b-9283-9a28e6c57c7c/volumes" Oct 03 09:03:18 crc kubenswrapper[4765]: I1003 09:03:18.551495 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/memcached-0" event={"ID":"3023d790-ea2b-44fc-9255-338f822b368c","Type":"ContainerStarted","Data":"5009b7927c38484ea68ae7966b90151e7874de3d46a7ed530ac844c7519a318a"} Oct 03 09:03:18 crc kubenswrapper[4765]: I1003 09:03:18.551544 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/memcached-0" event={"ID":"3023d790-ea2b-44fc-9255-338f822b368c","Type":"ContainerStarted","Data":"2e3225e26341e2be8c5fc5f6082700c2f7e6138b68b654bd202df4576f2b1391"} Oct 03 09:03:18 crc kubenswrapper[4765]: I1003 09:03:18.551581 4765 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="watcher-kuttl-default/memcached-0" Oct 03 09:03:18 crc kubenswrapper[4765]: I1003 09:03:18.555436 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/watcher-kuttl-api-0" event={"ID":"9ba86238-0b43-4a15-878a-1c0495098797","Type":"ContainerStarted","Data":"7e8678578c65ff6bf7710e1200474ac6b3f1041fa89dc3c52a2fa850b1b1fda3"} Oct 03 09:03:18 crc kubenswrapper[4765]: I1003 09:03:18.555508 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/watcher-kuttl-api-0" event={"ID":"9ba86238-0b43-4a15-878a-1c0495098797","Type":"ContainerStarted","Data":"b61c8d9d774c761e7ab74d79239648cf11aa8e48e7485ad087a4085193226889"} Oct 03 09:03:18 crc kubenswrapper[4765]: I1003 09:03:18.555524 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/watcher-kuttl-api-0" event={"ID":"9ba86238-0b43-4a15-878a-1c0495098797","Type":"ContainerStarted","Data":"8a927fa22b80fe8f447ddb7c014121e80154d5ac20462b0f07cfdc186c35287a"} Oct 03 09:03:18 crc kubenswrapper[4765]: I1003 09:03:18.556570 4765 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="watcher-kuttl-default/watcher-kuttl-api-0" Oct 03 09:03:18 crc kubenswrapper[4765]: I1003 09:03:18.575274 4765 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="watcher-kuttl-default/memcached-0" podStartSLOduration=2.575253944 podStartE2EDuration="2.575253944s" podCreationTimestamp="2025-10-03 09:03:16 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-03 09:03:18.570854833 +0000 UTC m=+1442.872349173" watchObservedRunningTime="2025-10-03 09:03:18.575253944 +0000 UTC m=+1442.876748274" Oct 03 09:03:18 crc kubenswrapper[4765]: I1003 09:03:18.593836 4765 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="watcher-kuttl-default/watcher-kuttl-api-0" podStartSLOduration=2.593819963 podStartE2EDuration="2.593819963s" podCreationTimestamp="2025-10-03 09:03:16 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-03 09:03:18.587755229 +0000 UTC m=+1442.889249569" watchObservedRunningTime="2025-10-03 09:03:18.593819963 +0000 UTC m=+1442.895314293" Oct 03 09:03:19 crc kubenswrapper[4765]: I1003 09:03:19.574285 4765 generic.go:334] "Generic (PLEG): container finished" podID="02077287-6b7d-4d30-8a19-e7f42699e5d2" containerID="fb430e7d139d8ef6c9d2c00d38b03c6289fd36807707c4b86a3953e4fc3f713d" exitCode=0 Oct 03 09:03:19 crc kubenswrapper[4765]: I1003 09:03:19.575204 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/watcher-kuttl-applier-0" event={"ID":"02077287-6b7d-4d30-8a19-e7f42699e5d2","Type":"ContainerDied","Data":"fb430e7d139d8ef6c9d2c00d38b03c6289fd36807707c4b86a3953e4fc3f713d"} Oct 03 09:03:19 crc kubenswrapper[4765]: I1003 09:03:19.575639 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/watcher-kuttl-applier-0" event={"ID":"02077287-6b7d-4d30-8a19-e7f42699e5d2","Type":"ContainerDied","Data":"3a6d4977b5de3fce7113a8a0559b82e44601759afb53733a6c517a6a4ffa30dd"} Oct 03 09:03:19 crc kubenswrapper[4765]: I1003 09:03:19.575696 4765 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="3a6d4977b5de3fce7113a8a0559b82e44601759afb53733a6c517a6a4ffa30dd" Oct 03 09:03:19 crc kubenswrapper[4765]: I1003 09:03:19.609861 4765 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/watcher-kuttl-applier-0" Oct 03 09:03:19 crc kubenswrapper[4765]: I1003 09:03:19.711377 4765 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/02077287-6b7d-4d30-8a19-e7f42699e5d2-config-data\") pod \"02077287-6b7d-4d30-8a19-e7f42699e5d2\" (UID: \"02077287-6b7d-4d30-8a19-e7f42699e5d2\") " Oct 03 09:03:19 crc kubenswrapper[4765]: I1003 09:03:19.711445 4765 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/02077287-6b7d-4d30-8a19-e7f42699e5d2-logs\") pod \"02077287-6b7d-4d30-8a19-e7f42699e5d2\" (UID: \"02077287-6b7d-4d30-8a19-e7f42699e5d2\") " Oct 03 09:03:19 crc kubenswrapper[4765]: I1003 09:03:19.711561 4765 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-j9m5s\" (UniqueName: \"kubernetes.io/projected/02077287-6b7d-4d30-8a19-e7f42699e5d2-kube-api-access-j9m5s\") pod \"02077287-6b7d-4d30-8a19-e7f42699e5d2\" (UID: \"02077287-6b7d-4d30-8a19-e7f42699e5d2\") " Oct 03 09:03:19 crc kubenswrapper[4765]: I1003 09:03:19.711620 4765 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/02077287-6b7d-4d30-8a19-e7f42699e5d2-combined-ca-bundle\") pod \"02077287-6b7d-4d30-8a19-e7f42699e5d2\" (UID: \"02077287-6b7d-4d30-8a19-e7f42699e5d2\") " Oct 03 09:03:19 crc kubenswrapper[4765]: I1003 09:03:19.715134 4765 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/02077287-6b7d-4d30-8a19-e7f42699e5d2-logs" (OuterVolumeSpecName: "logs") pod "02077287-6b7d-4d30-8a19-e7f42699e5d2" (UID: "02077287-6b7d-4d30-8a19-e7f42699e5d2"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 03 09:03:19 crc kubenswrapper[4765]: I1003 09:03:19.741783 4765 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/02077287-6b7d-4d30-8a19-e7f42699e5d2-kube-api-access-j9m5s" (OuterVolumeSpecName: "kube-api-access-j9m5s") pod "02077287-6b7d-4d30-8a19-e7f42699e5d2" (UID: "02077287-6b7d-4d30-8a19-e7f42699e5d2"). InnerVolumeSpecName "kube-api-access-j9m5s". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 09:03:19 crc kubenswrapper[4765]: I1003 09:03:19.749919 4765 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/02077287-6b7d-4d30-8a19-e7f42699e5d2-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "02077287-6b7d-4d30-8a19-e7f42699e5d2" (UID: "02077287-6b7d-4d30-8a19-e7f42699e5d2"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 09:03:19 crc kubenswrapper[4765]: I1003 09:03:19.783367 4765 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/02077287-6b7d-4d30-8a19-e7f42699e5d2-config-data" (OuterVolumeSpecName: "config-data") pod "02077287-6b7d-4d30-8a19-e7f42699e5d2" (UID: "02077287-6b7d-4d30-8a19-e7f42699e5d2"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 09:03:19 crc kubenswrapper[4765]: I1003 09:03:19.813387 4765 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-j9m5s\" (UniqueName: \"kubernetes.io/projected/02077287-6b7d-4d30-8a19-e7f42699e5d2-kube-api-access-j9m5s\") on node \"crc\" DevicePath \"\"" Oct 03 09:03:19 crc kubenswrapper[4765]: I1003 09:03:19.813424 4765 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/02077287-6b7d-4d30-8a19-e7f42699e5d2-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 03 09:03:19 crc kubenswrapper[4765]: I1003 09:03:19.813436 4765 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/02077287-6b7d-4d30-8a19-e7f42699e5d2-config-data\") on node \"crc\" DevicePath \"\"" Oct 03 09:03:19 crc kubenswrapper[4765]: I1003 09:03:19.813449 4765 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/02077287-6b7d-4d30-8a19-e7f42699e5d2-logs\") on node \"crc\" DevicePath \"\"" Oct 03 09:03:20 crc kubenswrapper[4765]: I1003 09:03:20.584355 4765 generic.go:334] "Generic (PLEG): container finished" podID="3645711c-7984-4502-aee7-98e45640eaa9" containerID="d63ad9af49bea67159a5f84c95383734beb6b9e672682fefa4010326a84c3331" exitCode=0 Oct 03 09:03:20 crc kubenswrapper[4765]: I1003 09:03:20.584441 4765 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Oct 03 09:03:20 crc kubenswrapper[4765]: I1003 09:03:20.584467 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/keystone-bootstrap-kbjbs" event={"ID":"3645711c-7984-4502-aee7-98e45640eaa9","Type":"ContainerDied","Data":"d63ad9af49bea67159a5f84c95383734beb6b9e672682fefa4010326a84c3331"} Oct 03 09:03:20 crc kubenswrapper[4765]: I1003 09:03:20.584638 4765 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/watcher-kuttl-applier-0" Oct 03 09:03:20 crc kubenswrapper[4765]: I1003 09:03:20.622535 4765 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["watcher-kuttl-default/watcher-kuttl-applier-0"] Oct 03 09:03:20 crc kubenswrapper[4765]: I1003 09:03:20.628482 4765 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["watcher-kuttl-default/watcher-kuttl-applier-0"] Oct 03 09:03:20 crc kubenswrapper[4765]: I1003 09:03:20.643499 4765 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["watcher-kuttl-default/watcher-kuttl-applier-0"] Oct 03 09:03:20 crc kubenswrapper[4765]: E1003 09:03:20.643888 4765 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="02077287-6b7d-4d30-8a19-e7f42699e5d2" containerName="watcher-applier" Oct 03 09:03:20 crc kubenswrapper[4765]: I1003 09:03:20.643910 4765 state_mem.go:107] "Deleted CPUSet assignment" podUID="02077287-6b7d-4d30-8a19-e7f42699e5d2" containerName="watcher-applier" Oct 03 09:03:20 crc kubenswrapper[4765]: I1003 09:03:20.644059 4765 memory_manager.go:354] "RemoveStaleState removing state" podUID="02077287-6b7d-4d30-8a19-e7f42699e5d2" containerName="watcher-applier" Oct 03 09:03:20 crc kubenswrapper[4765]: I1003 09:03:20.644746 4765 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/watcher-kuttl-applier-0" Oct 03 09:03:20 crc kubenswrapper[4765]: I1003 09:03:20.647182 4765 reflector.go:368] Caches populated for *v1.Secret from object-"watcher-kuttl-default"/"watcher-kuttl-applier-config-data" Oct 03 09:03:20 crc kubenswrapper[4765]: I1003 09:03:20.653530 4765 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["watcher-kuttl-default/watcher-kuttl-applier-0"] Oct 03 09:03:20 crc kubenswrapper[4765]: I1003 09:03:20.829018 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/cf49515d-149d-4084-a0bc-5f5ddb0d6739-logs\") pod \"watcher-kuttl-applier-0\" (UID: \"cf49515d-149d-4084-a0bc-5f5ddb0d6739\") " pod="watcher-kuttl-default/watcher-kuttl-applier-0" Oct 03 09:03:20 crc kubenswrapper[4765]: I1003 09:03:20.829079 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gskbs\" (UniqueName: \"kubernetes.io/projected/cf49515d-149d-4084-a0bc-5f5ddb0d6739-kube-api-access-gskbs\") pod \"watcher-kuttl-applier-0\" (UID: \"cf49515d-149d-4084-a0bc-5f5ddb0d6739\") " pod="watcher-kuttl-default/watcher-kuttl-applier-0" Oct 03 09:03:20 crc kubenswrapper[4765]: I1003 09:03:20.829100 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/cf49515d-149d-4084-a0bc-5f5ddb0d6739-config-data\") pod \"watcher-kuttl-applier-0\" (UID: \"cf49515d-149d-4084-a0bc-5f5ddb0d6739\") " pod="watcher-kuttl-default/watcher-kuttl-applier-0" Oct 03 09:03:20 crc kubenswrapper[4765]: I1003 09:03:20.829139 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert-memcached-mtls\" (UniqueName: \"kubernetes.io/secret/cf49515d-149d-4084-a0bc-5f5ddb0d6739-cert-memcached-mtls\") pod \"watcher-kuttl-applier-0\" (UID: \"cf49515d-149d-4084-a0bc-5f5ddb0d6739\") " pod="watcher-kuttl-default/watcher-kuttl-applier-0" Oct 03 09:03:20 crc kubenswrapper[4765]: I1003 09:03:20.829180 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cf49515d-149d-4084-a0bc-5f5ddb0d6739-combined-ca-bundle\") pod \"watcher-kuttl-applier-0\" (UID: \"cf49515d-149d-4084-a0bc-5f5ddb0d6739\") " pod="watcher-kuttl-default/watcher-kuttl-applier-0" Oct 03 09:03:20 crc kubenswrapper[4765]: I1003 09:03:20.930271 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cf49515d-149d-4084-a0bc-5f5ddb0d6739-combined-ca-bundle\") pod \"watcher-kuttl-applier-0\" (UID: \"cf49515d-149d-4084-a0bc-5f5ddb0d6739\") " pod="watcher-kuttl-default/watcher-kuttl-applier-0" Oct 03 09:03:20 crc kubenswrapper[4765]: I1003 09:03:20.930382 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/cf49515d-149d-4084-a0bc-5f5ddb0d6739-logs\") pod \"watcher-kuttl-applier-0\" (UID: \"cf49515d-149d-4084-a0bc-5f5ddb0d6739\") " pod="watcher-kuttl-default/watcher-kuttl-applier-0" Oct 03 09:03:20 crc kubenswrapper[4765]: I1003 09:03:20.930413 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gskbs\" (UniqueName: \"kubernetes.io/projected/cf49515d-149d-4084-a0bc-5f5ddb0d6739-kube-api-access-gskbs\") pod \"watcher-kuttl-applier-0\" (UID: \"cf49515d-149d-4084-a0bc-5f5ddb0d6739\") " pod="watcher-kuttl-default/watcher-kuttl-applier-0" Oct 03 09:03:20 crc kubenswrapper[4765]: I1003 09:03:20.930431 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/cf49515d-149d-4084-a0bc-5f5ddb0d6739-config-data\") pod \"watcher-kuttl-applier-0\" (UID: \"cf49515d-149d-4084-a0bc-5f5ddb0d6739\") " pod="watcher-kuttl-default/watcher-kuttl-applier-0" Oct 03 09:03:20 crc kubenswrapper[4765]: I1003 09:03:20.930466 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert-memcached-mtls\" (UniqueName: \"kubernetes.io/secret/cf49515d-149d-4084-a0bc-5f5ddb0d6739-cert-memcached-mtls\") pod \"watcher-kuttl-applier-0\" (UID: \"cf49515d-149d-4084-a0bc-5f5ddb0d6739\") " pod="watcher-kuttl-default/watcher-kuttl-applier-0" Oct 03 09:03:20 crc kubenswrapper[4765]: I1003 09:03:20.930877 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/cf49515d-149d-4084-a0bc-5f5ddb0d6739-logs\") pod \"watcher-kuttl-applier-0\" (UID: \"cf49515d-149d-4084-a0bc-5f5ddb0d6739\") " pod="watcher-kuttl-default/watcher-kuttl-applier-0" Oct 03 09:03:20 crc kubenswrapper[4765]: I1003 09:03:20.934633 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cf49515d-149d-4084-a0bc-5f5ddb0d6739-combined-ca-bundle\") pod \"watcher-kuttl-applier-0\" (UID: \"cf49515d-149d-4084-a0bc-5f5ddb0d6739\") " pod="watcher-kuttl-default/watcher-kuttl-applier-0" Oct 03 09:03:20 crc kubenswrapper[4765]: I1003 09:03:20.934696 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/cf49515d-149d-4084-a0bc-5f5ddb0d6739-config-data\") pod \"watcher-kuttl-applier-0\" (UID: \"cf49515d-149d-4084-a0bc-5f5ddb0d6739\") " pod="watcher-kuttl-default/watcher-kuttl-applier-0" Oct 03 09:03:20 crc kubenswrapper[4765]: I1003 09:03:20.949240 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert-memcached-mtls\" (UniqueName: \"kubernetes.io/secret/cf49515d-149d-4084-a0bc-5f5ddb0d6739-cert-memcached-mtls\") pod \"watcher-kuttl-applier-0\" (UID: \"cf49515d-149d-4084-a0bc-5f5ddb0d6739\") " pod="watcher-kuttl-default/watcher-kuttl-applier-0" Oct 03 09:03:20 crc kubenswrapper[4765]: I1003 09:03:20.957634 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gskbs\" (UniqueName: \"kubernetes.io/projected/cf49515d-149d-4084-a0bc-5f5ddb0d6739-kube-api-access-gskbs\") pod \"watcher-kuttl-applier-0\" (UID: \"cf49515d-149d-4084-a0bc-5f5ddb0d6739\") " pod="watcher-kuttl-default/watcher-kuttl-applier-0" Oct 03 09:03:21 crc kubenswrapper[4765]: I1003 09:03:21.002751 4765 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/watcher-kuttl-applier-0" Oct 03 09:03:21 crc kubenswrapper[4765]: I1003 09:03:21.087297 4765 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="watcher-kuttl-default/watcher-kuttl-api-0" Oct 03 09:03:21 crc kubenswrapper[4765]: I1003 09:03:21.459827 4765 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["watcher-kuttl-default/watcher-kuttl-applier-0"] Oct 03 09:03:21 crc kubenswrapper[4765]: W1003 09:03:21.462020 4765 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podcf49515d_149d_4084_a0bc_5f5ddb0d6739.slice/crio-398b3a5a0f79f557634f4c7d07b15e1d0dd9580bf73560681aa45fd38c61f6c6 WatchSource:0}: Error finding container 398b3a5a0f79f557634f4c7d07b15e1d0dd9580bf73560681aa45fd38c61f6c6: Status 404 returned error can't find the container with id 398b3a5a0f79f557634f4c7d07b15e1d0dd9580bf73560681aa45fd38c61f6c6 Oct 03 09:03:21 crc kubenswrapper[4765]: I1003 09:03:21.598401 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/watcher-kuttl-applier-0" event={"ID":"cf49515d-149d-4084-a0bc-5f5ddb0d6739","Type":"ContainerStarted","Data":"398b3a5a0f79f557634f4c7d07b15e1d0dd9580bf73560681aa45fd38c61f6c6"} Oct 03 09:03:22 crc kubenswrapper[4765]: I1003 09:03:22.052527 4765 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/keystone-bootstrap-kbjbs" Oct 03 09:03:22 crc kubenswrapper[4765]: I1003 09:03:22.073682 4765 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="watcher-kuttl-default/watcher-kuttl-api-0" Oct 03 09:03:22 crc kubenswrapper[4765]: I1003 09:03:22.156889 4765 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3645711c-7984-4502-aee7-98e45640eaa9-combined-ca-bundle\") pod \"3645711c-7984-4502-aee7-98e45640eaa9\" (UID: \"3645711c-7984-4502-aee7-98e45640eaa9\") " Oct 03 09:03:22 crc kubenswrapper[4765]: I1003 09:03:22.157031 4765 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3645711c-7984-4502-aee7-98e45640eaa9-config-data\") pod \"3645711c-7984-4502-aee7-98e45640eaa9\" (UID: \"3645711c-7984-4502-aee7-98e45640eaa9\") " Oct 03 09:03:22 crc kubenswrapper[4765]: I1003 09:03:22.157096 4765 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/3645711c-7984-4502-aee7-98e45640eaa9-scripts\") pod \"3645711c-7984-4502-aee7-98e45640eaa9\" (UID: \"3645711c-7984-4502-aee7-98e45640eaa9\") " Oct 03 09:03:22 crc kubenswrapper[4765]: I1003 09:03:22.157151 4765 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/3645711c-7984-4502-aee7-98e45640eaa9-credential-keys\") pod \"3645711c-7984-4502-aee7-98e45640eaa9\" (UID: \"3645711c-7984-4502-aee7-98e45640eaa9\") " Oct 03 09:03:22 crc kubenswrapper[4765]: I1003 09:03:22.157242 4765 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cert-memcached-mtls\" (UniqueName: \"kubernetes.io/secret/3645711c-7984-4502-aee7-98e45640eaa9-cert-memcached-mtls\") pod \"3645711c-7984-4502-aee7-98e45640eaa9\" (UID: \"3645711c-7984-4502-aee7-98e45640eaa9\") " Oct 03 09:03:22 crc kubenswrapper[4765]: I1003 09:03:22.157324 4765 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/3645711c-7984-4502-aee7-98e45640eaa9-fernet-keys\") pod \"3645711c-7984-4502-aee7-98e45640eaa9\" (UID: \"3645711c-7984-4502-aee7-98e45640eaa9\") " Oct 03 09:03:22 crc kubenswrapper[4765]: I1003 09:03:22.157353 4765 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jh4rn\" (UniqueName: \"kubernetes.io/projected/3645711c-7984-4502-aee7-98e45640eaa9-kube-api-access-jh4rn\") pod \"3645711c-7984-4502-aee7-98e45640eaa9\" (UID: \"3645711c-7984-4502-aee7-98e45640eaa9\") " Oct 03 09:03:22 crc kubenswrapper[4765]: I1003 09:03:22.161361 4765 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3645711c-7984-4502-aee7-98e45640eaa9-fernet-keys" (OuterVolumeSpecName: "fernet-keys") pod "3645711c-7984-4502-aee7-98e45640eaa9" (UID: "3645711c-7984-4502-aee7-98e45640eaa9"). InnerVolumeSpecName "fernet-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 09:03:22 crc kubenswrapper[4765]: I1003 09:03:22.161449 4765 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3645711c-7984-4502-aee7-98e45640eaa9-credential-keys" (OuterVolumeSpecName: "credential-keys") pod "3645711c-7984-4502-aee7-98e45640eaa9" (UID: "3645711c-7984-4502-aee7-98e45640eaa9"). InnerVolumeSpecName "credential-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 09:03:22 crc kubenswrapper[4765]: I1003 09:03:22.161828 4765 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3645711c-7984-4502-aee7-98e45640eaa9-kube-api-access-jh4rn" (OuterVolumeSpecName: "kube-api-access-jh4rn") pod "3645711c-7984-4502-aee7-98e45640eaa9" (UID: "3645711c-7984-4502-aee7-98e45640eaa9"). InnerVolumeSpecName "kube-api-access-jh4rn". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 09:03:22 crc kubenswrapper[4765]: I1003 09:03:22.161908 4765 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3645711c-7984-4502-aee7-98e45640eaa9-scripts" (OuterVolumeSpecName: "scripts") pod "3645711c-7984-4502-aee7-98e45640eaa9" (UID: "3645711c-7984-4502-aee7-98e45640eaa9"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 09:03:22 crc kubenswrapper[4765]: I1003 09:03:22.179965 4765 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3645711c-7984-4502-aee7-98e45640eaa9-config-data" (OuterVolumeSpecName: "config-data") pod "3645711c-7984-4502-aee7-98e45640eaa9" (UID: "3645711c-7984-4502-aee7-98e45640eaa9"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 09:03:22 crc kubenswrapper[4765]: I1003 09:03:22.182612 4765 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3645711c-7984-4502-aee7-98e45640eaa9-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "3645711c-7984-4502-aee7-98e45640eaa9" (UID: "3645711c-7984-4502-aee7-98e45640eaa9"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 09:03:22 crc kubenswrapper[4765]: I1003 09:03:22.219816 4765 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3645711c-7984-4502-aee7-98e45640eaa9-cert-memcached-mtls" (OuterVolumeSpecName: "cert-memcached-mtls") pod "3645711c-7984-4502-aee7-98e45640eaa9" (UID: "3645711c-7984-4502-aee7-98e45640eaa9"). InnerVolumeSpecName "cert-memcached-mtls". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 09:03:22 crc kubenswrapper[4765]: I1003 09:03:22.259403 4765 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/3645711c-7984-4502-aee7-98e45640eaa9-scripts\") on node \"crc\" DevicePath \"\"" Oct 03 09:03:22 crc kubenswrapper[4765]: I1003 09:03:22.259679 4765 reconciler_common.go:293] "Volume detached for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/3645711c-7984-4502-aee7-98e45640eaa9-credential-keys\") on node \"crc\" DevicePath \"\"" Oct 03 09:03:22 crc kubenswrapper[4765]: I1003 09:03:22.259690 4765 reconciler_common.go:293] "Volume detached for volume \"cert-memcached-mtls\" (UniqueName: \"kubernetes.io/secret/3645711c-7984-4502-aee7-98e45640eaa9-cert-memcached-mtls\") on node \"crc\" DevicePath \"\"" Oct 03 09:03:22 crc kubenswrapper[4765]: I1003 09:03:22.259700 4765 reconciler_common.go:293] "Volume detached for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/3645711c-7984-4502-aee7-98e45640eaa9-fernet-keys\") on node \"crc\" DevicePath \"\"" Oct 03 09:03:22 crc kubenswrapper[4765]: I1003 09:03:22.259709 4765 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jh4rn\" (UniqueName: \"kubernetes.io/projected/3645711c-7984-4502-aee7-98e45640eaa9-kube-api-access-jh4rn\") on node \"crc\" DevicePath \"\"" Oct 03 09:03:22 crc kubenswrapper[4765]: I1003 09:03:22.259718 4765 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3645711c-7984-4502-aee7-98e45640eaa9-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 03 09:03:22 crc kubenswrapper[4765]: I1003 09:03:22.259726 4765 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3645711c-7984-4502-aee7-98e45640eaa9-config-data\") on node \"crc\" DevicePath \"\"" Oct 03 09:03:22 crc kubenswrapper[4765]: I1003 09:03:22.319878 4765 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="02077287-6b7d-4d30-8a19-e7f42699e5d2" path="/var/lib/kubelet/pods/02077287-6b7d-4d30-8a19-e7f42699e5d2/volumes" Oct 03 09:03:22 crc kubenswrapper[4765]: I1003 09:03:22.609475 4765 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/keystone-bootstrap-kbjbs" Oct 03 09:03:22 crc kubenswrapper[4765]: I1003 09:03:22.609447 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/keystone-bootstrap-kbjbs" event={"ID":"3645711c-7984-4502-aee7-98e45640eaa9","Type":"ContainerDied","Data":"7393c7551885273b28d2ecc56ac02883b9bb03037ede752ebdc86469457653fb"} Oct 03 09:03:22 crc kubenswrapper[4765]: I1003 09:03:22.609693 4765 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="7393c7551885273b28d2ecc56ac02883b9bb03037ede752ebdc86469457653fb" Oct 03 09:03:22 crc kubenswrapper[4765]: I1003 09:03:22.611194 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/watcher-kuttl-applier-0" event={"ID":"cf49515d-149d-4084-a0bc-5f5ddb0d6739","Type":"ContainerStarted","Data":"29cded319cd697ad6a39d8b966c290756135d4b49b6e3333728a7d7d258bfe6e"} Oct 03 09:03:22 crc kubenswrapper[4765]: I1003 09:03:22.650913 4765 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="watcher-kuttl-default/watcher-kuttl-applier-0" podStartSLOduration=2.650891183 podStartE2EDuration="2.650891183s" podCreationTimestamp="2025-10-03 09:03:20 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-03 09:03:22.643118407 +0000 UTC m=+1446.944612747" watchObservedRunningTime="2025-10-03 09:03:22.650891183 +0000 UTC m=+1446.952385513" Oct 03 09:03:26 crc kubenswrapper[4765]: I1003 09:03:26.003785 4765 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="watcher-kuttl-default/watcher-kuttl-applier-0" Oct 03 09:03:26 crc kubenswrapper[4765]: I1003 09:03:26.655467 4765 generic.go:334] "Generic (PLEG): container finished" podID="1eef47c2-953c-4299-a994-a3ab5583b2d0" containerID="3475db93fee76cefce14c7c9aef47cf2bafa98b960c66b70f4d3da860393bb9d" exitCode=0 Oct 03 09:03:26 crc kubenswrapper[4765]: I1003 09:03:26.655676 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/watcher-kuttl-decision-engine-0" event={"ID":"1eef47c2-953c-4299-a994-a3ab5583b2d0","Type":"ContainerDied","Data":"3475db93fee76cefce14c7c9aef47cf2bafa98b960c66b70f4d3da860393bb9d"} Oct 03 09:03:26 crc kubenswrapper[4765]: I1003 09:03:26.972060 4765 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/watcher-kuttl-decision-engine-0" Oct 03 09:03:27 crc kubenswrapper[4765]: I1003 09:03:27.071552 4765 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="watcher-kuttl-default/watcher-kuttl-api-0" Oct 03 09:03:27 crc kubenswrapper[4765]: I1003 09:03:27.137706 4765 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1eef47c2-953c-4299-a994-a3ab5583b2d0-config-data\") pod \"1eef47c2-953c-4299-a994-a3ab5583b2d0\" (UID: \"1eef47c2-953c-4299-a994-a3ab5583b2d0\") " Oct 03 09:03:27 crc kubenswrapper[4765]: I1003 09:03:27.137774 4765 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1eef47c2-953c-4299-a994-a3ab5583b2d0-combined-ca-bundle\") pod \"1eef47c2-953c-4299-a994-a3ab5583b2d0\" (UID: \"1eef47c2-953c-4299-a994-a3ab5583b2d0\") " Oct 03 09:03:27 crc kubenswrapper[4765]: I1003 09:03:27.137827 4765 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"custom-prometheus-ca\" (UniqueName: \"kubernetes.io/secret/1eef47c2-953c-4299-a994-a3ab5583b2d0-custom-prometheus-ca\") pod \"1eef47c2-953c-4299-a994-a3ab5583b2d0\" (UID: \"1eef47c2-953c-4299-a994-a3ab5583b2d0\") " Oct 03 09:03:27 crc kubenswrapper[4765]: I1003 09:03:27.137867 4765 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/1eef47c2-953c-4299-a994-a3ab5583b2d0-logs\") pod \"1eef47c2-953c-4299-a994-a3ab5583b2d0\" (UID: \"1eef47c2-953c-4299-a994-a3ab5583b2d0\") " Oct 03 09:03:27 crc kubenswrapper[4765]: I1003 09:03:27.138006 4765 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ljtwk\" (UniqueName: \"kubernetes.io/projected/1eef47c2-953c-4299-a994-a3ab5583b2d0-kube-api-access-ljtwk\") pod \"1eef47c2-953c-4299-a994-a3ab5583b2d0\" (UID: \"1eef47c2-953c-4299-a994-a3ab5583b2d0\") " Oct 03 09:03:27 crc kubenswrapper[4765]: I1003 09:03:27.140205 4765 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1eef47c2-953c-4299-a994-a3ab5583b2d0-logs" (OuterVolumeSpecName: "logs") pod "1eef47c2-953c-4299-a994-a3ab5583b2d0" (UID: "1eef47c2-953c-4299-a994-a3ab5583b2d0"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 03 09:03:27 crc kubenswrapper[4765]: I1003 09:03:27.158113 4765 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1eef47c2-953c-4299-a994-a3ab5583b2d0-kube-api-access-ljtwk" (OuterVolumeSpecName: "kube-api-access-ljtwk") pod "1eef47c2-953c-4299-a994-a3ab5583b2d0" (UID: "1eef47c2-953c-4299-a994-a3ab5583b2d0"). InnerVolumeSpecName "kube-api-access-ljtwk". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 09:03:27 crc kubenswrapper[4765]: I1003 09:03:27.208003 4765 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1eef47c2-953c-4299-a994-a3ab5583b2d0-custom-prometheus-ca" (OuterVolumeSpecName: "custom-prometheus-ca") pod "1eef47c2-953c-4299-a994-a3ab5583b2d0" (UID: "1eef47c2-953c-4299-a994-a3ab5583b2d0"). InnerVolumeSpecName "custom-prometheus-ca". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 09:03:27 crc kubenswrapper[4765]: I1003 09:03:27.233927 4765 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1eef47c2-953c-4299-a994-a3ab5583b2d0-config-data" (OuterVolumeSpecName: "config-data") pod "1eef47c2-953c-4299-a994-a3ab5583b2d0" (UID: "1eef47c2-953c-4299-a994-a3ab5583b2d0"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 09:03:27 crc kubenswrapper[4765]: I1003 09:03:27.239635 4765 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ljtwk\" (UniqueName: \"kubernetes.io/projected/1eef47c2-953c-4299-a994-a3ab5583b2d0-kube-api-access-ljtwk\") on node \"crc\" DevicePath \"\"" Oct 03 09:03:27 crc kubenswrapper[4765]: I1003 09:03:27.239751 4765 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1eef47c2-953c-4299-a994-a3ab5583b2d0-config-data\") on node \"crc\" DevicePath \"\"" Oct 03 09:03:27 crc kubenswrapper[4765]: I1003 09:03:27.239760 4765 reconciler_common.go:293] "Volume detached for volume \"custom-prometheus-ca\" (UniqueName: \"kubernetes.io/secret/1eef47c2-953c-4299-a994-a3ab5583b2d0-custom-prometheus-ca\") on node \"crc\" DevicePath \"\"" Oct 03 09:03:27 crc kubenswrapper[4765]: I1003 09:03:27.239768 4765 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/1eef47c2-953c-4299-a994-a3ab5583b2d0-logs\") on node \"crc\" DevicePath \"\"" Oct 03 09:03:27 crc kubenswrapper[4765]: I1003 09:03:27.244959 4765 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1eef47c2-953c-4299-a994-a3ab5583b2d0-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "1eef47c2-953c-4299-a994-a3ab5583b2d0" (UID: "1eef47c2-953c-4299-a994-a3ab5583b2d0"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 09:03:27 crc kubenswrapper[4765]: I1003 09:03:27.260615 4765 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="watcher-kuttl-default/watcher-kuttl-api-0" Oct 03 09:03:27 crc kubenswrapper[4765]: I1003 09:03:27.341759 4765 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1eef47c2-953c-4299-a994-a3ab5583b2d0-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 03 09:03:27 crc kubenswrapper[4765]: I1003 09:03:27.366851 4765 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="watcher-kuttl-default/memcached-0" Oct 03 09:03:27 crc kubenswrapper[4765]: I1003 09:03:27.505427 4765 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["watcher-kuttl-default/keystone-7bb6879856-dm6qk"] Oct 03 09:03:27 crc kubenswrapper[4765]: E1003 09:03:27.506106 4765 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3645711c-7984-4502-aee7-98e45640eaa9" containerName="keystone-bootstrap" Oct 03 09:03:27 crc kubenswrapper[4765]: I1003 09:03:27.506126 4765 state_mem.go:107] "Deleted CPUSet assignment" podUID="3645711c-7984-4502-aee7-98e45640eaa9" containerName="keystone-bootstrap" Oct 03 09:03:27 crc kubenswrapper[4765]: E1003 09:03:27.506148 4765 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1eef47c2-953c-4299-a994-a3ab5583b2d0" containerName="watcher-decision-engine" Oct 03 09:03:27 crc kubenswrapper[4765]: I1003 09:03:27.506156 4765 state_mem.go:107] "Deleted CPUSet assignment" podUID="1eef47c2-953c-4299-a994-a3ab5583b2d0" containerName="watcher-decision-engine" Oct 03 09:03:27 crc kubenswrapper[4765]: I1003 09:03:27.506489 4765 memory_manager.go:354] "RemoveStaleState removing state" podUID="1eef47c2-953c-4299-a994-a3ab5583b2d0" containerName="watcher-decision-engine" Oct 03 09:03:27 crc kubenswrapper[4765]: I1003 09:03:27.506526 4765 memory_manager.go:354] "RemoveStaleState removing state" podUID="3645711c-7984-4502-aee7-98e45640eaa9" containerName="keystone-bootstrap" Oct 03 09:03:27 crc kubenswrapper[4765]: I1003 09:03:27.507430 4765 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/keystone-7bb6879856-dm6qk" Oct 03 09:03:27 crc kubenswrapper[4765]: I1003 09:03:27.531351 4765 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["watcher-kuttl-default/keystone-7bb6879856-dm6qk"] Oct 03 09:03:27 crc kubenswrapper[4765]: I1003 09:03:27.647421 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/0ab88eac-dab7-4528-9654-8d5b819ca4d6-internal-tls-certs\") pod \"keystone-7bb6879856-dm6qk\" (UID: \"0ab88eac-dab7-4528-9654-8d5b819ca4d6\") " pod="watcher-kuttl-default/keystone-7bb6879856-dm6qk" Oct 03 09:03:27 crc kubenswrapper[4765]: I1003 09:03:27.647481 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/0ab88eac-dab7-4528-9654-8d5b819ca4d6-fernet-keys\") pod \"keystone-7bb6879856-dm6qk\" (UID: \"0ab88eac-dab7-4528-9654-8d5b819ca4d6\") " pod="watcher-kuttl-default/keystone-7bb6879856-dm6qk" Oct 03 09:03:27 crc kubenswrapper[4765]: I1003 09:03:27.647564 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0ab88eac-dab7-4528-9654-8d5b819ca4d6-config-data\") pod \"keystone-7bb6879856-dm6qk\" (UID: \"0ab88eac-dab7-4528-9654-8d5b819ca4d6\") " pod="watcher-kuttl-default/keystone-7bb6879856-dm6qk" Oct 03 09:03:27 crc kubenswrapper[4765]: I1003 09:03:27.647593 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fhtj6\" (UniqueName: \"kubernetes.io/projected/0ab88eac-dab7-4528-9654-8d5b819ca4d6-kube-api-access-fhtj6\") pod \"keystone-7bb6879856-dm6qk\" (UID: \"0ab88eac-dab7-4528-9654-8d5b819ca4d6\") " pod="watcher-kuttl-default/keystone-7bb6879856-dm6qk" Oct 03 09:03:27 crc kubenswrapper[4765]: I1003 09:03:27.647622 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/0ab88eac-dab7-4528-9654-8d5b819ca4d6-scripts\") pod \"keystone-7bb6879856-dm6qk\" (UID: \"0ab88eac-dab7-4528-9654-8d5b819ca4d6\") " pod="watcher-kuttl-default/keystone-7bb6879856-dm6qk" Oct 03 09:03:27 crc kubenswrapper[4765]: I1003 09:03:27.647684 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/0ab88eac-dab7-4528-9654-8d5b819ca4d6-public-tls-certs\") pod \"keystone-7bb6879856-dm6qk\" (UID: \"0ab88eac-dab7-4528-9654-8d5b819ca4d6\") " pod="watcher-kuttl-default/keystone-7bb6879856-dm6qk" Oct 03 09:03:27 crc kubenswrapper[4765]: I1003 09:03:27.647727 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0ab88eac-dab7-4528-9654-8d5b819ca4d6-combined-ca-bundle\") pod \"keystone-7bb6879856-dm6qk\" (UID: \"0ab88eac-dab7-4528-9654-8d5b819ca4d6\") " pod="watcher-kuttl-default/keystone-7bb6879856-dm6qk" Oct 03 09:03:27 crc kubenswrapper[4765]: I1003 09:03:27.647755 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/0ab88eac-dab7-4528-9654-8d5b819ca4d6-credential-keys\") pod \"keystone-7bb6879856-dm6qk\" (UID: \"0ab88eac-dab7-4528-9654-8d5b819ca4d6\") " pod="watcher-kuttl-default/keystone-7bb6879856-dm6qk" Oct 03 09:03:27 crc kubenswrapper[4765]: I1003 09:03:27.647817 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert-memcached-mtls\" (UniqueName: \"kubernetes.io/secret/0ab88eac-dab7-4528-9654-8d5b819ca4d6-cert-memcached-mtls\") pod \"keystone-7bb6879856-dm6qk\" (UID: \"0ab88eac-dab7-4528-9654-8d5b819ca4d6\") " pod="watcher-kuttl-default/keystone-7bb6879856-dm6qk" Oct 03 09:03:27 crc kubenswrapper[4765]: I1003 09:03:27.669066 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/watcher-kuttl-decision-engine-0" event={"ID":"1eef47c2-953c-4299-a994-a3ab5583b2d0","Type":"ContainerDied","Data":"a6586bd5d098bf1c2249fe0eb885aa9ffc184dea725da899bd4cc5dda4f34c7c"} Oct 03 09:03:27 crc kubenswrapper[4765]: I1003 09:03:27.669138 4765 scope.go:117] "RemoveContainer" containerID="3475db93fee76cefce14c7c9aef47cf2bafa98b960c66b70f4d3da860393bb9d" Oct 03 09:03:27 crc kubenswrapper[4765]: I1003 09:03:27.669092 4765 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/watcher-kuttl-decision-engine-0" Oct 03 09:03:27 crc kubenswrapper[4765]: I1003 09:03:27.680603 4765 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="watcher-kuttl-default/watcher-kuttl-api-0" Oct 03 09:03:27 crc kubenswrapper[4765]: I1003 09:03:27.751637 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert-memcached-mtls\" (UniqueName: \"kubernetes.io/secret/0ab88eac-dab7-4528-9654-8d5b819ca4d6-cert-memcached-mtls\") pod \"keystone-7bb6879856-dm6qk\" (UID: \"0ab88eac-dab7-4528-9654-8d5b819ca4d6\") " pod="watcher-kuttl-default/keystone-7bb6879856-dm6qk" Oct 03 09:03:27 crc kubenswrapper[4765]: I1003 09:03:27.752589 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/0ab88eac-dab7-4528-9654-8d5b819ca4d6-internal-tls-certs\") pod \"keystone-7bb6879856-dm6qk\" (UID: \"0ab88eac-dab7-4528-9654-8d5b819ca4d6\") " pod="watcher-kuttl-default/keystone-7bb6879856-dm6qk" Oct 03 09:03:27 crc kubenswrapper[4765]: I1003 09:03:27.752704 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/0ab88eac-dab7-4528-9654-8d5b819ca4d6-fernet-keys\") pod \"keystone-7bb6879856-dm6qk\" (UID: \"0ab88eac-dab7-4528-9654-8d5b819ca4d6\") " pod="watcher-kuttl-default/keystone-7bb6879856-dm6qk" Oct 03 09:03:27 crc kubenswrapper[4765]: I1003 09:03:27.752872 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0ab88eac-dab7-4528-9654-8d5b819ca4d6-config-data\") pod \"keystone-7bb6879856-dm6qk\" (UID: \"0ab88eac-dab7-4528-9654-8d5b819ca4d6\") " pod="watcher-kuttl-default/keystone-7bb6879856-dm6qk" Oct 03 09:03:27 crc kubenswrapper[4765]: I1003 09:03:27.752970 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fhtj6\" (UniqueName: \"kubernetes.io/projected/0ab88eac-dab7-4528-9654-8d5b819ca4d6-kube-api-access-fhtj6\") pod \"keystone-7bb6879856-dm6qk\" (UID: \"0ab88eac-dab7-4528-9654-8d5b819ca4d6\") " pod="watcher-kuttl-default/keystone-7bb6879856-dm6qk" Oct 03 09:03:27 crc kubenswrapper[4765]: I1003 09:03:27.753047 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/0ab88eac-dab7-4528-9654-8d5b819ca4d6-scripts\") pod \"keystone-7bb6879856-dm6qk\" (UID: \"0ab88eac-dab7-4528-9654-8d5b819ca4d6\") " pod="watcher-kuttl-default/keystone-7bb6879856-dm6qk" Oct 03 09:03:27 crc kubenswrapper[4765]: I1003 09:03:27.753164 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/0ab88eac-dab7-4528-9654-8d5b819ca4d6-public-tls-certs\") pod \"keystone-7bb6879856-dm6qk\" (UID: \"0ab88eac-dab7-4528-9654-8d5b819ca4d6\") " pod="watcher-kuttl-default/keystone-7bb6879856-dm6qk" Oct 03 09:03:27 crc kubenswrapper[4765]: I1003 09:03:27.754006 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0ab88eac-dab7-4528-9654-8d5b819ca4d6-combined-ca-bundle\") pod \"keystone-7bb6879856-dm6qk\" (UID: \"0ab88eac-dab7-4528-9654-8d5b819ca4d6\") " pod="watcher-kuttl-default/keystone-7bb6879856-dm6qk" Oct 03 09:03:27 crc kubenswrapper[4765]: I1003 09:03:27.754089 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/0ab88eac-dab7-4528-9654-8d5b819ca4d6-credential-keys\") pod \"keystone-7bb6879856-dm6qk\" (UID: \"0ab88eac-dab7-4528-9654-8d5b819ca4d6\") " pod="watcher-kuttl-default/keystone-7bb6879856-dm6qk" Oct 03 09:03:27 crc kubenswrapper[4765]: I1003 09:03:27.755769 4765 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["watcher-kuttl-default/watcher-kuttl-decision-engine-0"] Oct 03 09:03:27 crc kubenswrapper[4765]: I1003 09:03:27.755790 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert-memcached-mtls\" (UniqueName: \"kubernetes.io/secret/0ab88eac-dab7-4528-9654-8d5b819ca4d6-cert-memcached-mtls\") pod \"keystone-7bb6879856-dm6qk\" (UID: \"0ab88eac-dab7-4528-9654-8d5b819ca4d6\") " pod="watcher-kuttl-default/keystone-7bb6879856-dm6qk" Oct 03 09:03:27 crc kubenswrapper[4765]: I1003 09:03:27.757715 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/0ab88eac-dab7-4528-9654-8d5b819ca4d6-credential-keys\") pod \"keystone-7bb6879856-dm6qk\" (UID: \"0ab88eac-dab7-4528-9654-8d5b819ca4d6\") " pod="watcher-kuttl-default/keystone-7bb6879856-dm6qk" Oct 03 09:03:27 crc kubenswrapper[4765]: I1003 09:03:27.768191 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0ab88eac-dab7-4528-9654-8d5b819ca4d6-combined-ca-bundle\") pod \"keystone-7bb6879856-dm6qk\" (UID: \"0ab88eac-dab7-4528-9654-8d5b819ca4d6\") " pod="watcher-kuttl-default/keystone-7bb6879856-dm6qk" Oct 03 09:03:27 crc kubenswrapper[4765]: I1003 09:03:27.771280 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/0ab88eac-dab7-4528-9654-8d5b819ca4d6-public-tls-certs\") pod \"keystone-7bb6879856-dm6qk\" (UID: \"0ab88eac-dab7-4528-9654-8d5b819ca4d6\") " pod="watcher-kuttl-default/keystone-7bb6879856-dm6qk" Oct 03 09:03:27 crc kubenswrapper[4765]: I1003 09:03:27.771657 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0ab88eac-dab7-4528-9654-8d5b819ca4d6-config-data\") pod \"keystone-7bb6879856-dm6qk\" (UID: \"0ab88eac-dab7-4528-9654-8d5b819ca4d6\") " pod="watcher-kuttl-default/keystone-7bb6879856-dm6qk" Oct 03 09:03:27 crc kubenswrapper[4765]: I1003 09:03:27.776994 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/0ab88eac-dab7-4528-9654-8d5b819ca4d6-internal-tls-certs\") pod \"keystone-7bb6879856-dm6qk\" (UID: \"0ab88eac-dab7-4528-9654-8d5b819ca4d6\") " pod="watcher-kuttl-default/keystone-7bb6879856-dm6qk" Oct 03 09:03:27 crc kubenswrapper[4765]: I1003 09:03:27.780617 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/0ab88eac-dab7-4528-9654-8d5b819ca4d6-fernet-keys\") pod \"keystone-7bb6879856-dm6qk\" (UID: \"0ab88eac-dab7-4528-9654-8d5b819ca4d6\") " pod="watcher-kuttl-default/keystone-7bb6879856-dm6qk" Oct 03 09:03:27 crc kubenswrapper[4765]: I1003 09:03:27.782887 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/0ab88eac-dab7-4528-9654-8d5b819ca4d6-scripts\") pod \"keystone-7bb6879856-dm6qk\" (UID: \"0ab88eac-dab7-4528-9654-8d5b819ca4d6\") " pod="watcher-kuttl-default/keystone-7bb6879856-dm6qk" Oct 03 09:03:27 crc kubenswrapper[4765]: I1003 09:03:27.786161 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fhtj6\" (UniqueName: \"kubernetes.io/projected/0ab88eac-dab7-4528-9654-8d5b819ca4d6-kube-api-access-fhtj6\") pod \"keystone-7bb6879856-dm6qk\" (UID: \"0ab88eac-dab7-4528-9654-8d5b819ca4d6\") " pod="watcher-kuttl-default/keystone-7bb6879856-dm6qk" Oct 03 09:03:27 crc kubenswrapper[4765]: I1003 09:03:27.796342 4765 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["watcher-kuttl-default/watcher-kuttl-decision-engine-0"] Oct 03 09:03:27 crc kubenswrapper[4765]: I1003 09:03:27.814204 4765 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["watcher-kuttl-default/watcher-kuttl-decision-engine-0"] Oct 03 09:03:27 crc kubenswrapper[4765]: I1003 09:03:27.816308 4765 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/watcher-kuttl-decision-engine-0" Oct 03 09:03:27 crc kubenswrapper[4765]: I1003 09:03:27.819786 4765 reflector.go:368] Caches populated for *v1.Secret from object-"watcher-kuttl-default"/"watcher-kuttl-decision-engine-config-data" Oct 03 09:03:27 crc kubenswrapper[4765]: I1003 09:03:27.830054 4765 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["watcher-kuttl-default/watcher-kuttl-decision-engine-0"] Oct 03 09:03:27 crc kubenswrapper[4765]: I1003 09:03:27.840870 4765 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/keystone-7bb6879856-dm6qk" Oct 03 09:03:27 crc kubenswrapper[4765]: I1003 09:03:27.957958 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert-memcached-mtls\" (UniqueName: \"kubernetes.io/secret/0deed7b7-7fac-4c23-bcb1-e790b70f7a9d-cert-memcached-mtls\") pod \"watcher-kuttl-decision-engine-0\" (UID: \"0deed7b7-7fac-4c23-bcb1-e790b70f7a9d\") " pod="watcher-kuttl-default/watcher-kuttl-decision-engine-0" Oct 03 09:03:27 crc kubenswrapper[4765]: I1003 09:03:27.958431 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-58hdp\" (UniqueName: \"kubernetes.io/projected/0deed7b7-7fac-4c23-bcb1-e790b70f7a9d-kube-api-access-58hdp\") pod \"watcher-kuttl-decision-engine-0\" (UID: \"0deed7b7-7fac-4c23-bcb1-e790b70f7a9d\") " pod="watcher-kuttl-default/watcher-kuttl-decision-engine-0" Oct 03 09:03:27 crc kubenswrapper[4765]: I1003 09:03:27.958480 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0deed7b7-7fac-4c23-bcb1-e790b70f7a9d-config-data\") pod \"watcher-kuttl-decision-engine-0\" (UID: \"0deed7b7-7fac-4c23-bcb1-e790b70f7a9d\") " pod="watcher-kuttl-default/watcher-kuttl-decision-engine-0" Oct 03 09:03:27 crc kubenswrapper[4765]: I1003 09:03:27.958510 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/0deed7b7-7fac-4c23-bcb1-e790b70f7a9d-logs\") pod \"watcher-kuttl-decision-engine-0\" (UID: \"0deed7b7-7fac-4c23-bcb1-e790b70f7a9d\") " pod="watcher-kuttl-default/watcher-kuttl-decision-engine-0" Oct 03 09:03:27 crc kubenswrapper[4765]: I1003 09:03:27.958535 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"custom-prometheus-ca\" (UniqueName: \"kubernetes.io/secret/0deed7b7-7fac-4c23-bcb1-e790b70f7a9d-custom-prometheus-ca\") pod \"watcher-kuttl-decision-engine-0\" (UID: \"0deed7b7-7fac-4c23-bcb1-e790b70f7a9d\") " pod="watcher-kuttl-default/watcher-kuttl-decision-engine-0" Oct 03 09:03:27 crc kubenswrapper[4765]: I1003 09:03:27.958560 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0deed7b7-7fac-4c23-bcb1-e790b70f7a9d-combined-ca-bundle\") pod \"watcher-kuttl-decision-engine-0\" (UID: \"0deed7b7-7fac-4c23-bcb1-e790b70f7a9d\") " pod="watcher-kuttl-default/watcher-kuttl-decision-engine-0" Oct 03 09:03:28 crc kubenswrapper[4765]: I1003 09:03:28.060214 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert-memcached-mtls\" (UniqueName: \"kubernetes.io/secret/0deed7b7-7fac-4c23-bcb1-e790b70f7a9d-cert-memcached-mtls\") pod \"watcher-kuttl-decision-engine-0\" (UID: \"0deed7b7-7fac-4c23-bcb1-e790b70f7a9d\") " pod="watcher-kuttl-default/watcher-kuttl-decision-engine-0" Oct 03 09:03:28 crc kubenswrapper[4765]: I1003 09:03:28.060278 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-58hdp\" (UniqueName: \"kubernetes.io/projected/0deed7b7-7fac-4c23-bcb1-e790b70f7a9d-kube-api-access-58hdp\") pod \"watcher-kuttl-decision-engine-0\" (UID: \"0deed7b7-7fac-4c23-bcb1-e790b70f7a9d\") " pod="watcher-kuttl-default/watcher-kuttl-decision-engine-0" Oct 03 09:03:28 crc kubenswrapper[4765]: I1003 09:03:28.060319 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0deed7b7-7fac-4c23-bcb1-e790b70f7a9d-config-data\") pod \"watcher-kuttl-decision-engine-0\" (UID: \"0deed7b7-7fac-4c23-bcb1-e790b70f7a9d\") " pod="watcher-kuttl-default/watcher-kuttl-decision-engine-0" Oct 03 09:03:28 crc kubenswrapper[4765]: I1003 09:03:28.060340 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/0deed7b7-7fac-4c23-bcb1-e790b70f7a9d-logs\") pod \"watcher-kuttl-decision-engine-0\" (UID: \"0deed7b7-7fac-4c23-bcb1-e790b70f7a9d\") " pod="watcher-kuttl-default/watcher-kuttl-decision-engine-0" Oct 03 09:03:28 crc kubenswrapper[4765]: I1003 09:03:28.060359 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"custom-prometheus-ca\" (UniqueName: \"kubernetes.io/secret/0deed7b7-7fac-4c23-bcb1-e790b70f7a9d-custom-prometheus-ca\") pod \"watcher-kuttl-decision-engine-0\" (UID: \"0deed7b7-7fac-4c23-bcb1-e790b70f7a9d\") " pod="watcher-kuttl-default/watcher-kuttl-decision-engine-0" Oct 03 09:03:28 crc kubenswrapper[4765]: I1003 09:03:28.060376 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0deed7b7-7fac-4c23-bcb1-e790b70f7a9d-combined-ca-bundle\") pod \"watcher-kuttl-decision-engine-0\" (UID: \"0deed7b7-7fac-4c23-bcb1-e790b70f7a9d\") " pod="watcher-kuttl-default/watcher-kuttl-decision-engine-0" Oct 03 09:03:28 crc kubenswrapper[4765]: I1003 09:03:28.060875 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/0deed7b7-7fac-4c23-bcb1-e790b70f7a9d-logs\") pod \"watcher-kuttl-decision-engine-0\" (UID: \"0deed7b7-7fac-4c23-bcb1-e790b70f7a9d\") " pod="watcher-kuttl-default/watcher-kuttl-decision-engine-0" Oct 03 09:03:28 crc kubenswrapper[4765]: I1003 09:03:28.066224 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0deed7b7-7fac-4c23-bcb1-e790b70f7a9d-config-data\") pod \"watcher-kuttl-decision-engine-0\" (UID: \"0deed7b7-7fac-4c23-bcb1-e790b70f7a9d\") " pod="watcher-kuttl-default/watcher-kuttl-decision-engine-0" Oct 03 09:03:28 crc kubenswrapper[4765]: I1003 09:03:28.066259 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"custom-prometheus-ca\" (UniqueName: \"kubernetes.io/secret/0deed7b7-7fac-4c23-bcb1-e790b70f7a9d-custom-prometheus-ca\") pod \"watcher-kuttl-decision-engine-0\" (UID: \"0deed7b7-7fac-4c23-bcb1-e790b70f7a9d\") " pod="watcher-kuttl-default/watcher-kuttl-decision-engine-0" Oct 03 09:03:28 crc kubenswrapper[4765]: I1003 09:03:28.066840 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert-memcached-mtls\" (UniqueName: \"kubernetes.io/secret/0deed7b7-7fac-4c23-bcb1-e790b70f7a9d-cert-memcached-mtls\") pod \"watcher-kuttl-decision-engine-0\" (UID: \"0deed7b7-7fac-4c23-bcb1-e790b70f7a9d\") " pod="watcher-kuttl-default/watcher-kuttl-decision-engine-0" Oct 03 09:03:28 crc kubenswrapper[4765]: I1003 09:03:28.068122 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0deed7b7-7fac-4c23-bcb1-e790b70f7a9d-combined-ca-bundle\") pod \"watcher-kuttl-decision-engine-0\" (UID: \"0deed7b7-7fac-4c23-bcb1-e790b70f7a9d\") " pod="watcher-kuttl-default/watcher-kuttl-decision-engine-0" Oct 03 09:03:28 crc kubenswrapper[4765]: I1003 09:03:28.076262 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-58hdp\" (UniqueName: \"kubernetes.io/projected/0deed7b7-7fac-4c23-bcb1-e790b70f7a9d-kube-api-access-58hdp\") pod \"watcher-kuttl-decision-engine-0\" (UID: \"0deed7b7-7fac-4c23-bcb1-e790b70f7a9d\") " pod="watcher-kuttl-default/watcher-kuttl-decision-engine-0" Oct 03 09:03:28 crc kubenswrapper[4765]: I1003 09:03:28.178192 4765 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/watcher-kuttl-decision-engine-0" Oct 03 09:03:28 crc kubenswrapper[4765]: I1003 09:03:28.297033 4765 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["watcher-kuttl-default/keystone-7bb6879856-dm6qk"] Oct 03 09:03:28 crc kubenswrapper[4765]: W1003 09:03:28.306206 4765 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod0ab88eac_dab7_4528_9654_8d5b819ca4d6.slice/crio-3fb294e8526ff4f92a39c47775962b3d1c38075bf5b9410c4fc224bc22a7fa4f WatchSource:0}: Error finding container 3fb294e8526ff4f92a39c47775962b3d1c38075bf5b9410c4fc224bc22a7fa4f: Status 404 returned error can't find the container with id 3fb294e8526ff4f92a39c47775962b3d1c38075bf5b9410c4fc224bc22a7fa4f Oct 03 09:03:28 crc kubenswrapper[4765]: I1003 09:03:28.318883 4765 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1eef47c2-953c-4299-a994-a3ab5583b2d0" path="/var/lib/kubelet/pods/1eef47c2-953c-4299-a994-a3ab5583b2d0/volumes" Oct 03 09:03:28 crc kubenswrapper[4765]: I1003 09:03:28.592842 4765 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["watcher-kuttl-default/watcher-kuttl-decision-engine-0"] Oct 03 09:03:28 crc kubenswrapper[4765]: W1003 09:03:28.599323 4765 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod0deed7b7_7fac_4c23_bcb1_e790b70f7a9d.slice/crio-daa57d9a1090ed85a1d81910bf590745fd8b2d0a451d4d14b62363956e1f6076 WatchSource:0}: Error finding container daa57d9a1090ed85a1d81910bf590745fd8b2d0a451d4d14b62363956e1f6076: Status 404 returned error can't find the container with id daa57d9a1090ed85a1d81910bf590745fd8b2d0a451d4d14b62363956e1f6076 Oct 03 09:03:28 crc kubenswrapper[4765]: I1003 09:03:28.684822 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/watcher-kuttl-decision-engine-0" event={"ID":"0deed7b7-7fac-4c23-bcb1-e790b70f7a9d","Type":"ContainerStarted","Data":"daa57d9a1090ed85a1d81910bf590745fd8b2d0a451d4d14b62363956e1f6076"} Oct 03 09:03:28 crc kubenswrapper[4765]: I1003 09:03:28.690994 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/keystone-7bb6879856-dm6qk" event={"ID":"0ab88eac-dab7-4528-9654-8d5b819ca4d6","Type":"ContainerStarted","Data":"ee9a95e0785abb8e44ab41a49abb2952157fafe36c1779b8ee7dd20f37827f79"} Oct 03 09:03:28 crc kubenswrapper[4765]: I1003 09:03:28.691027 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/keystone-7bb6879856-dm6qk" event={"ID":"0ab88eac-dab7-4528-9654-8d5b819ca4d6","Type":"ContainerStarted","Data":"3fb294e8526ff4f92a39c47775962b3d1c38075bf5b9410c4fc224bc22a7fa4f"} Oct 03 09:03:28 crc kubenswrapper[4765]: I1003 09:03:28.691051 4765 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="watcher-kuttl-default/keystone-7bb6879856-dm6qk" Oct 03 09:03:28 crc kubenswrapper[4765]: I1003 09:03:28.713608 4765 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="watcher-kuttl-default/keystone-7bb6879856-dm6qk" podStartSLOduration=1.713590361 podStartE2EDuration="1.713590361s" podCreationTimestamp="2025-10-03 09:03:27 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-03 09:03:28.70839967 +0000 UTC m=+1453.009894010" watchObservedRunningTime="2025-10-03 09:03:28.713590361 +0000 UTC m=+1453.015084691" Oct 03 09:03:29 crc kubenswrapper[4765]: I1003 09:03:29.697855 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/watcher-kuttl-decision-engine-0" event={"ID":"0deed7b7-7fac-4c23-bcb1-e790b70f7a9d","Type":"ContainerStarted","Data":"3b8ab8341c1ff541ec26df55c796b534c2eb631f6062761a0478a771544b296b"} Oct 03 09:03:29 crc kubenswrapper[4765]: I1003 09:03:29.714689 4765 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="watcher-kuttl-default/watcher-kuttl-decision-engine-0" podStartSLOduration=2.7146733259999998 podStartE2EDuration="2.714673326s" podCreationTimestamp="2025-10-03 09:03:27 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-03 09:03:29.713937198 +0000 UTC m=+1454.015431548" watchObservedRunningTime="2025-10-03 09:03:29.714673326 +0000 UTC m=+1454.016167656" Oct 03 09:03:31 crc kubenswrapper[4765]: I1003 09:03:31.004361 4765 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="watcher-kuttl-default/watcher-kuttl-applier-0" Oct 03 09:03:31 crc kubenswrapper[4765]: I1003 09:03:31.030463 4765 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="watcher-kuttl-default/watcher-kuttl-applier-0" Oct 03 09:03:31 crc kubenswrapper[4765]: I1003 09:03:31.745979 4765 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="watcher-kuttl-default/watcher-kuttl-applier-0" Oct 03 09:03:33 crc kubenswrapper[4765]: I1003 09:03:33.111177 4765 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="watcher-kuttl-default/ceilometer-0" Oct 03 09:03:34 crc kubenswrapper[4765]: I1003 09:03:34.080389 4765 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["watcher-kuttl-default/watcher-kuttl-api-0"] Oct 03 09:03:34 crc kubenswrapper[4765]: I1003 09:03:34.081126 4765 kuberuntime_container.go:808] "Killing container with a grace period" pod="watcher-kuttl-default/watcher-kuttl-api-0" podUID="9ba86238-0b43-4a15-878a-1c0495098797" containerName="watcher-kuttl-api-log" containerID="cri-o://b61c8d9d774c761e7ab74d79239648cf11aa8e48e7485ad087a4085193226889" gracePeriod=30 Oct 03 09:03:34 crc kubenswrapper[4765]: I1003 09:03:34.081246 4765 kuberuntime_container.go:808] "Killing container with a grace period" pod="watcher-kuttl-default/watcher-kuttl-api-0" podUID="9ba86238-0b43-4a15-878a-1c0495098797" containerName="watcher-api" containerID="cri-o://7e8678578c65ff6bf7710e1200474ac6b3f1041fa89dc3c52a2fa850b1b1fda3" gracePeriod=30 Oct 03 09:03:34 crc kubenswrapper[4765]: I1003 09:03:34.740685 4765 generic.go:334] "Generic (PLEG): container finished" podID="9ba86238-0b43-4a15-878a-1c0495098797" containerID="b61c8d9d774c761e7ab74d79239648cf11aa8e48e7485ad087a4085193226889" exitCode=143 Oct 03 09:03:34 crc kubenswrapper[4765]: I1003 09:03:34.740733 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/watcher-kuttl-api-0" event={"ID":"9ba86238-0b43-4a15-878a-1c0495098797","Type":"ContainerDied","Data":"b61c8d9d774c761e7ab74d79239648cf11aa8e48e7485ad087a4085193226889"} Oct 03 09:03:35 crc kubenswrapper[4765]: I1003 09:03:35.503564 4765 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/watcher-kuttl-api-0" Oct 03 09:03:35 crc kubenswrapper[4765]: I1003 09:03:35.587173 4765 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cert-memcached-mtls\" (UniqueName: \"kubernetes.io/secret/9ba86238-0b43-4a15-878a-1c0495098797-cert-memcached-mtls\") pod \"9ba86238-0b43-4a15-878a-1c0495098797\" (UID: \"9ba86238-0b43-4a15-878a-1c0495098797\") " Oct 03 09:03:35 crc kubenswrapper[4765]: I1003 09:03:35.587294 4765 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-trvjt\" (UniqueName: \"kubernetes.io/projected/9ba86238-0b43-4a15-878a-1c0495098797-kube-api-access-trvjt\") pod \"9ba86238-0b43-4a15-878a-1c0495098797\" (UID: \"9ba86238-0b43-4a15-878a-1c0495098797\") " Oct 03 09:03:35 crc kubenswrapper[4765]: I1003 09:03:35.587403 4765 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9ba86238-0b43-4a15-878a-1c0495098797-config-data\") pod \"9ba86238-0b43-4a15-878a-1c0495098797\" (UID: \"9ba86238-0b43-4a15-878a-1c0495098797\") " Oct 03 09:03:35 crc kubenswrapper[4765]: I1003 09:03:35.587444 4765 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/9ba86238-0b43-4a15-878a-1c0495098797-internal-tls-certs\") pod \"9ba86238-0b43-4a15-878a-1c0495098797\" (UID: \"9ba86238-0b43-4a15-878a-1c0495098797\") " Oct 03 09:03:35 crc kubenswrapper[4765]: I1003 09:03:35.587471 4765 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9ba86238-0b43-4a15-878a-1c0495098797-combined-ca-bundle\") pod \"9ba86238-0b43-4a15-878a-1c0495098797\" (UID: \"9ba86238-0b43-4a15-878a-1c0495098797\") " Oct 03 09:03:35 crc kubenswrapper[4765]: I1003 09:03:35.587541 4765 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/9ba86238-0b43-4a15-878a-1c0495098797-logs\") pod \"9ba86238-0b43-4a15-878a-1c0495098797\" (UID: \"9ba86238-0b43-4a15-878a-1c0495098797\") " Oct 03 09:03:35 crc kubenswrapper[4765]: I1003 09:03:35.587573 4765 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/9ba86238-0b43-4a15-878a-1c0495098797-public-tls-certs\") pod \"9ba86238-0b43-4a15-878a-1c0495098797\" (UID: \"9ba86238-0b43-4a15-878a-1c0495098797\") " Oct 03 09:03:35 crc kubenswrapper[4765]: I1003 09:03:35.587632 4765 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"custom-prometheus-ca\" (UniqueName: \"kubernetes.io/secret/9ba86238-0b43-4a15-878a-1c0495098797-custom-prometheus-ca\") pod \"9ba86238-0b43-4a15-878a-1c0495098797\" (UID: \"9ba86238-0b43-4a15-878a-1c0495098797\") " Oct 03 09:03:35 crc kubenswrapper[4765]: I1003 09:03:35.589222 4765 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/9ba86238-0b43-4a15-878a-1c0495098797-logs" (OuterVolumeSpecName: "logs") pod "9ba86238-0b43-4a15-878a-1c0495098797" (UID: "9ba86238-0b43-4a15-878a-1c0495098797"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 03 09:03:35 crc kubenswrapper[4765]: I1003 09:03:35.614428 4765 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9ba86238-0b43-4a15-878a-1c0495098797-kube-api-access-trvjt" (OuterVolumeSpecName: "kube-api-access-trvjt") pod "9ba86238-0b43-4a15-878a-1c0495098797" (UID: "9ba86238-0b43-4a15-878a-1c0495098797"). InnerVolumeSpecName "kube-api-access-trvjt". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 09:03:35 crc kubenswrapper[4765]: I1003 09:03:35.621953 4765 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9ba86238-0b43-4a15-878a-1c0495098797-custom-prometheus-ca" (OuterVolumeSpecName: "custom-prometheus-ca") pod "9ba86238-0b43-4a15-878a-1c0495098797" (UID: "9ba86238-0b43-4a15-878a-1c0495098797"). InnerVolumeSpecName "custom-prometheus-ca". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 09:03:35 crc kubenswrapper[4765]: I1003 09:03:35.633871 4765 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9ba86238-0b43-4a15-878a-1c0495098797-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "9ba86238-0b43-4a15-878a-1c0495098797" (UID: "9ba86238-0b43-4a15-878a-1c0495098797"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 09:03:35 crc kubenswrapper[4765]: I1003 09:03:35.645222 4765 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9ba86238-0b43-4a15-878a-1c0495098797-config-data" (OuterVolumeSpecName: "config-data") pod "9ba86238-0b43-4a15-878a-1c0495098797" (UID: "9ba86238-0b43-4a15-878a-1c0495098797"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 09:03:35 crc kubenswrapper[4765]: I1003 09:03:35.654777 4765 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9ba86238-0b43-4a15-878a-1c0495098797-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "9ba86238-0b43-4a15-878a-1c0495098797" (UID: "9ba86238-0b43-4a15-878a-1c0495098797"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 09:03:35 crc kubenswrapper[4765]: I1003 09:03:35.665835 4765 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9ba86238-0b43-4a15-878a-1c0495098797-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "9ba86238-0b43-4a15-878a-1c0495098797" (UID: "9ba86238-0b43-4a15-878a-1c0495098797"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 09:03:35 crc kubenswrapper[4765]: I1003 09:03:35.681975 4765 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9ba86238-0b43-4a15-878a-1c0495098797-cert-memcached-mtls" (OuterVolumeSpecName: "cert-memcached-mtls") pod "9ba86238-0b43-4a15-878a-1c0495098797" (UID: "9ba86238-0b43-4a15-878a-1c0495098797"). InnerVolumeSpecName "cert-memcached-mtls". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 09:03:35 crc kubenswrapper[4765]: I1003 09:03:35.690363 4765 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/9ba86238-0b43-4a15-878a-1c0495098797-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Oct 03 09:03:35 crc kubenswrapper[4765]: I1003 09:03:35.690409 4765 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9ba86238-0b43-4a15-878a-1c0495098797-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 03 09:03:35 crc kubenswrapper[4765]: I1003 09:03:35.690422 4765 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/9ba86238-0b43-4a15-878a-1c0495098797-logs\") on node \"crc\" DevicePath \"\"" Oct 03 09:03:35 crc kubenswrapper[4765]: I1003 09:03:35.690433 4765 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/9ba86238-0b43-4a15-878a-1c0495098797-public-tls-certs\") on node \"crc\" DevicePath \"\"" Oct 03 09:03:35 crc kubenswrapper[4765]: I1003 09:03:35.690445 4765 reconciler_common.go:293] "Volume detached for volume \"custom-prometheus-ca\" (UniqueName: \"kubernetes.io/secret/9ba86238-0b43-4a15-878a-1c0495098797-custom-prometheus-ca\") on node \"crc\" DevicePath \"\"" Oct 03 09:03:35 crc kubenswrapper[4765]: I1003 09:03:35.690456 4765 reconciler_common.go:293] "Volume detached for volume \"cert-memcached-mtls\" (UniqueName: \"kubernetes.io/secret/9ba86238-0b43-4a15-878a-1c0495098797-cert-memcached-mtls\") on node \"crc\" DevicePath \"\"" Oct 03 09:03:35 crc kubenswrapper[4765]: I1003 09:03:35.690466 4765 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-trvjt\" (UniqueName: \"kubernetes.io/projected/9ba86238-0b43-4a15-878a-1c0495098797-kube-api-access-trvjt\") on node \"crc\" DevicePath \"\"" Oct 03 09:03:35 crc kubenswrapper[4765]: I1003 09:03:35.690478 4765 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9ba86238-0b43-4a15-878a-1c0495098797-config-data\") on node \"crc\" DevicePath \"\"" Oct 03 09:03:35 crc kubenswrapper[4765]: I1003 09:03:35.754801 4765 generic.go:334] "Generic (PLEG): container finished" podID="9ba86238-0b43-4a15-878a-1c0495098797" containerID="7e8678578c65ff6bf7710e1200474ac6b3f1041fa89dc3c52a2fa850b1b1fda3" exitCode=0 Oct 03 09:03:35 crc kubenswrapper[4765]: I1003 09:03:35.754865 4765 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/watcher-kuttl-api-0" Oct 03 09:03:35 crc kubenswrapper[4765]: I1003 09:03:35.754888 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/watcher-kuttl-api-0" event={"ID":"9ba86238-0b43-4a15-878a-1c0495098797","Type":"ContainerDied","Data":"7e8678578c65ff6bf7710e1200474ac6b3f1041fa89dc3c52a2fa850b1b1fda3"} Oct 03 09:03:35 crc kubenswrapper[4765]: I1003 09:03:35.755809 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/watcher-kuttl-api-0" event={"ID":"9ba86238-0b43-4a15-878a-1c0495098797","Type":"ContainerDied","Data":"8a927fa22b80fe8f447ddb7c014121e80154d5ac20462b0f07cfdc186c35287a"} Oct 03 09:03:35 crc kubenswrapper[4765]: I1003 09:03:35.755829 4765 scope.go:117] "RemoveContainer" containerID="7e8678578c65ff6bf7710e1200474ac6b3f1041fa89dc3c52a2fa850b1b1fda3" Oct 03 09:03:35 crc kubenswrapper[4765]: I1003 09:03:35.807900 4765 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["watcher-kuttl-default/watcher-kuttl-api-0"] Oct 03 09:03:35 crc kubenswrapper[4765]: I1003 09:03:35.815179 4765 scope.go:117] "RemoveContainer" containerID="b61c8d9d774c761e7ab74d79239648cf11aa8e48e7485ad087a4085193226889" Oct 03 09:03:35 crc kubenswrapper[4765]: I1003 09:03:35.819718 4765 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["watcher-kuttl-default/watcher-kuttl-api-0"] Oct 03 09:03:35 crc kubenswrapper[4765]: I1003 09:03:35.840266 4765 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["watcher-kuttl-default/watcher-kuttl-api-0"] Oct 03 09:03:35 crc kubenswrapper[4765]: E1003 09:03:35.840955 4765 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9ba86238-0b43-4a15-878a-1c0495098797" containerName="watcher-kuttl-api-log" Oct 03 09:03:35 crc kubenswrapper[4765]: I1003 09:03:35.840982 4765 state_mem.go:107] "Deleted CPUSet assignment" podUID="9ba86238-0b43-4a15-878a-1c0495098797" containerName="watcher-kuttl-api-log" Oct 03 09:03:35 crc kubenswrapper[4765]: E1003 09:03:35.841014 4765 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9ba86238-0b43-4a15-878a-1c0495098797" containerName="watcher-api" Oct 03 09:03:35 crc kubenswrapper[4765]: I1003 09:03:35.841026 4765 state_mem.go:107] "Deleted CPUSet assignment" podUID="9ba86238-0b43-4a15-878a-1c0495098797" containerName="watcher-api" Oct 03 09:03:35 crc kubenswrapper[4765]: I1003 09:03:35.841205 4765 memory_manager.go:354] "RemoveStaleState removing state" podUID="9ba86238-0b43-4a15-878a-1c0495098797" containerName="watcher-kuttl-api-log" Oct 03 09:03:35 crc kubenswrapper[4765]: I1003 09:03:35.841228 4765 memory_manager.go:354] "RemoveStaleState removing state" podUID="9ba86238-0b43-4a15-878a-1c0495098797" containerName="watcher-api" Oct 03 09:03:35 crc kubenswrapper[4765]: I1003 09:03:35.841948 4765 scope.go:117] "RemoveContainer" containerID="7e8678578c65ff6bf7710e1200474ac6b3f1041fa89dc3c52a2fa850b1b1fda3" Oct 03 09:03:35 crc kubenswrapper[4765]: I1003 09:03:35.842408 4765 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/watcher-kuttl-api-0" Oct 03 09:03:35 crc kubenswrapper[4765]: E1003 09:03:35.845286 4765 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7e8678578c65ff6bf7710e1200474ac6b3f1041fa89dc3c52a2fa850b1b1fda3\": container with ID starting with 7e8678578c65ff6bf7710e1200474ac6b3f1041fa89dc3c52a2fa850b1b1fda3 not found: ID does not exist" containerID="7e8678578c65ff6bf7710e1200474ac6b3f1041fa89dc3c52a2fa850b1b1fda3" Oct 03 09:03:35 crc kubenswrapper[4765]: I1003 09:03:35.845336 4765 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7e8678578c65ff6bf7710e1200474ac6b3f1041fa89dc3c52a2fa850b1b1fda3"} err="failed to get container status \"7e8678578c65ff6bf7710e1200474ac6b3f1041fa89dc3c52a2fa850b1b1fda3\": rpc error: code = NotFound desc = could not find container \"7e8678578c65ff6bf7710e1200474ac6b3f1041fa89dc3c52a2fa850b1b1fda3\": container with ID starting with 7e8678578c65ff6bf7710e1200474ac6b3f1041fa89dc3c52a2fa850b1b1fda3 not found: ID does not exist" Oct 03 09:03:35 crc kubenswrapper[4765]: I1003 09:03:35.845367 4765 scope.go:117] "RemoveContainer" containerID="b61c8d9d774c761e7ab74d79239648cf11aa8e48e7485ad087a4085193226889" Oct 03 09:03:35 crc kubenswrapper[4765]: I1003 09:03:35.846954 4765 reflector.go:368] Caches populated for *v1.Secret from object-"watcher-kuttl-default"/"watcher-kuttl-api-config-data" Oct 03 09:03:35 crc kubenswrapper[4765]: E1003 09:03:35.847177 4765 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b61c8d9d774c761e7ab74d79239648cf11aa8e48e7485ad087a4085193226889\": container with ID starting with b61c8d9d774c761e7ab74d79239648cf11aa8e48e7485ad087a4085193226889 not found: ID does not exist" containerID="b61c8d9d774c761e7ab74d79239648cf11aa8e48e7485ad087a4085193226889" Oct 03 09:03:35 crc kubenswrapper[4765]: I1003 09:03:35.847209 4765 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b61c8d9d774c761e7ab74d79239648cf11aa8e48e7485ad087a4085193226889"} err="failed to get container status \"b61c8d9d774c761e7ab74d79239648cf11aa8e48e7485ad087a4085193226889\": rpc error: code = NotFound desc = could not find container \"b61c8d9d774c761e7ab74d79239648cf11aa8e48e7485ad087a4085193226889\": container with ID starting with b61c8d9d774c761e7ab74d79239648cf11aa8e48e7485ad087a4085193226889 not found: ID does not exist" Oct 03 09:03:35 crc kubenswrapper[4765]: I1003 09:03:35.873996 4765 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["watcher-kuttl-default/watcher-kuttl-api-0"] Oct 03 09:03:35 crc kubenswrapper[4765]: I1003 09:03:35.998544 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ff7eb360-6cc9-4cd8-bd0e-1c11f76296d7-config-data\") pod \"watcher-kuttl-api-0\" (UID: \"ff7eb360-6cc9-4cd8-bd0e-1c11f76296d7\") " pod="watcher-kuttl-default/watcher-kuttl-api-0" Oct 03 09:03:35 crc kubenswrapper[4765]: I1003 09:03:35.998617 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lr9j2\" (UniqueName: \"kubernetes.io/projected/ff7eb360-6cc9-4cd8-bd0e-1c11f76296d7-kube-api-access-lr9j2\") pod \"watcher-kuttl-api-0\" (UID: \"ff7eb360-6cc9-4cd8-bd0e-1c11f76296d7\") " pod="watcher-kuttl-default/watcher-kuttl-api-0" Oct 03 09:03:35 crc kubenswrapper[4765]: I1003 09:03:35.998728 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert-memcached-mtls\" (UniqueName: \"kubernetes.io/secret/ff7eb360-6cc9-4cd8-bd0e-1c11f76296d7-cert-memcached-mtls\") pod \"watcher-kuttl-api-0\" (UID: \"ff7eb360-6cc9-4cd8-bd0e-1c11f76296d7\") " pod="watcher-kuttl-default/watcher-kuttl-api-0" Oct 03 09:03:35 crc kubenswrapper[4765]: I1003 09:03:35.998791 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"custom-prometheus-ca\" (UniqueName: \"kubernetes.io/secret/ff7eb360-6cc9-4cd8-bd0e-1c11f76296d7-custom-prometheus-ca\") pod \"watcher-kuttl-api-0\" (UID: \"ff7eb360-6cc9-4cd8-bd0e-1c11f76296d7\") " pod="watcher-kuttl-default/watcher-kuttl-api-0" Oct 03 09:03:35 crc kubenswrapper[4765]: I1003 09:03:35.998854 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/ff7eb360-6cc9-4cd8-bd0e-1c11f76296d7-logs\") pod \"watcher-kuttl-api-0\" (UID: \"ff7eb360-6cc9-4cd8-bd0e-1c11f76296d7\") " pod="watcher-kuttl-default/watcher-kuttl-api-0" Oct 03 09:03:35 crc kubenswrapper[4765]: I1003 09:03:35.998942 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ff7eb360-6cc9-4cd8-bd0e-1c11f76296d7-combined-ca-bundle\") pod \"watcher-kuttl-api-0\" (UID: \"ff7eb360-6cc9-4cd8-bd0e-1c11f76296d7\") " pod="watcher-kuttl-default/watcher-kuttl-api-0" Oct 03 09:03:36 crc kubenswrapper[4765]: I1003 09:03:36.100594 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/ff7eb360-6cc9-4cd8-bd0e-1c11f76296d7-logs\") pod \"watcher-kuttl-api-0\" (UID: \"ff7eb360-6cc9-4cd8-bd0e-1c11f76296d7\") " pod="watcher-kuttl-default/watcher-kuttl-api-0" Oct 03 09:03:36 crc kubenswrapper[4765]: I1003 09:03:36.101051 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ff7eb360-6cc9-4cd8-bd0e-1c11f76296d7-combined-ca-bundle\") pod \"watcher-kuttl-api-0\" (UID: \"ff7eb360-6cc9-4cd8-bd0e-1c11f76296d7\") " pod="watcher-kuttl-default/watcher-kuttl-api-0" Oct 03 09:03:36 crc kubenswrapper[4765]: I1003 09:03:36.101112 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ff7eb360-6cc9-4cd8-bd0e-1c11f76296d7-config-data\") pod \"watcher-kuttl-api-0\" (UID: \"ff7eb360-6cc9-4cd8-bd0e-1c11f76296d7\") " pod="watcher-kuttl-default/watcher-kuttl-api-0" Oct 03 09:03:36 crc kubenswrapper[4765]: I1003 09:03:36.101145 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/ff7eb360-6cc9-4cd8-bd0e-1c11f76296d7-logs\") pod \"watcher-kuttl-api-0\" (UID: \"ff7eb360-6cc9-4cd8-bd0e-1c11f76296d7\") " pod="watcher-kuttl-default/watcher-kuttl-api-0" Oct 03 09:03:36 crc kubenswrapper[4765]: I1003 09:03:36.101157 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lr9j2\" (UniqueName: \"kubernetes.io/projected/ff7eb360-6cc9-4cd8-bd0e-1c11f76296d7-kube-api-access-lr9j2\") pod \"watcher-kuttl-api-0\" (UID: \"ff7eb360-6cc9-4cd8-bd0e-1c11f76296d7\") " pod="watcher-kuttl-default/watcher-kuttl-api-0" Oct 03 09:03:36 crc kubenswrapper[4765]: I1003 09:03:36.101285 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert-memcached-mtls\" (UniqueName: \"kubernetes.io/secret/ff7eb360-6cc9-4cd8-bd0e-1c11f76296d7-cert-memcached-mtls\") pod \"watcher-kuttl-api-0\" (UID: \"ff7eb360-6cc9-4cd8-bd0e-1c11f76296d7\") " pod="watcher-kuttl-default/watcher-kuttl-api-0" Oct 03 09:03:36 crc kubenswrapper[4765]: I1003 09:03:36.101344 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"custom-prometheus-ca\" (UniqueName: \"kubernetes.io/secret/ff7eb360-6cc9-4cd8-bd0e-1c11f76296d7-custom-prometheus-ca\") pod \"watcher-kuttl-api-0\" (UID: \"ff7eb360-6cc9-4cd8-bd0e-1c11f76296d7\") " pod="watcher-kuttl-default/watcher-kuttl-api-0" Oct 03 09:03:36 crc kubenswrapper[4765]: I1003 09:03:36.104422 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ff7eb360-6cc9-4cd8-bd0e-1c11f76296d7-config-data\") pod \"watcher-kuttl-api-0\" (UID: \"ff7eb360-6cc9-4cd8-bd0e-1c11f76296d7\") " pod="watcher-kuttl-default/watcher-kuttl-api-0" Oct 03 09:03:36 crc kubenswrapper[4765]: I1003 09:03:36.104427 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert-memcached-mtls\" (UniqueName: \"kubernetes.io/secret/ff7eb360-6cc9-4cd8-bd0e-1c11f76296d7-cert-memcached-mtls\") pod \"watcher-kuttl-api-0\" (UID: \"ff7eb360-6cc9-4cd8-bd0e-1c11f76296d7\") " pod="watcher-kuttl-default/watcher-kuttl-api-0" Oct 03 09:03:36 crc kubenswrapper[4765]: I1003 09:03:36.105086 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"custom-prometheus-ca\" (UniqueName: \"kubernetes.io/secret/ff7eb360-6cc9-4cd8-bd0e-1c11f76296d7-custom-prometheus-ca\") pod \"watcher-kuttl-api-0\" (UID: \"ff7eb360-6cc9-4cd8-bd0e-1c11f76296d7\") " pod="watcher-kuttl-default/watcher-kuttl-api-0" Oct 03 09:03:36 crc kubenswrapper[4765]: I1003 09:03:36.106086 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ff7eb360-6cc9-4cd8-bd0e-1c11f76296d7-combined-ca-bundle\") pod \"watcher-kuttl-api-0\" (UID: \"ff7eb360-6cc9-4cd8-bd0e-1c11f76296d7\") " pod="watcher-kuttl-default/watcher-kuttl-api-0" Oct 03 09:03:36 crc kubenswrapper[4765]: I1003 09:03:36.124308 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lr9j2\" (UniqueName: \"kubernetes.io/projected/ff7eb360-6cc9-4cd8-bd0e-1c11f76296d7-kube-api-access-lr9j2\") pod \"watcher-kuttl-api-0\" (UID: \"ff7eb360-6cc9-4cd8-bd0e-1c11f76296d7\") " pod="watcher-kuttl-default/watcher-kuttl-api-0" Oct 03 09:03:36 crc kubenswrapper[4765]: I1003 09:03:36.158663 4765 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/watcher-kuttl-api-0" Oct 03 09:03:36 crc kubenswrapper[4765]: I1003 09:03:36.325918 4765 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9ba86238-0b43-4a15-878a-1c0495098797" path="/var/lib/kubelet/pods/9ba86238-0b43-4a15-878a-1c0495098797/volumes" Oct 03 09:03:36 crc kubenswrapper[4765]: I1003 09:03:36.681748 4765 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["watcher-kuttl-default/watcher-kuttl-api-0"] Oct 03 09:03:36 crc kubenswrapper[4765]: I1003 09:03:36.766168 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/watcher-kuttl-api-0" event={"ID":"ff7eb360-6cc9-4cd8-bd0e-1c11f76296d7","Type":"ContainerStarted","Data":"30ffbbfaeaa402b374ff270e313a33980f73001a32d8342d62e3fe12e94b3ff2"} Oct 03 09:03:37 crc kubenswrapper[4765]: I1003 09:03:37.777945 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/watcher-kuttl-api-0" event={"ID":"ff7eb360-6cc9-4cd8-bd0e-1c11f76296d7","Type":"ContainerStarted","Data":"ecb964afaf2f1e274b0b620a5cab69ea802ecaad5e425655dc7383b31b2525c0"} Oct 03 09:03:37 crc kubenswrapper[4765]: I1003 09:03:37.778246 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/watcher-kuttl-api-0" event={"ID":"ff7eb360-6cc9-4cd8-bd0e-1c11f76296d7","Type":"ContainerStarted","Data":"8c4736edbd19119c5c09fed3accd8e17e5ee98ea1e12ca39fc3b59aeaeb12c20"} Oct 03 09:03:37 crc kubenswrapper[4765]: I1003 09:03:37.778261 4765 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="watcher-kuttl-default/watcher-kuttl-api-0" Oct 03 09:03:37 crc kubenswrapper[4765]: I1003 09:03:37.805069 4765 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="watcher-kuttl-default/watcher-kuttl-api-0" podStartSLOduration=2.8050435670000002 podStartE2EDuration="2.805043567s" podCreationTimestamp="2025-10-03 09:03:35 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-03 09:03:37.795949917 +0000 UTC m=+1462.097444247" watchObservedRunningTime="2025-10-03 09:03:37.805043567 +0000 UTC m=+1462.106537897" Oct 03 09:03:38 crc kubenswrapper[4765]: I1003 09:03:38.178744 4765 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="watcher-kuttl-default/watcher-kuttl-decision-engine-0" Oct 03 09:03:38 crc kubenswrapper[4765]: I1003 09:03:38.213130 4765 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="watcher-kuttl-default/watcher-kuttl-decision-engine-0" Oct 03 09:03:38 crc kubenswrapper[4765]: I1003 09:03:38.810273 4765 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="watcher-kuttl-default/watcher-kuttl-decision-engine-0" Oct 03 09:03:38 crc kubenswrapper[4765]: I1003 09:03:38.835856 4765 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="watcher-kuttl-default/watcher-kuttl-decision-engine-0" Oct 03 09:03:40 crc kubenswrapper[4765]: I1003 09:03:40.439015 4765 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="watcher-kuttl-default/watcher-kuttl-api-0" Oct 03 09:03:41 crc kubenswrapper[4765]: I1003 09:03:41.159774 4765 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="watcher-kuttl-default/watcher-kuttl-api-0" Oct 03 09:03:46 crc kubenswrapper[4765]: I1003 09:03:46.159356 4765 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="watcher-kuttl-default/watcher-kuttl-api-0" Oct 03 09:03:46 crc kubenswrapper[4765]: I1003 09:03:46.165257 4765 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="watcher-kuttl-default/watcher-kuttl-api-0" Oct 03 09:03:46 crc kubenswrapper[4765]: I1003 09:03:46.913610 4765 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="watcher-kuttl-default/watcher-kuttl-api-0" Oct 03 09:03:55 crc kubenswrapper[4765]: I1003 09:03:55.560089 4765 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-qt28p"] Oct 03 09:03:55 crc kubenswrapper[4765]: I1003 09:03:55.563048 4765 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-qt28p" Oct 03 09:03:55 crc kubenswrapper[4765]: I1003 09:03:55.575575 4765 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-qt28p"] Oct 03 09:03:55 crc kubenswrapper[4765]: I1003 09:03:55.631960 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zpcnd\" (UniqueName: \"kubernetes.io/projected/feba90d3-2b1f-4cc0-8c7f-7e03179794c9-kube-api-access-zpcnd\") pod \"certified-operators-qt28p\" (UID: \"feba90d3-2b1f-4cc0-8c7f-7e03179794c9\") " pod="openshift-marketplace/certified-operators-qt28p" Oct 03 09:03:55 crc kubenswrapper[4765]: I1003 09:03:55.632013 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/feba90d3-2b1f-4cc0-8c7f-7e03179794c9-utilities\") pod \"certified-operators-qt28p\" (UID: \"feba90d3-2b1f-4cc0-8c7f-7e03179794c9\") " pod="openshift-marketplace/certified-operators-qt28p" Oct 03 09:03:55 crc kubenswrapper[4765]: I1003 09:03:55.632044 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/feba90d3-2b1f-4cc0-8c7f-7e03179794c9-catalog-content\") pod \"certified-operators-qt28p\" (UID: \"feba90d3-2b1f-4cc0-8c7f-7e03179794c9\") " pod="openshift-marketplace/certified-operators-qt28p" Oct 03 09:03:55 crc kubenswrapper[4765]: I1003 09:03:55.733115 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zpcnd\" (UniqueName: \"kubernetes.io/projected/feba90d3-2b1f-4cc0-8c7f-7e03179794c9-kube-api-access-zpcnd\") pod \"certified-operators-qt28p\" (UID: \"feba90d3-2b1f-4cc0-8c7f-7e03179794c9\") " pod="openshift-marketplace/certified-operators-qt28p" Oct 03 09:03:55 crc kubenswrapper[4765]: I1003 09:03:55.733181 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/feba90d3-2b1f-4cc0-8c7f-7e03179794c9-utilities\") pod \"certified-operators-qt28p\" (UID: \"feba90d3-2b1f-4cc0-8c7f-7e03179794c9\") " pod="openshift-marketplace/certified-operators-qt28p" Oct 03 09:03:55 crc kubenswrapper[4765]: I1003 09:03:55.733213 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/feba90d3-2b1f-4cc0-8c7f-7e03179794c9-catalog-content\") pod \"certified-operators-qt28p\" (UID: \"feba90d3-2b1f-4cc0-8c7f-7e03179794c9\") " pod="openshift-marketplace/certified-operators-qt28p" Oct 03 09:03:55 crc kubenswrapper[4765]: I1003 09:03:55.733797 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/feba90d3-2b1f-4cc0-8c7f-7e03179794c9-utilities\") pod \"certified-operators-qt28p\" (UID: \"feba90d3-2b1f-4cc0-8c7f-7e03179794c9\") " pod="openshift-marketplace/certified-operators-qt28p" Oct 03 09:03:55 crc kubenswrapper[4765]: I1003 09:03:55.733849 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/feba90d3-2b1f-4cc0-8c7f-7e03179794c9-catalog-content\") pod \"certified-operators-qt28p\" (UID: \"feba90d3-2b1f-4cc0-8c7f-7e03179794c9\") " pod="openshift-marketplace/certified-operators-qt28p" Oct 03 09:03:55 crc kubenswrapper[4765]: I1003 09:03:55.752097 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zpcnd\" (UniqueName: \"kubernetes.io/projected/feba90d3-2b1f-4cc0-8c7f-7e03179794c9-kube-api-access-zpcnd\") pod \"certified-operators-qt28p\" (UID: \"feba90d3-2b1f-4cc0-8c7f-7e03179794c9\") " pod="openshift-marketplace/certified-operators-qt28p" Oct 03 09:03:55 crc kubenswrapper[4765]: I1003 09:03:55.886980 4765 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-qt28p" Oct 03 09:03:56 crc kubenswrapper[4765]: I1003 09:03:56.433683 4765 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-qt28p"] Oct 03 09:03:56 crc kubenswrapper[4765]: W1003 09:03:56.438425 4765 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podfeba90d3_2b1f_4cc0_8c7f_7e03179794c9.slice/crio-0e1d73b0d5ffe77ae3c11c2a9ef9db4704b7bf9178700a67f6e19c29820f604c WatchSource:0}: Error finding container 0e1d73b0d5ffe77ae3c11c2a9ef9db4704b7bf9178700a67f6e19c29820f604c: Status 404 returned error can't find the container with id 0e1d73b0d5ffe77ae3c11c2a9ef9db4704b7bf9178700a67f6e19c29820f604c Oct 03 09:03:57 crc kubenswrapper[4765]: I1003 09:03:57.002989 4765 generic.go:334] "Generic (PLEG): container finished" podID="feba90d3-2b1f-4cc0-8c7f-7e03179794c9" containerID="e90889130f3980ac56af2225d66989bba2ec93f8c823550295badd2335384431" exitCode=0 Oct 03 09:03:57 crc kubenswrapper[4765]: I1003 09:03:57.003041 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-qt28p" event={"ID":"feba90d3-2b1f-4cc0-8c7f-7e03179794c9","Type":"ContainerDied","Data":"e90889130f3980ac56af2225d66989bba2ec93f8c823550295badd2335384431"} Oct 03 09:03:57 crc kubenswrapper[4765]: I1003 09:03:57.003274 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-qt28p" event={"ID":"feba90d3-2b1f-4cc0-8c7f-7e03179794c9","Type":"ContainerStarted","Data":"0e1d73b0d5ffe77ae3c11c2a9ef9db4704b7bf9178700a67f6e19c29820f604c"} Oct 03 09:03:58 crc kubenswrapper[4765]: I1003 09:03:58.013104 4765 generic.go:334] "Generic (PLEG): container finished" podID="feba90d3-2b1f-4cc0-8c7f-7e03179794c9" containerID="103bd5132292c0c5c3902669c6fe9e99b8564a71604b4da01a0b67a38d8d2ca2" exitCode=0 Oct 03 09:03:58 crc kubenswrapper[4765]: I1003 09:03:58.013267 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-qt28p" event={"ID":"feba90d3-2b1f-4cc0-8c7f-7e03179794c9","Type":"ContainerDied","Data":"103bd5132292c0c5c3902669c6fe9e99b8564a71604b4da01a0b67a38d8d2ca2"} Oct 03 09:03:59 crc kubenswrapper[4765]: I1003 09:03:59.321251 4765 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="watcher-kuttl-default/keystone-7bb6879856-dm6qk" Oct 03 09:03:59 crc kubenswrapper[4765]: I1003 09:03:59.373399 4765 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["watcher-kuttl-default/keystone-596469b6bd-hwkh5"] Oct 03 09:03:59 crc kubenswrapper[4765]: I1003 09:03:59.375239 4765 kuberuntime_container.go:808] "Killing container with a grace period" pod="watcher-kuttl-default/keystone-596469b6bd-hwkh5" podUID="783d142e-5f7f-4ea1-bed2-6b55f7a35aec" containerName="keystone-api" containerID="cri-o://f4b4d45ba6779950271e08ec96068d24d4e2040f61a118c926515522ef7fbe42" gracePeriod=30 Oct 03 09:03:59 crc kubenswrapper[4765]: I1003 09:03:59.757391 4765 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-fbbnh"] Oct 03 09:03:59 crc kubenswrapper[4765]: I1003 09:03:59.759669 4765 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-fbbnh" Oct 03 09:03:59 crc kubenswrapper[4765]: I1003 09:03:59.796020 4765 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-fbbnh"] Oct 03 09:03:59 crc kubenswrapper[4765]: I1003 09:03:59.899479 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wktht\" (UniqueName: \"kubernetes.io/projected/1b5dbcd7-9207-482b-b2ca-d56795e51267-kube-api-access-wktht\") pod \"redhat-marketplace-fbbnh\" (UID: \"1b5dbcd7-9207-482b-b2ca-d56795e51267\") " pod="openshift-marketplace/redhat-marketplace-fbbnh" Oct 03 09:03:59 crc kubenswrapper[4765]: I1003 09:03:59.899561 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1b5dbcd7-9207-482b-b2ca-d56795e51267-catalog-content\") pod \"redhat-marketplace-fbbnh\" (UID: \"1b5dbcd7-9207-482b-b2ca-d56795e51267\") " pod="openshift-marketplace/redhat-marketplace-fbbnh" Oct 03 09:03:59 crc kubenswrapper[4765]: I1003 09:03:59.899592 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1b5dbcd7-9207-482b-b2ca-d56795e51267-utilities\") pod \"redhat-marketplace-fbbnh\" (UID: \"1b5dbcd7-9207-482b-b2ca-d56795e51267\") " pod="openshift-marketplace/redhat-marketplace-fbbnh" Oct 03 09:04:00 crc kubenswrapper[4765]: I1003 09:04:00.001232 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wktht\" (UniqueName: \"kubernetes.io/projected/1b5dbcd7-9207-482b-b2ca-d56795e51267-kube-api-access-wktht\") pod \"redhat-marketplace-fbbnh\" (UID: \"1b5dbcd7-9207-482b-b2ca-d56795e51267\") " pod="openshift-marketplace/redhat-marketplace-fbbnh" Oct 03 09:04:00 crc kubenswrapper[4765]: I1003 09:04:00.001377 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1b5dbcd7-9207-482b-b2ca-d56795e51267-catalog-content\") pod \"redhat-marketplace-fbbnh\" (UID: \"1b5dbcd7-9207-482b-b2ca-d56795e51267\") " pod="openshift-marketplace/redhat-marketplace-fbbnh" Oct 03 09:04:00 crc kubenswrapper[4765]: I1003 09:04:00.001414 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1b5dbcd7-9207-482b-b2ca-d56795e51267-utilities\") pod \"redhat-marketplace-fbbnh\" (UID: \"1b5dbcd7-9207-482b-b2ca-d56795e51267\") " pod="openshift-marketplace/redhat-marketplace-fbbnh" Oct 03 09:04:00 crc kubenswrapper[4765]: I1003 09:04:00.001915 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1b5dbcd7-9207-482b-b2ca-d56795e51267-catalog-content\") pod \"redhat-marketplace-fbbnh\" (UID: \"1b5dbcd7-9207-482b-b2ca-d56795e51267\") " pod="openshift-marketplace/redhat-marketplace-fbbnh" Oct 03 09:04:00 crc kubenswrapper[4765]: I1003 09:04:00.002015 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1b5dbcd7-9207-482b-b2ca-d56795e51267-utilities\") pod \"redhat-marketplace-fbbnh\" (UID: \"1b5dbcd7-9207-482b-b2ca-d56795e51267\") " pod="openshift-marketplace/redhat-marketplace-fbbnh" Oct 03 09:04:00 crc kubenswrapper[4765]: I1003 09:04:00.033281 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wktht\" (UniqueName: \"kubernetes.io/projected/1b5dbcd7-9207-482b-b2ca-d56795e51267-kube-api-access-wktht\") pod \"redhat-marketplace-fbbnh\" (UID: \"1b5dbcd7-9207-482b-b2ca-d56795e51267\") " pod="openshift-marketplace/redhat-marketplace-fbbnh" Oct 03 09:04:00 crc kubenswrapper[4765]: I1003 09:04:00.037626 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-qt28p" event={"ID":"feba90d3-2b1f-4cc0-8c7f-7e03179794c9","Type":"ContainerStarted","Data":"0d2fcb1ede19abb8c7b306c228d0d976d78c1da7794b3199895326a2d379463d"} Oct 03 09:04:00 crc kubenswrapper[4765]: I1003 09:04:00.059096 4765 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-qt28p" podStartSLOduration=2.468324234 podStartE2EDuration="5.059077584s" podCreationTimestamp="2025-10-03 09:03:55 +0000 UTC" firstStartedPulling="2025-10-03 09:03:57.006883884 +0000 UTC m=+1481.308378214" lastFinishedPulling="2025-10-03 09:03:59.597637224 +0000 UTC m=+1483.899131564" observedRunningTime="2025-10-03 09:04:00.052936489 +0000 UTC m=+1484.354430829" watchObservedRunningTime="2025-10-03 09:04:00.059077584 +0000 UTC m=+1484.360571914" Oct 03 09:04:00 crc kubenswrapper[4765]: I1003 09:04:00.082386 4765 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-fbbnh" Oct 03 09:04:00 crc kubenswrapper[4765]: I1003 09:04:00.599198 4765 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-fbbnh"] Oct 03 09:04:00 crc kubenswrapper[4765]: I1003 09:04:00.680308 4765 patch_prober.go:28] interesting pod/machine-config-daemon-j8mss container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 03 09:04:00 crc kubenswrapper[4765]: I1003 09:04:00.680371 4765 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-j8mss" podUID="d636dbad-9ffa-4ba7-953f-adea04b76a23" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 03 09:04:01 crc kubenswrapper[4765]: I1003 09:04:01.048027 4765 generic.go:334] "Generic (PLEG): container finished" podID="1b5dbcd7-9207-482b-b2ca-d56795e51267" containerID="de9b65b0792e811c2d671ba6b753f07e2122358d7e5ee57a19a8629f72775c09" exitCode=0 Oct 03 09:04:01 crc kubenswrapper[4765]: I1003 09:04:01.048144 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-fbbnh" event={"ID":"1b5dbcd7-9207-482b-b2ca-d56795e51267","Type":"ContainerDied","Data":"de9b65b0792e811c2d671ba6b753f07e2122358d7e5ee57a19a8629f72775c09"} Oct 03 09:04:01 crc kubenswrapper[4765]: I1003 09:04:01.048200 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-fbbnh" event={"ID":"1b5dbcd7-9207-482b-b2ca-d56795e51267","Type":"ContainerStarted","Data":"b09d624e3626f22cae5cf31df11aef1526a3abeb531349c68751781d46443194"} Oct 03 09:04:02 crc kubenswrapper[4765]: I1003 09:04:02.059052 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-fbbnh" event={"ID":"1b5dbcd7-9207-482b-b2ca-d56795e51267","Type":"ContainerStarted","Data":"b8404d34e24f042e49d1b9aa7082b44c624ee88c2f8d0913e6ca8a62d391503c"} Oct 03 09:04:03 crc kubenswrapper[4765]: I1003 09:04:03.080205 4765 generic.go:334] "Generic (PLEG): container finished" podID="783d142e-5f7f-4ea1-bed2-6b55f7a35aec" containerID="f4b4d45ba6779950271e08ec96068d24d4e2040f61a118c926515522ef7fbe42" exitCode=0 Oct 03 09:04:03 crc kubenswrapper[4765]: I1003 09:04:03.080410 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/keystone-596469b6bd-hwkh5" event={"ID":"783d142e-5f7f-4ea1-bed2-6b55f7a35aec","Type":"ContainerDied","Data":"f4b4d45ba6779950271e08ec96068d24d4e2040f61a118c926515522ef7fbe42"} Oct 03 09:04:03 crc kubenswrapper[4765]: I1003 09:04:03.083210 4765 generic.go:334] "Generic (PLEG): container finished" podID="1b5dbcd7-9207-482b-b2ca-d56795e51267" containerID="b8404d34e24f042e49d1b9aa7082b44c624ee88c2f8d0913e6ca8a62d391503c" exitCode=0 Oct 03 09:04:03 crc kubenswrapper[4765]: I1003 09:04:03.083246 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-fbbnh" event={"ID":"1b5dbcd7-9207-482b-b2ca-d56795e51267","Type":"ContainerDied","Data":"b8404d34e24f042e49d1b9aa7082b44c624ee88c2f8d0913e6ca8a62d391503c"} Oct 03 09:04:03 crc kubenswrapper[4765]: I1003 09:04:03.284133 4765 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/keystone-596469b6bd-hwkh5" Oct 03 09:04:03 crc kubenswrapper[4765]: I1003 09:04:03.359202 4765 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/783d142e-5f7f-4ea1-bed2-6b55f7a35aec-combined-ca-bundle\") pod \"783d142e-5f7f-4ea1-bed2-6b55f7a35aec\" (UID: \"783d142e-5f7f-4ea1-bed2-6b55f7a35aec\") " Oct 03 09:04:03 crc kubenswrapper[4765]: I1003 09:04:03.359343 4765 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/783d142e-5f7f-4ea1-bed2-6b55f7a35aec-scripts\") pod \"783d142e-5f7f-4ea1-bed2-6b55f7a35aec\" (UID: \"783d142e-5f7f-4ea1-bed2-6b55f7a35aec\") " Oct 03 09:04:03 crc kubenswrapper[4765]: I1003 09:04:03.359373 4765 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/783d142e-5f7f-4ea1-bed2-6b55f7a35aec-public-tls-certs\") pod \"783d142e-5f7f-4ea1-bed2-6b55f7a35aec\" (UID: \"783d142e-5f7f-4ea1-bed2-6b55f7a35aec\") " Oct 03 09:04:03 crc kubenswrapper[4765]: I1003 09:04:03.360169 4765 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/783d142e-5f7f-4ea1-bed2-6b55f7a35aec-internal-tls-certs\") pod \"783d142e-5f7f-4ea1-bed2-6b55f7a35aec\" (UID: \"783d142e-5f7f-4ea1-bed2-6b55f7a35aec\") " Oct 03 09:04:03 crc kubenswrapper[4765]: I1003 09:04:03.360224 4765 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/783d142e-5f7f-4ea1-bed2-6b55f7a35aec-fernet-keys\") pod \"783d142e-5f7f-4ea1-bed2-6b55f7a35aec\" (UID: \"783d142e-5f7f-4ea1-bed2-6b55f7a35aec\") " Oct 03 09:04:03 crc kubenswrapper[4765]: I1003 09:04:03.360247 4765 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/783d142e-5f7f-4ea1-bed2-6b55f7a35aec-config-data\") pod \"783d142e-5f7f-4ea1-bed2-6b55f7a35aec\" (UID: \"783d142e-5f7f-4ea1-bed2-6b55f7a35aec\") " Oct 03 09:04:03 crc kubenswrapper[4765]: I1003 09:04:03.360278 4765 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6zz5t\" (UniqueName: \"kubernetes.io/projected/783d142e-5f7f-4ea1-bed2-6b55f7a35aec-kube-api-access-6zz5t\") pod \"783d142e-5f7f-4ea1-bed2-6b55f7a35aec\" (UID: \"783d142e-5f7f-4ea1-bed2-6b55f7a35aec\") " Oct 03 09:04:03 crc kubenswrapper[4765]: I1003 09:04:03.360368 4765 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/783d142e-5f7f-4ea1-bed2-6b55f7a35aec-credential-keys\") pod \"783d142e-5f7f-4ea1-bed2-6b55f7a35aec\" (UID: \"783d142e-5f7f-4ea1-bed2-6b55f7a35aec\") " Oct 03 09:04:03 crc kubenswrapper[4765]: I1003 09:04:03.417845 4765 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/783d142e-5f7f-4ea1-bed2-6b55f7a35aec-credential-keys" (OuterVolumeSpecName: "credential-keys") pod "783d142e-5f7f-4ea1-bed2-6b55f7a35aec" (UID: "783d142e-5f7f-4ea1-bed2-6b55f7a35aec"). InnerVolumeSpecName "credential-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 09:04:03 crc kubenswrapper[4765]: I1003 09:04:03.421629 4765 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/783d142e-5f7f-4ea1-bed2-6b55f7a35aec-scripts" (OuterVolumeSpecName: "scripts") pod "783d142e-5f7f-4ea1-bed2-6b55f7a35aec" (UID: "783d142e-5f7f-4ea1-bed2-6b55f7a35aec"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 09:04:03 crc kubenswrapper[4765]: I1003 09:04:03.436865 4765 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/783d142e-5f7f-4ea1-bed2-6b55f7a35aec-fernet-keys" (OuterVolumeSpecName: "fernet-keys") pod "783d142e-5f7f-4ea1-bed2-6b55f7a35aec" (UID: "783d142e-5f7f-4ea1-bed2-6b55f7a35aec"). InnerVolumeSpecName "fernet-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 09:04:03 crc kubenswrapper[4765]: I1003 09:04:03.437047 4765 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/783d142e-5f7f-4ea1-bed2-6b55f7a35aec-kube-api-access-6zz5t" (OuterVolumeSpecName: "kube-api-access-6zz5t") pod "783d142e-5f7f-4ea1-bed2-6b55f7a35aec" (UID: "783d142e-5f7f-4ea1-bed2-6b55f7a35aec"). InnerVolumeSpecName "kube-api-access-6zz5t". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 09:04:03 crc kubenswrapper[4765]: I1003 09:04:03.464983 4765 reconciler_common.go:293] "Volume detached for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/783d142e-5f7f-4ea1-bed2-6b55f7a35aec-credential-keys\") on node \"crc\" DevicePath \"\"" Oct 03 09:04:03 crc kubenswrapper[4765]: I1003 09:04:03.465027 4765 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/783d142e-5f7f-4ea1-bed2-6b55f7a35aec-scripts\") on node \"crc\" DevicePath \"\"" Oct 03 09:04:03 crc kubenswrapper[4765]: I1003 09:04:03.465039 4765 reconciler_common.go:293] "Volume detached for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/783d142e-5f7f-4ea1-bed2-6b55f7a35aec-fernet-keys\") on node \"crc\" DevicePath \"\"" Oct 03 09:04:03 crc kubenswrapper[4765]: I1003 09:04:03.465051 4765 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6zz5t\" (UniqueName: \"kubernetes.io/projected/783d142e-5f7f-4ea1-bed2-6b55f7a35aec-kube-api-access-6zz5t\") on node \"crc\" DevicePath \"\"" Oct 03 09:04:03 crc kubenswrapper[4765]: I1003 09:04:03.469669 4765 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/783d142e-5f7f-4ea1-bed2-6b55f7a35aec-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "783d142e-5f7f-4ea1-bed2-6b55f7a35aec" (UID: "783d142e-5f7f-4ea1-bed2-6b55f7a35aec"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 09:04:03 crc kubenswrapper[4765]: I1003 09:04:03.535841 4765 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/783d142e-5f7f-4ea1-bed2-6b55f7a35aec-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "783d142e-5f7f-4ea1-bed2-6b55f7a35aec" (UID: "783d142e-5f7f-4ea1-bed2-6b55f7a35aec"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 09:04:03 crc kubenswrapper[4765]: I1003 09:04:03.557090 4765 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/783d142e-5f7f-4ea1-bed2-6b55f7a35aec-config-data" (OuterVolumeSpecName: "config-data") pod "783d142e-5f7f-4ea1-bed2-6b55f7a35aec" (UID: "783d142e-5f7f-4ea1-bed2-6b55f7a35aec"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 09:04:03 crc kubenswrapper[4765]: I1003 09:04:03.563799 4765 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/783d142e-5f7f-4ea1-bed2-6b55f7a35aec-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "783d142e-5f7f-4ea1-bed2-6b55f7a35aec" (UID: "783d142e-5f7f-4ea1-bed2-6b55f7a35aec"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 09:04:03 crc kubenswrapper[4765]: I1003 09:04:03.566501 4765 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/783d142e-5f7f-4ea1-bed2-6b55f7a35aec-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Oct 03 09:04:03 crc kubenswrapper[4765]: I1003 09:04:03.566536 4765 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/783d142e-5f7f-4ea1-bed2-6b55f7a35aec-config-data\") on node \"crc\" DevicePath \"\"" Oct 03 09:04:03 crc kubenswrapper[4765]: I1003 09:04:03.566546 4765 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/783d142e-5f7f-4ea1-bed2-6b55f7a35aec-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 03 09:04:03 crc kubenswrapper[4765]: I1003 09:04:03.566554 4765 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/783d142e-5f7f-4ea1-bed2-6b55f7a35aec-public-tls-certs\") on node \"crc\" DevicePath \"\"" Oct 03 09:04:04 crc kubenswrapper[4765]: I1003 09:04:04.091936 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/keystone-596469b6bd-hwkh5" event={"ID":"783d142e-5f7f-4ea1-bed2-6b55f7a35aec","Type":"ContainerDied","Data":"07bf28ea03cc6da5dc699e7363213b3bb850dce5a6ade67de6680b27e8ebee9a"} Oct 03 09:04:04 crc kubenswrapper[4765]: I1003 09:04:04.091984 4765 scope.go:117] "RemoveContainer" containerID="f4b4d45ba6779950271e08ec96068d24d4e2040f61a118c926515522ef7fbe42" Oct 03 09:04:04 crc kubenswrapper[4765]: I1003 09:04:04.092053 4765 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/keystone-596469b6bd-hwkh5" Oct 03 09:04:04 crc kubenswrapper[4765]: I1003 09:04:04.133743 4765 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["watcher-kuttl-default/keystone-596469b6bd-hwkh5"] Oct 03 09:04:04 crc kubenswrapper[4765]: I1003 09:04:04.140046 4765 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["watcher-kuttl-default/keystone-596469b6bd-hwkh5"] Oct 03 09:04:04 crc kubenswrapper[4765]: I1003 09:04:04.316861 4765 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="783d142e-5f7f-4ea1-bed2-6b55f7a35aec" path="/var/lib/kubelet/pods/783d142e-5f7f-4ea1-bed2-6b55f7a35aec/volumes" Oct 03 09:04:05 crc kubenswrapper[4765]: I1003 09:04:05.100812 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-fbbnh" event={"ID":"1b5dbcd7-9207-482b-b2ca-d56795e51267","Type":"ContainerStarted","Data":"edcf2edb10681382ff89ddaa19ccfe089dc52b2bb1ccafae522fc6abd7fe3fc4"} Oct 03 09:04:05 crc kubenswrapper[4765]: I1003 09:04:05.117960 4765 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-fbbnh" podStartSLOduration=3.226379304 podStartE2EDuration="6.117943498s" podCreationTimestamp="2025-10-03 09:03:59 +0000 UTC" firstStartedPulling="2025-10-03 09:04:01.049709705 +0000 UTC m=+1485.351204035" lastFinishedPulling="2025-10-03 09:04:03.941273899 +0000 UTC m=+1488.242768229" observedRunningTime="2025-10-03 09:04:05.115604158 +0000 UTC m=+1489.417098488" watchObservedRunningTime="2025-10-03 09:04:05.117943498 +0000 UTC m=+1489.419437828" Oct 03 09:04:05 crc kubenswrapper[4765]: I1003 09:04:05.887691 4765 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-qt28p" Oct 03 09:04:05 crc kubenswrapper[4765]: I1003 09:04:05.887745 4765 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-qt28p" Oct 03 09:04:05 crc kubenswrapper[4765]: I1003 09:04:05.940395 4765 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-qt28p" Oct 03 09:04:06 crc kubenswrapper[4765]: I1003 09:04:06.155608 4765 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-qt28p" Oct 03 09:04:06 crc kubenswrapper[4765]: I1003 09:04:06.772354 4765 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["watcher-kuttl-default/ceilometer-0"] Oct 03 09:04:06 crc kubenswrapper[4765]: I1003 09:04:06.772698 4765 kuberuntime_container.go:808] "Killing container with a grace period" pod="watcher-kuttl-default/ceilometer-0" podUID="1cc99b7b-ff5c-4e91-a07c-52f372cd1fca" containerName="ceilometer-central-agent" containerID="cri-o://0ed793914a926a330301fc682306b9862b81d5c880898a1ca7915a9f72dc751e" gracePeriod=30 Oct 03 09:04:06 crc kubenswrapper[4765]: I1003 09:04:06.772825 4765 kuberuntime_container.go:808] "Killing container with a grace period" pod="watcher-kuttl-default/ceilometer-0" podUID="1cc99b7b-ff5c-4e91-a07c-52f372cd1fca" containerName="ceilometer-notification-agent" containerID="cri-o://551ebb5a4921e3577d7f5c4c4913b0f4198e76251b38360b0e5f9bc493ad69fc" gracePeriod=30 Oct 03 09:04:06 crc kubenswrapper[4765]: I1003 09:04:06.772897 4765 kuberuntime_container.go:808] "Killing container with a grace period" pod="watcher-kuttl-default/ceilometer-0" podUID="1cc99b7b-ff5c-4e91-a07c-52f372cd1fca" containerName="proxy-httpd" containerID="cri-o://cd1b44b04a74a6c33dd93784d299aa05f27a684d0a8af972bd2779294c393815" gracePeriod=30 Oct 03 09:04:06 crc kubenswrapper[4765]: I1003 09:04:06.773212 4765 kuberuntime_container.go:808] "Killing container with a grace period" pod="watcher-kuttl-default/ceilometer-0" podUID="1cc99b7b-ff5c-4e91-a07c-52f372cd1fca" containerName="sg-core" containerID="cri-o://3799d764f3c9903802bca2d1df0d3fbc3bfcb31a932fba626fe6faac9b780c75" gracePeriod=30 Oct 03 09:04:07 crc kubenswrapper[4765]: I1003 09:04:07.126886 4765 generic.go:334] "Generic (PLEG): container finished" podID="1cc99b7b-ff5c-4e91-a07c-52f372cd1fca" containerID="cd1b44b04a74a6c33dd93784d299aa05f27a684d0a8af972bd2779294c393815" exitCode=0 Oct 03 09:04:07 crc kubenswrapper[4765]: I1003 09:04:07.126921 4765 generic.go:334] "Generic (PLEG): container finished" podID="1cc99b7b-ff5c-4e91-a07c-52f372cd1fca" containerID="3799d764f3c9903802bca2d1df0d3fbc3bfcb31a932fba626fe6faac9b780c75" exitCode=2 Oct 03 09:04:07 crc kubenswrapper[4765]: I1003 09:04:07.126984 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/ceilometer-0" event={"ID":"1cc99b7b-ff5c-4e91-a07c-52f372cd1fca","Type":"ContainerDied","Data":"cd1b44b04a74a6c33dd93784d299aa05f27a684d0a8af972bd2779294c393815"} Oct 03 09:04:07 crc kubenswrapper[4765]: I1003 09:04:07.127041 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/ceilometer-0" event={"ID":"1cc99b7b-ff5c-4e91-a07c-52f372cd1fca","Type":"ContainerDied","Data":"3799d764f3c9903802bca2d1df0d3fbc3bfcb31a932fba626fe6faac9b780c75"} Oct 03 09:04:08 crc kubenswrapper[4765]: I1003 09:04:08.209223 4765 generic.go:334] "Generic (PLEG): container finished" podID="1cc99b7b-ff5c-4e91-a07c-52f372cd1fca" containerID="551ebb5a4921e3577d7f5c4c4913b0f4198e76251b38360b0e5f9bc493ad69fc" exitCode=0 Oct 03 09:04:08 crc kubenswrapper[4765]: I1003 09:04:08.209566 4765 generic.go:334] "Generic (PLEG): container finished" podID="1cc99b7b-ff5c-4e91-a07c-52f372cd1fca" containerID="0ed793914a926a330301fc682306b9862b81d5c880898a1ca7915a9f72dc751e" exitCode=0 Oct 03 09:04:08 crc kubenswrapper[4765]: I1003 09:04:08.209592 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/ceilometer-0" event={"ID":"1cc99b7b-ff5c-4e91-a07c-52f372cd1fca","Type":"ContainerDied","Data":"551ebb5a4921e3577d7f5c4c4913b0f4198e76251b38360b0e5f9bc493ad69fc"} Oct 03 09:04:08 crc kubenswrapper[4765]: I1003 09:04:08.209622 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/ceilometer-0" event={"ID":"1cc99b7b-ff5c-4e91-a07c-52f372cd1fca","Type":"ContainerDied","Data":"0ed793914a926a330301fc682306b9862b81d5c880898a1ca7915a9f72dc751e"} Oct 03 09:04:08 crc kubenswrapper[4765]: I1003 09:04:08.354052 4765 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-qt28p"] Oct 03 09:04:08 crc kubenswrapper[4765]: I1003 09:04:08.354611 4765 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-qt28p" podUID="feba90d3-2b1f-4cc0-8c7f-7e03179794c9" containerName="registry-server" containerID="cri-o://0d2fcb1ede19abb8c7b306c228d0d976d78c1da7794b3199895326a2d379463d" gracePeriod=2 Oct 03 09:04:08 crc kubenswrapper[4765]: I1003 09:04:08.541713 4765 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/ceilometer-0" Oct 03 09:04:08 crc kubenswrapper[4765]: I1003 09:04:08.653032 4765 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/1cc99b7b-ff5c-4e91-a07c-52f372cd1fca-ceilometer-tls-certs\") pod \"1cc99b7b-ff5c-4e91-a07c-52f372cd1fca\" (UID: \"1cc99b7b-ff5c-4e91-a07c-52f372cd1fca\") " Oct 03 09:04:08 crc kubenswrapper[4765]: I1003 09:04:08.653106 4765 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/1cc99b7b-ff5c-4e91-a07c-52f372cd1fca-sg-core-conf-yaml\") pod \"1cc99b7b-ff5c-4e91-a07c-52f372cd1fca\" (UID: \"1cc99b7b-ff5c-4e91-a07c-52f372cd1fca\") " Oct 03 09:04:08 crc kubenswrapper[4765]: I1003 09:04:08.653167 4765 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1cc99b7b-ff5c-4e91-a07c-52f372cd1fca-config-data\") pod \"1cc99b7b-ff5c-4e91-a07c-52f372cd1fca\" (UID: \"1cc99b7b-ff5c-4e91-a07c-52f372cd1fca\") " Oct 03 09:04:08 crc kubenswrapper[4765]: I1003 09:04:08.653200 4765 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1cc99b7b-ff5c-4e91-a07c-52f372cd1fca-combined-ca-bundle\") pod \"1cc99b7b-ff5c-4e91-a07c-52f372cd1fca\" (UID: \"1cc99b7b-ff5c-4e91-a07c-52f372cd1fca\") " Oct 03 09:04:08 crc kubenswrapper[4765]: I1003 09:04:08.653239 4765 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tfws9\" (UniqueName: \"kubernetes.io/projected/1cc99b7b-ff5c-4e91-a07c-52f372cd1fca-kube-api-access-tfws9\") pod \"1cc99b7b-ff5c-4e91-a07c-52f372cd1fca\" (UID: \"1cc99b7b-ff5c-4e91-a07c-52f372cd1fca\") " Oct 03 09:04:08 crc kubenswrapper[4765]: I1003 09:04:08.653321 4765 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/1cc99b7b-ff5c-4e91-a07c-52f372cd1fca-log-httpd\") pod \"1cc99b7b-ff5c-4e91-a07c-52f372cd1fca\" (UID: \"1cc99b7b-ff5c-4e91-a07c-52f372cd1fca\") " Oct 03 09:04:08 crc kubenswrapper[4765]: I1003 09:04:08.653451 4765 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/1cc99b7b-ff5c-4e91-a07c-52f372cd1fca-run-httpd\") pod \"1cc99b7b-ff5c-4e91-a07c-52f372cd1fca\" (UID: \"1cc99b7b-ff5c-4e91-a07c-52f372cd1fca\") " Oct 03 09:04:08 crc kubenswrapper[4765]: I1003 09:04:08.653518 4765 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/1cc99b7b-ff5c-4e91-a07c-52f372cd1fca-scripts\") pod \"1cc99b7b-ff5c-4e91-a07c-52f372cd1fca\" (UID: \"1cc99b7b-ff5c-4e91-a07c-52f372cd1fca\") " Oct 03 09:04:08 crc kubenswrapper[4765]: I1003 09:04:08.661866 4765 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1cc99b7b-ff5c-4e91-a07c-52f372cd1fca-scripts" (OuterVolumeSpecName: "scripts") pod "1cc99b7b-ff5c-4e91-a07c-52f372cd1fca" (UID: "1cc99b7b-ff5c-4e91-a07c-52f372cd1fca"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 09:04:08 crc kubenswrapper[4765]: I1003 09:04:08.664921 4765 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1cc99b7b-ff5c-4e91-a07c-52f372cd1fca-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "1cc99b7b-ff5c-4e91-a07c-52f372cd1fca" (UID: "1cc99b7b-ff5c-4e91-a07c-52f372cd1fca"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 03 09:04:08 crc kubenswrapper[4765]: I1003 09:04:08.665227 4765 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1cc99b7b-ff5c-4e91-a07c-52f372cd1fca-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "1cc99b7b-ff5c-4e91-a07c-52f372cd1fca" (UID: "1cc99b7b-ff5c-4e91-a07c-52f372cd1fca"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 03 09:04:08 crc kubenswrapper[4765]: I1003 09:04:08.665938 4765 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1cc99b7b-ff5c-4e91-a07c-52f372cd1fca-kube-api-access-tfws9" (OuterVolumeSpecName: "kube-api-access-tfws9") pod "1cc99b7b-ff5c-4e91-a07c-52f372cd1fca" (UID: "1cc99b7b-ff5c-4e91-a07c-52f372cd1fca"). InnerVolumeSpecName "kube-api-access-tfws9". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 09:04:08 crc kubenswrapper[4765]: I1003 09:04:08.697531 4765 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1cc99b7b-ff5c-4e91-a07c-52f372cd1fca-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "1cc99b7b-ff5c-4e91-a07c-52f372cd1fca" (UID: "1cc99b7b-ff5c-4e91-a07c-52f372cd1fca"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 09:04:08 crc kubenswrapper[4765]: I1003 09:04:08.737019 4765 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1cc99b7b-ff5c-4e91-a07c-52f372cd1fca-ceilometer-tls-certs" (OuterVolumeSpecName: "ceilometer-tls-certs") pod "1cc99b7b-ff5c-4e91-a07c-52f372cd1fca" (UID: "1cc99b7b-ff5c-4e91-a07c-52f372cd1fca"). InnerVolumeSpecName "ceilometer-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 09:04:08 crc kubenswrapper[4765]: I1003 09:04:08.755631 4765 reconciler_common.go:293] "Volume detached for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/1cc99b7b-ff5c-4e91-a07c-52f372cd1fca-ceilometer-tls-certs\") on node \"crc\" DevicePath \"\"" Oct 03 09:04:08 crc kubenswrapper[4765]: I1003 09:04:08.755721 4765 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/1cc99b7b-ff5c-4e91-a07c-52f372cd1fca-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Oct 03 09:04:08 crc kubenswrapper[4765]: I1003 09:04:08.755734 4765 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-tfws9\" (UniqueName: \"kubernetes.io/projected/1cc99b7b-ff5c-4e91-a07c-52f372cd1fca-kube-api-access-tfws9\") on node \"crc\" DevicePath \"\"" Oct 03 09:04:08 crc kubenswrapper[4765]: I1003 09:04:08.755748 4765 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/1cc99b7b-ff5c-4e91-a07c-52f372cd1fca-log-httpd\") on node \"crc\" DevicePath \"\"" Oct 03 09:04:08 crc kubenswrapper[4765]: I1003 09:04:08.755757 4765 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/1cc99b7b-ff5c-4e91-a07c-52f372cd1fca-run-httpd\") on node \"crc\" DevicePath \"\"" Oct 03 09:04:08 crc kubenswrapper[4765]: I1003 09:04:08.755769 4765 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/1cc99b7b-ff5c-4e91-a07c-52f372cd1fca-scripts\") on node \"crc\" DevicePath \"\"" Oct 03 09:04:08 crc kubenswrapper[4765]: I1003 09:04:08.755852 4765 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1cc99b7b-ff5c-4e91-a07c-52f372cd1fca-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "1cc99b7b-ff5c-4e91-a07c-52f372cd1fca" (UID: "1cc99b7b-ff5c-4e91-a07c-52f372cd1fca"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 09:04:08 crc kubenswrapper[4765]: I1003 09:04:08.758744 4765 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-qt28p" Oct 03 09:04:08 crc kubenswrapper[4765]: I1003 09:04:08.787613 4765 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1cc99b7b-ff5c-4e91-a07c-52f372cd1fca-config-data" (OuterVolumeSpecName: "config-data") pod "1cc99b7b-ff5c-4e91-a07c-52f372cd1fca" (UID: "1cc99b7b-ff5c-4e91-a07c-52f372cd1fca"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 09:04:08 crc kubenswrapper[4765]: I1003 09:04:08.856903 4765 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/feba90d3-2b1f-4cc0-8c7f-7e03179794c9-catalog-content\") pod \"feba90d3-2b1f-4cc0-8c7f-7e03179794c9\" (UID: \"feba90d3-2b1f-4cc0-8c7f-7e03179794c9\") " Oct 03 09:04:08 crc kubenswrapper[4765]: I1003 09:04:08.857113 4765 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zpcnd\" (UniqueName: \"kubernetes.io/projected/feba90d3-2b1f-4cc0-8c7f-7e03179794c9-kube-api-access-zpcnd\") pod \"feba90d3-2b1f-4cc0-8c7f-7e03179794c9\" (UID: \"feba90d3-2b1f-4cc0-8c7f-7e03179794c9\") " Oct 03 09:04:08 crc kubenswrapper[4765]: I1003 09:04:08.857184 4765 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/feba90d3-2b1f-4cc0-8c7f-7e03179794c9-utilities\") pod \"feba90d3-2b1f-4cc0-8c7f-7e03179794c9\" (UID: \"feba90d3-2b1f-4cc0-8c7f-7e03179794c9\") " Oct 03 09:04:08 crc kubenswrapper[4765]: I1003 09:04:08.857630 4765 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1cc99b7b-ff5c-4e91-a07c-52f372cd1fca-config-data\") on node \"crc\" DevicePath \"\"" Oct 03 09:04:08 crc kubenswrapper[4765]: I1003 09:04:08.857664 4765 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1cc99b7b-ff5c-4e91-a07c-52f372cd1fca-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 03 09:04:08 crc kubenswrapper[4765]: I1003 09:04:08.858399 4765 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/feba90d3-2b1f-4cc0-8c7f-7e03179794c9-utilities" (OuterVolumeSpecName: "utilities") pod "feba90d3-2b1f-4cc0-8c7f-7e03179794c9" (UID: "feba90d3-2b1f-4cc0-8c7f-7e03179794c9"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 03 09:04:08 crc kubenswrapper[4765]: I1003 09:04:08.861327 4765 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/feba90d3-2b1f-4cc0-8c7f-7e03179794c9-kube-api-access-zpcnd" (OuterVolumeSpecName: "kube-api-access-zpcnd") pod "feba90d3-2b1f-4cc0-8c7f-7e03179794c9" (UID: "feba90d3-2b1f-4cc0-8c7f-7e03179794c9"). InnerVolumeSpecName "kube-api-access-zpcnd". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 09:04:08 crc kubenswrapper[4765]: I1003 09:04:08.905068 4765 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/feba90d3-2b1f-4cc0-8c7f-7e03179794c9-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "feba90d3-2b1f-4cc0-8c7f-7e03179794c9" (UID: "feba90d3-2b1f-4cc0-8c7f-7e03179794c9"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 03 09:04:08 crc kubenswrapper[4765]: I1003 09:04:08.959593 4765 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/feba90d3-2b1f-4cc0-8c7f-7e03179794c9-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 03 09:04:08 crc kubenswrapper[4765]: I1003 09:04:08.959660 4765 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zpcnd\" (UniqueName: \"kubernetes.io/projected/feba90d3-2b1f-4cc0-8c7f-7e03179794c9-kube-api-access-zpcnd\") on node \"crc\" DevicePath \"\"" Oct 03 09:04:08 crc kubenswrapper[4765]: I1003 09:04:08.959676 4765 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/feba90d3-2b1f-4cc0-8c7f-7e03179794c9-utilities\") on node \"crc\" DevicePath \"\"" Oct 03 09:04:09 crc kubenswrapper[4765]: I1003 09:04:09.219820 4765 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/ceilometer-0" Oct 03 09:04:09 crc kubenswrapper[4765]: I1003 09:04:09.219815 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/ceilometer-0" event={"ID":"1cc99b7b-ff5c-4e91-a07c-52f372cd1fca","Type":"ContainerDied","Data":"138dd69f06db6ca626bc445a63ac569671a8911838a7d468c4c41c202487762a"} Oct 03 09:04:09 crc kubenswrapper[4765]: I1003 09:04:09.219964 4765 scope.go:117] "RemoveContainer" containerID="cd1b44b04a74a6c33dd93784d299aa05f27a684d0a8af972bd2779294c393815" Oct 03 09:04:09 crc kubenswrapper[4765]: I1003 09:04:09.223280 4765 generic.go:334] "Generic (PLEG): container finished" podID="feba90d3-2b1f-4cc0-8c7f-7e03179794c9" containerID="0d2fcb1ede19abb8c7b306c228d0d976d78c1da7794b3199895326a2d379463d" exitCode=0 Oct 03 09:04:09 crc kubenswrapper[4765]: I1003 09:04:09.223313 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-qt28p" event={"ID":"feba90d3-2b1f-4cc0-8c7f-7e03179794c9","Type":"ContainerDied","Data":"0d2fcb1ede19abb8c7b306c228d0d976d78c1da7794b3199895326a2d379463d"} Oct 03 09:04:09 crc kubenswrapper[4765]: I1003 09:04:09.223331 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-qt28p" event={"ID":"feba90d3-2b1f-4cc0-8c7f-7e03179794c9","Type":"ContainerDied","Data":"0e1d73b0d5ffe77ae3c11c2a9ef9db4704b7bf9178700a67f6e19c29820f604c"} Oct 03 09:04:09 crc kubenswrapper[4765]: I1003 09:04:09.223345 4765 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-qt28p" Oct 03 09:04:09 crc kubenswrapper[4765]: I1003 09:04:09.245314 4765 scope.go:117] "RemoveContainer" containerID="3799d764f3c9903802bca2d1df0d3fbc3bfcb31a932fba626fe6faac9b780c75" Oct 03 09:04:09 crc kubenswrapper[4765]: I1003 09:04:09.257793 4765 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["watcher-kuttl-default/ceilometer-0"] Oct 03 09:04:09 crc kubenswrapper[4765]: I1003 09:04:09.270742 4765 scope.go:117] "RemoveContainer" containerID="551ebb5a4921e3577d7f5c4c4913b0f4198e76251b38360b0e5f9bc493ad69fc" Oct 03 09:04:09 crc kubenswrapper[4765]: I1003 09:04:09.272610 4765 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["watcher-kuttl-default/ceilometer-0"] Oct 03 09:04:09 crc kubenswrapper[4765]: I1003 09:04:09.292486 4765 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-qt28p"] Oct 03 09:04:09 crc kubenswrapper[4765]: I1003 09:04:09.319377 4765 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-qt28p"] Oct 03 09:04:09 crc kubenswrapper[4765]: I1003 09:04:09.341066 4765 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["watcher-kuttl-default/ceilometer-0"] Oct 03 09:04:09 crc kubenswrapper[4765]: E1003 09:04:09.341390 4765 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="783d142e-5f7f-4ea1-bed2-6b55f7a35aec" containerName="keystone-api" Oct 03 09:04:09 crc kubenswrapper[4765]: I1003 09:04:09.341404 4765 state_mem.go:107] "Deleted CPUSet assignment" podUID="783d142e-5f7f-4ea1-bed2-6b55f7a35aec" containerName="keystone-api" Oct 03 09:04:09 crc kubenswrapper[4765]: E1003 09:04:09.341422 4765 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="feba90d3-2b1f-4cc0-8c7f-7e03179794c9" containerName="registry-server" Oct 03 09:04:09 crc kubenswrapper[4765]: I1003 09:04:09.341481 4765 state_mem.go:107] "Deleted CPUSet assignment" podUID="feba90d3-2b1f-4cc0-8c7f-7e03179794c9" containerName="registry-server" Oct 03 09:04:09 crc kubenswrapper[4765]: E1003 09:04:09.341498 4765 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="feba90d3-2b1f-4cc0-8c7f-7e03179794c9" containerName="extract-content" Oct 03 09:04:09 crc kubenswrapper[4765]: I1003 09:04:09.341504 4765 state_mem.go:107] "Deleted CPUSet assignment" podUID="feba90d3-2b1f-4cc0-8c7f-7e03179794c9" containerName="extract-content" Oct 03 09:04:09 crc kubenswrapper[4765]: E1003 09:04:09.341519 4765 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1cc99b7b-ff5c-4e91-a07c-52f372cd1fca" containerName="proxy-httpd" Oct 03 09:04:09 crc kubenswrapper[4765]: I1003 09:04:09.341524 4765 state_mem.go:107] "Deleted CPUSet assignment" podUID="1cc99b7b-ff5c-4e91-a07c-52f372cd1fca" containerName="proxy-httpd" Oct 03 09:04:09 crc kubenswrapper[4765]: E1003 09:04:09.341538 4765 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1cc99b7b-ff5c-4e91-a07c-52f372cd1fca" containerName="ceilometer-notification-agent" Oct 03 09:04:09 crc kubenswrapper[4765]: I1003 09:04:09.341566 4765 state_mem.go:107] "Deleted CPUSet assignment" podUID="1cc99b7b-ff5c-4e91-a07c-52f372cd1fca" containerName="ceilometer-notification-agent" Oct 03 09:04:09 crc kubenswrapper[4765]: E1003 09:04:09.341578 4765 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1cc99b7b-ff5c-4e91-a07c-52f372cd1fca" containerName="ceilometer-central-agent" Oct 03 09:04:09 crc kubenswrapper[4765]: I1003 09:04:09.341584 4765 state_mem.go:107] "Deleted CPUSet assignment" podUID="1cc99b7b-ff5c-4e91-a07c-52f372cd1fca" containerName="ceilometer-central-agent" Oct 03 09:04:09 crc kubenswrapper[4765]: E1003 09:04:09.341593 4765 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="feba90d3-2b1f-4cc0-8c7f-7e03179794c9" containerName="extract-utilities" Oct 03 09:04:09 crc kubenswrapper[4765]: I1003 09:04:09.341598 4765 state_mem.go:107] "Deleted CPUSet assignment" podUID="feba90d3-2b1f-4cc0-8c7f-7e03179794c9" containerName="extract-utilities" Oct 03 09:04:09 crc kubenswrapper[4765]: E1003 09:04:09.341610 4765 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1cc99b7b-ff5c-4e91-a07c-52f372cd1fca" containerName="sg-core" Oct 03 09:04:09 crc kubenswrapper[4765]: I1003 09:04:09.341615 4765 state_mem.go:107] "Deleted CPUSet assignment" podUID="1cc99b7b-ff5c-4e91-a07c-52f372cd1fca" containerName="sg-core" Oct 03 09:04:09 crc kubenswrapper[4765]: I1003 09:04:09.341815 4765 memory_manager.go:354] "RemoveStaleState removing state" podUID="feba90d3-2b1f-4cc0-8c7f-7e03179794c9" containerName="registry-server" Oct 03 09:04:09 crc kubenswrapper[4765]: I1003 09:04:09.341845 4765 memory_manager.go:354] "RemoveStaleState removing state" podUID="1cc99b7b-ff5c-4e91-a07c-52f372cd1fca" containerName="sg-core" Oct 03 09:04:09 crc kubenswrapper[4765]: I1003 09:04:09.341855 4765 memory_manager.go:354] "RemoveStaleState removing state" podUID="1cc99b7b-ff5c-4e91-a07c-52f372cd1fca" containerName="ceilometer-central-agent" Oct 03 09:04:09 crc kubenswrapper[4765]: I1003 09:04:09.341865 4765 memory_manager.go:354] "RemoveStaleState removing state" podUID="1cc99b7b-ff5c-4e91-a07c-52f372cd1fca" containerName="ceilometer-notification-agent" Oct 03 09:04:09 crc kubenswrapper[4765]: I1003 09:04:09.341879 4765 memory_manager.go:354] "RemoveStaleState removing state" podUID="783d142e-5f7f-4ea1-bed2-6b55f7a35aec" containerName="keystone-api" Oct 03 09:04:09 crc kubenswrapper[4765]: I1003 09:04:09.341890 4765 memory_manager.go:354] "RemoveStaleState removing state" podUID="1cc99b7b-ff5c-4e91-a07c-52f372cd1fca" containerName="proxy-httpd" Oct 03 09:04:09 crc kubenswrapper[4765]: I1003 09:04:09.342383 4765 scope.go:117] "RemoveContainer" containerID="0ed793914a926a330301fc682306b9862b81d5c880898a1ca7915a9f72dc751e" Oct 03 09:04:09 crc kubenswrapper[4765]: I1003 09:04:09.344069 4765 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/ceilometer-0" Oct 03 09:04:09 crc kubenswrapper[4765]: I1003 09:04:09.351328 4765 reflector.go:368] Caches populated for *v1.Secret from object-"watcher-kuttl-default"/"ceilometer-scripts" Oct 03 09:04:09 crc kubenswrapper[4765]: I1003 09:04:09.351391 4765 reflector.go:368] Caches populated for *v1.Secret from object-"watcher-kuttl-default"/"ceilometer-config-data" Oct 03 09:04:09 crc kubenswrapper[4765]: I1003 09:04:09.351343 4765 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["watcher-kuttl-default/ceilometer-0"] Oct 03 09:04:09 crc kubenswrapper[4765]: I1003 09:04:09.351544 4765 reflector.go:368] Caches populated for *v1.Secret from object-"watcher-kuttl-default"/"cert-ceilometer-internal-svc" Oct 03 09:04:09 crc kubenswrapper[4765]: I1003 09:04:09.369722 4765 scope.go:117] "RemoveContainer" containerID="0d2fcb1ede19abb8c7b306c228d0d976d78c1da7794b3199895326a2d379463d" Oct 03 09:04:09 crc kubenswrapper[4765]: I1003 09:04:09.387988 4765 scope.go:117] "RemoveContainer" containerID="103bd5132292c0c5c3902669c6fe9e99b8564a71604b4da01a0b67a38d8d2ca2" Oct 03 09:04:09 crc kubenswrapper[4765]: I1003 09:04:09.414002 4765 scope.go:117] "RemoveContainer" containerID="e90889130f3980ac56af2225d66989bba2ec93f8c823550295badd2335384431" Oct 03 09:04:09 crc kubenswrapper[4765]: I1003 09:04:09.441607 4765 scope.go:117] "RemoveContainer" containerID="0d2fcb1ede19abb8c7b306c228d0d976d78c1da7794b3199895326a2d379463d" Oct 03 09:04:09 crc kubenswrapper[4765]: E1003 09:04:09.442073 4765 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0d2fcb1ede19abb8c7b306c228d0d976d78c1da7794b3199895326a2d379463d\": container with ID starting with 0d2fcb1ede19abb8c7b306c228d0d976d78c1da7794b3199895326a2d379463d not found: ID does not exist" containerID="0d2fcb1ede19abb8c7b306c228d0d976d78c1da7794b3199895326a2d379463d" Oct 03 09:04:09 crc kubenswrapper[4765]: I1003 09:04:09.442107 4765 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0d2fcb1ede19abb8c7b306c228d0d976d78c1da7794b3199895326a2d379463d"} err="failed to get container status \"0d2fcb1ede19abb8c7b306c228d0d976d78c1da7794b3199895326a2d379463d\": rpc error: code = NotFound desc = could not find container \"0d2fcb1ede19abb8c7b306c228d0d976d78c1da7794b3199895326a2d379463d\": container with ID starting with 0d2fcb1ede19abb8c7b306c228d0d976d78c1da7794b3199895326a2d379463d not found: ID does not exist" Oct 03 09:04:09 crc kubenswrapper[4765]: I1003 09:04:09.442132 4765 scope.go:117] "RemoveContainer" containerID="103bd5132292c0c5c3902669c6fe9e99b8564a71604b4da01a0b67a38d8d2ca2" Oct 03 09:04:09 crc kubenswrapper[4765]: E1003 09:04:09.442663 4765 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"103bd5132292c0c5c3902669c6fe9e99b8564a71604b4da01a0b67a38d8d2ca2\": container with ID starting with 103bd5132292c0c5c3902669c6fe9e99b8564a71604b4da01a0b67a38d8d2ca2 not found: ID does not exist" containerID="103bd5132292c0c5c3902669c6fe9e99b8564a71604b4da01a0b67a38d8d2ca2" Oct 03 09:04:09 crc kubenswrapper[4765]: I1003 09:04:09.442690 4765 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"103bd5132292c0c5c3902669c6fe9e99b8564a71604b4da01a0b67a38d8d2ca2"} err="failed to get container status \"103bd5132292c0c5c3902669c6fe9e99b8564a71604b4da01a0b67a38d8d2ca2\": rpc error: code = NotFound desc = could not find container \"103bd5132292c0c5c3902669c6fe9e99b8564a71604b4da01a0b67a38d8d2ca2\": container with ID starting with 103bd5132292c0c5c3902669c6fe9e99b8564a71604b4da01a0b67a38d8d2ca2 not found: ID does not exist" Oct 03 09:04:09 crc kubenswrapper[4765]: I1003 09:04:09.442708 4765 scope.go:117] "RemoveContainer" containerID="e90889130f3980ac56af2225d66989bba2ec93f8c823550295badd2335384431" Oct 03 09:04:09 crc kubenswrapper[4765]: E1003 09:04:09.443021 4765 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e90889130f3980ac56af2225d66989bba2ec93f8c823550295badd2335384431\": container with ID starting with e90889130f3980ac56af2225d66989bba2ec93f8c823550295badd2335384431 not found: ID does not exist" containerID="e90889130f3980ac56af2225d66989bba2ec93f8c823550295badd2335384431" Oct 03 09:04:09 crc kubenswrapper[4765]: I1003 09:04:09.443049 4765 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e90889130f3980ac56af2225d66989bba2ec93f8c823550295badd2335384431"} err="failed to get container status \"e90889130f3980ac56af2225d66989bba2ec93f8c823550295badd2335384431\": rpc error: code = NotFound desc = could not find container \"e90889130f3980ac56af2225d66989bba2ec93f8c823550295badd2335384431\": container with ID starting with e90889130f3980ac56af2225d66989bba2ec93f8c823550295badd2335384431 not found: ID does not exist" Oct 03 09:04:09 crc kubenswrapper[4765]: I1003 09:04:09.469106 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/e2b874f8-a25b-46c7-bff2-45197b16caa7-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"e2b874f8-a25b-46c7-bff2-45197b16caa7\") " pod="watcher-kuttl-default/ceilometer-0" Oct 03 09:04:09 crc kubenswrapper[4765]: I1003 09:04:09.469182 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/e2b874f8-a25b-46c7-bff2-45197b16caa7-run-httpd\") pod \"ceilometer-0\" (UID: \"e2b874f8-a25b-46c7-bff2-45197b16caa7\") " pod="watcher-kuttl-default/ceilometer-0" Oct 03 09:04:09 crc kubenswrapper[4765]: I1003 09:04:09.469234 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/e2b874f8-a25b-46c7-bff2-45197b16caa7-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"e2b874f8-a25b-46c7-bff2-45197b16caa7\") " pod="watcher-kuttl-default/ceilometer-0" Oct 03 09:04:09 crc kubenswrapper[4765]: I1003 09:04:09.469280 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/e2b874f8-a25b-46c7-bff2-45197b16caa7-log-httpd\") pod \"ceilometer-0\" (UID: \"e2b874f8-a25b-46c7-bff2-45197b16caa7\") " pod="watcher-kuttl-default/ceilometer-0" Oct 03 09:04:09 crc kubenswrapper[4765]: I1003 09:04:09.469304 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e2b874f8-a25b-46c7-bff2-45197b16caa7-config-data\") pod \"ceilometer-0\" (UID: \"e2b874f8-a25b-46c7-bff2-45197b16caa7\") " pod="watcher-kuttl-default/ceilometer-0" Oct 03 09:04:09 crc kubenswrapper[4765]: I1003 09:04:09.469345 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ns47t\" (UniqueName: \"kubernetes.io/projected/e2b874f8-a25b-46c7-bff2-45197b16caa7-kube-api-access-ns47t\") pod \"ceilometer-0\" (UID: \"e2b874f8-a25b-46c7-bff2-45197b16caa7\") " pod="watcher-kuttl-default/ceilometer-0" Oct 03 09:04:09 crc kubenswrapper[4765]: I1003 09:04:09.469366 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e2b874f8-a25b-46c7-bff2-45197b16caa7-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"e2b874f8-a25b-46c7-bff2-45197b16caa7\") " pod="watcher-kuttl-default/ceilometer-0" Oct 03 09:04:09 crc kubenswrapper[4765]: I1003 09:04:09.469386 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e2b874f8-a25b-46c7-bff2-45197b16caa7-scripts\") pod \"ceilometer-0\" (UID: \"e2b874f8-a25b-46c7-bff2-45197b16caa7\") " pod="watcher-kuttl-default/ceilometer-0" Oct 03 09:04:09 crc kubenswrapper[4765]: I1003 09:04:09.570807 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ns47t\" (UniqueName: \"kubernetes.io/projected/e2b874f8-a25b-46c7-bff2-45197b16caa7-kube-api-access-ns47t\") pod \"ceilometer-0\" (UID: \"e2b874f8-a25b-46c7-bff2-45197b16caa7\") " pod="watcher-kuttl-default/ceilometer-0" Oct 03 09:04:09 crc kubenswrapper[4765]: I1003 09:04:09.570849 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e2b874f8-a25b-46c7-bff2-45197b16caa7-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"e2b874f8-a25b-46c7-bff2-45197b16caa7\") " pod="watcher-kuttl-default/ceilometer-0" Oct 03 09:04:09 crc kubenswrapper[4765]: I1003 09:04:09.570875 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e2b874f8-a25b-46c7-bff2-45197b16caa7-scripts\") pod \"ceilometer-0\" (UID: \"e2b874f8-a25b-46c7-bff2-45197b16caa7\") " pod="watcher-kuttl-default/ceilometer-0" Oct 03 09:04:09 crc kubenswrapper[4765]: I1003 09:04:09.571504 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/e2b874f8-a25b-46c7-bff2-45197b16caa7-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"e2b874f8-a25b-46c7-bff2-45197b16caa7\") " pod="watcher-kuttl-default/ceilometer-0" Oct 03 09:04:09 crc kubenswrapper[4765]: I1003 09:04:09.571569 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/e2b874f8-a25b-46c7-bff2-45197b16caa7-run-httpd\") pod \"ceilometer-0\" (UID: \"e2b874f8-a25b-46c7-bff2-45197b16caa7\") " pod="watcher-kuttl-default/ceilometer-0" Oct 03 09:04:09 crc kubenswrapper[4765]: I1003 09:04:09.571608 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/e2b874f8-a25b-46c7-bff2-45197b16caa7-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"e2b874f8-a25b-46c7-bff2-45197b16caa7\") " pod="watcher-kuttl-default/ceilometer-0" Oct 03 09:04:09 crc kubenswrapper[4765]: I1003 09:04:09.571671 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/e2b874f8-a25b-46c7-bff2-45197b16caa7-log-httpd\") pod \"ceilometer-0\" (UID: \"e2b874f8-a25b-46c7-bff2-45197b16caa7\") " pod="watcher-kuttl-default/ceilometer-0" Oct 03 09:04:09 crc kubenswrapper[4765]: I1003 09:04:09.571697 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e2b874f8-a25b-46c7-bff2-45197b16caa7-config-data\") pod \"ceilometer-0\" (UID: \"e2b874f8-a25b-46c7-bff2-45197b16caa7\") " pod="watcher-kuttl-default/ceilometer-0" Oct 03 09:04:09 crc kubenswrapper[4765]: I1003 09:04:09.573732 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/e2b874f8-a25b-46c7-bff2-45197b16caa7-run-httpd\") pod \"ceilometer-0\" (UID: \"e2b874f8-a25b-46c7-bff2-45197b16caa7\") " pod="watcher-kuttl-default/ceilometer-0" Oct 03 09:04:09 crc kubenswrapper[4765]: I1003 09:04:09.573774 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/e2b874f8-a25b-46c7-bff2-45197b16caa7-log-httpd\") pod \"ceilometer-0\" (UID: \"e2b874f8-a25b-46c7-bff2-45197b16caa7\") " pod="watcher-kuttl-default/ceilometer-0" Oct 03 09:04:09 crc kubenswrapper[4765]: I1003 09:04:09.575151 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e2b874f8-a25b-46c7-bff2-45197b16caa7-scripts\") pod \"ceilometer-0\" (UID: \"e2b874f8-a25b-46c7-bff2-45197b16caa7\") " pod="watcher-kuttl-default/ceilometer-0" Oct 03 09:04:09 crc kubenswrapper[4765]: I1003 09:04:09.575797 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e2b874f8-a25b-46c7-bff2-45197b16caa7-config-data\") pod \"ceilometer-0\" (UID: \"e2b874f8-a25b-46c7-bff2-45197b16caa7\") " pod="watcher-kuttl-default/ceilometer-0" Oct 03 09:04:09 crc kubenswrapper[4765]: I1003 09:04:09.577218 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/e2b874f8-a25b-46c7-bff2-45197b16caa7-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"e2b874f8-a25b-46c7-bff2-45197b16caa7\") " pod="watcher-kuttl-default/ceilometer-0" Oct 03 09:04:09 crc kubenswrapper[4765]: I1003 09:04:09.577474 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e2b874f8-a25b-46c7-bff2-45197b16caa7-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"e2b874f8-a25b-46c7-bff2-45197b16caa7\") " pod="watcher-kuttl-default/ceilometer-0" Oct 03 09:04:09 crc kubenswrapper[4765]: I1003 09:04:09.578265 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/e2b874f8-a25b-46c7-bff2-45197b16caa7-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"e2b874f8-a25b-46c7-bff2-45197b16caa7\") " pod="watcher-kuttl-default/ceilometer-0" Oct 03 09:04:09 crc kubenswrapper[4765]: I1003 09:04:09.596986 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ns47t\" (UniqueName: \"kubernetes.io/projected/e2b874f8-a25b-46c7-bff2-45197b16caa7-kube-api-access-ns47t\") pod \"ceilometer-0\" (UID: \"e2b874f8-a25b-46c7-bff2-45197b16caa7\") " pod="watcher-kuttl-default/ceilometer-0" Oct 03 09:04:09 crc kubenswrapper[4765]: I1003 09:04:09.661521 4765 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/ceilometer-0" Oct 03 09:04:10 crc kubenswrapper[4765]: I1003 09:04:10.083476 4765 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-fbbnh" Oct 03 09:04:10 crc kubenswrapper[4765]: I1003 09:04:10.084125 4765 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-fbbnh" Oct 03 09:04:10 crc kubenswrapper[4765]: I1003 09:04:10.128297 4765 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["watcher-kuttl-default/ceilometer-0"] Oct 03 09:04:10 crc kubenswrapper[4765]: I1003 09:04:10.140803 4765 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-fbbnh" Oct 03 09:04:10 crc kubenswrapper[4765]: I1003 09:04:10.234128 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/ceilometer-0" event={"ID":"e2b874f8-a25b-46c7-bff2-45197b16caa7","Type":"ContainerStarted","Data":"de7b05e5078ea573839ebd0c20ee5963c559c2f0f4d710cae8f0034494c699ae"} Oct 03 09:04:10 crc kubenswrapper[4765]: I1003 09:04:10.289721 4765 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-fbbnh" Oct 03 09:04:10 crc kubenswrapper[4765]: I1003 09:04:10.317365 4765 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1cc99b7b-ff5c-4e91-a07c-52f372cd1fca" path="/var/lib/kubelet/pods/1cc99b7b-ff5c-4e91-a07c-52f372cd1fca/volumes" Oct 03 09:04:10 crc kubenswrapper[4765]: I1003 09:04:10.318242 4765 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="feba90d3-2b1f-4cc0-8c7f-7e03179794c9" path="/var/lib/kubelet/pods/feba90d3-2b1f-4cc0-8c7f-7e03179794c9/volumes" Oct 03 09:04:11 crc kubenswrapper[4765]: I1003 09:04:11.247438 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/ceilometer-0" event={"ID":"e2b874f8-a25b-46c7-bff2-45197b16caa7","Type":"ContainerStarted","Data":"02a4c12827acf6a03832d3ba2e7f6d31c7c59596c52784fc348f07f44e85d3a0"} Oct 03 09:04:12 crc kubenswrapper[4765]: I1003 09:04:12.267746 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/ceilometer-0" event={"ID":"e2b874f8-a25b-46c7-bff2-45197b16caa7","Type":"ContainerStarted","Data":"9e59176beff49dbe1e9082f472a14d0a0615f5f796e8ada2fd8cdb7da1e3ef40"} Oct 03 09:04:12 crc kubenswrapper[4765]: I1003 09:04:12.554997 4765 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-fbbnh"] Oct 03 09:04:13 crc kubenswrapper[4765]: I1003 09:04:13.278977 4765 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-fbbnh" podUID="1b5dbcd7-9207-482b-b2ca-d56795e51267" containerName="registry-server" containerID="cri-o://edcf2edb10681382ff89ddaa19ccfe089dc52b2bb1ccafae522fc6abd7fe3fc4" gracePeriod=2 Oct 03 09:04:13 crc kubenswrapper[4765]: I1003 09:04:13.279082 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/ceilometer-0" event={"ID":"e2b874f8-a25b-46c7-bff2-45197b16caa7","Type":"ContainerStarted","Data":"a72f8378a59178e3d9227eddd050776509910bb73498d32073e0984faa7c92ce"} Oct 03 09:04:13 crc kubenswrapper[4765]: I1003 09:04:13.748557 4765 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-fbbnh" Oct 03 09:04:13 crc kubenswrapper[4765]: I1003 09:04:13.858712 4765 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1b5dbcd7-9207-482b-b2ca-d56795e51267-utilities\") pod \"1b5dbcd7-9207-482b-b2ca-d56795e51267\" (UID: \"1b5dbcd7-9207-482b-b2ca-d56795e51267\") " Oct 03 09:04:13 crc kubenswrapper[4765]: I1003 09:04:13.858860 4765 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wktht\" (UniqueName: \"kubernetes.io/projected/1b5dbcd7-9207-482b-b2ca-d56795e51267-kube-api-access-wktht\") pod \"1b5dbcd7-9207-482b-b2ca-d56795e51267\" (UID: \"1b5dbcd7-9207-482b-b2ca-d56795e51267\") " Oct 03 09:04:13 crc kubenswrapper[4765]: I1003 09:04:13.858937 4765 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1b5dbcd7-9207-482b-b2ca-d56795e51267-catalog-content\") pod \"1b5dbcd7-9207-482b-b2ca-d56795e51267\" (UID: \"1b5dbcd7-9207-482b-b2ca-d56795e51267\") " Oct 03 09:04:13 crc kubenswrapper[4765]: I1003 09:04:13.859607 4765 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1b5dbcd7-9207-482b-b2ca-d56795e51267-utilities" (OuterVolumeSpecName: "utilities") pod "1b5dbcd7-9207-482b-b2ca-d56795e51267" (UID: "1b5dbcd7-9207-482b-b2ca-d56795e51267"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 03 09:04:13 crc kubenswrapper[4765]: I1003 09:04:13.863961 4765 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1b5dbcd7-9207-482b-b2ca-d56795e51267-kube-api-access-wktht" (OuterVolumeSpecName: "kube-api-access-wktht") pod "1b5dbcd7-9207-482b-b2ca-d56795e51267" (UID: "1b5dbcd7-9207-482b-b2ca-d56795e51267"). InnerVolumeSpecName "kube-api-access-wktht". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 09:04:13 crc kubenswrapper[4765]: I1003 09:04:13.871674 4765 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1b5dbcd7-9207-482b-b2ca-d56795e51267-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "1b5dbcd7-9207-482b-b2ca-d56795e51267" (UID: "1b5dbcd7-9207-482b-b2ca-d56795e51267"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 03 09:04:13 crc kubenswrapper[4765]: I1003 09:04:13.960848 4765 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1b5dbcd7-9207-482b-b2ca-d56795e51267-utilities\") on node \"crc\" DevicePath \"\"" Oct 03 09:04:13 crc kubenswrapper[4765]: I1003 09:04:13.961106 4765 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wktht\" (UniqueName: \"kubernetes.io/projected/1b5dbcd7-9207-482b-b2ca-d56795e51267-kube-api-access-wktht\") on node \"crc\" DevicePath \"\"" Oct 03 09:04:13 crc kubenswrapper[4765]: I1003 09:04:13.961117 4765 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1b5dbcd7-9207-482b-b2ca-d56795e51267-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 03 09:04:14 crc kubenswrapper[4765]: I1003 09:04:14.292406 4765 generic.go:334] "Generic (PLEG): container finished" podID="1b5dbcd7-9207-482b-b2ca-d56795e51267" containerID="edcf2edb10681382ff89ddaa19ccfe089dc52b2bb1ccafae522fc6abd7fe3fc4" exitCode=0 Oct 03 09:04:14 crc kubenswrapper[4765]: I1003 09:04:14.292489 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-fbbnh" event={"ID":"1b5dbcd7-9207-482b-b2ca-d56795e51267","Type":"ContainerDied","Data":"edcf2edb10681382ff89ddaa19ccfe089dc52b2bb1ccafae522fc6abd7fe3fc4"} Oct 03 09:04:14 crc kubenswrapper[4765]: I1003 09:04:14.292567 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-fbbnh" event={"ID":"1b5dbcd7-9207-482b-b2ca-d56795e51267","Type":"ContainerDied","Data":"b09d624e3626f22cae5cf31df11aef1526a3abeb531349c68751781d46443194"} Oct 03 09:04:14 crc kubenswrapper[4765]: I1003 09:04:14.292611 4765 scope.go:117] "RemoveContainer" containerID="edcf2edb10681382ff89ddaa19ccfe089dc52b2bb1ccafae522fc6abd7fe3fc4" Oct 03 09:04:14 crc kubenswrapper[4765]: I1003 09:04:14.293803 4765 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-fbbnh" Oct 03 09:04:14 crc kubenswrapper[4765]: I1003 09:04:14.296409 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/ceilometer-0" event={"ID":"e2b874f8-a25b-46c7-bff2-45197b16caa7","Type":"ContainerStarted","Data":"93bef5acdd682ab111d6e77370f9d02c11d9640abe9ef5443a6a5bfca8a9f4f9"} Oct 03 09:04:14 crc kubenswrapper[4765]: I1003 09:04:14.297284 4765 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="watcher-kuttl-default/ceilometer-0" Oct 03 09:04:14 crc kubenswrapper[4765]: I1003 09:04:14.319593 4765 scope.go:117] "RemoveContainer" containerID="b8404d34e24f042e49d1b9aa7082b44c624ee88c2f8d0913e6ca8a62d391503c" Oct 03 09:04:14 crc kubenswrapper[4765]: I1003 09:04:14.338236 4765 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="watcher-kuttl-default/ceilometer-0" podStartSLOduration=1.994868826 podStartE2EDuration="5.338211359s" podCreationTimestamp="2025-10-03 09:04:09 +0000 UTC" firstStartedPulling="2025-10-03 09:04:10.135961607 +0000 UTC m=+1494.437455937" lastFinishedPulling="2025-10-03 09:04:13.47930414 +0000 UTC m=+1497.780798470" observedRunningTime="2025-10-03 09:04:14.331894706 +0000 UTC m=+1498.633389046" watchObservedRunningTime="2025-10-03 09:04:14.338211359 +0000 UTC m=+1498.639705699" Oct 03 09:04:14 crc kubenswrapper[4765]: I1003 09:04:14.352905 4765 scope.go:117] "RemoveContainer" containerID="de9b65b0792e811c2d671ba6b753f07e2122358d7e5ee57a19a8629f72775c09" Oct 03 09:04:14 crc kubenswrapper[4765]: I1003 09:04:14.354717 4765 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-fbbnh"] Oct 03 09:04:14 crc kubenswrapper[4765]: I1003 09:04:14.364254 4765 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-fbbnh"] Oct 03 09:04:14 crc kubenswrapper[4765]: I1003 09:04:14.372384 4765 scope.go:117] "RemoveContainer" containerID="edcf2edb10681382ff89ddaa19ccfe089dc52b2bb1ccafae522fc6abd7fe3fc4" Oct 03 09:04:14 crc kubenswrapper[4765]: E1003 09:04:14.372775 4765 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"edcf2edb10681382ff89ddaa19ccfe089dc52b2bb1ccafae522fc6abd7fe3fc4\": container with ID starting with edcf2edb10681382ff89ddaa19ccfe089dc52b2bb1ccafae522fc6abd7fe3fc4 not found: ID does not exist" containerID="edcf2edb10681382ff89ddaa19ccfe089dc52b2bb1ccafae522fc6abd7fe3fc4" Oct 03 09:04:14 crc kubenswrapper[4765]: I1003 09:04:14.372817 4765 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"edcf2edb10681382ff89ddaa19ccfe089dc52b2bb1ccafae522fc6abd7fe3fc4"} err="failed to get container status \"edcf2edb10681382ff89ddaa19ccfe089dc52b2bb1ccafae522fc6abd7fe3fc4\": rpc error: code = NotFound desc = could not find container \"edcf2edb10681382ff89ddaa19ccfe089dc52b2bb1ccafae522fc6abd7fe3fc4\": container with ID starting with edcf2edb10681382ff89ddaa19ccfe089dc52b2bb1ccafae522fc6abd7fe3fc4 not found: ID does not exist" Oct 03 09:04:14 crc kubenswrapper[4765]: I1003 09:04:14.372842 4765 scope.go:117] "RemoveContainer" containerID="b8404d34e24f042e49d1b9aa7082b44c624ee88c2f8d0913e6ca8a62d391503c" Oct 03 09:04:14 crc kubenswrapper[4765]: E1003 09:04:14.373184 4765 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b8404d34e24f042e49d1b9aa7082b44c624ee88c2f8d0913e6ca8a62d391503c\": container with ID starting with b8404d34e24f042e49d1b9aa7082b44c624ee88c2f8d0913e6ca8a62d391503c not found: ID does not exist" containerID="b8404d34e24f042e49d1b9aa7082b44c624ee88c2f8d0913e6ca8a62d391503c" Oct 03 09:04:14 crc kubenswrapper[4765]: I1003 09:04:14.373249 4765 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b8404d34e24f042e49d1b9aa7082b44c624ee88c2f8d0913e6ca8a62d391503c"} err="failed to get container status \"b8404d34e24f042e49d1b9aa7082b44c624ee88c2f8d0913e6ca8a62d391503c\": rpc error: code = NotFound desc = could not find container \"b8404d34e24f042e49d1b9aa7082b44c624ee88c2f8d0913e6ca8a62d391503c\": container with ID starting with b8404d34e24f042e49d1b9aa7082b44c624ee88c2f8d0913e6ca8a62d391503c not found: ID does not exist" Oct 03 09:04:14 crc kubenswrapper[4765]: I1003 09:04:14.373286 4765 scope.go:117] "RemoveContainer" containerID="de9b65b0792e811c2d671ba6b753f07e2122358d7e5ee57a19a8629f72775c09" Oct 03 09:04:14 crc kubenswrapper[4765]: E1003 09:04:14.373633 4765 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"de9b65b0792e811c2d671ba6b753f07e2122358d7e5ee57a19a8629f72775c09\": container with ID starting with de9b65b0792e811c2d671ba6b753f07e2122358d7e5ee57a19a8629f72775c09 not found: ID does not exist" containerID="de9b65b0792e811c2d671ba6b753f07e2122358d7e5ee57a19a8629f72775c09" Oct 03 09:04:14 crc kubenswrapper[4765]: I1003 09:04:14.373677 4765 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"de9b65b0792e811c2d671ba6b753f07e2122358d7e5ee57a19a8629f72775c09"} err="failed to get container status \"de9b65b0792e811c2d671ba6b753f07e2122358d7e5ee57a19a8629f72775c09\": rpc error: code = NotFound desc = could not find container \"de9b65b0792e811c2d671ba6b753f07e2122358d7e5ee57a19a8629f72775c09\": container with ID starting with de9b65b0792e811c2d671ba6b753f07e2122358d7e5ee57a19a8629f72775c09 not found: ID does not exist" Oct 03 09:04:16 crc kubenswrapper[4765]: I1003 09:04:16.327327 4765 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1b5dbcd7-9207-482b-b2ca-d56795e51267" path="/var/lib/kubelet/pods/1b5dbcd7-9207-482b-b2ca-d56795e51267/volumes" Oct 03 09:04:17 crc kubenswrapper[4765]: I1003 09:04:17.470238 4765 scope.go:117] "RemoveContainer" containerID="f9b1fa593b2043d2de9aa7a0b58ede46415e6225d4d4183e3963cdf0e49d9560" Oct 03 09:04:17 crc kubenswrapper[4765]: I1003 09:04:17.525577 4765 scope.go:117] "RemoveContainer" containerID="371554d6a23867e20f20b0b92c3bae02aa1730cacbdf0e5968dd0522a11bf163" Oct 03 09:04:30 crc kubenswrapper[4765]: I1003 09:04:30.680345 4765 patch_prober.go:28] interesting pod/machine-config-daemon-j8mss container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 03 09:04:30 crc kubenswrapper[4765]: I1003 09:04:30.680939 4765 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-j8mss" podUID="d636dbad-9ffa-4ba7-953f-adea04b76a23" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 03 09:04:39 crc kubenswrapper[4765]: I1003 09:04:39.671948 4765 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="watcher-kuttl-default/ceilometer-0" Oct 03 09:04:48 crc kubenswrapper[4765]: I1003 09:04:48.689713 4765 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["watcher-kuttl-default/watcher-kuttl-db-sync-vslqx"] Oct 03 09:04:48 crc kubenswrapper[4765]: I1003 09:04:48.702941 4765 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["watcher-kuttl-default/watcher-kuttl-db-sync-vslqx"] Oct 03 09:04:48 crc kubenswrapper[4765]: I1003 09:04:48.776015 4765 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["watcher-kuttl-default/watcher-kuttl-applier-0"] Oct 03 09:04:48 crc kubenswrapper[4765]: I1003 09:04:48.776264 4765 kuberuntime_container.go:808] "Killing container with a grace period" pod="watcher-kuttl-default/watcher-kuttl-applier-0" podUID="cf49515d-149d-4084-a0bc-5f5ddb0d6739" containerName="watcher-applier" containerID="cri-o://29cded319cd697ad6a39d8b966c290756135d4b49b6e3333728a7d7d258bfe6e" gracePeriod=30 Oct 03 09:04:48 crc kubenswrapper[4765]: I1003 09:04:48.807261 4765 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["watcher-kuttl-default/watcheradea-account-delete-4kprl"] Oct 03 09:04:48 crc kubenswrapper[4765]: E1003 09:04:48.807611 4765 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1b5dbcd7-9207-482b-b2ca-d56795e51267" containerName="registry-server" Oct 03 09:04:48 crc kubenswrapper[4765]: I1003 09:04:48.807626 4765 state_mem.go:107] "Deleted CPUSet assignment" podUID="1b5dbcd7-9207-482b-b2ca-d56795e51267" containerName="registry-server" Oct 03 09:04:48 crc kubenswrapper[4765]: E1003 09:04:48.807637 4765 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1b5dbcd7-9207-482b-b2ca-d56795e51267" containerName="extract-utilities" Oct 03 09:04:48 crc kubenswrapper[4765]: I1003 09:04:48.807657 4765 state_mem.go:107] "Deleted CPUSet assignment" podUID="1b5dbcd7-9207-482b-b2ca-d56795e51267" containerName="extract-utilities" Oct 03 09:04:48 crc kubenswrapper[4765]: E1003 09:04:48.807677 4765 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1b5dbcd7-9207-482b-b2ca-d56795e51267" containerName="extract-content" Oct 03 09:04:48 crc kubenswrapper[4765]: I1003 09:04:48.807684 4765 state_mem.go:107] "Deleted CPUSet assignment" podUID="1b5dbcd7-9207-482b-b2ca-d56795e51267" containerName="extract-content" Oct 03 09:04:48 crc kubenswrapper[4765]: I1003 09:04:48.807829 4765 memory_manager.go:354] "RemoveStaleState removing state" podUID="1b5dbcd7-9207-482b-b2ca-d56795e51267" containerName="registry-server" Oct 03 09:04:48 crc kubenswrapper[4765]: I1003 09:04:48.808399 4765 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/watcheradea-account-delete-4kprl" Oct 03 09:04:48 crc kubenswrapper[4765]: I1003 09:04:48.825479 4765 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["watcher-kuttl-default/watcheradea-account-delete-4kprl"] Oct 03 09:04:48 crc kubenswrapper[4765]: I1003 09:04:48.869730 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fh5ww\" (UniqueName: \"kubernetes.io/projected/638a2f4d-4147-4f7b-8032-947dac4e402b-kube-api-access-fh5ww\") pod \"watcheradea-account-delete-4kprl\" (UID: \"638a2f4d-4147-4f7b-8032-947dac4e402b\") " pod="watcher-kuttl-default/watcheradea-account-delete-4kprl" Oct 03 09:04:48 crc kubenswrapper[4765]: I1003 09:04:48.883006 4765 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["watcher-kuttl-default/watcher-db-create-nt8ws"] Oct 03 09:04:48 crc kubenswrapper[4765]: I1003 09:04:48.895656 4765 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["watcher-kuttl-default/watcher-db-create-nt8ws"] Oct 03 09:04:48 crc kubenswrapper[4765]: I1003 09:04:48.919962 4765 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["watcher-kuttl-default/watcher-kuttl-decision-engine-0"] Oct 03 09:04:48 crc kubenswrapper[4765]: I1003 09:04:48.920220 4765 kuberuntime_container.go:808] "Killing container with a grace period" pod="watcher-kuttl-default/watcher-kuttl-decision-engine-0" podUID="0deed7b7-7fac-4c23-bcb1-e790b70f7a9d" containerName="watcher-decision-engine" containerID="cri-o://3b8ab8341c1ff541ec26df55c796b534c2eb631f6062761a0478a771544b296b" gracePeriod=30 Oct 03 09:04:48 crc kubenswrapper[4765]: I1003 09:04:48.947769 4765 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["watcher-kuttl-default/watcheradea-account-delete-4kprl"] Oct 03 09:04:48 crc kubenswrapper[4765]: E1003 09:04:48.948385 4765 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[kube-api-access-fh5ww], unattached volumes=[], failed to process volumes=[]: context canceled" pod="watcher-kuttl-default/watcheradea-account-delete-4kprl" podUID="638a2f4d-4147-4f7b-8032-947dac4e402b" Oct 03 09:04:48 crc kubenswrapper[4765]: I1003 09:04:48.967125 4765 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["watcher-kuttl-default/watcher-adea-account-create-8dvhn"] Oct 03 09:04:48 crc kubenswrapper[4765]: I1003 09:04:48.970904 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fh5ww\" (UniqueName: \"kubernetes.io/projected/638a2f4d-4147-4f7b-8032-947dac4e402b-kube-api-access-fh5ww\") pod \"watcheradea-account-delete-4kprl\" (UID: \"638a2f4d-4147-4f7b-8032-947dac4e402b\") " pod="watcher-kuttl-default/watcheradea-account-delete-4kprl" Oct 03 09:04:48 crc kubenswrapper[4765]: I1003 09:04:48.984552 4765 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["watcher-kuttl-default/watcher-adea-account-create-8dvhn"] Oct 03 09:04:48 crc kubenswrapper[4765]: I1003 09:04:48.991782 4765 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["watcher-kuttl-default/watcher-kuttl-api-0"] Oct 03 09:04:48 crc kubenswrapper[4765]: I1003 09:04:48.992059 4765 kuberuntime_container.go:808] "Killing container with a grace period" pod="watcher-kuttl-default/watcher-kuttl-api-0" podUID="ff7eb360-6cc9-4cd8-bd0e-1c11f76296d7" containerName="watcher-kuttl-api-log" containerID="cri-o://8c4736edbd19119c5c09fed3accd8e17e5ee98ea1e12ca39fc3b59aeaeb12c20" gracePeriod=30 Oct 03 09:04:48 crc kubenswrapper[4765]: I1003 09:04:48.992538 4765 kuberuntime_container.go:808] "Killing container with a grace period" pod="watcher-kuttl-default/watcher-kuttl-api-0" podUID="ff7eb360-6cc9-4cd8-bd0e-1c11f76296d7" containerName="watcher-api" containerID="cri-o://ecb964afaf2f1e274b0b620a5cab69ea802ecaad5e425655dc7383b31b2525c0" gracePeriod=30 Oct 03 09:04:49 crc kubenswrapper[4765]: I1003 09:04:49.043540 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fh5ww\" (UniqueName: \"kubernetes.io/projected/638a2f4d-4147-4f7b-8032-947dac4e402b-kube-api-access-fh5ww\") pod \"watcheradea-account-delete-4kprl\" (UID: \"638a2f4d-4147-4f7b-8032-947dac4e402b\") " pod="watcher-kuttl-default/watcheradea-account-delete-4kprl" Oct 03 09:04:49 crc kubenswrapper[4765]: I1003 09:04:49.619302 4765 generic.go:334] "Generic (PLEG): container finished" podID="ff7eb360-6cc9-4cd8-bd0e-1c11f76296d7" containerID="8c4736edbd19119c5c09fed3accd8e17e5ee98ea1e12ca39fc3b59aeaeb12c20" exitCode=143 Oct 03 09:04:49 crc kubenswrapper[4765]: I1003 09:04:49.619387 4765 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/watcheradea-account-delete-4kprl" Oct 03 09:04:49 crc kubenswrapper[4765]: I1003 09:04:49.619771 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/watcher-kuttl-api-0" event={"ID":"ff7eb360-6cc9-4cd8-bd0e-1c11f76296d7","Type":"ContainerDied","Data":"8c4736edbd19119c5c09fed3accd8e17e5ee98ea1e12ca39fc3b59aeaeb12c20"} Oct 03 09:04:49 crc kubenswrapper[4765]: I1003 09:04:49.630820 4765 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/watcheradea-account-delete-4kprl" Oct 03 09:04:49 crc kubenswrapper[4765]: I1003 09:04:49.682968 4765 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fh5ww\" (UniqueName: \"kubernetes.io/projected/638a2f4d-4147-4f7b-8032-947dac4e402b-kube-api-access-fh5ww\") pod \"638a2f4d-4147-4f7b-8032-947dac4e402b\" (UID: \"638a2f4d-4147-4f7b-8032-947dac4e402b\") " Oct 03 09:04:49 crc kubenswrapper[4765]: I1003 09:04:49.697761 4765 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/638a2f4d-4147-4f7b-8032-947dac4e402b-kube-api-access-fh5ww" (OuterVolumeSpecName: "kube-api-access-fh5ww") pod "638a2f4d-4147-4f7b-8032-947dac4e402b" (UID: "638a2f4d-4147-4f7b-8032-947dac4e402b"). InnerVolumeSpecName "kube-api-access-fh5ww". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 09:04:49 crc kubenswrapper[4765]: I1003 09:04:49.791508 4765 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fh5ww\" (UniqueName: \"kubernetes.io/projected/638a2f4d-4147-4f7b-8032-947dac4e402b-kube-api-access-fh5ww\") on node \"crc\" DevicePath \"\"" Oct 03 09:04:50 crc kubenswrapper[4765]: I1003 09:04:50.322839 4765 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="575c88eb-254b-45ff-ab90-bbf9cf94c4fa" path="/var/lib/kubelet/pods/575c88eb-254b-45ff-ab90-bbf9cf94c4fa/volumes" Oct 03 09:04:50 crc kubenswrapper[4765]: I1003 09:04:50.323392 4765 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5eab1ba2-0082-4d6c-b42d-9cb409d59fac" path="/var/lib/kubelet/pods/5eab1ba2-0082-4d6c-b42d-9cb409d59fac/volumes" Oct 03 09:04:50 crc kubenswrapper[4765]: I1003 09:04:50.323844 4765 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7a703dbc-56c9-490f-8180-0322a355f7f3" path="/var/lib/kubelet/pods/7a703dbc-56c9-490f-8180-0322a355f7f3/volumes" Oct 03 09:04:50 crc kubenswrapper[4765]: I1003 09:04:50.391509 4765 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/watcher-kuttl-applier-0" Oct 03 09:04:50 crc kubenswrapper[4765]: I1003 09:04:50.525048 4765 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gskbs\" (UniqueName: \"kubernetes.io/projected/cf49515d-149d-4084-a0bc-5f5ddb0d6739-kube-api-access-gskbs\") pod \"cf49515d-149d-4084-a0bc-5f5ddb0d6739\" (UID: \"cf49515d-149d-4084-a0bc-5f5ddb0d6739\") " Oct 03 09:04:50 crc kubenswrapper[4765]: I1003 09:04:50.525402 4765 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cf49515d-149d-4084-a0bc-5f5ddb0d6739-combined-ca-bundle\") pod \"cf49515d-149d-4084-a0bc-5f5ddb0d6739\" (UID: \"cf49515d-149d-4084-a0bc-5f5ddb0d6739\") " Oct 03 09:04:50 crc kubenswrapper[4765]: I1003 09:04:50.525473 4765 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/cf49515d-149d-4084-a0bc-5f5ddb0d6739-config-data\") pod \"cf49515d-149d-4084-a0bc-5f5ddb0d6739\" (UID: \"cf49515d-149d-4084-a0bc-5f5ddb0d6739\") " Oct 03 09:04:50 crc kubenswrapper[4765]: I1003 09:04:50.525552 4765 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/cf49515d-149d-4084-a0bc-5f5ddb0d6739-logs\") pod \"cf49515d-149d-4084-a0bc-5f5ddb0d6739\" (UID: \"cf49515d-149d-4084-a0bc-5f5ddb0d6739\") " Oct 03 09:04:50 crc kubenswrapper[4765]: I1003 09:04:50.525614 4765 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cert-memcached-mtls\" (UniqueName: \"kubernetes.io/secret/cf49515d-149d-4084-a0bc-5f5ddb0d6739-cert-memcached-mtls\") pod \"cf49515d-149d-4084-a0bc-5f5ddb0d6739\" (UID: \"cf49515d-149d-4084-a0bc-5f5ddb0d6739\") " Oct 03 09:04:50 crc kubenswrapper[4765]: I1003 09:04:50.529613 4765 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/cf49515d-149d-4084-a0bc-5f5ddb0d6739-logs" (OuterVolumeSpecName: "logs") pod "cf49515d-149d-4084-a0bc-5f5ddb0d6739" (UID: "cf49515d-149d-4084-a0bc-5f5ddb0d6739"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 03 09:04:50 crc kubenswrapper[4765]: I1003 09:04:50.556550 4765 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/cf49515d-149d-4084-a0bc-5f5ddb0d6739-kube-api-access-gskbs" (OuterVolumeSpecName: "kube-api-access-gskbs") pod "cf49515d-149d-4084-a0bc-5f5ddb0d6739" (UID: "cf49515d-149d-4084-a0bc-5f5ddb0d6739"). InnerVolumeSpecName "kube-api-access-gskbs". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 09:04:50 crc kubenswrapper[4765]: I1003 09:04:50.570856 4765 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/cf49515d-149d-4084-a0bc-5f5ddb0d6739-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "cf49515d-149d-4084-a0bc-5f5ddb0d6739" (UID: "cf49515d-149d-4084-a0bc-5f5ddb0d6739"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 09:04:50 crc kubenswrapper[4765]: I1003 09:04:50.617722 4765 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/cf49515d-149d-4084-a0bc-5f5ddb0d6739-config-data" (OuterVolumeSpecName: "config-data") pod "cf49515d-149d-4084-a0bc-5f5ddb0d6739" (UID: "cf49515d-149d-4084-a0bc-5f5ddb0d6739"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 09:04:50 crc kubenswrapper[4765]: I1003 09:04:50.631252 4765 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-gskbs\" (UniqueName: \"kubernetes.io/projected/cf49515d-149d-4084-a0bc-5f5ddb0d6739-kube-api-access-gskbs\") on node \"crc\" DevicePath \"\"" Oct 03 09:04:50 crc kubenswrapper[4765]: I1003 09:04:50.631308 4765 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cf49515d-149d-4084-a0bc-5f5ddb0d6739-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 03 09:04:50 crc kubenswrapper[4765]: I1003 09:04:50.631318 4765 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/cf49515d-149d-4084-a0bc-5f5ddb0d6739-config-data\") on node \"crc\" DevicePath \"\"" Oct 03 09:04:50 crc kubenswrapper[4765]: I1003 09:04:50.631327 4765 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/cf49515d-149d-4084-a0bc-5f5ddb0d6739-logs\") on node \"crc\" DevicePath \"\"" Oct 03 09:04:50 crc kubenswrapper[4765]: I1003 09:04:50.634153 4765 generic.go:334] "Generic (PLEG): container finished" podID="cf49515d-149d-4084-a0bc-5f5ddb0d6739" containerID="29cded319cd697ad6a39d8b966c290756135d4b49b6e3333728a7d7d258bfe6e" exitCode=0 Oct 03 09:04:50 crc kubenswrapper[4765]: I1003 09:04:50.634217 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/watcher-kuttl-applier-0" event={"ID":"cf49515d-149d-4084-a0bc-5f5ddb0d6739","Type":"ContainerDied","Data":"29cded319cd697ad6a39d8b966c290756135d4b49b6e3333728a7d7d258bfe6e"} Oct 03 09:04:50 crc kubenswrapper[4765]: I1003 09:04:50.634287 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/watcher-kuttl-applier-0" event={"ID":"cf49515d-149d-4084-a0bc-5f5ddb0d6739","Type":"ContainerDied","Data":"398b3a5a0f79f557634f4c7d07b15e1d0dd9580bf73560681aa45fd38c61f6c6"} Oct 03 09:04:50 crc kubenswrapper[4765]: I1003 09:04:50.634307 4765 scope.go:117] "RemoveContainer" containerID="29cded319cd697ad6a39d8b966c290756135d4b49b6e3333728a7d7d258bfe6e" Oct 03 09:04:50 crc kubenswrapper[4765]: I1003 09:04:50.634312 4765 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/watcher-kuttl-applier-0" Oct 03 09:04:50 crc kubenswrapper[4765]: I1003 09:04:50.637351 4765 generic.go:334] "Generic (PLEG): container finished" podID="ff7eb360-6cc9-4cd8-bd0e-1c11f76296d7" containerID="ecb964afaf2f1e274b0b620a5cab69ea802ecaad5e425655dc7383b31b2525c0" exitCode=0 Oct 03 09:04:50 crc kubenswrapper[4765]: I1003 09:04:50.637421 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/watcher-kuttl-api-0" event={"ID":"ff7eb360-6cc9-4cd8-bd0e-1c11f76296d7","Type":"ContainerDied","Data":"ecb964afaf2f1e274b0b620a5cab69ea802ecaad5e425655dc7383b31b2525c0"} Oct 03 09:04:50 crc kubenswrapper[4765]: I1003 09:04:50.637468 4765 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/watcheradea-account-delete-4kprl" Oct 03 09:04:50 crc kubenswrapper[4765]: I1003 09:04:50.637469 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/watcher-kuttl-api-0" event={"ID":"ff7eb360-6cc9-4cd8-bd0e-1c11f76296d7","Type":"ContainerDied","Data":"30ffbbfaeaa402b374ff270e313a33980f73001a32d8342d62e3fe12e94b3ff2"} Oct 03 09:04:50 crc kubenswrapper[4765]: I1003 09:04:50.637552 4765 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="30ffbbfaeaa402b374ff270e313a33980f73001a32d8342d62e3fe12e94b3ff2" Oct 03 09:04:50 crc kubenswrapper[4765]: I1003 09:04:50.641994 4765 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/watcher-kuttl-api-0" Oct 03 09:04:50 crc kubenswrapper[4765]: I1003 09:04:50.667599 4765 scope.go:117] "RemoveContainer" containerID="29cded319cd697ad6a39d8b966c290756135d4b49b6e3333728a7d7d258bfe6e" Oct 03 09:04:50 crc kubenswrapper[4765]: E1003 09:04:50.668472 4765 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"29cded319cd697ad6a39d8b966c290756135d4b49b6e3333728a7d7d258bfe6e\": container with ID starting with 29cded319cd697ad6a39d8b966c290756135d4b49b6e3333728a7d7d258bfe6e not found: ID does not exist" containerID="29cded319cd697ad6a39d8b966c290756135d4b49b6e3333728a7d7d258bfe6e" Oct 03 09:04:50 crc kubenswrapper[4765]: I1003 09:04:50.668522 4765 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"29cded319cd697ad6a39d8b966c290756135d4b49b6e3333728a7d7d258bfe6e"} err="failed to get container status \"29cded319cd697ad6a39d8b966c290756135d4b49b6e3333728a7d7d258bfe6e\": rpc error: code = NotFound desc = could not find container \"29cded319cd697ad6a39d8b966c290756135d4b49b6e3333728a7d7d258bfe6e\": container with ID starting with 29cded319cd697ad6a39d8b966c290756135d4b49b6e3333728a7d7d258bfe6e not found: ID does not exist" Oct 03 09:04:50 crc kubenswrapper[4765]: I1003 09:04:50.690040 4765 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/cf49515d-149d-4084-a0bc-5f5ddb0d6739-cert-memcached-mtls" (OuterVolumeSpecName: "cert-memcached-mtls") pod "cf49515d-149d-4084-a0bc-5f5ddb0d6739" (UID: "cf49515d-149d-4084-a0bc-5f5ddb0d6739"). InnerVolumeSpecName "cert-memcached-mtls". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 09:04:50 crc kubenswrapper[4765]: I1003 09:04:50.708136 4765 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["watcher-kuttl-default/watcheradea-account-delete-4kprl"] Oct 03 09:04:50 crc kubenswrapper[4765]: I1003 09:04:50.715904 4765 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["watcher-kuttl-default/watcheradea-account-delete-4kprl"] Oct 03 09:04:50 crc kubenswrapper[4765]: I1003 09:04:50.732907 4765 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lr9j2\" (UniqueName: \"kubernetes.io/projected/ff7eb360-6cc9-4cd8-bd0e-1c11f76296d7-kube-api-access-lr9j2\") pod \"ff7eb360-6cc9-4cd8-bd0e-1c11f76296d7\" (UID: \"ff7eb360-6cc9-4cd8-bd0e-1c11f76296d7\") " Oct 03 09:04:50 crc kubenswrapper[4765]: I1003 09:04:50.732965 4765 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/ff7eb360-6cc9-4cd8-bd0e-1c11f76296d7-logs\") pod \"ff7eb360-6cc9-4cd8-bd0e-1c11f76296d7\" (UID: \"ff7eb360-6cc9-4cd8-bd0e-1c11f76296d7\") " Oct 03 09:04:50 crc kubenswrapper[4765]: I1003 09:04:50.732998 4765 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"custom-prometheus-ca\" (UniqueName: \"kubernetes.io/secret/ff7eb360-6cc9-4cd8-bd0e-1c11f76296d7-custom-prometheus-ca\") pod \"ff7eb360-6cc9-4cd8-bd0e-1c11f76296d7\" (UID: \"ff7eb360-6cc9-4cd8-bd0e-1c11f76296d7\") " Oct 03 09:04:50 crc kubenswrapper[4765]: I1003 09:04:50.733069 4765 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ff7eb360-6cc9-4cd8-bd0e-1c11f76296d7-combined-ca-bundle\") pod \"ff7eb360-6cc9-4cd8-bd0e-1c11f76296d7\" (UID: \"ff7eb360-6cc9-4cd8-bd0e-1c11f76296d7\") " Oct 03 09:04:50 crc kubenswrapper[4765]: I1003 09:04:50.733095 4765 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cert-memcached-mtls\" (UniqueName: \"kubernetes.io/secret/ff7eb360-6cc9-4cd8-bd0e-1c11f76296d7-cert-memcached-mtls\") pod \"ff7eb360-6cc9-4cd8-bd0e-1c11f76296d7\" (UID: \"ff7eb360-6cc9-4cd8-bd0e-1c11f76296d7\") " Oct 03 09:04:50 crc kubenswrapper[4765]: I1003 09:04:50.733168 4765 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ff7eb360-6cc9-4cd8-bd0e-1c11f76296d7-config-data\") pod \"ff7eb360-6cc9-4cd8-bd0e-1c11f76296d7\" (UID: \"ff7eb360-6cc9-4cd8-bd0e-1c11f76296d7\") " Oct 03 09:04:50 crc kubenswrapper[4765]: I1003 09:04:50.733518 4765 reconciler_common.go:293] "Volume detached for volume \"cert-memcached-mtls\" (UniqueName: \"kubernetes.io/secret/cf49515d-149d-4084-a0bc-5f5ddb0d6739-cert-memcached-mtls\") on node \"crc\" DevicePath \"\"" Oct 03 09:04:50 crc kubenswrapper[4765]: I1003 09:04:50.735242 4765 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ff7eb360-6cc9-4cd8-bd0e-1c11f76296d7-logs" (OuterVolumeSpecName: "logs") pod "ff7eb360-6cc9-4cd8-bd0e-1c11f76296d7" (UID: "ff7eb360-6cc9-4cd8-bd0e-1c11f76296d7"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 03 09:04:50 crc kubenswrapper[4765]: I1003 09:04:50.737979 4765 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ff7eb360-6cc9-4cd8-bd0e-1c11f76296d7-kube-api-access-lr9j2" (OuterVolumeSpecName: "kube-api-access-lr9j2") pod "ff7eb360-6cc9-4cd8-bd0e-1c11f76296d7" (UID: "ff7eb360-6cc9-4cd8-bd0e-1c11f76296d7"). InnerVolumeSpecName "kube-api-access-lr9j2". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 09:04:50 crc kubenswrapper[4765]: I1003 09:04:50.762535 4765 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ff7eb360-6cc9-4cd8-bd0e-1c11f76296d7-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "ff7eb360-6cc9-4cd8-bd0e-1c11f76296d7" (UID: "ff7eb360-6cc9-4cd8-bd0e-1c11f76296d7"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 09:04:50 crc kubenswrapper[4765]: I1003 09:04:50.763555 4765 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ff7eb360-6cc9-4cd8-bd0e-1c11f76296d7-custom-prometheus-ca" (OuterVolumeSpecName: "custom-prometheus-ca") pod "ff7eb360-6cc9-4cd8-bd0e-1c11f76296d7" (UID: "ff7eb360-6cc9-4cd8-bd0e-1c11f76296d7"). InnerVolumeSpecName "custom-prometheus-ca". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 09:04:50 crc kubenswrapper[4765]: I1003 09:04:50.798819 4765 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ff7eb360-6cc9-4cd8-bd0e-1c11f76296d7-config-data" (OuterVolumeSpecName: "config-data") pod "ff7eb360-6cc9-4cd8-bd0e-1c11f76296d7" (UID: "ff7eb360-6cc9-4cd8-bd0e-1c11f76296d7"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 09:04:50 crc kubenswrapper[4765]: I1003 09:04:50.818545 4765 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ff7eb360-6cc9-4cd8-bd0e-1c11f76296d7-cert-memcached-mtls" (OuterVolumeSpecName: "cert-memcached-mtls") pod "ff7eb360-6cc9-4cd8-bd0e-1c11f76296d7" (UID: "ff7eb360-6cc9-4cd8-bd0e-1c11f76296d7"). InnerVolumeSpecName "cert-memcached-mtls". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 09:04:50 crc kubenswrapper[4765]: I1003 09:04:50.834472 4765 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lr9j2\" (UniqueName: \"kubernetes.io/projected/ff7eb360-6cc9-4cd8-bd0e-1c11f76296d7-kube-api-access-lr9j2\") on node \"crc\" DevicePath \"\"" Oct 03 09:04:50 crc kubenswrapper[4765]: I1003 09:04:50.834511 4765 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/ff7eb360-6cc9-4cd8-bd0e-1c11f76296d7-logs\") on node \"crc\" DevicePath \"\"" Oct 03 09:04:50 crc kubenswrapper[4765]: I1003 09:04:50.834525 4765 reconciler_common.go:293] "Volume detached for volume \"custom-prometheus-ca\" (UniqueName: \"kubernetes.io/secret/ff7eb360-6cc9-4cd8-bd0e-1c11f76296d7-custom-prometheus-ca\") on node \"crc\" DevicePath \"\"" Oct 03 09:04:50 crc kubenswrapper[4765]: I1003 09:04:50.834537 4765 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ff7eb360-6cc9-4cd8-bd0e-1c11f76296d7-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 03 09:04:50 crc kubenswrapper[4765]: I1003 09:04:50.834600 4765 reconciler_common.go:293] "Volume detached for volume \"cert-memcached-mtls\" (UniqueName: \"kubernetes.io/secret/ff7eb360-6cc9-4cd8-bd0e-1c11f76296d7-cert-memcached-mtls\") on node \"crc\" DevicePath \"\"" Oct 03 09:04:50 crc kubenswrapper[4765]: I1003 09:04:50.834612 4765 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ff7eb360-6cc9-4cd8-bd0e-1c11f76296d7-config-data\") on node \"crc\" DevicePath \"\"" Oct 03 09:04:50 crc kubenswrapper[4765]: I1003 09:04:50.965463 4765 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["watcher-kuttl-default/watcher-kuttl-applier-0"] Oct 03 09:04:50 crc kubenswrapper[4765]: I1003 09:04:50.971283 4765 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["watcher-kuttl-default/watcher-kuttl-applier-0"] Oct 03 09:04:51 crc kubenswrapper[4765]: I1003 09:04:51.644967 4765 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/watcher-kuttl-api-0" Oct 03 09:04:51 crc kubenswrapper[4765]: I1003 09:04:51.681578 4765 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["watcher-kuttl-default/watcher-kuttl-api-0"] Oct 03 09:04:51 crc kubenswrapper[4765]: I1003 09:04:51.688120 4765 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["watcher-kuttl-default/watcher-kuttl-api-0"] Oct 03 09:04:52 crc kubenswrapper[4765]: I1003 09:04:52.130479 4765 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["watcher-kuttl-default/ceilometer-0"] Oct 03 09:04:52 crc kubenswrapper[4765]: I1003 09:04:52.130782 4765 kuberuntime_container.go:808] "Killing container with a grace period" pod="watcher-kuttl-default/ceilometer-0" podUID="e2b874f8-a25b-46c7-bff2-45197b16caa7" containerName="ceilometer-central-agent" containerID="cri-o://02a4c12827acf6a03832d3ba2e7f6d31c7c59596c52784fc348f07f44e85d3a0" gracePeriod=30 Oct 03 09:04:52 crc kubenswrapper[4765]: I1003 09:04:52.130902 4765 kuberuntime_container.go:808] "Killing container with a grace period" pod="watcher-kuttl-default/ceilometer-0" podUID="e2b874f8-a25b-46c7-bff2-45197b16caa7" containerName="proxy-httpd" containerID="cri-o://93bef5acdd682ab111d6e77370f9d02c11d9640abe9ef5443a6a5bfca8a9f4f9" gracePeriod=30 Oct 03 09:04:52 crc kubenswrapper[4765]: I1003 09:04:52.130947 4765 kuberuntime_container.go:808] "Killing container with a grace period" pod="watcher-kuttl-default/ceilometer-0" podUID="e2b874f8-a25b-46c7-bff2-45197b16caa7" containerName="sg-core" containerID="cri-o://a72f8378a59178e3d9227eddd050776509910bb73498d32073e0984faa7c92ce" gracePeriod=30 Oct 03 09:04:52 crc kubenswrapper[4765]: I1003 09:04:52.130983 4765 kuberuntime_container.go:808] "Killing container with a grace period" pod="watcher-kuttl-default/ceilometer-0" podUID="e2b874f8-a25b-46c7-bff2-45197b16caa7" containerName="ceilometer-notification-agent" containerID="cri-o://9e59176beff49dbe1e9082f472a14d0a0615f5f796e8ada2fd8cdb7da1e3ef40" gracePeriod=30 Oct 03 09:04:52 crc kubenswrapper[4765]: I1003 09:04:52.316685 4765 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="638a2f4d-4147-4f7b-8032-947dac4e402b" path="/var/lib/kubelet/pods/638a2f4d-4147-4f7b-8032-947dac4e402b/volumes" Oct 03 09:04:52 crc kubenswrapper[4765]: I1003 09:04:52.317181 4765 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="cf49515d-149d-4084-a0bc-5f5ddb0d6739" path="/var/lib/kubelet/pods/cf49515d-149d-4084-a0bc-5f5ddb0d6739/volumes" Oct 03 09:04:52 crc kubenswrapper[4765]: I1003 09:04:52.317834 4765 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ff7eb360-6cc9-4cd8-bd0e-1c11f76296d7" path="/var/lib/kubelet/pods/ff7eb360-6cc9-4cd8-bd0e-1c11f76296d7/volumes" Oct 03 09:04:52 crc kubenswrapper[4765]: I1003 09:04:52.657742 4765 generic.go:334] "Generic (PLEG): container finished" podID="e2b874f8-a25b-46c7-bff2-45197b16caa7" containerID="93bef5acdd682ab111d6e77370f9d02c11d9640abe9ef5443a6a5bfca8a9f4f9" exitCode=0 Oct 03 09:04:52 crc kubenswrapper[4765]: I1003 09:04:52.658341 4765 generic.go:334] "Generic (PLEG): container finished" podID="e2b874f8-a25b-46c7-bff2-45197b16caa7" containerID="a72f8378a59178e3d9227eddd050776509910bb73498d32073e0984faa7c92ce" exitCode=2 Oct 03 09:04:52 crc kubenswrapper[4765]: I1003 09:04:52.657953 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/ceilometer-0" event={"ID":"e2b874f8-a25b-46c7-bff2-45197b16caa7","Type":"ContainerDied","Data":"93bef5acdd682ab111d6e77370f9d02c11d9640abe9ef5443a6a5bfca8a9f4f9"} Oct 03 09:04:52 crc kubenswrapper[4765]: I1003 09:04:52.658442 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/ceilometer-0" event={"ID":"e2b874f8-a25b-46c7-bff2-45197b16caa7","Type":"ContainerDied","Data":"a72f8378a59178e3d9227eddd050776509910bb73498d32073e0984faa7c92ce"} Oct 03 09:04:53 crc kubenswrapper[4765]: I1003 09:04:53.488553 4765 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/watcher-kuttl-decision-engine-0" Oct 03 09:04:53 crc kubenswrapper[4765]: I1003 09:04:53.583170 4765 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/0deed7b7-7fac-4c23-bcb1-e790b70f7a9d-logs\") pod \"0deed7b7-7fac-4c23-bcb1-e790b70f7a9d\" (UID: \"0deed7b7-7fac-4c23-bcb1-e790b70f7a9d\") " Oct 03 09:04:53 crc kubenswrapper[4765]: I1003 09:04:53.583272 4765 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0deed7b7-7fac-4c23-bcb1-e790b70f7a9d-combined-ca-bundle\") pod \"0deed7b7-7fac-4c23-bcb1-e790b70f7a9d\" (UID: \"0deed7b7-7fac-4c23-bcb1-e790b70f7a9d\") " Oct 03 09:04:53 crc kubenswrapper[4765]: I1003 09:04:53.583392 4765 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cert-memcached-mtls\" (UniqueName: \"kubernetes.io/secret/0deed7b7-7fac-4c23-bcb1-e790b70f7a9d-cert-memcached-mtls\") pod \"0deed7b7-7fac-4c23-bcb1-e790b70f7a9d\" (UID: \"0deed7b7-7fac-4c23-bcb1-e790b70f7a9d\") " Oct 03 09:04:53 crc kubenswrapper[4765]: I1003 09:04:53.583434 4765 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"custom-prometheus-ca\" (UniqueName: \"kubernetes.io/secret/0deed7b7-7fac-4c23-bcb1-e790b70f7a9d-custom-prometheus-ca\") pod \"0deed7b7-7fac-4c23-bcb1-e790b70f7a9d\" (UID: \"0deed7b7-7fac-4c23-bcb1-e790b70f7a9d\") " Oct 03 09:04:53 crc kubenswrapper[4765]: I1003 09:04:53.583459 4765 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-58hdp\" (UniqueName: \"kubernetes.io/projected/0deed7b7-7fac-4c23-bcb1-e790b70f7a9d-kube-api-access-58hdp\") pod \"0deed7b7-7fac-4c23-bcb1-e790b70f7a9d\" (UID: \"0deed7b7-7fac-4c23-bcb1-e790b70f7a9d\") " Oct 03 09:04:53 crc kubenswrapper[4765]: I1003 09:04:53.583520 4765 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0deed7b7-7fac-4c23-bcb1-e790b70f7a9d-config-data\") pod \"0deed7b7-7fac-4c23-bcb1-e790b70f7a9d\" (UID: \"0deed7b7-7fac-4c23-bcb1-e790b70f7a9d\") " Oct 03 09:04:53 crc kubenswrapper[4765]: I1003 09:04:53.583988 4765 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/0deed7b7-7fac-4c23-bcb1-e790b70f7a9d-logs" (OuterVolumeSpecName: "logs") pod "0deed7b7-7fac-4c23-bcb1-e790b70f7a9d" (UID: "0deed7b7-7fac-4c23-bcb1-e790b70f7a9d"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 03 09:04:53 crc kubenswrapper[4765]: I1003 09:04:53.595811 4765 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0deed7b7-7fac-4c23-bcb1-e790b70f7a9d-kube-api-access-58hdp" (OuterVolumeSpecName: "kube-api-access-58hdp") pod "0deed7b7-7fac-4c23-bcb1-e790b70f7a9d" (UID: "0deed7b7-7fac-4c23-bcb1-e790b70f7a9d"). InnerVolumeSpecName "kube-api-access-58hdp". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 09:04:53 crc kubenswrapper[4765]: I1003 09:04:53.612730 4765 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0deed7b7-7fac-4c23-bcb1-e790b70f7a9d-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "0deed7b7-7fac-4c23-bcb1-e790b70f7a9d" (UID: "0deed7b7-7fac-4c23-bcb1-e790b70f7a9d"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 09:04:53 crc kubenswrapper[4765]: I1003 09:04:53.626584 4765 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0deed7b7-7fac-4c23-bcb1-e790b70f7a9d-custom-prometheus-ca" (OuterVolumeSpecName: "custom-prometheus-ca") pod "0deed7b7-7fac-4c23-bcb1-e790b70f7a9d" (UID: "0deed7b7-7fac-4c23-bcb1-e790b70f7a9d"). InnerVolumeSpecName "custom-prometheus-ca". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 09:04:53 crc kubenswrapper[4765]: I1003 09:04:53.656852 4765 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0deed7b7-7fac-4c23-bcb1-e790b70f7a9d-cert-memcached-mtls" (OuterVolumeSpecName: "cert-memcached-mtls") pod "0deed7b7-7fac-4c23-bcb1-e790b70f7a9d" (UID: "0deed7b7-7fac-4c23-bcb1-e790b70f7a9d"). InnerVolumeSpecName "cert-memcached-mtls". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 09:04:53 crc kubenswrapper[4765]: I1003 09:04:53.660727 4765 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0deed7b7-7fac-4c23-bcb1-e790b70f7a9d-config-data" (OuterVolumeSpecName: "config-data") pod "0deed7b7-7fac-4c23-bcb1-e790b70f7a9d" (UID: "0deed7b7-7fac-4c23-bcb1-e790b70f7a9d"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 09:04:53 crc kubenswrapper[4765]: I1003 09:04:53.673021 4765 generic.go:334] "Generic (PLEG): container finished" podID="0deed7b7-7fac-4c23-bcb1-e790b70f7a9d" containerID="3b8ab8341c1ff541ec26df55c796b534c2eb631f6062761a0478a771544b296b" exitCode=0 Oct 03 09:04:53 crc kubenswrapper[4765]: I1003 09:04:53.673093 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/watcher-kuttl-decision-engine-0" event={"ID":"0deed7b7-7fac-4c23-bcb1-e790b70f7a9d","Type":"ContainerDied","Data":"3b8ab8341c1ff541ec26df55c796b534c2eb631f6062761a0478a771544b296b"} Oct 03 09:04:53 crc kubenswrapper[4765]: I1003 09:04:53.673122 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/watcher-kuttl-decision-engine-0" event={"ID":"0deed7b7-7fac-4c23-bcb1-e790b70f7a9d","Type":"ContainerDied","Data":"daa57d9a1090ed85a1d81910bf590745fd8b2d0a451d4d14b62363956e1f6076"} Oct 03 09:04:53 crc kubenswrapper[4765]: I1003 09:04:53.673139 4765 scope.go:117] "RemoveContainer" containerID="3b8ab8341c1ff541ec26df55c796b534c2eb631f6062761a0478a771544b296b" Oct 03 09:04:53 crc kubenswrapper[4765]: I1003 09:04:53.673266 4765 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/watcher-kuttl-decision-engine-0" Oct 03 09:04:53 crc kubenswrapper[4765]: I1003 09:04:53.680518 4765 generic.go:334] "Generic (PLEG): container finished" podID="e2b874f8-a25b-46c7-bff2-45197b16caa7" containerID="02a4c12827acf6a03832d3ba2e7f6d31c7c59596c52784fc348f07f44e85d3a0" exitCode=0 Oct 03 09:04:53 crc kubenswrapper[4765]: I1003 09:04:53.680558 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/ceilometer-0" event={"ID":"e2b874f8-a25b-46c7-bff2-45197b16caa7","Type":"ContainerDied","Data":"02a4c12827acf6a03832d3ba2e7f6d31c7c59596c52784fc348f07f44e85d3a0"} Oct 03 09:04:53 crc kubenswrapper[4765]: I1003 09:04:53.685591 4765 reconciler_common.go:293] "Volume detached for volume \"cert-memcached-mtls\" (UniqueName: \"kubernetes.io/secret/0deed7b7-7fac-4c23-bcb1-e790b70f7a9d-cert-memcached-mtls\") on node \"crc\" DevicePath \"\"" Oct 03 09:04:53 crc kubenswrapper[4765]: I1003 09:04:53.685637 4765 reconciler_common.go:293] "Volume detached for volume \"custom-prometheus-ca\" (UniqueName: \"kubernetes.io/secret/0deed7b7-7fac-4c23-bcb1-e790b70f7a9d-custom-prometheus-ca\") on node \"crc\" DevicePath \"\"" Oct 03 09:04:53 crc kubenswrapper[4765]: I1003 09:04:53.685667 4765 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-58hdp\" (UniqueName: \"kubernetes.io/projected/0deed7b7-7fac-4c23-bcb1-e790b70f7a9d-kube-api-access-58hdp\") on node \"crc\" DevicePath \"\"" Oct 03 09:04:53 crc kubenswrapper[4765]: I1003 09:04:53.685680 4765 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0deed7b7-7fac-4c23-bcb1-e790b70f7a9d-config-data\") on node \"crc\" DevicePath \"\"" Oct 03 09:04:53 crc kubenswrapper[4765]: I1003 09:04:53.685691 4765 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/0deed7b7-7fac-4c23-bcb1-e790b70f7a9d-logs\") on node \"crc\" DevicePath \"\"" Oct 03 09:04:53 crc kubenswrapper[4765]: I1003 09:04:53.685702 4765 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0deed7b7-7fac-4c23-bcb1-e790b70f7a9d-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 03 09:04:53 crc kubenswrapper[4765]: I1003 09:04:53.743755 4765 scope.go:117] "RemoveContainer" containerID="3b8ab8341c1ff541ec26df55c796b534c2eb631f6062761a0478a771544b296b" Oct 03 09:04:53 crc kubenswrapper[4765]: E1003 09:04:53.744670 4765 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3b8ab8341c1ff541ec26df55c796b534c2eb631f6062761a0478a771544b296b\": container with ID starting with 3b8ab8341c1ff541ec26df55c796b534c2eb631f6062761a0478a771544b296b not found: ID does not exist" containerID="3b8ab8341c1ff541ec26df55c796b534c2eb631f6062761a0478a771544b296b" Oct 03 09:04:53 crc kubenswrapper[4765]: I1003 09:04:53.744709 4765 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3b8ab8341c1ff541ec26df55c796b534c2eb631f6062761a0478a771544b296b"} err="failed to get container status \"3b8ab8341c1ff541ec26df55c796b534c2eb631f6062761a0478a771544b296b\": rpc error: code = NotFound desc = could not find container \"3b8ab8341c1ff541ec26df55c796b534c2eb631f6062761a0478a771544b296b\": container with ID starting with 3b8ab8341c1ff541ec26df55c796b534c2eb631f6062761a0478a771544b296b not found: ID does not exist" Oct 03 09:04:53 crc kubenswrapper[4765]: I1003 09:04:53.750456 4765 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["watcher-kuttl-default/watcher-kuttl-decision-engine-0"] Oct 03 09:04:53 crc kubenswrapper[4765]: I1003 09:04:53.756671 4765 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["watcher-kuttl-default/watcher-kuttl-decision-engine-0"] Oct 03 09:04:54 crc kubenswrapper[4765]: I1003 09:04:54.316470 4765 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0deed7b7-7fac-4c23-bcb1-e790b70f7a9d" path="/var/lib/kubelet/pods/0deed7b7-7fac-4c23-bcb1-e790b70f7a9d/volumes" Oct 03 09:04:55 crc kubenswrapper[4765]: I1003 09:04:55.318728 4765 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/ceilometer-0" Oct 03 09:04:55 crc kubenswrapper[4765]: I1003 09:04:55.413935 4765 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/e2b874f8-a25b-46c7-bff2-45197b16caa7-sg-core-conf-yaml\") pod \"e2b874f8-a25b-46c7-bff2-45197b16caa7\" (UID: \"e2b874f8-a25b-46c7-bff2-45197b16caa7\") " Oct 03 09:04:55 crc kubenswrapper[4765]: I1003 09:04:55.414044 4765 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e2b874f8-a25b-46c7-bff2-45197b16caa7-combined-ca-bundle\") pod \"e2b874f8-a25b-46c7-bff2-45197b16caa7\" (UID: \"e2b874f8-a25b-46c7-bff2-45197b16caa7\") " Oct 03 09:04:55 crc kubenswrapper[4765]: I1003 09:04:55.414114 4765 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/e2b874f8-a25b-46c7-bff2-45197b16caa7-ceilometer-tls-certs\") pod \"e2b874f8-a25b-46c7-bff2-45197b16caa7\" (UID: \"e2b874f8-a25b-46c7-bff2-45197b16caa7\") " Oct 03 09:04:55 crc kubenswrapper[4765]: I1003 09:04:55.414156 4765 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e2b874f8-a25b-46c7-bff2-45197b16caa7-scripts\") pod \"e2b874f8-a25b-46c7-bff2-45197b16caa7\" (UID: \"e2b874f8-a25b-46c7-bff2-45197b16caa7\") " Oct 03 09:04:55 crc kubenswrapper[4765]: I1003 09:04:55.414204 4765 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e2b874f8-a25b-46c7-bff2-45197b16caa7-config-data\") pod \"e2b874f8-a25b-46c7-bff2-45197b16caa7\" (UID: \"e2b874f8-a25b-46c7-bff2-45197b16caa7\") " Oct 03 09:04:55 crc kubenswrapper[4765]: I1003 09:04:55.414242 4765 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/e2b874f8-a25b-46c7-bff2-45197b16caa7-log-httpd\") pod \"e2b874f8-a25b-46c7-bff2-45197b16caa7\" (UID: \"e2b874f8-a25b-46c7-bff2-45197b16caa7\") " Oct 03 09:04:55 crc kubenswrapper[4765]: I1003 09:04:55.414275 4765 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ns47t\" (UniqueName: \"kubernetes.io/projected/e2b874f8-a25b-46c7-bff2-45197b16caa7-kube-api-access-ns47t\") pod \"e2b874f8-a25b-46c7-bff2-45197b16caa7\" (UID: \"e2b874f8-a25b-46c7-bff2-45197b16caa7\") " Oct 03 09:04:55 crc kubenswrapper[4765]: I1003 09:04:55.414330 4765 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/e2b874f8-a25b-46c7-bff2-45197b16caa7-run-httpd\") pod \"e2b874f8-a25b-46c7-bff2-45197b16caa7\" (UID: \"e2b874f8-a25b-46c7-bff2-45197b16caa7\") " Oct 03 09:04:55 crc kubenswrapper[4765]: I1003 09:04:55.416830 4765 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e2b874f8-a25b-46c7-bff2-45197b16caa7-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "e2b874f8-a25b-46c7-bff2-45197b16caa7" (UID: "e2b874f8-a25b-46c7-bff2-45197b16caa7"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 03 09:04:55 crc kubenswrapper[4765]: I1003 09:04:55.417591 4765 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e2b874f8-a25b-46c7-bff2-45197b16caa7-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "e2b874f8-a25b-46c7-bff2-45197b16caa7" (UID: "e2b874f8-a25b-46c7-bff2-45197b16caa7"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 03 09:04:55 crc kubenswrapper[4765]: I1003 09:04:55.422908 4765 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e2b874f8-a25b-46c7-bff2-45197b16caa7-scripts" (OuterVolumeSpecName: "scripts") pod "e2b874f8-a25b-46c7-bff2-45197b16caa7" (UID: "e2b874f8-a25b-46c7-bff2-45197b16caa7"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 09:04:55 crc kubenswrapper[4765]: I1003 09:04:55.436530 4765 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e2b874f8-a25b-46c7-bff2-45197b16caa7-kube-api-access-ns47t" (OuterVolumeSpecName: "kube-api-access-ns47t") pod "e2b874f8-a25b-46c7-bff2-45197b16caa7" (UID: "e2b874f8-a25b-46c7-bff2-45197b16caa7"). InnerVolumeSpecName "kube-api-access-ns47t". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 09:04:55 crc kubenswrapper[4765]: I1003 09:04:55.475915 4765 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e2b874f8-a25b-46c7-bff2-45197b16caa7-ceilometer-tls-certs" (OuterVolumeSpecName: "ceilometer-tls-certs") pod "e2b874f8-a25b-46c7-bff2-45197b16caa7" (UID: "e2b874f8-a25b-46c7-bff2-45197b16caa7"). InnerVolumeSpecName "ceilometer-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 09:04:55 crc kubenswrapper[4765]: I1003 09:04:55.516511 4765 reconciler_common.go:293] "Volume detached for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/e2b874f8-a25b-46c7-bff2-45197b16caa7-ceilometer-tls-certs\") on node \"crc\" DevicePath \"\"" Oct 03 09:04:55 crc kubenswrapper[4765]: I1003 09:04:55.516542 4765 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e2b874f8-a25b-46c7-bff2-45197b16caa7-scripts\") on node \"crc\" DevicePath \"\"" Oct 03 09:04:55 crc kubenswrapper[4765]: I1003 09:04:55.516553 4765 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/e2b874f8-a25b-46c7-bff2-45197b16caa7-log-httpd\") on node \"crc\" DevicePath \"\"" Oct 03 09:04:55 crc kubenswrapper[4765]: I1003 09:04:55.516562 4765 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ns47t\" (UniqueName: \"kubernetes.io/projected/e2b874f8-a25b-46c7-bff2-45197b16caa7-kube-api-access-ns47t\") on node \"crc\" DevicePath \"\"" Oct 03 09:04:55 crc kubenswrapper[4765]: I1003 09:04:55.516572 4765 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/e2b874f8-a25b-46c7-bff2-45197b16caa7-run-httpd\") on node \"crc\" DevicePath \"\"" Oct 03 09:04:55 crc kubenswrapper[4765]: I1003 09:04:55.516729 4765 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e2b874f8-a25b-46c7-bff2-45197b16caa7-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "e2b874f8-a25b-46c7-bff2-45197b16caa7" (UID: "e2b874f8-a25b-46c7-bff2-45197b16caa7"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 09:04:55 crc kubenswrapper[4765]: I1003 09:04:55.521448 4765 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["watcher-kuttl-default/watcher-db-create-b4j4f"] Oct 03 09:04:55 crc kubenswrapper[4765]: E1003 09:04:55.521918 4765 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ff7eb360-6cc9-4cd8-bd0e-1c11f76296d7" containerName="watcher-api" Oct 03 09:04:55 crc kubenswrapper[4765]: I1003 09:04:55.522037 4765 state_mem.go:107] "Deleted CPUSet assignment" podUID="ff7eb360-6cc9-4cd8-bd0e-1c11f76296d7" containerName="watcher-api" Oct 03 09:04:55 crc kubenswrapper[4765]: E1003 09:04:55.522118 4765 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e2b874f8-a25b-46c7-bff2-45197b16caa7" containerName="ceilometer-notification-agent" Oct 03 09:04:55 crc kubenswrapper[4765]: I1003 09:04:55.522173 4765 state_mem.go:107] "Deleted CPUSet assignment" podUID="e2b874f8-a25b-46c7-bff2-45197b16caa7" containerName="ceilometer-notification-agent" Oct 03 09:04:55 crc kubenswrapper[4765]: E1003 09:04:55.522230 4765 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ff7eb360-6cc9-4cd8-bd0e-1c11f76296d7" containerName="watcher-kuttl-api-log" Oct 03 09:04:55 crc kubenswrapper[4765]: I1003 09:04:55.522297 4765 state_mem.go:107] "Deleted CPUSet assignment" podUID="ff7eb360-6cc9-4cd8-bd0e-1c11f76296d7" containerName="watcher-kuttl-api-log" Oct 03 09:04:55 crc kubenswrapper[4765]: E1003 09:04:55.522357 4765 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e2b874f8-a25b-46c7-bff2-45197b16caa7" containerName="sg-core" Oct 03 09:04:55 crc kubenswrapper[4765]: I1003 09:04:55.522410 4765 state_mem.go:107] "Deleted CPUSet assignment" podUID="e2b874f8-a25b-46c7-bff2-45197b16caa7" containerName="sg-core" Oct 03 09:04:55 crc kubenswrapper[4765]: E1003 09:04:55.522473 4765 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e2b874f8-a25b-46c7-bff2-45197b16caa7" containerName="ceilometer-central-agent" Oct 03 09:04:55 crc kubenswrapper[4765]: I1003 09:04:55.522533 4765 state_mem.go:107] "Deleted CPUSet assignment" podUID="e2b874f8-a25b-46c7-bff2-45197b16caa7" containerName="ceilometer-central-agent" Oct 03 09:04:55 crc kubenswrapper[4765]: E1003 09:04:55.522595 4765 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0deed7b7-7fac-4c23-bcb1-e790b70f7a9d" containerName="watcher-decision-engine" Oct 03 09:04:55 crc kubenswrapper[4765]: I1003 09:04:55.522758 4765 state_mem.go:107] "Deleted CPUSet assignment" podUID="0deed7b7-7fac-4c23-bcb1-e790b70f7a9d" containerName="watcher-decision-engine" Oct 03 09:04:55 crc kubenswrapper[4765]: E1003 09:04:55.522831 4765 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cf49515d-149d-4084-a0bc-5f5ddb0d6739" containerName="watcher-applier" Oct 03 09:04:55 crc kubenswrapper[4765]: I1003 09:04:55.522885 4765 state_mem.go:107] "Deleted CPUSet assignment" podUID="cf49515d-149d-4084-a0bc-5f5ddb0d6739" containerName="watcher-applier" Oct 03 09:04:55 crc kubenswrapper[4765]: E1003 09:04:55.522968 4765 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e2b874f8-a25b-46c7-bff2-45197b16caa7" containerName="proxy-httpd" Oct 03 09:04:55 crc kubenswrapper[4765]: I1003 09:04:55.523032 4765 state_mem.go:107] "Deleted CPUSet assignment" podUID="e2b874f8-a25b-46c7-bff2-45197b16caa7" containerName="proxy-httpd" Oct 03 09:04:55 crc kubenswrapper[4765]: I1003 09:04:55.523235 4765 memory_manager.go:354] "RemoveStaleState removing state" podUID="e2b874f8-a25b-46c7-bff2-45197b16caa7" containerName="proxy-httpd" Oct 03 09:04:55 crc kubenswrapper[4765]: I1003 09:04:55.523339 4765 memory_manager.go:354] "RemoveStaleState removing state" podUID="0deed7b7-7fac-4c23-bcb1-e790b70f7a9d" containerName="watcher-decision-engine" Oct 03 09:04:55 crc kubenswrapper[4765]: I1003 09:04:55.523409 4765 memory_manager.go:354] "RemoveStaleState removing state" podUID="e2b874f8-a25b-46c7-bff2-45197b16caa7" containerName="sg-core" Oct 03 09:04:55 crc kubenswrapper[4765]: I1003 09:04:55.523472 4765 memory_manager.go:354] "RemoveStaleState removing state" podUID="ff7eb360-6cc9-4cd8-bd0e-1c11f76296d7" containerName="watcher-api" Oct 03 09:04:55 crc kubenswrapper[4765]: I1003 09:04:55.523535 4765 memory_manager.go:354] "RemoveStaleState removing state" podUID="e2b874f8-a25b-46c7-bff2-45197b16caa7" containerName="ceilometer-central-agent" Oct 03 09:04:55 crc kubenswrapper[4765]: I1003 09:04:55.523597 4765 memory_manager.go:354] "RemoveStaleState removing state" podUID="ff7eb360-6cc9-4cd8-bd0e-1c11f76296d7" containerName="watcher-kuttl-api-log" Oct 03 09:04:55 crc kubenswrapper[4765]: I1003 09:04:55.523677 4765 memory_manager.go:354] "RemoveStaleState removing state" podUID="e2b874f8-a25b-46c7-bff2-45197b16caa7" containerName="ceilometer-notification-agent" Oct 03 09:04:55 crc kubenswrapper[4765]: I1003 09:04:55.523739 4765 memory_manager.go:354] "RemoveStaleState removing state" podUID="cf49515d-149d-4084-a0bc-5f5ddb0d6739" containerName="watcher-applier" Oct 03 09:04:55 crc kubenswrapper[4765]: I1003 09:04:55.524520 4765 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/watcher-db-create-b4j4f" Oct 03 09:04:55 crc kubenswrapper[4765]: I1003 09:04:55.532769 4765 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["watcher-kuttl-default/watcher-db-create-b4j4f"] Oct 03 09:04:55 crc kubenswrapper[4765]: I1003 09:04:55.539941 4765 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e2b874f8-a25b-46c7-bff2-45197b16caa7-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "e2b874f8-a25b-46c7-bff2-45197b16caa7" (UID: "e2b874f8-a25b-46c7-bff2-45197b16caa7"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 09:04:55 crc kubenswrapper[4765]: I1003 09:04:55.550015 4765 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e2b874f8-a25b-46c7-bff2-45197b16caa7-config-data" (OuterVolumeSpecName: "config-data") pod "e2b874f8-a25b-46c7-bff2-45197b16caa7" (UID: "e2b874f8-a25b-46c7-bff2-45197b16caa7"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 09:04:55 crc kubenswrapper[4765]: I1003 09:04:55.618283 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mlcv4\" (UniqueName: \"kubernetes.io/projected/520ae465-8518-4135-8906-29b80e6e0543-kube-api-access-mlcv4\") pod \"watcher-db-create-b4j4f\" (UID: \"520ae465-8518-4135-8906-29b80e6e0543\") " pod="watcher-kuttl-default/watcher-db-create-b4j4f" Oct 03 09:04:55 crc kubenswrapper[4765]: I1003 09:04:55.618698 4765 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e2b874f8-a25b-46c7-bff2-45197b16caa7-config-data\") on node \"crc\" DevicePath \"\"" Oct 03 09:04:55 crc kubenswrapper[4765]: I1003 09:04:55.618743 4765 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/e2b874f8-a25b-46c7-bff2-45197b16caa7-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Oct 03 09:04:55 crc kubenswrapper[4765]: I1003 09:04:55.618757 4765 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e2b874f8-a25b-46c7-bff2-45197b16caa7-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 03 09:04:55 crc kubenswrapper[4765]: I1003 09:04:55.701331 4765 generic.go:334] "Generic (PLEG): container finished" podID="e2b874f8-a25b-46c7-bff2-45197b16caa7" containerID="9e59176beff49dbe1e9082f472a14d0a0615f5f796e8ada2fd8cdb7da1e3ef40" exitCode=0 Oct 03 09:04:55 crc kubenswrapper[4765]: I1003 09:04:55.701553 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/ceilometer-0" event={"ID":"e2b874f8-a25b-46c7-bff2-45197b16caa7","Type":"ContainerDied","Data":"9e59176beff49dbe1e9082f472a14d0a0615f5f796e8ada2fd8cdb7da1e3ef40"} Oct 03 09:04:55 crc kubenswrapper[4765]: I1003 09:04:55.701674 4765 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/ceilometer-0" Oct 03 09:04:55 crc kubenswrapper[4765]: I1003 09:04:55.701693 4765 scope.go:117] "RemoveContainer" containerID="93bef5acdd682ab111d6e77370f9d02c11d9640abe9ef5443a6a5bfca8a9f4f9" Oct 03 09:04:55 crc kubenswrapper[4765]: I1003 09:04:55.701677 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/ceilometer-0" event={"ID":"e2b874f8-a25b-46c7-bff2-45197b16caa7","Type":"ContainerDied","Data":"de7b05e5078ea573839ebd0c20ee5963c559c2f0f4d710cae8f0034494c699ae"} Oct 03 09:04:55 crc kubenswrapper[4765]: I1003 09:04:55.720017 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mlcv4\" (UniqueName: \"kubernetes.io/projected/520ae465-8518-4135-8906-29b80e6e0543-kube-api-access-mlcv4\") pod \"watcher-db-create-b4j4f\" (UID: \"520ae465-8518-4135-8906-29b80e6e0543\") " pod="watcher-kuttl-default/watcher-db-create-b4j4f" Oct 03 09:04:55 crc kubenswrapper[4765]: I1003 09:04:55.723700 4765 scope.go:117] "RemoveContainer" containerID="a72f8378a59178e3d9227eddd050776509910bb73498d32073e0984faa7c92ce" Oct 03 09:04:55 crc kubenswrapper[4765]: I1003 09:04:55.741036 4765 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["watcher-kuttl-default/ceilometer-0"] Oct 03 09:04:55 crc kubenswrapper[4765]: I1003 09:04:55.743033 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mlcv4\" (UniqueName: \"kubernetes.io/projected/520ae465-8518-4135-8906-29b80e6e0543-kube-api-access-mlcv4\") pod \"watcher-db-create-b4j4f\" (UID: \"520ae465-8518-4135-8906-29b80e6e0543\") " pod="watcher-kuttl-default/watcher-db-create-b4j4f" Oct 03 09:04:55 crc kubenswrapper[4765]: I1003 09:04:55.746122 4765 scope.go:117] "RemoveContainer" containerID="9e59176beff49dbe1e9082f472a14d0a0615f5f796e8ada2fd8cdb7da1e3ef40" Oct 03 09:04:55 crc kubenswrapper[4765]: I1003 09:04:55.760720 4765 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["watcher-kuttl-default/ceilometer-0"] Oct 03 09:04:55 crc kubenswrapper[4765]: I1003 09:04:55.778833 4765 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["watcher-kuttl-default/ceilometer-0"] Oct 03 09:04:55 crc kubenswrapper[4765]: I1003 09:04:55.782762 4765 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/ceilometer-0" Oct 03 09:04:55 crc kubenswrapper[4765]: I1003 09:04:55.783659 4765 scope.go:117] "RemoveContainer" containerID="02a4c12827acf6a03832d3ba2e7f6d31c7c59596c52784fc348f07f44e85d3a0" Oct 03 09:04:55 crc kubenswrapper[4765]: I1003 09:04:55.787170 4765 reflector.go:368] Caches populated for *v1.Secret from object-"watcher-kuttl-default"/"ceilometer-config-data" Oct 03 09:04:55 crc kubenswrapper[4765]: I1003 09:04:55.787291 4765 reflector.go:368] Caches populated for *v1.Secret from object-"watcher-kuttl-default"/"ceilometer-scripts" Oct 03 09:04:55 crc kubenswrapper[4765]: I1003 09:04:55.788771 4765 reflector.go:368] Caches populated for *v1.Secret from object-"watcher-kuttl-default"/"cert-ceilometer-internal-svc" Oct 03 09:04:55 crc kubenswrapper[4765]: I1003 09:04:55.809586 4765 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["watcher-kuttl-default/ceilometer-0"] Oct 03 09:04:55 crc kubenswrapper[4765]: I1003 09:04:55.825225 4765 scope.go:117] "RemoveContainer" containerID="93bef5acdd682ab111d6e77370f9d02c11d9640abe9ef5443a6a5bfca8a9f4f9" Oct 03 09:04:55 crc kubenswrapper[4765]: E1003 09:04:55.827781 4765 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"93bef5acdd682ab111d6e77370f9d02c11d9640abe9ef5443a6a5bfca8a9f4f9\": container with ID starting with 93bef5acdd682ab111d6e77370f9d02c11d9640abe9ef5443a6a5bfca8a9f4f9 not found: ID does not exist" containerID="93bef5acdd682ab111d6e77370f9d02c11d9640abe9ef5443a6a5bfca8a9f4f9" Oct 03 09:04:55 crc kubenswrapper[4765]: I1003 09:04:55.827830 4765 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"93bef5acdd682ab111d6e77370f9d02c11d9640abe9ef5443a6a5bfca8a9f4f9"} err="failed to get container status \"93bef5acdd682ab111d6e77370f9d02c11d9640abe9ef5443a6a5bfca8a9f4f9\": rpc error: code = NotFound desc = could not find container \"93bef5acdd682ab111d6e77370f9d02c11d9640abe9ef5443a6a5bfca8a9f4f9\": container with ID starting with 93bef5acdd682ab111d6e77370f9d02c11d9640abe9ef5443a6a5bfca8a9f4f9 not found: ID does not exist" Oct 03 09:04:55 crc kubenswrapper[4765]: I1003 09:04:55.827864 4765 scope.go:117] "RemoveContainer" containerID="a72f8378a59178e3d9227eddd050776509910bb73498d32073e0984faa7c92ce" Oct 03 09:04:55 crc kubenswrapper[4765]: E1003 09:04:55.828172 4765 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a72f8378a59178e3d9227eddd050776509910bb73498d32073e0984faa7c92ce\": container with ID starting with a72f8378a59178e3d9227eddd050776509910bb73498d32073e0984faa7c92ce not found: ID does not exist" containerID="a72f8378a59178e3d9227eddd050776509910bb73498d32073e0984faa7c92ce" Oct 03 09:04:55 crc kubenswrapper[4765]: I1003 09:04:55.828206 4765 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a72f8378a59178e3d9227eddd050776509910bb73498d32073e0984faa7c92ce"} err="failed to get container status \"a72f8378a59178e3d9227eddd050776509910bb73498d32073e0984faa7c92ce\": rpc error: code = NotFound desc = could not find container \"a72f8378a59178e3d9227eddd050776509910bb73498d32073e0984faa7c92ce\": container with ID starting with a72f8378a59178e3d9227eddd050776509910bb73498d32073e0984faa7c92ce not found: ID does not exist" Oct 03 09:04:55 crc kubenswrapper[4765]: I1003 09:04:55.828228 4765 scope.go:117] "RemoveContainer" containerID="9e59176beff49dbe1e9082f472a14d0a0615f5f796e8ada2fd8cdb7da1e3ef40" Oct 03 09:04:55 crc kubenswrapper[4765]: E1003 09:04:55.840288 4765 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9e59176beff49dbe1e9082f472a14d0a0615f5f796e8ada2fd8cdb7da1e3ef40\": container with ID starting with 9e59176beff49dbe1e9082f472a14d0a0615f5f796e8ada2fd8cdb7da1e3ef40 not found: ID does not exist" containerID="9e59176beff49dbe1e9082f472a14d0a0615f5f796e8ada2fd8cdb7da1e3ef40" Oct 03 09:04:55 crc kubenswrapper[4765]: I1003 09:04:55.840340 4765 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9e59176beff49dbe1e9082f472a14d0a0615f5f796e8ada2fd8cdb7da1e3ef40"} err="failed to get container status \"9e59176beff49dbe1e9082f472a14d0a0615f5f796e8ada2fd8cdb7da1e3ef40\": rpc error: code = NotFound desc = could not find container \"9e59176beff49dbe1e9082f472a14d0a0615f5f796e8ada2fd8cdb7da1e3ef40\": container with ID starting with 9e59176beff49dbe1e9082f472a14d0a0615f5f796e8ada2fd8cdb7da1e3ef40 not found: ID does not exist" Oct 03 09:04:55 crc kubenswrapper[4765]: I1003 09:04:55.840369 4765 scope.go:117] "RemoveContainer" containerID="02a4c12827acf6a03832d3ba2e7f6d31c7c59596c52784fc348f07f44e85d3a0" Oct 03 09:04:55 crc kubenswrapper[4765]: E1003 09:04:55.840820 4765 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"02a4c12827acf6a03832d3ba2e7f6d31c7c59596c52784fc348f07f44e85d3a0\": container with ID starting with 02a4c12827acf6a03832d3ba2e7f6d31c7c59596c52784fc348f07f44e85d3a0 not found: ID does not exist" containerID="02a4c12827acf6a03832d3ba2e7f6d31c7c59596c52784fc348f07f44e85d3a0" Oct 03 09:04:55 crc kubenswrapper[4765]: I1003 09:04:55.840847 4765 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"02a4c12827acf6a03832d3ba2e7f6d31c7c59596c52784fc348f07f44e85d3a0"} err="failed to get container status \"02a4c12827acf6a03832d3ba2e7f6d31c7c59596c52784fc348f07f44e85d3a0\": rpc error: code = NotFound desc = could not find container \"02a4c12827acf6a03832d3ba2e7f6d31c7c59596c52784fc348f07f44e85d3a0\": container with ID starting with 02a4c12827acf6a03832d3ba2e7f6d31c7c59596c52784fc348f07f44e85d3a0 not found: ID does not exist" Oct 03 09:04:55 crc kubenswrapper[4765]: I1003 09:04:55.868839 4765 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/watcher-db-create-b4j4f" Oct 03 09:04:55 crc kubenswrapper[4765]: I1003 09:04:55.923384 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/26111dc6-7d31-48ad-a6a3-4bda319a71d5-scripts\") pod \"ceilometer-0\" (UID: \"26111dc6-7d31-48ad-a6a3-4bda319a71d5\") " pod="watcher-kuttl-default/ceilometer-0" Oct 03 09:04:55 crc kubenswrapper[4765]: I1003 09:04:55.923476 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/26111dc6-7d31-48ad-a6a3-4bda319a71d5-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"26111dc6-7d31-48ad-a6a3-4bda319a71d5\") " pod="watcher-kuttl-default/ceilometer-0" Oct 03 09:04:55 crc kubenswrapper[4765]: I1003 09:04:55.923588 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/26111dc6-7d31-48ad-a6a3-4bda319a71d5-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"26111dc6-7d31-48ad-a6a3-4bda319a71d5\") " pod="watcher-kuttl-default/ceilometer-0" Oct 03 09:04:55 crc kubenswrapper[4765]: I1003 09:04:55.923682 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/26111dc6-7d31-48ad-a6a3-4bda319a71d5-log-httpd\") pod \"ceilometer-0\" (UID: \"26111dc6-7d31-48ad-a6a3-4bda319a71d5\") " pod="watcher-kuttl-default/ceilometer-0" Oct 03 09:04:55 crc kubenswrapper[4765]: I1003 09:04:55.923749 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-v64gg\" (UniqueName: \"kubernetes.io/projected/26111dc6-7d31-48ad-a6a3-4bda319a71d5-kube-api-access-v64gg\") pod \"ceilometer-0\" (UID: \"26111dc6-7d31-48ad-a6a3-4bda319a71d5\") " pod="watcher-kuttl-default/ceilometer-0" Oct 03 09:04:55 crc kubenswrapper[4765]: I1003 09:04:55.923782 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/26111dc6-7d31-48ad-a6a3-4bda319a71d5-run-httpd\") pod \"ceilometer-0\" (UID: \"26111dc6-7d31-48ad-a6a3-4bda319a71d5\") " pod="watcher-kuttl-default/ceilometer-0" Oct 03 09:04:55 crc kubenswrapper[4765]: I1003 09:04:55.923822 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/26111dc6-7d31-48ad-a6a3-4bda319a71d5-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"26111dc6-7d31-48ad-a6a3-4bda319a71d5\") " pod="watcher-kuttl-default/ceilometer-0" Oct 03 09:04:55 crc kubenswrapper[4765]: I1003 09:04:55.923857 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/26111dc6-7d31-48ad-a6a3-4bda319a71d5-config-data\") pod \"ceilometer-0\" (UID: \"26111dc6-7d31-48ad-a6a3-4bda319a71d5\") " pod="watcher-kuttl-default/ceilometer-0" Oct 03 09:04:56 crc kubenswrapper[4765]: I1003 09:04:56.025338 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/26111dc6-7d31-48ad-a6a3-4bda319a71d5-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"26111dc6-7d31-48ad-a6a3-4bda319a71d5\") " pod="watcher-kuttl-default/ceilometer-0" Oct 03 09:04:56 crc kubenswrapper[4765]: I1003 09:04:56.025428 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/26111dc6-7d31-48ad-a6a3-4bda319a71d5-log-httpd\") pod \"ceilometer-0\" (UID: \"26111dc6-7d31-48ad-a6a3-4bda319a71d5\") " pod="watcher-kuttl-default/ceilometer-0" Oct 03 09:04:56 crc kubenswrapper[4765]: I1003 09:04:56.025490 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-v64gg\" (UniqueName: \"kubernetes.io/projected/26111dc6-7d31-48ad-a6a3-4bda319a71d5-kube-api-access-v64gg\") pod \"ceilometer-0\" (UID: \"26111dc6-7d31-48ad-a6a3-4bda319a71d5\") " pod="watcher-kuttl-default/ceilometer-0" Oct 03 09:04:56 crc kubenswrapper[4765]: I1003 09:04:56.025528 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/26111dc6-7d31-48ad-a6a3-4bda319a71d5-run-httpd\") pod \"ceilometer-0\" (UID: \"26111dc6-7d31-48ad-a6a3-4bda319a71d5\") " pod="watcher-kuttl-default/ceilometer-0" Oct 03 09:04:56 crc kubenswrapper[4765]: I1003 09:04:56.025565 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/26111dc6-7d31-48ad-a6a3-4bda319a71d5-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"26111dc6-7d31-48ad-a6a3-4bda319a71d5\") " pod="watcher-kuttl-default/ceilometer-0" Oct 03 09:04:56 crc kubenswrapper[4765]: I1003 09:04:56.025594 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/26111dc6-7d31-48ad-a6a3-4bda319a71d5-config-data\") pod \"ceilometer-0\" (UID: \"26111dc6-7d31-48ad-a6a3-4bda319a71d5\") " pod="watcher-kuttl-default/ceilometer-0" Oct 03 09:04:56 crc kubenswrapper[4765]: I1003 09:04:56.025671 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/26111dc6-7d31-48ad-a6a3-4bda319a71d5-scripts\") pod \"ceilometer-0\" (UID: \"26111dc6-7d31-48ad-a6a3-4bda319a71d5\") " pod="watcher-kuttl-default/ceilometer-0" Oct 03 09:04:56 crc kubenswrapper[4765]: I1003 09:04:56.025699 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/26111dc6-7d31-48ad-a6a3-4bda319a71d5-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"26111dc6-7d31-48ad-a6a3-4bda319a71d5\") " pod="watcher-kuttl-default/ceilometer-0" Oct 03 09:04:56 crc kubenswrapper[4765]: I1003 09:04:56.027452 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/26111dc6-7d31-48ad-a6a3-4bda319a71d5-run-httpd\") pod \"ceilometer-0\" (UID: \"26111dc6-7d31-48ad-a6a3-4bda319a71d5\") " pod="watcher-kuttl-default/ceilometer-0" Oct 03 09:04:56 crc kubenswrapper[4765]: I1003 09:04:56.027714 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/26111dc6-7d31-48ad-a6a3-4bda319a71d5-log-httpd\") pod \"ceilometer-0\" (UID: \"26111dc6-7d31-48ad-a6a3-4bda319a71d5\") " pod="watcher-kuttl-default/ceilometer-0" Oct 03 09:04:56 crc kubenswrapper[4765]: I1003 09:04:56.036562 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/26111dc6-7d31-48ad-a6a3-4bda319a71d5-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"26111dc6-7d31-48ad-a6a3-4bda319a71d5\") " pod="watcher-kuttl-default/ceilometer-0" Oct 03 09:04:56 crc kubenswrapper[4765]: I1003 09:04:56.039799 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/26111dc6-7d31-48ad-a6a3-4bda319a71d5-config-data\") pod \"ceilometer-0\" (UID: \"26111dc6-7d31-48ad-a6a3-4bda319a71d5\") " pod="watcher-kuttl-default/ceilometer-0" Oct 03 09:04:56 crc kubenswrapper[4765]: I1003 09:04:56.042890 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/26111dc6-7d31-48ad-a6a3-4bda319a71d5-scripts\") pod \"ceilometer-0\" (UID: \"26111dc6-7d31-48ad-a6a3-4bda319a71d5\") " pod="watcher-kuttl-default/ceilometer-0" Oct 03 09:04:56 crc kubenswrapper[4765]: I1003 09:04:56.044109 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/26111dc6-7d31-48ad-a6a3-4bda319a71d5-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"26111dc6-7d31-48ad-a6a3-4bda319a71d5\") " pod="watcher-kuttl-default/ceilometer-0" Oct 03 09:04:56 crc kubenswrapper[4765]: I1003 09:04:56.046760 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/26111dc6-7d31-48ad-a6a3-4bda319a71d5-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"26111dc6-7d31-48ad-a6a3-4bda319a71d5\") " pod="watcher-kuttl-default/ceilometer-0" Oct 03 09:04:56 crc kubenswrapper[4765]: I1003 09:04:56.054123 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-v64gg\" (UniqueName: \"kubernetes.io/projected/26111dc6-7d31-48ad-a6a3-4bda319a71d5-kube-api-access-v64gg\") pod \"ceilometer-0\" (UID: \"26111dc6-7d31-48ad-a6a3-4bda319a71d5\") " pod="watcher-kuttl-default/ceilometer-0" Oct 03 09:04:56 crc kubenswrapper[4765]: I1003 09:04:56.107778 4765 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/ceilometer-0" Oct 03 09:04:56 crc kubenswrapper[4765]: I1003 09:04:56.385056 4765 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e2b874f8-a25b-46c7-bff2-45197b16caa7" path="/var/lib/kubelet/pods/e2b874f8-a25b-46c7-bff2-45197b16caa7/volumes" Oct 03 09:04:56 crc kubenswrapper[4765]: I1003 09:04:56.449989 4765 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["watcher-kuttl-default/watcher-db-create-b4j4f"] Oct 03 09:04:56 crc kubenswrapper[4765]: I1003 09:04:56.665187 4765 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["watcher-kuttl-default/ceilometer-0"] Oct 03 09:04:56 crc kubenswrapper[4765]: I1003 09:04:56.717867 4765 generic.go:334] "Generic (PLEG): container finished" podID="520ae465-8518-4135-8906-29b80e6e0543" containerID="330330f7181b4c919e902b096ed32d564070fa45840494fff43596d7ae822baf" exitCode=0 Oct 03 09:04:56 crc kubenswrapper[4765]: I1003 09:04:56.717927 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/watcher-db-create-b4j4f" event={"ID":"520ae465-8518-4135-8906-29b80e6e0543","Type":"ContainerDied","Data":"330330f7181b4c919e902b096ed32d564070fa45840494fff43596d7ae822baf"} Oct 03 09:04:56 crc kubenswrapper[4765]: I1003 09:04:56.717948 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/watcher-db-create-b4j4f" event={"ID":"520ae465-8518-4135-8906-29b80e6e0543","Type":"ContainerStarted","Data":"d9cb827b67c816735204213aeff5b691ebb887100305be8adc7e2e80774f77b4"} Oct 03 09:04:56 crc kubenswrapper[4765]: I1003 09:04:56.719290 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/ceilometer-0" event={"ID":"26111dc6-7d31-48ad-a6a3-4bda319a71d5","Type":"ContainerStarted","Data":"cd4a5bda9d09c78727cc7e2119c34fb60e73604f0768d550ec618e288598d130"} Oct 03 09:04:57 crc kubenswrapper[4765]: I1003 09:04:57.728812 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/ceilometer-0" event={"ID":"26111dc6-7d31-48ad-a6a3-4bda319a71d5","Type":"ContainerStarted","Data":"d0c6700111c0fb323a7c1c1c532293193af50fd5197bf856ddd2da55c03349eb"} Oct 03 09:04:58 crc kubenswrapper[4765]: I1003 09:04:58.162222 4765 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/watcher-db-create-b4j4f" Oct 03 09:04:58 crc kubenswrapper[4765]: I1003 09:04:58.295212 4765 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mlcv4\" (UniqueName: \"kubernetes.io/projected/520ae465-8518-4135-8906-29b80e6e0543-kube-api-access-mlcv4\") pod \"520ae465-8518-4135-8906-29b80e6e0543\" (UID: \"520ae465-8518-4135-8906-29b80e6e0543\") " Oct 03 09:04:58 crc kubenswrapper[4765]: I1003 09:04:58.301976 4765 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/520ae465-8518-4135-8906-29b80e6e0543-kube-api-access-mlcv4" (OuterVolumeSpecName: "kube-api-access-mlcv4") pod "520ae465-8518-4135-8906-29b80e6e0543" (UID: "520ae465-8518-4135-8906-29b80e6e0543"). InnerVolumeSpecName "kube-api-access-mlcv4". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 09:04:58 crc kubenswrapper[4765]: I1003 09:04:58.397862 4765 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mlcv4\" (UniqueName: \"kubernetes.io/projected/520ae465-8518-4135-8906-29b80e6e0543-kube-api-access-mlcv4\") on node \"crc\" DevicePath \"\"" Oct 03 09:04:58 crc kubenswrapper[4765]: I1003 09:04:58.737946 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/watcher-db-create-b4j4f" event={"ID":"520ae465-8518-4135-8906-29b80e6e0543","Type":"ContainerDied","Data":"d9cb827b67c816735204213aeff5b691ebb887100305be8adc7e2e80774f77b4"} Oct 03 09:04:58 crc kubenswrapper[4765]: I1003 09:04:58.737983 4765 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="d9cb827b67c816735204213aeff5b691ebb887100305be8adc7e2e80774f77b4" Oct 03 09:04:58 crc kubenswrapper[4765]: I1003 09:04:58.738031 4765 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/watcher-db-create-b4j4f" Oct 03 09:04:58 crc kubenswrapper[4765]: I1003 09:04:58.740736 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/ceilometer-0" event={"ID":"26111dc6-7d31-48ad-a6a3-4bda319a71d5","Type":"ContainerStarted","Data":"b22bb8689ecc9dd4f0e5d26c4d823803c919864ce9c6d9ecfee378a9cbb3370c"} Oct 03 09:04:59 crc kubenswrapper[4765]: I1003 09:04:59.751854 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/ceilometer-0" event={"ID":"26111dc6-7d31-48ad-a6a3-4bda319a71d5","Type":"ContainerStarted","Data":"fbd64ba33438952d3c750ece586a3787f342b322c9a37f28bbbf0cd11e71ace4"} Oct 03 09:05:00 crc kubenswrapper[4765]: I1003 09:05:00.680566 4765 patch_prober.go:28] interesting pod/machine-config-daemon-j8mss container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 03 09:05:00 crc kubenswrapper[4765]: I1003 09:05:00.681260 4765 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-j8mss" podUID="d636dbad-9ffa-4ba7-953f-adea04b76a23" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 03 09:05:00 crc kubenswrapper[4765]: I1003 09:05:00.681378 4765 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-j8mss" Oct 03 09:05:00 crc kubenswrapper[4765]: I1003 09:05:00.695587 4765 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"dd918556e4256b95f1ffce5dba4f8a301b33441a569fc5bbea88da3f09eb9800"} pod="openshift-machine-config-operator/machine-config-daemon-j8mss" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Oct 03 09:05:00 crc kubenswrapper[4765]: I1003 09:05:00.695727 4765 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-j8mss" podUID="d636dbad-9ffa-4ba7-953f-adea04b76a23" containerName="machine-config-daemon" containerID="cri-o://dd918556e4256b95f1ffce5dba4f8a301b33441a569fc5bbea88da3f09eb9800" gracePeriod=600 Oct 03 09:05:00 crc kubenswrapper[4765]: I1003 09:05:00.762841 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/ceilometer-0" event={"ID":"26111dc6-7d31-48ad-a6a3-4bda319a71d5","Type":"ContainerStarted","Data":"4dc2c23b1622c0a60ed7089232d4c876d1e18ccdba79e7b41bb64e586e0ba352"} Oct 03 09:05:00 crc kubenswrapper[4765]: I1003 09:05:00.763030 4765 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="watcher-kuttl-default/ceilometer-0" Oct 03 09:05:00 crc kubenswrapper[4765]: I1003 09:05:00.792893 4765 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="watcher-kuttl-default/ceilometer-0" podStartSLOduration=2.069607572 podStartE2EDuration="5.79287216s" podCreationTimestamp="2025-10-03 09:04:55 +0000 UTC" firstStartedPulling="2025-10-03 09:04:56.671207302 +0000 UTC m=+1540.972701632" lastFinishedPulling="2025-10-03 09:05:00.39447189 +0000 UTC m=+1544.695966220" observedRunningTime="2025-10-03 09:05:00.78896325 +0000 UTC m=+1545.090457580" watchObservedRunningTime="2025-10-03 09:05:00.79287216 +0000 UTC m=+1545.094366490" Oct 03 09:05:00 crc kubenswrapper[4765]: E1003 09:05:00.823776 4765 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-j8mss_openshift-machine-config-operator(d636dbad-9ffa-4ba7-953f-adea04b76a23)\"" pod="openshift-machine-config-operator/machine-config-daemon-j8mss" podUID="d636dbad-9ffa-4ba7-953f-adea04b76a23" Oct 03 09:05:01 crc kubenswrapper[4765]: I1003 09:05:01.772243 4765 generic.go:334] "Generic (PLEG): container finished" podID="d636dbad-9ffa-4ba7-953f-adea04b76a23" containerID="dd918556e4256b95f1ffce5dba4f8a301b33441a569fc5bbea88da3f09eb9800" exitCode=0 Oct 03 09:05:01 crc kubenswrapper[4765]: I1003 09:05:01.772350 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-j8mss" event={"ID":"d636dbad-9ffa-4ba7-953f-adea04b76a23","Type":"ContainerDied","Data":"dd918556e4256b95f1ffce5dba4f8a301b33441a569fc5bbea88da3f09eb9800"} Oct 03 09:05:01 crc kubenswrapper[4765]: I1003 09:05:01.772787 4765 scope.go:117] "RemoveContainer" containerID="6fb31d91d836934f6972647efe119143f3dae08e768c5506a0423da0d8bd74e8" Oct 03 09:05:01 crc kubenswrapper[4765]: I1003 09:05:01.773517 4765 scope.go:117] "RemoveContainer" containerID="dd918556e4256b95f1ffce5dba4f8a301b33441a569fc5bbea88da3f09eb9800" Oct 03 09:05:01 crc kubenswrapper[4765]: E1003 09:05:01.773849 4765 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-j8mss_openshift-machine-config-operator(d636dbad-9ffa-4ba7-953f-adea04b76a23)\"" pod="openshift-machine-config-operator/machine-config-daemon-j8mss" podUID="d636dbad-9ffa-4ba7-953f-adea04b76a23" Oct 03 09:05:05 crc kubenswrapper[4765]: I1003 09:05:05.636542 4765 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["watcher-kuttl-default/watcher-201a-account-create-xwrlp"] Oct 03 09:05:05 crc kubenswrapper[4765]: E1003 09:05:05.637149 4765 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="520ae465-8518-4135-8906-29b80e6e0543" containerName="mariadb-database-create" Oct 03 09:05:05 crc kubenswrapper[4765]: I1003 09:05:05.637163 4765 state_mem.go:107] "Deleted CPUSet assignment" podUID="520ae465-8518-4135-8906-29b80e6e0543" containerName="mariadb-database-create" Oct 03 09:05:05 crc kubenswrapper[4765]: I1003 09:05:05.637417 4765 memory_manager.go:354] "RemoveStaleState removing state" podUID="520ae465-8518-4135-8906-29b80e6e0543" containerName="mariadb-database-create" Oct 03 09:05:05 crc kubenswrapper[4765]: I1003 09:05:05.638169 4765 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/watcher-201a-account-create-xwrlp" Oct 03 09:05:05 crc kubenswrapper[4765]: I1003 09:05:05.641366 4765 reflector.go:368] Caches populated for *v1.Secret from object-"watcher-kuttl-default"/"watcher-db-secret" Oct 03 09:05:05 crc kubenswrapper[4765]: I1003 09:05:05.654428 4765 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["watcher-kuttl-default/watcher-201a-account-create-xwrlp"] Oct 03 09:05:05 crc kubenswrapper[4765]: I1003 09:05:05.821271 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9xrsl\" (UniqueName: \"kubernetes.io/projected/8677a574-cd76-43b4-9a66-fcd43b04112e-kube-api-access-9xrsl\") pod \"watcher-201a-account-create-xwrlp\" (UID: \"8677a574-cd76-43b4-9a66-fcd43b04112e\") " pod="watcher-kuttl-default/watcher-201a-account-create-xwrlp" Oct 03 09:05:05 crc kubenswrapper[4765]: I1003 09:05:05.922706 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9xrsl\" (UniqueName: \"kubernetes.io/projected/8677a574-cd76-43b4-9a66-fcd43b04112e-kube-api-access-9xrsl\") pod \"watcher-201a-account-create-xwrlp\" (UID: \"8677a574-cd76-43b4-9a66-fcd43b04112e\") " pod="watcher-kuttl-default/watcher-201a-account-create-xwrlp" Oct 03 09:05:05 crc kubenswrapper[4765]: I1003 09:05:05.948361 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9xrsl\" (UniqueName: \"kubernetes.io/projected/8677a574-cd76-43b4-9a66-fcd43b04112e-kube-api-access-9xrsl\") pod \"watcher-201a-account-create-xwrlp\" (UID: \"8677a574-cd76-43b4-9a66-fcd43b04112e\") " pod="watcher-kuttl-default/watcher-201a-account-create-xwrlp" Oct 03 09:05:05 crc kubenswrapper[4765]: I1003 09:05:05.954996 4765 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/watcher-201a-account-create-xwrlp" Oct 03 09:05:06 crc kubenswrapper[4765]: I1003 09:05:06.418126 4765 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["watcher-kuttl-default/watcher-201a-account-create-xwrlp"] Oct 03 09:05:06 crc kubenswrapper[4765]: I1003 09:05:06.814954 4765 generic.go:334] "Generic (PLEG): container finished" podID="8677a574-cd76-43b4-9a66-fcd43b04112e" containerID="c52e60ba8087540c720ae0d205856f62a2466d9553bd5bb0fed51601b94417a2" exitCode=0 Oct 03 09:05:06 crc kubenswrapper[4765]: I1003 09:05:06.814997 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/watcher-201a-account-create-xwrlp" event={"ID":"8677a574-cd76-43b4-9a66-fcd43b04112e","Type":"ContainerDied","Data":"c52e60ba8087540c720ae0d205856f62a2466d9553bd5bb0fed51601b94417a2"} Oct 03 09:05:06 crc kubenswrapper[4765]: I1003 09:05:06.815036 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/watcher-201a-account-create-xwrlp" event={"ID":"8677a574-cd76-43b4-9a66-fcd43b04112e","Type":"ContainerStarted","Data":"a1025f0061e523acb2b901c3e411e651ad7af162244e5a8c7c7df26150c38055"} Oct 03 09:05:08 crc kubenswrapper[4765]: I1003 09:05:08.202916 4765 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/watcher-201a-account-create-xwrlp" Oct 03 09:05:08 crc kubenswrapper[4765]: I1003 09:05:08.363446 4765 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9xrsl\" (UniqueName: \"kubernetes.io/projected/8677a574-cd76-43b4-9a66-fcd43b04112e-kube-api-access-9xrsl\") pod \"8677a574-cd76-43b4-9a66-fcd43b04112e\" (UID: \"8677a574-cd76-43b4-9a66-fcd43b04112e\") " Oct 03 09:05:08 crc kubenswrapper[4765]: I1003 09:05:08.377883 4765 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8677a574-cd76-43b4-9a66-fcd43b04112e-kube-api-access-9xrsl" (OuterVolumeSpecName: "kube-api-access-9xrsl") pod "8677a574-cd76-43b4-9a66-fcd43b04112e" (UID: "8677a574-cd76-43b4-9a66-fcd43b04112e"). InnerVolumeSpecName "kube-api-access-9xrsl". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 09:05:08 crc kubenswrapper[4765]: I1003 09:05:08.466846 4765 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9xrsl\" (UniqueName: \"kubernetes.io/projected/8677a574-cd76-43b4-9a66-fcd43b04112e-kube-api-access-9xrsl\") on node \"crc\" DevicePath \"\"" Oct 03 09:05:08 crc kubenswrapper[4765]: I1003 09:05:08.832065 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/watcher-201a-account-create-xwrlp" event={"ID":"8677a574-cd76-43b4-9a66-fcd43b04112e","Type":"ContainerDied","Data":"a1025f0061e523acb2b901c3e411e651ad7af162244e5a8c7c7df26150c38055"} Oct 03 09:05:08 crc kubenswrapper[4765]: I1003 09:05:08.832385 4765 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="a1025f0061e523acb2b901c3e411e651ad7af162244e5a8c7c7df26150c38055" Oct 03 09:05:08 crc kubenswrapper[4765]: I1003 09:05:08.832168 4765 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/watcher-201a-account-create-xwrlp" Oct 03 09:05:10 crc kubenswrapper[4765]: I1003 09:05:10.877159 4765 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["watcher-kuttl-default/watcher-kuttl-db-sync-lh9fk"] Oct 03 09:05:10 crc kubenswrapper[4765]: E1003 09:05:10.877893 4765 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8677a574-cd76-43b4-9a66-fcd43b04112e" containerName="mariadb-account-create" Oct 03 09:05:10 crc kubenswrapper[4765]: I1003 09:05:10.877909 4765 state_mem.go:107] "Deleted CPUSet assignment" podUID="8677a574-cd76-43b4-9a66-fcd43b04112e" containerName="mariadb-account-create" Oct 03 09:05:10 crc kubenswrapper[4765]: I1003 09:05:10.878049 4765 memory_manager.go:354] "RemoveStaleState removing state" podUID="8677a574-cd76-43b4-9a66-fcd43b04112e" containerName="mariadb-account-create" Oct 03 09:05:10 crc kubenswrapper[4765]: I1003 09:05:10.878699 4765 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/watcher-kuttl-db-sync-lh9fk" Oct 03 09:05:10 crc kubenswrapper[4765]: I1003 09:05:10.880985 4765 reflector.go:368] Caches populated for *v1.Secret from object-"watcher-kuttl-default"/"watcher-watcher-kuttl-dockercfg-hgx22" Oct 03 09:05:10 crc kubenswrapper[4765]: I1003 09:05:10.881324 4765 reflector.go:368] Caches populated for *v1.Secret from object-"watcher-kuttl-default"/"watcher-kuttl-config-data" Oct 03 09:05:10 crc kubenswrapper[4765]: I1003 09:05:10.890085 4765 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["watcher-kuttl-default/watcher-kuttl-db-sync-lh9fk"] Oct 03 09:05:11 crc kubenswrapper[4765]: I1003 09:05:11.005939 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/1cad9dd0-c4e9-442c-a3f0-c2d3ec0aaee2-db-sync-config-data\") pod \"watcher-kuttl-db-sync-lh9fk\" (UID: \"1cad9dd0-c4e9-442c-a3f0-c2d3ec0aaee2\") " pod="watcher-kuttl-default/watcher-kuttl-db-sync-lh9fk" Oct 03 09:05:11 crc kubenswrapper[4765]: I1003 09:05:11.005987 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-w7qdg\" (UniqueName: \"kubernetes.io/projected/1cad9dd0-c4e9-442c-a3f0-c2d3ec0aaee2-kube-api-access-w7qdg\") pod \"watcher-kuttl-db-sync-lh9fk\" (UID: \"1cad9dd0-c4e9-442c-a3f0-c2d3ec0aaee2\") " pod="watcher-kuttl-default/watcher-kuttl-db-sync-lh9fk" Oct 03 09:05:11 crc kubenswrapper[4765]: I1003 09:05:11.006061 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1cad9dd0-c4e9-442c-a3f0-c2d3ec0aaee2-config-data\") pod \"watcher-kuttl-db-sync-lh9fk\" (UID: \"1cad9dd0-c4e9-442c-a3f0-c2d3ec0aaee2\") " pod="watcher-kuttl-default/watcher-kuttl-db-sync-lh9fk" Oct 03 09:05:11 crc kubenswrapper[4765]: I1003 09:05:11.006080 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1cad9dd0-c4e9-442c-a3f0-c2d3ec0aaee2-combined-ca-bundle\") pod \"watcher-kuttl-db-sync-lh9fk\" (UID: \"1cad9dd0-c4e9-442c-a3f0-c2d3ec0aaee2\") " pod="watcher-kuttl-default/watcher-kuttl-db-sync-lh9fk" Oct 03 09:05:11 crc kubenswrapper[4765]: I1003 09:05:11.107841 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1cad9dd0-c4e9-442c-a3f0-c2d3ec0aaee2-config-data\") pod \"watcher-kuttl-db-sync-lh9fk\" (UID: \"1cad9dd0-c4e9-442c-a3f0-c2d3ec0aaee2\") " pod="watcher-kuttl-default/watcher-kuttl-db-sync-lh9fk" Oct 03 09:05:11 crc kubenswrapper[4765]: I1003 09:05:11.107899 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1cad9dd0-c4e9-442c-a3f0-c2d3ec0aaee2-combined-ca-bundle\") pod \"watcher-kuttl-db-sync-lh9fk\" (UID: \"1cad9dd0-c4e9-442c-a3f0-c2d3ec0aaee2\") " pod="watcher-kuttl-default/watcher-kuttl-db-sync-lh9fk" Oct 03 09:05:11 crc kubenswrapper[4765]: I1003 09:05:11.108025 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/1cad9dd0-c4e9-442c-a3f0-c2d3ec0aaee2-db-sync-config-data\") pod \"watcher-kuttl-db-sync-lh9fk\" (UID: \"1cad9dd0-c4e9-442c-a3f0-c2d3ec0aaee2\") " pod="watcher-kuttl-default/watcher-kuttl-db-sync-lh9fk" Oct 03 09:05:11 crc kubenswrapper[4765]: I1003 09:05:11.108105 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-w7qdg\" (UniqueName: \"kubernetes.io/projected/1cad9dd0-c4e9-442c-a3f0-c2d3ec0aaee2-kube-api-access-w7qdg\") pod \"watcher-kuttl-db-sync-lh9fk\" (UID: \"1cad9dd0-c4e9-442c-a3f0-c2d3ec0aaee2\") " pod="watcher-kuttl-default/watcher-kuttl-db-sync-lh9fk" Oct 03 09:05:11 crc kubenswrapper[4765]: I1003 09:05:11.113844 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/1cad9dd0-c4e9-442c-a3f0-c2d3ec0aaee2-db-sync-config-data\") pod \"watcher-kuttl-db-sync-lh9fk\" (UID: \"1cad9dd0-c4e9-442c-a3f0-c2d3ec0aaee2\") " pod="watcher-kuttl-default/watcher-kuttl-db-sync-lh9fk" Oct 03 09:05:11 crc kubenswrapper[4765]: I1003 09:05:11.114088 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1cad9dd0-c4e9-442c-a3f0-c2d3ec0aaee2-config-data\") pod \"watcher-kuttl-db-sync-lh9fk\" (UID: \"1cad9dd0-c4e9-442c-a3f0-c2d3ec0aaee2\") " pod="watcher-kuttl-default/watcher-kuttl-db-sync-lh9fk" Oct 03 09:05:11 crc kubenswrapper[4765]: I1003 09:05:11.114451 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1cad9dd0-c4e9-442c-a3f0-c2d3ec0aaee2-combined-ca-bundle\") pod \"watcher-kuttl-db-sync-lh9fk\" (UID: \"1cad9dd0-c4e9-442c-a3f0-c2d3ec0aaee2\") " pod="watcher-kuttl-default/watcher-kuttl-db-sync-lh9fk" Oct 03 09:05:11 crc kubenswrapper[4765]: I1003 09:05:11.139392 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-w7qdg\" (UniqueName: \"kubernetes.io/projected/1cad9dd0-c4e9-442c-a3f0-c2d3ec0aaee2-kube-api-access-w7qdg\") pod \"watcher-kuttl-db-sync-lh9fk\" (UID: \"1cad9dd0-c4e9-442c-a3f0-c2d3ec0aaee2\") " pod="watcher-kuttl-default/watcher-kuttl-db-sync-lh9fk" Oct 03 09:05:11 crc kubenswrapper[4765]: I1003 09:05:11.195552 4765 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/watcher-kuttl-db-sync-lh9fk" Oct 03 09:05:11 crc kubenswrapper[4765]: I1003 09:05:11.702261 4765 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["watcher-kuttl-default/watcher-kuttl-db-sync-lh9fk"] Oct 03 09:05:11 crc kubenswrapper[4765]: I1003 09:05:11.856339 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/watcher-kuttl-db-sync-lh9fk" event={"ID":"1cad9dd0-c4e9-442c-a3f0-c2d3ec0aaee2","Type":"ContainerStarted","Data":"919af238144f9d28260776ad19d7df5d5398c3d91b411b834f0f221dc5ba40e9"} Oct 03 09:05:12 crc kubenswrapper[4765]: I1003 09:05:12.866919 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/watcher-kuttl-db-sync-lh9fk" event={"ID":"1cad9dd0-c4e9-442c-a3f0-c2d3ec0aaee2","Type":"ContainerStarted","Data":"fce137e1c1c792c1ed9d0bebbcd3b1a3cfcd74dd023097e25aa5e9a1305f2d60"} Oct 03 09:05:12 crc kubenswrapper[4765]: I1003 09:05:12.879694 4765 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="watcher-kuttl-default/watcher-kuttl-db-sync-lh9fk" podStartSLOduration=2.87967674 podStartE2EDuration="2.87967674s" podCreationTimestamp="2025-10-03 09:05:10 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-03 09:05:12.879466815 +0000 UTC m=+1557.180961155" watchObservedRunningTime="2025-10-03 09:05:12.87967674 +0000 UTC m=+1557.181171070" Oct 03 09:05:14 crc kubenswrapper[4765]: I1003 09:05:14.886383 4765 generic.go:334] "Generic (PLEG): container finished" podID="1cad9dd0-c4e9-442c-a3f0-c2d3ec0aaee2" containerID="fce137e1c1c792c1ed9d0bebbcd3b1a3cfcd74dd023097e25aa5e9a1305f2d60" exitCode=0 Oct 03 09:05:14 crc kubenswrapper[4765]: I1003 09:05:14.886471 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/watcher-kuttl-db-sync-lh9fk" event={"ID":"1cad9dd0-c4e9-442c-a3f0-c2d3ec0aaee2","Type":"ContainerDied","Data":"fce137e1c1c792c1ed9d0bebbcd3b1a3cfcd74dd023097e25aa5e9a1305f2d60"} Oct 03 09:05:15 crc kubenswrapper[4765]: I1003 09:05:15.307071 4765 scope.go:117] "RemoveContainer" containerID="dd918556e4256b95f1ffce5dba4f8a301b33441a569fc5bbea88da3f09eb9800" Oct 03 09:05:15 crc kubenswrapper[4765]: E1003 09:05:15.307310 4765 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-j8mss_openshift-machine-config-operator(d636dbad-9ffa-4ba7-953f-adea04b76a23)\"" pod="openshift-machine-config-operator/machine-config-daemon-j8mss" podUID="d636dbad-9ffa-4ba7-953f-adea04b76a23" Oct 03 09:05:16 crc kubenswrapper[4765]: I1003 09:05:16.276389 4765 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/watcher-kuttl-db-sync-lh9fk" Oct 03 09:05:16 crc kubenswrapper[4765]: I1003 09:05:16.390228 4765 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1cad9dd0-c4e9-442c-a3f0-c2d3ec0aaee2-combined-ca-bundle\") pod \"1cad9dd0-c4e9-442c-a3f0-c2d3ec0aaee2\" (UID: \"1cad9dd0-c4e9-442c-a3f0-c2d3ec0aaee2\") " Oct 03 09:05:16 crc kubenswrapper[4765]: I1003 09:05:16.390576 4765 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/1cad9dd0-c4e9-442c-a3f0-c2d3ec0aaee2-db-sync-config-data\") pod \"1cad9dd0-c4e9-442c-a3f0-c2d3ec0aaee2\" (UID: \"1cad9dd0-c4e9-442c-a3f0-c2d3ec0aaee2\") " Oct 03 09:05:16 crc kubenswrapper[4765]: I1003 09:05:16.391049 4765 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1cad9dd0-c4e9-442c-a3f0-c2d3ec0aaee2-config-data\") pod \"1cad9dd0-c4e9-442c-a3f0-c2d3ec0aaee2\" (UID: \"1cad9dd0-c4e9-442c-a3f0-c2d3ec0aaee2\") " Oct 03 09:05:16 crc kubenswrapper[4765]: I1003 09:05:16.391220 4765 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-w7qdg\" (UniqueName: \"kubernetes.io/projected/1cad9dd0-c4e9-442c-a3f0-c2d3ec0aaee2-kube-api-access-w7qdg\") pod \"1cad9dd0-c4e9-442c-a3f0-c2d3ec0aaee2\" (UID: \"1cad9dd0-c4e9-442c-a3f0-c2d3ec0aaee2\") " Oct 03 09:05:16 crc kubenswrapper[4765]: I1003 09:05:16.413916 4765 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1cad9dd0-c4e9-442c-a3f0-c2d3ec0aaee2-db-sync-config-data" (OuterVolumeSpecName: "db-sync-config-data") pod "1cad9dd0-c4e9-442c-a3f0-c2d3ec0aaee2" (UID: "1cad9dd0-c4e9-442c-a3f0-c2d3ec0aaee2"). InnerVolumeSpecName "db-sync-config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 09:05:16 crc kubenswrapper[4765]: I1003 09:05:16.417873 4765 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1cad9dd0-c4e9-442c-a3f0-c2d3ec0aaee2-kube-api-access-w7qdg" (OuterVolumeSpecName: "kube-api-access-w7qdg") pod "1cad9dd0-c4e9-442c-a3f0-c2d3ec0aaee2" (UID: "1cad9dd0-c4e9-442c-a3f0-c2d3ec0aaee2"). InnerVolumeSpecName "kube-api-access-w7qdg". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 09:05:16 crc kubenswrapper[4765]: I1003 09:05:16.418146 4765 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1cad9dd0-c4e9-442c-a3f0-c2d3ec0aaee2-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "1cad9dd0-c4e9-442c-a3f0-c2d3ec0aaee2" (UID: "1cad9dd0-c4e9-442c-a3f0-c2d3ec0aaee2"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 09:05:16 crc kubenswrapper[4765]: I1003 09:05:16.470508 4765 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1cad9dd0-c4e9-442c-a3f0-c2d3ec0aaee2-config-data" (OuterVolumeSpecName: "config-data") pod "1cad9dd0-c4e9-442c-a3f0-c2d3ec0aaee2" (UID: "1cad9dd0-c4e9-442c-a3f0-c2d3ec0aaee2"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 09:05:16 crc kubenswrapper[4765]: I1003 09:05:16.494349 4765 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-w7qdg\" (UniqueName: \"kubernetes.io/projected/1cad9dd0-c4e9-442c-a3f0-c2d3ec0aaee2-kube-api-access-w7qdg\") on node \"crc\" DevicePath \"\"" Oct 03 09:05:16 crc kubenswrapper[4765]: I1003 09:05:16.494399 4765 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1cad9dd0-c4e9-442c-a3f0-c2d3ec0aaee2-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 03 09:05:16 crc kubenswrapper[4765]: I1003 09:05:16.494414 4765 reconciler_common.go:293] "Volume detached for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/1cad9dd0-c4e9-442c-a3f0-c2d3ec0aaee2-db-sync-config-data\") on node \"crc\" DevicePath \"\"" Oct 03 09:05:16 crc kubenswrapper[4765]: I1003 09:05:16.494426 4765 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1cad9dd0-c4e9-442c-a3f0-c2d3ec0aaee2-config-data\") on node \"crc\" DevicePath \"\"" Oct 03 09:05:16 crc kubenswrapper[4765]: I1003 09:05:16.906133 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/watcher-kuttl-db-sync-lh9fk" event={"ID":"1cad9dd0-c4e9-442c-a3f0-c2d3ec0aaee2","Type":"ContainerDied","Data":"919af238144f9d28260776ad19d7df5d5398c3d91b411b834f0f221dc5ba40e9"} Oct 03 09:05:16 crc kubenswrapper[4765]: I1003 09:05:16.906500 4765 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="919af238144f9d28260776ad19d7df5d5398c3d91b411b834f0f221dc5ba40e9" Oct 03 09:05:16 crc kubenswrapper[4765]: I1003 09:05:16.906422 4765 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/watcher-kuttl-db-sync-lh9fk" Oct 03 09:05:17 crc kubenswrapper[4765]: I1003 09:05:17.236099 4765 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["watcher-kuttl-default/watcher-kuttl-decision-engine-0"] Oct 03 09:05:17 crc kubenswrapper[4765]: E1003 09:05:17.236436 4765 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1cad9dd0-c4e9-442c-a3f0-c2d3ec0aaee2" containerName="watcher-kuttl-db-sync" Oct 03 09:05:17 crc kubenswrapper[4765]: I1003 09:05:17.236451 4765 state_mem.go:107] "Deleted CPUSet assignment" podUID="1cad9dd0-c4e9-442c-a3f0-c2d3ec0aaee2" containerName="watcher-kuttl-db-sync" Oct 03 09:05:17 crc kubenswrapper[4765]: I1003 09:05:17.236613 4765 memory_manager.go:354] "RemoveStaleState removing state" podUID="1cad9dd0-c4e9-442c-a3f0-c2d3ec0aaee2" containerName="watcher-kuttl-db-sync" Oct 03 09:05:17 crc kubenswrapper[4765]: I1003 09:05:17.237166 4765 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/watcher-kuttl-decision-engine-0" Oct 03 09:05:17 crc kubenswrapper[4765]: I1003 09:05:17.239246 4765 reflector.go:368] Caches populated for *v1.Secret from object-"watcher-kuttl-default"/"watcher-watcher-kuttl-dockercfg-hgx22" Oct 03 09:05:17 crc kubenswrapper[4765]: I1003 09:05:17.240035 4765 reflector.go:368] Caches populated for *v1.Secret from object-"watcher-kuttl-default"/"watcher-kuttl-decision-engine-config-data" Oct 03 09:05:17 crc kubenswrapper[4765]: I1003 09:05:17.252169 4765 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["watcher-kuttl-default/watcher-kuttl-decision-engine-0"] Oct 03 09:05:17 crc kubenswrapper[4765]: I1003 09:05:17.307308 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qfc2s\" (UniqueName: \"kubernetes.io/projected/3669b09d-23ed-4f85-b730-22c36851ca02-kube-api-access-qfc2s\") pod \"watcher-kuttl-decision-engine-0\" (UID: \"3669b09d-23ed-4f85-b730-22c36851ca02\") " pod="watcher-kuttl-default/watcher-kuttl-decision-engine-0" Oct 03 09:05:17 crc kubenswrapper[4765]: I1003 09:05:17.307362 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"custom-prometheus-ca\" (UniqueName: \"kubernetes.io/secret/3669b09d-23ed-4f85-b730-22c36851ca02-custom-prometheus-ca\") pod \"watcher-kuttl-decision-engine-0\" (UID: \"3669b09d-23ed-4f85-b730-22c36851ca02\") " pod="watcher-kuttl-default/watcher-kuttl-decision-engine-0" Oct 03 09:05:17 crc kubenswrapper[4765]: I1003 09:05:17.307394 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert-memcached-mtls\" (UniqueName: \"kubernetes.io/secret/3669b09d-23ed-4f85-b730-22c36851ca02-cert-memcached-mtls\") pod \"watcher-kuttl-decision-engine-0\" (UID: \"3669b09d-23ed-4f85-b730-22c36851ca02\") " pod="watcher-kuttl-default/watcher-kuttl-decision-engine-0" Oct 03 09:05:17 crc kubenswrapper[4765]: I1003 09:05:17.307549 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3669b09d-23ed-4f85-b730-22c36851ca02-combined-ca-bundle\") pod \"watcher-kuttl-decision-engine-0\" (UID: \"3669b09d-23ed-4f85-b730-22c36851ca02\") " pod="watcher-kuttl-default/watcher-kuttl-decision-engine-0" Oct 03 09:05:17 crc kubenswrapper[4765]: I1003 09:05:17.307591 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/3669b09d-23ed-4f85-b730-22c36851ca02-logs\") pod \"watcher-kuttl-decision-engine-0\" (UID: \"3669b09d-23ed-4f85-b730-22c36851ca02\") " pod="watcher-kuttl-default/watcher-kuttl-decision-engine-0" Oct 03 09:05:17 crc kubenswrapper[4765]: I1003 09:05:17.307737 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3669b09d-23ed-4f85-b730-22c36851ca02-config-data\") pod \"watcher-kuttl-decision-engine-0\" (UID: \"3669b09d-23ed-4f85-b730-22c36851ca02\") " pod="watcher-kuttl-default/watcher-kuttl-decision-engine-0" Oct 03 09:05:17 crc kubenswrapper[4765]: I1003 09:05:17.328453 4765 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["watcher-kuttl-default/watcher-kuttl-api-0"] Oct 03 09:05:17 crc kubenswrapper[4765]: I1003 09:05:17.330906 4765 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/watcher-kuttl-api-0" Oct 03 09:05:17 crc kubenswrapper[4765]: I1003 09:05:17.337712 4765 reflector.go:368] Caches populated for *v1.Secret from object-"watcher-kuttl-default"/"watcher-kuttl-api-config-data" Oct 03 09:05:17 crc kubenswrapper[4765]: I1003 09:05:17.386365 4765 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["watcher-kuttl-default/watcher-kuttl-api-0"] Oct 03 09:05:17 crc kubenswrapper[4765]: I1003 09:05:17.411636 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"custom-prometheus-ca\" (UniqueName: \"kubernetes.io/secret/3669b09d-23ed-4f85-b730-22c36851ca02-custom-prometheus-ca\") pod \"watcher-kuttl-decision-engine-0\" (UID: \"3669b09d-23ed-4f85-b730-22c36851ca02\") " pod="watcher-kuttl-default/watcher-kuttl-decision-engine-0" Oct 03 09:05:17 crc kubenswrapper[4765]: I1003 09:05:17.411716 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert-memcached-mtls\" (UniqueName: \"kubernetes.io/secret/1d8a89f7-c9ca-44bc-a600-6c2ff2d369f9-cert-memcached-mtls\") pod \"watcher-kuttl-api-0\" (UID: \"1d8a89f7-c9ca-44bc-a600-6c2ff2d369f9\") " pod="watcher-kuttl-default/watcher-kuttl-api-0" Oct 03 09:05:17 crc kubenswrapper[4765]: I1003 09:05:17.411742 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mvrsd\" (UniqueName: \"kubernetes.io/projected/1d8a89f7-c9ca-44bc-a600-6c2ff2d369f9-kube-api-access-mvrsd\") pod \"watcher-kuttl-api-0\" (UID: \"1d8a89f7-c9ca-44bc-a600-6c2ff2d369f9\") " pod="watcher-kuttl-default/watcher-kuttl-api-0" Oct 03 09:05:17 crc kubenswrapper[4765]: I1003 09:05:17.411768 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"custom-prometheus-ca\" (UniqueName: \"kubernetes.io/secret/1d8a89f7-c9ca-44bc-a600-6c2ff2d369f9-custom-prometheus-ca\") pod \"watcher-kuttl-api-0\" (UID: \"1d8a89f7-c9ca-44bc-a600-6c2ff2d369f9\") " pod="watcher-kuttl-default/watcher-kuttl-api-0" Oct 03 09:05:17 crc kubenswrapper[4765]: I1003 09:05:17.411802 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert-memcached-mtls\" (UniqueName: \"kubernetes.io/secret/3669b09d-23ed-4f85-b730-22c36851ca02-cert-memcached-mtls\") pod \"watcher-kuttl-decision-engine-0\" (UID: \"3669b09d-23ed-4f85-b730-22c36851ca02\") " pod="watcher-kuttl-default/watcher-kuttl-decision-engine-0" Oct 03 09:05:17 crc kubenswrapper[4765]: I1003 09:05:17.411827 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3669b09d-23ed-4f85-b730-22c36851ca02-combined-ca-bundle\") pod \"watcher-kuttl-decision-engine-0\" (UID: \"3669b09d-23ed-4f85-b730-22c36851ca02\") " pod="watcher-kuttl-default/watcher-kuttl-decision-engine-0" Oct 03 09:05:17 crc kubenswrapper[4765]: I1003 09:05:17.411851 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/3669b09d-23ed-4f85-b730-22c36851ca02-logs\") pod \"watcher-kuttl-decision-engine-0\" (UID: \"3669b09d-23ed-4f85-b730-22c36851ca02\") " pod="watcher-kuttl-default/watcher-kuttl-decision-engine-0" Oct 03 09:05:17 crc kubenswrapper[4765]: I1003 09:05:17.411931 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1d8a89f7-c9ca-44bc-a600-6c2ff2d369f9-config-data\") pod \"watcher-kuttl-api-0\" (UID: \"1d8a89f7-c9ca-44bc-a600-6c2ff2d369f9\") " pod="watcher-kuttl-default/watcher-kuttl-api-0" Oct 03 09:05:17 crc kubenswrapper[4765]: I1003 09:05:17.411958 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1d8a89f7-c9ca-44bc-a600-6c2ff2d369f9-combined-ca-bundle\") pod \"watcher-kuttl-api-0\" (UID: \"1d8a89f7-c9ca-44bc-a600-6c2ff2d369f9\") " pod="watcher-kuttl-default/watcher-kuttl-api-0" Oct 03 09:05:17 crc kubenswrapper[4765]: I1003 09:05:17.411986 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3669b09d-23ed-4f85-b730-22c36851ca02-config-data\") pod \"watcher-kuttl-decision-engine-0\" (UID: \"3669b09d-23ed-4f85-b730-22c36851ca02\") " pod="watcher-kuttl-default/watcher-kuttl-decision-engine-0" Oct 03 09:05:17 crc kubenswrapper[4765]: I1003 09:05:17.412028 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/1d8a89f7-c9ca-44bc-a600-6c2ff2d369f9-logs\") pod \"watcher-kuttl-api-0\" (UID: \"1d8a89f7-c9ca-44bc-a600-6c2ff2d369f9\") " pod="watcher-kuttl-default/watcher-kuttl-api-0" Oct 03 09:05:17 crc kubenswrapper[4765]: I1003 09:05:17.412059 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qfc2s\" (UniqueName: \"kubernetes.io/projected/3669b09d-23ed-4f85-b730-22c36851ca02-kube-api-access-qfc2s\") pod \"watcher-kuttl-decision-engine-0\" (UID: \"3669b09d-23ed-4f85-b730-22c36851ca02\") " pod="watcher-kuttl-default/watcher-kuttl-decision-engine-0" Oct 03 09:05:17 crc kubenswrapper[4765]: I1003 09:05:17.418161 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/3669b09d-23ed-4f85-b730-22c36851ca02-logs\") pod \"watcher-kuttl-decision-engine-0\" (UID: \"3669b09d-23ed-4f85-b730-22c36851ca02\") " pod="watcher-kuttl-default/watcher-kuttl-decision-engine-0" Oct 03 09:05:17 crc kubenswrapper[4765]: I1003 09:05:17.425960 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"custom-prometheus-ca\" (UniqueName: \"kubernetes.io/secret/3669b09d-23ed-4f85-b730-22c36851ca02-custom-prometheus-ca\") pod \"watcher-kuttl-decision-engine-0\" (UID: \"3669b09d-23ed-4f85-b730-22c36851ca02\") " pod="watcher-kuttl-default/watcher-kuttl-decision-engine-0" Oct 03 09:05:17 crc kubenswrapper[4765]: I1003 09:05:17.426225 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert-memcached-mtls\" (UniqueName: \"kubernetes.io/secret/3669b09d-23ed-4f85-b730-22c36851ca02-cert-memcached-mtls\") pod \"watcher-kuttl-decision-engine-0\" (UID: \"3669b09d-23ed-4f85-b730-22c36851ca02\") " pod="watcher-kuttl-default/watcher-kuttl-decision-engine-0" Oct 03 09:05:17 crc kubenswrapper[4765]: I1003 09:05:17.428960 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3669b09d-23ed-4f85-b730-22c36851ca02-combined-ca-bundle\") pod \"watcher-kuttl-decision-engine-0\" (UID: \"3669b09d-23ed-4f85-b730-22c36851ca02\") " pod="watcher-kuttl-default/watcher-kuttl-decision-engine-0" Oct 03 09:05:17 crc kubenswrapper[4765]: I1003 09:05:17.431312 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3669b09d-23ed-4f85-b730-22c36851ca02-config-data\") pod \"watcher-kuttl-decision-engine-0\" (UID: \"3669b09d-23ed-4f85-b730-22c36851ca02\") " pod="watcher-kuttl-default/watcher-kuttl-decision-engine-0" Oct 03 09:05:17 crc kubenswrapper[4765]: I1003 09:05:17.443755 4765 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["watcher-kuttl-default/watcher-kuttl-api-1"] Oct 03 09:05:17 crc kubenswrapper[4765]: I1003 09:05:17.445374 4765 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/watcher-kuttl-api-1" Oct 03 09:05:17 crc kubenswrapper[4765]: I1003 09:05:17.460992 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qfc2s\" (UniqueName: \"kubernetes.io/projected/3669b09d-23ed-4f85-b730-22c36851ca02-kube-api-access-qfc2s\") pod \"watcher-kuttl-decision-engine-0\" (UID: \"3669b09d-23ed-4f85-b730-22c36851ca02\") " pod="watcher-kuttl-default/watcher-kuttl-decision-engine-0" Oct 03 09:05:17 crc kubenswrapper[4765]: I1003 09:05:17.461051 4765 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["watcher-kuttl-default/watcher-kuttl-applier-0"] Oct 03 09:05:17 crc kubenswrapper[4765]: I1003 09:05:17.462331 4765 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/watcher-kuttl-applier-0" Oct 03 09:05:17 crc kubenswrapper[4765]: I1003 09:05:17.477192 4765 reflector.go:368] Caches populated for *v1.Secret from object-"watcher-kuttl-default"/"watcher-kuttl-applier-config-data" Oct 03 09:05:17 crc kubenswrapper[4765]: I1003 09:05:17.514198 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert-memcached-mtls\" (UniqueName: \"kubernetes.io/secret/1d8a89f7-c9ca-44bc-a600-6c2ff2d369f9-cert-memcached-mtls\") pod \"watcher-kuttl-api-0\" (UID: \"1d8a89f7-c9ca-44bc-a600-6c2ff2d369f9\") " pod="watcher-kuttl-default/watcher-kuttl-api-0" Oct 03 09:05:17 crc kubenswrapper[4765]: I1003 09:05:17.514249 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mvrsd\" (UniqueName: \"kubernetes.io/projected/1d8a89f7-c9ca-44bc-a600-6c2ff2d369f9-kube-api-access-mvrsd\") pod \"watcher-kuttl-api-0\" (UID: \"1d8a89f7-c9ca-44bc-a600-6c2ff2d369f9\") " pod="watcher-kuttl-default/watcher-kuttl-api-0" Oct 03 09:05:17 crc kubenswrapper[4765]: I1003 09:05:17.514274 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"custom-prometheus-ca\" (UniqueName: \"kubernetes.io/secret/1d8a89f7-c9ca-44bc-a600-6c2ff2d369f9-custom-prometheus-ca\") pod \"watcher-kuttl-api-0\" (UID: \"1d8a89f7-c9ca-44bc-a600-6c2ff2d369f9\") " pod="watcher-kuttl-default/watcher-kuttl-api-0" Oct 03 09:05:17 crc kubenswrapper[4765]: I1003 09:05:17.514365 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1d8a89f7-c9ca-44bc-a600-6c2ff2d369f9-config-data\") pod \"watcher-kuttl-api-0\" (UID: \"1d8a89f7-c9ca-44bc-a600-6c2ff2d369f9\") " pod="watcher-kuttl-default/watcher-kuttl-api-0" Oct 03 09:05:17 crc kubenswrapper[4765]: I1003 09:05:17.514395 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1d8a89f7-c9ca-44bc-a600-6c2ff2d369f9-combined-ca-bundle\") pod \"watcher-kuttl-api-0\" (UID: \"1d8a89f7-c9ca-44bc-a600-6c2ff2d369f9\") " pod="watcher-kuttl-default/watcher-kuttl-api-0" Oct 03 09:05:17 crc kubenswrapper[4765]: I1003 09:05:17.514442 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/1d8a89f7-c9ca-44bc-a600-6c2ff2d369f9-logs\") pod \"watcher-kuttl-api-0\" (UID: \"1d8a89f7-c9ca-44bc-a600-6c2ff2d369f9\") " pod="watcher-kuttl-default/watcher-kuttl-api-0" Oct 03 09:05:17 crc kubenswrapper[4765]: I1003 09:05:17.514841 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/1d8a89f7-c9ca-44bc-a600-6c2ff2d369f9-logs\") pod \"watcher-kuttl-api-0\" (UID: \"1d8a89f7-c9ca-44bc-a600-6c2ff2d369f9\") " pod="watcher-kuttl-default/watcher-kuttl-api-0" Oct 03 09:05:17 crc kubenswrapper[4765]: I1003 09:05:17.520716 4765 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["watcher-kuttl-default/watcher-kuttl-api-1"] Oct 03 09:05:17 crc kubenswrapper[4765]: I1003 09:05:17.533179 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert-memcached-mtls\" (UniqueName: \"kubernetes.io/secret/1d8a89f7-c9ca-44bc-a600-6c2ff2d369f9-cert-memcached-mtls\") pod \"watcher-kuttl-api-0\" (UID: \"1d8a89f7-c9ca-44bc-a600-6c2ff2d369f9\") " pod="watcher-kuttl-default/watcher-kuttl-api-0" Oct 03 09:05:17 crc kubenswrapper[4765]: I1003 09:05:17.533416 4765 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["watcher-kuttl-default/watcher-kuttl-applier-0"] Oct 03 09:05:17 crc kubenswrapper[4765]: I1003 09:05:17.536357 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"custom-prometheus-ca\" (UniqueName: \"kubernetes.io/secret/1d8a89f7-c9ca-44bc-a600-6c2ff2d369f9-custom-prometheus-ca\") pod \"watcher-kuttl-api-0\" (UID: \"1d8a89f7-c9ca-44bc-a600-6c2ff2d369f9\") " pod="watcher-kuttl-default/watcher-kuttl-api-0" Oct 03 09:05:17 crc kubenswrapper[4765]: I1003 09:05:17.543104 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1d8a89f7-c9ca-44bc-a600-6c2ff2d369f9-config-data\") pod \"watcher-kuttl-api-0\" (UID: \"1d8a89f7-c9ca-44bc-a600-6c2ff2d369f9\") " pod="watcher-kuttl-default/watcher-kuttl-api-0" Oct 03 09:05:17 crc kubenswrapper[4765]: I1003 09:05:17.548989 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1d8a89f7-c9ca-44bc-a600-6c2ff2d369f9-combined-ca-bundle\") pod \"watcher-kuttl-api-0\" (UID: \"1d8a89f7-c9ca-44bc-a600-6c2ff2d369f9\") " pod="watcher-kuttl-default/watcher-kuttl-api-0" Oct 03 09:05:17 crc kubenswrapper[4765]: I1003 09:05:17.552589 4765 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/watcher-kuttl-decision-engine-0" Oct 03 09:05:17 crc kubenswrapper[4765]: I1003 09:05:17.564233 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mvrsd\" (UniqueName: \"kubernetes.io/projected/1d8a89f7-c9ca-44bc-a600-6c2ff2d369f9-kube-api-access-mvrsd\") pod \"watcher-kuttl-api-0\" (UID: \"1d8a89f7-c9ca-44bc-a600-6c2ff2d369f9\") " pod="watcher-kuttl-default/watcher-kuttl-api-0" Oct 03 09:05:17 crc kubenswrapper[4765]: I1003 09:05:17.628341 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gkgjr\" (UniqueName: \"kubernetes.io/projected/bba0b645-70f1-4933-8370-f24077971b0c-kube-api-access-gkgjr\") pod \"watcher-kuttl-applier-0\" (UID: \"bba0b645-70f1-4933-8370-f24077971b0c\") " pod="watcher-kuttl-default/watcher-kuttl-applier-0" Oct 03 09:05:17 crc kubenswrapper[4765]: I1003 09:05:17.628391 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b9a8c09a-1859-47cd-b598-a9ffd6ce62b4-combined-ca-bundle\") pod \"watcher-kuttl-api-1\" (UID: \"b9a8c09a-1859-47cd-b598-a9ffd6ce62b4\") " pod="watcher-kuttl-default/watcher-kuttl-api-1" Oct 03 09:05:17 crc kubenswrapper[4765]: I1003 09:05:17.628417 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"custom-prometheus-ca\" (UniqueName: \"kubernetes.io/secret/b9a8c09a-1859-47cd-b598-a9ffd6ce62b4-custom-prometheus-ca\") pod \"watcher-kuttl-api-1\" (UID: \"b9a8c09a-1859-47cd-b598-a9ffd6ce62b4\") " pod="watcher-kuttl-default/watcher-kuttl-api-1" Oct 03 09:05:17 crc kubenswrapper[4765]: I1003 09:05:17.628447 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/b9a8c09a-1859-47cd-b598-a9ffd6ce62b4-logs\") pod \"watcher-kuttl-api-1\" (UID: \"b9a8c09a-1859-47cd-b598-a9ffd6ce62b4\") " pod="watcher-kuttl-default/watcher-kuttl-api-1" Oct 03 09:05:17 crc kubenswrapper[4765]: I1003 09:05:17.628463 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert-memcached-mtls\" (UniqueName: \"kubernetes.io/secret/bba0b645-70f1-4933-8370-f24077971b0c-cert-memcached-mtls\") pod \"watcher-kuttl-applier-0\" (UID: \"bba0b645-70f1-4933-8370-f24077971b0c\") " pod="watcher-kuttl-default/watcher-kuttl-applier-0" Oct 03 09:05:17 crc kubenswrapper[4765]: I1003 09:05:17.628483 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jhgmp\" (UniqueName: \"kubernetes.io/projected/b9a8c09a-1859-47cd-b598-a9ffd6ce62b4-kube-api-access-jhgmp\") pod \"watcher-kuttl-api-1\" (UID: \"b9a8c09a-1859-47cd-b598-a9ffd6ce62b4\") " pod="watcher-kuttl-default/watcher-kuttl-api-1" Oct 03 09:05:17 crc kubenswrapper[4765]: I1003 09:05:17.628505 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b9a8c09a-1859-47cd-b598-a9ffd6ce62b4-config-data\") pod \"watcher-kuttl-api-1\" (UID: \"b9a8c09a-1859-47cd-b598-a9ffd6ce62b4\") " pod="watcher-kuttl-default/watcher-kuttl-api-1" Oct 03 09:05:17 crc kubenswrapper[4765]: I1003 09:05:17.628523 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/bba0b645-70f1-4933-8370-f24077971b0c-config-data\") pod \"watcher-kuttl-applier-0\" (UID: \"bba0b645-70f1-4933-8370-f24077971b0c\") " pod="watcher-kuttl-default/watcher-kuttl-applier-0" Oct 03 09:05:17 crc kubenswrapper[4765]: I1003 09:05:17.628563 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert-memcached-mtls\" (UniqueName: \"kubernetes.io/secret/b9a8c09a-1859-47cd-b598-a9ffd6ce62b4-cert-memcached-mtls\") pod \"watcher-kuttl-api-1\" (UID: \"b9a8c09a-1859-47cd-b598-a9ffd6ce62b4\") " pod="watcher-kuttl-default/watcher-kuttl-api-1" Oct 03 09:05:17 crc kubenswrapper[4765]: I1003 09:05:17.628600 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/bba0b645-70f1-4933-8370-f24077971b0c-logs\") pod \"watcher-kuttl-applier-0\" (UID: \"bba0b645-70f1-4933-8370-f24077971b0c\") " pod="watcher-kuttl-default/watcher-kuttl-applier-0" Oct 03 09:05:17 crc kubenswrapper[4765]: I1003 09:05:17.628620 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bba0b645-70f1-4933-8370-f24077971b0c-combined-ca-bundle\") pod \"watcher-kuttl-applier-0\" (UID: \"bba0b645-70f1-4933-8370-f24077971b0c\") " pod="watcher-kuttl-default/watcher-kuttl-applier-0" Oct 03 09:05:17 crc kubenswrapper[4765]: I1003 09:05:17.655557 4765 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/watcher-kuttl-api-0" Oct 03 09:05:17 crc kubenswrapper[4765]: I1003 09:05:17.730585 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/bba0b645-70f1-4933-8370-f24077971b0c-logs\") pod \"watcher-kuttl-applier-0\" (UID: \"bba0b645-70f1-4933-8370-f24077971b0c\") " pod="watcher-kuttl-default/watcher-kuttl-applier-0" Oct 03 09:05:17 crc kubenswrapper[4765]: I1003 09:05:17.749104 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bba0b645-70f1-4933-8370-f24077971b0c-combined-ca-bundle\") pod \"watcher-kuttl-applier-0\" (UID: \"bba0b645-70f1-4933-8370-f24077971b0c\") " pod="watcher-kuttl-default/watcher-kuttl-applier-0" Oct 03 09:05:17 crc kubenswrapper[4765]: I1003 09:05:17.749254 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gkgjr\" (UniqueName: \"kubernetes.io/projected/bba0b645-70f1-4933-8370-f24077971b0c-kube-api-access-gkgjr\") pod \"watcher-kuttl-applier-0\" (UID: \"bba0b645-70f1-4933-8370-f24077971b0c\") " pod="watcher-kuttl-default/watcher-kuttl-applier-0" Oct 03 09:05:17 crc kubenswrapper[4765]: I1003 09:05:17.749305 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b9a8c09a-1859-47cd-b598-a9ffd6ce62b4-combined-ca-bundle\") pod \"watcher-kuttl-api-1\" (UID: \"b9a8c09a-1859-47cd-b598-a9ffd6ce62b4\") " pod="watcher-kuttl-default/watcher-kuttl-api-1" Oct 03 09:05:17 crc kubenswrapper[4765]: I1003 09:05:17.749348 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"custom-prometheus-ca\" (UniqueName: \"kubernetes.io/secret/b9a8c09a-1859-47cd-b598-a9ffd6ce62b4-custom-prometheus-ca\") pod \"watcher-kuttl-api-1\" (UID: \"b9a8c09a-1859-47cd-b598-a9ffd6ce62b4\") " pod="watcher-kuttl-default/watcher-kuttl-api-1" Oct 03 09:05:17 crc kubenswrapper[4765]: I1003 09:05:17.749411 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/b9a8c09a-1859-47cd-b598-a9ffd6ce62b4-logs\") pod \"watcher-kuttl-api-1\" (UID: \"b9a8c09a-1859-47cd-b598-a9ffd6ce62b4\") " pod="watcher-kuttl-default/watcher-kuttl-api-1" Oct 03 09:05:17 crc kubenswrapper[4765]: I1003 09:05:17.749436 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert-memcached-mtls\" (UniqueName: \"kubernetes.io/secret/bba0b645-70f1-4933-8370-f24077971b0c-cert-memcached-mtls\") pod \"watcher-kuttl-applier-0\" (UID: \"bba0b645-70f1-4933-8370-f24077971b0c\") " pod="watcher-kuttl-default/watcher-kuttl-applier-0" Oct 03 09:05:17 crc kubenswrapper[4765]: I1003 09:05:17.749470 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jhgmp\" (UniqueName: \"kubernetes.io/projected/b9a8c09a-1859-47cd-b598-a9ffd6ce62b4-kube-api-access-jhgmp\") pod \"watcher-kuttl-api-1\" (UID: \"b9a8c09a-1859-47cd-b598-a9ffd6ce62b4\") " pod="watcher-kuttl-default/watcher-kuttl-api-1" Oct 03 09:05:17 crc kubenswrapper[4765]: I1003 09:05:17.749508 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b9a8c09a-1859-47cd-b598-a9ffd6ce62b4-config-data\") pod \"watcher-kuttl-api-1\" (UID: \"b9a8c09a-1859-47cd-b598-a9ffd6ce62b4\") " pod="watcher-kuttl-default/watcher-kuttl-api-1" Oct 03 09:05:17 crc kubenswrapper[4765]: I1003 09:05:17.749564 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/bba0b645-70f1-4933-8370-f24077971b0c-config-data\") pod \"watcher-kuttl-applier-0\" (UID: \"bba0b645-70f1-4933-8370-f24077971b0c\") " pod="watcher-kuttl-default/watcher-kuttl-applier-0" Oct 03 09:05:17 crc kubenswrapper[4765]: I1003 09:05:17.750377 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert-memcached-mtls\" (UniqueName: \"kubernetes.io/secret/b9a8c09a-1859-47cd-b598-a9ffd6ce62b4-cert-memcached-mtls\") pod \"watcher-kuttl-api-1\" (UID: \"b9a8c09a-1859-47cd-b598-a9ffd6ce62b4\") " pod="watcher-kuttl-default/watcher-kuttl-api-1" Oct 03 09:05:17 crc kubenswrapper[4765]: I1003 09:05:17.752010 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/b9a8c09a-1859-47cd-b598-a9ffd6ce62b4-logs\") pod \"watcher-kuttl-api-1\" (UID: \"b9a8c09a-1859-47cd-b598-a9ffd6ce62b4\") " pod="watcher-kuttl-default/watcher-kuttl-api-1" Oct 03 09:05:17 crc kubenswrapper[4765]: I1003 09:05:17.735811 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/bba0b645-70f1-4933-8370-f24077971b0c-logs\") pod \"watcher-kuttl-applier-0\" (UID: \"bba0b645-70f1-4933-8370-f24077971b0c\") " pod="watcher-kuttl-default/watcher-kuttl-applier-0" Oct 03 09:05:17 crc kubenswrapper[4765]: I1003 09:05:17.756491 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert-memcached-mtls\" (UniqueName: \"kubernetes.io/secret/b9a8c09a-1859-47cd-b598-a9ffd6ce62b4-cert-memcached-mtls\") pod \"watcher-kuttl-api-1\" (UID: \"b9a8c09a-1859-47cd-b598-a9ffd6ce62b4\") " pod="watcher-kuttl-default/watcher-kuttl-api-1" Oct 03 09:05:17 crc kubenswrapper[4765]: I1003 09:05:17.757330 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"custom-prometheus-ca\" (UniqueName: \"kubernetes.io/secret/b9a8c09a-1859-47cd-b598-a9ffd6ce62b4-custom-prometheus-ca\") pod \"watcher-kuttl-api-1\" (UID: \"b9a8c09a-1859-47cd-b598-a9ffd6ce62b4\") " pod="watcher-kuttl-default/watcher-kuttl-api-1" Oct 03 09:05:17 crc kubenswrapper[4765]: I1003 09:05:17.760498 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b9a8c09a-1859-47cd-b598-a9ffd6ce62b4-combined-ca-bundle\") pod \"watcher-kuttl-api-1\" (UID: \"b9a8c09a-1859-47cd-b598-a9ffd6ce62b4\") " pod="watcher-kuttl-default/watcher-kuttl-api-1" Oct 03 09:05:17 crc kubenswrapper[4765]: I1003 09:05:17.761362 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bba0b645-70f1-4933-8370-f24077971b0c-combined-ca-bundle\") pod \"watcher-kuttl-applier-0\" (UID: \"bba0b645-70f1-4933-8370-f24077971b0c\") " pod="watcher-kuttl-default/watcher-kuttl-applier-0" Oct 03 09:05:17 crc kubenswrapper[4765]: I1003 09:05:17.762866 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b9a8c09a-1859-47cd-b598-a9ffd6ce62b4-config-data\") pod \"watcher-kuttl-api-1\" (UID: \"b9a8c09a-1859-47cd-b598-a9ffd6ce62b4\") " pod="watcher-kuttl-default/watcher-kuttl-api-1" Oct 03 09:05:17 crc kubenswrapper[4765]: I1003 09:05:17.766956 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert-memcached-mtls\" (UniqueName: \"kubernetes.io/secret/bba0b645-70f1-4933-8370-f24077971b0c-cert-memcached-mtls\") pod \"watcher-kuttl-applier-0\" (UID: \"bba0b645-70f1-4933-8370-f24077971b0c\") " pod="watcher-kuttl-default/watcher-kuttl-applier-0" Oct 03 09:05:17 crc kubenswrapper[4765]: I1003 09:05:17.773271 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/bba0b645-70f1-4933-8370-f24077971b0c-config-data\") pod \"watcher-kuttl-applier-0\" (UID: \"bba0b645-70f1-4933-8370-f24077971b0c\") " pod="watcher-kuttl-default/watcher-kuttl-applier-0" Oct 03 09:05:17 crc kubenswrapper[4765]: I1003 09:05:17.784236 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gkgjr\" (UniqueName: \"kubernetes.io/projected/bba0b645-70f1-4933-8370-f24077971b0c-kube-api-access-gkgjr\") pod \"watcher-kuttl-applier-0\" (UID: \"bba0b645-70f1-4933-8370-f24077971b0c\") " pod="watcher-kuttl-default/watcher-kuttl-applier-0" Oct 03 09:05:17 crc kubenswrapper[4765]: I1003 09:05:17.795968 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jhgmp\" (UniqueName: \"kubernetes.io/projected/b9a8c09a-1859-47cd-b598-a9ffd6ce62b4-kube-api-access-jhgmp\") pod \"watcher-kuttl-api-1\" (UID: \"b9a8c09a-1859-47cd-b598-a9ffd6ce62b4\") " pod="watcher-kuttl-default/watcher-kuttl-api-1" Oct 03 09:05:17 crc kubenswrapper[4765]: I1003 09:05:17.956745 4765 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/watcher-kuttl-api-1" Oct 03 09:05:17 crc kubenswrapper[4765]: I1003 09:05:17.979338 4765 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/watcher-kuttl-applier-0" Oct 03 09:05:18 crc kubenswrapper[4765]: I1003 09:05:18.171096 4765 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["watcher-kuttl-default/watcher-kuttl-decision-engine-0"] Oct 03 09:05:18 crc kubenswrapper[4765]: W1003 09:05:18.321287 4765 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod1d8a89f7_c9ca_44bc_a600_6c2ff2d369f9.slice/crio-e8fb7bea615170fc165565ff5e182140c36cca8f15b68d12084f416142f9b574 WatchSource:0}: Error finding container e8fb7bea615170fc165565ff5e182140c36cca8f15b68d12084f416142f9b574: Status 404 returned error can't find the container with id e8fb7bea615170fc165565ff5e182140c36cca8f15b68d12084f416142f9b574 Oct 03 09:05:18 crc kubenswrapper[4765]: I1003 09:05:18.328989 4765 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["watcher-kuttl-default/watcher-kuttl-api-0"] Oct 03 09:05:18 crc kubenswrapper[4765]: I1003 09:05:18.506305 4765 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["watcher-kuttl-default/watcher-kuttl-api-1"] Oct 03 09:05:18 crc kubenswrapper[4765]: I1003 09:05:18.654143 4765 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["watcher-kuttl-default/watcher-kuttl-applier-0"] Oct 03 09:05:18 crc kubenswrapper[4765]: I1003 09:05:18.943481 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/watcher-kuttl-applier-0" event={"ID":"bba0b645-70f1-4933-8370-f24077971b0c","Type":"ContainerStarted","Data":"7bc063ab757caea4438ed944cc746e782c86db03ec7ef54219691ba86dc3f431"} Oct 03 09:05:18 crc kubenswrapper[4765]: I1003 09:05:18.944159 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/watcher-kuttl-applier-0" event={"ID":"bba0b645-70f1-4933-8370-f24077971b0c","Type":"ContainerStarted","Data":"00de83f377edc168b6df4adc489d9de51299b7fc88ddb70b0562c8f10344349e"} Oct 03 09:05:18 crc kubenswrapper[4765]: I1003 09:05:18.950097 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/watcher-kuttl-api-0" event={"ID":"1d8a89f7-c9ca-44bc-a600-6c2ff2d369f9","Type":"ContainerStarted","Data":"c2813cfe40bc57c88a7569e33cd00057158f17e38460816e65400415ed19cf3d"} Oct 03 09:05:18 crc kubenswrapper[4765]: I1003 09:05:18.950319 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/watcher-kuttl-api-0" event={"ID":"1d8a89f7-c9ca-44bc-a600-6c2ff2d369f9","Type":"ContainerStarted","Data":"3e12391e4a3be2cafea78b9fca57beafa77a4e56a66ea9a622496c43a5c16099"} Oct 03 09:05:18 crc kubenswrapper[4765]: I1003 09:05:18.951204 4765 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="watcher-kuttl-default/watcher-kuttl-api-0" Oct 03 09:05:18 crc kubenswrapper[4765]: I1003 09:05:18.951739 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/watcher-kuttl-api-0" event={"ID":"1d8a89f7-c9ca-44bc-a600-6c2ff2d369f9","Type":"ContainerStarted","Data":"e8fb7bea615170fc165565ff5e182140c36cca8f15b68d12084f416142f9b574"} Oct 03 09:05:18 crc kubenswrapper[4765]: I1003 09:05:18.953119 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/watcher-kuttl-decision-engine-0" event={"ID":"3669b09d-23ed-4f85-b730-22c36851ca02","Type":"ContainerStarted","Data":"61ec2535a698fc94c3eeb1f6ace68e948318b07eba61b15e3e11e6045959bde9"} Oct 03 09:05:18 crc kubenswrapper[4765]: I1003 09:05:18.953243 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/watcher-kuttl-decision-engine-0" event={"ID":"3669b09d-23ed-4f85-b730-22c36851ca02","Type":"ContainerStarted","Data":"8654bd470422c1029ec068530a117633cbf989aca06415480edbcbf7b10e3cc2"} Oct 03 09:05:18 crc kubenswrapper[4765]: I1003 09:05:18.953904 4765 prober.go:107] "Probe failed" probeType="Readiness" pod="watcher-kuttl-default/watcher-kuttl-api-0" podUID="1d8a89f7-c9ca-44bc-a600-6c2ff2d369f9" containerName="watcher-api" probeResult="failure" output="Get \"http://10.217.0.183:9322/\": dial tcp 10.217.0.183:9322: connect: connection refused" Oct 03 09:05:18 crc kubenswrapper[4765]: I1003 09:05:18.955566 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/watcher-kuttl-api-1" event={"ID":"b9a8c09a-1859-47cd-b598-a9ffd6ce62b4","Type":"ContainerStarted","Data":"64d7296706e7417ff5478686cdce2ac08109f591282cde57f967ed3578b2764a"} Oct 03 09:05:18 crc kubenswrapper[4765]: I1003 09:05:18.955611 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/watcher-kuttl-api-1" event={"ID":"b9a8c09a-1859-47cd-b598-a9ffd6ce62b4","Type":"ContainerStarted","Data":"a72a4d7a0ef6b6248f21e63472aed01b3b6ba7e55baeaed0d1185dd236812c04"} Oct 03 09:05:18 crc kubenswrapper[4765]: I1003 09:05:18.971016 4765 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="watcher-kuttl-default/watcher-kuttl-applier-0" podStartSLOduration=1.9709938280000001 podStartE2EDuration="1.970993828s" podCreationTimestamp="2025-10-03 09:05:17 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-03 09:05:18.963293769 +0000 UTC m=+1563.264788099" watchObservedRunningTime="2025-10-03 09:05:18.970993828 +0000 UTC m=+1563.272488158" Oct 03 09:05:18 crc kubenswrapper[4765]: I1003 09:05:18.993719 4765 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="watcher-kuttl-default/watcher-kuttl-decision-engine-0" podStartSLOduration=1.993700122 podStartE2EDuration="1.993700122s" podCreationTimestamp="2025-10-03 09:05:17 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-03 09:05:18.989211166 +0000 UTC m=+1563.290705506" watchObservedRunningTime="2025-10-03 09:05:18.993700122 +0000 UTC m=+1563.295194452" Oct 03 09:05:19 crc kubenswrapper[4765]: I1003 09:05:19.014321 4765 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="watcher-kuttl-default/watcher-kuttl-api-0" podStartSLOduration=2.014300632 podStartE2EDuration="2.014300632s" podCreationTimestamp="2025-10-03 09:05:17 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-03 09:05:19.008529503 +0000 UTC m=+1563.310023843" watchObservedRunningTime="2025-10-03 09:05:19.014300632 +0000 UTC m=+1563.315794962" Oct 03 09:05:19 crc kubenswrapper[4765]: I1003 09:05:19.966790 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/watcher-kuttl-api-1" event={"ID":"b9a8c09a-1859-47cd-b598-a9ffd6ce62b4","Type":"ContainerStarted","Data":"35842a981688a41a3c48a85757b385de00e81f2b3badff0c13b7def96d800197"} Oct 03 09:05:19 crc kubenswrapper[4765]: I1003 09:05:19.993418 4765 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="watcher-kuttl-default/watcher-kuttl-api-1" podStartSLOduration=2.993398924 podStartE2EDuration="2.993398924s" podCreationTimestamp="2025-10-03 09:05:17 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-03 09:05:19.989885014 +0000 UTC m=+1564.291379364" watchObservedRunningTime="2025-10-03 09:05:19.993398924 +0000 UTC m=+1564.294893254" Oct 03 09:05:20 crc kubenswrapper[4765]: I1003 09:05:20.977885 4765 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="watcher-kuttl-default/watcher-kuttl-api-1" Oct 03 09:05:22 crc kubenswrapper[4765]: I1003 09:05:22.625773 4765 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="watcher-kuttl-default/watcher-kuttl-api-0" Oct 03 09:05:22 crc kubenswrapper[4765]: I1003 09:05:22.656117 4765 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="watcher-kuttl-default/watcher-kuttl-api-0" Oct 03 09:05:22 crc kubenswrapper[4765]: I1003 09:05:22.957309 4765 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="watcher-kuttl-default/watcher-kuttl-api-1" Oct 03 09:05:22 crc kubenswrapper[4765]: I1003 09:05:22.980482 4765 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="watcher-kuttl-default/watcher-kuttl-applier-0" Oct 03 09:05:22 crc kubenswrapper[4765]: I1003 09:05:22.989723 4765 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Oct 03 09:05:23 crc kubenswrapper[4765]: I1003 09:05:23.517918 4765 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="watcher-kuttl-default/watcher-kuttl-api-1" Oct 03 09:05:26 crc kubenswrapper[4765]: I1003 09:05:26.116811 4765 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="watcher-kuttl-default/ceilometer-0" Oct 03 09:05:27 crc kubenswrapper[4765]: I1003 09:05:27.553234 4765 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="watcher-kuttl-default/watcher-kuttl-decision-engine-0" Oct 03 09:05:27 crc kubenswrapper[4765]: I1003 09:05:27.582467 4765 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="watcher-kuttl-default/watcher-kuttl-decision-engine-0" Oct 03 09:05:27 crc kubenswrapper[4765]: I1003 09:05:27.656681 4765 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="watcher-kuttl-default/watcher-kuttl-api-0" Oct 03 09:05:27 crc kubenswrapper[4765]: I1003 09:05:27.661558 4765 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="watcher-kuttl-default/watcher-kuttl-api-0" Oct 03 09:05:27 crc kubenswrapper[4765]: I1003 09:05:27.957485 4765 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="watcher-kuttl-default/watcher-kuttl-api-1" Oct 03 09:05:27 crc kubenswrapper[4765]: I1003 09:05:27.963242 4765 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="watcher-kuttl-default/watcher-kuttl-api-1" Oct 03 09:05:27 crc kubenswrapper[4765]: I1003 09:05:27.979446 4765 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="watcher-kuttl-default/watcher-kuttl-applier-0" Oct 03 09:05:28 crc kubenswrapper[4765]: I1003 09:05:28.007019 4765 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="watcher-kuttl-default/watcher-kuttl-applier-0" Oct 03 09:05:28 crc kubenswrapper[4765]: I1003 09:05:28.033728 4765 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="watcher-kuttl-default/watcher-kuttl-decision-engine-0" Oct 03 09:05:28 crc kubenswrapper[4765]: I1003 09:05:28.041556 4765 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="watcher-kuttl-default/watcher-kuttl-api-1" Oct 03 09:05:28 crc kubenswrapper[4765]: I1003 09:05:28.041614 4765 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="watcher-kuttl-default/watcher-kuttl-api-0" Oct 03 09:05:28 crc kubenswrapper[4765]: I1003 09:05:28.065729 4765 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="watcher-kuttl-default/watcher-kuttl-decision-engine-0" Oct 03 09:05:28 crc kubenswrapper[4765]: I1003 09:05:28.079317 4765 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="watcher-kuttl-default/watcher-kuttl-applier-0" Oct 03 09:05:28 crc kubenswrapper[4765]: I1003 09:05:28.307245 4765 scope.go:117] "RemoveContainer" containerID="dd918556e4256b95f1ffce5dba4f8a301b33441a569fc5bbea88da3f09eb9800" Oct 03 09:05:28 crc kubenswrapper[4765]: E1003 09:05:28.307521 4765 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-j8mss_openshift-machine-config-operator(d636dbad-9ffa-4ba7-953f-adea04b76a23)\"" pod="openshift-machine-config-operator/machine-config-daemon-j8mss" podUID="d636dbad-9ffa-4ba7-953f-adea04b76a23" Oct 03 09:05:29 crc kubenswrapper[4765]: I1003 09:05:29.669395 4765 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["watcher-kuttl-default/ceilometer-0"] Oct 03 09:05:29 crc kubenswrapper[4765]: I1003 09:05:29.670148 4765 kuberuntime_container.go:808] "Killing container with a grace period" pod="watcher-kuttl-default/ceilometer-0" podUID="26111dc6-7d31-48ad-a6a3-4bda319a71d5" containerName="proxy-httpd" containerID="cri-o://4dc2c23b1622c0a60ed7089232d4c876d1e18ccdba79e7b41bb64e586e0ba352" gracePeriod=30 Oct 03 09:05:29 crc kubenswrapper[4765]: I1003 09:05:29.670199 4765 kuberuntime_container.go:808] "Killing container with a grace period" pod="watcher-kuttl-default/ceilometer-0" podUID="26111dc6-7d31-48ad-a6a3-4bda319a71d5" containerName="sg-core" containerID="cri-o://fbd64ba33438952d3c750ece586a3787f342b322c9a37f28bbbf0cd11e71ace4" gracePeriod=30 Oct 03 09:05:29 crc kubenswrapper[4765]: I1003 09:05:29.670292 4765 kuberuntime_container.go:808] "Killing container with a grace period" pod="watcher-kuttl-default/ceilometer-0" podUID="26111dc6-7d31-48ad-a6a3-4bda319a71d5" containerName="ceilometer-notification-agent" containerID="cri-o://b22bb8689ecc9dd4f0e5d26c4d823803c919864ce9c6d9ecfee378a9cbb3370c" gracePeriod=30 Oct 03 09:05:29 crc kubenswrapper[4765]: I1003 09:05:29.670430 4765 kuberuntime_container.go:808] "Killing container with a grace period" pod="watcher-kuttl-default/ceilometer-0" podUID="26111dc6-7d31-48ad-a6a3-4bda319a71d5" containerName="ceilometer-central-agent" containerID="cri-o://d0c6700111c0fb323a7c1c1c532293193af50fd5197bf856ddd2da55c03349eb" gracePeriod=30 Oct 03 09:05:30 crc kubenswrapper[4765]: I1003 09:05:30.051363 4765 generic.go:334] "Generic (PLEG): container finished" podID="26111dc6-7d31-48ad-a6a3-4bda319a71d5" containerID="4dc2c23b1622c0a60ed7089232d4c876d1e18ccdba79e7b41bb64e586e0ba352" exitCode=0 Oct 03 09:05:30 crc kubenswrapper[4765]: I1003 09:05:30.051397 4765 generic.go:334] "Generic (PLEG): container finished" podID="26111dc6-7d31-48ad-a6a3-4bda319a71d5" containerID="fbd64ba33438952d3c750ece586a3787f342b322c9a37f28bbbf0cd11e71ace4" exitCode=2 Oct 03 09:05:30 crc kubenswrapper[4765]: I1003 09:05:30.051453 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/ceilometer-0" event={"ID":"26111dc6-7d31-48ad-a6a3-4bda319a71d5","Type":"ContainerDied","Data":"4dc2c23b1622c0a60ed7089232d4c876d1e18ccdba79e7b41bb64e586e0ba352"} Oct 03 09:05:30 crc kubenswrapper[4765]: I1003 09:05:30.051509 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/ceilometer-0" event={"ID":"26111dc6-7d31-48ad-a6a3-4bda319a71d5","Type":"ContainerDied","Data":"fbd64ba33438952d3c750ece586a3787f342b322c9a37f28bbbf0cd11e71ace4"} Oct 03 09:05:31 crc kubenswrapper[4765]: I1003 09:05:31.064671 4765 generic.go:334] "Generic (PLEG): container finished" podID="26111dc6-7d31-48ad-a6a3-4bda319a71d5" containerID="d0c6700111c0fb323a7c1c1c532293193af50fd5197bf856ddd2da55c03349eb" exitCode=0 Oct 03 09:05:31 crc kubenswrapper[4765]: I1003 09:05:31.064977 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/ceilometer-0" event={"ID":"26111dc6-7d31-48ad-a6a3-4bda319a71d5","Type":"ContainerDied","Data":"d0c6700111c0fb323a7c1c1c532293193af50fd5197bf856ddd2da55c03349eb"} Oct 03 09:05:35 crc kubenswrapper[4765]: I1003 09:05:35.098908 4765 generic.go:334] "Generic (PLEG): container finished" podID="26111dc6-7d31-48ad-a6a3-4bda319a71d5" containerID="b22bb8689ecc9dd4f0e5d26c4d823803c919864ce9c6d9ecfee378a9cbb3370c" exitCode=0 Oct 03 09:05:35 crc kubenswrapper[4765]: I1003 09:05:35.099498 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/ceilometer-0" event={"ID":"26111dc6-7d31-48ad-a6a3-4bda319a71d5","Type":"ContainerDied","Data":"b22bb8689ecc9dd4f0e5d26c4d823803c919864ce9c6d9ecfee378a9cbb3370c"} Oct 03 09:05:35 crc kubenswrapper[4765]: I1003 09:05:35.865852 4765 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/ceilometer-0" Oct 03 09:05:35 crc kubenswrapper[4765]: I1003 09:05:35.990039 4765 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/26111dc6-7d31-48ad-a6a3-4bda319a71d5-scripts\") pod \"26111dc6-7d31-48ad-a6a3-4bda319a71d5\" (UID: \"26111dc6-7d31-48ad-a6a3-4bda319a71d5\") " Oct 03 09:05:35 crc kubenswrapper[4765]: I1003 09:05:35.990364 4765 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-v64gg\" (UniqueName: \"kubernetes.io/projected/26111dc6-7d31-48ad-a6a3-4bda319a71d5-kube-api-access-v64gg\") pod \"26111dc6-7d31-48ad-a6a3-4bda319a71d5\" (UID: \"26111dc6-7d31-48ad-a6a3-4bda319a71d5\") " Oct 03 09:05:35 crc kubenswrapper[4765]: I1003 09:05:35.990395 4765 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/26111dc6-7d31-48ad-a6a3-4bda319a71d5-run-httpd\") pod \"26111dc6-7d31-48ad-a6a3-4bda319a71d5\" (UID: \"26111dc6-7d31-48ad-a6a3-4bda319a71d5\") " Oct 03 09:05:35 crc kubenswrapper[4765]: I1003 09:05:35.990423 4765 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/26111dc6-7d31-48ad-a6a3-4bda319a71d5-log-httpd\") pod \"26111dc6-7d31-48ad-a6a3-4bda319a71d5\" (UID: \"26111dc6-7d31-48ad-a6a3-4bda319a71d5\") " Oct 03 09:05:35 crc kubenswrapper[4765]: I1003 09:05:35.990439 4765 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/26111dc6-7d31-48ad-a6a3-4bda319a71d5-combined-ca-bundle\") pod \"26111dc6-7d31-48ad-a6a3-4bda319a71d5\" (UID: \"26111dc6-7d31-48ad-a6a3-4bda319a71d5\") " Oct 03 09:05:35 crc kubenswrapper[4765]: I1003 09:05:35.990905 4765 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/26111dc6-7d31-48ad-a6a3-4bda319a71d5-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "26111dc6-7d31-48ad-a6a3-4bda319a71d5" (UID: "26111dc6-7d31-48ad-a6a3-4bda319a71d5"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 03 09:05:35 crc kubenswrapper[4765]: I1003 09:05:35.990914 4765 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/26111dc6-7d31-48ad-a6a3-4bda319a71d5-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "26111dc6-7d31-48ad-a6a3-4bda319a71d5" (UID: "26111dc6-7d31-48ad-a6a3-4bda319a71d5"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 03 09:05:35 crc kubenswrapper[4765]: I1003 09:05:35.991777 4765 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/26111dc6-7d31-48ad-a6a3-4bda319a71d5-config-data\") pod \"26111dc6-7d31-48ad-a6a3-4bda319a71d5\" (UID: \"26111dc6-7d31-48ad-a6a3-4bda319a71d5\") " Oct 03 09:05:35 crc kubenswrapper[4765]: I1003 09:05:35.991824 4765 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/26111dc6-7d31-48ad-a6a3-4bda319a71d5-sg-core-conf-yaml\") pod \"26111dc6-7d31-48ad-a6a3-4bda319a71d5\" (UID: \"26111dc6-7d31-48ad-a6a3-4bda319a71d5\") " Oct 03 09:05:35 crc kubenswrapper[4765]: I1003 09:05:35.991855 4765 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/26111dc6-7d31-48ad-a6a3-4bda319a71d5-ceilometer-tls-certs\") pod \"26111dc6-7d31-48ad-a6a3-4bda319a71d5\" (UID: \"26111dc6-7d31-48ad-a6a3-4bda319a71d5\") " Oct 03 09:05:35 crc kubenswrapper[4765]: I1003 09:05:35.992511 4765 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/26111dc6-7d31-48ad-a6a3-4bda319a71d5-run-httpd\") on node \"crc\" DevicePath \"\"" Oct 03 09:05:35 crc kubenswrapper[4765]: I1003 09:05:35.992529 4765 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/26111dc6-7d31-48ad-a6a3-4bda319a71d5-log-httpd\") on node \"crc\" DevicePath \"\"" Oct 03 09:05:35 crc kubenswrapper[4765]: I1003 09:05:35.995538 4765 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/26111dc6-7d31-48ad-a6a3-4bda319a71d5-kube-api-access-v64gg" (OuterVolumeSpecName: "kube-api-access-v64gg") pod "26111dc6-7d31-48ad-a6a3-4bda319a71d5" (UID: "26111dc6-7d31-48ad-a6a3-4bda319a71d5"). InnerVolumeSpecName "kube-api-access-v64gg". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 09:05:35 crc kubenswrapper[4765]: I1003 09:05:35.995593 4765 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/26111dc6-7d31-48ad-a6a3-4bda319a71d5-scripts" (OuterVolumeSpecName: "scripts") pod "26111dc6-7d31-48ad-a6a3-4bda319a71d5" (UID: "26111dc6-7d31-48ad-a6a3-4bda319a71d5"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 09:05:36 crc kubenswrapper[4765]: I1003 09:05:36.027759 4765 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/26111dc6-7d31-48ad-a6a3-4bda319a71d5-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "26111dc6-7d31-48ad-a6a3-4bda319a71d5" (UID: "26111dc6-7d31-48ad-a6a3-4bda319a71d5"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 09:05:36 crc kubenswrapper[4765]: I1003 09:05:36.048354 4765 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/26111dc6-7d31-48ad-a6a3-4bda319a71d5-ceilometer-tls-certs" (OuterVolumeSpecName: "ceilometer-tls-certs") pod "26111dc6-7d31-48ad-a6a3-4bda319a71d5" (UID: "26111dc6-7d31-48ad-a6a3-4bda319a71d5"). InnerVolumeSpecName "ceilometer-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 09:05:36 crc kubenswrapper[4765]: I1003 09:05:36.054885 4765 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/26111dc6-7d31-48ad-a6a3-4bda319a71d5-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "26111dc6-7d31-48ad-a6a3-4bda319a71d5" (UID: "26111dc6-7d31-48ad-a6a3-4bda319a71d5"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 09:05:36 crc kubenswrapper[4765]: I1003 09:05:36.089807 4765 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/26111dc6-7d31-48ad-a6a3-4bda319a71d5-config-data" (OuterVolumeSpecName: "config-data") pod "26111dc6-7d31-48ad-a6a3-4bda319a71d5" (UID: "26111dc6-7d31-48ad-a6a3-4bda319a71d5"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 09:05:36 crc kubenswrapper[4765]: I1003 09:05:36.093789 4765 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/26111dc6-7d31-48ad-a6a3-4bda319a71d5-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 03 09:05:36 crc kubenswrapper[4765]: I1003 09:05:36.093826 4765 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/26111dc6-7d31-48ad-a6a3-4bda319a71d5-config-data\") on node \"crc\" DevicePath \"\"" Oct 03 09:05:36 crc kubenswrapper[4765]: I1003 09:05:36.093850 4765 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/26111dc6-7d31-48ad-a6a3-4bda319a71d5-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Oct 03 09:05:36 crc kubenswrapper[4765]: I1003 09:05:36.093860 4765 reconciler_common.go:293] "Volume detached for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/26111dc6-7d31-48ad-a6a3-4bda319a71d5-ceilometer-tls-certs\") on node \"crc\" DevicePath \"\"" Oct 03 09:05:36 crc kubenswrapper[4765]: I1003 09:05:36.093873 4765 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/26111dc6-7d31-48ad-a6a3-4bda319a71d5-scripts\") on node \"crc\" DevicePath \"\"" Oct 03 09:05:36 crc kubenswrapper[4765]: I1003 09:05:36.093896 4765 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-v64gg\" (UniqueName: \"kubernetes.io/projected/26111dc6-7d31-48ad-a6a3-4bda319a71d5-kube-api-access-v64gg\") on node \"crc\" DevicePath \"\"" Oct 03 09:05:36 crc kubenswrapper[4765]: I1003 09:05:36.118086 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/ceilometer-0" event={"ID":"26111dc6-7d31-48ad-a6a3-4bda319a71d5","Type":"ContainerDied","Data":"cd4a5bda9d09c78727cc7e2119c34fb60e73604f0768d550ec618e288598d130"} Oct 03 09:05:36 crc kubenswrapper[4765]: I1003 09:05:36.118147 4765 scope.go:117] "RemoveContainer" containerID="4dc2c23b1622c0a60ed7089232d4c876d1e18ccdba79e7b41bb64e586e0ba352" Oct 03 09:05:36 crc kubenswrapper[4765]: I1003 09:05:36.118203 4765 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/ceilometer-0" Oct 03 09:05:36 crc kubenswrapper[4765]: I1003 09:05:36.162004 4765 scope.go:117] "RemoveContainer" containerID="fbd64ba33438952d3c750ece586a3787f342b322c9a37f28bbbf0cd11e71ace4" Oct 03 09:05:36 crc kubenswrapper[4765]: I1003 09:05:36.171685 4765 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["watcher-kuttl-default/ceilometer-0"] Oct 03 09:05:36 crc kubenswrapper[4765]: I1003 09:05:36.179976 4765 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["watcher-kuttl-default/ceilometer-0"] Oct 03 09:05:36 crc kubenswrapper[4765]: I1003 09:05:36.184451 4765 scope.go:117] "RemoveContainer" containerID="b22bb8689ecc9dd4f0e5d26c4d823803c919864ce9c6d9ecfee378a9cbb3370c" Oct 03 09:05:36 crc kubenswrapper[4765]: I1003 09:05:36.203297 4765 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["watcher-kuttl-default/ceilometer-0"] Oct 03 09:05:36 crc kubenswrapper[4765]: E1003 09:05:36.204281 4765 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="26111dc6-7d31-48ad-a6a3-4bda319a71d5" containerName="sg-core" Oct 03 09:05:36 crc kubenswrapper[4765]: I1003 09:05:36.204591 4765 state_mem.go:107] "Deleted CPUSet assignment" podUID="26111dc6-7d31-48ad-a6a3-4bda319a71d5" containerName="sg-core" Oct 03 09:05:36 crc kubenswrapper[4765]: E1003 09:05:36.204704 4765 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="26111dc6-7d31-48ad-a6a3-4bda319a71d5" containerName="ceilometer-notification-agent" Oct 03 09:05:36 crc kubenswrapper[4765]: I1003 09:05:36.204755 4765 state_mem.go:107] "Deleted CPUSet assignment" podUID="26111dc6-7d31-48ad-a6a3-4bda319a71d5" containerName="ceilometer-notification-agent" Oct 03 09:05:36 crc kubenswrapper[4765]: E1003 09:05:36.204819 4765 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="26111dc6-7d31-48ad-a6a3-4bda319a71d5" containerName="ceilometer-central-agent" Oct 03 09:05:36 crc kubenswrapper[4765]: I1003 09:05:36.204867 4765 state_mem.go:107] "Deleted CPUSet assignment" podUID="26111dc6-7d31-48ad-a6a3-4bda319a71d5" containerName="ceilometer-central-agent" Oct 03 09:05:36 crc kubenswrapper[4765]: E1003 09:05:36.204938 4765 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="26111dc6-7d31-48ad-a6a3-4bda319a71d5" containerName="proxy-httpd" Oct 03 09:05:36 crc kubenswrapper[4765]: I1003 09:05:36.204988 4765 state_mem.go:107] "Deleted CPUSet assignment" podUID="26111dc6-7d31-48ad-a6a3-4bda319a71d5" containerName="proxy-httpd" Oct 03 09:05:36 crc kubenswrapper[4765]: I1003 09:05:36.205266 4765 memory_manager.go:354] "RemoveStaleState removing state" podUID="26111dc6-7d31-48ad-a6a3-4bda319a71d5" containerName="sg-core" Oct 03 09:05:36 crc kubenswrapper[4765]: I1003 09:05:36.205334 4765 memory_manager.go:354] "RemoveStaleState removing state" podUID="26111dc6-7d31-48ad-a6a3-4bda319a71d5" containerName="proxy-httpd" Oct 03 09:05:36 crc kubenswrapper[4765]: I1003 09:05:36.205386 4765 memory_manager.go:354] "RemoveStaleState removing state" podUID="26111dc6-7d31-48ad-a6a3-4bda319a71d5" containerName="ceilometer-central-agent" Oct 03 09:05:36 crc kubenswrapper[4765]: I1003 09:05:36.205446 4765 memory_manager.go:354] "RemoveStaleState removing state" podUID="26111dc6-7d31-48ad-a6a3-4bda319a71d5" containerName="ceilometer-notification-agent" Oct 03 09:05:36 crc kubenswrapper[4765]: I1003 09:05:36.209210 4765 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/ceilometer-0" Oct 03 09:05:36 crc kubenswrapper[4765]: I1003 09:05:36.214746 4765 reflector.go:368] Caches populated for *v1.Secret from object-"watcher-kuttl-default"/"ceilometer-scripts" Oct 03 09:05:36 crc kubenswrapper[4765]: I1003 09:05:36.215019 4765 reflector.go:368] Caches populated for *v1.Secret from object-"watcher-kuttl-default"/"ceilometer-config-data" Oct 03 09:05:36 crc kubenswrapper[4765]: I1003 09:05:36.215283 4765 reflector.go:368] Caches populated for *v1.Secret from object-"watcher-kuttl-default"/"cert-ceilometer-internal-svc" Oct 03 09:05:36 crc kubenswrapper[4765]: I1003 09:05:36.215736 4765 scope.go:117] "RemoveContainer" containerID="d0c6700111c0fb323a7c1c1c532293193af50fd5197bf856ddd2da55c03349eb" Oct 03 09:05:36 crc kubenswrapper[4765]: I1003 09:05:36.217212 4765 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["watcher-kuttl-default/ceilometer-0"] Oct 03 09:05:36 crc kubenswrapper[4765]: I1003 09:05:36.299478 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6ffed54e-6e2d-463d-b21b-f9ce909b1264-config-data\") pod \"ceilometer-0\" (UID: \"6ffed54e-6e2d-463d-b21b-f9ce909b1264\") " pod="watcher-kuttl-default/ceilometer-0" Oct 03 09:05:36 crc kubenswrapper[4765]: I1003 09:05:36.299526 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/6ffed54e-6e2d-463d-b21b-f9ce909b1264-run-httpd\") pod \"ceilometer-0\" (UID: \"6ffed54e-6e2d-463d-b21b-f9ce909b1264\") " pod="watcher-kuttl-default/ceilometer-0" Oct 03 09:05:36 crc kubenswrapper[4765]: I1003 09:05:36.299573 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/6ffed54e-6e2d-463d-b21b-f9ce909b1264-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"6ffed54e-6e2d-463d-b21b-f9ce909b1264\") " pod="watcher-kuttl-default/ceilometer-0" Oct 03 09:05:36 crc kubenswrapper[4765]: I1003 09:05:36.299613 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/6ffed54e-6e2d-463d-b21b-f9ce909b1264-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"6ffed54e-6e2d-463d-b21b-f9ce909b1264\") " pod="watcher-kuttl-default/ceilometer-0" Oct 03 09:05:36 crc kubenswrapper[4765]: I1003 09:05:36.299684 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/6ffed54e-6e2d-463d-b21b-f9ce909b1264-scripts\") pod \"ceilometer-0\" (UID: \"6ffed54e-6e2d-463d-b21b-f9ce909b1264\") " pod="watcher-kuttl-default/ceilometer-0" Oct 03 09:05:36 crc kubenswrapper[4765]: I1003 09:05:36.299702 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6ffed54e-6e2d-463d-b21b-f9ce909b1264-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"6ffed54e-6e2d-463d-b21b-f9ce909b1264\") " pod="watcher-kuttl-default/ceilometer-0" Oct 03 09:05:36 crc kubenswrapper[4765]: I1003 09:05:36.299732 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2qnjp\" (UniqueName: \"kubernetes.io/projected/6ffed54e-6e2d-463d-b21b-f9ce909b1264-kube-api-access-2qnjp\") pod \"ceilometer-0\" (UID: \"6ffed54e-6e2d-463d-b21b-f9ce909b1264\") " pod="watcher-kuttl-default/ceilometer-0" Oct 03 09:05:36 crc kubenswrapper[4765]: I1003 09:05:36.299749 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/6ffed54e-6e2d-463d-b21b-f9ce909b1264-log-httpd\") pod \"ceilometer-0\" (UID: \"6ffed54e-6e2d-463d-b21b-f9ce909b1264\") " pod="watcher-kuttl-default/ceilometer-0" Oct 03 09:05:36 crc kubenswrapper[4765]: I1003 09:05:36.318497 4765 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="26111dc6-7d31-48ad-a6a3-4bda319a71d5" path="/var/lib/kubelet/pods/26111dc6-7d31-48ad-a6a3-4bda319a71d5/volumes" Oct 03 09:05:36 crc kubenswrapper[4765]: I1003 09:05:36.401052 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/6ffed54e-6e2d-463d-b21b-f9ce909b1264-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"6ffed54e-6e2d-463d-b21b-f9ce909b1264\") " pod="watcher-kuttl-default/ceilometer-0" Oct 03 09:05:36 crc kubenswrapper[4765]: I1003 09:05:36.401151 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/6ffed54e-6e2d-463d-b21b-f9ce909b1264-scripts\") pod \"ceilometer-0\" (UID: \"6ffed54e-6e2d-463d-b21b-f9ce909b1264\") " pod="watcher-kuttl-default/ceilometer-0" Oct 03 09:05:36 crc kubenswrapper[4765]: I1003 09:05:36.401173 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6ffed54e-6e2d-463d-b21b-f9ce909b1264-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"6ffed54e-6e2d-463d-b21b-f9ce909b1264\") " pod="watcher-kuttl-default/ceilometer-0" Oct 03 09:05:36 crc kubenswrapper[4765]: I1003 09:05:36.401206 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2qnjp\" (UniqueName: \"kubernetes.io/projected/6ffed54e-6e2d-463d-b21b-f9ce909b1264-kube-api-access-2qnjp\") pod \"ceilometer-0\" (UID: \"6ffed54e-6e2d-463d-b21b-f9ce909b1264\") " pod="watcher-kuttl-default/ceilometer-0" Oct 03 09:05:36 crc kubenswrapper[4765]: I1003 09:05:36.401227 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/6ffed54e-6e2d-463d-b21b-f9ce909b1264-log-httpd\") pod \"ceilometer-0\" (UID: \"6ffed54e-6e2d-463d-b21b-f9ce909b1264\") " pod="watcher-kuttl-default/ceilometer-0" Oct 03 09:05:36 crc kubenswrapper[4765]: I1003 09:05:36.401263 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/6ffed54e-6e2d-463d-b21b-f9ce909b1264-run-httpd\") pod \"ceilometer-0\" (UID: \"6ffed54e-6e2d-463d-b21b-f9ce909b1264\") " pod="watcher-kuttl-default/ceilometer-0" Oct 03 09:05:36 crc kubenswrapper[4765]: I1003 09:05:36.401281 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6ffed54e-6e2d-463d-b21b-f9ce909b1264-config-data\") pod \"ceilometer-0\" (UID: \"6ffed54e-6e2d-463d-b21b-f9ce909b1264\") " pod="watcher-kuttl-default/ceilometer-0" Oct 03 09:05:36 crc kubenswrapper[4765]: I1003 09:05:36.401318 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/6ffed54e-6e2d-463d-b21b-f9ce909b1264-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"6ffed54e-6e2d-463d-b21b-f9ce909b1264\") " pod="watcher-kuttl-default/ceilometer-0" Oct 03 09:05:36 crc kubenswrapper[4765]: I1003 09:05:36.402572 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/6ffed54e-6e2d-463d-b21b-f9ce909b1264-log-httpd\") pod \"ceilometer-0\" (UID: \"6ffed54e-6e2d-463d-b21b-f9ce909b1264\") " pod="watcher-kuttl-default/ceilometer-0" Oct 03 09:05:36 crc kubenswrapper[4765]: I1003 09:05:36.402718 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/6ffed54e-6e2d-463d-b21b-f9ce909b1264-run-httpd\") pod \"ceilometer-0\" (UID: \"6ffed54e-6e2d-463d-b21b-f9ce909b1264\") " pod="watcher-kuttl-default/ceilometer-0" Oct 03 09:05:36 crc kubenswrapper[4765]: I1003 09:05:36.406129 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/6ffed54e-6e2d-463d-b21b-f9ce909b1264-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"6ffed54e-6e2d-463d-b21b-f9ce909b1264\") " pod="watcher-kuttl-default/ceilometer-0" Oct 03 09:05:36 crc kubenswrapper[4765]: I1003 09:05:36.406380 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6ffed54e-6e2d-463d-b21b-f9ce909b1264-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"6ffed54e-6e2d-463d-b21b-f9ce909b1264\") " pod="watcher-kuttl-default/ceilometer-0" Oct 03 09:05:36 crc kubenswrapper[4765]: I1003 09:05:36.406380 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/6ffed54e-6e2d-463d-b21b-f9ce909b1264-scripts\") pod \"ceilometer-0\" (UID: \"6ffed54e-6e2d-463d-b21b-f9ce909b1264\") " pod="watcher-kuttl-default/ceilometer-0" Oct 03 09:05:36 crc kubenswrapper[4765]: I1003 09:05:36.407072 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/6ffed54e-6e2d-463d-b21b-f9ce909b1264-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"6ffed54e-6e2d-463d-b21b-f9ce909b1264\") " pod="watcher-kuttl-default/ceilometer-0" Oct 03 09:05:36 crc kubenswrapper[4765]: I1003 09:05:36.408368 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6ffed54e-6e2d-463d-b21b-f9ce909b1264-config-data\") pod \"ceilometer-0\" (UID: \"6ffed54e-6e2d-463d-b21b-f9ce909b1264\") " pod="watcher-kuttl-default/ceilometer-0" Oct 03 09:05:36 crc kubenswrapper[4765]: I1003 09:05:36.419091 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2qnjp\" (UniqueName: \"kubernetes.io/projected/6ffed54e-6e2d-463d-b21b-f9ce909b1264-kube-api-access-2qnjp\") pod \"ceilometer-0\" (UID: \"6ffed54e-6e2d-463d-b21b-f9ce909b1264\") " pod="watcher-kuttl-default/ceilometer-0" Oct 03 09:05:36 crc kubenswrapper[4765]: I1003 09:05:36.538844 4765 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/ceilometer-0" Oct 03 09:05:36 crc kubenswrapper[4765]: I1003 09:05:36.713848 4765 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["watcher-kuttl-default/watcher-kuttl-api-2"] Oct 03 09:05:36 crc kubenswrapper[4765]: I1003 09:05:36.725053 4765 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/watcher-kuttl-api-2" Oct 03 09:05:36 crc kubenswrapper[4765]: I1003 09:05:36.751564 4765 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["watcher-kuttl-default/watcher-kuttl-api-2"] Oct 03 09:05:36 crc kubenswrapper[4765]: I1003 09:05:36.806872 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lxvs7\" (UniqueName: \"kubernetes.io/projected/1dee5256-0d10-4009-9774-c57edd4aa274-kube-api-access-lxvs7\") pod \"watcher-kuttl-api-2\" (UID: \"1dee5256-0d10-4009-9774-c57edd4aa274\") " pod="watcher-kuttl-default/watcher-kuttl-api-2" Oct 03 09:05:36 crc kubenswrapper[4765]: I1003 09:05:36.806941 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"custom-prometheus-ca\" (UniqueName: \"kubernetes.io/secret/1dee5256-0d10-4009-9774-c57edd4aa274-custom-prometheus-ca\") pod \"watcher-kuttl-api-2\" (UID: \"1dee5256-0d10-4009-9774-c57edd4aa274\") " pod="watcher-kuttl-default/watcher-kuttl-api-2" Oct 03 09:05:36 crc kubenswrapper[4765]: I1003 09:05:36.807111 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1dee5256-0d10-4009-9774-c57edd4aa274-config-data\") pod \"watcher-kuttl-api-2\" (UID: \"1dee5256-0d10-4009-9774-c57edd4aa274\") " pod="watcher-kuttl-default/watcher-kuttl-api-2" Oct 03 09:05:36 crc kubenswrapper[4765]: I1003 09:05:36.807183 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert-memcached-mtls\" (UniqueName: \"kubernetes.io/secret/1dee5256-0d10-4009-9774-c57edd4aa274-cert-memcached-mtls\") pod \"watcher-kuttl-api-2\" (UID: \"1dee5256-0d10-4009-9774-c57edd4aa274\") " pod="watcher-kuttl-default/watcher-kuttl-api-2" Oct 03 09:05:36 crc kubenswrapper[4765]: I1003 09:05:36.807251 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/1dee5256-0d10-4009-9774-c57edd4aa274-logs\") pod \"watcher-kuttl-api-2\" (UID: \"1dee5256-0d10-4009-9774-c57edd4aa274\") " pod="watcher-kuttl-default/watcher-kuttl-api-2" Oct 03 09:05:36 crc kubenswrapper[4765]: I1003 09:05:36.807469 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1dee5256-0d10-4009-9774-c57edd4aa274-combined-ca-bundle\") pod \"watcher-kuttl-api-2\" (UID: \"1dee5256-0d10-4009-9774-c57edd4aa274\") " pod="watcher-kuttl-default/watcher-kuttl-api-2" Oct 03 09:05:36 crc kubenswrapper[4765]: I1003 09:05:36.909307 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/1dee5256-0d10-4009-9774-c57edd4aa274-logs\") pod \"watcher-kuttl-api-2\" (UID: \"1dee5256-0d10-4009-9774-c57edd4aa274\") " pod="watcher-kuttl-default/watcher-kuttl-api-2" Oct 03 09:05:36 crc kubenswrapper[4765]: I1003 09:05:36.909443 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1dee5256-0d10-4009-9774-c57edd4aa274-combined-ca-bundle\") pod \"watcher-kuttl-api-2\" (UID: \"1dee5256-0d10-4009-9774-c57edd4aa274\") " pod="watcher-kuttl-default/watcher-kuttl-api-2" Oct 03 09:05:36 crc kubenswrapper[4765]: I1003 09:05:36.909556 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lxvs7\" (UniqueName: \"kubernetes.io/projected/1dee5256-0d10-4009-9774-c57edd4aa274-kube-api-access-lxvs7\") pod \"watcher-kuttl-api-2\" (UID: \"1dee5256-0d10-4009-9774-c57edd4aa274\") " pod="watcher-kuttl-default/watcher-kuttl-api-2" Oct 03 09:05:36 crc kubenswrapper[4765]: I1003 09:05:36.909580 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"custom-prometheus-ca\" (UniqueName: \"kubernetes.io/secret/1dee5256-0d10-4009-9774-c57edd4aa274-custom-prometheus-ca\") pod \"watcher-kuttl-api-2\" (UID: \"1dee5256-0d10-4009-9774-c57edd4aa274\") " pod="watcher-kuttl-default/watcher-kuttl-api-2" Oct 03 09:05:36 crc kubenswrapper[4765]: I1003 09:05:36.909642 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1dee5256-0d10-4009-9774-c57edd4aa274-config-data\") pod \"watcher-kuttl-api-2\" (UID: \"1dee5256-0d10-4009-9774-c57edd4aa274\") " pod="watcher-kuttl-default/watcher-kuttl-api-2" Oct 03 09:05:36 crc kubenswrapper[4765]: I1003 09:05:36.909692 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert-memcached-mtls\" (UniqueName: \"kubernetes.io/secret/1dee5256-0d10-4009-9774-c57edd4aa274-cert-memcached-mtls\") pod \"watcher-kuttl-api-2\" (UID: \"1dee5256-0d10-4009-9774-c57edd4aa274\") " pod="watcher-kuttl-default/watcher-kuttl-api-2" Oct 03 09:05:36 crc kubenswrapper[4765]: I1003 09:05:36.910915 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/1dee5256-0d10-4009-9774-c57edd4aa274-logs\") pod \"watcher-kuttl-api-2\" (UID: \"1dee5256-0d10-4009-9774-c57edd4aa274\") " pod="watcher-kuttl-default/watcher-kuttl-api-2" Oct 03 09:05:36 crc kubenswrapper[4765]: I1003 09:05:36.917845 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"custom-prometheus-ca\" (UniqueName: \"kubernetes.io/secret/1dee5256-0d10-4009-9774-c57edd4aa274-custom-prometheus-ca\") pod \"watcher-kuttl-api-2\" (UID: \"1dee5256-0d10-4009-9774-c57edd4aa274\") " pod="watcher-kuttl-default/watcher-kuttl-api-2" Oct 03 09:05:36 crc kubenswrapper[4765]: I1003 09:05:36.917868 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert-memcached-mtls\" (UniqueName: \"kubernetes.io/secret/1dee5256-0d10-4009-9774-c57edd4aa274-cert-memcached-mtls\") pod \"watcher-kuttl-api-2\" (UID: \"1dee5256-0d10-4009-9774-c57edd4aa274\") " pod="watcher-kuttl-default/watcher-kuttl-api-2" Oct 03 09:05:36 crc kubenswrapper[4765]: I1003 09:05:36.918132 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1dee5256-0d10-4009-9774-c57edd4aa274-config-data\") pod \"watcher-kuttl-api-2\" (UID: \"1dee5256-0d10-4009-9774-c57edd4aa274\") " pod="watcher-kuttl-default/watcher-kuttl-api-2" Oct 03 09:05:36 crc kubenswrapper[4765]: I1003 09:05:36.919917 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1dee5256-0d10-4009-9774-c57edd4aa274-combined-ca-bundle\") pod \"watcher-kuttl-api-2\" (UID: \"1dee5256-0d10-4009-9774-c57edd4aa274\") " pod="watcher-kuttl-default/watcher-kuttl-api-2" Oct 03 09:05:36 crc kubenswrapper[4765]: I1003 09:05:36.934992 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lxvs7\" (UniqueName: \"kubernetes.io/projected/1dee5256-0d10-4009-9774-c57edd4aa274-kube-api-access-lxvs7\") pod \"watcher-kuttl-api-2\" (UID: \"1dee5256-0d10-4009-9774-c57edd4aa274\") " pod="watcher-kuttl-default/watcher-kuttl-api-2" Oct 03 09:05:37 crc kubenswrapper[4765]: W1003 09:05:37.017943 4765 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod6ffed54e_6e2d_463d_b21b_f9ce909b1264.slice/crio-f48e8215dd5a95254c08479d945518e6ee3d3418b6b3a656df9b6d258fbe4fb1 WatchSource:0}: Error finding container f48e8215dd5a95254c08479d945518e6ee3d3418b6b3a656df9b6d258fbe4fb1: Status 404 returned error can't find the container with id f48e8215dd5a95254c08479d945518e6ee3d3418b6b3a656df9b6d258fbe4fb1 Oct 03 09:05:37 crc kubenswrapper[4765]: I1003 09:05:37.018812 4765 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["watcher-kuttl-default/ceilometer-0"] Oct 03 09:05:37 crc kubenswrapper[4765]: I1003 09:05:37.021358 4765 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Oct 03 09:05:37 crc kubenswrapper[4765]: I1003 09:05:37.064474 4765 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/watcher-kuttl-api-2" Oct 03 09:05:37 crc kubenswrapper[4765]: I1003 09:05:37.152458 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/ceilometer-0" event={"ID":"6ffed54e-6e2d-463d-b21b-f9ce909b1264","Type":"ContainerStarted","Data":"f48e8215dd5a95254c08479d945518e6ee3d3418b6b3a656df9b6d258fbe4fb1"} Oct 03 09:05:37 crc kubenswrapper[4765]: W1003 09:05:37.635078 4765 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod1dee5256_0d10_4009_9774_c57edd4aa274.slice/crio-672dcb0547ce26de1af904bd31939fd4d41a9622f60bf902ad529e4e81c2d342 WatchSource:0}: Error finding container 672dcb0547ce26de1af904bd31939fd4d41a9622f60bf902ad529e4e81c2d342: Status 404 returned error can't find the container with id 672dcb0547ce26de1af904bd31939fd4d41a9622f60bf902ad529e4e81c2d342 Oct 03 09:05:37 crc kubenswrapper[4765]: I1003 09:05:37.635168 4765 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["watcher-kuttl-default/watcher-kuttl-api-2"] Oct 03 09:05:38 crc kubenswrapper[4765]: I1003 09:05:38.162407 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/watcher-kuttl-api-2" event={"ID":"1dee5256-0d10-4009-9774-c57edd4aa274","Type":"ContainerStarted","Data":"4131db2de2276c7ad976d0da09d58fd8f277c0d18f919c79babc9f0f34ccf6b1"} Oct 03 09:05:38 crc kubenswrapper[4765]: I1003 09:05:38.162977 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/watcher-kuttl-api-2" event={"ID":"1dee5256-0d10-4009-9774-c57edd4aa274","Type":"ContainerStarted","Data":"92b99aa09e2921ae248bdacb90411999fa28be0540d565b402fe30155a9503e5"} Oct 03 09:05:38 crc kubenswrapper[4765]: I1003 09:05:38.162996 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/watcher-kuttl-api-2" event={"ID":"1dee5256-0d10-4009-9774-c57edd4aa274","Type":"ContainerStarted","Data":"672dcb0547ce26de1af904bd31939fd4d41a9622f60bf902ad529e4e81c2d342"} Oct 03 09:05:38 crc kubenswrapper[4765]: I1003 09:05:38.163839 4765 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="watcher-kuttl-default/watcher-kuttl-api-2" Oct 03 09:05:38 crc kubenswrapper[4765]: I1003 09:05:38.166532 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/ceilometer-0" event={"ID":"6ffed54e-6e2d-463d-b21b-f9ce909b1264","Type":"ContainerStarted","Data":"96c2d8150247b33574bafdd3753851fdfb7a61afe1d584a46cef3bc486461c60"} Oct 03 09:05:38 crc kubenswrapper[4765]: I1003 09:05:38.182302 4765 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="watcher-kuttl-default/watcher-kuttl-api-2" podStartSLOduration=2.182280328 podStartE2EDuration="2.182280328s" podCreationTimestamp="2025-10-03 09:05:36 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-03 09:05:38.179800034 +0000 UTC m=+1582.481294364" watchObservedRunningTime="2025-10-03 09:05:38.182280328 +0000 UTC m=+1582.483774658" Oct 03 09:05:39 crc kubenswrapper[4765]: I1003 09:05:39.176473 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/ceilometer-0" event={"ID":"6ffed54e-6e2d-463d-b21b-f9ce909b1264","Type":"ContainerStarted","Data":"b1ab83ca5db2f8eb4f2ae72d3b3c46bb8132e82978d8f4d0ccd7b1ca3a8f810c"} Oct 03 09:05:40 crc kubenswrapper[4765]: I1003 09:05:40.189457 4765 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Oct 03 09:05:40 crc kubenswrapper[4765]: I1003 09:05:40.189446 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/ceilometer-0" event={"ID":"6ffed54e-6e2d-463d-b21b-f9ce909b1264","Type":"ContainerStarted","Data":"32b48866ca8b41c0fa2032a04197049e43ffdeccc6934c55ed37377073e72799"} Oct 03 09:05:40 crc kubenswrapper[4765]: I1003 09:05:40.905936 4765 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="watcher-kuttl-default/watcher-kuttl-api-2" Oct 03 09:05:42 crc kubenswrapper[4765]: I1003 09:05:42.065553 4765 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="watcher-kuttl-default/watcher-kuttl-api-2" Oct 03 09:05:42 crc kubenswrapper[4765]: I1003 09:05:42.217845 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/ceilometer-0" event={"ID":"6ffed54e-6e2d-463d-b21b-f9ce909b1264","Type":"ContainerStarted","Data":"4d3defc069f26cffaef7bd8bb410aeb1dc7e9e0ce6f2cc712b08b2023a284442"} Oct 03 09:05:42 crc kubenswrapper[4765]: I1003 09:05:42.217916 4765 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="watcher-kuttl-default/ceilometer-0" Oct 03 09:05:42 crc kubenswrapper[4765]: I1003 09:05:42.246913 4765 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="watcher-kuttl-default/ceilometer-0" podStartSLOduration=2.287969007 podStartE2EDuration="6.246891069s" podCreationTimestamp="2025-10-03 09:05:36 +0000 UTC" firstStartedPulling="2025-10-03 09:05:37.021097501 +0000 UTC m=+1581.322591831" lastFinishedPulling="2025-10-03 09:05:40.980019563 +0000 UTC m=+1585.281513893" observedRunningTime="2025-10-03 09:05:42.238005811 +0000 UTC m=+1586.539500141" watchObservedRunningTime="2025-10-03 09:05:42.246891069 +0000 UTC m=+1586.548385399" Oct 03 09:05:43 crc kubenswrapper[4765]: I1003 09:05:43.306274 4765 scope.go:117] "RemoveContainer" containerID="dd918556e4256b95f1ffce5dba4f8a301b33441a569fc5bbea88da3f09eb9800" Oct 03 09:05:43 crc kubenswrapper[4765]: E1003 09:05:43.306761 4765 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-j8mss_openshift-machine-config-operator(d636dbad-9ffa-4ba7-953f-adea04b76a23)\"" pod="openshift-machine-config-operator/machine-config-daemon-j8mss" podUID="d636dbad-9ffa-4ba7-953f-adea04b76a23" Oct 03 09:05:47 crc kubenswrapper[4765]: I1003 09:05:47.065114 4765 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="watcher-kuttl-default/watcher-kuttl-api-2" Oct 03 09:05:47 crc kubenswrapper[4765]: I1003 09:05:47.074545 4765 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="watcher-kuttl-default/watcher-kuttl-api-2" Oct 03 09:05:47 crc kubenswrapper[4765]: I1003 09:05:47.261508 4765 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="watcher-kuttl-default/watcher-kuttl-api-2" Oct 03 09:05:48 crc kubenswrapper[4765]: I1003 09:05:48.344846 4765 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["watcher-kuttl-default/watcher-kuttl-api-2"] Oct 03 09:05:48 crc kubenswrapper[4765]: I1003 09:05:48.352838 4765 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["watcher-kuttl-default/watcher-kuttl-api-1"] Oct 03 09:05:48 crc kubenswrapper[4765]: I1003 09:05:48.353112 4765 kuberuntime_container.go:808] "Killing container with a grace period" pod="watcher-kuttl-default/watcher-kuttl-api-1" podUID="b9a8c09a-1859-47cd-b598-a9ffd6ce62b4" containerName="watcher-kuttl-api-log" containerID="cri-o://64d7296706e7417ff5478686cdce2ac08109f591282cde57f967ed3578b2764a" gracePeriod=30 Oct 03 09:05:48 crc kubenswrapper[4765]: I1003 09:05:48.353234 4765 kuberuntime_container.go:808] "Killing container with a grace period" pod="watcher-kuttl-default/watcher-kuttl-api-1" podUID="b9a8c09a-1859-47cd-b598-a9ffd6ce62b4" containerName="watcher-api" containerID="cri-o://35842a981688a41a3c48a85757b385de00e81f2b3badff0c13b7def96d800197" gracePeriod=30 Oct 03 09:05:49 crc kubenswrapper[4765]: I1003 09:05:49.227987 4765 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/watcher-kuttl-api-1" Oct 03 09:05:49 crc kubenswrapper[4765]: I1003 09:05:49.275836 4765 generic.go:334] "Generic (PLEG): container finished" podID="b9a8c09a-1859-47cd-b598-a9ffd6ce62b4" containerID="35842a981688a41a3c48a85757b385de00e81f2b3badff0c13b7def96d800197" exitCode=0 Oct 03 09:05:49 crc kubenswrapper[4765]: I1003 09:05:49.276324 4765 generic.go:334] "Generic (PLEG): container finished" podID="b9a8c09a-1859-47cd-b598-a9ffd6ce62b4" containerID="64d7296706e7417ff5478686cdce2ac08109f591282cde57f967ed3578b2764a" exitCode=143 Oct 03 09:05:49 crc kubenswrapper[4765]: I1003 09:05:49.275944 4765 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/watcher-kuttl-api-1" Oct 03 09:05:49 crc kubenswrapper[4765]: I1003 09:05:49.275978 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/watcher-kuttl-api-1" event={"ID":"b9a8c09a-1859-47cd-b598-a9ffd6ce62b4","Type":"ContainerDied","Data":"35842a981688a41a3c48a85757b385de00e81f2b3badff0c13b7def96d800197"} Oct 03 09:05:49 crc kubenswrapper[4765]: I1003 09:05:49.276425 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/watcher-kuttl-api-1" event={"ID":"b9a8c09a-1859-47cd-b598-a9ffd6ce62b4","Type":"ContainerDied","Data":"64d7296706e7417ff5478686cdce2ac08109f591282cde57f967ed3578b2764a"} Oct 03 09:05:49 crc kubenswrapper[4765]: I1003 09:05:49.276446 4765 scope.go:117] "RemoveContainer" containerID="35842a981688a41a3c48a85757b385de00e81f2b3badff0c13b7def96d800197" Oct 03 09:05:49 crc kubenswrapper[4765]: I1003 09:05:49.276447 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/watcher-kuttl-api-1" event={"ID":"b9a8c09a-1859-47cd-b598-a9ffd6ce62b4","Type":"ContainerDied","Data":"a72a4d7a0ef6b6248f21e63472aed01b3b6ba7e55baeaed0d1185dd236812c04"} Oct 03 09:05:49 crc kubenswrapper[4765]: I1003 09:05:49.276791 4765 kuberuntime_container.go:808] "Killing container with a grace period" pod="watcher-kuttl-default/watcher-kuttl-api-2" podUID="1dee5256-0d10-4009-9774-c57edd4aa274" containerName="watcher-kuttl-api-log" containerID="cri-o://92b99aa09e2921ae248bdacb90411999fa28be0540d565b402fe30155a9503e5" gracePeriod=30 Oct 03 09:05:49 crc kubenswrapper[4765]: I1003 09:05:49.276904 4765 kuberuntime_container.go:808] "Killing container with a grace period" pod="watcher-kuttl-default/watcher-kuttl-api-2" podUID="1dee5256-0d10-4009-9774-c57edd4aa274" containerName="watcher-api" containerID="cri-o://4131db2de2276c7ad976d0da09d58fd8f277c0d18f919c79babc9f0f34ccf6b1" gracePeriod=30 Oct 03 09:05:49 crc kubenswrapper[4765]: I1003 09:05:49.326617 4765 scope.go:117] "RemoveContainer" containerID="64d7296706e7417ff5478686cdce2ac08109f591282cde57f967ed3578b2764a" Oct 03 09:05:49 crc kubenswrapper[4765]: I1003 09:05:49.329620 4765 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/b9a8c09a-1859-47cd-b598-a9ffd6ce62b4-logs\") pod \"b9a8c09a-1859-47cd-b598-a9ffd6ce62b4\" (UID: \"b9a8c09a-1859-47cd-b598-a9ffd6ce62b4\") " Oct 03 09:05:49 crc kubenswrapper[4765]: I1003 09:05:49.329726 4765 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"custom-prometheus-ca\" (UniqueName: \"kubernetes.io/secret/b9a8c09a-1859-47cd-b598-a9ffd6ce62b4-custom-prometheus-ca\") pod \"b9a8c09a-1859-47cd-b598-a9ffd6ce62b4\" (UID: \"b9a8c09a-1859-47cd-b598-a9ffd6ce62b4\") " Oct 03 09:05:49 crc kubenswrapper[4765]: I1003 09:05:49.329761 4765 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jhgmp\" (UniqueName: \"kubernetes.io/projected/b9a8c09a-1859-47cd-b598-a9ffd6ce62b4-kube-api-access-jhgmp\") pod \"b9a8c09a-1859-47cd-b598-a9ffd6ce62b4\" (UID: \"b9a8c09a-1859-47cd-b598-a9ffd6ce62b4\") " Oct 03 09:05:49 crc kubenswrapper[4765]: I1003 09:05:49.329815 4765 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b9a8c09a-1859-47cd-b598-a9ffd6ce62b4-config-data\") pod \"b9a8c09a-1859-47cd-b598-a9ffd6ce62b4\" (UID: \"b9a8c09a-1859-47cd-b598-a9ffd6ce62b4\") " Oct 03 09:05:49 crc kubenswrapper[4765]: I1003 09:05:49.329849 4765 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b9a8c09a-1859-47cd-b598-a9ffd6ce62b4-combined-ca-bundle\") pod \"b9a8c09a-1859-47cd-b598-a9ffd6ce62b4\" (UID: \"b9a8c09a-1859-47cd-b598-a9ffd6ce62b4\") " Oct 03 09:05:49 crc kubenswrapper[4765]: I1003 09:05:49.329899 4765 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cert-memcached-mtls\" (UniqueName: \"kubernetes.io/secret/b9a8c09a-1859-47cd-b598-a9ffd6ce62b4-cert-memcached-mtls\") pod \"b9a8c09a-1859-47cd-b598-a9ffd6ce62b4\" (UID: \"b9a8c09a-1859-47cd-b598-a9ffd6ce62b4\") " Oct 03 09:05:49 crc kubenswrapper[4765]: I1003 09:05:49.331046 4765 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b9a8c09a-1859-47cd-b598-a9ffd6ce62b4-logs" (OuterVolumeSpecName: "logs") pod "b9a8c09a-1859-47cd-b598-a9ffd6ce62b4" (UID: "b9a8c09a-1859-47cd-b598-a9ffd6ce62b4"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 03 09:05:49 crc kubenswrapper[4765]: I1003 09:05:49.335897 4765 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b9a8c09a-1859-47cd-b598-a9ffd6ce62b4-kube-api-access-jhgmp" (OuterVolumeSpecName: "kube-api-access-jhgmp") pod "b9a8c09a-1859-47cd-b598-a9ffd6ce62b4" (UID: "b9a8c09a-1859-47cd-b598-a9ffd6ce62b4"). InnerVolumeSpecName "kube-api-access-jhgmp". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 09:05:49 crc kubenswrapper[4765]: I1003 09:05:49.350756 4765 scope.go:117] "RemoveContainer" containerID="35842a981688a41a3c48a85757b385de00e81f2b3badff0c13b7def96d800197" Oct 03 09:05:49 crc kubenswrapper[4765]: E1003 09:05:49.351245 4765 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"35842a981688a41a3c48a85757b385de00e81f2b3badff0c13b7def96d800197\": container with ID starting with 35842a981688a41a3c48a85757b385de00e81f2b3badff0c13b7def96d800197 not found: ID does not exist" containerID="35842a981688a41a3c48a85757b385de00e81f2b3badff0c13b7def96d800197" Oct 03 09:05:49 crc kubenswrapper[4765]: I1003 09:05:49.351283 4765 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"35842a981688a41a3c48a85757b385de00e81f2b3badff0c13b7def96d800197"} err="failed to get container status \"35842a981688a41a3c48a85757b385de00e81f2b3badff0c13b7def96d800197\": rpc error: code = NotFound desc = could not find container \"35842a981688a41a3c48a85757b385de00e81f2b3badff0c13b7def96d800197\": container with ID starting with 35842a981688a41a3c48a85757b385de00e81f2b3badff0c13b7def96d800197 not found: ID does not exist" Oct 03 09:05:49 crc kubenswrapper[4765]: I1003 09:05:49.351307 4765 scope.go:117] "RemoveContainer" containerID="64d7296706e7417ff5478686cdce2ac08109f591282cde57f967ed3578b2764a" Oct 03 09:05:49 crc kubenswrapper[4765]: E1003 09:05:49.351580 4765 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"64d7296706e7417ff5478686cdce2ac08109f591282cde57f967ed3578b2764a\": container with ID starting with 64d7296706e7417ff5478686cdce2ac08109f591282cde57f967ed3578b2764a not found: ID does not exist" containerID="64d7296706e7417ff5478686cdce2ac08109f591282cde57f967ed3578b2764a" Oct 03 09:05:49 crc kubenswrapper[4765]: I1003 09:05:49.351607 4765 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"64d7296706e7417ff5478686cdce2ac08109f591282cde57f967ed3578b2764a"} err="failed to get container status \"64d7296706e7417ff5478686cdce2ac08109f591282cde57f967ed3578b2764a\": rpc error: code = NotFound desc = could not find container \"64d7296706e7417ff5478686cdce2ac08109f591282cde57f967ed3578b2764a\": container with ID starting with 64d7296706e7417ff5478686cdce2ac08109f591282cde57f967ed3578b2764a not found: ID does not exist" Oct 03 09:05:49 crc kubenswrapper[4765]: I1003 09:05:49.351630 4765 scope.go:117] "RemoveContainer" containerID="35842a981688a41a3c48a85757b385de00e81f2b3badff0c13b7def96d800197" Oct 03 09:05:49 crc kubenswrapper[4765]: I1003 09:05:49.351899 4765 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"35842a981688a41a3c48a85757b385de00e81f2b3badff0c13b7def96d800197"} err="failed to get container status \"35842a981688a41a3c48a85757b385de00e81f2b3badff0c13b7def96d800197\": rpc error: code = NotFound desc = could not find container \"35842a981688a41a3c48a85757b385de00e81f2b3badff0c13b7def96d800197\": container with ID starting with 35842a981688a41a3c48a85757b385de00e81f2b3badff0c13b7def96d800197 not found: ID does not exist" Oct 03 09:05:49 crc kubenswrapper[4765]: I1003 09:05:49.351920 4765 scope.go:117] "RemoveContainer" containerID="64d7296706e7417ff5478686cdce2ac08109f591282cde57f967ed3578b2764a" Oct 03 09:05:49 crc kubenswrapper[4765]: I1003 09:05:49.352117 4765 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"64d7296706e7417ff5478686cdce2ac08109f591282cde57f967ed3578b2764a"} err="failed to get container status \"64d7296706e7417ff5478686cdce2ac08109f591282cde57f967ed3578b2764a\": rpc error: code = NotFound desc = could not find container \"64d7296706e7417ff5478686cdce2ac08109f591282cde57f967ed3578b2764a\": container with ID starting with 64d7296706e7417ff5478686cdce2ac08109f591282cde57f967ed3578b2764a not found: ID does not exist" Oct 03 09:05:49 crc kubenswrapper[4765]: I1003 09:05:49.356856 4765 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b9a8c09a-1859-47cd-b598-a9ffd6ce62b4-custom-prometheus-ca" (OuterVolumeSpecName: "custom-prometheus-ca") pod "b9a8c09a-1859-47cd-b598-a9ffd6ce62b4" (UID: "b9a8c09a-1859-47cd-b598-a9ffd6ce62b4"). InnerVolumeSpecName "custom-prometheus-ca". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 09:05:49 crc kubenswrapper[4765]: I1003 09:05:49.358950 4765 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b9a8c09a-1859-47cd-b598-a9ffd6ce62b4-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "b9a8c09a-1859-47cd-b598-a9ffd6ce62b4" (UID: "b9a8c09a-1859-47cd-b598-a9ffd6ce62b4"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 09:05:49 crc kubenswrapper[4765]: I1003 09:05:49.375403 4765 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b9a8c09a-1859-47cd-b598-a9ffd6ce62b4-config-data" (OuterVolumeSpecName: "config-data") pod "b9a8c09a-1859-47cd-b598-a9ffd6ce62b4" (UID: "b9a8c09a-1859-47cd-b598-a9ffd6ce62b4"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 09:05:49 crc kubenswrapper[4765]: I1003 09:05:49.413388 4765 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b9a8c09a-1859-47cd-b598-a9ffd6ce62b4-cert-memcached-mtls" (OuterVolumeSpecName: "cert-memcached-mtls") pod "b9a8c09a-1859-47cd-b598-a9ffd6ce62b4" (UID: "b9a8c09a-1859-47cd-b598-a9ffd6ce62b4"). InnerVolumeSpecName "cert-memcached-mtls". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 09:05:49 crc kubenswrapper[4765]: I1003 09:05:49.432198 4765 reconciler_common.go:293] "Volume detached for volume \"cert-memcached-mtls\" (UniqueName: \"kubernetes.io/secret/b9a8c09a-1859-47cd-b598-a9ffd6ce62b4-cert-memcached-mtls\") on node \"crc\" DevicePath \"\"" Oct 03 09:05:49 crc kubenswrapper[4765]: I1003 09:05:49.432955 4765 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/b9a8c09a-1859-47cd-b598-a9ffd6ce62b4-logs\") on node \"crc\" DevicePath \"\"" Oct 03 09:05:49 crc kubenswrapper[4765]: I1003 09:05:49.432995 4765 reconciler_common.go:293] "Volume detached for volume \"custom-prometheus-ca\" (UniqueName: \"kubernetes.io/secret/b9a8c09a-1859-47cd-b598-a9ffd6ce62b4-custom-prometheus-ca\") on node \"crc\" DevicePath \"\"" Oct 03 09:05:49 crc kubenswrapper[4765]: I1003 09:05:49.433011 4765 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jhgmp\" (UniqueName: \"kubernetes.io/projected/b9a8c09a-1859-47cd-b598-a9ffd6ce62b4-kube-api-access-jhgmp\") on node \"crc\" DevicePath \"\"" Oct 03 09:05:49 crc kubenswrapper[4765]: I1003 09:05:49.433036 4765 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b9a8c09a-1859-47cd-b598-a9ffd6ce62b4-config-data\") on node \"crc\" DevicePath \"\"" Oct 03 09:05:49 crc kubenswrapper[4765]: I1003 09:05:49.433050 4765 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b9a8c09a-1859-47cd-b598-a9ffd6ce62b4-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 03 09:05:49 crc kubenswrapper[4765]: I1003 09:05:49.608446 4765 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["watcher-kuttl-default/watcher-kuttl-api-1"] Oct 03 09:05:49 crc kubenswrapper[4765]: I1003 09:05:49.614799 4765 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["watcher-kuttl-default/watcher-kuttl-api-1"] Oct 03 09:05:50 crc kubenswrapper[4765]: I1003 09:05:50.123111 4765 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/watcher-kuttl-api-2" Oct 03 09:05:50 crc kubenswrapper[4765]: I1003 09:05:50.144919 4765 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/1dee5256-0d10-4009-9774-c57edd4aa274-logs\") pod \"1dee5256-0d10-4009-9774-c57edd4aa274\" (UID: \"1dee5256-0d10-4009-9774-c57edd4aa274\") " Oct 03 09:05:50 crc kubenswrapper[4765]: I1003 09:05:50.144992 4765 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lxvs7\" (UniqueName: \"kubernetes.io/projected/1dee5256-0d10-4009-9774-c57edd4aa274-kube-api-access-lxvs7\") pod \"1dee5256-0d10-4009-9774-c57edd4aa274\" (UID: \"1dee5256-0d10-4009-9774-c57edd4aa274\") " Oct 03 09:05:50 crc kubenswrapper[4765]: I1003 09:05:50.145043 4765 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1dee5256-0d10-4009-9774-c57edd4aa274-combined-ca-bundle\") pod \"1dee5256-0d10-4009-9774-c57edd4aa274\" (UID: \"1dee5256-0d10-4009-9774-c57edd4aa274\") " Oct 03 09:05:50 crc kubenswrapper[4765]: I1003 09:05:50.145149 4765 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"custom-prometheus-ca\" (UniqueName: \"kubernetes.io/secret/1dee5256-0d10-4009-9774-c57edd4aa274-custom-prometheus-ca\") pod \"1dee5256-0d10-4009-9774-c57edd4aa274\" (UID: \"1dee5256-0d10-4009-9774-c57edd4aa274\") " Oct 03 09:05:50 crc kubenswrapper[4765]: I1003 09:05:50.145269 4765 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cert-memcached-mtls\" (UniqueName: \"kubernetes.io/secret/1dee5256-0d10-4009-9774-c57edd4aa274-cert-memcached-mtls\") pod \"1dee5256-0d10-4009-9774-c57edd4aa274\" (UID: \"1dee5256-0d10-4009-9774-c57edd4aa274\") " Oct 03 09:05:50 crc kubenswrapper[4765]: I1003 09:05:50.145322 4765 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1dee5256-0d10-4009-9774-c57edd4aa274-config-data\") pod \"1dee5256-0d10-4009-9774-c57edd4aa274\" (UID: \"1dee5256-0d10-4009-9774-c57edd4aa274\") " Oct 03 09:05:50 crc kubenswrapper[4765]: I1003 09:05:50.145466 4765 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1dee5256-0d10-4009-9774-c57edd4aa274-logs" (OuterVolumeSpecName: "logs") pod "1dee5256-0d10-4009-9774-c57edd4aa274" (UID: "1dee5256-0d10-4009-9774-c57edd4aa274"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 03 09:05:50 crc kubenswrapper[4765]: I1003 09:05:50.145805 4765 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/1dee5256-0d10-4009-9774-c57edd4aa274-logs\") on node \"crc\" DevicePath \"\"" Oct 03 09:05:50 crc kubenswrapper[4765]: I1003 09:05:50.152253 4765 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1dee5256-0d10-4009-9774-c57edd4aa274-kube-api-access-lxvs7" (OuterVolumeSpecName: "kube-api-access-lxvs7") pod "1dee5256-0d10-4009-9774-c57edd4aa274" (UID: "1dee5256-0d10-4009-9774-c57edd4aa274"). InnerVolumeSpecName "kube-api-access-lxvs7". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 09:05:50 crc kubenswrapper[4765]: I1003 09:05:50.182777 4765 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1dee5256-0d10-4009-9774-c57edd4aa274-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "1dee5256-0d10-4009-9774-c57edd4aa274" (UID: "1dee5256-0d10-4009-9774-c57edd4aa274"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 09:05:50 crc kubenswrapper[4765]: I1003 09:05:50.193106 4765 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1dee5256-0d10-4009-9774-c57edd4aa274-config-data" (OuterVolumeSpecName: "config-data") pod "1dee5256-0d10-4009-9774-c57edd4aa274" (UID: "1dee5256-0d10-4009-9774-c57edd4aa274"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 09:05:50 crc kubenswrapper[4765]: I1003 09:05:50.198796 4765 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1dee5256-0d10-4009-9774-c57edd4aa274-custom-prometheus-ca" (OuterVolumeSpecName: "custom-prometheus-ca") pod "1dee5256-0d10-4009-9774-c57edd4aa274" (UID: "1dee5256-0d10-4009-9774-c57edd4aa274"). InnerVolumeSpecName "custom-prometheus-ca". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 09:05:50 crc kubenswrapper[4765]: I1003 09:05:50.213925 4765 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1dee5256-0d10-4009-9774-c57edd4aa274-cert-memcached-mtls" (OuterVolumeSpecName: "cert-memcached-mtls") pod "1dee5256-0d10-4009-9774-c57edd4aa274" (UID: "1dee5256-0d10-4009-9774-c57edd4aa274"). InnerVolumeSpecName "cert-memcached-mtls". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 09:05:50 crc kubenswrapper[4765]: I1003 09:05:50.247055 4765 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1dee5256-0d10-4009-9774-c57edd4aa274-config-data\") on node \"crc\" DevicePath \"\"" Oct 03 09:05:50 crc kubenswrapper[4765]: I1003 09:05:50.247102 4765 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lxvs7\" (UniqueName: \"kubernetes.io/projected/1dee5256-0d10-4009-9774-c57edd4aa274-kube-api-access-lxvs7\") on node \"crc\" DevicePath \"\"" Oct 03 09:05:50 crc kubenswrapper[4765]: I1003 09:05:50.247113 4765 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1dee5256-0d10-4009-9774-c57edd4aa274-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 03 09:05:50 crc kubenswrapper[4765]: I1003 09:05:50.247122 4765 reconciler_common.go:293] "Volume detached for volume \"custom-prometheus-ca\" (UniqueName: \"kubernetes.io/secret/1dee5256-0d10-4009-9774-c57edd4aa274-custom-prometheus-ca\") on node \"crc\" DevicePath \"\"" Oct 03 09:05:50 crc kubenswrapper[4765]: I1003 09:05:50.247132 4765 reconciler_common.go:293] "Volume detached for volume \"cert-memcached-mtls\" (UniqueName: \"kubernetes.io/secret/1dee5256-0d10-4009-9774-c57edd4aa274-cert-memcached-mtls\") on node \"crc\" DevicePath \"\"" Oct 03 09:05:50 crc kubenswrapper[4765]: I1003 09:05:50.288104 4765 generic.go:334] "Generic (PLEG): container finished" podID="1dee5256-0d10-4009-9774-c57edd4aa274" containerID="4131db2de2276c7ad976d0da09d58fd8f277c0d18f919c79babc9f0f34ccf6b1" exitCode=0 Oct 03 09:05:50 crc kubenswrapper[4765]: I1003 09:05:50.288143 4765 generic.go:334] "Generic (PLEG): container finished" podID="1dee5256-0d10-4009-9774-c57edd4aa274" containerID="92b99aa09e2921ae248bdacb90411999fa28be0540d565b402fe30155a9503e5" exitCode=143 Oct 03 09:05:50 crc kubenswrapper[4765]: I1003 09:05:50.288188 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/watcher-kuttl-api-2" event={"ID":"1dee5256-0d10-4009-9774-c57edd4aa274","Type":"ContainerDied","Data":"4131db2de2276c7ad976d0da09d58fd8f277c0d18f919c79babc9f0f34ccf6b1"} Oct 03 09:05:50 crc kubenswrapper[4765]: I1003 09:05:50.288215 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/watcher-kuttl-api-2" event={"ID":"1dee5256-0d10-4009-9774-c57edd4aa274","Type":"ContainerDied","Data":"92b99aa09e2921ae248bdacb90411999fa28be0540d565b402fe30155a9503e5"} Oct 03 09:05:50 crc kubenswrapper[4765]: I1003 09:05:50.288224 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/watcher-kuttl-api-2" event={"ID":"1dee5256-0d10-4009-9774-c57edd4aa274","Type":"ContainerDied","Data":"672dcb0547ce26de1af904bd31939fd4d41a9622f60bf902ad529e4e81c2d342"} Oct 03 09:05:50 crc kubenswrapper[4765]: I1003 09:05:50.288239 4765 scope.go:117] "RemoveContainer" containerID="4131db2de2276c7ad976d0da09d58fd8f277c0d18f919c79babc9f0f34ccf6b1" Oct 03 09:05:50 crc kubenswrapper[4765]: I1003 09:05:50.288328 4765 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/watcher-kuttl-api-2" Oct 03 09:05:50 crc kubenswrapper[4765]: I1003 09:05:50.310177 4765 scope.go:117] "RemoveContainer" containerID="92b99aa09e2921ae248bdacb90411999fa28be0540d565b402fe30155a9503e5" Oct 03 09:05:50 crc kubenswrapper[4765]: I1003 09:05:50.316698 4765 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b9a8c09a-1859-47cd-b598-a9ffd6ce62b4" path="/var/lib/kubelet/pods/b9a8c09a-1859-47cd-b598-a9ffd6ce62b4/volumes" Oct 03 09:05:50 crc kubenswrapper[4765]: I1003 09:05:50.328354 4765 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["watcher-kuttl-default/watcher-kuttl-api-2"] Oct 03 09:05:50 crc kubenswrapper[4765]: I1003 09:05:50.332015 4765 scope.go:117] "RemoveContainer" containerID="4131db2de2276c7ad976d0da09d58fd8f277c0d18f919c79babc9f0f34ccf6b1" Oct 03 09:05:50 crc kubenswrapper[4765]: E1003 09:05:50.332480 4765 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4131db2de2276c7ad976d0da09d58fd8f277c0d18f919c79babc9f0f34ccf6b1\": container with ID starting with 4131db2de2276c7ad976d0da09d58fd8f277c0d18f919c79babc9f0f34ccf6b1 not found: ID does not exist" containerID="4131db2de2276c7ad976d0da09d58fd8f277c0d18f919c79babc9f0f34ccf6b1" Oct 03 09:05:50 crc kubenswrapper[4765]: I1003 09:05:50.332512 4765 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4131db2de2276c7ad976d0da09d58fd8f277c0d18f919c79babc9f0f34ccf6b1"} err="failed to get container status \"4131db2de2276c7ad976d0da09d58fd8f277c0d18f919c79babc9f0f34ccf6b1\": rpc error: code = NotFound desc = could not find container \"4131db2de2276c7ad976d0da09d58fd8f277c0d18f919c79babc9f0f34ccf6b1\": container with ID starting with 4131db2de2276c7ad976d0da09d58fd8f277c0d18f919c79babc9f0f34ccf6b1 not found: ID does not exist" Oct 03 09:05:50 crc kubenswrapper[4765]: I1003 09:05:50.332532 4765 scope.go:117] "RemoveContainer" containerID="92b99aa09e2921ae248bdacb90411999fa28be0540d565b402fe30155a9503e5" Oct 03 09:05:50 crc kubenswrapper[4765]: E1003 09:05:50.334920 4765 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"92b99aa09e2921ae248bdacb90411999fa28be0540d565b402fe30155a9503e5\": container with ID starting with 92b99aa09e2921ae248bdacb90411999fa28be0540d565b402fe30155a9503e5 not found: ID does not exist" containerID="92b99aa09e2921ae248bdacb90411999fa28be0540d565b402fe30155a9503e5" Oct 03 09:05:50 crc kubenswrapper[4765]: I1003 09:05:50.338844 4765 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"92b99aa09e2921ae248bdacb90411999fa28be0540d565b402fe30155a9503e5"} err="failed to get container status \"92b99aa09e2921ae248bdacb90411999fa28be0540d565b402fe30155a9503e5\": rpc error: code = NotFound desc = could not find container \"92b99aa09e2921ae248bdacb90411999fa28be0540d565b402fe30155a9503e5\": container with ID starting with 92b99aa09e2921ae248bdacb90411999fa28be0540d565b402fe30155a9503e5 not found: ID does not exist" Oct 03 09:05:50 crc kubenswrapper[4765]: I1003 09:05:50.338908 4765 scope.go:117] "RemoveContainer" containerID="4131db2de2276c7ad976d0da09d58fd8f277c0d18f919c79babc9f0f34ccf6b1" Oct 03 09:05:50 crc kubenswrapper[4765]: I1003 09:05:50.341197 4765 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["watcher-kuttl-default/watcher-kuttl-api-2"] Oct 03 09:05:50 crc kubenswrapper[4765]: I1003 09:05:50.343117 4765 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4131db2de2276c7ad976d0da09d58fd8f277c0d18f919c79babc9f0f34ccf6b1"} err="failed to get container status \"4131db2de2276c7ad976d0da09d58fd8f277c0d18f919c79babc9f0f34ccf6b1\": rpc error: code = NotFound desc = could not find container \"4131db2de2276c7ad976d0da09d58fd8f277c0d18f919c79babc9f0f34ccf6b1\": container with ID starting with 4131db2de2276c7ad976d0da09d58fd8f277c0d18f919c79babc9f0f34ccf6b1 not found: ID does not exist" Oct 03 09:05:50 crc kubenswrapper[4765]: I1003 09:05:50.343171 4765 scope.go:117] "RemoveContainer" containerID="92b99aa09e2921ae248bdacb90411999fa28be0540d565b402fe30155a9503e5" Oct 03 09:05:50 crc kubenswrapper[4765]: I1003 09:05:50.343480 4765 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"92b99aa09e2921ae248bdacb90411999fa28be0540d565b402fe30155a9503e5"} err="failed to get container status \"92b99aa09e2921ae248bdacb90411999fa28be0540d565b402fe30155a9503e5\": rpc error: code = NotFound desc = could not find container \"92b99aa09e2921ae248bdacb90411999fa28be0540d565b402fe30155a9503e5\": container with ID starting with 92b99aa09e2921ae248bdacb90411999fa28be0540d565b402fe30155a9503e5 not found: ID does not exist" Oct 03 09:05:50 crc kubenswrapper[4765]: I1003 09:05:50.609123 4765 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["watcher-kuttl-default/watcher-kuttl-api-0"] Oct 03 09:05:50 crc kubenswrapper[4765]: I1003 09:05:50.609359 4765 kuberuntime_container.go:808] "Killing container with a grace period" pod="watcher-kuttl-default/watcher-kuttl-api-0" podUID="1d8a89f7-c9ca-44bc-a600-6c2ff2d369f9" containerName="watcher-kuttl-api-log" containerID="cri-o://3e12391e4a3be2cafea78b9fca57beafa77a4e56a66ea9a622496c43a5c16099" gracePeriod=30 Oct 03 09:05:50 crc kubenswrapper[4765]: I1003 09:05:50.609473 4765 kuberuntime_container.go:808] "Killing container with a grace period" pod="watcher-kuttl-default/watcher-kuttl-api-0" podUID="1d8a89f7-c9ca-44bc-a600-6c2ff2d369f9" containerName="watcher-api" containerID="cri-o://c2813cfe40bc57c88a7569e33cd00057158f17e38460816e65400415ed19cf3d" gracePeriod=30 Oct 03 09:05:51 crc kubenswrapper[4765]: I1003 09:05:51.302808 4765 generic.go:334] "Generic (PLEG): container finished" podID="1d8a89f7-c9ca-44bc-a600-6c2ff2d369f9" containerID="3e12391e4a3be2cafea78b9fca57beafa77a4e56a66ea9a622496c43a5c16099" exitCode=143 Oct 03 09:05:51 crc kubenswrapper[4765]: I1003 09:05:51.302859 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/watcher-kuttl-api-0" event={"ID":"1d8a89f7-c9ca-44bc-a600-6c2ff2d369f9","Type":"ContainerDied","Data":"3e12391e4a3be2cafea78b9fca57beafa77a4e56a66ea9a622496c43a5c16099"} Oct 03 09:05:51 crc kubenswrapper[4765]: I1003 09:05:51.731950 4765 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/watcher-kuttl-api-0" Oct 03 09:05:51 crc kubenswrapper[4765]: I1003 09:05:51.782848 4765 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cert-memcached-mtls\" (UniqueName: \"kubernetes.io/secret/1d8a89f7-c9ca-44bc-a600-6c2ff2d369f9-cert-memcached-mtls\") pod \"1d8a89f7-c9ca-44bc-a600-6c2ff2d369f9\" (UID: \"1d8a89f7-c9ca-44bc-a600-6c2ff2d369f9\") " Oct 03 09:05:51 crc kubenswrapper[4765]: I1003 09:05:51.782940 4765 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1d8a89f7-c9ca-44bc-a600-6c2ff2d369f9-combined-ca-bundle\") pod \"1d8a89f7-c9ca-44bc-a600-6c2ff2d369f9\" (UID: \"1d8a89f7-c9ca-44bc-a600-6c2ff2d369f9\") " Oct 03 09:05:51 crc kubenswrapper[4765]: I1003 09:05:51.783638 4765 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/1d8a89f7-c9ca-44bc-a600-6c2ff2d369f9-logs\") pod \"1d8a89f7-c9ca-44bc-a600-6c2ff2d369f9\" (UID: \"1d8a89f7-c9ca-44bc-a600-6c2ff2d369f9\") " Oct 03 09:05:51 crc kubenswrapper[4765]: I1003 09:05:51.783706 4765 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"custom-prometheus-ca\" (UniqueName: \"kubernetes.io/secret/1d8a89f7-c9ca-44bc-a600-6c2ff2d369f9-custom-prometheus-ca\") pod \"1d8a89f7-c9ca-44bc-a600-6c2ff2d369f9\" (UID: \"1d8a89f7-c9ca-44bc-a600-6c2ff2d369f9\") " Oct 03 09:05:51 crc kubenswrapper[4765]: I1003 09:05:51.783804 4765 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mvrsd\" (UniqueName: \"kubernetes.io/projected/1d8a89f7-c9ca-44bc-a600-6c2ff2d369f9-kube-api-access-mvrsd\") pod \"1d8a89f7-c9ca-44bc-a600-6c2ff2d369f9\" (UID: \"1d8a89f7-c9ca-44bc-a600-6c2ff2d369f9\") " Oct 03 09:05:51 crc kubenswrapper[4765]: I1003 09:05:51.783831 4765 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1d8a89f7-c9ca-44bc-a600-6c2ff2d369f9-config-data\") pod \"1d8a89f7-c9ca-44bc-a600-6c2ff2d369f9\" (UID: \"1d8a89f7-c9ca-44bc-a600-6c2ff2d369f9\") " Oct 03 09:05:51 crc kubenswrapper[4765]: I1003 09:05:51.783869 4765 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1d8a89f7-c9ca-44bc-a600-6c2ff2d369f9-logs" (OuterVolumeSpecName: "logs") pod "1d8a89f7-c9ca-44bc-a600-6c2ff2d369f9" (UID: "1d8a89f7-c9ca-44bc-a600-6c2ff2d369f9"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 03 09:05:51 crc kubenswrapper[4765]: I1003 09:05:51.784298 4765 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/1d8a89f7-c9ca-44bc-a600-6c2ff2d369f9-logs\") on node \"crc\" DevicePath \"\"" Oct 03 09:05:51 crc kubenswrapper[4765]: I1003 09:05:51.813942 4765 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1d8a89f7-c9ca-44bc-a600-6c2ff2d369f9-kube-api-access-mvrsd" (OuterVolumeSpecName: "kube-api-access-mvrsd") pod "1d8a89f7-c9ca-44bc-a600-6c2ff2d369f9" (UID: "1d8a89f7-c9ca-44bc-a600-6c2ff2d369f9"). InnerVolumeSpecName "kube-api-access-mvrsd". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 09:05:51 crc kubenswrapper[4765]: I1003 09:05:51.825960 4765 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1d8a89f7-c9ca-44bc-a600-6c2ff2d369f9-custom-prometheus-ca" (OuterVolumeSpecName: "custom-prometheus-ca") pod "1d8a89f7-c9ca-44bc-a600-6c2ff2d369f9" (UID: "1d8a89f7-c9ca-44bc-a600-6c2ff2d369f9"). InnerVolumeSpecName "custom-prometheus-ca". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 09:05:51 crc kubenswrapper[4765]: I1003 09:05:51.849723 4765 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["watcher-kuttl-default/watcher-kuttl-db-sync-lh9fk"] Oct 03 09:05:51 crc kubenswrapper[4765]: I1003 09:05:51.870342 4765 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1d8a89f7-c9ca-44bc-a600-6c2ff2d369f9-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "1d8a89f7-c9ca-44bc-a600-6c2ff2d369f9" (UID: "1d8a89f7-c9ca-44bc-a600-6c2ff2d369f9"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 09:05:51 crc kubenswrapper[4765]: I1003 09:05:51.888997 4765 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1d8a89f7-c9ca-44bc-a600-6c2ff2d369f9-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 03 09:05:51 crc kubenswrapper[4765]: I1003 09:05:51.889035 4765 reconciler_common.go:293] "Volume detached for volume \"custom-prometheus-ca\" (UniqueName: \"kubernetes.io/secret/1d8a89f7-c9ca-44bc-a600-6c2ff2d369f9-custom-prometheus-ca\") on node \"crc\" DevicePath \"\"" Oct 03 09:05:51 crc kubenswrapper[4765]: I1003 09:05:51.889048 4765 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mvrsd\" (UniqueName: \"kubernetes.io/projected/1d8a89f7-c9ca-44bc-a600-6c2ff2d369f9-kube-api-access-mvrsd\") on node \"crc\" DevicePath \"\"" Oct 03 09:05:51 crc kubenswrapper[4765]: I1003 09:05:51.896536 4765 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1d8a89f7-c9ca-44bc-a600-6c2ff2d369f9-config-data" (OuterVolumeSpecName: "config-data") pod "1d8a89f7-c9ca-44bc-a600-6c2ff2d369f9" (UID: "1d8a89f7-c9ca-44bc-a600-6c2ff2d369f9"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 09:05:51 crc kubenswrapper[4765]: I1003 09:05:51.896626 4765 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["watcher-kuttl-default/watcher-kuttl-db-sync-lh9fk"] Oct 03 09:05:51 crc kubenswrapper[4765]: I1003 09:05:51.935155 4765 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["watcher-kuttl-default/watcher201a-account-delete-fq5l7"] Oct 03 09:05:51 crc kubenswrapper[4765]: E1003 09:05:51.935582 4765 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b9a8c09a-1859-47cd-b598-a9ffd6ce62b4" containerName="watcher-api" Oct 03 09:05:51 crc kubenswrapper[4765]: I1003 09:05:51.935602 4765 state_mem.go:107] "Deleted CPUSet assignment" podUID="b9a8c09a-1859-47cd-b598-a9ffd6ce62b4" containerName="watcher-api" Oct 03 09:05:51 crc kubenswrapper[4765]: E1003 09:05:51.935616 4765 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b9a8c09a-1859-47cd-b598-a9ffd6ce62b4" containerName="watcher-kuttl-api-log" Oct 03 09:05:51 crc kubenswrapper[4765]: I1003 09:05:51.935622 4765 state_mem.go:107] "Deleted CPUSet assignment" podUID="b9a8c09a-1859-47cd-b598-a9ffd6ce62b4" containerName="watcher-kuttl-api-log" Oct 03 09:05:51 crc kubenswrapper[4765]: E1003 09:05:51.935663 4765 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1d8a89f7-c9ca-44bc-a600-6c2ff2d369f9" containerName="watcher-api" Oct 03 09:05:51 crc kubenswrapper[4765]: I1003 09:05:51.935669 4765 state_mem.go:107] "Deleted CPUSet assignment" podUID="1d8a89f7-c9ca-44bc-a600-6c2ff2d369f9" containerName="watcher-api" Oct 03 09:05:51 crc kubenswrapper[4765]: E1003 09:05:51.935680 4765 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1dee5256-0d10-4009-9774-c57edd4aa274" containerName="watcher-kuttl-api-log" Oct 03 09:05:51 crc kubenswrapper[4765]: I1003 09:05:51.935685 4765 state_mem.go:107] "Deleted CPUSet assignment" podUID="1dee5256-0d10-4009-9774-c57edd4aa274" containerName="watcher-kuttl-api-log" Oct 03 09:05:51 crc kubenswrapper[4765]: E1003 09:05:51.935697 4765 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1d8a89f7-c9ca-44bc-a600-6c2ff2d369f9" containerName="watcher-kuttl-api-log" Oct 03 09:05:51 crc kubenswrapper[4765]: I1003 09:05:51.935704 4765 state_mem.go:107] "Deleted CPUSet assignment" podUID="1d8a89f7-c9ca-44bc-a600-6c2ff2d369f9" containerName="watcher-kuttl-api-log" Oct 03 09:05:51 crc kubenswrapper[4765]: E1003 09:05:51.935716 4765 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1dee5256-0d10-4009-9774-c57edd4aa274" containerName="watcher-api" Oct 03 09:05:51 crc kubenswrapper[4765]: I1003 09:05:51.935722 4765 state_mem.go:107] "Deleted CPUSet assignment" podUID="1dee5256-0d10-4009-9774-c57edd4aa274" containerName="watcher-api" Oct 03 09:05:51 crc kubenswrapper[4765]: I1003 09:05:51.935904 4765 memory_manager.go:354] "RemoveStaleState removing state" podUID="1dee5256-0d10-4009-9774-c57edd4aa274" containerName="watcher-api" Oct 03 09:05:51 crc kubenswrapper[4765]: I1003 09:05:51.935917 4765 memory_manager.go:354] "RemoveStaleState removing state" podUID="1d8a89f7-c9ca-44bc-a600-6c2ff2d369f9" containerName="watcher-kuttl-api-log" Oct 03 09:05:51 crc kubenswrapper[4765]: I1003 09:05:51.935953 4765 memory_manager.go:354] "RemoveStaleState removing state" podUID="b9a8c09a-1859-47cd-b598-a9ffd6ce62b4" containerName="watcher-kuttl-api-log" Oct 03 09:05:51 crc kubenswrapper[4765]: I1003 09:05:51.935965 4765 memory_manager.go:354] "RemoveStaleState removing state" podUID="1dee5256-0d10-4009-9774-c57edd4aa274" containerName="watcher-kuttl-api-log" Oct 03 09:05:51 crc kubenswrapper[4765]: I1003 09:05:51.935972 4765 memory_manager.go:354] "RemoveStaleState removing state" podUID="1d8a89f7-c9ca-44bc-a600-6c2ff2d369f9" containerName="watcher-api" Oct 03 09:05:51 crc kubenswrapper[4765]: I1003 09:05:51.935986 4765 memory_manager.go:354] "RemoveStaleState removing state" podUID="b9a8c09a-1859-47cd-b598-a9ffd6ce62b4" containerName="watcher-api" Oct 03 09:05:51 crc kubenswrapper[4765]: I1003 09:05:51.936590 4765 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/watcher201a-account-delete-fq5l7" Oct 03 09:05:51 crc kubenswrapper[4765]: I1003 09:05:51.945110 4765 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["watcher-kuttl-default/watcher201a-account-delete-fq5l7"] Oct 03 09:05:51 crc kubenswrapper[4765]: I1003 09:05:51.964003 4765 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1d8a89f7-c9ca-44bc-a600-6c2ff2d369f9-cert-memcached-mtls" (OuterVolumeSpecName: "cert-memcached-mtls") pod "1d8a89f7-c9ca-44bc-a600-6c2ff2d369f9" (UID: "1d8a89f7-c9ca-44bc-a600-6c2ff2d369f9"). InnerVolumeSpecName "cert-memcached-mtls". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 09:05:51 crc kubenswrapper[4765]: I1003 09:05:51.991622 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mflkw\" (UniqueName: \"kubernetes.io/projected/d4cfdf95-c2d0-44b9-b58f-0d93887dd6b4-kube-api-access-mflkw\") pod \"watcher201a-account-delete-fq5l7\" (UID: \"d4cfdf95-c2d0-44b9-b58f-0d93887dd6b4\") " pod="watcher-kuttl-default/watcher201a-account-delete-fq5l7" Oct 03 09:05:51 crc kubenswrapper[4765]: I1003 09:05:51.991934 4765 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1d8a89f7-c9ca-44bc-a600-6c2ff2d369f9-config-data\") on node \"crc\" DevicePath \"\"" Oct 03 09:05:51 crc kubenswrapper[4765]: I1003 09:05:51.991968 4765 reconciler_common.go:293] "Volume detached for volume \"cert-memcached-mtls\" (UniqueName: \"kubernetes.io/secret/1d8a89f7-c9ca-44bc-a600-6c2ff2d369f9-cert-memcached-mtls\") on node \"crc\" DevicePath \"\"" Oct 03 09:05:52 crc kubenswrapper[4765]: I1003 09:05:52.003319 4765 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["watcher-kuttl-default/watcher-kuttl-applier-0"] Oct 03 09:05:52 crc kubenswrapper[4765]: I1003 09:05:52.003630 4765 kuberuntime_container.go:808] "Killing container with a grace period" pod="watcher-kuttl-default/watcher-kuttl-applier-0" podUID="bba0b645-70f1-4933-8370-f24077971b0c" containerName="watcher-applier" containerID="cri-o://7bc063ab757caea4438ed944cc746e782c86db03ec7ef54219691ba86dc3f431" gracePeriod=30 Oct 03 09:05:52 crc kubenswrapper[4765]: I1003 09:05:52.033169 4765 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["watcher-kuttl-default/watcher-kuttl-decision-engine-0"] Oct 03 09:05:52 crc kubenswrapper[4765]: I1003 09:05:52.033431 4765 kuberuntime_container.go:808] "Killing container with a grace period" pod="watcher-kuttl-default/watcher-kuttl-decision-engine-0" podUID="3669b09d-23ed-4f85-b730-22c36851ca02" containerName="watcher-decision-engine" containerID="cri-o://61ec2535a698fc94c3eeb1f6ace68e948318b07eba61b15e3e11e6045959bde9" gracePeriod=30 Oct 03 09:05:52 crc kubenswrapper[4765]: I1003 09:05:52.093661 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mflkw\" (UniqueName: \"kubernetes.io/projected/d4cfdf95-c2d0-44b9-b58f-0d93887dd6b4-kube-api-access-mflkw\") pod \"watcher201a-account-delete-fq5l7\" (UID: \"d4cfdf95-c2d0-44b9-b58f-0d93887dd6b4\") " pod="watcher-kuttl-default/watcher201a-account-delete-fq5l7" Oct 03 09:05:52 crc kubenswrapper[4765]: I1003 09:05:52.115775 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mflkw\" (UniqueName: \"kubernetes.io/projected/d4cfdf95-c2d0-44b9-b58f-0d93887dd6b4-kube-api-access-mflkw\") pod \"watcher201a-account-delete-fq5l7\" (UID: \"d4cfdf95-c2d0-44b9-b58f-0d93887dd6b4\") " pod="watcher-kuttl-default/watcher201a-account-delete-fq5l7" Oct 03 09:05:52 crc kubenswrapper[4765]: I1003 09:05:52.273440 4765 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/watcher201a-account-delete-fq5l7" Oct 03 09:05:52 crc kubenswrapper[4765]: I1003 09:05:52.321915 4765 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1cad9dd0-c4e9-442c-a3f0-c2d3ec0aaee2" path="/var/lib/kubelet/pods/1cad9dd0-c4e9-442c-a3f0-c2d3ec0aaee2/volumes" Oct 03 09:05:52 crc kubenswrapper[4765]: I1003 09:05:52.322460 4765 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1dee5256-0d10-4009-9774-c57edd4aa274" path="/var/lib/kubelet/pods/1dee5256-0d10-4009-9774-c57edd4aa274/volumes" Oct 03 09:05:52 crc kubenswrapper[4765]: I1003 09:05:52.329044 4765 generic.go:334] "Generic (PLEG): container finished" podID="1d8a89f7-c9ca-44bc-a600-6c2ff2d369f9" containerID="c2813cfe40bc57c88a7569e33cd00057158f17e38460816e65400415ed19cf3d" exitCode=0 Oct 03 09:05:52 crc kubenswrapper[4765]: I1003 09:05:52.329085 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/watcher-kuttl-api-0" event={"ID":"1d8a89f7-c9ca-44bc-a600-6c2ff2d369f9","Type":"ContainerDied","Data":"c2813cfe40bc57c88a7569e33cd00057158f17e38460816e65400415ed19cf3d"} Oct 03 09:05:52 crc kubenswrapper[4765]: I1003 09:05:52.329114 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/watcher-kuttl-api-0" event={"ID":"1d8a89f7-c9ca-44bc-a600-6c2ff2d369f9","Type":"ContainerDied","Data":"e8fb7bea615170fc165565ff5e182140c36cca8f15b68d12084f416142f9b574"} Oct 03 09:05:52 crc kubenswrapper[4765]: I1003 09:05:52.329134 4765 scope.go:117] "RemoveContainer" containerID="c2813cfe40bc57c88a7569e33cd00057158f17e38460816e65400415ed19cf3d" Oct 03 09:05:52 crc kubenswrapper[4765]: I1003 09:05:52.329254 4765 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/watcher-kuttl-api-0" Oct 03 09:05:52 crc kubenswrapper[4765]: I1003 09:05:52.368286 4765 scope.go:117] "RemoveContainer" containerID="3e12391e4a3be2cafea78b9fca57beafa77a4e56a66ea9a622496c43a5c16099" Oct 03 09:05:52 crc kubenswrapper[4765]: I1003 09:05:52.373950 4765 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["watcher-kuttl-default/watcher-kuttl-api-0"] Oct 03 09:05:52 crc kubenswrapper[4765]: I1003 09:05:52.384103 4765 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["watcher-kuttl-default/watcher-kuttl-api-0"] Oct 03 09:05:52 crc kubenswrapper[4765]: I1003 09:05:52.415337 4765 scope.go:117] "RemoveContainer" containerID="c2813cfe40bc57c88a7569e33cd00057158f17e38460816e65400415ed19cf3d" Oct 03 09:05:52 crc kubenswrapper[4765]: E1003 09:05:52.418832 4765 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c2813cfe40bc57c88a7569e33cd00057158f17e38460816e65400415ed19cf3d\": container with ID starting with c2813cfe40bc57c88a7569e33cd00057158f17e38460816e65400415ed19cf3d not found: ID does not exist" containerID="c2813cfe40bc57c88a7569e33cd00057158f17e38460816e65400415ed19cf3d" Oct 03 09:05:52 crc kubenswrapper[4765]: I1003 09:05:52.418878 4765 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c2813cfe40bc57c88a7569e33cd00057158f17e38460816e65400415ed19cf3d"} err="failed to get container status \"c2813cfe40bc57c88a7569e33cd00057158f17e38460816e65400415ed19cf3d\": rpc error: code = NotFound desc = could not find container \"c2813cfe40bc57c88a7569e33cd00057158f17e38460816e65400415ed19cf3d\": container with ID starting with c2813cfe40bc57c88a7569e33cd00057158f17e38460816e65400415ed19cf3d not found: ID does not exist" Oct 03 09:05:52 crc kubenswrapper[4765]: I1003 09:05:52.418904 4765 scope.go:117] "RemoveContainer" containerID="3e12391e4a3be2cafea78b9fca57beafa77a4e56a66ea9a622496c43a5c16099" Oct 03 09:05:52 crc kubenswrapper[4765]: E1003 09:05:52.419416 4765 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3e12391e4a3be2cafea78b9fca57beafa77a4e56a66ea9a622496c43a5c16099\": container with ID starting with 3e12391e4a3be2cafea78b9fca57beafa77a4e56a66ea9a622496c43a5c16099 not found: ID does not exist" containerID="3e12391e4a3be2cafea78b9fca57beafa77a4e56a66ea9a622496c43a5c16099" Oct 03 09:05:52 crc kubenswrapper[4765]: I1003 09:05:52.419468 4765 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3e12391e4a3be2cafea78b9fca57beafa77a4e56a66ea9a622496c43a5c16099"} err="failed to get container status \"3e12391e4a3be2cafea78b9fca57beafa77a4e56a66ea9a622496c43a5c16099\": rpc error: code = NotFound desc = could not find container \"3e12391e4a3be2cafea78b9fca57beafa77a4e56a66ea9a622496c43a5c16099\": container with ID starting with 3e12391e4a3be2cafea78b9fca57beafa77a4e56a66ea9a622496c43a5c16099 not found: ID does not exist" Oct 03 09:05:52 crc kubenswrapper[4765]: I1003 09:05:52.755914 4765 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["watcher-kuttl-default/watcher201a-account-delete-fq5l7"] Oct 03 09:05:52 crc kubenswrapper[4765]: E1003 09:05:52.982493 4765 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="7bc063ab757caea4438ed944cc746e782c86db03ec7ef54219691ba86dc3f431" cmd=["/usr/bin/pgrep","-r","DRST","watcher-applier"] Oct 03 09:05:52 crc kubenswrapper[4765]: E1003 09:05:52.985928 4765 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="7bc063ab757caea4438ed944cc746e782c86db03ec7ef54219691ba86dc3f431" cmd=["/usr/bin/pgrep","-r","DRST","watcher-applier"] Oct 03 09:05:52 crc kubenswrapper[4765]: E1003 09:05:52.992142 4765 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="7bc063ab757caea4438ed944cc746e782c86db03ec7ef54219691ba86dc3f431" cmd=["/usr/bin/pgrep","-r","DRST","watcher-applier"] Oct 03 09:05:52 crc kubenswrapper[4765]: E1003 09:05:52.992235 4765 prober.go:104] "Probe errored" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" probeType="Readiness" pod="watcher-kuttl-default/watcher-kuttl-applier-0" podUID="bba0b645-70f1-4933-8370-f24077971b0c" containerName="watcher-applier" Oct 03 09:05:53 crc kubenswrapper[4765]: I1003 09:05:53.365703 4765 generic.go:334] "Generic (PLEG): container finished" podID="d4cfdf95-c2d0-44b9-b58f-0d93887dd6b4" containerID="5c944ee405adf975cb9f464c89114bc85779a402ed3be0fec81254716566e0a8" exitCode=0 Oct 03 09:05:53 crc kubenswrapper[4765]: I1003 09:05:53.366562 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/watcher201a-account-delete-fq5l7" event={"ID":"d4cfdf95-c2d0-44b9-b58f-0d93887dd6b4","Type":"ContainerDied","Data":"5c944ee405adf975cb9f464c89114bc85779a402ed3be0fec81254716566e0a8"} Oct 03 09:05:53 crc kubenswrapper[4765]: I1003 09:05:53.366600 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/watcher201a-account-delete-fq5l7" event={"ID":"d4cfdf95-c2d0-44b9-b58f-0d93887dd6b4","Type":"ContainerStarted","Data":"176a9cee411f210f381ac1ee1273188456a07cd6637993972b4871448c8345aa"} Oct 03 09:05:54 crc kubenswrapper[4765]: I1003 09:05:54.325739 4765 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1d8a89f7-c9ca-44bc-a600-6c2ff2d369f9" path="/var/lib/kubelet/pods/1d8a89f7-c9ca-44bc-a600-6c2ff2d369f9/volumes" Oct 03 09:05:54 crc kubenswrapper[4765]: I1003 09:05:54.477478 4765 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["watcher-kuttl-default/ceilometer-0"] Oct 03 09:05:54 crc kubenswrapper[4765]: I1003 09:05:54.478251 4765 kuberuntime_container.go:808] "Killing container with a grace period" pod="watcher-kuttl-default/ceilometer-0" podUID="6ffed54e-6e2d-463d-b21b-f9ce909b1264" containerName="ceilometer-notification-agent" containerID="cri-o://b1ab83ca5db2f8eb4f2ae72d3b3c46bb8132e82978d8f4d0ccd7b1ca3a8f810c" gracePeriod=30 Oct 03 09:05:54 crc kubenswrapper[4765]: I1003 09:05:54.478252 4765 kuberuntime_container.go:808] "Killing container with a grace period" pod="watcher-kuttl-default/ceilometer-0" podUID="6ffed54e-6e2d-463d-b21b-f9ce909b1264" containerName="sg-core" containerID="cri-o://32b48866ca8b41c0fa2032a04197049e43ffdeccc6934c55ed37377073e72799" gracePeriod=30 Oct 03 09:05:54 crc kubenswrapper[4765]: I1003 09:05:54.478458 4765 kuberuntime_container.go:808] "Killing container with a grace period" pod="watcher-kuttl-default/ceilometer-0" podUID="6ffed54e-6e2d-463d-b21b-f9ce909b1264" containerName="ceilometer-central-agent" containerID="cri-o://96c2d8150247b33574bafdd3753851fdfb7a61afe1d584a46cef3bc486461c60" gracePeriod=30 Oct 03 09:05:54 crc kubenswrapper[4765]: I1003 09:05:54.478252 4765 kuberuntime_container.go:808] "Killing container with a grace period" pod="watcher-kuttl-default/ceilometer-0" podUID="6ffed54e-6e2d-463d-b21b-f9ce909b1264" containerName="proxy-httpd" containerID="cri-o://4d3defc069f26cffaef7bd8bb410aeb1dc7e9e0ce6f2cc712b08b2023a284442" gracePeriod=30 Oct 03 09:05:54 crc kubenswrapper[4765]: I1003 09:05:54.492066 4765 prober.go:107] "Probe failed" probeType="Readiness" pod="watcher-kuttl-default/ceilometer-0" podUID="6ffed54e-6e2d-463d-b21b-f9ce909b1264" containerName="proxy-httpd" probeResult="failure" output="Get \"https://10.217.0.186:3000/\": read tcp 10.217.0.2:41058->10.217.0.186:3000: read: connection reset by peer" Oct 03 09:05:54 crc kubenswrapper[4765]: I1003 09:05:54.806790 4765 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/watcher201a-account-delete-fq5l7" Oct 03 09:05:54 crc kubenswrapper[4765]: I1003 09:05:54.837242 4765 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mflkw\" (UniqueName: \"kubernetes.io/projected/d4cfdf95-c2d0-44b9-b58f-0d93887dd6b4-kube-api-access-mflkw\") pod \"d4cfdf95-c2d0-44b9-b58f-0d93887dd6b4\" (UID: \"d4cfdf95-c2d0-44b9-b58f-0d93887dd6b4\") " Oct 03 09:05:54 crc kubenswrapper[4765]: I1003 09:05:54.855514 4765 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d4cfdf95-c2d0-44b9-b58f-0d93887dd6b4-kube-api-access-mflkw" (OuterVolumeSpecName: "kube-api-access-mflkw") pod "d4cfdf95-c2d0-44b9-b58f-0d93887dd6b4" (UID: "d4cfdf95-c2d0-44b9-b58f-0d93887dd6b4"). InnerVolumeSpecName "kube-api-access-mflkw". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 09:05:54 crc kubenswrapper[4765]: I1003 09:05:54.938998 4765 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mflkw\" (UniqueName: \"kubernetes.io/projected/d4cfdf95-c2d0-44b9-b58f-0d93887dd6b4-kube-api-access-mflkw\") on node \"crc\" DevicePath \"\"" Oct 03 09:05:55 crc kubenswrapper[4765]: I1003 09:05:55.394843 4765 generic.go:334] "Generic (PLEG): container finished" podID="6ffed54e-6e2d-463d-b21b-f9ce909b1264" containerID="4d3defc069f26cffaef7bd8bb410aeb1dc7e9e0ce6f2cc712b08b2023a284442" exitCode=0 Oct 03 09:05:55 crc kubenswrapper[4765]: I1003 09:05:55.394871 4765 generic.go:334] "Generic (PLEG): container finished" podID="6ffed54e-6e2d-463d-b21b-f9ce909b1264" containerID="32b48866ca8b41c0fa2032a04197049e43ffdeccc6934c55ed37377073e72799" exitCode=2 Oct 03 09:05:55 crc kubenswrapper[4765]: I1003 09:05:55.394880 4765 generic.go:334] "Generic (PLEG): container finished" podID="6ffed54e-6e2d-463d-b21b-f9ce909b1264" containerID="96c2d8150247b33574bafdd3753851fdfb7a61afe1d584a46cef3bc486461c60" exitCode=0 Oct 03 09:05:55 crc kubenswrapper[4765]: I1003 09:05:55.394943 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/ceilometer-0" event={"ID":"6ffed54e-6e2d-463d-b21b-f9ce909b1264","Type":"ContainerDied","Data":"4d3defc069f26cffaef7bd8bb410aeb1dc7e9e0ce6f2cc712b08b2023a284442"} Oct 03 09:05:55 crc kubenswrapper[4765]: I1003 09:05:55.394999 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/ceilometer-0" event={"ID":"6ffed54e-6e2d-463d-b21b-f9ce909b1264","Type":"ContainerDied","Data":"32b48866ca8b41c0fa2032a04197049e43ffdeccc6934c55ed37377073e72799"} Oct 03 09:05:55 crc kubenswrapper[4765]: I1003 09:05:55.395014 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/ceilometer-0" event={"ID":"6ffed54e-6e2d-463d-b21b-f9ce909b1264","Type":"ContainerDied","Data":"96c2d8150247b33574bafdd3753851fdfb7a61afe1d584a46cef3bc486461c60"} Oct 03 09:05:55 crc kubenswrapper[4765]: I1003 09:05:55.396394 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/watcher201a-account-delete-fq5l7" event={"ID":"d4cfdf95-c2d0-44b9-b58f-0d93887dd6b4","Type":"ContainerDied","Data":"176a9cee411f210f381ac1ee1273188456a07cd6637993972b4871448c8345aa"} Oct 03 09:05:55 crc kubenswrapper[4765]: I1003 09:05:55.396478 4765 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/watcher201a-account-delete-fq5l7" Oct 03 09:05:55 crc kubenswrapper[4765]: I1003 09:05:55.396496 4765 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="176a9cee411f210f381ac1ee1273188456a07cd6637993972b4871448c8345aa" Oct 03 09:05:56 crc kubenswrapper[4765]: I1003 09:05:56.044832 4765 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/ceilometer-0" Oct 03 09:05:56 crc kubenswrapper[4765]: I1003 09:05:56.054584 4765 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/6ffed54e-6e2d-463d-b21b-f9ce909b1264-ceilometer-tls-certs\") pod \"6ffed54e-6e2d-463d-b21b-f9ce909b1264\" (UID: \"6ffed54e-6e2d-463d-b21b-f9ce909b1264\") " Oct 03 09:05:56 crc kubenswrapper[4765]: I1003 09:05:56.054887 4765 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6ffed54e-6e2d-463d-b21b-f9ce909b1264-config-data\") pod \"6ffed54e-6e2d-463d-b21b-f9ce909b1264\" (UID: \"6ffed54e-6e2d-463d-b21b-f9ce909b1264\") " Oct 03 09:05:56 crc kubenswrapper[4765]: I1003 09:05:56.055056 4765 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/6ffed54e-6e2d-463d-b21b-f9ce909b1264-log-httpd\") pod \"6ffed54e-6e2d-463d-b21b-f9ce909b1264\" (UID: \"6ffed54e-6e2d-463d-b21b-f9ce909b1264\") " Oct 03 09:05:56 crc kubenswrapper[4765]: I1003 09:05:56.055190 4765 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6ffed54e-6e2d-463d-b21b-f9ce909b1264-combined-ca-bundle\") pod \"6ffed54e-6e2d-463d-b21b-f9ce909b1264\" (UID: \"6ffed54e-6e2d-463d-b21b-f9ce909b1264\") " Oct 03 09:05:56 crc kubenswrapper[4765]: I1003 09:05:56.055294 4765 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2qnjp\" (UniqueName: \"kubernetes.io/projected/6ffed54e-6e2d-463d-b21b-f9ce909b1264-kube-api-access-2qnjp\") pod \"6ffed54e-6e2d-463d-b21b-f9ce909b1264\" (UID: \"6ffed54e-6e2d-463d-b21b-f9ce909b1264\") " Oct 03 09:05:56 crc kubenswrapper[4765]: I1003 09:05:56.055429 4765 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/6ffed54e-6e2d-463d-b21b-f9ce909b1264-scripts\") pod \"6ffed54e-6e2d-463d-b21b-f9ce909b1264\" (UID: \"6ffed54e-6e2d-463d-b21b-f9ce909b1264\") " Oct 03 09:05:56 crc kubenswrapper[4765]: I1003 09:05:56.055494 4765 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/6ffed54e-6e2d-463d-b21b-f9ce909b1264-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "6ffed54e-6e2d-463d-b21b-f9ce909b1264" (UID: "6ffed54e-6e2d-463d-b21b-f9ce909b1264"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 03 09:05:56 crc kubenswrapper[4765]: I1003 09:05:56.055499 4765 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/6ffed54e-6e2d-463d-b21b-f9ce909b1264-sg-core-conf-yaml\") pod \"6ffed54e-6e2d-463d-b21b-f9ce909b1264\" (UID: \"6ffed54e-6e2d-463d-b21b-f9ce909b1264\") " Oct 03 09:05:56 crc kubenswrapper[4765]: I1003 09:05:56.055638 4765 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/6ffed54e-6e2d-463d-b21b-f9ce909b1264-run-httpd\") pod \"6ffed54e-6e2d-463d-b21b-f9ce909b1264\" (UID: \"6ffed54e-6e2d-463d-b21b-f9ce909b1264\") " Oct 03 09:05:56 crc kubenswrapper[4765]: I1003 09:05:56.056018 4765 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/6ffed54e-6e2d-463d-b21b-f9ce909b1264-log-httpd\") on node \"crc\" DevicePath \"\"" Oct 03 09:05:56 crc kubenswrapper[4765]: I1003 09:05:56.056145 4765 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/6ffed54e-6e2d-463d-b21b-f9ce909b1264-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "6ffed54e-6e2d-463d-b21b-f9ce909b1264" (UID: "6ffed54e-6e2d-463d-b21b-f9ce909b1264"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 03 09:05:56 crc kubenswrapper[4765]: I1003 09:05:56.062239 4765 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6ffed54e-6e2d-463d-b21b-f9ce909b1264-scripts" (OuterVolumeSpecName: "scripts") pod "6ffed54e-6e2d-463d-b21b-f9ce909b1264" (UID: "6ffed54e-6e2d-463d-b21b-f9ce909b1264"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 09:05:56 crc kubenswrapper[4765]: I1003 09:05:56.067981 4765 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6ffed54e-6e2d-463d-b21b-f9ce909b1264-kube-api-access-2qnjp" (OuterVolumeSpecName: "kube-api-access-2qnjp") pod "6ffed54e-6e2d-463d-b21b-f9ce909b1264" (UID: "6ffed54e-6e2d-463d-b21b-f9ce909b1264"). InnerVolumeSpecName "kube-api-access-2qnjp". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 09:05:56 crc kubenswrapper[4765]: I1003 09:05:56.098109 4765 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6ffed54e-6e2d-463d-b21b-f9ce909b1264-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "6ffed54e-6e2d-463d-b21b-f9ce909b1264" (UID: "6ffed54e-6e2d-463d-b21b-f9ce909b1264"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 09:05:56 crc kubenswrapper[4765]: I1003 09:05:56.108875 4765 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6ffed54e-6e2d-463d-b21b-f9ce909b1264-ceilometer-tls-certs" (OuterVolumeSpecName: "ceilometer-tls-certs") pod "6ffed54e-6e2d-463d-b21b-f9ce909b1264" (UID: "6ffed54e-6e2d-463d-b21b-f9ce909b1264"). InnerVolumeSpecName "ceilometer-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 09:05:56 crc kubenswrapper[4765]: I1003 09:05:56.134969 4765 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6ffed54e-6e2d-463d-b21b-f9ce909b1264-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "6ffed54e-6e2d-463d-b21b-f9ce909b1264" (UID: "6ffed54e-6e2d-463d-b21b-f9ce909b1264"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 09:05:56 crc kubenswrapper[4765]: I1003 09:05:56.153476 4765 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6ffed54e-6e2d-463d-b21b-f9ce909b1264-config-data" (OuterVolumeSpecName: "config-data") pod "6ffed54e-6e2d-463d-b21b-f9ce909b1264" (UID: "6ffed54e-6e2d-463d-b21b-f9ce909b1264"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 09:05:56 crc kubenswrapper[4765]: I1003 09:05:56.157544 4765 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6ffed54e-6e2d-463d-b21b-f9ce909b1264-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 03 09:05:56 crc kubenswrapper[4765]: I1003 09:05:56.157574 4765 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2qnjp\" (UniqueName: \"kubernetes.io/projected/6ffed54e-6e2d-463d-b21b-f9ce909b1264-kube-api-access-2qnjp\") on node \"crc\" DevicePath \"\"" Oct 03 09:05:56 crc kubenswrapper[4765]: I1003 09:05:56.157584 4765 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/6ffed54e-6e2d-463d-b21b-f9ce909b1264-scripts\") on node \"crc\" DevicePath \"\"" Oct 03 09:05:56 crc kubenswrapper[4765]: I1003 09:05:56.157593 4765 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/6ffed54e-6e2d-463d-b21b-f9ce909b1264-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Oct 03 09:05:56 crc kubenswrapper[4765]: I1003 09:05:56.157602 4765 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/6ffed54e-6e2d-463d-b21b-f9ce909b1264-run-httpd\") on node \"crc\" DevicePath \"\"" Oct 03 09:05:56 crc kubenswrapper[4765]: I1003 09:05:56.157610 4765 reconciler_common.go:293] "Volume detached for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/6ffed54e-6e2d-463d-b21b-f9ce909b1264-ceilometer-tls-certs\") on node \"crc\" DevicePath \"\"" Oct 03 09:05:56 crc kubenswrapper[4765]: I1003 09:05:56.157617 4765 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6ffed54e-6e2d-463d-b21b-f9ce909b1264-config-data\") on node \"crc\" DevicePath \"\"" Oct 03 09:05:56 crc kubenswrapper[4765]: I1003 09:05:56.406431 4765 generic.go:334] "Generic (PLEG): container finished" podID="6ffed54e-6e2d-463d-b21b-f9ce909b1264" containerID="b1ab83ca5db2f8eb4f2ae72d3b3c46bb8132e82978d8f4d0ccd7b1ca3a8f810c" exitCode=0 Oct 03 09:05:56 crc kubenswrapper[4765]: I1003 09:05:56.406495 4765 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/ceilometer-0" Oct 03 09:05:56 crc kubenswrapper[4765]: I1003 09:05:56.406521 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/ceilometer-0" event={"ID":"6ffed54e-6e2d-463d-b21b-f9ce909b1264","Type":"ContainerDied","Data":"b1ab83ca5db2f8eb4f2ae72d3b3c46bb8132e82978d8f4d0ccd7b1ca3a8f810c"} Oct 03 09:05:56 crc kubenswrapper[4765]: I1003 09:05:56.407448 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/ceilometer-0" event={"ID":"6ffed54e-6e2d-463d-b21b-f9ce909b1264","Type":"ContainerDied","Data":"f48e8215dd5a95254c08479d945518e6ee3d3418b6b3a656df9b6d258fbe4fb1"} Oct 03 09:05:56 crc kubenswrapper[4765]: I1003 09:05:56.407470 4765 scope.go:117] "RemoveContainer" containerID="4d3defc069f26cffaef7bd8bb410aeb1dc7e9e0ce6f2cc712b08b2023a284442" Oct 03 09:05:56 crc kubenswrapper[4765]: I1003 09:05:56.434131 4765 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["watcher-kuttl-default/ceilometer-0"] Oct 03 09:05:56 crc kubenswrapper[4765]: I1003 09:05:56.435786 4765 scope.go:117] "RemoveContainer" containerID="32b48866ca8b41c0fa2032a04197049e43ffdeccc6934c55ed37377073e72799" Oct 03 09:05:56 crc kubenswrapper[4765]: I1003 09:05:56.441334 4765 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["watcher-kuttl-default/ceilometer-0"] Oct 03 09:05:56 crc kubenswrapper[4765]: I1003 09:05:56.458026 4765 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["watcher-kuttl-default/ceilometer-0"] Oct 03 09:05:56 crc kubenswrapper[4765]: E1003 09:05:56.458366 4765 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d4cfdf95-c2d0-44b9-b58f-0d93887dd6b4" containerName="mariadb-account-delete" Oct 03 09:05:56 crc kubenswrapper[4765]: I1003 09:05:56.458384 4765 state_mem.go:107] "Deleted CPUSet assignment" podUID="d4cfdf95-c2d0-44b9-b58f-0d93887dd6b4" containerName="mariadb-account-delete" Oct 03 09:05:56 crc kubenswrapper[4765]: E1003 09:05:56.458397 4765 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6ffed54e-6e2d-463d-b21b-f9ce909b1264" containerName="ceilometer-notification-agent" Oct 03 09:05:56 crc kubenswrapper[4765]: I1003 09:05:56.458405 4765 state_mem.go:107] "Deleted CPUSet assignment" podUID="6ffed54e-6e2d-463d-b21b-f9ce909b1264" containerName="ceilometer-notification-agent" Oct 03 09:05:56 crc kubenswrapper[4765]: E1003 09:05:56.458417 4765 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6ffed54e-6e2d-463d-b21b-f9ce909b1264" containerName="sg-core" Oct 03 09:05:56 crc kubenswrapper[4765]: I1003 09:05:56.458425 4765 state_mem.go:107] "Deleted CPUSet assignment" podUID="6ffed54e-6e2d-463d-b21b-f9ce909b1264" containerName="sg-core" Oct 03 09:05:56 crc kubenswrapper[4765]: E1003 09:05:56.458440 4765 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6ffed54e-6e2d-463d-b21b-f9ce909b1264" containerName="ceilometer-central-agent" Oct 03 09:05:56 crc kubenswrapper[4765]: I1003 09:05:56.458447 4765 state_mem.go:107] "Deleted CPUSet assignment" podUID="6ffed54e-6e2d-463d-b21b-f9ce909b1264" containerName="ceilometer-central-agent" Oct 03 09:05:56 crc kubenswrapper[4765]: E1003 09:05:56.458461 4765 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6ffed54e-6e2d-463d-b21b-f9ce909b1264" containerName="proxy-httpd" Oct 03 09:05:56 crc kubenswrapper[4765]: I1003 09:05:56.458467 4765 state_mem.go:107] "Deleted CPUSet assignment" podUID="6ffed54e-6e2d-463d-b21b-f9ce909b1264" containerName="proxy-httpd" Oct 03 09:05:56 crc kubenswrapper[4765]: I1003 09:05:56.458680 4765 memory_manager.go:354] "RemoveStaleState removing state" podUID="6ffed54e-6e2d-463d-b21b-f9ce909b1264" containerName="ceilometer-central-agent" Oct 03 09:05:56 crc kubenswrapper[4765]: I1003 09:05:56.458696 4765 memory_manager.go:354] "RemoveStaleState removing state" podUID="6ffed54e-6e2d-463d-b21b-f9ce909b1264" containerName="sg-core" Oct 03 09:05:56 crc kubenswrapper[4765]: I1003 09:05:56.458705 4765 memory_manager.go:354] "RemoveStaleState removing state" podUID="d4cfdf95-c2d0-44b9-b58f-0d93887dd6b4" containerName="mariadb-account-delete" Oct 03 09:05:56 crc kubenswrapper[4765]: I1003 09:05:56.458720 4765 memory_manager.go:354] "RemoveStaleState removing state" podUID="6ffed54e-6e2d-463d-b21b-f9ce909b1264" containerName="proxy-httpd" Oct 03 09:05:56 crc kubenswrapper[4765]: I1003 09:05:56.458747 4765 memory_manager.go:354] "RemoveStaleState removing state" podUID="6ffed54e-6e2d-463d-b21b-f9ce909b1264" containerName="ceilometer-notification-agent" Oct 03 09:05:56 crc kubenswrapper[4765]: I1003 09:05:56.460343 4765 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/ceilometer-0" Oct 03 09:05:56 crc kubenswrapper[4765]: I1003 09:05:56.467876 4765 scope.go:117] "RemoveContainer" containerID="b1ab83ca5db2f8eb4f2ae72d3b3c46bb8132e82978d8f4d0ccd7b1ca3a8f810c" Oct 03 09:05:56 crc kubenswrapper[4765]: I1003 09:05:56.468254 4765 reflector.go:368] Caches populated for *v1.Secret from object-"watcher-kuttl-default"/"ceilometer-scripts" Oct 03 09:05:56 crc kubenswrapper[4765]: I1003 09:05:56.468401 4765 reflector.go:368] Caches populated for *v1.Secret from object-"watcher-kuttl-default"/"cert-ceilometer-internal-svc" Oct 03 09:05:56 crc kubenswrapper[4765]: I1003 09:05:56.468578 4765 reflector.go:368] Caches populated for *v1.Secret from object-"watcher-kuttl-default"/"ceilometer-config-data" Oct 03 09:05:56 crc kubenswrapper[4765]: I1003 09:05:56.472485 4765 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["watcher-kuttl-default/ceilometer-0"] Oct 03 09:05:56 crc kubenswrapper[4765]: I1003 09:05:56.505099 4765 scope.go:117] "RemoveContainer" containerID="96c2d8150247b33574bafdd3753851fdfb7a61afe1d584a46cef3bc486461c60" Oct 03 09:05:56 crc kubenswrapper[4765]: I1003 09:05:56.525990 4765 scope.go:117] "RemoveContainer" containerID="4d3defc069f26cffaef7bd8bb410aeb1dc7e9e0ce6f2cc712b08b2023a284442" Oct 03 09:05:56 crc kubenswrapper[4765]: E1003 09:05:56.526698 4765 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4d3defc069f26cffaef7bd8bb410aeb1dc7e9e0ce6f2cc712b08b2023a284442\": container with ID starting with 4d3defc069f26cffaef7bd8bb410aeb1dc7e9e0ce6f2cc712b08b2023a284442 not found: ID does not exist" containerID="4d3defc069f26cffaef7bd8bb410aeb1dc7e9e0ce6f2cc712b08b2023a284442" Oct 03 09:05:56 crc kubenswrapper[4765]: I1003 09:05:56.526757 4765 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4d3defc069f26cffaef7bd8bb410aeb1dc7e9e0ce6f2cc712b08b2023a284442"} err="failed to get container status \"4d3defc069f26cffaef7bd8bb410aeb1dc7e9e0ce6f2cc712b08b2023a284442\": rpc error: code = NotFound desc = could not find container \"4d3defc069f26cffaef7bd8bb410aeb1dc7e9e0ce6f2cc712b08b2023a284442\": container with ID starting with 4d3defc069f26cffaef7bd8bb410aeb1dc7e9e0ce6f2cc712b08b2023a284442 not found: ID does not exist" Oct 03 09:05:56 crc kubenswrapper[4765]: I1003 09:05:56.526785 4765 scope.go:117] "RemoveContainer" containerID="32b48866ca8b41c0fa2032a04197049e43ffdeccc6934c55ed37377073e72799" Oct 03 09:05:56 crc kubenswrapper[4765]: E1003 09:05:56.527185 4765 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"32b48866ca8b41c0fa2032a04197049e43ffdeccc6934c55ed37377073e72799\": container with ID starting with 32b48866ca8b41c0fa2032a04197049e43ffdeccc6934c55ed37377073e72799 not found: ID does not exist" containerID="32b48866ca8b41c0fa2032a04197049e43ffdeccc6934c55ed37377073e72799" Oct 03 09:05:56 crc kubenswrapper[4765]: I1003 09:05:56.527213 4765 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"32b48866ca8b41c0fa2032a04197049e43ffdeccc6934c55ed37377073e72799"} err="failed to get container status \"32b48866ca8b41c0fa2032a04197049e43ffdeccc6934c55ed37377073e72799\": rpc error: code = NotFound desc = could not find container \"32b48866ca8b41c0fa2032a04197049e43ffdeccc6934c55ed37377073e72799\": container with ID starting with 32b48866ca8b41c0fa2032a04197049e43ffdeccc6934c55ed37377073e72799 not found: ID does not exist" Oct 03 09:05:56 crc kubenswrapper[4765]: I1003 09:05:56.527231 4765 scope.go:117] "RemoveContainer" containerID="b1ab83ca5db2f8eb4f2ae72d3b3c46bb8132e82978d8f4d0ccd7b1ca3a8f810c" Oct 03 09:05:56 crc kubenswrapper[4765]: E1003 09:05:56.528325 4765 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b1ab83ca5db2f8eb4f2ae72d3b3c46bb8132e82978d8f4d0ccd7b1ca3a8f810c\": container with ID starting with b1ab83ca5db2f8eb4f2ae72d3b3c46bb8132e82978d8f4d0ccd7b1ca3a8f810c not found: ID does not exist" containerID="b1ab83ca5db2f8eb4f2ae72d3b3c46bb8132e82978d8f4d0ccd7b1ca3a8f810c" Oct 03 09:05:56 crc kubenswrapper[4765]: I1003 09:05:56.528378 4765 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b1ab83ca5db2f8eb4f2ae72d3b3c46bb8132e82978d8f4d0ccd7b1ca3a8f810c"} err="failed to get container status \"b1ab83ca5db2f8eb4f2ae72d3b3c46bb8132e82978d8f4d0ccd7b1ca3a8f810c\": rpc error: code = NotFound desc = could not find container \"b1ab83ca5db2f8eb4f2ae72d3b3c46bb8132e82978d8f4d0ccd7b1ca3a8f810c\": container with ID starting with b1ab83ca5db2f8eb4f2ae72d3b3c46bb8132e82978d8f4d0ccd7b1ca3a8f810c not found: ID does not exist" Oct 03 09:05:56 crc kubenswrapper[4765]: I1003 09:05:56.528399 4765 scope.go:117] "RemoveContainer" containerID="96c2d8150247b33574bafdd3753851fdfb7a61afe1d584a46cef3bc486461c60" Oct 03 09:05:56 crc kubenswrapper[4765]: E1003 09:05:56.528908 4765 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"96c2d8150247b33574bafdd3753851fdfb7a61afe1d584a46cef3bc486461c60\": container with ID starting with 96c2d8150247b33574bafdd3753851fdfb7a61afe1d584a46cef3bc486461c60 not found: ID does not exist" containerID="96c2d8150247b33574bafdd3753851fdfb7a61afe1d584a46cef3bc486461c60" Oct 03 09:05:56 crc kubenswrapper[4765]: I1003 09:05:56.529033 4765 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"96c2d8150247b33574bafdd3753851fdfb7a61afe1d584a46cef3bc486461c60"} err="failed to get container status \"96c2d8150247b33574bafdd3753851fdfb7a61afe1d584a46cef3bc486461c60\": rpc error: code = NotFound desc = could not find container \"96c2d8150247b33574bafdd3753851fdfb7a61afe1d584a46cef3bc486461c60\": container with ID starting with 96c2d8150247b33574bafdd3753851fdfb7a61afe1d584a46cef3bc486461c60 not found: ID does not exist" Oct 03 09:05:56 crc kubenswrapper[4765]: I1003 09:05:56.565422 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gm65c\" (UniqueName: \"kubernetes.io/projected/62a7a58b-c181-4ef7-b62d-de2a16d2a47c-kube-api-access-gm65c\") pod \"ceilometer-0\" (UID: \"62a7a58b-c181-4ef7-b62d-de2a16d2a47c\") " pod="watcher-kuttl-default/ceilometer-0" Oct 03 09:05:56 crc kubenswrapper[4765]: I1003 09:05:56.565483 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/62a7a58b-c181-4ef7-b62d-de2a16d2a47c-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"62a7a58b-c181-4ef7-b62d-de2a16d2a47c\") " pod="watcher-kuttl-default/ceilometer-0" Oct 03 09:05:56 crc kubenswrapper[4765]: I1003 09:05:56.565505 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/62a7a58b-c181-4ef7-b62d-de2a16d2a47c-run-httpd\") pod \"ceilometer-0\" (UID: \"62a7a58b-c181-4ef7-b62d-de2a16d2a47c\") " pod="watcher-kuttl-default/ceilometer-0" Oct 03 09:05:56 crc kubenswrapper[4765]: I1003 09:05:56.565570 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/62a7a58b-c181-4ef7-b62d-de2a16d2a47c-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"62a7a58b-c181-4ef7-b62d-de2a16d2a47c\") " pod="watcher-kuttl-default/ceilometer-0" Oct 03 09:05:56 crc kubenswrapper[4765]: I1003 09:05:56.565593 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/62a7a58b-c181-4ef7-b62d-de2a16d2a47c-config-data\") pod \"ceilometer-0\" (UID: \"62a7a58b-c181-4ef7-b62d-de2a16d2a47c\") " pod="watcher-kuttl-default/ceilometer-0" Oct 03 09:05:56 crc kubenswrapper[4765]: I1003 09:05:56.565615 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/62a7a58b-c181-4ef7-b62d-de2a16d2a47c-scripts\") pod \"ceilometer-0\" (UID: \"62a7a58b-c181-4ef7-b62d-de2a16d2a47c\") " pod="watcher-kuttl-default/ceilometer-0" Oct 03 09:05:56 crc kubenswrapper[4765]: I1003 09:05:56.565681 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/62a7a58b-c181-4ef7-b62d-de2a16d2a47c-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"62a7a58b-c181-4ef7-b62d-de2a16d2a47c\") " pod="watcher-kuttl-default/ceilometer-0" Oct 03 09:05:56 crc kubenswrapper[4765]: I1003 09:05:56.565714 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/62a7a58b-c181-4ef7-b62d-de2a16d2a47c-log-httpd\") pod \"ceilometer-0\" (UID: \"62a7a58b-c181-4ef7-b62d-de2a16d2a47c\") " pod="watcher-kuttl-default/ceilometer-0" Oct 03 09:05:56 crc kubenswrapper[4765]: I1003 09:05:56.668776 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/62a7a58b-c181-4ef7-b62d-de2a16d2a47c-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"62a7a58b-c181-4ef7-b62d-de2a16d2a47c\") " pod="watcher-kuttl-default/ceilometer-0" Oct 03 09:05:56 crc kubenswrapper[4765]: I1003 09:05:56.669048 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/62a7a58b-c181-4ef7-b62d-de2a16d2a47c-log-httpd\") pod \"ceilometer-0\" (UID: \"62a7a58b-c181-4ef7-b62d-de2a16d2a47c\") " pod="watcher-kuttl-default/ceilometer-0" Oct 03 09:05:56 crc kubenswrapper[4765]: I1003 09:05:56.669168 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gm65c\" (UniqueName: \"kubernetes.io/projected/62a7a58b-c181-4ef7-b62d-de2a16d2a47c-kube-api-access-gm65c\") pod \"ceilometer-0\" (UID: \"62a7a58b-c181-4ef7-b62d-de2a16d2a47c\") " pod="watcher-kuttl-default/ceilometer-0" Oct 03 09:05:56 crc kubenswrapper[4765]: I1003 09:05:56.669276 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/62a7a58b-c181-4ef7-b62d-de2a16d2a47c-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"62a7a58b-c181-4ef7-b62d-de2a16d2a47c\") " pod="watcher-kuttl-default/ceilometer-0" Oct 03 09:05:56 crc kubenswrapper[4765]: I1003 09:05:56.669442 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/62a7a58b-c181-4ef7-b62d-de2a16d2a47c-run-httpd\") pod \"ceilometer-0\" (UID: \"62a7a58b-c181-4ef7-b62d-de2a16d2a47c\") " pod="watcher-kuttl-default/ceilometer-0" Oct 03 09:05:56 crc kubenswrapper[4765]: I1003 09:05:56.669630 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/62a7a58b-c181-4ef7-b62d-de2a16d2a47c-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"62a7a58b-c181-4ef7-b62d-de2a16d2a47c\") " pod="watcher-kuttl-default/ceilometer-0" Oct 03 09:05:56 crc kubenswrapper[4765]: I1003 09:05:56.669878 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/62a7a58b-c181-4ef7-b62d-de2a16d2a47c-run-httpd\") pod \"ceilometer-0\" (UID: \"62a7a58b-c181-4ef7-b62d-de2a16d2a47c\") " pod="watcher-kuttl-default/ceilometer-0" Oct 03 09:05:56 crc kubenswrapper[4765]: I1003 09:05:56.669954 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/62a7a58b-c181-4ef7-b62d-de2a16d2a47c-log-httpd\") pod \"ceilometer-0\" (UID: \"62a7a58b-c181-4ef7-b62d-de2a16d2a47c\") " pod="watcher-kuttl-default/ceilometer-0" Oct 03 09:05:56 crc kubenswrapper[4765]: I1003 09:05:56.670984 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/62a7a58b-c181-4ef7-b62d-de2a16d2a47c-config-data\") pod \"ceilometer-0\" (UID: \"62a7a58b-c181-4ef7-b62d-de2a16d2a47c\") " pod="watcher-kuttl-default/ceilometer-0" Oct 03 09:05:56 crc kubenswrapper[4765]: I1003 09:05:56.672662 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/62a7a58b-c181-4ef7-b62d-de2a16d2a47c-scripts\") pod \"ceilometer-0\" (UID: \"62a7a58b-c181-4ef7-b62d-de2a16d2a47c\") " pod="watcher-kuttl-default/ceilometer-0" Oct 03 09:05:56 crc kubenswrapper[4765]: I1003 09:05:56.673422 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/62a7a58b-c181-4ef7-b62d-de2a16d2a47c-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"62a7a58b-c181-4ef7-b62d-de2a16d2a47c\") " pod="watcher-kuttl-default/ceilometer-0" Oct 03 09:05:56 crc kubenswrapper[4765]: I1003 09:05:56.678297 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/62a7a58b-c181-4ef7-b62d-de2a16d2a47c-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"62a7a58b-c181-4ef7-b62d-de2a16d2a47c\") " pod="watcher-kuttl-default/ceilometer-0" Oct 03 09:05:56 crc kubenswrapper[4765]: I1003 09:05:56.678700 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/62a7a58b-c181-4ef7-b62d-de2a16d2a47c-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"62a7a58b-c181-4ef7-b62d-de2a16d2a47c\") " pod="watcher-kuttl-default/ceilometer-0" Oct 03 09:05:56 crc kubenswrapper[4765]: I1003 09:05:56.679333 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/62a7a58b-c181-4ef7-b62d-de2a16d2a47c-config-data\") pod \"ceilometer-0\" (UID: \"62a7a58b-c181-4ef7-b62d-de2a16d2a47c\") " pod="watcher-kuttl-default/ceilometer-0" Oct 03 09:05:56 crc kubenswrapper[4765]: I1003 09:05:56.693252 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/62a7a58b-c181-4ef7-b62d-de2a16d2a47c-scripts\") pod \"ceilometer-0\" (UID: \"62a7a58b-c181-4ef7-b62d-de2a16d2a47c\") " pod="watcher-kuttl-default/ceilometer-0" Oct 03 09:05:56 crc kubenswrapper[4765]: I1003 09:05:56.694313 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gm65c\" (UniqueName: \"kubernetes.io/projected/62a7a58b-c181-4ef7-b62d-de2a16d2a47c-kube-api-access-gm65c\") pod \"ceilometer-0\" (UID: \"62a7a58b-c181-4ef7-b62d-de2a16d2a47c\") " pod="watcher-kuttl-default/ceilometer-0" Oct 03 09:05:56 crc kubenswrapper[4765]: I1003 09:05:56.794509 4765 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/ceilometer-0" Oct 03 09:05:56 crc kubenswrapper[4765]: I1003 09:05:56.877948 4765 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["watcher-kuttl-default/watcher-db-create-b4j4f"] Oct 03 09:05:56 crc kubenswrapper[4765]: I1003 09:05:56.895119 4765 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["watcher-kuttl-default/watcher-db-create-b4j4f"] Oct 03 09:05:56 crc kubenswrapper[4765]: I1003 09:05:56.942972 4765 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["watcher-kuttl-default/watcher-201a-account-create-xwrlp"] Oct 03 09:05:56 crc kubenswrapper[4765]: I1003 09:05:56.975705 4765 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["watcher-kuttl-default/watcher-201a-account-create-xwrlp"] Oct 03 09:05:56 crc kubenswrapper[4765]: I1003 09:05:56.986552 4765 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["watcher-kuttl-default/watcher201a-account-delete-fq5l7"] Oct 03 09:05:56 crc kubenswrapper[4765]: I1003 09:05:56.996737 4765 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["watcher-kuttl-default/watcher201a-account-delete-fq5l7"] Oct 03 09:05:57 crc kubenswrapper[4765]: I1003 09:05:57.325481 4765 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["watcher-kuttl-default/ceilometer-0"] Oct 03 09:05:57 crc kubenswrapper[4765]: W1003 09:05:57.328469 4765 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod62a7a58b_c181_4ef7_b62d_de2a16d2a47c.slice/crio-5777b6feccf0629666e398cc77016bb325c7ef30d05ff401552bee7ed071bc90 WatchSource:0}: Error finding container 5777b6feccf0629666e398cc77016bb325c7ef30d05ff401552bee7ed071bc90: Status 404 returned error can't find the container with id 5777b6feccf0629666e398cc77016bb325c7ef30d05ff401552bee7ed071bc90 Oct 03 09:05:57 crc kubenswrapper[4765]: I1003 09:05:57.419662 4765 generic.go:334] "Generic (PLEG): container finished" podID="bba0b645-70f1-4933-8370-f24077971b0c" containerID="7bc063ab757caea4438ed944cc746e782c86db03ec7ef54219691ba86dc3f431" exitCode=0 Oct 03 09:05:57 crc kubenswrapper[4765]: I1003 09:05:57.419737 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/watcher-kuttl-applier-0" event={"ID":"bba0b645-70f1-4933-8370-f24077971b0c","Type":"ContainerDied","Data":"7bc063ab757caea4438ed944cc746e782c86db03ec7ef54219691ba86dc3f431"} Oct 03 09:05:57 crc kubenswrapper[4765]: I1003 09:05:57.426010 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/ceilometer-0" event={"ID":"62a7a58b-c181-4ef7-b62d-de2a16d2a47c","Type":"ContainerStarted","Data":"5777b6feccf0629666e398cc77016bb325c7ef30d05ff401552bee7ed071bc90"} Oct 03 09:05:57 crc kubenswrapper[4765]: I1003 09:05:57.571143 4765 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/watcher-kuttl-applier-0" Oct 03 09:05:57 crc kubenswrapper[4765]: I1003 09:05:57.609492 4765 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bba0b645-70f1-4933-8370-f24077971b0c-combined-ca-bundle\") pod \"bba0b645-70f1-4933-8370-f24077971b0c\" (UID: \"bba0b645-70f1-4933-8370-f24077971b0c\") " Oct 03 09:05:57 crc kubenswrapper[4765]: I1003 09:05:57.609599 4765 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gkgjr\" (UniqueName: \"kubernetes.io/projected/bba0b645-70f1-4933-8370-f24077971b0c-kube-api-access-gkgjr\") pod \"bba0b645-70f1-4933-8370-f24077971b0c\" (UID: \"bba0b645-70f1-4933-8370-f24077971b0c\") " Oct 03 09:05:57 crc kubenswrapper[4765]: I1003 09:05:57.609717 4765 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cert-memcached-mtls\" (UniqueName: \"kubernetes.io/secret/bba0b645-70f1-4933-8370-f24077971b0c-cert-memcached-mtls\") pod \"bba0b645-70f1-4933-8370-f24077971b0c\" (UID: \"bba0b645-70f1-4933-8370-f24077971b0c\") " Oct 03 09:05:57 crc kubenswrapper[4765]: I1003 09:05:57.609820 4765 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/bba0b645-70f1-4933-8370-f24077971b0c-logs\") pod \"bba0b645-70f1-4933-8370-f24077971b0c\" (UID: \"bba0b645-70f1-4933-8370-f24077971b0c\") " Oct 03 09:05:57 crc kubenswrapper[4765]: I1003 09:05:57.609889 4765 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/bba0b645-70f1-4933-8370-f24077971b0c-config-data\") pod \"bba0b645-70f1-4933-8370-f24077971b0c\" (UID: \"bba0b645-70f1-4933-8370-f24077971b0c\") " Oct 03 09:05:57 crc kubenswrapper[4765]: I1003 09:05:57.611195 4765 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/bba0b645-70f1-4933-8370-f24077971b0c-logs" (OuterVolumeSpecName: "logs") pod "bba0b645-70f1-4933-8370-f24077971b0c" (UID: "bba0b645-70f1-4933-8370-f24077971b0c"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 03 09:05:57 crc kubenswrapper[4765]: I1003 09:05:57.620895 4765 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bba0b645-70f1-4933-8370-f24077971b0c-kube-api-access-gkgjr" (OuterVolumeSpecName: "kube-api-access-gkgjr") pod "bba0b645-70f1-4933-8370-f24077971b0c" (UID: "bba0b645-70f1-4933-8370-f24077971b0c"). InnerVolumeSpecName "kube-api-access-gkgjr". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 09:05:57 crc kubenswrapper[4765]: I1003 09:05:57.638064 4765 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bba0b645-70f1-4933-8370-f24077971b0c-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "bba0b645-70f1-4933-8370-f24077971b0c" (UID: "bba0b645-70f1-4933-8370-f24077971b0c"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 09:05:57 crc kubenswrapper[4765]: I1003 09:05:57.670874 4765 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bba0b645-70f1-4933-8370-f24077971b0c-config-data" (OuterVolumeSpecName: "config-data") pod "bba0b645-70f1-4933-8370-f24077971b0c" (UID: "bba0b645-70f1-4933-8370-f24077971b0c"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 09:05:57 crc kubenswrapper[4765]: I1003 09:05:57.682959 4765 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bba0b645-70f1-4933-8370-f24077971b0c-cert-memcached-mtls" (OuterVolumeSpecName: "cert-memcached-mtls") pod "bba0b645-70f1-4933-8370-f24077971b0c" (UID: "bba0b645-70f1-4933-8370-f24077971b0c"). InnerVolumeSpecName "cert-memcached-mtls". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 09:05:57 crc kubenswrapper[4765]: I1003 09:05:57.711944 4765 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/bba0b645-70f1-4933-8370-f24077971b0c-logs\") on node \"crc\" DevicePath \"\"" Oct 03 09:05:57 crc kubenswrapper[4765]: I1003 09:05:57.711978 4765 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/bba0b645-70f1-4933-8370-f24077971b0c-config-data\") on node \"crc\" DevicePath \"\"" Oct 03 09:05:57 crc kubenswrapper[4765]: I1003 09:05:57.711991 4765 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bba0b645-70f1-4933-8370-f24077971b0c-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 03 09:05:57 crc kubenswrapper[4765]: I1003 09:05:57.712002 4765 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-gkgjr\" (UniqueName: \"kubernetes.io/projected/bba0b645-70f1-4933-8370-f24077971b0c-kube-api-access-gkgjr\") on node \"crc\" DevicePath \"\"" Oct 03 09:05:57 crc kubenswrapper[4765]: I1003 09:05:57.712011 4765 reconciler_common.go:293] "Volume detached for volume \"cert-memcached-mtls\" (UniqueName: \"kubernetes.io/secret/bba0b645-70f1-4933-8370-f24077971b0c-cert-memcached-mtls\") on node \"crc\" DevicePath \"\"" Oct 03 09:05:58 crc kubenswrapper[4765]: I1003 09:05:58.116756 4765 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/watcher-kuttl-decision-engine-0" Oct 03 09:05:58 crc kubenswrapper[4765]: I1003 09:05:58.221761 4765 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cert-memcached-mtls\" (UniqueName: \"kubernetes.io/secret/3669b09d-23ed-4f85-b730-22c36851ca02-cert-memcached-mtls\") pod \"3669b09d-23ed-4f85-b730-22c36851ca02\" (UID: \"3669b09d-23ed-4f85-b730-22c36851ca02\") " Oct 03 09:05:58 crc kubenswrapper[4765]: I1003 09:05:58.221831 4765 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3669b09d-23ed-4f85-b730-22c36851ca02-config-data\") pod \"3669b09d-23ed-4f85-b730-22c36851ca02\" (UID: \"3669b09d-23ed-4f85-b730-22c36851ca02\") " Oct 03 09:05:58 crc kubenswrapper[4765]: I1003 09:05:58.221972 4765 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/3669b09d-23ed-4f85-b730-22c36851ca02-logs\") pod \"3669b09d-23ed-4f85-b730-22c36851ca02\" (UID: \"3669b09d-23ed-4f85-b730-22c36851ca02\") " Oct 03 09:05:58 crc kubenswrapper[4765]: I1003 09:05:58.222021 4765 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qfc2s\" (UniqueName: \"kubernetes.io/projected/3669b09d-23ed-4f85-b730-22c36851ca02-kube-api-access-qfc2s\") pod \"3669b09d-23ed-4f85-b730-22c36851ca02\" (UID: \"3669b09d-23ed-4f85-b730-22c36851ca02\") " Oct 03 09:05:58 crc kubenswrapper[4765]: I1003 09:05:58.222060 4765 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3669b09d-23ed-4f85-b730-22c36851ca02-combined-ca-bundle\") pod \"3669b09d-23ed-4f85-b730-22c36851ca02\" (UID: \"3669b09d-23ed-4f85-b730-22c36851ca02\") " Oct 03 09:05:58 crc kubenswrapper[4765]: I1003 09:05:58.222136 4765 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"custom-prometheus-ca\" (UniqueName: \"kubernetes.io/secret/3669b09d-23ed-4f85-b730-22c36851ca02-custom-prometheus-ca\") pod \"3669b09d-23ed-4f85-b730-22c36851ca02\" (UID: \"3669b09d-23ed-4f85-b730-22c36851ca02\") " Oct 03 09:05:58 crc kubenswrapper[4765]: I1003 09:05:58.223290 4765 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/3669b09d-23ed-4f85-b730-22c36851ca02-logs" (OuterVolumeSpecName: "logs") pod "3669b09d-23ed-4f85-b730-22c36851ca02" (UID: "3669b09d-23ed-4f85-b730-22c36851ca02"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 03 09:05:58 crc kubenswrapper[4765]: I1003 09:05:58.235733 4765 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3669b09d-23ed-4f85-b730-22c36851ca02-kube-api-access-qfc2s" (OuterVolumeSpecName: "kube-api-access-qfc2s") pod "3669b09d-23ed-4f85-b730-22c36851ca02" (UID: "3669b09d-23ed-4f85-b730-22c36851ca02"). InnerVolumeSpecName "kube-api-access-qfc2s". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 09:05:58 crc kubenswrapper[4765]: I1003 09:05:58.248391 4765 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3669b09d-23ed-4f85-b730-22c36851ca02-custom-prometheus-ca" (OuterVolumeSpecName: "custom-prometheus-ca") pod "3669b09d-23ed-4f85-b730-22c36851ca02" (UID: "3669b09d-23ed-4f85-b730-22c36851ca02"). InnerVolumeSpecName "custom-prometheus-ca". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 09:05:58 crc kubenswrapper[4765]: I1003 09:05:58.255485 4765 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3669b09d-23ed-4f85-b730-22c36851ca02-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "3669b09d-23ed-4f85-b730-22c36851ca02" (UID: "3669b09d-23ed-4f85-b730-22c36851ca02"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 09:05:58 crc kubenswrapper[4765]: I1003 09:05:58.274877 4765 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3669b09d-23ed-4f85-b730-22c36851ca02-config-data" (OuterVolumeSpecName: "config-data") pod "3669b09d-23ed-4f85-b730-22c36851ca02" (UID: "3669b09d-23ed-4f85-b730-22c36851ca02"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 09:05:58 crc kubenswrapper[4765]: I1003 09:05:58.293075 4765 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3669b09d-23ed-4f85-b730-22c36851ca02-cert-memcached-mtls" (OuterVolumeSpecName: "cert-memcached-mtls") pod "3669b09d-23ed-4f85-b730-22c36851ca02" (UID: "3669b09d-23ed-4f85-b730-22c36851ca02"). InnerVolumeSpecName "cert-memcached-mtls". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 09:05:58 crc kubenswrapper[4765]: I1003 09:05:58.307185 4765 scope.go:117] "RemoveContainer" containerID="dd918556e4256b95f1ffce5dba4f8a301b33441a569fc5bbea88da3f09eb9800" Oct 03 09:05:58 crc kubenswrapper[4765]: E1003 09:05:58.307637 4765 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-j8mss_openshift-machine-config-operator(d636dbad-9ffa-4ba7-953f-adea04b76a23)\"" pod="openshift-machine-config-operator/machine-config-daemon-j8mss" podUID="d636dbad-9ffa-4ba7-953f-adea04b76a23" Oct 03 09:05:58 crc kubenswrapper[4765]: I1003 09:05:58.319410 4765 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="520ae465-8518-4135-8906-29b80e6e0543" path="/var/lib/kubelet/pods/520ae465-8518-4135-8906-29b80e6e0543/volumes" Oct 03 09:05:58 crc kubenswrapper[4765]: I1003 09:05:58.320210 4765 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6ffed54e-6e2d-463d-b21b-f9ce909b1264" path="/var/lib/kubelet/pods/6ffed54e-6e2d-463d-b21b-f9ce909b1264/volumes" Oct 03 09:05:58 crc kubenswrapper[4765]: I1003 09:05:58.321063 4765 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8677a574-cd76-43b4-9a66-fcd43b04112e" path="/var/lib/kubelet/pods/8677a574-cd76-43b4-9a66-fcd43b04112e/volumes" Oct 03 09:05:58 crc kubenswrapper[4765]: I1003 09:05:58.324072 4765 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/3669b09d-23ed-4f85-b730-22c36851ca02-logs\") on node \"crc\" DevicePath \"\"" Oct 03 09:05:58 crc kubenswrapper[4765]: I1003 09:05:58.324106 4765 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qfc2s\" (UniqueName: \"kubernetes.io/projected/3669b09d-23ed-4f85-b730-22c36851ca02-kube-api-access-qfc2s\") on node \"crc\" DevicePath \"\"" Oct 03 09:05:58 crc kubenswrapper[4765]: I1003 09:05:58.324118 4765 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3669b09d-23ed-4f85-b730-22c36851ca02-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 03 09:05:58 crc kubenswrapper[4765]: I1003 09:05:58.324131 4765 reconciler_common.go:293] "Volume detached for volume \"custom-prometheus-ca\" (UniqueName: \"kubernetes.io/secret/3669b09d-23ed-4f85-b730-22c36851ca02-custom-prometheus-ca\") on node \"crc\" DevicePath \"\"" Oct 03 09:05:58 crc kubenswrapper[4765]: I1003 09:05:58.324145 4765 reconciler_common.go:293] "Volume detached for volume \"cert-memcached-mtls\" (UniqueName: \"kubernetes.io/secret/3669b09d-23ed-4f85-b730-22c36851ca02-cert-memcached-mtls\") on node \"crc\" DevicePath \"\"" Oct 03 09:05:58 crc kubenswrapper[4765]: I1003 09:05:58.324514 4765 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3669b09d-23ed-4f85-b730-22c36851ca02-config-data\") on node \"crc\" DevicePath \"\"" Oct 03 09:05:58 crc kubenswrapper[4765]: I1003 09:05:58.324704 4765 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d4cfdf95-c2d0-44b9-b58f-0d93887dd6b4" path="/var/lib/kubelet/pods/d4cfdf95-c2d0-44b9-b58f-0d93887dd6b4/volumes" Oct 03 09:05:58 crc kubenswrapper[4765]: I1003 09:05:58.435931 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/watcher-kuttl-applier-0" event={"ID":"bba0b645-70f1-4933-8370-f24077971b0c","Type":"ContainerDied","Data":"00de83f377edc168b6df4adc489d9de51299b7fc88ddb70b0562c8f10344349e"} Oct 03 09:05:58 crc kubenswrapper[4765]: I1003 09:05:58.435981 4765 scope.go:117] "RemoveContainer" containerID="7bc063ab757caea4438ed944cc746e782c86db03ec7ef54219691ba86dc3f431" Oct 03 09:05:58 crc kubenswrapper[4765]: I1003 09:05:58.435976 4765 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/watcher-kuttl-applier-0" Oct 03 09:05:58 crc kubenswrapper[4765]: I1003 09:05:58.438983 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/ceilometer-0" event={"ID":"62a7a58b-c181-4ef7-b62d-de2a16d2a47c","Type":"ContainerStarted","Data":"bc4f7717660a0772f0811a84b491da8fd73479342e628ba86dcb4188e766279d"} Oct 03 09:05:58 crc kubenswrapper[4765]: I1003 09:05:58.441849 4765 generic.go:334] "Generic (PLEG): container finished" podID="3669b09d-23ed-4f85-b730-22c36851ca02" containerID="61ec2535a698fc94c3eeb1f6ace68e948318b07eba61b15e3e11e6045959bde9" exitCode=0 Oct 03 09:05:58 crc kubenswrapper[4765]: I1003 09:05:58.441903 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/watcher-kuttl-decision-engine-0" event={"ID":"3669b09d-23ed-4f85-b730-22c36851ca02","Type":"ContainerDied","Data":"61ec2535a698fc94c3eeb1f6ace68e948318b07eba61b15e3e11e6045959bde9"} Oct 03 09:05:58 crc kubenswrapper[4765]: I1003 09:05:58.441936 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/watcher-kuttl-decision-engine-0" event={"ID":"3669b09d-23ed-4f85-b730-22c36851ca02","Type":"ContainerDied","Data":"8654bd470422c1029ec068530a117633cbf989aca06415480edbcbf7b10e3cc2"} Oct 03 09:05:58 crc kubenswrapper[4765]: I1003 09:05:58.441904 4765 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/watcher-kuttl-decision-engine-0" Oct 03 09:05:58 crc kubenswrapper[4765]: I1003 09:05:58.457889 4765 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["watcher-kuttl-default/watcher-kuttl-applier-0"] Oct 03 09:05:58 crc kubenswrapper[4765]: I1003 09:05:58.461195 4765 scope.go:117] "RemoveContainer" containerID="61ec2535a698fc94c3eeb1f6ace68e948318b07eba61b15e3e11e6045959bde9" Oct 03 09:05:58 crc kubenswrapper[4765]: I1003 09:05:58.467356 4765 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["watcher-kuttl-default/watcher-kuttl-applier-0"] Oct 03 09:05:58 crc kubenswrapper[4765]: I1003 09:05:58.482219 4765 scope.go:117] "RemoveContainer" containerID="61ec2535a698fc94c3eeb1f6ace68e948318b07eba61b15e3e11e6045959bde9" Oct 03 09:05:58 crc kubenswrapper[4765]: E1003 09:05:58.483275 4765 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"61ec2535a698fc94c3eeb1f6ace68e948318b07eba61b15e3e11e6045959bde9\": container with ID starting with 61ec2535a698fc94c3eeb1f6ace68e948318b07eba61b15e3e11e6045959bde9 not found: ID does not exist" containerID="61ec2535a698fc94c3eeb1f6ace68e948318b07eba61b15e3e11e6045959bde9" Oct 03 09:05:58 crc kubenswrapper[4765]: I1003 09:05:58.483302 4765 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"61ec2535a698fc94c3eeb1f6ace68e948318b07eba61b15e3e11e6045959bde9"} err="failed to get container status \"61ec2535a698fc94c3eeb1f6ace68e948318b07eba61b15e3e11e6045959bde9\": rpc error: code = NotFound desc = could not find container \"61ec2535a698fc94c3eeb1f6ace68e948318b07eba61b15e3e11e6045959bde9\": container with ID starting with 61ec2535a698fc94c3eeb1f6ace68e948318b07eba61b15e3e11e6045959bde9 not found: ID does not exist" Oct 03 09:05:58 crc kubenswrapper[4765]: I1003 09:05:58.485323 4765 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["watcher-kuttl-default/watcher-kuttl-decision-engine-0"] Oct 03 09:05:58 crc kubenswrapper[4765]: I1003 09:05:58.491961 4765 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["watcher-kuttl-default/watcher-kuttl-decision-engine-0"] Oct 03 09:05:59 crc kubenswrapper[4765]: I1003 09:05:59.161748 4765 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["watcher-kuttl-default/watcher-db-create-wfptv"] Oct 03 09:05:59 crc kubenswrapper[4765]: E1003 09:05:59.162504 4765 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3669b09d-23ed-4f85-b730-22c36851ca02" containerName="watcher-decision-engine" Oct 03 09:05:59 crc kubenswrapper[4765]: I1003 09:05:59.162527 4765 state_mem.go:107] "Deleted CPUSet assignment" podUID="3669b09d-23ed-4f85-b730-22c36851ca02" containerName="watcher-decision-engine" Oct 03 09:05:59 crc kubenswrapper[4765]: E1003 09:05:59.162541 4765 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bba0b645-70f1-4933-8370-f24077971b0c" containerName="watcher-applier" Oct 03 09:05:59 crc kubenswrapper[4765]: I1003 09:05:59.162549 4765 state_mem.go:107] "Deleted CPUSet assignment" podUID="bba0b645-70f1-4933-8370-f24077971b0c" containerName="watcher-applier" Oct 03 09:05:59 crc kubenswrapper[4765]: I1003 09:05:59.162728 4765 memory_manager.go:354] "RemoveStaleState removing state" podUID="bba0b645-70f1-4933-8370-f24077971b0c" containerName="watcher-applier" Oct 03 09:05:59 crc kubenswrapper[4765]: I1003 09:05:59.162755 4765 memory_manager.go:354] "RemoveStaleState removing state" podUID="3669b09d-23ed-4f85-b730-22c36851ca02" containerName="watcher-decision-engine" Oct 03 09:05:59 crc kubenswrapper[4765]: I1003 09:05:59.163369 4765 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/watcher-db-create-wfptv" Oct 03 09:05:59 crc kubenswrapper[4765]: I1003 09:05:59.175124 4765 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["watcher-kuttl-default/watcher-db-create-wfptv"] Oct 03 09:05:59 crc kubenswrapper[4765]: I1003 09:05:59.240565 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tsjd5\" (UniqueName: \"kubernetes.io/projected/42c7eba2-2b60-4283-bdfe-320338b7c04d-kube-api-access-tsjd5\") pod \"watcher-db-create-wfptv\" (UID: \"42c7eba2-2b60-4283-bdfe-320338b7c04d\") " pod="watcher-kuttl-default/watcher-db-create-wfptv" Oct 03 09:05:59 crc kubenswrapper[4765]: I1003 09:05:59.341681 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tsjd5\" (UniqueName: \"kubernetes.io/projected/42c7eba2-2b60-4283-bdfe-320338b7c04d-kube-api-access-tsjd5\") pod \"watcher-db-create-wfptv\" (UID: \"42c7eba2-2b60-4283-bdfe-320338b7c04d\") " pod="watcher-kuttl-default/watcher-db-create-wfptv" Oct 03 09:05:59 crc kubenswrapper[4765]: I1003 09:05:59.361440 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tsjd5\" (UniqueName: \"kubernetes.io/projected/42c7eba2-2b60-4283-bdfe-320338b7c04d-kube-api-access-tsjd5\") pod \"watcher-db-create-wfptv\" (UID: \"42c7eba2-2b60-4283-bdfe-320338b7c04d\") " pod="watcher-kuttl-default/watcher-db-create-wfptv" Oct 03 09:05:59 crc kubenswrapper[4765]: I1003 09:05:59.455100 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/ceilometer-0" event={"ID":"62a7a58b-c181-4ef7-b62d-de2a16d2a47c","Type":"ContainerStarted","Data":"50953f54f4d8e27818d198645502cedfbc75636e8d28835fe71747bc1ce91f49"} Oct 03 09:05:59 crc kubenswrapper[4765]: I1003 09:05:59.479857 4765 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/watcher-db-create-wfptv" Oct 03 09:05:59 crc kubenswrapper[4765]: I1003 09:05:59.958466 4765 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["watcher-kuttl-default/watcher-db-create-wfptv"] Oct 03 09:06:00 crc kubenswrapper[4765]: I1003 09:06:00.319277 4765 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3669b09d-23ed-4f85-b730-22c36851ca02" path="/var/lib/kubelet/pods/3669b09d-23ed-4f85-b730-22c36851ca02/volumes" Oct 03 09:06:00 crc kubenswrapper[4765]: I1003 09:06:00.320525 4765 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bba0b645-70f1-4933-8370-f24077971b0c" path="/var/lib/kubelet/pods/bba0b645-70f1-4933-8370-f24077971b0c/volumes" Oct 03 09:06:00 crc kubenswrapper[4765]: I1003 09:06:00.465956 4765 generic.go:334] "Generic (PLEG): container finished" podID="42c7eba2-2b60-4283-bdfe-320338b7c04d" containerID="7f580f6c80011a6013386fcfee558fccb2c9b64b5f6460aa6fde8191d9650009" exitCode=0 Oct 03 09:06:00 crc kubenswrapper[4765]: I1003 09:06:00.466051 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/watcher-db-create-wfptv" event={"ID":"42c7eba2-2b60-4283-bdfe-320338b7c04d","Type":"ContainerDied","Data":"7f580f6c80011a6013386fcfee558fccb2c9b64b5f6460aa6fde8191d9650009"} Oct 03 09:06:00 crc kubenswrapper[4765]: I1003 09:06:00.466082 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/watcher-db-create-wfptv" event={"ID":"42c7eba2-2b60-4283-bdfe-320338b7c04d","Type":"ContainerStarted","Data":"0cf9e830df8cfd631dddf05130c1c1b5ae8f1a929772f2d757375964d835d867"} Oct 03 09:06:00 crc kubenswrapper[4765]: I1003 09:06:00.468125 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/ceilometer-0" event={"ID":"62a7a58b-c181-4ef7-b62d-de2a16d2a47c","Type":"ContainerStarted","Data":"4595af1cd347c986b356c3f9262b22f868bea087e67bad66465aabc2e2d0176b"} Oct 03 09:06:01 crc kubenswrapper[4765]: I1003 09:06:01.947807 4765 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/watcher-db-create-wfptv" Oct 03 09:06:01 crc kubenswrapper[4765]: I1003 09:06:01.983974 4765 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tsjd5\" (UniqueName: \"kubernetes.io/projected/42c7eba2-2b60-4283-bdfe-320338b7c04d-kube-api-access-tsjd5\") pod \"42c7eba2-2b60-4283-bdfe-320338b7c04d\" (UID: \"42c7eba2-2b60-4283-bdfe-320338b7c04d\") " Oct 03 09:06:01 crc kubenswrapper[4765]: I1003 09:06:01.998863 4765 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/42c7eba2-2b60-4283-bdfe-320338b7c04d-kube-api-access-tsjd5" (OuterVolumeSpecName: "kube-api-access-tsjd5") pod "42c7eba2-2b60-4283-bdfe-320338b7c04d" (UID: "42c7eba2-2b60-4283-bdfe-320338b7c04d"). InnerVolumeSpecName "kube-api-access-tsjd5". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 09:06:02 crc kubenswrapper[4765]: I1003 09:06:02.085860 4765 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-tsjd5\" (UniqueName: \"kubernetes.io/projected/42c7eba2-2b60-4283-bdfe-320338b7c04d-kube-api-access-tsjd5\") on node \"crc\" DevicePath \"\"" Oct 03 09:06:02 crc kubenswrapper[4765]: E1003 09:06:02.452302 4765 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod42c7eba2_2b60_4283_bdfe_320338b7c04d.slice\": RecentStats: unable to find data in memory cache]" Oct 03 09:06:02 crc kubenswrapper[4765]: I1003 09:06:02.496194 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/watcher-db-create-wfptv" event={"ID":"42c7eba2-2b60-4283-bdfe-320338b7c04d","Type":"ContainerDied","Data":"0cf9e830df8cfd631dddf05130c1c1b5ae8f1a929772f2d757375964d835d867"} Oct 03 09:06:02 crc kubenswrapper[4765]: I1003 09:06:02.496245 4765 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="0cf9e830df8cfd631dddf05130c1c1b5ae8f1a929772f2d757375964d835d867" Oct 03 09:06:02 crc kubenswrapper[4765]: I1003 09:06:02.496306 4765 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/watcher-db-create-wfptv" Oct 03 09:06:02 crc kubenswrapper[4765]: I1003 09:06:02.499118 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/ceilometer-0" event={"ID":"62a7a58b-c181-4ef7-b62d-de2a16d2a47c","Type":"ContainerStarted","Data":"a2311cf4e7789e137e2d4959b66683e5b60c999ff4f46934bd967d185c138828"} Oct 03 09:06:02 crc kubenswrapper[4765]: I1003 09:06:02.499296 4765 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="watcher-kuttl-default/ceilometer-0" Oct 03 09:06:02 crc kubenswrapper[4765]: I1003 09:06:02.524247 4765 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="watcher-kuttl-default/ceilometer-0" podStartSLOduration=2.561403306 podStartE2EDuration="6.524226988s" podCreationTimestamp="2025-10-03 09:05:56 +0000 UTC" firstStartedPulling="2025-10-03 09:05:57.330514676 +0000 UTC m=+1601.632009006" lastFinishedPulling="2025-10-03 09:06:01.293338358 +0000 UTC m=+1605.594832688" observedRunningTime="2025-10-03 09:06:02.517024323 +0000 UTC m=+1606.818518653" watchObservedRunningTime="2025-10-03 09:06:02.524226988 +0000 UTC m=+1606.825721318" Oct 03 09:06:09 crc kubenswrapper[4765]: I1003 09:06:09.171895 4765 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["watcher-kuttl-default/watcher-bc0e-account-create-qh4zl"] Oct 03 09:06:09 crc kubenswrapper[4765]: E1003 09:06:09.172844 4765 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="42c7eba2-2b60-4283-bdfe-320338b7c04d" containerName="mariadb-database-create" Oct 03 09:06:09 crc kubenswrapper[4765]: I1003 09:06:09.172860 4765 state_mem.go:107] "Deleted CPUSet assignment" podUID="42c7eba2-2b60-4283-bdfe-320338b7c04d" containerName="mariadb-database-create" Oct 03 09:06:09 crc kubenswrapper[4765]: I1003 09:06:09.173077 4765 memory_manager.go:354] "RemoveStaleState removing state" podUID="42c7eba2-2b60-4283-bdfe-320338b7c04d" containerName="mariadb-database-create" Oct 03 09:06:09 crc kubenswrapper[4765]: I1003 09:06:09.173763 4765 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/watcher-bc0e-account-create-qh4zl" Oct 03 09:06:09 crc kubenswrapper[4765]: I1003 09:06:09.176381 4765 reflector.go:368] Caches populated for *v1.Secret from object-"watcher-kuttl-default"/"watcher-db-secret" Oct 03 09:06:09 crc kubenswrapper[4765]: I1003 09:06:09.181308 4765 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["watcher-kuttl-default/watcher-bc0e-account-create-qh4zl"] Oct 03 09:06:09 crc kubenswrapper[4765]: I1003 09:06:09.298467 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hwxk4\" (UniqueName: \"kubernetes.io/projected/870c1bf5-02a9-472e-8747-b92bdc505367-kube-api-access-hwxk4\") pod \"watcher-bc0e-account-create-qh4zl\" (UID: \"870c1bf5-02a9-472e-8747-b92bdc505367\") " pod="watcher-kuttl-default/watcher-bc0e-account-create-qh4zl" Oct 03 09:06:09 crc kubenswrapper[4765]: I1003 09:06:09.400481 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hwxk4\" (UniqueName: \"kubernetes.io/projected/870c1bf5-02a9-472e-8747-b92bdc505367-kube-api-access-hwxk4\") pod \"watcher-bc0e-account-create-qh4zl\" (UID: \"870c1bf5-02a9-472e-8747-b92bdc505367\") " pod="watcher-kuttl-default/watcher-bc0e-account-create-qh4zl" Oct 03 09:06:09 crc kubenswrapper[4765]: I1003 09:06:09.421718 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hwxk4\" (UniqueName: \"kubernetes.io/projected/870c1bf5-02a9-472e-8747-b92bdc505367-kube-api-access-hwxk4\") pod \"watcher-bc0e-account-create-qh4zl\" (UID: \"870c1bf5-02a9-472e-8747-b92bdc505367\") " pod="watcher-kuttl-default/watcher-bc0e-account-create-qh4zl" Oct 03 09:06:09 crc kubenswrapper[4765]: I1003 09:06:09.498056 4765 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/watcher-bc0e-account-create-qh4zl" Oct 03 09:06:10 crc kubenswrapper[4765]: I1003 09:06:10.007022 4765 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["watcher-kuttl-default/watcher-bc0e-account-create-qh4zl"] Oct 03 09:06:10 crc kubenswrapper[4765]: I1003 09:06:10.588816 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/watcher-bc0e-account-create-qh4zl" event={"ID":"870c1bf5-02a9-472e-8747-b92bdc505367","Type":"ContainerStarted","Data":"917858b779ed740f3db191723520a9ae27e54bcc078a6aebe580ef8a1baf447b"} Oct 03 09:06:10 crc kubenswrapper[4765]: I1003 09:06:10.589169 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/watcher-bc0e-account-create-qh4zl" event={"ID":"870c1bf5-02a9-472e-8747-b92bdc505367","Type":"ContainerStarted","Data":"611aba021a5886783412f009c56ba993a9fa5b7bb664cbe312224d8c66ef71f5"} Oct 03 09:06:10 crc kubenswrapper[4765]: I1003 09:06:10.611859 4765 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="watcher-kuttl-default/watcher-bc0e-account-create-qh4zl" podStartSLOduration=1.6118349699999999 podStartE2EDuration="1.61183497s" podCreationTimestamp="2025-10-03 09:06:09 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-03 09:06:10.606277517 +0000 UTC m=+1614.907771857" watchObservedRunningTime="2025-10-03 09:06:10.61183497 +0000 UTC m=+1614.913329300" Oct 03 09:06:11 crc kubenswrapper[4765]: I1003 09:06:11.598493 4765 generic.go:334] "Generic (PLEG): container finished" podID="870c1bf5-02a9-472e-8747-b92bdc505367" containerID="917858b779ed740f3db191723520a9ae27e54bcc078a6aebe580ef8a1baf447b" exitCode=0 Oct 03 09:06:11 crc kubenswrapper[4765]: I1003 09:06:11.598565 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/watcher-bc0e-account-create-qh4zl" event={"ID":"870c1bf5-02a9-472e-8747-b92bdc505367","Type":"ContainerDied","Data":"917858b779ed740f3db191723520a9ae27e54bcc078a6aebe580ef8a1baf447b"} Oct 03 09:06:13 crc kubenswrapper[4765]: I1003 09:06:13.011752 4765 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/watcher-bc0e-account-create-qh4zl" Oct 03 09:06:13 crc kubenswrapper[4765]: I1003 09:06:13.163457 4765 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-hwxk4\" (UniqueName: \"kubernetes.io/projected/870c1bf5-02a9-472e-8747-b92bdc505367-kube-api-access-hwxk4\") pod \"870c1bf5-02a9-472e-8747-b92bdc505367\" (UID: \"870c1bf5-02a9-472e-8747-b92bdc505367\") " Oct 03 09:06:13 crc kubenswrapper[4765]: I1003 09:06:13.173231 4765 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/870c1bf5-02a9-472e-8747-b92bdc505367-kube-api-access-hwxk4" (OuterVolumeSpecName: "kube-api-access-hwxk4") pod "870c1bf5-02a9-472e-8747-b92bdc505367" (UID: "870c1bf5-02a9-472e-8747-b92bdc505367"). InnerVolumeSpecName "kube-api-access-hwxk4". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 09:06:13 crc kubenswrapper[4765]: I1003 09:06:13.266309 4765 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-hwxk4\" (UniqueName: \"kubernetes.io/projected/870c1bf5-02a9-472e-8747-b92bdc505367-kube-api-access-hwxk4\") on node \"crc\" DevicePath \"\"" Oct 03 09:06:13 crc kubenswrapper[4765]: I1003 09:06:13.307101 4765 scope.go:117] "RemoveContainer" containerID="dd918556e4256b95f1ffce5dba4f8a301b33441a569fc5bbea88da3f09eb9800" Oct 03 09:06:13 crc kubenswrapper[4765]: E1003 09:06:13.307476 4765 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-j8mss_openshift-machine-config-operator(d636dbad-9ffa-4ba7-953f-adea04b76a23)\"" pod="openshift-machine-config-operator/machine-config-daemon-j8mss" podUID="d636dbad-9ffa-4ba7-953f-adea04b76a23" Oct 03 09:06:13 crc kubenswrapper[4765]: I1003 09:06:13.629979 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/watcher-bc0e-account-create-qh4zl" event={"ID":"870c1bf5-02a9-472e-8747-b92bdc505367","Type":"ContainerDied","Data":"611aba021a5886783412f009c56ba993a9fa5b7bb664cbe312224d8c66ef71f5"} Oct 03 09:06:13 crc kubenswrapper[4765]: I1003 09:06:13.630035 4765 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="611aba021a5886783412f009c56ba993a9fa5b7bb664cbe312224d8c66ef71f5" Oct 03 09:06:13 crc kubenswrapper[4765]: I1003 09:06:13.630095 4765 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/watcher-bc0e-account-create-qh4zl" Oct 03 09:06:14 crc kubenswrapper[4765]: I1003 09:06:14.512136 4765 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["watcher-kuttl-default/watcher-kuttl-db-sync-lzqdv"] Oct 03 09:06:14 crc kubenswrapper[4765]: E1003 09:06:14.512486 4765 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="870c1bf5-02a9-472e-8747-b92bdc505367" containerName="mariadb-account-create" Oct 03 09:06:14 crc kubenswrapper[4765]: I1003 09:06:14.512499 4765 state_mem.go:107] "Deleted CPUSet assignment" podUID="870c1bf5-02a9-472e-8747-b92bdc505367" containerName="mariadb-account-create" Oct 03 09:06:14 crc kubenswrapper[4765]: I1003 09:06:14.512680 4765 memory_manager.go:354] "RemoveStaleState removing state" podUID="870c1bf5-02a9-472e-8747-b92bdc505367" containerName="mariadb-account-create" Oct 03 09:06:14 crc kubenswrapper[4765]: I1003 09:06:14.513364 4765 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/watcher-kuttl-db-sync-lzqdv" Oct 03 09:06:14 crc kubenswrapper[4765]: I1003 09:06:14.516578 4765 reflector.go:368] Caches populated for *v1.Secret from object-"watcher-kuttl-default"/"watcher-kuttl-config-data" Oct 03 09:06:14 crc kubenswrapper[4765]: I1003 09:06:14.516719 4765 reflector.go:368] Caches populated for *v1.Secret from object-"watcher-kuttl-default"/"watcher-watcher-kuttl-dockercfg-f4q6z" Oct 03 09:06:14 crc kubenswrapper[4765]: I1003 09:06:14.526120 4765 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["watcher-kuttl-default/watcher-kuttl-db-sync-lzqdv"] Oct 03 09:06:14 crc kubenswrapper[4765]: I1003 09:06:14.691134 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/bb0e5f30-95a2-4a98-80c2-cd33dd4328a1-config-data\") pod \"watcher-kuttl-db-sync-lzqdv\" (UID: \"bb0e5f30-95a2-4a98-80c2-cd33dd4328a1\") " pod="watcher-kuttl-default/watcher-kuttl-db-sync-lzqdv" Oct 03 09:06:14 crc kubenswrapper[4765]: I1003 09:06:14.691456 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/bb0e5f30-95a2-4a98-80c2-cd33dd4328a1-db-sync-config-data\") pod \"watcher-kuttl-db-sync-lzqdv\" (UID: \"bb0e5f30-95a2-4a98-80c2-cd33dd4328a1\") " pod="watcher-kuttl-default/watcher-kuttl-db-sync-lzqdv" Oct 03 09:06:14 crc kubenswrapper[4765]: I1003 09:06:14.691655 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zl87c\" (UniqueName: \"kubernetes.io/projected/bb0e5f30-95a2-4a98-80c2-cd33dd4328a1-kube-api-access-zl87c\") pod \"watcher-kuttl-db-sync-lzqdv\" (UID: \"bb0e5f30-95a2-4a98-80c2-cd33dd4328a1\") " pod="watcher-kuttl-default/watcher-kuttl-db-sync-lzqdv" Oct 03 09:06:14 crc kubenswrapper[4765]: I1003 09:06:14.691827 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bb0e5f30-95a2-4a98-80c2-cd33dd4328a1-combined-ca-bundle\") pod \"watcher-kuttl-db-sync-lzqdv\" (UID: \"bb0e5f30-95a2-4a98-80c2-cd33dd4328a1\") " pod="watcher-kuttl-default/watcher-kuttl-db-sync-lzqdv" Oct 03 09:06:14 crc kubenswrapper[4765]: I1003 09:06:14.793632 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/bb0e5f30-95a2-4a98-80c2-cd33dd4328a1-config-data\") pod \"watcher-kuttl-db-sync-lzqdv\" (UID: \"bb0e5f30-95a2-4a98-80c2-cd33dd4328a1\") " pod="watcher-kuttl-default/watcher-kuttl-db-sync-lzqdv" Oct 03 09:06:14 crc kubenswrapper[4765]: I1003 09:06:14.793842 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/bb0e5f30-95a2-4a98-80c2-cd33dd4328a1-db-sync-config-data\") pod \"watcher-kuttl-db-sync-lzqdv\" (UID: \"bb0e5f30-95a2-4a98-80c2-cd33dd4328a1\") " pod="watcher-kuttl-default/watcher-kuttl-db-sync-lzqdv" Oct 03 09:06:14 crc kubenswrapper[4765]: I1003 09:06:14.793914 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zl87c\" (UniqueName: \"kubernetes.io/projected/bb0e5f30-95a2-4a98-80c2-cd33dd4328a1-kube-api-access-zl87c\") pod \"watcher-kuttl-db-sync-lzqdv\" (UID: \"bb0e5f30-95a2-4a98-80c2-cd33dd4328a1\") " pod="watcher-kuttl-default/watcher-kuttl-db-sync-lzqdv" Oct 03 09:06:14 crc kubenswrapper[4765]: I1003 09:06:14.793974 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bb0e5f30-95a2-4a98-80c2-cd33dd4328a1-combined-ca-bundle\") pod \"watcher-kuttl-db-sync-lzqdv\" (UID: \"bb0e5f30-95a2-4a98-80c2-cd33dd4328a1\") " pod="watcher-kuttl-default/watcher-kuttl-db-sync-lzqdv" Oct 03 09:06:14 crc kubenswrapper[4765]: I1003 09:06:14.799315 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/bb0e5f30-95a2-4a98-80c2-cd33dd4328a1-db-sync-config-data\") pod \"watcher-kuttl-db-sync-lzqdv\" (UID: \"bb0e5f30-95a2-4a98-80c2-cd33dd4328a1\") " pod="watcher-kuttl-default/watcher-kuttl-db-sync-lzqdv" Oct 03 09:06:14 crc kubenswrapper[4765]: I1003 09:06:14.805474 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bb0e5f30-95a2-4a98-80c2-cd33dd4328a1-combined-ca-bundle\") pod \"watcher-kuttl-db-sync-lzqdv\" (UID: \"bb0e5f30-95a2-4a98-80c2-cd33dd4328a1\") " pod="watcher-kuttl-default/watcher-kuttl-db-sync-lzqdv" Oct 03 09:06:14 crc kubenswrapper[4765]: I1003 09:06:14.806470 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/bb0e5f30-95a2-4a98-80c2-cd33dd4328a1-config-data\") pod \"watcher-kuttl-db-sync-lzqdv\" (UID: \"bb0e5f30-95a2-4a98-80c2-cd33dd4328a1\") " pod="watcher-kuttl-default/watcher-kuttl-db-sync-lzqdv" Oct 03 09:06:14 crc kubenswrapper[4765]: I1003 09:06:14.810618 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zl87c\" (UniqueName: \"kubernetes.io/projected/bb0e5f30-95a2-4a98-80c2-cd33dd4328a1-kube-api-access-zl87c\") pod \"watcher-kuttl-db-sync-lzqdv\" (UID: \"bb0e5f30-95a2-4a98-80c2-cd33dd4328a1\") " pod="watcher-kuttl-default/watcher-kuttl-db-sync-lzqdv" Oct 03 09:06:14 crc kubenswrapper[4765]: I1003 09:06:14.837798 4765 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/watcher-kuttl-db-sync-lzqdv" Oct 03 09:06:15 crc kubenswrapper[4765]: I1003 09:06:15.283955 4765 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["watcher-kuttl-default/watcher-kuttl-db-sync-lzqdv"] Oct 03 09:06:15 crc kubenswrapper[4765]: I1003 09:06:15.647380 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/watcher-kuttl-db-sync-lzqdv" event={"ID":"bb0e5f30-95a2-4a98-80c2-cd33dd4328a1","Type":"ContainerStarted","Data":"28c8d5a9fb8b49bec37ee98f35896a8cb55afcf28d1ba92c93967268c0f07240"} Oct 03 09:06:15 crc kubenswrapper[4765]: I1003 09:06:15.647434 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/watcher-kuttl-db-sync-lzqdv" event={"ID":"bb0e5f30-95a2-4a98-80c2-cd33dd4328a1","Type":"ContainerStarted","Data":"849bb1ac7359c89597582304ecbadbfdbb54640b269b14eef0c83e0ff7f340df"} Oct 03 09:06:15 crc kubenswrapper[4765]: I1003 09:06:15.670271 4765 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="watcher-kuttl-default/watcher-kuttl-db-sync-lzqdv" podStartSLOduration=1.67025214 podStartE2EDuration="1.67025214s" podCreationTimestamp="2025-10-03 09:06:14 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-03 09:06:15.662587993 +0000 UTC m=+1619.964082343" watchObservedRunningTime="2025-10-03 09:06:15.67025214 +0000 UTC m=+1619.971746470" Oct 03 09:06:17 crc kubenswrapper[4765]: I1003 09:06:17.909067 4765 scope.go:117] "RemoveContainer" containerID="d30cd7dcf5b79b28884651b6af6741815ff55b619c7d6fb4417ac39d2831b88d" Oct 03 09:06:17 crc kubenswrapper[4765]: I1003 09:06:17.946584 4765 scope.go:117] "RemoveContainer" containerID="c0ee5a1efc2893c2e7c90c4b084b0786c801d29e68edff371ca727b1dfd86db0" Oct 03 09:06:18 crc kubenswrapper[4765]: I1003 09:06:18.675887 4765 generic.go:334] "Generic (PLEG): container finished" podID="bb0e5f30-95a2-4a98-80c2-cd33dd4328a1" containerID="28c8d5a9fb8b49bec37ee98f35896a8cb55afcf28d1ba92c93967268c0f07240" exitCode=0 Oct 03 09:06:18 crc kubenswrapper[4765]: I1003 09:06:18.676008 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/watcher-kuttl-db-sync-lzqdv" event={"ID":"bb0e5f30-95a2-4a98-80c2-cd33dd4328a1","Type":"ContainerDied","Data":"28c8d5a9fb8b49bec37ee98f35896a8cb55afcf28d1ba92c93967268c0f07240"} Oct 03 09:06:20 crc kubenswrapper[4765]: I1003 09:06:20.060663 4765 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/watcher-kuttl-db-sync-lzqdv" Oct 03 09:06:20 crc kubenswrapper[4765]: I1003 09:06:20.178097 4765 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zl87c\" (UniqueName: \"kubernetes.io/projected/bb0e5f30-95a2-4a98-80c2-cd33dd4328a1-kube-api-access-zl87c\") pod \"bb0e5f30-95a2-4a98-80c2-cd33dd4328a1\" (UID: \"bb0e5f30-95a2-4a98-80c2-cd33dd4328a1\") " Oct 03 09:06:20 crc kubenswrapper[4765]: I1003 09:06:20.178177 4765 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/bb0e5f30-95a2-4a98-80c2-cd33dd4328a1-config-data\") pod \"bb0e5f30-95a2-4a98-80c2-cd33dd4328a1\" (UID: \"bb0e5f30-95a2-4a98-80c2-cd33dd4328a1\") " Oct 03 09:06:20 crc kubenswrapper[4765]: I1003 09:06:20.178206 4765 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/bb0e5f30-95a2-4a98-80c2-cd33dd4328a1-db-sync-config-data\") pod \"bb0e5f30-95a2-4a98-80c2-cd33dd4328a1\" (UID: \"bb0e5f30-95a2-4a98-80c2-cd33dd4328a1\") " Oct 03 09:06:20 crc kubenswrapper[4765]: I1003 09:06:20.178287 4765 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bb0e5f30-95a2-4a98-80c2-cd33dd4328a1-combined-ca-bundle\") pod \"bb0e5f30-95a2-4a98-80c2-cd33dd4328a1\" (UID: \"bb0e5f30-95a2-4a98-80c2-cd33dd4328a1\") " Oct 03 09:06:20 crc kubenswrapper[4765]: I1003 09:06:20.183246 4765 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bb0e5f30-95a2-4a98-80c2-cd33dd4328a1-db-sync-config-data" (OuterVolumeSpecName: "db-sync-config-data") pod "bb0e5f30-95a2-4a98-80c2-cd33dd4328a1" (UID: "bb0e5f30-95a2-4a98-80c2-cd33dd4328a1"). InnerVolumeSpecName "db-sync-config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 09:06:20 crc kubenswrapper[4765]: I1003 09:06:20.183509 4765 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bb0e5f30-95a2-4a98-80c2-cd33dd4328a1-kube-api-access-zl87c" (OuterVolumeSpecName: "kube-api-access-zl87c") pod "bb0e5f30-95a2-4a98-80c2-cd33dd4328a1" (UID: "bb0e5f30-95a2-4a98-80c2-cd33dd4328a1"). InnerVolumeSpecName "kube-api-access-zl87c". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 09:06:20 crc kubenswrapper[4765]: I1003 09:06:20.203557 4765 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bb0e5f30-95a2-4a98-80c2-cd33dd4328a1-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "bb0e5f30-95a2-4a98-80c2-cd33dd4328a1" (UID: "bb0e5f30-95a2-4a98-80c2-cd33dd4328a1"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 09:06:20 crc kubenswrapper[4765]: I1003 09:06:20.233165 4765 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bb0e5f30-95a2-4a98-80c2-cd33dd4328a1-config-data" (OuterVolumeSpecName: "config-data") pod "bb0e5f30-95a2-4a98-80c2-cd33dd4328a1" (UID: "bb0e5f30-95a2-4a98-80c2-cd33dd4328a1"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 09:06:20 crc kubenswrapper[4765]: I1003 09:06:20.280693 4765 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zl87c\" (UniqueName: \"kubernetes.io/projected/bb0e5f30-95a2-4a98-80c2-cd33dd4328a1-kube-api-access-zl87c\") on node \"crc\" DevicePath \"\"" Oct 03 09:06:20 crc kubenswrapper[4765]: I1003 09:06:20.280751 4765 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/bb0e5f30-95a2-4a98-80c2-cd33dd4328a1-config-data\") on node \"crc\" DevicePath \"\"" Oct 03 09:06:20 crc kubenswrapper[4765]: I1003 09:06:20.280765 4765 reconciler_common.go:293] "Volume detached for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/bb0e5f30-95a2-4a98-80c2-cd33dd4328a1-db-sync-config-data\") on node \"crc\" DevicePath \"\"" Oct 03 09:06:20 crc kubenswrapper[4765]: I1003 09:06:20.280776 4765 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bb0e5f30-95a2-4a98-80c2-cd33dd4328a1-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 03 09:06:20 crc kubenswrapper[4765]: I1003 09:06:20.694463 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/watcher-kuttl-db-sync-lzqdv" event={"ID":"bb0e5f30-95a2-4a98-80c2-cd33dd4328a1","Type":"ContainerDied","Data":"849bb1ac7359c89597582304ecbadbfdbb54640b269b14eef0c83e0ff7f340df"} Oct 03 09:06:20 crc kubenswrapper[4765]: I1003 09:06:20.694532 4765 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/watcher-kuttl-db-sync-lzqdv" Oct 03 09:06:20 crc kubenswrapper[4765]: I1003 09:06:20.694544 4765 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="849bb1ac7359c89597582304ecbadbfdbb54640b269b14eef0c83e0ff7f340df" Oct 03 09:06:20 crc kubenswrapper[4765]: I1003 09:06:20.947376 4765 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["watcher-kuttl-default/watcher-kuttl-api-0"] Oct 03 09:06:20 crc kubenswrapper[4765]: E1003 09:06:20.947810 4765 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bb0e5f30-95a2-4a98-80c2-cd33dd4328a1" containerName="watcher-kuttl-db-sync" Oct 03 09:06:20 crc kubenswrapper[4765]: I1003 09:06:20.947843 4765 state_mem.go:107] "Deleted CPUSet assignment" podUID="bb0e5f30-95a2-4a98-80c2-cd33dd4328a1" containerName="watcher-kuttl-db-sync" Oct 03 09:06:20 crc kubenswrapper[4765]: I1003 09:06:20.948075 4765 memory_manager.go:354] "RemoveStaleState removing state" podUID="bb0e5f30-95a2-4a98-80c2-cd33dd4328a1" containerName="watcher-kuttl-db-sync" Oct 03 09:06:20 crc kubenswrapper[4765]: I1003 09:06:20.949093 4765 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/watcher-kuttl-api-0" Oct 03 09:06:20 crc kubenswrapper[4765]: I1003 09:06:20.954431 4765 reflector.go:368] Caches populated for *v1.Secret from object-"watcher-kuttl-default"/"watcher-watcher-kuttl-dockercfg-f4q6z" Oct 03 09:06:20 crc kubenswrapper[4765]: I1003 09:06:20.954588 4765 reflector.go:368] Caches populated for *v1.Secret from object-"watcher-kuttl-default"/"watcher-kuttl-api-config-data" Oct 03 09:06:20 crc kubenswrapper[4765]: I1003 09:06:20.974832 4765 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["watcher-kuttl-default/watcher-kuttl-api-0"] Oct 03 09:06:21 crc kubenswrapper[4765]: I1003 09:06:21.025315 4765 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["watcher-kuttl-default/watcher-kuttl-decision-engine-0"] Oct 03 09:06:21 crc kubenswrapper[4765]: I1003 09:06:21.026275 4765 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/watcher-kuttl-decision-engine-0" Oct 03 09:06:21 crc kubenswrapper[4765]: I1003 09:06:21.030574 4765 reflector.go:368] Caches populated for *v1.Secret from object-"watcher-kuttl-default"/"watcher-kuttl-decision-engine-config-data" Oct 03 09:06:21 crc kubenswrapper[4765]: I1003 09:06:21.042577 4765 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["watcher-kuttl-default/watcher-kuttl-decision-engine-0"] Oct 03 09:06:21 crc kubenswrapper[4765]: I1003 09:06:21.051223 4765 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["watcher-kuttl-default/watcher-kuttl-applier-0"] Oct 03 09:06:21 crc kubenswrapper[4765]: I1003 09:06:21.052435 4765 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/watcher-kuttl-applier-0" Oct 03 09:06:21 crc kubenswrapper[4765]: I1003 09:06:21.056007 4765 reflector.go:368] Caches populated for *v1.Secret from object-"watcher-kuttl-default"/"watcher-kuttl-applier-config-data" Oct 03 09:06:21 crc kubenswrapper[4765]: I1003 09:06:21.093167 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7wjn6\" (UniqueName: \"kubernetes.io/projected/9ce33c7a-0177-422f-8cfa-7d88fa60c2ee-kube-api-access-7wjn6\") pod \"watcher-kuttl-api-0\" (UID: \"9ce33c7a-0177-422f-8cfa-7d88fa60c2ee\") " pod="watcher-kuttl-default/watcher-kuttl-api-0" Oct 03 09:06:21 crc kubenswrapper[4765]: I1003 09:06:21.093259 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert-memcached-mtls\" (UniqueName: \"kubernetes.io/secret/9ce33c7a-0177-422f-8cfa-7d88fa60c2ee-cert-memcached-mtls\") pod \"watcher-kuttl-api-0\" (UID: \"9ce33c7a-0177-422f-8cfa-7d88fa60c2ee\") " pod="watcher-kuttl-default/watcher-kuttl-api-0" Oct 03 09:06:21 crc kubenswrapper[4765]: I1003 09:06:21.093333 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9ce33c7a-0177-422f-8cfa-7d88fa60c2ee-combined-ca-bundle\") pod \"watcher-kuttl-api-0\" (UID: \"9ce33c7a-0177-422f-8cfa-7d88fa60c2ee\") " pod="watcher-kuttl-default/watcher-kuttl-api-0" Oct 03 09:06:21 crc kubenswrapper[4765]: I1003 09:06:21.093390 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9ce33c7a-0177-422f-8cfa-7d88fa60c2ee-config-data\") pod \"watcher-kuttl-api-0\" (UID: \"9ce33c7a-0177-422f-8cfa-7d88fa60c2ee\") " pod="watcher-kuttl-default/watcher-kuttl-api-0" Oct 03 09:06:21 crc kubenswrapper[4765]: I1003 09:06:21.093430 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/9ce33c7a-0177-422f-8cfa-7d88fa60c2ee-logs\") pod \"watcher-kuttl-api-0\" (UID: \"9ce33c7a-0177-422f-8cfa-7d88fa60c2ee\") " pod="watcher-kuttl-default/watcher-kuttl-api-0" Oct 03 09:06:21 crc kubenswrapper[4765]: I1003 09:06:21.093451 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"custom-prometheus-ca\" (UniqueName: \"kubernetes.io/secret/9ce33c7a-0177-422f-8cfa-7d88fa60c2ee-custom-prometheus-ca\") pod \"watcher-kuttl-api-0\" (UID: \"9ce33c7a-0177-422f-8cfa-7d88fa60c2ee\") " pod="watcher-kuttl-default/watcher-kuttl-api-0" Oct 03 09:06:21 crc kubenswrapper[4765]: I1003 09:06:21.105979 4765 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["watcher-kuttl-default/watcher-kuttl-applier-0"] Oct 03 09:06:21 crc kubenswrapper[4765]: I1003 09:06:21.194928 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9ce33c7a-0177-422f-8cfa-7d88fa60c2ee-config-data\") pod \"watcher-kuttl-api-0\" (UID: \"9ce33c7a-0177-422f-8cfa-7d88fa60c2ee\") " pod="watcher-kuttl-default/watcher-kuttl-api-0" Oct 03 09:06:21 crc kubenswrapper[4765]: I1003 09:06:21.194978 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-95jcn\" (UniqueName: \"kubernetes.io/projected/ba2a7e44-33cc-4cf1-8dfc-7aa8306118d2-kube-api-access-95jcn\") pod \"watcher-kuttl-decision-engine-0\" (UID: \"ba2a7e44-33cc-4cf1-8dfc-7aa8306118d2\") " pod="watcher-kuttl-default/watcher-kuttl-decision-engine-0" Oct 03 09:06:21 crc kubenswrapper[4765]: I1003 09:06:21.195018 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/c6c827c9-7f24-44dc-841b-246a3eafaae5-logs\") pod \"watcher-kuttl-applier-0\" (UID: \"c6c827c9-7f24-44dc-841b-246a3eafaae5\") " pod="watcher-kuttl-default/watcher-kuttl-applier-0" Oct 03 09:06:21 crc kubenswrapper[4765]: I1003 09:06:21.195037 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/9ce33c7a-0177-422f-8cfa-7d88fa60c2ee-logs\") pod \"watcher-kuttl-api-0\" (UID: \"9ce33c7a-0177-422f-8cfa-7d88fa60c2ee\") " pod="watcher-kuttl-default/watcher-kuttl-api-0" Oct 03 09:06:21 crc kubenswrapper[4765]: I1003 09:06:21.195053 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"custom-prometheus-ca\" (UniqueName: \"kubernetes.io/secret/9ce33c7a-0177-422f-8cfa-7d88fa60c2ee-custom-prometheus-ca\") pod \"watcher-kuttl-api-0\" (UID: \"9ce33c7a-0177-422f-8cfa-7d88fa60c2ee\") " pod="watcher-kuttl-default/watcher-kuttl-api-0" Oct 03 09:06:21 crc kubenswrapper[4765]: I1003 09:06:21.195071 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert-memcached-mtls\" (UniqueName: \"kubernetes.io/secret/c6c827c9-7f24-44dc-841b-246a3eafaae5-cert-memcached-mtls\") pod \"watcher-kuttl-applier-0\" (UID: \"c6c827c9-7f24-44dc-841b-246a3eafaae5\") " pod="watcher-kuttl-default/watcher-kuttl-applier-0" Oct 03 09:06:21 crc kubenswrapper[4765]: I1003 09:06:21.195114 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ba2a7e44-33cc-4cf1-8dfc-7aa8306118d2-config-data\") pod \"watcher-kuttl-decision-engine-0\" (UID: \"ba2a7e44-33cc-4cf1-8dfc-7aa8306118d2\") " pod="watcher-kuttl-default/watcher-kuttl-decision-engine-0" Oct 03 09:06:21 crc kubenswrapper[4765]: I1003 09:06:21.195137 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7wjn6\" (UniqueName: \"kubernetes.io/projected/9ce33c7a-0177-422f-8cfa-7d88fa60c2ee-kube-api-access-7wjn6\") pod \"watcher-kuttl-api-0\" (UID: \"9ce33c7a-0177-422f-8cfa-7d88fa60c2ee\") " pod="watcher-kuttl-default/watcher-kuttl-api-0" Oct 03 09:06:21 crc kubenswrapper[4765]: I1003 09:06:21.195153 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ba2a7e44-33cc-4cf1-8dfc-7aa8306118d2-combined-ca-bundle\") pod \"watcher-kuttl-decision-engine-0\" (UID: \"ba2a7e44-33cc-4cf1-8dfc-7aa8306118d2\") " pod="watcher-kuttl-default/watcher-kuttl-decision-engine-0" Oct 03 09:06:21 crc kubenswrapper[4765]: I1003 09:06:21.195170 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c6c827c9-7f24-44dc-841b-246a3eafaae5-config-data\") pod \"watcher-kuttl-applier-0\" (UID: \"c6c827c9-7f24-44dc-841b-246a3eafaae5\") " pod="watcher-kuttl-default/watcher-kuttl-applier-0" Oct 03 09:06:21 crc kubenswrapper[4765]: I1003 09:06:21.195209 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert-memcached-mtls\" (UniqueName: \"kubernetes.io/secret/9ce33c7a-0177-422f-8cfa-7d88fa60c2ee-cert-memcached-mtls\") pod \"watcher-kuttl-api-0\" (UID: \"9ce33c7a-0177-422f-8cfa-7d88fa60c2ee\") " pod="watcher-kuttl-default/watcher-kuttl-api-0" Oct 03 09:06:21 crc kubenswrapper[4765]: I1003 09:06:21.195244 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert-memcached-mtls\" (UniqueName: \"kubernetes.io/secret/ba2a7e44-33cc-4cf1-8dfc-7aa8306118d2-cert-memcached-mtls\") pod \"watcher-kuttl-decision-engine-0\" (UID: \"ba2a7e44-33cc-4cf1-8dfc-7aa8306118d2\") " pod="watcher-kuttl-default/watcher-kuttl-decision-engine-0" Oct 03 09:06:21 crc kubenswrapper[4765]: I1003 09:06:21.195284 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"custom-prometheus-ca\" (UniqueName: \"kubernetes.io/secret/ba2a7e44-33cc-4cf1-8dfc-7aa8306118d2-custom-prometheus-ca\") pod \"watcher-kuttl-decision-engine-0\" (UID: \"ba2a7e44-33cc-4cf1-8dfc-7aa8306118d2\") " pod="watcher-kuttl-default/watcher-kuttl-decision-engine-0" Oct 03 09:06:21 crc kubenswrapper[4765]: I1003 09:06:21.195320 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/ba2a7e44-33cc-4cf1-8dfc-7aa8306118d2-logs\") pod \"watcher-kuttl-decision-engine-0\" (UID: \"ba2a7e44-33cc-4cf1-8dfc-7aa8306118d2\") " pod="watcher-kuttl-default/watcher-kuttl-decision-engine-0" Oct 03 09:06:21 crc kubenswrapper[4765]: I1003 09:06:21.195354 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9ce33c7a-0177-422f-8cfa-7d88fa60c2ee-combined-ca-bundle\") pod \"watcher-kuttl-api-0\" (UID: \"9ce33c7a-0177-422f-8cfa-7d88fa60c2ee\") " pod="watcher-kuttl-default/watcher-kuttl-api-0" Oct 03 09:06:21 crc kubenswrapper[4765]: I1003 09:06:21.195386 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c6c827c9-7f24-44dc-841b-246a3eafaae5-combined-ca-bundle\") pod \"watcher-kuttl-applier-0\" (UID: \"c6c827c9-7f24-44dc-841b-246a3eafaae5\") " pod="watcher-kuttl-default/watcher-kuttl-applier-0" Oct 03 09:06:21 crc kubenswrapper[4765]: I1003 09:06:21.195412 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hl5pw\" (UniqueName: \"kubernetes.io/projected/c6c827c9-7f24-44dc-841b-246a3eafaae5-kube-api-access-hl5pw\") pod \"watcher-kuttl-applier-0\" (UID: \"c6c827c9-7f24-44dc-841b-246a3eafaae5\") " pod="watcher-kuttl-default/watcher-kuttl-applier-0" Oct 03 09:06:21 crc kubenswrapper[4765]: I1003 09:06:21.195566 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/9ce33c7a-0177-422f-8cfa-7d88fa60c2ee-logs\") pod \"watcher-kuttl-api-0\" (UID: \"9ce33c7a-0177-422f-8cfa-7d88fa60c2ee\") " pod="watcher-kuttl-default/watcher-kuttl-api-0" Oct 03 09:06:21 crc kubenswrapper[4765]: I1003 09:06:21.199916 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9ce33c7a-0177-422f-8cfa-7d88fa60c2ee-combined-ca-bundle\") pod \"watcher-kuttl-api-0\" (UID: \"9ce33c7a-0177-422f-8cfa-7d88fa60c2ee\") " pod="watcher-kuttl-default/watcher-kuttl-api-0" Oct 03 09:06:21 crc kubenswrapper[4765]: I1003 09:06:21.200194 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert-memcached-mtls\" (UniqueName: \"kubernetes.io/secret/9ce33c7a-0177-422f-8cfa-7d88fa60c2ee-cert-memcached-mtls\") pod \"watcher-kuttl-api-0\" (UID: \"9ce33c7a-0177-422f-8cfa-7d88fa60c2ee\") " pod="watcher-kuttl-default/watcher-kuttl-api-0" Oct 03 09:06:21 crc kubenswrapper[4765]: I1003 09:06:21.200286 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9ce33c7a-0177-422f-8cfa-7d88fa60c2ee-config-data\") pod \"watcher-kuttl-api-0\" (UID: \"9ce33c7a-0177-422f-8cfa-7d88fa60c2ee\") " pod="watcher-kuttl-default/watcher-kuttl-api-0" Oct 03 09:06:21 crc kubenswrapper[4765]: I1003 09:06:21.203800 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"custom-prometheus-ca\" (UniqueName: \"kubernetes.io/secret/9ce33c7a-0177-422f-8cfa-7d88fa60c2ee-custom-prometheus-ca\") pod \"watcher-kuttl-api-0\" (UID: \"9ce33c7a-0177-422f-8cfa-7d88fa60c2ee\") " pod="watcher-kuttl-default/watcher-kuttl-api-0" Oct 03 09:06:21 crc kubenswrapper[4765]: I1003 09:06:21.214630 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7wjn6\" (UniqueName: \"kubernetes.io/projected/9ce33c7a-0177-422f-8cfa-7d88fa60c2ee-kube-api-access-7wjn6\") pod \"watcher-kuttl-api-0\" (UID: \"9ce33c7a-0177-422f-8cfa-7d88fa60c2ee\") " pod="watcher-kuttl-default/watcher-kuttl-api-0" Oct 03 09:06:21 crc kubenswrapper[4765]: I1003 09:06:21.265432 4765 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/watcher-kuttl-api-0" Oct 03 09:06:21 crc kubenswrapper[4765]: I1003 09:06:21.296382 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ba2a7e44-33cc-4cf1-8dfc-7aa8306118d2-config-data\") pod \"watcher-kuttl-decision-engine-0\" (UID: \"ba2a7e44-33cc-4cf1-8dfc-7aa8306118d2\") " pod="watcher-kuttl-default/watcher-kuttl-decision-engine-0" Oct 03 09:06:21 crc kubenswrapper[4765]: I1003 09:06:21.296672 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ba2a7e44-33cc-4cf1-8dfc-7aa8306118d2-combined-ca-bundle\") pod \"watcher-kuttl-decision-engine-0\" (UID: \"ba2a7e44-33cc-4cf1-8dfc-7aa8306118d2\") " pod="watcher-kuttl-default/watcher-kuttl-decision-engine-0" Oct 03 09:06:21 crc kubenswrapper[4765]: I1003 09:06:21.296762 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c6c827c9-7f24-44dc-841b-246a3eafaae5-config-data\") pod \"watcher-kuttl-applier-0\" (UID: \"c6c827c9-7f24-44dc-841b-246a3eafaae5\") " pod="watcher-kuttl-default/watcher-kuttl-applier-0" Oct 03 09:06:21 crc kubenswrapper[4765]: I1003 09:06:21.296892 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert-memcached-mtls\" (UniqueName: \"kubernetes.io/secret/ba2a7e44-33cc-4cf1-8dfc-7aa8306118d2-cert-memcached-mtls\") pod \"watcher-kuttl-decision-engine-0\" (UID: \"ba2a7e44-33cc-4cf1-8dfc-7aa8306118d2\") " pod="watcher-kuttl-default/watcher-kuttl-decision-engine-0" Oct 03 09:06:21 crc kubenswrapper[4765]: I1003 09:06:21.296967 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"custom-prometheus-ca\" (UniqueName: \"kubernetes.io/secret/ba2a7e44-33cc-4cf1-8dfc-7aa8306118d2-custom-prometheus-ca\") pod \"watcher-kuttl-decision-engine-0\" (UID: \"ba2a7e44-33cc-4cf1-8dfc-7aa8306118d2\") " pod="watcher-kuttl-default/watcher-kuttl-decision-engine-0" Oct 03 09:06:21 crc kubenswrapper[4765]: I1003 09:06:21.297035 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/ba2a7e44-33cc-4cf1-8dfc-7aa8306118d2-logs\") pod \"watcher-kuttl-decision-engine-0\" (UID: \"ba2a7e44-33cc-4cf1-8dfc-7aa8306118d2\") " pod="watcher-kuttl-default/watcher-kuttl-decision-engine-0" Oct 03 09:06:21 crc kubenswrapper[4765]: I1003 09:06:21.297115 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c6c827c9-7f24-44dc-841b-246a3eafaae5-combined-ca-bundle\") pod \"watcher-kuttl-applier-0\" (UID: \"c6c827c9-7f24-44dc-841b-246a3eafaae5\") " pod="watcher-kuttl-default/watcher-kuttl-applier-0" Oct 03 09:06:21 crc kubenswrapper[4765]: I1003 09:06:21.297189 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hl5pw\" (UniqueName: \"kubernetes.io/projected/c6c827c9-7f24-44dc-841b-246a3eafaae5-kube-api-access-hl5pw\") pod \"watcher-kuttl-applier-0\" (UID: \"c6c827c9-7f24-44dc-841b-246a3eafaae5\") " pod="watcher-kuttl-default/watcher-kuttl-applier-0" Oct 03 09:06:21 crc kubenswrapper[4765]: I1003 09:06:21.297262 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-95jcn\" (UniqueName: \"kubernetes.io/projected/ba2a7e44-33cc-4cf1-8dfc-7aa8306118d2-kube-api-access-95jcn\") pod \"watcher-kuttl-decision-engine-0\" (UID: \"ba2a7e44-33cc-4cf1-8dfc-7aa8306118d2\") " pod="watcher-kuttl-default/watcher-kuttl-decision-engine-0" Oct 03 09:06:21 crc kubenswrapper[4765]: I1003 09:06:21.297336 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/c6c827c9-7f24-44dc-841b-246a3eafaae5-logs\") pod \"watcher-kuttl-applier-0\" (UID: \"c6c827c9-7f24-44dc-841b-246a3eafaae5\") " pod="watcher-kuttl-default/watcher-kuttl-applier-0" Oct 03 09:06:21 crc kubenswrapper[4765]: I1003 09:06:21.297404 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert-memcached-mtls\" (UniqueName: \"kubernetes.io/secret/c6c827c9-7f24-44dc-841b-246a3eafaae5-cert-memcached-mtls\") pod \"watcher-kuttl-applier-0\" (UID: \"c6c827c9-7f24-44dc-841b-246a3eafaae5\") " pod="watcher-kuttl-default/watcher-kuttl-applier-0" Oct 03 09:06:21 crc kubenswrapper[4765]: I1003 09:06:21.297587 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/ba2a7e44-33cc-4cf1-8dfc-7aa8306118d2-logs\") pod \"watcher-kuttl-decision-engine-0\" (UID: \"ba2a7e44-33cc-4cf1-8dfc-7aa8306118d2\") " pod="watcher-kuttl-default/watcher-kuttl-decision-engine-0" Oct 03 09:06:21 crc kubenswrapper[4765]: I1003 09:06:21.298016 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/c6c827c9-7f24-44dc-841b-246a3eafaae5-logs\") pod \"watcher-kuttl-applier-0\" (UID: \"c6c827c9-7f24-44dc-841b-246a3eafaae5\") " pod="watcher-kuttl-default/watcher-kuttl-applier-0" Oct 03 09:06:21 crc kubenswrapper[4765]: I1003 09:06:21.300473 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c6c827c9-7f24-44dc-841b-246a3eafaae5-config-data\") pod \"watcher-kuttl-applier-0\" (UID: \"c6c827c9-7f24-44dc-841b-246a3eafaae5\") " pod="watcher-kuttl-default/watcher-kuttl-applier-0" Oct 03 09:06:21 crc kubenswrapper[4765]: I1003 09:06:21.301995 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"custom-prometheus-ca\" (UniqueName: \"kubernetes.io/secret/ba2a7e44-33cc-4cf1-8dfc-7aa8306118d2-custom-prometheus-ca\") pod \"watcher-kuttl-decision-engine-0\" (UID: \"ba2a7e44-33cc-4cf1-8dfc-7aa8306118d2\") " pod="watcher-kuttl-default/watcher-kuttl-decision-engine-0" Oct 03 09:06:21 crc kubenswrapper[4765]: I1003 09:06:21.302536 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c6c827c9-7f24-44dc-841b-246a3eafaae5-combined-ca-bundle\") pod \"watcher-kuttl-applier-0\" (UID: \"c6c827c9-7f24-44dc-841b-246a3eafaae5\") " pod="watcher-kuttl-default/watcher-kuttl-applier-0" Oct 03 09:06:21 crc kubenswrapper[4765]: I1003 09:06:21.302648 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ba2a7e44-33cc-4cf1-8dfc-7aa8306118d2-combined-ca-bundle\") pod \"watcher-kuttl-decision-engine-0\" (UID: \"ba2a7e44-33cc-4cf1-8dfc-7aa8306118d2\") " pod="watcher-kuttl-default/watcher-kuttl-decision-engine-0" Oct 03 09:06:21 crc kubenswrapper[4765]: I1003 09:06:21.302805 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert-memcached-mtls\" (UniqueName: \"kubernetes.io/secret/c6c827c9-7f24-44dc-841b-246a3eafaae5-cert-memcached-mtls\") pod \"watcher-kuttl-applier-0\" (UID: \"c6c827c9-7f24-44dc-841b-246a3eafaae5\") " pod="watcher-kuttl-default/watcher-kuttl-applier-0" Oct 03 09:06:21 crc kubenswrapper[4765]: I1003 09:06:21.305677 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert-memcached-mtls\" (UniqueName: \"kubernetes.io/secret/ba2a7e44-33cc-4cf1-8dfc-7aa8306118d2-cert-memcached-mtls\") pod \"watcher-kuttl-decision-engine-0\" (UID: \"ba2a7e44-33cc-4cf1-8dfc-7aa8306118d2\") " pod="watcher-kuttl-default/watcher-kuttl-decision-engine-0" Oct 03 09:06:21 crc kubenswrapper[4765]: I1003 09:06:21.311797 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ba2a7e44-33cc-4cf1-8dfc-7aa8306118d2-config-data\") pod \"watcher-kuttl-decision-engine-0\" (UID: \"ba2a7e44-33cc-4cf1-8dfc-7aa8306118d2\") " pod="watcher-kuttl-default/watcher-kuttl-decision-engine-0" Oct 03 09:06:21 crc kubenswrapper[4765]: I1003 09:06:21.315182 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-95jcn\" (UniqueName: \"kubernetes.io/projected/ba2a7e44-33cc-4cf1-8dfc-7aa8306118d2-kube-api-access-95jcn\") pod \"watcher-kuttl-decision-engine-0\" (UID: \"ba2a7e44-33cc-4cf1-8dfc-7aa8306118d2\") " pod="watcher-kuttl-default/watcher-kuttl-decision-engine-0" Oct 03 09:06:21 crc kubenswrapper[4765]: I1003 09:06:21.315773 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hl5pw\" (UniqueName: \"kubernetes.io/projected/c6c827c9-7f24-44dc-841b-246a3eafaae5-kube-api-access-hl5pw\") pod \"watcher-kuttl-applier-0\" (UID: \"c6c827c9-7f24-44dc-841b-246a3eafaae5\") " pod="watcher-kuttl-default/watcher-kuttl-applier-0" Oct 03 09:06:21 crc kubenswrapper[4765]: I1003 09:06:21.339996 4765 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/watcher-kuttl-decision-engine-0" Oct 03 09:06:21 crc kubenswrapper[4765]: I1003 09:06:21.381299 4765 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/watcher-kuttl-applier-0" Oct 03 09:06:21 crc kubenswrapper[4765]: I1003 09:06:21.733046 4765 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["watcher-kuttl-default/watcher-kuttl-api-0"] Oct 03 09:06:21 crc kubenswrapper[4765]: I1003 09:06:21.841588 4765 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["watcher-kuttl-default/watcher-kuttl-decision-engine-0"] Oct 03 09:06:21 crc kubenswrapper[4765]: W1003 09:06:21.845651 4765 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podba2a7e44_33cc_4cf1_8dfc_7aa8306118d2.slice/crio-0a4d82f8e30da1dd48d2f6b4e5554982c50f7491805e027f7dd85feb90d55969 WatchSource:0}: Error finding container 0a4d82f8e30da1dd48d2f6b4e5554982c50f7491805e027f7dd85feb90d55969: Status 404 returned error can't find the container with id 0a4d82f8e30da1dd48d2f6b4e5554982c50f7491805e027f7dd85feb90d55969 Oct 03 09:06:21 crc kubenswrapper[4765]: I1003 09:06:21.921944 4765 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["watcher-kuttl-default/watcher-kuttl-applier-0"] Oct 03 09:06:22 crc kubenswrapper[4765]: I1003 09:06:22.715409 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/watcher-kuttl-decision-engine-0" event={"ID":"ba2a7e44-33cc-4cf1-8dfc-7aa8306118d2","Type":"ContainerStarted","Data":"de9d18011da3ea7f10fb762bcd7f356de4b115da9910de90f6b45abc85109696"} Oct 03 09:06:22 crc kubenswrapper[4765]: I1003 09:06:22.715822 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/watcher-kuttl-decision-engine-0" event={"ID":"ba2a7e44-33cc-4cf1-8dfc-7aa8306118d2","Type":"ContainerStarted","Data":"0a4d82f8e30da1dd48d2f6b4e5554982c50f7491805e027f7dd85feb90d55969"} Oct 03 09:06:22 crc kubenswrapper[4765]: I1003 09:06:22.717629 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/watcher-kuttl-applier-0" event={"ID":"c6c827c9-7f24-44dc-841b-246a3eafaae5","Type":"ContainerStarted","Data":"ce94645a3b1bf7876f235e3ac0ce6d987e69566b3de848e9be5e8eca67816971"} Oct 03 09:06:22 crc kubenswrapper[4765]: I1003 09:06:22.717688 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/watcher-kuttl-applier-0" event={"ID":"c6c827c9-7f24-44dc-841b-246a3eafaae5","Type":"ContainerStarted","Data":"206cbd03b1dd601820f9b7ce00262169db25d13c3c7067eec441ffa7df44ef44"} Oct 03 09:06:22 crc kubenswrapper[4765]: I1003 09:06:22.719925 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/watcher-kuttl-api-0" event={"ID":"9ce33c7a-0177-422f-8cfa-7d88fa60c2ee","Type":"ContainerStarted","Data":"59516f60cc564107349b6d1caf969e96f2d1f0ae07ece499ad03762a161bf1e0"} Oct 03 09:06:22 crc kubenswrapper[4765]: I1003 09:06:22.719978 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/watcher-kuttl-api-0" event={"ID":"9ce33c7a-0177-422f-8cfa-7d88fa60c2ee","Type":"ContainerStarted","Data":"052bfd19b2f2aac71e6087c585e63529b2fe429d5adbb681db8feb4bac7908e7"} Oct 03 09:06:22 crc kubenswrapper[4765]: I1003 09:06:22.719994 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/watcher-kuttl-api-0" event={"ID":"9ce33c7a-0177-422f-8cfa-7d88fa60c2ee","Type":"ContainerStarted","Data":"ef498f5ba8a9608f3f5c387cf09146a241fbed441284f31a1cae3965a60d6696"} Oct 03 09:06:22 crc kubenswrapper[4765]: I1003 09:06:22.720441 4765 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="watcher-kuttl-default/watcher-kuttl-api-0" Oct 03 09:06:22 crc kubenswrapper[4765]: I1003 09:06:22.775385 4765 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="watcher-kuttl-default/watcher-kuttl-decision-engine-0" podStartSLOduration=1.775364511 podStartE2EDuration="1.775364511s" podCreationTimestamp="2025-10-03 09:06:21 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-03 09:06:22.7543525 +0000 UTC m=+1627.055846830" watchObservedRunningTime="2025-10-03 09:06:22.775364511 +0000 UTC m=+1627.076858841" Oct 03 09:06:22 crc kubenswrapper[4765]: I1003 09:06:22.777548 4765 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="watcher-kuttl-default/watcher-kuttl-applier-0" podStartSLOduration=1.777535077 podStartE2EDuration="1.777535077s" podCreationTimestamp="2025-10-03 09:06:21 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-03 09:06:22.773053921 +0000 UTC m=+1627.074548251" watchObservedRunningTime="2025-10-03 09:06:22.777535077 +0000 UTC m=+1627.079029407" Oct 03 09:06:22 crc kubenswrapper[4765]: I1003 09:06:22.834652 4765 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="watcher-kuttl-default/watcher-kuttl-api-0" podStartSLOduration=2.834631837 podStartE2EDuration="2.834631837s" podCreationTimestamp="2025-10-03 09:06:20 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-03 09:06:22.829867564 +0000 UTC m=+1627.131361894" watchObservedRunningTime="2025-10-03 09:06:22.834631837 +0000 UTC m=+1627.136126167" Oct 03 09:06:23 crc kubenswrapper[4765]: I1003 09:06:23.856738 4765 log.go:25] "Finished parsing log file" path="/var/log/pods/watcher-kuttl-default_watcher-kuttl-decision-engine-0_ba2a7e44-33cc-4cf1-8dfc-7aa8306118d2/watcher-decision-engine/0.log" Oct 03 09:06:24 crc kubenswrapper[4765]: I1003 09:06:24.307288 4765 scope.go:117] "RemoveContainer" containerID="dd918556e4256b95f1ffce5dba4f8a301b33441a569fc5bbea88da3f09eb9800" Oct 03 09:06:24 crc kubenswrapper[4765]: E1003 09:06:24.308926 4765 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-j8mss_openshift-machine-config-operator(d636dbad-9ffa-4ba7-953f-adea04b76a23)\"" pod="openshift-machine-config-operator/machine-config-daemon-j8mss" podUID="d636dbad-9ffa-4ba7-953f-adea04b76a23" Oct 03 09:06:25 crc kubenswrapper[4765]: I1003 09:06:25.052327 4765 log.go:25] "Finished parsing log file" path="/var/log/pods/watcher-kuttl-default_watcher-kuttl-decision-engine-0_ba2a7e44-33cc-4cf1-8dfc-7aa8306118d2/watcher-decision-engine/0.log" Oct 03 09:06:25 crc kubenswrapper[4765]: I1003 09:06:25.293709 4765 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="watcher-kuttl-default/watcher-kuttl-api-0" Oct 03 09:06:26 crc kubenswrapper[4765]: I1003 09:06:26.248490 4765 log.go:25] "Finished parsing log file" path="/var/log/pods/watcher-kuttl-default_watcher-kuttl-decision-engine-0_ba2a7e44-33cc-4cf1-8dfc-7aa8306118d2/watcher-decision-engine/0.log" Oct 03 09:06:26 crc kubenswrapper[4765]: I1003 09:06:26.266053 4765 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="watcher-kuttl-default/watcher-kuttl-api-0" Oct 03 09:06:26 crc kubenswrapper[4765]: I1003 09:06:26.382837 4765 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="watcher-kuttl-default/watcher-kuttl-applier-0" Oct 03 09:06:26 crc kubenswrapper[4765]: I1003 09:06:26.802670 4765 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="watcher-kuttl-default/ceilometer-0" Oct 03 09:06:27 crc kubenswrapper[4765]: I1003 09:06:27.431030 4765 log.go:25] "Finished parsing log file" path="/var/log/pods/watcher-kuttl-default_watcher-kuttl-decision-engine-0_ba2a7e44-33cc-4cf1-8dfc-7aa8306118d2/watcher-decision-engine/0.log" Oct 03 09:06:28 crc kubenswrapper[4765]: I1003 09:06:28.636008 4765 log.go:25] "Finished parsing log file" path="/var/log/pods/watcher-kuttl-default_watcher-kuttl-decision-engine-0_ba2a7e44-33cc-4cf1-8dfc-7aa8306118d2/watcher-decision-engine/0.log" Oct 03 09:06:29 crc kubenswrapper[4765]: I1003 09:06:29.892244 4765 log.go:25] "Finished parsing log file" path="/var/log/pods/watcher-kuttl-default_watcher-kuttl-decision-engine-0_ba2a7e44-33cc-4cf1-8dfc-7aa8306118d2/watcher-decision-engine/0.log" Oct 03 09:06:31 crc kubenswrapper[4765]: I1003 09:06:31.121851 4765 log.go:25] "Finished parsing log file" path="/var/log/pods/watcher-kuttl-default_watcher-kuttl-decision-engine-0_ba2a7e44-33cc-4cf1-8dfc-7aa8306118d2/watcher-decision-engine/0.log" Oct 03 09:06:31 crc kubenswrapper[4765]: I1003 09:06:31.266596 4765 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="watcher-kuttl-default/watcher-kuttl-api-0" Oct 03 09:06:31 crc kubenswrapper[4765]: I1003 09:06:31.273457 4765 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="watcher-kuttl-default/watcher-kuttl-api-0" Oct 03 09:06:31 crc kubenswrapper[4765]: I1003 09:06:31.340849 4765 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="watcher-kuttl-default/watcher-kuttl-decision-engine-0" Oct 03 09:06:31 crc kubenswrapper[4765]: I1003 09:06:31.368899 4765 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="watcher-kuttl-default/watcher-kuttl-decision-engine-0" Oct 03 09:06:31 crc kubenswrapper[4765]: I1003 09:06:31.387401 4765 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="watcher-kuttl-default/watcher-kuttl-applier-0" Oct 03 09:06:31 crc kubenswrapper[4765]: I1003 09:06:31.412449 4765 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="watcher-kuttl-default/watcher-kuttl-applier-0" Oct 03 09:06:31 crc kubenswrapper[4765]: I1003 09:06:31.790313 4765 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="watcher-kuttl-default/watcher-kuttl-decision-engine-0" Oct 03 09:06:31 crc kubenswrapper[4765]: I1003 09:06:31.794960 4765 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="watcher-kuttl-default/watcher-kuttl-api-0" Oct 03 09:06:31 crc kubenswrapper[4765]: I1003 09:06:31.818661 4765 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="watcher-kuttl-default/watcher-kuttl-decision-engine-0" Oct 03 09:06:31 crc kubenswrapper[4765]: I1003 09:06:31.819213 4765 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="watcher-kuttl-default/watcher-kuttl-applier-0" Oct 03 09:06:32 crc kubenswrapper[4765]: I1003 09:06:32.318383 4765 log.go:25] "Finished parsing log file" path="/var/log/pods/watcher-kuttl-default_watcher-kuttl-decision-engine-0_ba2a7e44-33cc-4cf1-8dfc-7aa8306118d2/watcher-decision-engine/0.log" Oct 03 09:06:32 crc kubenswrapper[4765]: I1003 09:06:32.594113 4765 log.go:25] "Finished parsing log file" path="/var/log/pods/watcher-kuttl-default_watcher-kuttl-decision-engine-0_ba2a7e44-33cc-4cf1-8dfc-7aa8306118d2/watcher-decision-engine/0.log" Oct 03 09:06:33 crc kubenswrapper[4765]: I1003 09:06:33.391166 4765 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["watcher-kuttl-default/cinder-db-create-nxj4c"] Oct 03 09:06:33 crc kubenswrapper[4765]: I1003 09:06:33.392410 4765 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/cinder-db-create-nxj4c" Oct 03 09:06:33 crc kubenswrapper[4765]: I1003 09:06:33.402563 4765 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["watcher-kuttl-default/cinder-db-create-nxj4c"] Oct 03 09:06:33 crc kubenswrapper[4765]: I1003 09:06:33.437386 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7l6bp\" (UniqueName: \"kubernetes.io/projected/5dd52fdd-5ddc-4475-a90f-0f51f56e4ac0-kube-api-access-7l6bp\") pod \"cinder-db-create-nxj4c\" (UID: \"5dd52fdd-5ddc-4475-a90f-0f51f56e4ac0\") " pod="watcher-kuttl-default/cinder-db-create-nxj4c" Oct 03 09:06:33 crc kubenswrapper[4765]: I1003 09:06:33.538756 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7l6bp\" (UniqueName: \"kubernetes.io/projected/5dd52fdd-5ddc-4475-a90f-0f51f56e4ac0-kube-api-access-7l6bp\") pod \"cinder-db-create-nxj4c\" (UID: \"5dd52fdd-5ddc-4475-a90f-0f51f56e4ac0\") " pod="watcher-kuttl-default/cinder-db-create-nxj4c" Oct 03 09:06:33 crc kubenswrapper[4765]: I1003 09:06:33.563482 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7l6bp\" (UniqueName: \"kubernetes.io/projected/5dd52fdd-5ddc-4475-a90f-0f51f56e4ac0-kube-api-access-7l6bp\") pod \"cinder-db-create-nxj4c\" (UID: \"5dd52fdd-5ddc-4475-a90f-0f51f56e4ac0\") " pod="watcher-kuttl-default/cinder-db-create-nxj4c" Oct 03 09:06:33 crc kubenswrapper[4765]: I1003 09:06:33.714044 4765 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/cinder-db-create-nxj4c" Oct 03 09:06:33 crc kubenswrapper[4765]: I1003 09:06:33.781768 4765 log.go:25] "Finished parsing log file" path="/var/log/pods/watcher-kuttl-default_watcher-kuttl-decision-engine-0_ba2a7e44-33cc-4cf1-8dfc-7aa8306118d2/watcher-decision-engine/0.log" Oct 03 09:06:34 crc kubenswrapper[4765]: I1003 09:06:34.191244 4765 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["watcher-kuttl-default/cinder-db-create-nxj4c"] Oct 03 09:06:34 crc kubenswrapper[4765]: I1003 09:06:34.611017 4765 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["watcher-kuttl-default/ceilometer-0"] Oct 03 09:06:34 crc kubenswrapper[4765]: I1003 09:06:34.611526 4765 kuberuntime_container.go:808] "Killing container with a grace period" pod="watcher-kuttl-default/ceilometer-0" podUID="62a7a58b-c181-4ef7-b62d-de2a16d2a47c" containerName="ceilometer-central-agent" containerID="cri-o://bc4f7717660a0772f0811a84b491da8fd73479342e628ba86dcb4188e766279d" gracePeriod=30 Oct 03 09:06:34 crc kubenswrapper[4765]: I1003 09:06:34.611596 4765 kuberuntime_container.go:808] "Killing container with a grace period" pod="watcher-kuttl-default/ceilometer-0" podUID="62a7a58b-c181-4ef7-b62d-de2a16d2a47c" containerName="proxy-httpd" containerID="cri-o://a2311cf4e7789e137e2d4959b66683e5b60c999ff4f46934bd967d185c138828" gracePeriod=30 Oct 03 09:06:34 crc kubenswrapper[4765]: I1003 09:06:34.611697 4765 kuberuntime_container.go:808] "Killing container with a grace period" pod="watcher-kuttl-default/ceilometer-0" podUID="62a7a58b-c181-4ef7-b62d-de2a16d2a47c" containerName="sg-core" containerID="cri-o://4595af1cd347c986b356c3f9262b22f868bea087e67bad66465aabc2e2d0176b" gracePeriod=30 Oct 03 09:06:34 crc kubenswrapper[4765]: I1003 09:06:34.611661 4765 kuberuntime_container.go:808] "Killing container with a grace period" pod="watcher-kuttl-default/ceilometer-0" podUID="62a7a58b-c181-4ef7-b62d-de2a16d2a47c" containerName="ceilometer-notification-agent" containerID="cri-o://50953f54f4d8e27818d198645502cedfbc75636e8d28835fe71747bc1ce91f49" gracePeriod=30 Oct 03 09:06:34 crc kubenswrapper[4765]: I1003 09:06:34.821639 4765 generic.go:334] "Generic (PLEG): container finished" podID="62a7a58b-c181-4ef7-b62d-de2a16d2a47c" containerID="4595af1cd347c986b356c3f9262b22f868bea087e67bad66465aabc2e2d0176b" exitCode=2 Oct 03 09:06:34 crc kubenswrapper[4765]: I1003 09:06:34.821688 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/ceilometer-0" event={"ID":"62a7a58b-c181-4ef7-b62d-de2a16d2a47c","Type":"ContainerDied","Data":"4595af1cd347c986b356c3f9262b22f868bea087e67bad66465aabc2e2d0176b"} Oct 03 09:06:34 crc kubenswrapper[4765]: I1003 09:06:34.824326 4765 generic.go:334] "Generic (PLEG): container finished" podID="5dd52fdd-5ddc-4475-a90f-0f51f56e4ac0" containerID="b897ac33fcbb2351a24542507c6113278751d033f25dd4c328e4a384f0ebe9e2" exitCode=0 Oct 03 09:06:34 crc kubenswrapper[4765]: I1003 09:06:34.824395 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/cinder-db-create-nxj4c" event={"ID":"5dd52fdd-5ddc-4475-a90f-0f51f56e4ac0","Type":"ContainerDied","Data":"b897ac33fcbb2351a24542507c6113278751d033f25dd4c328e4a384f0ebe9e2"} Oct 03 09:06:34 crc kubenswrapper[4765]: I1003 09:06:34.824438 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/cinder-db-create-nxj4c" event={"ID":"5dd52fdd-5ddc-4475-a90f-0f51f56e4ac0","Type":"ContainerStarted","Data":"df0bf8e66206b0e763e47632e9c65f03e9104ae1e399e260ccd0dfe82d3bbb76"} Oct 03 09:06:34 crc kubenswrapper[4765]: I1003 09:06:34.989129 4765 log.go:25] "Finished parsing log file" path="/var/log/pods/watcher-kuttl-default_watcher-kuttl-decision-engine-0_ba2a7e44-33cc-4cf1-8dfc-7aa8306118d2/watcher-decision-engine/0.log" Oct 03 09:06:35 crc kubenswrapper[4765]: I1003 09:06:35.835799 4765 generic.go:334] "Generic (PLEG): container finished" podID="62a7a58b-c181-4ef7-b62d-de2a16d2a47c" containerID="a2311cf4e7789e137e2d4959b66683e5b60c999ff4f46934bd967d185c138828" exitCode=0 Oct 03 09:06:35 crc kubenswrapper[4765]: I1003 09:06:35.835831 4765 generic.go:334] "Generic (PLEG): container finished" podID="62a7a58b-c181-4ef7-b62d-de2a16d2a47c" containerID="bc4f7717660a0772f0811a84b491da8fd73479342e628ba86dcb4188e766279d" exitCode=0 Oct 03 09:06:35 crc kubenswrapper[4765]: I1003 09:06:35.835895 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/ceilometer-0" event={"ID":"62a7a58b-c181-4ef7-b62d-de2a16d2a47c","Type":"ContainerDied","Data":"a2311cf4e7789e137e2d4959b66683e5b60c999ff4f46934bd967d185c138828"} Oct 03 09:06:35 crc kubenswrapper[4765]: I1003 09:06:35.835941 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/ceilometer-0" event={"ID":"62a7a58b-c181-4ef7-b62d-de2a16d2a47c","Type":"ContainerDied","Data":"bc4f7717660a0772f0811a84b491da8fd73479342e628ba86dcb4188e766279d"} Oct 03 09:06:36 crc kubenswrapper[4765]: I1003 09:06:36.201090 4765 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/cinder-db-create-nxj4c" Oct 03 09:06:36 crc kubenswrapper[4765]: I1003 09:06:36.221543 4765 log.go:25] "Finished parsing log file" path="/var/log/pods/watcher-kuttl-default_watcher-kuttl-decision-engine-0_ba2a7e44-33cc-4cf1-8dfc-7aa8306118d2/watcher-decision-engine/0.log" Oct 03 09:06:36 crc kubenswrapper[4765]: I1003 09:06:36.282374 4765 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7l6bp\" (UniqueName: \"kubernetes.io/projected/5dd52fdd-5ddc-4475-a90f-0f51f56e4ac0-kube-api-access-7l6bp\") pod \"5dd52fdd-5ddc-4475-a90f-0f51f56e4ac0\" (UID: \"5dd52fdd-5ddc-4475-a90f-0f51f56e4ac0\") " Oct 03 09:06:36 crc kubenswrapper[4765]: I1003 09:06:36.290800 4765 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5dd52fdd-5ddc-4475-a90f-0f51f56e4ac0-kube-api-access-7l6bp" (OuterVolumeSpecName: "kube-api-access-7l6bp") pod "5dd52fdd-5ddc-4475-a90f-0f51f56e4ac0" (UID: "5dd52fdd-5ddc-4475-a90f-0f51f56e4ac0"). InnerVolumeSpecName "kube-api-access-7l6bp". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 09:06:36 crc kubenswrapper[4765]: I1003 09:06:36.313433 4765 scope.go:117] "RemoveContainer" containerID="dd918556e4256b95f1ffce5dba4f8a301b33441a569fc5bbea88da3f09eb9800" Oct 03 09:06:36 crc kubenswrapper[4765]: E1003 09:06:36.313732 4765 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-j8mss_openshift-machine-config-operator(d636dbad-9ffa-4ba7-953f-adea04b76a23)\"" pod="openshift-machine-config-operator/machine-config-daemon-j8mss" podUID="d636dbad-9ffa-4ba7-953f-adea04b76a23" Oct 03 09:06:36 crc kubenswrapper[4765]: I1003 09:06:36.384976 4765 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7l6bp\" (UniqueName: \"kubernetes.io/projected/5dd52fdd-5ddc-4475-a90f-0f51f56e4ac0-kube-api-access-7l6bp\") on node \"crc\" DevicePath \"\"" Oct 03 09:06:36 crc kubenswrapper[4765]: I1003 09:06:36.817333 4765 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/ceilometer-0" Oct 03 09:06:36 crc kubenswrapper[4765]: I1003 09:06:36.858308 4765 generic.go:334] "Generic (PLEG): container finished" podID="62a7a58b-c181-4ef7-b62d-de2a16d2a47c" containerID="50953f54f4d8e27818d198645502cedfbc75636e8d28835fe71747bc1ce91f49" exitCode=0 Oct 03 09:06:36 crc kubenswrapper[4765]: I1003 09:06:36.858953 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/ceilometer-0" event={"ID":"62a7a58b-c181-4ef7-b62d-de2a16d2a47c","Type":"ContainerDied","Data":"50953f54f4d8e27818d198645502cedfbc75636e8d28835fe71747bc1ce91f49"} Oct 03 09:06:36 crc kubenswrapper[4765]: I1003 09:06:36.858995 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/ceilometer-0" event={"ID":"62a7a58b-c181-4ef7-b62d-de2a16d2a47c","Type":"ContainerDied","Data":"5777b6feccf0629666e398cc77016bb325c7ef30d05ff401552bee7ed071bc90"} Oct 03 09:06:36 crc kubenswrapper[4765]: I1003 09:06:36.859029 4765 scope.go:117] "RemoveContainer" containerID="a2311cf4e7789e137e2d4959b66683e5b60c999ff4f46934bd967d185c138828" Oct 03 09:06:36 crc kubenswrapper[4765]: I1003 09:06:36.859280 4765 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/ceilometer-0" Oct 03 09:06:36 crc kubenswrapper[4765]: I1003 09:06:36.865393 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/cinder-db-create-nxj4c" event={"ID":"5dd52fdd-5ddc-4475-a90f-0f51f56e4ac0","Type":"ContainerDied","Data":"df0bf8e66206b0e763e47632e9c65f03e9104ae1e399e260ccd0dfe82d3bbb76"} Oct 03 09:06:36 crc kubenswrapper[4765]: I1003 09:06:36.865441 4765 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="df0bf8e66206b0e763e47632e9c65f03e9104ae1e399e260ccd0dfe82d3bbb76" Oct 03 09:06:36 crc kubenswrapper[4765]: I1003 09:06:36.865553 4765 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/cinder-db-create-nxj4c" Oct 03 09:06:36 crc kubenswrapper[4765]: I1003 09:06:36.895715 4765 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/62a7a58b-c181-4ef7-b62d-de2a16d2a47c-log-httpd\") pod \"62a7a58b-c181-4ef7-b62d-de2a16d2a47c\" (UID: \"62a7a58b-c181-4ef7-b62d-de2a16d2a47c\") " Oct 03 09:06:36 crc kubenswrapper[4765]: I1003 09:06:36.895822 4765 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/62a7a58b-c181-4ef7-b62d-de2a16d2a47c-scripts\") pod \"62a7a58b-c181-4ef7-b62d-de2a16d2a47c\" (UID: \"62a7a58b-c181-4ef7-b62d-de2a16d2a47c\") " Oct 03 09:06:36 crc kubenswrapper[4765]: I1003 09:06:36.895864 4765 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/62a7a58b-c181-4ef7-b62d-de2a16d2a47c-sg-core-conf-yaml\") pod \"62a7a58b-c181-4ef7-b62d-de2a16d2a47c\" (UID: \"62a7a58b-c181-4ef7-b62d-de2a16d2a47c\") " Oct 03 09:06:36 crc kubenswrapper[4765]: I1003 09:06:36.895896 4765 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/62a7a58b-c181-4ef7-b62d-de2a16d2a47c-run-httpd\") pod \"62a7a58b-c181-4ef7-b62d-de2a16d2a47c\" (UID: \"62a7a58b-c181-4ef7-b62d-de2a16d2a47c\") " Oct 03 09:06:36 crc kubenswrapper[4765]: I1003 09:06:36.895929 4765 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gm65c\" (UniqueName: \"kubernetes.io/projected/62a7a58b-c181-4ef7-b62d-de2a16d2a47c-kube-api-access-gm65c\") pod \"62a7a58b-c181-4ef7-b62d-de2a16d2a47c\" (UID: \"62a7a58b-c181-4ef7-b62d-de2a16d2a47c\") " Oct 03 09:06:36 crc kubenswrapper[4765]: I1003 09:06:36.895964 4765 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/62a7a58b-c181-4ef7-b62d-de2a16d2a47c-combined-ca-bundle\") pod \"62a7a58b-c181-4ef7-b62d-de2a16d2a47c\" (UID: \"62a7a58b-c181-4ef7-b62d-de2a16d2a47c\") " Oct 03 09:06:36 crc kubenswrapper[4765]: I1003 09:06:36.896033 4765 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/62a7a58b-c181-4ef7-b62d-de2a16d2a47c-config-data\") pod \"62a7a58b-c181-4ef7-b62d-de2a16d2a47c\" (UID: \"62a7a58b-c181-4ef7-b62d-de2a16d2a47c\") " Oct 03 09:06:36 crc kubenswrapper[4765]: I1003 09:06:36.896142 4765 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/62a7a58b-c181-4ef7-b62d-de2a16d2a47c-ceilometer-tls-certs\") pod \"62a7a58b-c181-4ef7-b62d-de2a16d2a47c\" (UID: \"62a7a58b-c181-4ef7-b62d-de2a16d2a47c\") " Oct 03 09:06:36 crc kubenswrapper[4765]: I1003 09:06:36.896502 4765 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/62a7a58b-c181-4ef7-b62d-de2a16d2a47c-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "62a7a58b-c181-4ef7-b62d-de2a16d2a47c" (UID: "62a7a58b-c181-4ef7-b62d-de2a16d2a47c"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 03 09:06:36 crc kubenswrapper[4765]: I1003 09:06:36.896915 4765 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/62a7a58b-c181-4ef7-b62d-de2a16d2a47c-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "62a7a58b-c181-4ef7-b62d-de2a16d2a47c" (UID: "62a7a58b-c181-4ef7-b62d-de2a16d2a47c"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 03 09:06:36 crc kubenswrapper[4765]: I1003 09:06:36.896964 4765 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/62a7a58b-c181-4ef7-b62d-de2a16d2a47c-log-httpd\") on node \"crc\" DevicePath \"\"" Oct 03 09:06:36 crc kubenswrapper[4765]: I1003 09:06:36.903346 4765 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/62a7a58b-c181-4ef7-b62d-de2a16d2a47c-kube-api-access-gm65c" (OuterVolumeSpecName: "kube-api-access-gm65c") pod "62a7a58b-c181-4ef7-b62d-de2a16d2a47c" (UID: "62a7a58b-c181-4ef7-b62d-de2a16d2a47c"). InnerVolumeSpecName "kube-api-access-gm65c". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 09:06:36 crc kubenswrapper[4765]: I1003 09:06:36.906776 4765 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/62a7a58b-c181-4ef7-b62d-de2a16d2a47c-scripts" (OuterVolumeSpecName: "scripts") pod "62a7a58b-c181-4ef7-b62d-de2a16d2a47c" (UID: "62a7a58b-c181-4ef7-b62d-de2a16d2a47c"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 09:06:36 crc kubenswrapper[4765]: I1003 09:06:36.916512 4765 scope.go:117] "RemoveContainer" containerID="4595af1cd347c986b356c3f9262b22f868bea087e67bad66465aabc2e2d0176b" Oct 03 09:06:36 crc kubenswrapper[4765]: I1003 09:06:36.951961 4765 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/62a7a58b-c181-4ef7-b62d-de2a16d2a47c-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "62a7a58b-c181-4ef7-b62d-de2a16d2a47c" (UID: "62a7a58b-c181-4ef7-b62d-de2a16d2a47c"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 09:06:36 crc kubenswrapper[4765]: I1003 09:06:36.962453 4765 scope.go:117] "RemoveContainer" containerID="50953f54f4d8e27818d198645502cedfbc75636e8d28835fe71747bc1ce91f49" Oct 03 09:06:36 crc kubenswrapper[4765]: I1003 09:06:36.975881 4765 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/62a7a58b-c181-4ef7-b62d-de2a16d2a47c-ceilometer-tls-certs" (OuterVolumeSpecName: "ceilometer-tls-certs") pod "62a7a58b-c181-4ef7-b62d-de2a16d2a47c" (UID: "62a7a58b-c181-4ef7-b62d-de2a16d2a47c"). InnerVolumeSpecName "ceilometer-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 09:06:36 crc kubenswrapper[4765]: I1003 09:06:36.984302 4765 scope.go:117] "RemoveContainer" containerID="bc4f7717660a0772f0811a84b491da8fd73479342e628ba86dcb4188e766279d" Oct 03 09:06:36 crc kubenswrapper[4765]: I1003 09:06:36.998054 4765 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/62a7a58b-c181-4ef7-b62d-de2a16d2a47c-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "62a7a58b-c181-4ef7-b62d-de2a16d2a47c" (UID: "62a7a58b-c181-4ef7-b62d-de2a16d2a47c"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 09:06:37 crc kubenswrapper[4765]: I1003 09:06:37.000395 4765 reconciler_common.go:293] "Volume detached for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/62a7a58b-c181-4ef7-b62d-de2a16d2a47c-ceilometer-tls-certs\") on node \"crc\" DevicePath \"\"" Oct 03 09:06:37 crc kubenswrapper[4765]: I1003 09:06:37.000447 4765 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/62a7a58b-c181-4ef7-b62d-de2a16d2a47c-scripts\") on node \"crc\" DevicePath \"\"" Oct 03 09:06:37 crc kubenswrapper[4765]: I1003 09:06:37.000459 4765 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/62a7a58b-c181-4ef7-b62d-de2a16d2a47c-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Oct 03 09:06:37 crc kubenswrapper[4765]: I1003 09:06:37.000468 4765 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/62a7a58b-c181-4ef7-b62d-de2a16d2a47c-run-httpd\") on node \"crc\" DevicePath \"\"" Oct 03 09:06:37 crc kubenswrapper[4765]: I1003 09:06:37.000481 4765 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-gm65c\" (UniqueName: \"kubernetes.io/projected/62a7a58b-c181-4ef7-b62d-de2a16d2a47c-kube-api-access-gm65c\") on node \"crc\" DevicePath \"\"" Oct 03 09:06:37 crc kubenswrapper[4765]: I1003 09:06:37.000490 4765 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/62a7a58b-c181-4ef7-b62d-de2a16d2a47c-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 03 09:06:37 crc kubenswrapper[4765]: I1003 09:06:37.005532 4765 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/62a7a58b-c181-4ef7-b62d-de2a16d2a47c-config-data" (OuterVolumeSpecName: "config-data") pod "62a7a58b-c181-4ef7-b62d-de2a16d2a47c" (UID: "62a7a58b-c181-4ef7-b62d-de2a16d2a47c"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 09:06:37 crc kubenswrapper[4765]: I1003 09:06:37.009048 4765 scope.go:117] "RemoveContainer" containerID="a2311cf4e7789e137e2d4959b66683e5b60c999ff4f46934bd967d185c138828" Oct 03 09:06:37 crc kubenswrapper[4765]: E1003 09:06:37.009867 4765 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a2311cf4e7789e137e2d4959b66683e5b60c999ff4f46934bd967d185c138828\": container with ID starting with a2311cf4e7789e137e2d4959b66683e5b60c999ff4f46934bd967d185c138828 not found: ID does not exist" containerID="a2311cf4e7789e137e2d4959b66683e5b60c999ff4f46934bd967d185c138828" Oct 03 09:06:37 crc kubenswrapper[4765]: I1003 09:06:37.009899 4765 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a2311cf4e7789e137e2d4959b66683e5b60c999ff4f46934bd967d185c138828"} err="failed to get container status \"a2311cf4e7789e137e2d4959b66683e5b60c999ff4f46934bd967d185c138828\": rpc error: code = NotFound desc = could not find container \"a2311cf4e7789e137e2d4959b66683e5b60c999ff4f46934bd967d185c138828\": container with ID starting with a2311cf4e7789e137e2d4959b66683e5b60c999ff4f46934bd967d185c138828 not found: ID does not exist" Oct 03 09:06:37 crc kubenswrapper[4765]: I1003 09:06:37.009925 4765 scope.go:117] "RemoveContainer" containerID="4595af1cd347c986b356c3f9262b22f868bea087e67bad66465aabc2e2d0176b" Oct 03 09:06:37 crc kubenswrapper[4765]: E1003 09:06:37.010345 4765 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4595af1cd347c986b356c3f9262b22f868bea087e67bad66465aabc2e2d0176b\": container with ID starting with 4595af1cd347c986b356c3f9262b22f868bea087e67bad66465aabc2e2d0176b not found: ID does not exist" containerID="4595af1cd347c986b356c3f9262b22f868bea087e67bad66465aabc2e2d0176b" Oct 03 09:06:37 crc kubenswrapper[4765]: I1003 09:06:37.010587 4765 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4595af1cd347c986b356c3f9262b22f868bea087e67bad66465aabc2e2d0176b"} err="failed to get container status \"4595af1cd347c986b356c3f9262b22f868bea087e67bad66465aabc2e2d0176b\": rpc error: code = NotFound desc = could not find container \"4595af1cd347c986b356c3f9262b22f868bea087e67bad66465aabc2e2d0176b\": container with ID starting with 4595af1cd347c986b356c3f9262b22f868bea087e67bad66465aabc2e2d0176b not found: ID does not exist" Oct 03 09:06:37 crc kubenswrapper[4765]: I1003 09:06:37.010622 4765 scope.go:117] "RemoveContainer" containerID="50953f54f4d8e27818d198645502cedfbc75636e8d28835fe71747bc1ce91f49" Oct 03 09:06:37 crc kubenswrapper[4765]: E1003 09:06:37.011059 4765 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"50953f54f4d8e27818d198645502cedfbc75636e8d28835fe71747bc1ce91f49\": container with ID starting with 50953f54f4d8e27818d198645502cedfbc75636e8d28835fe71747bc1ce91f49 not found: ID does not exist" containerID="50953f54f4d8e27818d198645502cedfbc75636e8d28835fe71747bc1ce91f49" Oct 03 09:06:37 crc kubenswrapper[4765]: I1003 09:06:37.011135 4765 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"50953f54f4d8e27818d198645502cedfbc75636e8d28835fe71747bc1ce91f49"} err="failed to get container status \"50953f54f4d8e27818d198645502cedfbc75636e8d28835fe71747bc1ce91f49\": rpc error: code = NotFound desc = could not find container \"50953f54f4d8e27818d198645502cedfbc75636e8d28835fe71747bc1ce91f49\": container with ID starting with 50953f54f4d8e27818d198645502cedfbc75636e8d28835fe71747bc1ce91f49 not found: ID does not exist" Oct 03 09:06:37 crc kubenswrapper[4765]: I1003 09:06:37.011201 4765 scope.go:117] "RemoveContainer" containerID="bc4f7717660a0772f0811a84b491da8fd73479342e628ba86dcb4188e766279d" Oct 03 09:06:37 crc kubenswrapper[4765]: E1003 09:06:37.011813 4765 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"bc4f7717660a0772f0811a84b491da8fd73479342e628ba86dcb4188e766279d\": container with ID starting with bc4f7717660a0772f0811a84b491da8fd73479342e628ba86dcb4188e766279d not found: ID does not exist" containerID="bc4f7717660a0772f0811a84b491da8fd73479342e628ba86dcb4188e766279d" Oct 03 09:06:37 crc kubenswrapper[4765]: I1003 09:06:37.011915 4765 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"bc4f7717660a0772f0811a84b491da8fd73479342e628ba86dcb4188e766279d"} err="failed to get container status \"bc4f7717660a0772f0811a84b491da8fd73479342e628ba86dcb4188e766279d\": rpc error: code = NotFound desc = could not find container \"bc4f7717660a0772f0811a84b491da8fd73479342e628ba86dcb4188e766279d\": container with ID starting with bc4f7717660a0772f0811a84b491da8fd73479342e628ba86dcb4188e766279d not found: ID does not exist" Oct 03 09:06:37 crc kubenswrapper[4765]: I1003 09:06:37.102450 4765 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/62a7a58b-c181-4ef7-b62d-de2a16d2a47c-config-data\") on node \"crc\" DevicePath \"\"" Oct 03 09:06:37 crc kubenswrapper[4765]: I1003 09:06:37.197538 4765 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["watcher-kuttl-default/ceilometer-0"] Oct 03 09:06:37 crc kubenswrapper[4765]: I1003 09:06:37.209246 4765 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["watcher-kuttl-default/ceilometer-0"] Oct 03 09:06:37 crc kubenswrapper[4765]: I1003 09:06:37.218745 4765 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["watcher-kuttl-default/ceilometer-0"] Oct 03 09:06:37 crc kubenswrapper[4765]: E1003 09:06:37.219244 4765 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5dd52fdd-5ddc-4475-a90f-0f51f56e4ac0" containerName="mariadb-database-create" Oct 03 09:06:37 crc kubenswrapper[4765]: I1003 09:06:37.219264 4765 state_mem.go:107] "Deleted CPUSet assignment" podUID="5dd52fdd-5ddc-4475-a90f-0f51f56e4ac0" containerName="mariadb-database-create" Oct 03 09:06:37 crc kubenswrapper[4765]: E1003 09:06:37.219280 4765 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="62a7a58b-c181-4ef7-b62d-de2a16d2a47c" containerName="ceilometer-central-agent" Oct 03 09:06:37 crc kubenswrapper[4765]: I1003 09:06:37.219288 4765 state_mem.go:107] "Deleted CPUSet assignment" podUID="62a7a58b-c181-4ef7-b62d-de2a16d2a47c" containerName="ceilometer-central-agent" Oct 03 09:06:37 crc kubenswrapper[4765]: E1003 09:06:37.219312 4765 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="62a7a58b-c181-4ef7-b62d-de2a16d2a47c" containerName="proxy-httpd" Oct 03 09:06:37 crc kubenswrapper[4765]: I1003 09:06:37.219320 4765 state_mem.go:107] "Deleted CPUSet assignment" podUID="62a7a58b-c181-4ef7-b62d-de2a16d2a47c" containerName="proxy-httpd" Oct 03 09:06:37 crc kubenswrapper[4765]: E1003 09:06:37.219333 4765 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="62a7a58b-c181-4ef7-b62d-de2a16d2a47c" containerName="sg-core" Oct 03 09:06:37 crc kubenswrapper[4765]: I1003 09:06:37.219340 4765 state_mem.go:107] "Deleted CPUSet assignment" podUID="62a7a58b-c181-4ef7-b62d-de2a16d2a47c" containerName="sg-core" Oct 03 09:06:37 crc kubenswrapper[4765]: E1003 09:06:37.219357 4765 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="62a7a58b-c181-4ef7-b62d-de2a16d2a47c" containerName="ceilometer-notification-agent" Oct 03 09:06:37 crc kubenswrapper[4765]: I1003 09:06:37.219367 4765 state_mem.go:107] "Deleted CPUSet assignment" podUID="62a7a58b-c181-4ef7-b62d-de2a16d2a47c" containerName="ceilometer-notification-agent" Oct 03 09:06:37 crc kubenswrapper[4765]: I1003 09:06:37.219555 4765 memory_manager.go:354] "RemoveStaleState removing state" podUID="62a7a58b-c181-4ef7-b62d-de2a16d2a47c" containerName="sg-core" Oct 03 09:06:37 crc kubenswrapper[4765]: I1003 09:06:37.219571 4765 memory_manager.go:354] "RemoveStaleState removing state" podUID="62a7a58b-c181-4ef7-b62d-de2a16d2a47c" containerName="proxy-httpd" Oct 03 09:06:37 crc kubenswrapper[4765]: I1003 09:06:37.219585 4765 memory_manager.go:354] "RemoveStaleState removing state" podUID="62a7a58b-c181-4ef7-b62d-de2a16d2a47c" containerName="ceilometer-central-agent" Oct 03 09:06:37 crc kubenswrapper[4765]: I1003 09:06:37.219605 4765 memory_manager.go:354] "RemoveStaleState removing state" podUID="5dd52fdd-5ddc-4475-a90f-0f51f56e4ac0" containerName="mariadb-database-create" Oct 03 09:06:37 crc kubenswrapper[4765]: I1003 09:06:37.219614 4765 memory_manager.go:354] "RemoveStaleState removing state" podUID="62a7a58b-c181-4ef7-b62d-de2a16d2a47c" containerName="ceilometer-notification-agent" Oct 03 09:06:37 crc kubenswrapper[4765]: I1003 09:06:37.221899 4765 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/ceilometer-0" Oct 03 09:06:37 crc kubenswrapper[4765]: I1003 09:06:37.225898 4765 reflector.go:368] Caches populated for *v1.Secret from object-"watcher-kuttl-default"/"cert-ceilometer-internal-svc" Oct 03 09:06:37 crc kubenswrapper[4765]: I1003 09:06:37.226146 4765 reflector.go:368] Caches populated for *v1.Secret from object-"watcher-kuttl-default"/"ceilometer-config-data" Oct 03 09:06:37 crc kubenswrapper[4765]: I1003 09:06:37.229793 4765 reflector.go:368] Caches populated for *v1.Secret from object-"watcher-kuttl-default"/"ceilometer-scripts" Oct 03 09:06:37 crc kubenswrapper[4765]: I1003 09:06:37.237140 4765 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["watcher-kuttl-default/ceilometer-0"] Oct 03 09:06:37 crc kubenswrapper[4765]: I1003 09:06:37.305493 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/5dbe0344-1813-4e84-a956-434bd050bdc1-scripts\") pod \"ceilometer-0\" (UID: \"5dbe0344-1813-4e84-a956-434bd050bdc1\") " pod="watcher-kuttl-default/ceilometer-0" Oct 03 09:06:37 crc kubenswrapper[4765]: I1003 09:06:37.305555 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5dbe0344-1813-4e84-a956-434bd050bdc1-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"5dbe0344-1813-4e84-a956-434bd050bdc1\") " pod="watcher-kuttl-default/ceilometer-0" Oct 03 09:06:37 crc kubenswrapper[4765]: I1003 09:06:37.305605 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5dbe0344-1813-4e84-a956-434bd050bdc1-config-data\") pod \"ceilometer-0\" (UID: \"5dbe0344-1813-4e84-a956-434bd050bdc1\") " pod="watcher-kuttl-default/ceilometer-0" Oct 03 09:06:37 crc kubenswrapper[4765]: I1003 09:06:37.305750 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/5dbe0344-1813-4e84-a956-434bd050bdc1-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"5dbe0344-1813-4e84-a956-434bd050bdc1\") " pod="watcher-kuttl-default/ceilometer-0" Oct 03 09:06:37 crc kubenswrapper[4765]: I1003 09:06:37.305816 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/5dbe0344-1813-4e84-a956-434bd050bdc1-log-httpd\") pod \"ceilometer-0\" (UID: \"5dbe0344-1813-4e84-a956-434bd050bdc1\") " pod="watcher-kuttl-default/ceilometer-0" Oct 03 09:06:37 crc kubenswrapper[4765]: I1003 09:06:37.306050 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-g922f\" (UniqueName: \"kubernetes.io/projected/5dbe0344-1813-4e84-a956-434bd050bdc1-kube-api-access-g922f\") pod \"ceilometer-0\" (UID: \"5dbe0344-1813-4e84-a956-434bd050bdc1\") " pod="watcher-kuttl-default/ceilometer-0" Oct 03 09:06:37 crc kubenswrapper[4765]: I1003 09:06:37.306084 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/5dbe0344-1813-4e84-a956-434bd050bdc1-run-httpd\") pod \"ceilometer-0\" (UID: \"5dbe0344-1813-4e84-a956-434bd050bdc1\") " pod="watcher-kuttl-default/ceilometer-0" Oct 03 09:06:37 crc kubenswrapper[4765]: I1003 09:06:37.306128 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/5dbe0344-1813-4e84-a956-434bd050bdc1-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"5dbe0344-1813-4e84-a956-434bd050bdc1\") " pod="watcher-kuttl-default/ceilometer-0" Oct 03 09:06:37 crc kubenswrapper[4765]: I1003 09:06:37.408057 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/5dbe0344-1813-4e84-a956-434bd050bdc1-scripts\") pod \"ceilometer-0\" (UID: \"5dbe0344-1813-4e84-a956-434bd050bdc1\") " pod="watcher-kuttl-default/ceilometer-0" Oct 03 09:06:37 crc kubenswrapper[4765]: I1003 09:06:37.408112 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5dbe0344-1813-4e84-a956-434bd050bdc1-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"5dbe0344-1813-4e84-a956-434bd050bdc1\") " pod="watcher-kuttl-default/ceilometer-0" Oct 03 09:06:37 crc kubenswrapper[4765]: I1003 09:06:37.408167 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5dbe0344-1813-4e84-a956-434bd050bdc1-config-data\") pod \"ceilometer-0\" (UID: \"5dbe0344-1813-4e84-a956-434bd050bdc1\") " pod="watcher-kuttl-default/ceilometer-0" Oct 03 09:06:37 crc kubenswrapper[4765]: I1003 09:06:37.408196 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/5dbe0344-1813-4e84-a956-434bd050bdc1-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"5dbe0344-1813-4e84-a956-434bd050bdc1\") " pod="watcher-kuttl-default/ceilometer-0" Oct 03 09:06:37 crc kubenswrapper[4765]: I1003 09:06:37.408217 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/5dbe0344-1813-4e84-a956-434bd050bdc1-log-httpd\") pod \"ceilometer-0\" (UID: \"5dbe0344-1813-4e84-a956-434bd050bdc1\") " pod="watcher-kuttl-default/ceilometer-0" Oct 03 09:06:37 crc kubenswrapper[4765]: I1003 09:06:37.408271 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-g922f\" (UniqueName: \"kubernetes.io/projected/5dbe0344-1813-4e84-a956-434bd050bdc1-kube-api-access-g922f\") pod \"ceilometer-0\" (UID: \"5dbe0344-1813-4e84-a956-434bd050bdc1\") " pod="watcher-kuttl-default/ceilometer-0" Oct 03 09:06:37 crc kubenswrapper[4765]: I1003 09:06:37.408288 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/5dbe0344-1813-4e84-a956-434bd050bdc1-run-httpd\") pod \"ceilometer-0\" (UID: \"5dbe0344-1813-4e84-a956-434bd050bdc1\") " pod="watcher-kuttl-default/ceilometer-0" Oct 03 09:06:37 crc kubenswrapper[4765]: I1003 09:06:37.408308 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/5dbe0344-1813-4e84-a956-434bd050bdc1-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"5dbe0344-1813-4e84-a956-434bd050bdc1\") " pod="watcher-kuttl-default/ceilometer-0" Oct 03 09:06:37 crc kubenswrapper[4765]: I1003 09:06:37.408867 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/5dbe0344-1813-4e84-a956-434bd050bdc1-run-httpd\") pod \"ceilometer-0\" (UID: \"5dbe0344-1813-4e84-a956-434bd050bdc1\") " pod="watcher-kuttl-default/ceilometer-0" Oct 03 09:06:37 crc kubenswrapper[4765]: I1003 09:06:37.409412 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/5dbe0344-1813-4e84-a956-434bd050bdc1-log-httpd\") pod \"ceilometer-0\" (UID: \"5dbe0344-1813-4e84-a956-434bd050bdc1\") " pod="watcher-kuttl-default/ceilometer-0" Oct 03 09:06:37 crc kubenswrapper[4765]: I1003 09:06:37.418956 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5dbe0344-1813-4e84-a956-434bd050bdc1-config-data\") pod \"ceilometer-0\" (UID: \"5dbe0344-1813-4e84-a956-434bd050bdc1\") " pod="watcher-kuttl-default/ceilometer-0" Oct 03 09:06:37 crc kubenswrapper[4765]: I1003 09:06:37.421422 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5dbe0344-1813-4e84-a956-434bd050bdc1-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"5dbe0344-1813-4e84-a956-434bd050bdc1\") " pod="watcher-kuttl-default/ceilometer-0" Oct 03 09:06:37 crc kubenswrapper[4765]: I1003 09:06:37.425302 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/5dbe0344-1813-4e84-a956-434bd050bdc1-scripts\") pod \"ceilometer-0\" (UID: \"5dbe0344-1813-4e84-a956-434bd050bdc1\") " pod="watcher-kuttl-default/ceilometer-0" Oct 03 09:06:37 crc kubenswrapper[4765]: I1003 09:06:37.425298 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/5dbe0344-1813-4e84-a956-434bd050bdc1-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"5dbe0344-1813-4e84-a956-434bd050bdc1\") " pod="watcher-kuttl-default/ceilometer-0" Oct 03 09:06:37 crc kubenswrapper[4765]: I1003 09:06:37.425523 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/5dbe0344-1813-4e84-a956-434bd050bdc1-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"5dbe0344-1813-4e84-a956-434bd050bdc1\") " pod="watcher-kuttl-default/ceilometer-0" Oct 03 09:06:37 crc kubenswrapper[4765]: I1003 09:06:37.436488 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-g922f\" (UniqueName: \"kubernetes.io/projected/5dbe0344-1813-4e84-a956-434bd050bdc1-kube-api-access-g922f\") pod \"ceilometer-0\" (UID: \"5dbe0344-1813-4e84-a956-434bd050bdc1\") " pod="watcher-kuttl-default/ceilometer-0" Oct 03 09:06:37 crc kubenswrapper[4765]: I1003 09:06:37.456682 4765 log.go:25] "Finished parsing log file" path="/var/log/pods/watcher-kuttl-default_watcher-kuttl-decision-engine-0_ba2a7e44-33cc-4cf1-8dfc-7aa8306118d2/watcher-decision-engine/0.log" Oct 03 09:06:37 crc kubenswrapper[4765]: I1003 09:06:37.565427 4765 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/ceilometer-0" Oct 03 09:06:38 crc kubenswrapper[4765]: I1003 09:06:38.024505 4765 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["watcher-kuttl-default/ceilometer-0"] Oct 03 09:06:38 crc kubenswrapper[4765]: W1003 09:06:38.024795 4765 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod5dbe0344_1813_4e84_a956_434bd050bdc1.slice/crio-5e7862abcd6052d41f72b9beae90ce11a1b516e2e1a70282577cef7da8ecc676 WatchSource:0}: Error finding container 5e7862abcd6052d41f72b9beae90ce11a1b516e2e1a70282577cef7da8ecc676: Status 404 returned error can't find the container with id 5e7862abcd6052d41f72b9beae90ce11a1b516e2e1a70282577cef7da8ecc676 Oct 03 09:06:38 crc kubenswrapper[4765]: I1003 09:06:38.316838 4765 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="62a7a58b-c181-4ef7-b62d-de2a16d2a47c" path="/var/lib/kubelet/pods/62a7a58b-c181-4ef7-b62d-de2a16d2a47c/volumes" Oct 03 09:06:38 crc kubenswrapper[4765]: I1003 09:06:38.671886 4765 log.go:25] "Finished parsing log file" path="/var/log/pods/watcher-kuttl-default_watcher-kuttl-decision-engine-0_ba2a7e44-33cc-4cf1-8dfc-7aa8306118d2/watcher-decision-engine/0.log" Oct 03 09:06:38 crc kubenswrapper[4765]: I1003 09:06:38.898398 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/ceilometer-0" event={"ID":"5dbe0344-1813-4e84-a956-434bd050bdc1","Type":"ContainerStarted","Data":"b06db2bd1f356e803c7e960bd2bfdda5b8ece438e43f4a8114c4944017c78bfb"} Oct 03 09:06:38 crc kubenswrapper[4765]: I1003 09:06:38.898452 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/ceilometer-0" event={"ID":"5dbe0344-1813-4e84-a956-434bd050bdc1","Type":"ContainerStarted","Data":"5e7862abcd6052d41f72b9beae90ce11a1b516e2e1a70282577cef7da8ecc676"} Oct 03 09:06:39 crc kubenswrapper[4765]: I1003 09:06:39.854294 4765 log.go:25] "Finished parsing log file" path="/var/log/pods/watcher-kuttl-default_watcher-kuttl-decision-engine-0_ba2a7e44-33cc-4cf1-8dfc-7aa8306118d2/watcher-decision-engine/0.log" Oct 03 09:06:39 crc kubenswrapper[4765]: I1003 09:06:39.908378 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/ceilometer-0" event={"ID":"5dbe0344-1813-4e84-a956-434bd050bdc1","Type":"ContainerStarted","Data":"f579c547101e5562a5e8cbbf13eefd68ec182efd8182644ad53e45f28d15a161"} Oct 03 09:06:40 crc kubenswrapper[4765]: I1003 09:06:40.917728 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/ceilometer-0" event={"ID":"5dbe0344-1813-4e84-a956-434bd050bdc1","Type":"ContainerStarted","Data":"848b071f1bb670ceae2b3fdb7f9ba0a83275f026ea59a9689396f1e284267548"} Oct 03 09:06:41 crc kubenswrapper[4765]: I1003 09:06:41.047874 4765 log.go:25] "Finished parsing log file" path="/var/log/pods/watcher-kuttl-default_watcher-kuttl-decision-engine-0_ba2a7e44-33cc-4cf1-8dfc-7aa8306118d2/watcher-decision-engine/0.log" Oct 03 09:06:41 crc kubenswrapper[4765]: I1003 09:06:41.933929 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/ceilometer-0" event={"ID":"5dbe0344-1813-4e84-a956-434bd050bdc1","Type":"ContainerStarted","Data":"c9a7c8cff14aa929cc1d212e1add72e03d3849017c7ba288aba297471ec75f4c"} Oct 03 09:06:41 crc kubenswrapper[4765]: I1003 09:06:41.935260 4765 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="watcher-kuttl-default/ceilometer-0" Oct 03 09:06:41 crc kubenswrapper[4765]: I1003 09:06:41.955492 4765 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="watcher-kuttl-default/ceilometer-0" podStartSLOduration=1.755961667 podStartE2EDuration="4.955477028s" podCreationTimestamp="2025-10-03 09:06:37 +0000 UTC" firstStartedPulling="2025-10-03 09:06:38.02717064 +0000 UTC m=+1642.328664970" lastFinishedPulling="2025-10-03 09:06:41.226686001 +0000 UTC m=+1645.528180331" observedRunningTime="2025-10-03 09:06:41.951774263 +0000 UTC m=+1646.253268603" watchObservedRunningTime="2025-10-03 09:06:41.955477028 +0000 UTC m=+1646.256971358" Oct 03 09:06:42 crc kubenswrapper[4765]: I1003 09:06:42.243452 4765 log.go:25] "Finished parsing log file" path="/var/log/pods/watcher-kuttl-default_watcher-kuttl-decision-engine-0_ba2a7e44-33cc-4cf1-8dfc-7aa8306118d2/watcher-decision-engine/0.log" Oct 03 09:06:43 crc kubenswrapper[4765]: I1003 09:06:43.468929 4765 log.go:25] "Finished parsing log file" path="/var/log/pods/watcher-kuttl-default_watcher-kuttl-decision-engine-0_ba2a7e44-33cc-4cf1-8dfc-7aa8306118d2/watcher-decision-engine/0.log" Oct 03 09:06:43 crc kubenswrapper[4765]: I1003 09:06:43.473232 4765 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["watcher-kuttl-default/cinder-f883-account-create-nmddb"] Oct 03 09:06:43 crc kubenswrapper[4765]: I1003 09:06:43.474346 4765 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/cinder-f883-account-create-nmddb" Oct 03 09:06:43 crc kubenswrapper[4765]: I1003 09:06:43.478183 4765 reflector.go:368] Caches populated for *v1.Secret from object-"watcher-kuttl-default"/"cinder-db-secret" Oct 03 09:06:43 crc kubenswrapper[4765]: I1003 09:06:43.486976 4765 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["watcher-kuttl-default/cinder-f883-account-create-nmddb"] Oct 03 09:06:43 crc kubenswrapper[4765]: I1003 09:06:43.510834 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ms66g\" (UniqueName: \"kubernetes.io/projected/5f05a4bc-df17-4d20-879b-1d082c186426-kube-api-access-ms66g\") pod \"cinder-f883-account-create-nmddb\" (UID: \"5f05a4bc-df17-4d20-879b-1d082c186426\") " pod="watcher-kuttl-default/cinder-f883-account-create-nmddb" Oct 03 09:06:43 crc kubenswrapper[4765]: I1003 09:06:43.611996 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ms66g\" (UniqueName: \"kubernetes.io/projected/5f05a4bc-df17-4d20-879b-1d082c186426-kube-api-access-ms66g\") pod \"cinder-f883-account-create-nmddb\" (UID: \"5f05a4bc-df17-4d20-879b-1d082c186426\") " pod="watcher-kuttl-default/cinder-f883-account-create-nmddb" Oct 03 09:06:43 crc kubenswrapper[4765]: I1003 09:06:43.634867 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ms66g\" (UniqueName: \"kubernetes.io/projected/5f05a4bc-df17-4d20-879b-1d082c186426-kube-api-access-ms66g\") pod \"cinder-f883-account-create-nmddb\" (UID: \"5f05a4bc-df17-4d20-879b-1d082c186426\") " pod="watcher-kuttl-default/cinder-f883-account-create-nmddb" Oct 03 09:06:43 crc kubenswrapper[4765]: I1003 09:06:43.805145 4765 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/cinder-f883-account-create-nmddb" Oct 03 09:06:44 crc kubenswrapper[4765]: I1003 09:06:44.237717 4765 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["watcher-kuttl-default/cinder-f883-account-create-nmddb"] Oct 03 09:06:44 crc kubenswrapper[4765]: W1003 09:06:44.239943 4765 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod5f05a4bc_df17_4d20_879b_1d082c186426.slice/crio-2c58afea50dfda69b6934cacf0f7a466c18ae66b24a07ddfb642c91a170452c2 WatchSource:0}: Error finding container 2c58afea50dfda69b6934cacf0f7a466c18ae66b24a07ddfb642c91a170452c2: Status 404 returned error can't find the container with id 2c58afea50dfda69b6934cacf0f7a466c18ae66b24a07ddfb642c91a170452c2 Oct 03 09:06:44 crc kubenswrapper[4765]: I1003 09:06:44.673875 4765 log.go:25] "Finished parsing log file" path="/var/log/pods/watcher-kuttl-default_watcher-kuttl-decision-engine-0_ba2a7e44-33cc-4cf1-8dfc-7aa8306118d2/watcher-decision-engine/0.log" Oct 03 09:06:44 crc kubenswrapper[4765]: I1003 09:06:44.965031 4765 generic.go:334] "Generic (PLEG): container finished" podID="5f05a4bc-df17-4d20-879b-1d082c186426" containerID="a8484ac98edaaf57e9fa0bbd9ed0a814437584819226b3f4a6fdb9b23d694adc" exitCode=0 Oct 03 09:06:44 crc kubenswrapper[4765]: I1003 09:06:44.965079 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/cinder-f883-account-create-nmddb" event={"ID":"5f05a4bc-df17-4d20-879b-1d082c186426","Type":"ContainerDied","Data":"a8484ac98edaaf57e9fa0bbd9ed0a814437584819226b3f4a6fdb9b23d694adc"} Oct 03 09:06:44 crc kubenswrapper[4765]: I1003 09:06:44.965105 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/cinder-f883-account-create-nmddb" event={"ID":"5f05a4bc-df17-4d20-879b-1d082c186426","Type":"ContainerStarted","Data":"2c58afea50dfda69b6934cacf0f7a466c18ae66b24a07ddfb642c91a170452c2"} Oct 03 09:06:45 crc kubenswrapper[4765]: I1003 09:06:45.886542 4765 log.go:25] "Finished parsing log file" path="/var/log/pods/watcher-kuttl-default_watcher-kuttl-decision-engine-0_ba2a7e44-33cc-4cf1-8dfc-7aa8306118d2/watcher-decision-engine/0.log" Oct 03 09:06:46 crc kubenswrapper[4765]: I1003 09:06:46.394987 4765 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/cinder-f883-account-create-nmddb" Oct 03 09:06:46 crc kubenswrapper[4765]: I1003 09:06:46.487151 4765 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ms66g\" (UniqueName: \"kubernetes.io/projected/5f05a4bc-df17-4d20-879b-1d082c186426-kube-api-access-ms66g\") pod \"5f05a4bc-df17-4d20-879b-1d082c186426\" (UID: \"5f05a4bc-df17-4d20-879b-1d082c186426\") " Oct 03 09:06:46 crc kubenswrapper[4765]: I1003 09:06:46.493440 4765 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5f05a4bc-df17-4d20-879b-1d082c186426-kube-api-access-ms66g" (OuterVolumeSpecName: "kube-api-access-ms66g") pod "5f05a4bc-df17-4d20-879b-1d082c186426" (UID: "5f05a4bc-df17-4d20-879b-1d082c186426"). InnerVolumeSpecName "kube-api-access-ms66g". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 09:06:46 crc kubenswrapper[4765]: I1003 09:06:46.588915 4765 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ms66g\" (UniqueName: \"kubernetes.io/projected/5f05a4bc-df17-4d20-879b-1d082c186426-kube-api-access-ms66g\") on node \"crc\" DevicePath \"\"" Oct 03 09:06:46 crc kubenswrapper[4765]: I1003 09:06:46.982692 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/cinder-f883-account-create-nmddb" event={"ID":"5f05a4bc-df17-4d20-879b-1d082c186426","Type":"ContainerDied","Data":"2c58afea50dfda69b6934cacf0f7a466c18ae66b24a07ddfb642c91a170452c2"} Oct 03 09:06:46 crc kubenswrapper[4765]: I1003 09:06:46.983146 4765 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="2c58afea50dfda69b6934cacf0f7a466c18ae66b24a07ddfb642c91a170452c2" Oct 03 09:06:46 crc kubenswrapper[4765]: I1003 09:06:46.983226 4765 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/cinder-f883-account-create-nmddb" Oct 03 09:06:47 crc kubenswrapper[4765]: I1003 09:06:47.097250 4765 log.go:25] "Finished parsing log file" path="/var/log/pods/watcher-kuttl-default_watcher-kuttl-decision-engine-0_ba2a7e44-33cc-4cf1-8dfc-7aa8306118d2/watcher-decision-engine/0.log" Oct 03 09:06:48 crc kubenswrapper[4765]: I1003 09:06:48.317884 4765 log.go:25] "Finished parsing log file" path="/var/log/pods/watcher-kuttl-default_watcher-kuttl-decision-engine-0_ba2a7e44-33cc-4cf1-8dfc-7aa8306118d2/watcher-decision-engine/0.log" Oct 03 09:06:48 crc kubenswrapper[4765]: I1003 09:06:48.613120 4765 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["watcher-kuttl-default/cinder-db-sync-kvjq6"] Oct 03 09:06:48 crc kubenswrapper[4765]: E1003 09:06:48.613557 4765 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5f05a4bc-df17-4d20-879b-1d082c186426" containerName="mariadb-account-create" Oct 03 09:06:48 crc kubenswrapper[4765]: I1003 09:06:48.613582 4765 state_mem.go:107] "Deleted CPUSet assignment" podUID="5f05a4bc-df17-4d20-879b-1d082c186426" containerName="mariadb-account-create" Oct 03 09:06:48 crc kubenswrapper[4765]: I1003 09:06:48.613819 4765 memory_manager.go:354] "RemoveStaleState removing state" podUID="5f05a4bc-df17-4d20-879b-1d082c186426" containerName="mariadb-account-create" Oct 03 09:06:48 crc kubenswrapper[4765]: I1003 09:06:48.614457 4765 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/cinder-db-sync-kvjq6" Oct 03 09:06:48 crc kubenswrapper[4765]: I1003 09:06:48.616148 4765 reflector.go:368] Caches populated for *v1.Secret from object-"watcher-kuttl-default"/"cinder-cinder-dockercfg-k6wj4" Oct 03 09:06:48 crc kubenswrapper[4765]: I1003 09:06:48.616894 4765 reflector.go:368] Caches populated for *v1.Secret from object-"watcher-kuttl-default"/"cinder-config-data" Oct 03 09:06:48 crc kubenswrapper[4765]: I1003 09:06:48.619636 4765 reflector.go:368] Caches populated for *v1.Secret from object-"watcher-kuttl-default"/"cinder-scripts" Oct 03 09:06:48 crc kubenswrapper[4765]: I1003 09:06:48.629763 4765 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["watcher-kuttl-default/cinder-db-sync-kvjq6"] Oct 03 09:06:48 crc kubenswrapper[4765]: I1003 09:06:48.730828 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zr76f\" (UniqueName: \"kubernetes.io/projected/c7420dfa-3ae6-4037-9f4e-a43c05cd7ac7-kube-api-access-zr76f\") pod \"cinder-db-sync-kvjq6\" (UID: \"c7420dfa-3ae6-4037-9f4e-a43c05cd7ac7\") " pod="watcher-kuttl-default/cinder-db-sync-kvjq6" Oct 03 09:06:48 crc kubenswrapper[4765]: I1003 09:06:48.731215 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c7420dfa-3ae6-4037-9f4e-a43c05cd7ac7-combined-ca-bundle\") pod \"cinder-db-sync-kvjq6\" (UID: \"c7420dfa-3ae6-4037-9f4e-a43c05cd7ac7\") " pod="watcher-kuttl-default/cinder-db-sync-kvjq6" Oct 03 09:06:48 crc kubenswrapper[4765]: I1003 09:06:48.731355 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c7420dfa-3ae6-4037-9f4e-a43c05cd7ac7-scripts\") pod \"cinder-db-sync-kvjq6\" (UID: \"c7420dfa-3ae6-4037-9f4e-a43c05cd7ac7\") " pod="watcher-kuttl-default/cinder-db-sync-kvjq6" Oct 03 09:06:48 crc kubenswrapper[4765]: I1003 09:06:48.731379 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/c7420dfa-3ae6-4037-9f4e-a43c05cd7ac7-db-sync-config-data\") pod \"cinder-db-sync-kvjq6\" (UID: \"c7420dfa-3ae6-4037-9f4e-a43c05cd7ac7\") " pod="watcher-kuttl-default/cinder-db-sync-kvjq6" Oct 03 09:06:48 crc kubenswrapper[4765]: I1003 09:06:48.731428 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c7420dfa-3ae6-4037-9f4e-a43c05cd7ac7-config-data\") pod \"cinder-db-sync-kvjq6\" (UID: \"c7420dfa-3ae6-4037-9f4e-a43c05cd7ac7\") " pod="watcher-kuttl-default/cinder-db-sync-kvjq6" Oct 03 09:06:48 crc kubenswrapper[4765]: I1003 09:06:48.731470 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/c7420dfa-3ae6-4037-9f4e-a43c05cd7ac7-etc-machine-id\") pod \"cinder-db-sync-kvjq6\" (UID: \"c7420dfa-3ae6-4037-9f4e-a43c05cd7ac7\") " pod="watcher-kuttl-default/cinder-db-sync-kvjq6" Oct 03 09:06:48 crc kubenswrapper[4765]: I1003 09:06:48.833452 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zr76f\" (UniqueName: \"kubernetes.io/projected/c7420dfa-3ae6-4037-9f4e-a43c05cd7ac7-kube-api-access-zr76f\") pod \"cinder-db-sync-kvjq6\" (UID: \"c7420dfa-3ae6-4037-9f4e-a43c05cd7ac7\") " pod="watcher-kuttl-default/cinder-db-sync-kvjq6" Oct 03 09:06:48 crc kubenswrapper[4765]: I1003 09:06:48.833510 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c7420dfa-3ae6-4037-9f4e-a43c05cd7ac7-combined-ca-bundle\") pod \"cinder-db-sync-kvjq6\" (UID: \"c7420dfa-3ae6-4037-9f4e-a43c05cd7ac7\") " pod="watcher-kuttl-default/cinder-db-sync-kvjq6" Oct 03 09:06:48 crc kubenswrapper[4765]: I1003 09:06:48.833587 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c7420dfa-3ae6-4037-9f4e-a43c05cd7ac7-scripts\") pod \"cinder-db-sync-kvjq6\" (UID: \"c7420dfa-3ae6-4037-9f4e-a43c05cd7ac7\") " pod="watcher-kuttl-default/cinder-db-sync-kvjq6" Oct 03 09:06:48 crc kubenswrapper[4765]: I1003 09:06:48.833605 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/c7420dfa-3ae6-4037-9f4e-a43c05cd7ac7-db-sync-config-data\") pod \"cinder-db-sync-kvjq6\" (UID: \"c7420dfa-3ae6-4037-9f4e-a43c05cd7ac7\") " pod="watcher-kuttl-default/cinder-db-sync-kvjq6" Oct 03 09:06:48 crc kubenswrapper[4765]: I1003 09:06:48.833639 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c7420dfa-3ae6-4037-9f4e-a43c05cd7ac7-config-data\") pod \"cinder-db-sync-kvjq6\" (UID: \"c7420dfa-3ae6-4037-9f4e-a43c05cd7ac7\") " pod="watcher-kuttl-default/cinder-db-sync-kvjq6" Oct 03 09:06:48 crc kubenswrapper[4765]: I1003 09:06:48.833674 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/c7420dfa-3ae6-4037-9f4e-a43c05cd7ac7-etc-machine-id\") pod \"cinder-db-sync-kvjq6\" (UID: \"c7420dfa-3ae6-4037-9f4e-a43c05cd7ac7\") " pod="watcher-kuttl-default/cinder-db-sync-kvjq6" Oct 03 09:06:48 crc kubenswrapper[4765]: I1003 09:06:48.833761 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/c7420dfa-3ae6-4037-9f4e-a43c05cd7ac7-etc-machine-id\") pod \"cinder-db-sync-kvjq6\" (UID: \"c7420dfa-3ae6-4037-9f4e-a43c05cd7ac7\") " pod="watcher-kuttl-default/cinder-db-sync-kvjq6" Oct 03 09:06:48 crc kubenswrapper[4765]: I1003 09:06:48.839631 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/c7420dfa-3ae6-4037-9f4e-a43c05cd7ac7-db-sync-config-data\") pod \"cinder-db-sync-kvjq6\" (UID: \"c7420dfa-3ae6-4037-9f4e-a43c05cd7ac7\") " pod="watcher-kuttl-default/cinder-db-sync-kvjq6" Oct 03 09:06:48 crc kubenswrapper[4765]: I1003 09:06:48.840751 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c7420dfa-3ae6-4037-9f4e-a43c05cd7ac7-combined-ca-bundle\") pod \"cinder-db-sync-kvjq6\" (UID: \"c7420dfa-3ae6-4037-9f4e-a43c05cd7ac7\") " pod="watcher-kuttl-default/cinder-db-sync-kvjq6" Oct 03 09:06:48 crc kubenswrapper[4765]: I1003 09:06:48.840762 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c7420dfa-3ae6-4037-9f4e-a43c05cd7ac7-config-data\") pod \"cinder-db-sync-kvjq6\" (UID: \"c7420dfa-3ae6-4037-9f4e-a43c05cd7ac7\") " pod="watcher-kuttl-default/cinder-db-sync-kvjq6" Oct 03 09:06:48 crc kubenswrapper[4765]: I1003 09:06:48.842375 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c7420dfa-3ae6-4037-9f4e-a43c05cd7ac7-scripts\") pod \"cinder-db-sync-kvjq6\" (UID: \"c7420dfa-3ae6-4037-9f4e-a43c05cd7ac7\") " pod="watcher-kuttl-default/cinder-db-sync-kvjq6" Oct 03 09:06:48 crc kubenswrapper[4765]: I1003 09:06:48.874336 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zr76f\" (UniqueName: \"kubernetes.io/projected/c7420dfa-3ae6-4037-9f4e-a43c05cd7ac7-kube-api-access-zr76f\") pod \"cinder-db-sync-kvjq6\" (UID: \"c7420dfa-3ae6-4037-9f4e-a43c05cd7ac7\") " pod="watcher-kuttl-default/cinder-db-sync-kvjq6" Oct 03 09:06:48 crc kubenswrapper[4765]: I1003 09:06:48.933240 4765 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/cinder-db-sync-kvjq6" Oct 03 09:06:49 crc kubenswrapper[4765]: I1003 09:06:49.311406 4765 scope.go:117] "RemoveContainer" containerID="dd918556e4256b95f1ffce5dba4f8a301b33441a569fc5bbea88da3f09eb9800" Oct 03 09:06:49 crc kubenswrapper[4765]: E1003 09:06:49.318693 4765 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-j8mss_openshift-machine-config-operator(d636dbad-9ffa-4ba7-953f-adea04b76a23)\"" pod="openshift-machine-config-operator/machine-config-daemon-j8mss" podUID="d636dbad-9ffa-4ba7-953f-adea04b76a23" Oct 03 09:06:49 crc kubenswrapper[4765]: I1003 09:06:49.468300 4765 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["watcher-kuttl-default/cinder-db-sync-kvjq6"] Oct 03 09:06:49 crc kubenswrapper[4765]: I1003 09:06:49.525362 4765 log.go:25] "Finished parsing log file" path="/var/log/pods/watcher-kuttl-default_watcher-kuttl-decision-engine-0_ba2a7e44-33cc-4cf1-8dfc-7aa8306118d2/watcher-decision-engine/0.log" Oct 03 09:06:50 crc kubenswrapper[4765]: I1003 09:06:50.021836 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/cinder-db-sync-kvjq6" event={"ID":"c7420dfa-3ae6-4037-9f4e-a43c05cd7ac7","Type":"ContainerStarted","Data":"be14796f90edf14d52b7450170631707082aacbd87e4acdf44ae080aeffce5ee"} Oct 03 09:06:50 crc kubenswrapper[4765]: I1003 09:06:50.728519 4765 log.go:25] "Finished parsing log file" path="/var/log/pods/watcher-kuttl-default_watcher-kuttl-decision-engine-0_ba2a7e44-33cc-4cf1-8dfc-7aa8306118d2/watcher-decision-engine/0.log" Oct 03 09:06:51 crc kubenswrapper[4765]: I1003 09:06:51.976199 4765 log.go:25] "Finished parsing log file" path="/var/log/pods/watcher-kuttl-default_watcher-kuttl-decision-engine-0_ba2a7e44-33cc-4cf1-8dfc-7aa8306118d2/watcher-decision-engine/0.log" Oct 03 09:06:53 crc kubenswrapper[4765]: I1003 09:06:53.193147 4765 log.go:25] "Finished parsing log file" path="/var/log/pods/watcher-kuttl-default_watcher-kuttl-decision-engine-0_ba2a7e44-33cc-4cf1-8dfc-7aa8306118d2/watcher-decision-engine/0.log" Oct 03 09:06:54 crc kubenswrapper[4765]: I1003 09:06:54.395838 4765 log.go:25] "Finished parsing log file" path="/var/log/pods/watcher-kuttl-default_watcher-kuttl-decision-engine-0_ba2a7e44-33cc-4cf1-8dfc-7aa8306118d2/watcher-decision-engine/0.log" Oct 03 09:06:55 crc kubenswrapper[4765]: I1003 09:06:55.591585 4765 log.go:25] "Finished parsing log file" path="/var/log/pods/watcher-kuttl-default_watcher-kuttl-decision-engine-0_ba2a7e44-33cc-4cf1-8dfc-7aa8306118d2/watcher-decision-engine/0.log" Oct 03 09:06:56 crc kubenswrapper[4765]: I1003 09:06:56.819682 4765 log.go:25] "Finished parsing log file" path="/var/log/pods/watcher-kuttl-default_watcher-kuttl-decision-engine-0_ba2a7e44-33cc-4cf1-8dfc-7aa8306118d2/watcher-decision-engine/0.log" Oct 03 09:06:58 crc kubenswrapper[4765]: I1003 09:06:58.052878 4765 log.go:25] "Finished parsing log file" path="/var/log/pods/watcher-kuttl-default_watcher-kuttl-decision-engine-0_ba2a7e44-33cc-4cf1-8dfc-7aa8306118d2/watcher-decision-engine/0.log" Oct 03 09:06:59 crc kubenswrapper[4765]: I1003 09:06:59.283085 4765 log.go:25] "Finished parsing log file" path="/var/log/pods/watcher-kuttl-default_watcher-kuttl-decision-engine-0_ba2a7e44-33cc-4cf1-8dfc-7aa8306118d2/watcher-decision-engine/0.log" Oct 03 09:07:00 crc kubenswrapper[4765]: I1003 09:07:00.485092 4765 log.go:25] "Finished parsing log file" path="/var/log/pods/watcher-kuttl-default_watcher-kuttl-decision-engine-0_ba2a7e44-33cc-4cf1-8dfc-7aa8306118d2/watcher-decision-engine/0.log" Oct 03 09:07:01 crc kubenswrapper[4765]: I1003 09:07:01.702662 4765 log.go:25] "Finished parsing log file" path="/var/log/pods/watcher-kuttl-default_watcher-kuttl-decision-engine-0_ba2a7e44-33cc-4cf1-8dfc-7aa8306118d2/watcher-decision-engine/0.log" Oct 03 09:07:02 crc kubenswrapper[4765]: I1003 09:07:02.956989 4765 log.go:25] "Finished parsing log file" path="/var/log/pods/watcher-kuttl-default_watcher-kuttl-decision-engine-0_ba2a7e44-33cc-4cf1-8dfc-7aa8306118d2/watcher-decision-engine/0.log" Oct 03 09:07:03 crc kubenswrapper[4765]: I1003 09:07:03.307343 4765 scope.go:117] "RemoveContainer" containerID="dd918556e4256b95f1ffce5dba4f8a301b33441a569fc5bbea88da3f09eb9800" Oct 03 09:07:03 crc kubenswrapper[4765]: E1003 09:07:03.307812 4765 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-j8mss_openshift-machine-config-operator(d636dbad-9ffa-4ba7-953f-adea04b76a23)\"" pod="openshift-machine-config-operator/machine-config-daemon-j8mss" podUID="d636dbad-9ffa-4ba7-953f-adea04b76a23" Oct 03 09:07:04 crc kubenswrapper[4765]: I1003 09:07:04.172975 4765 log.go:25] "Finished parsing log file" path="/var/log/pods/watcher-kuttl-default_watcher-kuttl-decision-engine-0_ba2a7e44-33cc-4cf1-8dfc-7aa8306118d2/watcher-decision-engine/0.log" Oct 03 09:07:04 crc kubenswrapper[4765]: E1003 09:07:04.233703 4765 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-cinder-api:current-podified" Oct 03 09:07:04 crc kubenswrapper[4765]: E1003 09:07:04.233966 4765 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:cinder-db-sync,Image:quay.io/podified-antelope-centos9/openstack-cinder-api:current-podified,Command:[/bin/bash],Args:[-c /usr/local/bin/kolla_set_configs && /usr/local/bin/kolla_start],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:KOLLA_BOOTSTRAP,Value:TRUE,ValueFrom:nil,},EnvVar{Name:KOLLA_CONFIG_STRATEGY,Value:COPY_ALWAYS,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:etc-machine-id,ReadOnly:true,MountPath:/etc/machine-id,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:scripts,ReadOnly:true,MountPath:/usr/local/bin/container-scripts,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data,ReadOnly:true,MountPath:/var/lib/config-data/merged,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data,ReadOnly:true,MountPath:/etc/my.cnf,SubPath:my.cnf,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:db-sync-config-data,ReadOnly:true,MountPath:/etc/cinder/cinder.conf.d,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data,ReadOnly:true,MountPath:/var/lib/kolla/config_files/config.json,SubPath:db-sync-config.json,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:combined-ca-bundle,ReadOnly:true,MountPath:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem,SubPath:tls-ca-bundle.pem,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-zr76f,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:nil,Privileged:nil,SELinuxOptions:nil,RunAsUser:*0,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:nil,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod cinder-db-sync-kvjq6_watcher-kuttl-default(c7420dfa-3ae6-4037-9f4e-a43c05cd7ac7): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Oct 03 09:07:04 crc kubenswrapper[4765]: E1003 09:07:04.235710 4765 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"cinder-db-sync\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="watcher-kuttl-default/cinder-db-sync-kvjq6" podUID="c7420dfa-3ae6-4037-9f4e-a43c05cd7ac7" Oct 03 09:07:05 crc kubenswrapper[4765]: E1003 09:07:05.183426 4765 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"cinder-db-sync\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/podified-antelope-centos9/openstack-cinder-api:current-podified\\\"\"" pod="watcher-kuttl-default/cinder-db-sync-kvjq6" podUID="c7420dfa-3ae6-4037-9f4e-a43c05cd7ac7" Oct 03 09:07:05 crc kubenswrapper[4765]: I1003 09:07:05.374762 4765 log.go:25] "Finished parsing log file" path="/var/log/pods/watcher-kuttl-default_watcher-kuttl-decision-engine-0_ba2a7e44-33cc-4cf1-8dfc-7aa8306118d2/watcher-decision-engine/0.log" Oct 03 09:07:06 crc kubenswrapper[4765]: I1003 09:07:06.622010 4765 log.go:25] "Finished parsing log file" path="/var/log/pods/watcher-kuttl-default_watcher-kuttl-decision-engine-0_ba2a7e44-33cc-4cf1-8dfc-7aa8306118d2/watcher-decision-engine/0.log" Oct 03 09:07:07 crc kubenswrapper[4765]: I1003 09:07:07.573488 4765 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="watcher-kuttl-default/ceilometer-0" Oct 03 09:07:07 crc kubenswrapper[4765]: I1003 09:07:07.828149 4765 log.go:25] "Finished parsing log file" path="/var/log/pods/watcher-kuttl-default_watcher-kuttl-decision-engine-0_ba2a7e44-33cc-4cf1-8dfc-7aa8306118d2/watcher-decision-engine/0.log" Oct 03 09:07:09 crc kubenswrapper[4765]: I1003 09:07:09.003960 4765 log.go:25] "Finished parsing log file" path="/var/log/pods/watcher-kuttl-default_watcher-kuttl-decision-engine-0_ba2a7e44-33cc-4cf1-8dfc-7aa8306118d2/watcher-decision-engine/0.log" Oct 03 09:07:10 crc kubenswrapper[4765]: I1003 09:07:10.216204 4765 log.go:25] "Finished parsing log file" path="/var/log/pods/watcher-kuttl-default_watcher-kuttl-decision-engine-0_ba2a7e44-33cc-4cf1-8dfc-7aa8306118d2/watcher-decision-engine/0.log" Oct 03 09:07:11 crc kubenswrapper[4765]: I1003 09:07:11.424700 4765 log.go:25] "Finished parsing log file" path="/var/log/pods/watcher-kuttl-default_watcher-kuttl-decision-engine-0_ba2a7e44-33cc-4cf1-8dfc-7aa8306118d2/watcher-decision-engine/0.log" Oct 03 09:07:12 crc kubenswrapper[4765]: I1003 09:07:12.644197 4765 log.go:25] "Finished parsing log file" path="/var/log/pods/watcher-kuttl-default_watcher-kuttl-decision-engine-0_ba2a7e44-33cc-4cf1-8dfc-7aa8306118d2/watcher-decision-engine/0.log" Oct 03 09:07:13 crc kubenswrapper[4765]: I1003 09:07:13.847245 4765 log.go:25] "Finished parsing log file" path="/var/log/pods/watcher-kuttl-default_watcher-kuttl-decision-engine-0_ba2a7e44-33cc-4cf1-8dfc-7aa8306118d2/watcher-decision-engine/0.log" Oct 03 09:07:15 crc kubenswrapper[4765]: I1003 09:07:15.072849 4765 log.go:25] "Finished parsing log file" path="/var/log/pods/watcher-kuttl-default_watcher-kuttl-decision-engine-0_ba2a7e44-33cc-4cf1-8dfc-7aa8306118d2/watcher-decision-engine/0.log" Oct 03 09:07:15 crc kubenswrapper[4765]: I1003 09:07:15.307283 4765 scope.go:117] "RemoveContainer" containerID="dd918556e4256b95f1ffce5dba4f8a301b33441a569fc5bbea88da3f09eb9800" Oct 03 09:07:15 crc kubenswrapper[4765]: E1003 09:07:15.307731 4765 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-j8mss_openshift-machine-config-operator(d636dbad-9ffa-4ba7-953f-adea04b76a23)\"" pod="openshift-machine-config-operator/machine-config-daemon-j8mss" podUID="d636dbad-9ffa-4ba7-953f-adea04b76a23" Oct 03 09:07:16 crc kubenswrapper[4765]: I1003 09:07:16.057933 4765 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["watcher-kuttl-default/keystone-db-create-mvkpw"] Oct 03 09:07:16 crc kubenswrapper[4765]: I1003 09:07:16.066075 4765 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["watcher-kuttl-default/keystone-db-create-mvkpw"] Oct 03 09:07:16 crc kubenswrapper[4765]: I1003 09:07:16.273612 4765 log.go:25] "Finished parsing log file" path="/var/log/pods/watcher-kuttl-default_watcher-kuttl-decision-engine-0_ba2a7e44-33cc-4cf1-8dfc-7aa8306118d2/watcher-decision-engine/0.log" Oct 03 09:07:16 crc kubenswrapper[4765]: I1003 09:07:16.319095 4765 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="96645098-7ec7-4672-8f05-bc20105308e3" path="/var/lib/kubelet/pods/96645098-7ec7-4672-8f05-bc20105308e3/volumes" Oct 03 09:07:17 crc kubenswrapper[4765]: I1003 09:07:17.479569 4765 log.go:25] "Finished parsing log file" path="/var/log/pods/watcher-kuttl-default_watcher-kuttl-decision-engine-0_ba2a7e44-33cc-4cf1-8dfc-7aa8306118d2/watcher-decision-engine/0.log" Oct 03 09:07:18 crc kubenswrapper[4765]: I1003 09:07:18.179357 4765 scope.go:117] "RemoveContainer" containerID="4950246c5c7eca0d45ad61a373617e1f39830c8dfe1935688f16f0279279a881" Oct 03 09:07:18 crc kubenswrapper[4765]: I1003 09:07:18.214623 4765 scope.go:117] "RemoveContainer" containerID="d99bc88f1622d082610e0c7f328b2d0e3ebf9d36de3a796f7ef7cf8bcbcab157" Oct 03 09:07:18 crc kubenswrapper[4765]: I1003 09:07:18.246613 4765 scope.go:117] "RemoveContainer" containerID="7394fa8968e57e58857a0527abf4f3b23ec959eef0ec49bd23f20f27793b6cf6" Oct 03 09:07:18 crc kubenswrapper[4765]: I1003 09:07:18.306006 4765 scope.go:117] "RemoveContainer" containerID="b1be1b329830cd59739528151fc1bd5e2c1f802d74604af5fd6b2579a1031fbd" Oct 03 09:07:18 crc kubenswrapper[4765]: I1003 09:07:18.469109 4765 scope.go:117] "RemoveContainer" containerID="52999060ccc3806573782e5a15676472b0cb4da7a19cc36ad87486daa617d12c" Oct 03 09:07:18 crc kubenswrapper[4765]: I1003 09:07:18.504183 4765 scope.go:117] "RemoveContainer" containerID="3f6a182133151f1955d7525508cf4c5906a5b3a7d463d258e266b946537feca2" Oct 03 09:07:18 crc kubenswrapper[4765]: I1003 09:07:18.528748 4765 scope.go:117] "RemoveContainer" containerID="bcb2cb47a5852d6096be015b93463d8481e5cc8d0b8245a1e1d41c66b6ec90fb" Oct 03 09:07:18 crc kubenswrapper[4765]: I1003 09:07:18.731309 4765 log.go:25] "Finished parsing log file" path="/var/log/pods/watcher-kuttl-default_watcher-kuttl-decision-engine-0_ba2a7e44-33cc-4cf1-8dfc-7aa8306118d2/watcher-decision-engine/0.log" Oct 03 09:07:19 crc kubenswrapper[4765]: I1003 09:07:19.935546 4765 log.go:25] "Finished parsing log file" path="/var/log/pods/watcher-kuttl-default_watcher-kuttl-decision-engine-0_ba2a7e44-33cc-4cf1-8dfc-7aa8306118d2/watcher-decision-engine/0.log" Oct 03 09:07:20 crc kubenswrapper[4765]: I1003 09:07:20.316275 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/cinder-db-sync-kvjq6" event={"ID":"c7420dfa-3ae6-4037-9f4e-a43c05cd7ac7","Type":"ContainerStarted","Data":"dfda99923840b38f8150060deb154ec2326ee60d46ebe1e406b98d760aea99c7"} Oct 03 09:07:20 crc kubenswrapper[4765]: I1003 09:07:20.336361 4765 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="watcher-kuttl-default/cinder-db-sync-kvjq6" podStartSLOduration=2.7970014069999998 podStartE2EDuration="32.336337657s" podCreationTimestamp="2025-10-03 09:06:48 +0000 UTC" firstStartedPulling="2025-10-03 09:06:49.478390622 +0000 UTC m=+1653.779884952" lastFinishedPulling="2025-10-03 09:07:19.017726872 +0000 UTC m=+1683.319221202" observedRunningTime="2025-10-03 09:07:20.332111298 +0000 UTC m=+1684.633605628" watchObservedRunningTime="2025-10-03 09:07:20.336337657 +0000 UTC m=+1684.637831987" Oct 03 09:07:21 crc kubenswrapper[4765]: I1003 09:07:21.134048 4765 log.go:25] "Finished parsing log file" path="/var/log/pods/watcher-kuttl-default_watcher-kuttl-decision-engine-0_ba2a7e44-33cc-4cf1-8dfc-7aa8306118d2/watcher-decision-engine/0.log" Oct 03 09:07:22 crc kubenswrapper[4765]: I1003 09:07:22.397627 4765 log.go:25] "Finished parsing log file" path="/var/log/pods/watcher-kuttl-default_watcher-kuttl-decision-engine-0_ba2a7e44-33cc-4cf1-8dfc-7aa8306118d2/watcher-decision-engine/0.log" Oct 03 09:07:23 crc kubenswrapper[4765]: I1003 09:07:23.654040 4765 log.go:25] "Finished parsing log file" path="/var/log/pods/watcher-kuttl-default_watcher-kuttl-decision-engine-0_ba2a7e44-33cc-4cf1-8dfc-7aa8306118d2/watcher-decision-engine/0.log" Oct 03 09:07:24 crc kubenswrapper[4765]: I1003 09:07:24.355948 4765 generic.go:334] "Generic (PLEG): container finished" podID="c7420dfa-3ae6-4037-9f4e-a43c05cd7ac7" containerID="dfda99923840b38f8150060deb154ec2326ee60d46ebe1e406b98d760aea99c7" exitCode=0 Oct 03 09:07:24 crc kubenswrapper[4765]: I1003 09:07:24.355997 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/cinder-db-sync-kvjq6" event={"ID":"c7420dfa-3ae6-4037-9f4e-a43c05cd7ac7","Type":"ContainerDied","Data":"dfda99923840b38f8150060deb154ec2326ee60d46ebe1e406b98d760aea99c7"} Oct 03 09:07:24 crc kubenswrapper[4765]: I1003 09:07:24.876293 4765 log.go:25] "Finished parsing log file" path="/var/log/pods/watcher-kuttl-default_watcher-kuttl-decision-engine-0_ba2a7e44-33cc-4cf1-8dfc-7aa8306118d2/watcher-decision-engine/0.log" Oct 03 09:07:25 crc kubenswrapper[4765]: I1003 09:07:25.671968 4765 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/cinder-db-sync-kvjq6" Oct 03 09:07:25 crc kubenswrapper[4765]: I1003 09:07:25.783390 4765 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zr76f\" (UniqueName: \"kubernetes.io/projected/c7420dfa-3ae6-4037-9f4e-a43c05cd7ac7-kube-api-access-zr76f\") pod \"c7420dfa-3ae6-4037-9f4e-a43c05cd7ac7\" (UID: \"c7420dfa-3ae6-4037-9f4e-a43c05cd7ac7\") " Oct 03 09:07:25 crc kubenswrapper[4765]: I1003 09:07:25.783474 4765 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c7420dfa-3ae6-4037-9f4e-a43c05cd7ac7-config-data\") pod \"c7420dfa-3ae6-4037-9f4e-a43c05cd7ac7\" (UID: \"c7420dfa-3ae6-4037-9f4e-a43c05cd7ac7\") " Oct 03 09:07:25 crc kubenswrapper[4765]: I1003 09:07:25.783607 4765 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/c7420dfa-3ae6-4037-9f4e-a43c05cd7ac7-db-sync-config-data\") pod \"c7420dfa-3ae6-4037-9f4e-a43c05cd7ac7\" (UID: \"c7420dfa-3ae6-4037-9f4e-a43c05cd7ac7\") " Oct 03 09:07:25 crc kubenswrapper[4765]: I1003 09:07:25.783631 4765 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/c7420dfa-3ae6-4037-9f4e-a43c05cd7ac7-etc-machine-id\") pod \"c7420dfa-3ae6-4037-9f4e-a43c05cd7ac7\" (UID: \"c7420dfa-3ae6-4037-9f4e-a43c05cd7ac7\") " Oct 03 09:07:25 crc kubenswrapper[4765]: I1003 09:07:25.783665 4765 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c7420dfa-3ae6-4037-9f4e-a43c05cd7ac7-combined-ca-bundle\") pod \"c7420dfa-3ae6-4037-9f4e-a43c05cd7ac7\" (UID: \"c7420dfa-3ae6-4037-9f4e-a43c05cd7ac7\") " Oct 03 09:07:25 crc kubenswrapper[4765]: I1003 09:07:25.783734 4765 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c7420dfa-3ae6-4037-9f4e-a43c05cd7ac7-scripts\") pod \"c7420dfa-3ae6-4037-9f4e-a43c05cd7ac7\" (UID: \"c7420dfa-3ae6-4037-9f4e-a43c05cd7ac7\") " Oct 03 09:07:25 crc kubenswrapper[4765]: I1003 09:07:25.783761 4765 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/c7420dfa-3ae6-4037-9f4e-a43c05cd7ac7-etc-machine-id" (OuterVolumeSpecName: "etc-machine-id") pod "c7420dfa-3ae6-4037-9f4e-a43c05cd7ac7" (UID: "c7420dfa-3ae6-4037-9f4e-a43c05cd7ac7"). InnerVolumeSpecName "etc-machine-id". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 03 09:07:25 crc kubenswrapper[4765]: I1003 09:07:25.784022 4765 reconciler_common.go:293] "Volume detached for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/c7420dfa-3ae6-4037-9f4e-a43c05cd7ac7-etc-machine-id\") on node \"crc\" DevicePath \"\"" Oct 03 09:07:25 crc kubenswrapper[4765]: I1003 09:07:25.813843 4765 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c7420dfa-3ae6-4037-9f4e-a43c05cd7ac7-scripts" (OuterVolumeSpecName: "scripts") pod "c7420dfa-3ae6-4037-9f4e-a43c05cd7ac7" (UID: "c7420dfa-3ae6-4037-9f4e-a43c05cd7ac7"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 09:07:25 crc kubenswrapper[4765]: I1003 09:07:25.819798 4765 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c7420dfa-3ae6-4037-9f4e-a43c05cd7ac7-kube-api-access-zr76f" (OuterVolumeSpecName: "kube-api-access-zr76f") pod "c7420dfa-3ae6-4037-9f4e-a43c05cd7ac7" (UID: "c7420dfa-3ae6-4037-9f4e-a43c05cd7ac7"). InnerVolumeSpecName "kube-api-access-zr76f". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 09:07:25 crc kubenswrapper[4765]: I1003 09:07:25.840850 4765 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c7420dfa-3ae6-4037-9f4e-a43c05cd7ac7-db-sync-config-data" (OuterVolumeSpecName: "db-sync-config-data") pod "c7420dfa-3ae6-4037-9f4e-a43c05cd7ac7" (UID: "c7420dfa-3ae6-4037-9f4e-a43c05cd7ac7"). InnerVolumeSpecName "db-sync-config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 09:07:25 crc kubenswrapper[4765]: I1003 09:07:25.842074 4765 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c7420dfa-3ae6-4037-9f4e-a43c05cd7ac7-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "c7420dfa-3ae6-4037-9f4e-a43c05cd7ac7" (UID: "c7420dfa-3ae6-4037-9f4e-a43c05cd7ac7"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 09:07:25 crc kubenswrapper[4765]: I1003 09:07:25.885971 4765 reconciler_common.go:293] "Volume detached for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/c7420dfa-3ae6-4037-9f4e-a43c05cd7ac7-db-sync-config-data\") on node \"crc\" DevicePath \"\"" Oct 03 09:07:25 crc kubenswrapper[4765]: I1003 09:07:25.886036 4765 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c7420dfa-3ae6-4037-9f4e-a43c05cd7ac7-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 03 09:07:25 crc kubenswrapper[4765]: I1003 09:07:25.886047 4765 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c7420dfa-3ae6-4037-9f4e-a43c05cd7ac7-scripts\") on node \"crc\" DevicePath \"\"" Oct 03 09:07:25 crc kubenswrapper[4765]: I1003 09:07:25.886058 4765 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zr76f\" (UniqueName: \"kubernetes.io/projected/c7420dfa-3ae6-4037-9f4e-a43c05cd7ac7-kube-api-access-zr76f\") on node \"crc\" DevicePath \"\"" Oct 03 09:07:25 crc kubenswrapper[4765]: I1003 09:07:25.904865 4765 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c7420dfa-3ae6-4037-9f4e-a43c05cd7ac7-config-data" (OuterVolumeSpecName: "config-data") pod "c7420dfa-3ae6-4037-9f4e-a43c05cd7ac7" (UID: "c7420dfa-3ae6-4037-9f4e-a43c05cd7ac7"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 09:07:25 crc kubenswrapper[4765]: I1003 09:07:25.988058 4765 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c7420dfa-3ae6-4037-9f4e-a43c05cd7ac7-config-data\") on node \"crc\" DevicePath \"\"" Oct 03 09:07:26 crc kubenswrapper[4765]: I1003 09:07:26.117949 4765 log.go:25] "Finished parsing log file" path="/var/log/pods/watcher-kuttl-default_watcher-kuttl-decision-engine-0_ba2a7e44-33cc-4cf1-8dfc-7aa8306118d2/watcher-decision-engine/0.log" Oct 03 09:07:26 crc kubenswrapper[4765]: I1003 09:07:26.372985 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/cinder-db-sync-kvjq6" event={"ID":"c7420dfa-3ae6-4037-9f4e-a43c05cd7ac7","Type":"ContainerDied","Data":"be14796f90edf14d52b7450170631707082aacbd87e4acdf44ae080aeffce5ee"} Oct 03 09:07:26 crc kubenswrapper[4765]: I1003 09:07:26.373574 4765 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="be14796f90edf14d52b7450170631707082aacbd87e4acdf44ae080aeffce5ee" Oct 03 09:07:26 crc kubenswrapper[4765]: I1003 09:07:26.373072 4765 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/cinder-db-sync-kvjq6" Oct 03 09:07:26 crc kubenswrapper[4765]: I1003 09:07:26.681325 4765 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["watcher-kuttl-default/cinder-backup-0"] Oct 03 09:07:26 crc kubenswrapper[4765]: E1003 09:07:26.681658 4765 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c7420dfa-3ae6-4037-9f4e-a43c05cd7ac7" containerName="cinder-db-sync" Oct 03 09:07:26 crc kubenswrapper[4765]: I1003 09:07:26.681673 4765 state_mem.go:107] "Deleted CPUSet assignment" podUID="c7420dfa-3ae6-4037-9f4e-a43c05cd7ac7" containerName="cinder-db-sync" Oct 03 09:07:26 crc kubenswrapper[4765]: I1003 09:07:26.681865 4765 memory_manager.go:354] "RemoveStaleState removing state" podUID="c7420dfa-3ae6-4037-9f4e-a43c05cd7ac7" containerName="cinder-db-sync" Oct 03 09:07:26 crc kubenswrapper[4765]: I1003 09:07:26.682721 4765 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/cinder-backup-0" Oct 03 09:07:26 crc kubenswrapper[4765]: I1003 09:07:26.690169 4765 reflector.go:368] Caches populated for *v1.Secret from object-"watcher-kuttl-default"/"cinder-cinder-dockercfg-k6wj4" Oct 03 09:07:26 crc kubenswrapper[4765]: I1003 09:07:26.690400 4765 reflector.go:368] Caches populated for *v1.Secret from object-"watcher-kuttl-default"/"cinder-config-data" Oct 03 09:07:26 crc kubenswrapper[4765]: I1003 09:07:26.690526 4765 reflector.go:368] Caches populated for *v1.Secret from object-"watcher-kuttl-default"/"cinder-backup-config-data" Oct 03 09:07:26 crc kubenswrapper[4765]: I1003 09:07:26.690761 4765 reflector.go:368] Caches populated for *v1.Secret from object-"watcher-kuttl-default"/"cinder-scripts" Oct 03 09:07:26 crc kubenswrapper[4765]: I1003 09:07:26.699836 4765 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["watcher-kuttl-default/cinder-backup-0"] Oct 03 09:07:26 crc kubenswrapper[4765]: I1003 09:07:26.737125 4765 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["watcher-kuttl-default/cinder-scheduler-0"] Oct 03 09:07:26 crc kubenswrapper[4765]: I1003 09:07:26.739096 4765 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/cinder-scheduler-0" Oct 03 09:07:26 crc kubenswrapper[4765]: I1003 09:07:26.743875 4765 reflector.go:368] Caches populated for *v1.Secret from object-"watcher-kuttl-default"/"cinder-scheduler-config-data" Oct 03 09:07:26 crc kubenswrapper[4765]: I1003 09:07:26.756905 4765 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["watcher-kuttl-default/cinder-scheduler-0"] Oct 03 09:07:26 crc kubenswrapper[4765]: I1003 09:07:26.818416 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/260b2c35-636a-4254-a743-d7a34677e0cc-scripts\") pod \"cinder-backup-0\" (UID: \"260b2c35-636a-4254-a743-d7a34677e0cc\") " pod="watcher-kuttl-default/cinder-backup-0" Oct 03 09:07:26 crc kubenswrapper[4765]: I1003 09:07:26.818471 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/e8a8d866-ba44-4829-a8e8-df5c0c40ae8e-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"e8a8d866-ba44-4829-a8e8-df5c0c40ae8e\") " pod="watcher-kuttl-default/cinder-scheduler-0" Oct 03 09:07:26 crc kubenswrapper[4765]: I1003 09:07:26.818488 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-locks-cinder\" (UniqueName: \"kubernetes.io/host-path/260b2c35-636a-4254-a743-d7a34677e0cc-var-locks-cinder\") pod \"cinder-backup-0\" (UID: \"260b2c35-636a-4254-a743-d7a34677e0cc\") " pod="watcher-kuttl-default/cinder-backup-0" Oct 03 09:07:26 crc kubenswrapper[4765]: I1003 09:07:26.818503 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e8a8d866-ba44-4829-a8e8-df5c0c40ae8e-scripts\") pod \"cinder-scheduler-0\" (UID: \"e8a8d866-ba44-4829-a8e8-df5c0c40ae8e\") " pod="watcher-kuttl-default/cinder-scheduler-0" Oct 03 09:07:26 crc kubenswrapper[4765]: I1003 09:07:26.818734 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-iscsi\" (UniqueName: \"kubernetes.io/host-path/260b2c35-636a-4254-a743-d7a34677e0cc-etc-iscsi\") pod \"cinder-backup-0\" (UID: \"260b2c35-636a-4254-a743-d7a34677e0cc\") " pod="watcher-kuttl-default/cinder-backup-0" Oct 03 09:07:26 crc kubenswrapper[4765]: I1003 09:07:26.818772 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-cinder\" (UniqueName: \"kubernetes.io/host-path/260b2c35-636a-4254-a743-d7a34677e0cc-var-lib-cinder\") pod \"cinder-backup-0\" (UID: \"260b2c35-636a-4254-a743-d7a34677e0cc\") " pod="watcher-kuttl-default/cinder-backup-0" Oct 03 09:07:26 crc kubenswrapper[4765]: I1003 09:07:26.818795 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e8a8d866-ba44-4829-a8e8-df5c0c40ae8e-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"e8a8d866-ba44-4829-a8e8-df5c0c40ae8e\") " pod="watcher-kuttl-default/cinder-scheduler-0" Oct 03 09:07:26 crc kubenswrapper[4765]: I1003 09:07:26.818845 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/260b2c35-636a-4254-a743-d7a34677e0cc-config-data\") pod \"cinder-backup-0\" (UID: \"260b2c35-636a-4254-a743-d7a34677e0cc\") " pod="watcher-kuttl-default/cinder-backup-0" Oct 03 09:07:26 crc kubenswrapper[4765]: I1003 09:07:26.818867 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/260b2c35-636a-4254-a743-d7a34677e0cc-config-data-custom\") pod \"cinder-backup-0\" (UID: \"260b2c35-636a-4254-a743-d7a34677e0cc\") " pod="watcher-kuttl-default/cinder-backup-0" Oct 03 09:07:26 crc kubenswrapper[4765]: I1003 09:07:26.818900 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/260b2c35-636a-4254-a743-d7a34677e0cc-sys\") pod \"cinder-backup-0\" (UID: \"260b2c35-636a-4254-a743-d7a34677e0cc\") " pod="watcher-kuttl-default/cinder-backup-0" Oct 03 09:07:26 crc kubenswrapper[4765]: I1003 09:07:26.818960 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/260b2c35-636a-4254-a743-d7a34677e0cc-combined-ca-bundle\") pod \"cinder-backup-0\" (UID: \"260b2c35-636a-4254-a743-d7a34677e0cc\") " pod="watcher-kuttl-default/cinder-backup-0" Oct 03 09:07:26 crc kubenswrapper[4765]: I1003 09:07:26.819045 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-v5wzj\" (UniqueName: \"kubernetes.io/projected/260b2c35-636a-4254-a743-d7a34677e0cc-kube-api-access-v5wzj\") pod \"cinder-backup-0\" (UID: \"260b2c35-636a-4254-a743-d7a34677e0cc\") " pod="watcher-kuttl-default/cinder-backup-0" Oct 03 09:07:26 crc kubenswrapper[4765]: I1003 09:07:26.819112 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/e8a8d866-ba44-4829-a8e8-df5c0c40ae8e-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"e8a8d866-ba44-4829-a8e8-df5c0c40ae8e\") " pod="watcher-kuttl-default/cinder-scheduler-0" Oct 03 09:07:26 crc kubenswrapper[4765]: I1003 09:07:26.819174 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/260b2c35-636a-4254-a743-d7a34677e0cc-etc-machine-id\") pod \"cinder-backup-0\" (UID: \"260b2c35-636a-4254-a743-d7a34677e0cc\") " pod="watcher-kuttl-default/cinder-backup-0" Oct 03 09:07:26 crc kubenswrapper[4765]: I1003 09:07:26.819245 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/260b2c35-636a-4254-a743-d7a34677e0cc-run\") pod \"cinder-backup-0\" (UID: \"260b2c35-636a-4254-a743-d7a34677e0cc\") " pod="watcher-kuttl-default/cinder-backup-0" Oct 03 09:07:26 crc kubenswrapper[4765]: I1003 09:07:26.819277 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e8a8d866-ba44-4829-a8e8-df5c0c40ae8e-config-data\") pod \"cinder-scheduler-0\" (UID: \"e8a8d866-ba44-4829-a8e8-df5c0c40ae8e\") " pod="watcher-kuttl-default/cinder-scheduler-0" Oct 03 09:07:26 crc kubenswrapper[4765]: I1003 09:07:26.819297 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dev\" (UniqueName: \"kubernetes.io/host-path/260b2c35-636a-4254-a743-d7a34677e0cc-dev\") pod \"cinder-backup-0\" (UID: \"260b2c35-636a-4254-a743-d7a34677e0cc\") " pod="watcher-kuttl-default/cinder-backup-0" Oct 03 09:07:26 crc kubenswrapper[4765]: I1003 09:07:26.819372 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert-memcached-mtls\" (UniqueName: \"kubernetes.io/secret/260b2c35-636a-4254-a743-d7a34677e0cc-cert-memcached-mtls\") pod \"cinder-backup-0\" (UID: \"260b2c35-636a-4254-a743-d7a34677e0cc\") " pod="watcher-kuttl-default/cinder-backup-0" Oct 03 09:07:26 crc kubenswrapper[4765]: I1003 09:07:26.819400 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/260b2c35-636a-4254-a743-d7a34677e0cc-lib-modules\") pod \"cinder-backup-0\" (UID: \"260b2c35-636a-4254-a743-d7a34677e0cc\") " pod="watcher-kuttl-default/cinder-backup-0" Oct 03 09:07:26 crc kubenswrapper[4765]: I1003 09:07:26.819419 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bcqqt\" (UniqueName: \"kubernetes.io/projected/e8a8d866-ba44-4829-a8e8-df5c0c40ae8e-kube-api-access-bcqqt\") pod \"cinder-scheduler-0\" (UID: \"e8a8d866-ba44-4829-a8e8-df5c0c40ae8e\") " pod="watcher-kuttl-default/cinder-scheduler-0" Oct 03 09:07:26 crc kubenswrapper[4765]: I1003 09:07:26.819536 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-locks-brick\" (UniqueName: \"kubernetes.io/host-path/260b2c35-636a-4254-a743-d7a34677e0cc-var-locks-brick\") pod \"cinder-backup-0\" (UID: \"260b2c35-636a-4254-a743-d7a34677e0cc\") " pod="watcher-kuttl-default/cinder-backup-0" Oct 03 09:07:26 crc kubenswrapper[4765]: I1003 09:07:26.819572 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-nvme\" (UniqueName: \"kubernetes.io/host-path/260b2c35-636a-4254-a743-d7a34677e0cc-etc-nvme\") pod \"cinder-backup-0\" (UID: \"260b2c35-636a-4254-a743-d7a34677e0cc\") " pod="watcher-kuttl-default/cinder-backup-0" Oct 03 09:07:26 crc kubenswrapper[4765]: I1003 09:07:26.819589 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert-memcached-mtls\" (UniqueName: \"kubernetes.io/secret/e8a8d866-ba44-4829-a8e8-df5c0c40ae8e-cert-memcached-mtls\") pod \"cinder-scheduler-0\" (UID: \"e8a8d866-ba44-4829-a8e8-df5c0c40ae8e\") " pod="watcher-kuttl-default/cinder-scheduler-0" Oct 03 09:07:26 crc kubenswrapper[4765]: I1003 09:07:26.829660 4765 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["watcher-kuttl-default/cinder-api-0"] Oct 03 09:07:26 crc kubenswrapper[4765]: I1003 09:07:26.831280 4765 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/cinder-api-0" Oct 03 09:07:26 crc kubenswrapper[4765]: I1003 09:07:26.837121 4765 reflector.go:368] Caches populated for *v1.Secret from object-"watcher-kuttl-default"/"cinder-api-config-data" Oct 03 09:07:26 crc kubenswrapper[4765]: I1003 09:07:26.854450 4765 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["watcher-kuttl-default/cinder-api-0"] Oct 03 09:07:26 crc kubenswrapper[4765]: I1003 09:07:26.920795 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-iscsi\" (UniqueName: \"kubernetes.io/host-path/260b2c35-636a-4254-a743-d7a34677e0cc-etc-iscsi\") pod \"cinder-backup-0\" (UID: \"260b2c35-636a-4254-a743-d7a34677e0cc\") " pod="watcher-kuttl-default/cinder-backup-0" Oct 03 09:07:26 crc kubenswrapper[4765]: I1003 09:07:26.920833 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lib-cinder\" (UniqueName: \"kubernetes.io/host-path/260b2c35-636a-4254-a743-d7a34677e0cc-var-lib-cinder\") pod \"cinder-backup-0\" (UID: \"260b2c35-636a-4254-a743-d7a34677e0cc\") " pod="watcher-kuttl-default/cinder-backup-0" Oct 03 09:07:26 crc kubenswrapper[4765]: I1003 09:07:26.920854 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e8a8d866-ba44-4829-a8e8-df5c0c40ae8e-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"e8a8d866-ba44-4829-a8e8-df5c0c40ae8e\") " pod="watcher-kuttl-default/cinder-scheduler-0" Oct 03 09:07:26 crc kubenswrapper[4765]: I1003 09:07:26.920870 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/260b2c35-636a-4254-a743-d7a34677e0cc-config-data\") pod \"cinder-backup-0\" (UID: \"260b2c35-636a-4254-a743-d7a34677e0cc\") " pod="watcher-kuttl-default/cinder-backup-0" Oct 03 09:07:26 crc kubenswrapper[4765]: I1003 09:07:26.920890 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/260b2c35-636a-4254-a743-d7a34677e0cc-config-data-custom\") pod \"cinder-backup-0\" (UID: \"260b2c35-636a-4254-a743-d7a34677e0cc\") " pod="watcher-kuttl-default/cinder-backup-0" Oct 03 09:07:26 crc kubenswrapper[4765]: I1003 09:07:26.920922 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ntk6m\" (UniqueName: \"kubernetes.io/projected/91c71bb9-9664-41a9-a1a7-2821cb1d5ad1-kube-api-access-ntk6m\") pod \"cinder-api-0\" (UID: \"91c71bb9-9664-41a9-a1a7-2821cb1d5ad1\") " pod="watcher-kuttl-default/cinder-api-0" Oct 03 09:07:26 crc kubenswrapper[4765]: I1003 09:07:26.920942 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/260b2c35-636a-4254-a743-d7a34677e0cc-sys\") pod \"cinder-backup-0\" (UID: \"260b2c35-636a-4254-a743-d7a34677e0cc\") " pod="watcher-kuttl-default/cinder-backup-0" Oct 03 09:07:26 crc kubenswrapper[4765]: I1003 09:07:26.920958 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/260b2c35-636a-4254-a743-d7a34677e0cc-combined-ca-bundle\") pod \"cinder-backup-0\" (UID: \"260b2c35-636a-4254-a743-d7a34677e0cc\") " pod="watcher-kuttl-default/cinder-backup-0" Oct 03 09:07:26 crc kubenswrapper[4765]: I1003 09:07:26.920977 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-v5wzj\" (UniqueName: \"kubernetes.io/projected/260b2c35-636a-4254-a743-d7a34677e0cc-kube-api-access-v5wzj\") pod \"cinder-backup-0\" (UID: \"260b2c35-636a-4254-a743-d7a34677e0cc\") " pod="watcher-kuttl-default/cinder-backup-0" Oct 03 09:07:26 crc kubenswrapper[4765]: I1003 09:07:26.920996 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/91c71bb9-9664-41a9-a1a7-2821cb1d5ad1-logs\") pod \"cinder-api-0\" (UID: \"91c71bb9-9664-41a9-a1a7-2821cb1d5ad1\") " pod="watcher-kuttl-default/cinder-api-0" Oct 03 09:07:26 crc kubenswrapper[4765]: I1003 09:07:26.921027 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/91c71bb9-9664-41a9-a1a7-2821cb1d5ad1-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"91c71bb9-9664-41a9-a1a7-2821cb1d5ad1\") " pod="watcher-kuttl-default/cinder-api-0" Oct 03 09:07:26 crc kubenswrapper[4765]: I1003 09:07:26.921045 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert-memcached-mtls\" (UniqueName: \"kubernetes.io/secret/91c71bb9-9664-41a9-a1a7-2821cb1d5ad1-cert-memcached-mtls\") pod \"cinder-api-0\" (UID: \"91c71bb9-9664-41a9-a1a7-2821cb1d5ad1\") " pod="watcher-kuttl-default/cinder-api-0" Oct 03 09:07:26 crc kubenswrapper[4765]: I1003 09:07:26.921068 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/e8a8d866-ba44-4829-a8e8-df5c0c40ae8e-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"e8a8d866-ba44-4829-a8e8-df5c0c40ae8e\") " pod="watcher-kuttl-default/cinder-scheduler-0" Oct 03 09:07:26 crc kubenswrapper[4765]: I1003 09:07:26.921085 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/91c71bb9-9664-41a9-a1a7-2821cb1d5ad1-config-data\") pod \"cinder-api-0\" (UID: \"91c71bb9-9664-41a9-a1a7-2821cb1d5ad1\") " pod="watcher-kuttl-default/cinder-api-0" Oct 03 09:07:26 crc kubenswrapper[4765]: I1003 09:07:26.921107 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/260b2c35-636a-4254-a743-d7a34677e0cc-etc-machine-id\") pod \"cinder-backup-0\" (UID: \"260b2c35-636a-4254-a743-d7a34677e0cc\") " pod="watcher-kuttl-default/cinder-backup-0" Oct 03 09:07:26 crc kubenswrapper[4765]: I1003 09:07:26.921128 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/260b2c35-636a-4254-a743-d7a34677e0cc-run\") pod \"cinder-backup-0\" (UID: \"260b2c35-636a-4254-a743-d7a34677e0cc\") " pod="watcher-kuttl-default/cinder-backup-0" Oct 03 09:07:26 crc kubenswrapper[4765]: I1003 09:07:26.921148 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e8a8d866-ba44-4829-a8e8-df5c0c40ae8e-config-data\") pod \"cinder-scheduler-0\" (UID: \"e8a8d866-ba44-4829-a8e8-df5c0c40ae8e\") " pod="watcher-kuttl-default/cinder-scheduler-0" Oct 03 09:07:26 crc kubenswrapper[4765]: I1003 09:07:26.921170 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dev\" (UniqueName: \"kubernetes.io/host-path/260b2c35-636a-4254-a743-d7a34677e0cc-dev\") pod \"cinder-backup-0\" (UID: \"260b2c35-636a-4254-a743-d7a34677e0cc\") " pod="watcher-kuttl-default/cinder-backup-0" Oct 03 09:07:26 crc kubenswrapper[4765]: I1003 09:07:26.921188 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert-memcached-mtls\" (UniqueName: \"kubernetes.io/secret/260b2c35-636a-4254-a743-d7a34677e0cc-cert-memcached-mtls\") pod \"cinder-backup-0\" (UID: \"260b2c35-636a-4254-a743-d7a34677e0cc\") " pod="watcher-kuttl-default/cinder-backup-0" Oct 03 09:07:26 crc kubenswrapper[4765]: I1003 09:07:26.921204 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/91c71bb9-9664-41a9-a1a7-2821cb1d5ad1-etc-machine-id\") pod \"cinder-api-0\" (UID: \"91c71bb9-9664-41a9-a1a7-2821cb1d5ad1\") " pod="watcher-kuttl-default/cinder-api-0" Oct 03 09:07:26 crc kubenswrapper[4765]: I1003 09:07:26.921220 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/260b2c35-636a-4254-a743-d7a34677e0cc-lib-modules\") pod \"cinder-backup-0\" (UID: \"260b2c35-636a-4254-a743-d7a34677e0cc\") " pod="watcher-kuttl-default/cinder-backup-0" Oct 03 09:07:26 crc kubenswrapper[4765]: I1003 09:07:26.921239 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bcqqt\" (UniqueName: \"kubernetes.io/projected/e8a8d866-ba44-4829-a8e8-df5c0c40ae8e-kube-api-access-bcqqt\") pod \"cinder-scheduler-0\" (UID: \"e8a8d866-ba44-4829-a8e8-df5c0c40ae8e\") " pod="watcher-kuttl-default/cinder-scheduler-0" Oct 03 09:07:26 crc kubenswrapper[4765]: I1003 09:07:26.921272 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-locks-brick\" (UniqueName: \"kubernetes.io/host-path/260b2c35-636a-4254-a743-d7a34677e0cc-var-locks-brick\") pod \"cinder-backup-0\" (UID: \"260b2c35-636a-4254-a743-d7a34677e0cc\") " pod="watcher-kuttl-default/cinder-backup-0" Oct 03 09:07:26 crc kubenswrapper[4765]: I1003 09:07:26.921293 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-nvme\" (UniqueName: \"kubernetes.io/host-path/260b2c35-636a-4254-a743-d7a34677e0cc-etc-nvme\") pod \"cinder-backup-0\" (UID: \"260b2c35-636a-4254-a743-d7a34677e0cc\") " pod="watcher-kuttl-default/cinder-backup-0" Oct 03 09:07:26 crc kubenswrapper[4765]: I1003 09:07:26.921307 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert-memcached-mtls\" (UniqueName: \"kubernetes.io/secret/e8a8d866-ba44-4829-a8e8-df5c0c40ae8e-cert-memcached-mtls\") pod \"cinder-scheduler-0\" (UID: \"e8a8d866-ba44-4829-a8e8-df5c0c40ae8e\") " pod="watcher-kuttl-default/cinder-scheduler-0" Oct 03 09:07:26 crc kubenswrapper[4765]: I1003 09:07:26.921329 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/91c71bb9-9664-41a9-a1a7-2821cb1d5ad1-config-data-custom\") pod \"cinder-api-0\" (UID: \"91c71bb9-9664-41a9-a1a7-2821cb1d5ad1\") " pod="watcher-kuttl-default/cinder-api-0" Oct 03 09:07:26 crc kubenswrapper[4765]: I1003 09:07:26.921345 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/91c71bb9-9664-41a9-a1a7-2821cb1d5ad1-scripts\") pod \"cinder-api-0\" (UID: \"91c71bb9-9664-41a9-a1a7-2821cb1d5ad1\") " pod="watcher-kuttl-default/cinder-api-0" Oct 03 09:07:26 crc kubenswrapper[4765]: I1003 09:07:26.921362 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/260b2c35-636a-4254-a743-d7a34677e0cc-scripts\") pod \"cinder-backup-0\" (UID: \"260b2c35-636a-4254-a743-d7a34677e0cc\") " pod="watcher-kuttl-default/cinder-backup-0" Oct 03 09:07:26 crc kubenswrapper[4765]: I1003 09:07:26.921383 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/e8a8d866-ba44-4829-a8e8-df5c0c40ae8e-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"e8a8d866-ba44-4829-a8e8-df5c0c40ae8e\") " pod="watcher-kuttl-default/cinder-scheduler-0" Oct 03 09:07:26 crc kubenswrapper[4765]: I1003 09:07:26.921400 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-locks-cinder\" (UniqueName: \"kubernetes.io/host-path/260b2c35-636a-4254-a743-d7a34677e0cc-var-locks-cinder\") pod \"cinder-backup-0\" (UID: \"260b2c35-636a-4254-a743-d7a34677e0cc\") " pod="watcher-kuttl-default/cinder-backup-0" Oct 03 09:07:26 crc kubenswrapper[4765]: I1003 09:07:26.921417 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e8a8d866-ba44-4829-a8e8-df5c0c40ae8e-scripts\") pod \"cinder-scheduler-0\" (UID: \"e8a8d866-ba44-4829-a8e8-df5c0c40ae8e\") " pod="watcher-kuttl-default/cinder-scheduler-0" Oct 03 09:07:26 crc kubenswrapper[4765]: I1003 09:07:26.921743 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/260b2c35-636a-4254-a743-d7a34677e0cc-etc-machine-id\") pod \"cinder-backup-0\" (UID: \"260b2c35-636a-4254-a743-d7a34677e0cc\") " pod="watcher-kuttl-default/cinder-backup-0" Oct 03 09:07:26 crc kubenswrapper[4765]: I1003 09:07:26.921831 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/260b2c35-636a-4254-a743-d7a34677e0cc-sys\") pod \"cinder-backup-0\" (UID: \"260b2c35-636a-4254-a743-d7a34677e0cc\") " pod="watcher-kuttl-default/cinder-backup-0" Oct 03 09:07:26 crc kubenswrapper[4765]: I1003 09:07:26.921992 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-iscsi\" (UniqueName: \"kubernetes.io/host-path/260b2c35-636a-4254-a743-d7a34677e0cc-etc-iscsi\") pod \"cinder-backup-0\" (UID: \"260b2c35-636a-4254-a743-d7a34677e0cc\") " pod="watcher-kuttl-default/cinder-backup-0" Oct 03 09:07:26 crc kubenswrapper[4765]: I1003 09:07:26.922164 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lib-cinder\" (UniqueName: \"kubernetes.io/host-path/260b2c35-636a-4254-a743-d7a34677e0cc-var-lib-cinder\") pod \"cinder-backup-0\" (UID: \"260b2c35-636a-4254-a743-d7a34677e0cc\") " pod="watcher-kuttl-default/cinder-backup-0" Oct 03 09:07:26 crc kubenswrapper[4765]: I1003 09:07:26.926580 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-locks-brick\" (UniqueName: \"kubernetes.io/host-path/260b2c35-636a-4254-a743-d7a34677e0cc-var-locks-brick\") pod \"cinder-backup-0\" (UID: \"260b2c35-636a-4254-a743-d7a34677e0cc\") " pod="watcher-kuttl-default/cinder-backup-0" Oct 03 09:07:26 crc kubenswrapper[4765]: I1003 09:07:26.926633 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run\" (UniqueName: \"kubernetes.io/host-path/260b2c35-636a-4254-a743-d7a34677e0cc-run\") pod \"cinder-backup-0\" (UID: \"260b2c35-636a-4254-a743-d7a34677e0cc\") " pod="watcher-kuttl-default/cinder-backup-0" Oct 03 09:07:26 crc kubenswrapper[4765]: I1003 09:07:26.926852 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/e8a8d866-ba44-4829-a8e8-df5c0c40ae8e-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"e8a8d866-ba44-4829-a8e8-df5c0c40ae8e\") " pod="watcher-kuttl-default/cinder-scheduler-0" Oct 03 09:07:26 crc kubenswrapper[4765]: I1003 09:07:26.927006 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dev\" (UniqueName: \"kubernetes.io/host-path/260b2c35-636a-4254-a743-d7a34677e0cc-dev\") pod \"cinder-backup-0\" (UID: \"260b2c35-636a-4254-a743-d7a34677e0cc\") " pod="watcher-kuttl-default/cinder-backup-0" Oct 03 09:07:26 crc kubenswrapper[4765]: I1003 09:07:26.927039 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/260b2c35-636a-4254-a743-d7a34677e0cc-lib-modules\") pod \"cinder-backup-0\" (UID: \"260b2c35-636a-4254-a743-d7a34677e0cc\") " pod="watcher-kuttl-default/cinder-backup-0" Oct 03 09:07:26 crc kubenswrapper[4765]: I1003 09:07:26.927416 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-nvme\" (UniqueName: \"kubernetes.io/host-path/260b2c35-636a-4254-a743-d7a34677e0cc-etc-nvme\") pod \"cinder-backup-0\" (UID: \"260b2c35-636a-4254-a743-d7a34677e0cc\") " pod="watcher-kuttl-default/cinder-backup-0" Oct 03 09:07:26 crc kubenswrapper[4765]: I1003 09:07:26.933183 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-locks-cinder\" (UniqueName: \"kubernetes.io/host-path/260b2c35-636a-4254-a743-d7a34677e0cc-var-locks-cinder\") pod \"cinder-backup-0\" (UID: \"260b2c35-636a-4254-a743-d7a34677e0cc\") " pod="watcher-kuttl-default/cinder-backup-0" Oct 03 09:07:26 crc kubenswrapper[4765]: I1003 09:07:26.933632 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/e8a8d866-ba44-4829-a8e8-df5c0c40ae8e-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"e8a8d866-ba44-4829-a8e8-df5c0c40ae8e\") " pod="watcher-kuttl-default/cinder-scheduler-0" Oct 03 09:07:26 crc kubenswrapper[4765]: I1003 09:07:26.934672 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/260b2c35-636a-4254-a743-d7a34677e0cc-config-data-custom\") pod \"cinder-backup-0\" (UID: \"260b2c35-636a-4254-a743-d7a34677e0cc\") " pod="watcher-kuttl-default/cinder-backup-0" Oct 03 09:07:26 crc kubenswrapper[4765]: I1003 09:07:26.935351 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/260b2c35-636a-4254-a743-d7a34677e0cc-config-data\") pod \"cinder-backup-0\" (UID: \"260b2c35-636a-4254-a743-d7a34677e0cc\") " pod="watcher-kuttl-default/cinder-backup-0" Oct 03 09:07:26 crc kubenswrapper[4765]: I1003 09:07:26.936157 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert-memcached-mtls\" (UniqueName: \"kubernetes.io/secret/e8a8d866-ba44-4829-a8e8-df5c0c40ae8e-cert-memcached-mtls\") pod \"cinder-scheduler-0\" (UID: \"e8a8d866-ba44-4829-a8e8-df5c0c40ae8e\") " pod="watcher-kuttl-default/cinder-scheduler-0" Oct 03 09:07:26 crc kubenswrapper[4765]: I1003 09:07:26.937540 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e8a8d866-ba44-4829-a8e8-df5c0c40ae8e-config-data\") pod \"cinder-scheduler-0\" (UID: \"e8a8d866-ba44-4829-a8e8-df5c0c40ae8e\") " pod="watcher-kuttl-default/cinder-scheduler-0" Oct 03 09:07:26 crc kubenswrapper[4765]: I1003 09:07:26.937963 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/260b2c35-636a-4254-a743-d7a34677e0cc-scripts\") pod \"cinder-backup-0\" (UID: \"260b2c35-636a-4254-a743-d7a34677e0cc\") " pod="watcher-kuttl-default/cinder-backup-0" Oct 03 09:07:26 crc kubenswrapper[4765]: I1003 09:07:26.938271 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e8a8d866-ba44-4829-a8e8-df5c0c40ae8e-scripts\") pod \"cinder-scheduler-0\" (UID: \"e8a8d866-ba44-4829-a8e8-df5c0c40ae8e\") " pod="watcher-kuttl-default/cinder-scheduler-0" Oct 03 09:07:26 crc kubenswrapper[4765]: I1003 09:07:26.940316 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e8a8d866-ba44-4829-a8e8-df5c0c40ae8e-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"e8a8d866-ba44-4829-a8e8-df5c0c40ae8e\") " pod="watcher-kuttl-default/cinder-scheduler-0" Oct 03 09:07:26 crc kubenswrapper[4765]: I1003 09:07:26.941063 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/260b2c35-636a-4254-a743-d7a34677e0cc-combined-ca-bundle\") pod \"cinder-backup-0\" (UID: \"260b2c35-636a-4254-a743-d7a34677e0cc\") " pod="watcher-kuttl-default/cinder-backup-0" Oct 03 09:07:26 crc kubenswrapper[4765]: I1003 09:07:26.946563 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert-memcached-mtls\" (UniqueName: \"kubernetes.io/secret/260b2c35-636a-4254-a743-d7a34677e0cc-cert-memcached-mtls\") pod \"cinder-backup-0\" (UID: \"260b2c35-636a-4254-a743-d7a34677e0cc\") " pod="watcher-kuttl-default/cinder-backup-0" Oct 03 09:07:26 crc kubenswrapper[4765]: I1003 09:07:26.948505 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-v5wzj\" (UniqueName: \"kubernetes.io/projected/260b2c35-636a-4254-a743-d7a34677e0cc-kube-api-access-v5wzj\") pod \"cinder-backup-0\" (UID: \"260b2c35-636a-4254-a743-d7a34677e0cc\") " pod="watcher-kuttl-default/cinder-backup-0" Oct 03 09:07:26 crc kubenswrapper[4765]: I1003 09:07:26.954434 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bcqqt\" (UniqueName: \"kubernetes.io/projected/e8a8d866-ba44-4829-a8e8-df5c0c40ae8e-kube-api-access-bcqqt\") pod \"cinder-scheduler-0\" (UID: \"e8a8d866-ba44-4829-a8e8-df5c0c40ae8e\") " pod="watcher-kuttl-default/cinder-scheduler-0" Oct 03 09:07:27 crc kubenswrapper[4765]: I1003 09:07:27.006093 4765 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/cinder-backup-0" Oct 03 09:07:27 crc kubenswrapper[4765]: I1003 09:07:27.022722 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/91c71bb9-9664-41a9-a1a7-2821cb1d5ad1-etc-machine-id\") pod \"cinder-api-0\" (UID: \"91c71bb9-9664-41a9-a1a7-2821cb1d5ad1\") " pod="watcher-kuttl-default/cinder-api-0" Oct 03 09:07:27 crc kubenswrapper[4765]: I1003 09:07:27.022826 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/91c71bb9-9664-41a9-a1a7-2821cb1d5ad1-config-data-custom\") pod \"cinder-api-0\" (UID: \"91c71bb9-9664-41a9-a1a7-2821cb1d5ad1\") " pod="watcher-kuttl-default/cinder-api-0" Oct 03 09:07:27 crc kubenswrapper[4765]: I1003 09:07:27.022855 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/91c71bb9-9664-41a9-a1a7-2821cb1d5ad1-scripts\") pod \"cinder-api-0\" (UID: \"91c71bb9-9664-41a9-a1a7-2821cb1d5ad1\") " pod="watcher-kuttl-default/cinder-api-0" Oct 03 09:07:27 crc kubenswrapper[4765]: I1003 09:07:27.022936 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ntk6m\" (UniqueName: \"kubernetes.io/projected/91c71bb9-9664-41a9-a1a7-2821cb1d5ad1-kube-api-access-ntk6m\") pod \"cinder-api-0\" (UID: \"91c71bb9-9664-41a9-a1a7-2821cb1d5ad1\") " pod="watcher-kuttl-default/cinder-api-0" Oct 03 09:07:27 crc kubenswrapper[4765]: I1003 09:07:27.022973 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/91c71bb9-9664-41a9-a1a7-2821cb1d5ad1-logs\") pod \"cinder-api-0\" (UID: \"91c71bb9-9664-41a9-a1a7-2821cb1d5ad1\") " pod="watcher-kuttl-default/cinder-api-0" Oct 03 09:07:27 crc kubenswrapper[4765]: I1003 09:07:27.022992 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/91c71bb9-9664-41a9-a1a7-2821cb1d5ad1-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"91c71bb9-9664-41a9-a1a7-2821cb1d5ad1\") " pod="watcher-kuttl-default/cinder-api-0" Oct 03 09:07:27 crc kubenswrapper[4765]: I1003 09:07:27.023013 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert-memcached-mtls\" (UniqueName: \"kubernetes.io/secret/91c71bb9-9664-41a9-a1a7-2821cb1d5ad1-cert-memcached-mtls\") pod \"cinder-api-0\" (UID: \"91c71bb9-9664-41a9-a1a7-2821cb1d5ad1\") " pod="watcher-kuttl-default/cinder-api-0" Oct 03 09:07:27 crc kubenswrapper[4765]: I1003 09:07:27.023044 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/91c71bb9-9664-41a9-a1a7-2821cb1d5ad1-config-data\") pod \"cinder-api-0\" (UID: \"91c71bb9-9664-41a9-a1a7-2821cb1d5ad1\") " pod="watcher-kuttl-default/cinder-api-0" Oct 03 09:07:27 crc kubenswrapper[4765]: I1003 09:07:27.022830 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/91c71bb9-9664-41a9-a1a7-2821cb1d5ad1-etc-machine-id\") pod \"cinder-api-0\" (UID: \"91c71bb9-9664-41a9-a1a7-2821cb1d5ad1\") " pod="watcher-kuttl-default/cinder-api-0" Oct 03 09:07:27 crc kubenswrapper[4765]: I1003 09:07:27.024161 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/91c71bb9-9664-41a9-a1a7-2821cb1d5ad1-logs\") pod \"cinder-api-0\" (UID: \"91c71bb9-9664-41a9-a1a7-2821cb1d5ad1\") " pod="watcher-kuttl-default/cinder-api-0" Oct 03 09:07:27 crc kubenswrapper[4765]: I1003 09:07:27.029702 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/91c71bb9-9664-41a9-a1a7-2821cb1d5ad1-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"91c71bb9-9664-41a9-a1a7-2821cb1d5ad1\") " pod="watcher-kuttl-default/cinder-api-0" Oct 03 09:07:27 crc kubenswrapper[4765]: I1003 09:07:27.031375 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/91c71bb9-9664-41a9-a1a7-2821cb1d5ad1-config-data-custom\") pod \"cinder-api-0\" (UID: \"91c71bb9-9664-41a9-a1a7-2821cb1d5ad1\") " pod="watcher-kuttl-default/cinder-api-0" Oct 03 09:07:27 crc kubenswrapper[4765]: I1003 09:07:27.032571 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/91c71bb9-9664-41a9-a1a7-2821cb1d5ad1-scripts\") pod \"cinder-api-0\" (UID: \"91c71bb9-9664-41a9-a1a7-2821cb1d5ad1\") " pod="watcher-kuttl-default/cinder-api-0" Oct 03 09:07:27 crc kubenswrapper[4765]: I1003 09:07:27.038586 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/91c71bb9-9664-41a9-a1a7-2821cb1d5ad1-config-data\") pod \"cinder-api-0\" (UID: \"91c71bb9-9664-41a9-a1a7-2821cb1d5ad1\") " pod="watcher-kuttl-default/cinder-api-0" Oct 03 09:07:27 crc kubenswrapper[4765]: I1003 09:07:27.039826 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert-memcached-mtls\" (UniqueName: \"kubernetes.io/secret/91c71bb9-9664-41a9-a1a7-2821cb1d5ad1-cert-memcached-mtls\") pod \"cinder-api-0\" (UID: \"91c71bb9-9664-41a9-a1a7-2821cb1d5ad1\") " pod="watcher-kuttl-default/cinder-api-0" Oct 03 09:07:27 crc kubenswrapper[4765]: I1003 09:07:27.046205 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ntk6m\" (UniqueName: \"kubernetes.io/projected/91c71bb9-9664-41a9-a1a7-2821cb1d5ad1-kube-api-access-ntk6m\") pod \"cinder-api-0\" (UID: \"91c71bb9-9664-41a9-a1a7-2821cb1d5ad1\") " pod="watcher-kuttl-default/cinder-api-0" Oct 03 09:07:27 crc kubenswrapper[4765]: I1003 09:07:27.062043 4765 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/cinder-scheduler-0" Oct 03 09:07:27 crc kubenswrapper[4765]: I1003 09:07:27.171035 4765 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/cinder-api-0" Oct 03 09:07:27 crc kubenswrapper[4765]: I1003 09:07:27.320026 4765 log.go:25] "Finished parsing log file" path="/var/log/pods/watcher-kuttl-default_watcher-kuttl-decision-engine-0_ba2a7e44-33cc-4cf1-8dfc-7aa8306118d2/watcher-decision-engine/0.log" Oct 03 09:07:27 crc kubenswrapper[4765]: I1003 09:07:27.648310 4765 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["watcher-kuttl-default/cinder-backup-0"] Oct 03 09:07:27 crc kubenswrapper[4765]: I1003 09:07:27.721128 4765 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["watcher-kuttl-default/cinder-scheduler-0"] Oct 03 09:07:27 crc kubenswrapper[4765]: W1003 09:07:27.733386 4765 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pode8a8d866_ba44_4829_a8e8_df5c0c40ae8e.slice/crio-9e55728b65f65fe4dcd753ddee20aee601d77cf8a55dd9d5145edce17a406762 WatchSource:0}: Error finding container 9e55728b65f65fe4dcd753ddee20aee601d77cf8a55dd9d5145edce17a406762: Status 404 returned error can't find the container with id 9e55728b65f65fe4dcd753ddee20aee601d77cf8a55dd9d5145edce17a406762 Oct 03 09:07:27 crc kubenswrapper[4765]: I1003 09:07:27.768093 4765 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["watcher-kuttl-default/cinder-api-0"] Oct 03 09:07:28 crc kubenswrapper[4765]: I1003 09:07:28.306522 4765 scope.go:117] "RemoveContainer" containerID="dd918556e4256b95f1ffce5dba4f8a301b33441a569fc5bbea88da3f09eb9800" Oct 03 09:07:28 crc kubenswrapper[4765]: E1003 09:07:28.306797 4765 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-j8mss_openshift-machine-config-operator(d636dbad-9ffa-4ba7-953f-adea04b76a23)\"" pod="openshift-machine-config-operator/machine-config-daemon-j8mss" podUID="d636dbad-9ffa-4ba7-953f-adea04b76a23" Oct 03 09:07:28 crc kubenswrapper[4765]: I1003 09:07:28.461160 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/cinder-api-0" event={"ID":"91c71bb9-9664-41a9-a1a7-2821cb1d5ad1","Type":"ContainerStarted","Data":"f39fe208c04953bbafaeb2d5ba85389334b888e2d537c6de4b1cbab9cc3d163e"} Oct 03 09:07:28 crc kubenswrapper[4765]: I1003 09:07:28.463244 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/cinder-backup-0" event={"ID":"260b2c35-636a-4254-a743-d7a34677e0cc","Type":"ContainerStarted","Data":"faecdaf8b623942c9b079962ce42e1f995ea26af381b0b0771c7518f04eab352"} Oct 03 09:07:28 crc kubenswrapper[4765]: I1003 09:07:28.466058 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/cinder-scheduler-0" event={"ID":"e8a8d866-ba44-4829-a8e8-df5c0c40ae8e","Type":"ContainerStarted","Data":"9e55728b65f65fe4dcd753ddee20aee601d77cf8a55dd9d5145edce17a406762"} Oct 03 09:07:28 crc kubenswrapper[4765]: I1003 09:07:28.546953 4765 log.go:25] "Finished parsing log file" path="/var/log/pods/watcher-kuttl-default_watcher-kuttl-decision-engine-0_ba2a7e44-33cc-4cf1-8dfc-7aa8306118d2/watcher-decision-engine/0.log" Oct 03 09:07:29 crc kubenswrapper[4765]: I1003 09:07:29.520335 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/cinder-backup-0" event={"ID":"260b2c35-636a-4254-a743-d7a34677e0cc","Type":"ContainerStarted","Data":"50823c8edfe8e7840fb0b06c30143be35dacad1c8767f08433cbd86227cc2443"} Oct 03 09:07:29 crc kubenswrapper[4765]: I1003 09:07:29.524402 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/cinder-scheduler-0" event={"ID":"e8a8d866-ba44-4829-a8e8-df5c0c40ae8e","Type":"ContainerStarted","Data":"7d251be655899592f4e4c7638dbd63a609f5d166157cb5847cdea70728591818"} Oct 03 09:07:29 crc kubenswrapper[4765]: I1003 09:07:29.535291 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/cinder-api-0" event={"ID":"91c71bb9-9664-41a9-a1a7-2821cb1d5ad1","Type":"ContainerStarted","Data":"eece139b0de98a577e91d3ccc537b4e71fa2be8aa0251e6bb8ac73648b4178e2"} Oct 03 09:07:29 crc kubenswrapper[4765]: I1003 09:07:29.771079 4765 log.go:25] "Finished parsing log file" path="/var/log/pods/watcher-kuttl-default_watcher-kuttl-decision-engine-0_ba2a7e44-33cc-4cf1-8dfc-7aa8306118d2/watcher-decision-engine/0.log" Oct 03 09:07:29 crc kubenswrapper[4765]: I1003 09:07:29.921545 4765 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["watcher-kuttl-default/cinder-api-0"] Oct 03 09:07:30 crc kubenswrapper[4765]: I1003 09:07:30.545137 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/cinder-backup-0" event={"ID":"260b2c35-636a-4254-a743-d7a34677e0cc","Type":"ContainerStarted","Data":"c41ef2caf3db2ea87154fb3d6fe3722ce288239cce04209f2e68bce2d96b22cb"} Oct 03 09:07:30 crc kubenswrapper[4765]: I1003 09:07:30.547840 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/cinder-scheduler-0" event={"ID":"e8a8d866-ba44-4829-a8e8-df5c0c40ae8e","Type":"ContainerStarted","Data":"4e18a5c8c5eb32a52eb080a822703efc70ee3cd3be4f1ecfd74ba9c99ccbba9b"} Oct 03 09:07:30 crc kubenswrapper[4765]: I1003 09:07:30.550832 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/cinder-api-0" event={"ID":"91c71bb9-9664-41a9-a1a7-2821cb1d5ad1","Type":"ContainerStarted","Data":"bb46f8d40bf8f681ce04372f59c6d0df327eeb2398826ddc4b2024022970fffd"} Oct 03 09:07:30 crc kubenswrapper[4765]: I1003 09:07:30.550960 4765 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="watcher-kuttl-default/cinder-api-0" Oct 03 09:07:30 crc kubenswrapper[4765]: I1003 09:07:30.550955 4765 kuberuntime_container.go:808] "Killing container with a grace period" pod="watcher-kuttl-default/cinder-api-0" podUID="91c71bb9-9664-41a9-a1a7-2821cb1d5ad1" containerName="cinder-api-log" containerID="cri-o://eece139b0de98a577e91d3ccc537b4e71fa2be8aa0251e6bb8ac73648b4178e2" gracePeriod=30 Oct 03 09:07:30 crc kubenswrapper[4765]: I1003 09:07:30.550997 4765 kuberuntime_container.go:808] "Killing container with a grace period" pod="watcher-kuttl-default/cinder-api-0" podUID="91c71bb9-9664-41a9-a1a7-2821cb1d5ad1" containerName="cinder-api" containerID="cri-o://bb46f8d40bf8f681ce04372f59c6d0df327eeb2398826ddc4b2024022970fffd" gracePeriod=30 Oct 03 09:07:30 crc kubenswrapper[4765]: I1003 09:07:30.582858 4765 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="watcher-kuttl-default/cinder-backup-0" podStartSLOduration=3.207492998 podStartE2EDuration="4.582838005s" podCreationTimestamp="2025-10-03 09:07:26 +0000 UTC" firstStartedPulling="2025-10-03 09:07:27.625679215 +0000 UTC m=+1691.927173545" lastFinishedPulling="2025-10-03 09:07:29.001024222 +0000 UTC m=+1693.302518552" observedRunningTime="2025-10-03 09:07:30.578255017 +0000 UTC m=+1694.879749357" watchObservedRunningTime="2025-10-03 09:07:30.582838005 +0000 UTC m=+1694.884332335" Oct 03 09:07:30 crc kubenswrapper[4765]: I1003 09:07:30.616257 4765 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="watcher-kuttl-default/cinder-scheduler-0" podStartSLOduration=3.87001714 podStartE2EDuration="4.616239796s" podCreationTimestamp="2025-10-03 09:07:26 +0000 UTC" firstStartedPulling="2025-10-03 09:07:27.736028937 +0000 UTC m=+1692.037523267" lastFinishedPulling="2025-10-03 09:07:28.482251593 +0000 UTC m=+1692.783745923" observedRunningTime="2025-10-03 09:07:30.607889541 +0000 UTC m=+1694.909383881" watchObservedRunningTime="2025-10-03 09:07:30.616239796 +0000 UTC m=+1694.917734126" Oct 03 09:07:30 crc kubenswrapper[4765]: I1003 09:07:30.646496 4765 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="watcher-kuttl-default/cinder-api-0" podStartSLOduration=4.646474614 podStartE2EDuration="4.646474614s" podCreationTimestamp="2025-10-03 09:07:26 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-03 09:07:30.64085453 +0000 UTC m=+1694.942348860" watchObservedRunningTime="2025-10-03 09:07:30.646474614 +0000 UTC m=+1694.947968944" Oct 03 09:07:30 crc kubenswrapper[4765]: I1003 09:07:30.964836 4765 log.go:25] "Finished parsing log file" path="/var/log/pods/watcher-kuttl-default_watcher-kuttl-decision-engine-0_ba2a7e44-33cc-4cf1-8dfc-7aa8306118d2/watcher-decision-engine/0.log" Oct 03 09:07:31 crc kubenswrapper[4765]: I1003 09:07:31.495202 4765 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/cinder-api-0" Oct 03 09:07:31 crc kubenswrapper[4765]: I1003 09:07:31.571586 4765 generic.go:334] "Generic (PLEG): container finished" podID="91c71bb9-9664-41a9-a1a7-2821cb1d5ad1" containerID="bb46f8d40bf8f681ce04372f59c6d0df327eeb2398826ddc4b2024022970fffd" exitCode=0 Oct 03 09:07:31 crc kubenswrapper[4765]: I1003 09:07:31.571621 4765 generic.go:334] "Generic (PLEG): container finished" podID="91c71bb9-9664-41a9-a1a7-2821cb1d5ad1" containerID="eece139b0de98a577e91d3ccc537b4e71fa2be8aa0251e6bb8ac73648b4178e2" exitCode=143 Oct 03 09:07:31 crc kubenswrapper[4765]: I1003 09:07:31.571679 4765 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/cinder-api-0" Oct 03 09:07:31 crc kubenswrapper[4765]: I1003 09:07:31.572460 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/cinder-api-0" event={"ID":"91c71bb9-9664-41a9-a1a7-2821cb1d5ad1","Type":"ContainerDied","Data":"bb46f8d40bf8f681ce04372f59c6d0df327eeb2398826ddc4b2024022970fffd"} Oct 03 09:07:31 crc kubenswrapper[4765]: I1003 09:07:31.572496 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/cinder-api-0" event={"ID":"91c71bb9-9664-41a9-a1a7-2821cb1d5ad1","Type":"ContainerDied","Data":"eece139b0de98a577e91d3ccc537b4e71fa2be8aa0251e6bb8ac73648b4178e2"} Oct 03 09:07:31 crc kubenswrapper[4765]: I1003 09:07:31.572510 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/cinder-api-0" event={"ID":"91c71bb9-9664-41a9-a1a7-2821cb1d5ad1","Type":"ContainerDied","Data":"f39fe208c04953bbafaeb2d5ba85389334b888e2d537c6de4b1cbab9cc3d163e"} Oct 03 09:07:31 crc kubenswrapper[4765]: I1003 09:07:31.572528 4765 scope.go:117] "RemoveContainer" containerID="bb46f8d40bf8f681ce04372f59c6d0df327eeb2398826ddc4b2024022970fffd" Oct 03 09:07:31 crc kubenswrapper[4765]: I1003 09:07:31.642188 4765 scope.go:117] "RemoveContainer" containerID="eece139b0de98a577e91d3ccc537b4e71fa2be8aa0251e6bb8ac73648b4178e2" Oct 03 09:07:31 crc kubenswrapper[4765]: I1003 09:07:31.644574 4765 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/91c71bb9-9664-41a9-a1a7-2821cb1d5ad1-logs\") pod \"91c71bb9-9664-41a9-a1a7-2821cb1d5ad1\" (UID: \"91c71bb9-9664-41a9-a1a7-2821cb1d5ad1\") " Oct 03 09:07:31 crc kubenswrapper[4765]: I1003 09:07:31.644612 4765 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/91c71bb9-9664-41a9-a1a7-2821cb1d5ad1-etc-machine-id\") pod \"91c71bb9-9664-41a9-a1a7-2821cb1d5ad1\" (UID: \"91c71bb9-9664-41a9-a1a7-2821cb1d5ad1\") " Oct 03 09:07:31 crc kubenswrapper[4765]: I1003 09:07:31.644682 4765 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/91c71bb9-9664-41a9-a1a7-2821cb1d5ad1-combined-ca-bundle\") pod \"91c71bb9-9664-41a9-a1a7-2821cb1d5ad1\" (UID: \"91c71bb9-9664-41a9-a1a7-2821cb1d5ad1\") " Oct 03 09:07:31 crc kubenswrapper[4765]: I1003 09:07:31.644784 4765 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/91c71bb9-9664-41a9-a1a7-2821cb1d5ad1-config-data-custom\") pod \"91c71bb9-9664-41a9-a1a7-2821cb1d5ad1\" (UID: \"91c71bb9-9664-41a9-a1a7-2821cb1d5ad1\") " Oct 03 09:07:31 crc kubenswrapper[4765]: I1003 09:07:31.644836 4765 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/91c71bb9-9664-41a9-a1a7-2821cb1d5ad1-scripts\") pod \"91c71bb9-9664-41a9-a1a7-2821cb1d5ad1\" (UID: \"91c71bb9-9664-41a9-a1a7-2821cb1d5ad1\") " Oct 03 09:07:31 crc kubenswrapper[4765]: I1003 09:07:31.644918 4765 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/91c71bb9-9664-41a9-a1a7-2821cb1d5ad1-config-data\") pod \"91c71bb9-9664-41a9-a1a7-2821cb1d5ad1\" (UID: \"91c71bb9-9664-41a9-a1a7-2821cb1d5ad1\") " Oct 03 09:07:31 crc kubenswrapper[4765]: I1003 09:07:31.644991 4765 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cert-memcached-mtls\" (UniqueName: \"kubernetes.io/secret/91c71bb9-9664-41a9-a1a7-2821cb1d5ad1-cert-memcached-mtls\") pod \"91c71bb9-9664-41a9-a1a7-2821cb1d5ad1\" (UID: \"91c71bb9-9664-41a9-a1a7-2821cb1d5ad1\") " Oct 03 09:07:31 crc kubenswrapper[4765]: I1003 09:07:31.645019 4765 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ntk6m\" (UniqueName: \"kubernetes.io/projected/91c71bb9-9664-41a9-a1a7-2821cb1d5ad1-kube-api-access-ntk6m\") pod \"91c71bb9-9664-41a9-a1a7-2821cb1d5ad1\" (UID: \"91c71bb9-9664-41a9-a1a7-2821cb1d5ad1\") " Oct 03 09:07:31 crc kubenswrapper[4765]: I1003 09:07:31.646830 4765 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/91c71bb9-9664-41a9-a1a7-2821cb1d5ad1-etc-machine-id" (OuterVolumeSpecName: "etc-machine-id") pod "91c71bb9-9664-41a9-a1a7-2821cb1d5ad1" (UID: "91c71bb9-9664-41a9-a1a7-2821cb1d5ad1"). InnerVolumeSpecName "etc-machine-id". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 03 09:07:31 crc kubenswrapper[4765]: I1003 09:07:31.647214 4765 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/91c71bb9-9664-41a9-a1a7-2821cb1d5ad1-logs" (OuterVolumeSpecName: "logs") pod "91c71bb9-9664-41a9-a1a7-2821cb1d5ad1" (UID: "91c71bb9-9664-41a9-a1a7-2821cb1d5ad1"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 03 09:07:31 crc kubenswrapper[4765]: I1003 09:07:31.656113 4765 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/91c71bb9-9664-41a9-a1a7-2821cb1d5ad1-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "91c71bb9-9664-41a9-a1a7-2821cb1d5ad1" (UID: "91c71bb9-9664-41a9-a1a7-2821cb1d5ad1"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 09:07:31 crc kubenswrapper[4765]: I1003 09:07:31.659556 4765 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/91c71bb9-9664-41a9-a1a7-2821cb1d5ad1-kube-api-access-ntk6m" (OuterVolumeSpecName: "kube-api-access-ntk6m") pod "91c71bb9-9664-41a9-a1a7-2821cb1d5ad1" (UID: "91c71bb9-9664-41a9-a1a7-2821cb1d5ad1"). InnerVolumeSpecName "kube-api-access-ntk6m". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 09:07:31 crc kubenswrapper[4765]: I1003 09:07:31.674763 4765 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/91c71bb9-9664-41a9-a1a7-2821cb1d5ad1-scripts" (OuterVolumeSpecName: "scripts") pod "91c71bb9-9664-41a9-a1a7-2821cb1d5ad1" (UID: "91c71bb9-9664-41a9-a1a7-2821cb1d5ad1"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 09:07:31 crc kubenswrapper[4765]: I1003 09:07:31.748385 4765 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ntk6m\" (UniqueName: \"kubernetes.io/projected/91c71bb9-9664-41a9-a1a7-2821cb1d5ad1-kube-api-access-ntk6m\") on node \"crc\" DevicePath \"\"" Oct 03 09:07:31 crc kubenswrapper[4765]: I1003 09:07:31.748425 4765 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/91c71bb9-9664-41a9-a1a7-2821cb1d5ad1-logs\") on node \"crc\" DevicePath \"\"" Oct 03 09:07:31 crc kubenswrapper[4765]: I1003 09:07:31.748438 4765 reconciler_common.go:293] "Volume detached for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/91c71bb9-9664-41a9-a1a7-2821cb1d5ad1-etc-machine-id\") on node \"crc\" DevicePath \"\"" Oct 03 09:07:31 crc kubenswrapper[4765]: I1003 09:07:31.748449 4765 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/91c71bb9-9664-41a9-a1a7-2821cb1d5ad1-config-data-custom\") on node \"crc\" DevicePath \"\"" Oct 03 09:07:31 crc kubenswrapper[4765]: I1003 09:07:31.748460 4765 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/91c71bb9-9664-41a9-a1a7-2821cb1d5ad1-scripts\") on node \"crc\" DevicePath \"\"" Oct 03 09:07:31 crc kubenswrapper[4765]: I1003 09:07:31.765864 4765 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/91c71bb9-9664-41a9-a1a7-2821cb1d5ad1-config-data" (OuterVolumeSpecName: "config-data") pod "91c71bb9-9664-41a9-a1a7-2821cb1d5ad1" (UID: "91c71bb9-9664-41a9-a1a7-2821cb1d5ad1"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 09:07:31 crc kubenswrapper[4765]: I1003 09:07:31.782849 4765 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/91c71bb9-9664-41a9-a1a7-2821cb1d5ad1-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "91c71bb9-9664-41a9-a1a7-2821cb1d5ad1" (UID: "91c71bb9-9664-41a9-a1a7-2821cb1d5ad1"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 09:07:31 crc kubenswrapper[4765]: I1003 09:07:31.802085 4765 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/91c71bb9-9664-41a9-a1a7-2821cb1d5ad1-cert-memcached-mtls" (OuterVolumeSpecName: "cert-memcached-mtls") pod "91c71bb9-9664-41a9-a1a7-2821cb1d5ad1" (UID: "91c71bb9-9664-41a9-a1a7-2821cb1d5ad1"). InnerVolumeSpecName "cert-memcached-mtls". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 09:07:31 crc kubenswrapper[4765]: I1003 09:07:31.850289 4765 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/91c71bb9-9664-41a9-a1a7-2821cb1d5ad1-config-data\") on node \"crc\" DevicePath \"\"" Oct 03 09:07:31 crc kubenswrapper[4765]: I1003 09:07:31.850328 4765 reconciler_common.go:293] "Volume detached for volume \"cert-memcached-mtls\" (UniqueName: \"kubernetes.io/secret/91c71bb9-9664-41a9-a1a7-2821cb1d5ad1-cert-memcached-mtls\") on node \"crc\" DevicePath \"\"" Oct 03 09:07:31 crc kubenswrapper[4765]: I1003 09:07:31.850341 4765 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/91c71bb9-9664-41a9-a1a7-2821cb1d5ad1-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 03 09:07:31 crc kubenswrapper[4765]: I1003 09:07:31.886913 4765 scope.go:117] "RemoveContainer" containerID="bb46f8d40bf8f681ce04372f59c6d0df327eeb2398826ddc4b2024022970fffd" Oct 03 09:07:31 crc kubenswrapper[4765]: E1003 09:07:31.887490 4765 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"bb46f8d40bf8f681ce04372f59c6d0df327eeb2398826ddc4b2024022970fffd\": container with ID starting with bb46f8d40bf8f681ce04372f59c6d0df327eeb2398826ddc4b2024022970fffd not found: ID does not exist" containerID="bb46f8d40bf8f681ce04372f59c6d0df327eeb2398826ddc4b2024022970fffd" Oct 03 09:07:31 crc kubenswrapper[4765]: I1003 09:07:31.887746 4765 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"bb46f8d40bf8f681ce04372f59c6d0df327eeb2398826ddc4b2024022970fffd"} err="failed to get container status \"bb46f8d40bf8f681ce04372f59c6d0df327eeb2398826ddc4b2024022970fffd\": rpc error: code = NotFound desc = could not find container \"bb46f8d40bf8f681ce04372f59c6d0df327eeb2398826ddc4b2024022970fffd\": container with ID starting with bb46f8d40bf8f681ce04372f59c6d0df327eeb2398826ddc4b2024022970fffd not found: ID does not exist" Oct 03 09:07:31 crc kubenswrapper[4765]: I1003 09:07:31.888048 4765 scope.go:117] "RemoveContainer" containerID="eece139b0de98a577e91d3ccc537b4e71fa2be8aa0251e6bb8ac73648b4178e2" Oct 03 09:07:31 crc kubenswrapper[4765]: E1003 09:07:31.888944 4765 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"eece139b0de98a577e91d3ccc537b4e71fa2be8aa0251e6bb8ac73648b4178e2\": container with ID starting with eece139b0de98a577e91d3ccc537b4e71fa2be8aa0251e6bb8ac73648b4178e2 not found: ID does not exist" containerID="eece139b0de98a577e91d3ccc537b4e71fa2be8aa0251e6bb8ac73648b4178e2" Oct 03 09:07:31 crc kubenswrapper[4765]: I1003 09:07:31.889057 4765 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"eece139b0de98a577e91d3ccc537b4e71fa2be8aa0251e6bb8ac73648b4178e2"} err="failed to get container status \"eece139b0de98a577e91d3ccc537b4e71fa2be8aa0251e6bb8ac73648b4178e2\": rpc error: code = NotFound desc = could not find container \"eece139b0de98a577e91d3ccc537b4e71fa2be8aa0251e6bb8ac73648b4178e2\": container with ID starting with eece139b0de98a577e91d3ccc537b4e71fa2be8aa0251e6bb8ac73648b4178e2 not found: ID does not exist" Oct 03 09:07:31 crc kubenswrapper[4765]: I1003 09:07:31.889262 4765 scope.go:117] "RemoveContainer" containerID="bb46f8d40bf8f681ce04372f59c6d0df327eeb2398826ddc4b2024022970fffd" Oct 03 09:07:31 crc kubenswrapper[4765]: I1003 09:07:31.889969 4765 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"bb46f8d40bf8f681ce04372f59c6d0df327eeb2398826ddc4b2024022970fffd"} err="failed to get container status \"bb46f8d40bf8f681ce04372f59c6d0df327eeb2398826ddc4b2024022970fffd\": rpc error: code = NotFound desc = could not find container \"bb46f8d40bf8f681ce04372f59c6d0df327eeb2398826ddc4b2024022970fffd\": container with ID starting with bb46f8d40bf8f681ce04372f59c6d0df327eeb2398826ddc4b2024022970fffd not found: ID does not exist" Oct 03 09:07:31 crc kubenswrapper[4765]: I1003 09:07:31.890086 4765 scope.go:117] "RemoveContainer" containerID="eece139b0de98a577e91d3ccc537b4e71fa2be8aa0251e6bb8ac73648b4178e2" Oct 03 09:07:31 crc kubenswrapper[4765]: I1003 09:07:31.890462 4765 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"eece139b0de98a577e91d3ccc537b4e71fa2be8aa0251e6bb8ac73648b4178e2"} err="failed to get container status \"eece139b0de98a577e91d3ccc537b4e71fa2be8aa0251e6bb8ac73648b4178e2\": rpc error: code = NotFound desc = could not find container \"eece139b0de98a577e91d3ccc537b4e71fa2be8aa0251e6bb8ac73648b4178e2\": container with ID starting with eece139b0de98a577e91d3ccc537b4e71fa2be8aa0251e6bb8ac73648b4178e2 not found: ID does not exist" Oct 03 09:07:31 crc kubenswrapper[4765]: I1003 09:07:31.906151 4765 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["watcher-kuttl-default/cinder-api-0"] Oct 03 09:07:31 crc kubenswrapper[4765]: I1003 09:07:31.916861 4765 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["watcher-kuttl-default/cinder-api-0"] Oct 03 09:07:31 crc kubenswrapper[4765]: I1003 09:07:31.931390 4765 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["watcher-kuttl-default/cinder-api-0"] Oct 03 09:07:31 crc kubenswrapper[4765]: E1003 09:07:31.932235 4765 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="91c71bb9-9664-41a9-a1a7-2821cb1d5ad1" containerName="cinder-api" Oct 03 09:07:31 crc kubenswrapper[4765]: I1003 09:07:31.932339 4765 state_mem.go:107] "Deleted CPUSet assignment" podUID="91c71bb9-9664-41a9-a1a7-2821cb1d5ad1" containerName="cinder-api" Oct 03 09:07:31 crc kubenswrapper[4765]: E1003 09:07:31.932475 4765 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="91c71bb9-9664-41a9-a1a7-2821cb1d5ad1" containerName="cinder-api-log" Oct 03 09:07:31 crc kubenswrapper[4765]: I1003 09:07:31.932562 4765 state_mem.go:107] "Deleted CPUSet assignment" podUID="91c71bb9-9664-41a9-a1a7-2821cb1d5ad1" containerName="cinder-api-log" Oct 03 09:07:31 crc kubenswrapper[4765]: I1003 09:07:31.932892 4765 memory_manager.go:354] "RemoveStaleState removing state" podUID="91c71bb9-9664-41a9-a1a7-2821cb1d5ad1" containerName="cinder-api" Oct 03 09:07:31 crc kubenswrapper[4765]: I1003 09:07:31.933005 4765 memory_manager.go:354] "RemoveStaleState removing state" podUID="91c71bb9-9664-41a9-a1a7-2821cb1d5ad1" containerName="cinder-api-log" Oct 03 09:07:31 crc kubenswrapper[4765]: I1003 09:07:31.934221 4765 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/cinder-api-0" Oct 03 09:07:31 crc kubenswrapper[4765]: I1003 09:07:31.938485 4765 reflector.go:368] Caches populated for *v1.Secret from object-"watcher-kuttl-default"/"cert-cinder-internal-svc" Oct 03 09:07:31 crc kubenswrapper[4765]: I1003 09:07:31.938739 4765 reflector.go:368] Caches populated for *v1.Secret from object-"watcher-kuttl-default"/"cert-cinder-public-svc" Oct 03 09:07:31 crc kubenswrapper[4765]: I1003 09:07:31.938753 4765 reflector.go:368] Caches populated for *v1.Secret from object-"watcher-kuttl-default"/"cinder-api-config-data" Oct 03 09:07:31 crc kubenswrapper[4765]: I1003 09:07:31.959474 4765 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["watcher-kuttl-default/cinder-api-0"] Oct 03 09:07:32 crc kubenswrapper[4765]: I1003 09:07:32.006284 4765 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="watcher-kuttl-default/cinder-backup-0" Oct 03 09:07:32 crc kubenswrapper[4765]: I1003 09:07:32.057973 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4kk5m\" (UniqueName: \"kubernetes.io/projected/775f1f1a-d2a9-45a6-91d7-9ea015f815a5-kube-api-access-4kk5m\") pod \"cinder-api-0\" (UID: \"775f1f1a-d2a9-45a6-91d7-9ea015f815a5\") " pod="watcher-kuttl-default/cinder-api-0" Oct 03 09:07:32 crc kubenswrapper[4765]: I1003 09:07:32.058031 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert-memcached-mtls\" (UniqueName: \"kubernetes.io/secret/775f1f1a-d2a9-45a6-91d7-9ea015f815a5-cert-memcached-mtls\") pod \"cinder-api-0\" (UID: \"775f1f1a-d2a9-45a6-91d7-9ea015f815a5\") " pod="watcher-kuttl-default/cinder-api-0" Oct 03 09:07:32 crc kubenswrapper[4765]: I1003 09:07:32.058283 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/775f1f1a-d2a9-45a6-91d7-9ea015f815a5-config-data-custom\") pod \"cinder-api-0\" (UID: \"775f1f1a-d2a9-45a6-91d7-9ea015f815a5\") " pod="watcher-kuttl-default/cinder-api-0" Oct 03 09:07:32 crc kubenswrapper[4765]: I1003 09:07:32.058386 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/775f1f1a-d2a9-45a6-91d7-9ea015f815a5-public-tls-certs\") pod \"cinder-api-0\" (UID: \"775f1f1a-d2a9-45a6-91d7-9ea015f815a5\") " pod="watcher-kuttl-default/cinder-api-0" Oct 03 09:07:32 crc kubenswrapper[4765]: I1003 09:07:32.058753 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/775f1f1a-d2a9-45a6-91d7-9ea015f815a5-internal-tls-certs\") pod \"cinder-api-0\" (UID: \"775f1f1a-d2a9-45a6-91d7-9ea015f815a5\") " pod="watcher-kuttl-default/cinder-api-0" Oct 03 09:07:32 crc kubenswrapper[4765]: I1003 09:07:32.058989 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/775f1f1a-d2a9-45a6-91d7-9ea015f815a5-logs\") pod \"cinder-api-0\" (UID: \"775f1f1a-d2a9-45a6-91d7-9ea015f815a5\") " pod="watcher-kuttl-default/cinder-api-0" Oct 03 09:07:32 crc kubenswrapper[4765]: I1003 09:07:32.059029 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/775f1f1a-d2a9-45a6-91d7-9ea015f815a5-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"775f1f1a-d2a9-45a6-91d7-9ea015f815a5\") " pod="watcher-kuttl-default/cinder-api-0" Oct 03 09:07:32 crc kubenswrapper[4765]: I1003 09:07:32.059114 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/775f1f1a-d2a9-45a6-91d7-9ea015f815a5-etc-machine-id\") pod \"cinder-api-0\" (UID: \"775f1f1a-d2a9-45a6-91d7-9ea015f815a5\") " pod="watcher-kuttl-default/cinder-api-0" Oct 03 09:07:32 crc kubenswrapper[4765]: I1003 09:07:32.059147 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/775f1f1a-d2a9-45a6-91d7-9ea015f815a5-scripts\") pod \"cinder-api-0\" (UID: \"775f1f1a-d2a9-45a6-91d7-9ea015f815a5\") " pod="watcher-kuttl-default/cinder-api-0" Oct 03 09:07:32 crc kubenswrapper[4765]: I1003 09:07:32.059242 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/775f1f1a-d2a9-45a6-91d7-9ea015f815a5-config-data\") pod \"cinder-api-0\" (UID: \"775f1f1a-d2a9-45a6-91d7-9ea015f815a5\") " pod="watcher-kuttl-default/cinder-api-0" Oct 03 09:07:32 crc kubenswrapper[4765]: I1003 09:07:32.062206 4765 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="watcher-kuttl-default/cinder-scheduler-0" Oct 03 09:07:32 crc kubenswrapper[4765]: I1003 09:07:32.161324 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/775f1f1a-d2a9-45a6-91d7-9ea015f815a5-internal-tls-certs\") pod \"cinder-api-0\" (UID: \"775f1f1a-d2a9-45a6-91d7-9ea015f815a5\") " pod="watcher-kuttl-default/cinder-api-0" Oct 03 09:07:32 crc kubenswrapper[4765]: I1003 09:07:32.161418 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/775f1f1a-d2a9-45a6-91d7-9ea015f815a5-logs\") pod \"cinder-api-0\" (UID: \"775f1f1a-d2a9-45a6-91d7-9ea015f815a5\") " pod="watcher-kuttl-default/cinder-api-0" Oct 03 09:07:32 crc kubenswrapper[4765]: I1003 09:07:32.161444 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/775f1f1a-d2a9-45a6-91d7-9ea015f815a5-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"775f1f1a-d2a9-45a6-91d7-9ea015f815a5\") " pod="watcher-kuttl-default/cinder-api-0" Oct 03 09:07:32 crc kubenswrapper[4765]: I1003 09:07:32.161468 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/775f1f1a-d2a9-45a6-91d7-9ea015f815a5-etc-machine-id\") pod \"cinder-api-0\" (UID: \"775f1f1a-d2a9-45a6-91d7-9ea015f815a5\") " pod="watcher-kuttl-default/cinder-api-0" Oct 03 09:07:32 crc kubenswrapper[4765]: I1003 09:07:32.161486 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/775f1f1a-d2a9-45a6-91d7-9ea015f815a5-scripts\") pod \"cinder-api-0\" (UID: \"775f1f1a-d2a9-45a6-91d7-9ea015f815a5\") " pod="watcher-kuttl-default/cinder-api-0" Oct 03 09:07:32 crc kubenswrapper[4765]: I1003 09:07:32.161515 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/775f1f1a-d2a9-45a6-91d7-9ea015f815a5-config-data\") pod \"cinder-api-0\" (UID: \"775f1f1a-d2a9-45a6-91d7-9ea015f815a5\") " pod="watcher-kuttl-default/cinder-api-0" Oct 03 09:07:32 crc kubenswrapper[4765]: I1003 09:07:32.161541 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4kk5m\" (UniqueName: \"kubernetes.io/projected/775f1f1a-d2a9-45a6-91d7-9ea015f815a5-kube-api-access-4kk5m\") pod \"cinder-api-0\" (UID: \"775f1f1a-d2a9-45a6-91d7-9ea015f815a5\") " pod="watcher-kuttl-default/cinder-api-0" Oct 03 09:07:32 crc kubenswrapper[4765]: I1003 09:07:32.161559 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert-memcached-mtls\" (UniqueName: \"kubernetes.io/secret/775f1f1a-d2a9-45a6-91d7-9ea015f815a5-cert-memcached-mtls\") pod \"cinder-api-0\" (UID: \"775f1f1a-d2a9-45a6-91d7-9ea015f815a5\") " pod="watcher-kuttl-default/cinder-api-0" Oct 03 09:07:32 crc kubenswrapper[4765]: I1003 09:07:32.161599 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/775f1f1a-d2a9-45a6-91d7-9ea015f815a5-config-data-custom\") pod \"cinder-api-0\" (UID: \"775f1f1a-d2a9-45a6-91d7-9ea015f815a5\") " pod="watcher-kuttl-default/cinder-api-0" Oct 03 09:07:32 crc kubenswrapper[4765]: I1003 09:07:32.161626 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/775f1f1a-d2a9-45a6-91d7-9ea015f815a5-public-tls-certs\") pod \"cinder-api-0\" (UID: \"775f1f1a-d2a9-45a6-91d7-9ea015f815a5\") " pod="watcher-kuttl-default/cinder-api-0" Oct 03 09:07:32 crc kubenswrapper[4765]: I1003 09:07:32.163204 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/775f1f1a-d2a9-45a6-91d7-9ea015f815a5-logs\") pod \"cinder-api-0\" (UID: \"775f1f1a-d2a9-45a6-91d7-9ea015f815a5\") " pod="watcher-kuttl-default/cinder-api-0" Oct 03 09:07:32 crc kubenswrapper[4765]: I1003 09:07:32.163724 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/775f1f1a-d2a9-45a6-91d7-9ea015f815a5-etc-machine-id\") pod \"cinder-api-0\" (UID: \"775f1f1a-d2a9-45a6-91d7-9ea015f815a5\") " pod="watcher-kuttl-default/cinder-api-0" Oct 03 09:07:32 crc kubenswrapper[4765]: I1003 09:07:32.180589 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert-memcached-mtls\" (UniqueName: \"kubernetes.io/secret/775f1f1a-d2a9-45a6-91d7-9ea015f815a5-cert-memcached-mtls\") pod \"cinder-api-0\" (UID: \"775f1f1a-d2a9-45a6-91d7-9ea015f815a5\") " pod="watcher-kuttl-default/cinder-api-0" Oct 03 09:07:32 crc kubenswrapper[4765]: I1003 09:07:32.180595 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/775f1f1a-d2a9-45a6-91d7-9ea015f815a5-internal-tls-certs\") pod \"cinder-api-0\" (UID: \"775f1f1a-d2a9-45a6-91d7-9ea015f815a5\") " pod="watcher-kuttl-default/cinder-api-0" Oct 03 09:07:32 crc kubenswrapper[4765]: I1003 09:07:32.180598 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/775f1f1a-d2a9-45a6-91d7-9ea015f815a5-config-data-custom\") pod \"cinder-api-0\" (UID: \"775f1f1a-d2a9-45a6-91d7-9ea015f815a5\") " pod="watcher-kuttl-default/cinder-api-0" Oct 03 09:07:32 crc kubenswrapper[4765]: I1003 09:07:32.180917 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/775f1f1a-d2a9-45a6-91d7-9ea015f815a5-config-data\") pod \"cinder-api-0\" (UID: \"775f1f1a-d2a9-45a6-91d7-9ea015f815a5\") " pod="watcher-kuttl-default/cinder-api-0" Oct 03 09:07:32 crc kubenswrapper[4765]: I1003 09:07:32.180948 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/775f1f1a-d2a9-45a6-91d7-9ea015f815a5-scripts\") pod \"cinder-api-0\" (UID: \"775f1f1a-d2a9-45a6-91d7-9ea015f815a5\") " pod="watcher-kuttl-default/cinder-api-0" Oct 03 09:07:32 crc kubenswrapper[4765]: I1003 09:07:32.181130 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/775f1f1a-d2a9-45a6-91d7-9ea015f815a5-public-tls-certs\") pod \"cinder-api-0\" (UID: \"775f1f1a-d2a9-45a6-91d7-9ea015f815a5\") " pod="watcher-kuttl-default/cinder-api-0" Oct 03 09:07:32 crc kubenswrapper[4765]: I1003 09:07:32.181221 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/775f1f1a-d2a9-45a6-91d7-9ea015f815a5-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"775f1f1a-d2a9-45a6-91d7-9ea015f815a5\") " pod="watcher-kuttl-default/cinder-api-0" Oct 03 09:07:32 crc kubenswrapper[4765]: I1003 09:07:32.187093 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4kk5m\" (UniqueName: \"kubernetes.io/projected/775f1f1a-d2a9-45a6-91d7-9ea015f815a5-kube-api-access-4kk5m\") pod \"cinder-api-0\" (UID: \"775f1f1a-d2a9-45a6-91d7-9ea015f815a5\") " pod="watcher-kuttl-default/cinder-api-0" Oct 03 09:07:32 crc kubenswrapper[4765]: I1003 09:07:32.221597 4765 log.go:25] "Finished parsing log file" path="/var/log/pods/watcher-kuttl-default_watcher-kuttl-decision-engine-0_ba2a7e44-33cc-4cf1-8dfc-7aa8306118d2/watcher-decision-engine/0.log" Oct 03 09:07:32 crc kubenswrapper[4765]: I1003 09:07:32.258520 4765 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/cinder-api-0" Oct 03 09:07:32 crc kubenswrapper[4765]: I1003 09:07:32.320377 4765 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="91c71bb9-9664-41a9-a1a7-2821cb1d5ad1" path="/var/lib/kubelet/pods/91c71bb9-9664-41a9-a1a7-2821cb1d5ad1/volumes" Oct 03 09:07:32 crc kubenswrapper[4765]: I1003 09:07:32.780449 4765 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["watcher-kuttl-default/cinder-api-0"] Oct 03 09:07:32 crc kubenswrapper[4765]: W1003 09:07:32.790833 4765 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod775f1f1a_d2a9_45a6_91d7_9ea015f815a5.slice/crio-f525a662febbbd23fe1a6be5d80282e21df98cebaf61d6942c6d83665ed82739 WatchSource:0}: Error finding container f525a662febbbd23fe1a6be5d80282e21df98cebaf61d6942c6d83665ed82739: Status 404 returned error can't find the container with id f525a662febbbd23fe1a6be5d80282e21df98cebaf61d6942c6d83665ed82739 Oct 03 09:07:33 crc kubenswrapper[4765]: I1003 09:07:33.033954 4765 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["watcher-kuttl-default/keystone-7a7f-account-create-t89nv"] Oct 03 09:07:33 crc kubenswrapper[4765]: I1003 09:07:33.041469 4765 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["watcher-kuttl-default/keystone-7a7f-account-create-t89nv"] Oct 03 09:07:33 crc kubenswrapper[4765]: I1003 09:07:33.445431 4765 log.go:25] "Finished parsing log file" path="/var/log/pods/watcher-kuttl-default_watcher-kuttl-decision-engine-0_ba2a7e44-33cc-4cf1-8dfc-7aa8306118d2/watcher-decision-engine/0.log" Oct 03 09:07:33 crc kubenswrapper[4765]: I1003 09:07:33.609829 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/cinder-api-0" event={"ID":"775f1f1a-d2a9-45a6-91d7-9ea015f815a5","Type":"ContainerStarted","Data":"9b40d33389dbf3b866a6e031bf6639fcc656cc3db5295965e512f9492e6d40f9"} Oct 03 09:07:33 crc kubenswrapper[4765]: I1003 09:07:33.610212 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/cinder-api-0" event={"ID":"775f1f1a-d2a9-45a6-91d7-9ea015f815a5","Type":"ContainerStarted","Data":"f525a662febbbd23fe1a6be5d80282e21df98cebaf61d6942c6d83665ed82739"} Oct 03 09:07:34 crc kubenswrapper[4765]: I1003 09:07:34.319585 4765 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ce7aef1c-e74f-473a-9a1c-751e2c3185e2" path="/var/lib/kubelet/pods/ce7aef1c-e74f-473a-9a1c-751e2c3185e2/volumes" Oct 03 09:07:34 crc kubenswrapper[4765]: I1003 09:07:34.618724 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/cinder-api-0" event={"ID":"775f1f1a-d2a9-45a6-91d7-9ea015f815a5","Type":"ContainerStarted","Data":"b0761d1d1406df64e7df209c406d9b18463e3f500094ce75b7bd8a795f9271be"} Oct 03 09:07:34 crc kubenswrapper[4765]: I1003 09:07:34.619137 4765 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="watcher-kuttl-default/cinder-api-0" Oct 03 09:07:34 crc kubenswrapper[4765]: I1003 09:07:34.647545 4765 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="watcher-kuttl-default/cinder-api-0" podStartSLOduration=3.647522305 podStartE2EDuration="3.647522305s" podCreationTimestamp="2025-10-03 09:07:31 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-03 09:07:34.640807202 +0000 UTC m=+1698.942301552" watchObservedRunningTime="2025-10-03 09:07:34.647522305 +0000 UTC m=+1698.949016635" Oct 03 09:07:34 crc kubenswrapper[4765]: I1003 09:07:34.691434 4765 log.go:25] "Finished parsing log file" path="/var/log/pods/watcher-kuttl-default_watcher-kuttl-decision-engine-0_ba2a7e44-33cc-4cf1-8dfc-7aa8306118d2/watcher-decision-engine/0.log" Oct 03 09:07:35 crc kubenswrapper[4765]: I1003 09:07:35.909010 4765 log.go:25] "Finished parsing log file" path="/var/log/pods/watcher-kuttl-default_watcher-kuttl-decision-engine-0_ba2a7e44-33cc-4cf1-8dfc-7aa8306118d2/watcher-decision-engine/0.log" Oct 03 09:07:37 crc kubenswrapper[4765]: I1003 09:07:37.153803 4765 log.go:25] "Finished parsing log file" path="/var/log/pods/watcher-kuttl-default_watcher-kuttl-decision-engine-0_ba2a7e44-33cc-4cf1-8dfc-7aa8306118d2/watcher-decision-engine/0.log" Oct 03 09:07:37 crc kubenswrapper[4765]: I1003 09:07:37.279110 4765 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="watcher-kuttl-default/cinder-backup-0" Oct 03 09:07:37 crc kubenswrapper[4765]: I1003 09:07:37.358887 4765 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="watcher-kuttl-default/cinder-scheduler-0" Oct 03 09:07:37 crc kubenswrapper[4765]: I1003 09:07:37.367860 4765 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["watcher-kuttl-default/cinder-backup-0"] Oct 03 09:07:37 crc kubenswrapper[4765]: I1003 09:07:37.404356 4765 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["watcher-kuttl-default/cinder-scheduler-0"] Oct 03 09:07:37 crc kubenswrapper[4765]: I1003 09:07:37.641598 4765 kuberuntime_container.go:808] "Killing container with a grace period" pod="watcher-kuttl-default/cinder-scheduler-0" podUID="e8a8d866-ba44-4829-a8e8-df5c0c40ae8e" containerName="cinder-scheduler" containerID="cri-o://7d251be655899592f4e4c7638dbd63a609f5d166157cb5847cdea70728591818" gracePeriod=30 Oct 03 09:07:37 crc kubenswrapper[4765]: I1003 09:07:37.642053 4765 kuberuntime_container.go:808] "Killing container with a grace period" pod="watcher-kuttl-default/cinder-backup-0" podUID="260b2c35-636a-4254-a743-d7a34677e0cc" containerName="cinder-backup" containerID="cri-o://50823c8edfe8e7840fb0b06c30143be35dacad1c8767f08433cbd86227cc2443" gracePeriod=30 Oct 03 09:07:37 crc kubenswrapper[4765]: I1003 09:07:37.642363 4765 kuberuntime_container.go:808] "Killing container with a grace period" pod="watcher-kuttl-default/cinder-scheduler-0" podUID="e8a8d866-ba44-4829-a8e8-df5c0c40ae8e" containerName="probe" containerID="cri-o://4e18a5c8c5eb32a52eb080a822703efc70ee3cd3be4f1ecfd74ba9c99ccbba9b" gracePeriod=30 Oct 03 09:07:37 crc kubenswrapper[4765]: I1003 09:07:37.642440 4765 kuberuntime_container.go:808] "Killing container with a grace period" pod="watcher-kuttl-default/cinder-backup-0" podUID="260b2c35-636a-4254-a743-d7a34677e0cc" containerName="probe" containerID="cri-o://c41ef2caf3db2ea87154fb3d6fe3722ce288239cce04209f2e68bce2d96b22cb" gracePeriod=30 Oct 03 09:07:38 crc kubenswrapper[4765]: I1003 09:07:38.354384 4765 log.go:25] "Finished parsing log file" path="/var/log/pods/watcher-kuttl-default_watcher-kuttl-decision-engine-0_ba2a7e44-33cc-4cf1-8dfc-7aa8306118d2/watcher-decision-engine/0.log" Oct 03 09:07:38 crc kubenswrapper[4765]: I1003 09:07:38.653009 4765 generic.go:334] "Generic (PLEG): container finished" podID="260b2c35-636a-4254-a743-d7a34677e0cc" containerID="c41ef2caf3db2ea87154fb3d6fe3722ce288239cce04209f2e68bce2d96b22cb" exitCode=0 Oct 03 09:07:38 crc kubenswrapper[4765]: I1003 09:07:38.653366 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/cinder-backup-0" event={"ID":"260b2c35-636a-4254-a743-d7a34677e0cc","Type":"ContainerDied","Data":"c41ef2caf3db2ea87154fb3d6fe3722ce288239cce04209f2e68bce2d96b22cb"} Oct 03 09:07:38 crc kubenswrapper[4765]: I1003 09:07:38.655131 4765 generic.go:334] "Generic (PLEG): container finished" podID="e8a8d866-ba44-4829-a8e8-df5c0c40ae8e" containerID="4e18a5c8c5eb32a52eb080a822703efc70ee3cd3be4f1ecfd74ba9c99ccbba9b" exitCode=0 Oct 03 09:07:38 crc kubenswrapper[4765]: I1003 09:07:38.655180 4765 generic.go:334] "Generic (PLEG): container finished" podID="e8a8d866-ba44-4829-a8e8-df5c0c40ae8e" containerID="7d251be655899592f4e4c7638dbd63a609f5d166157cb5847cdea70728591818" exitCode=0 Oct 03 09:07:38 crc kubenswrapper[4765]: I1003 09:07:38.655206 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/cinder-scheduler-0" event={"ID":"e8a8d866-ba44-4829-a8e8-df5c0c40ae8e","Type":"ContainerDied","Data":"4e18a5c8c5eb32a52eb080a822703efc70ee3cd3be4f1ecfd74ba9c99ccbba9b"} Oct 03 09:07:38 crc kubenswrapper[4765]: I1003 09:07:38.655250 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/cinder-scheduler-0" event={"ID":"e8a8d866-ba44-4829-a8e8-df5c0c40ae8e","Type":"ContainerDied","Data":"7d251be655899592f4e4c7638dbd63a609f5d166157cb5847cdea70728591818"} Oct 03 09:07:38 crc kubenswrapper[4765]: I1003 09:07:38.805185 4765 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["watcher-kuttl-default/watcher-kuttl-decision-engine-0"] Oct 03 09:07:38 crc kubenswrapper[4765]: I1003 09:07:38.805811 4765 kuberuntime_container.go:808] "Killing container with a grace period" pod="watcher-kuttl-default/watcher-kuttl-decision-engine-0" podUID="ba2a7e44-33cc-4cf1-8dfc-7aa8306118d2" containerName="watcher-decision-engine" containerID="cri-o://de9d18011da3ea7f10fb762bcd7f356de4b115da9910de90f6b45abc85109696" gracePeriod=30 Oct 03 09:07:39 crc kubenswrapper[4765]: I1003 09:07:39.548619 4765 log.go:25] "Finished parsing log file" path="/var/log/pods/watcher-kuttl-default_watcher-kuttl-decision-engine-0_ba2a7e44-33cc-4cf1-8dfc-7aa8306118d2/watcher-decision-engine/0.log" Oct 03 09:07:39 crc kubenswrapper[4765]: I1003 09:07:39.592608 4765 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/cinder-backup-0" Oct 03 09:07:39 crc kubenswrapper[4765]: I1003 09:07:39.598988 4765 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/cinder-scheduler-0" Oct 03 09:07:39 crc kubenswrapper[4765]: I1003 09:07:39.692096 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/cinder-scheduler-0" event={"ID":"e8a8d866-ba44-4829-a8e8-df5c0c40ae8e","Type":"ContainerDied","Data":"9e55728b65f65fe4dcd753ddee20aee601d77cf8a55dd9d5145edce17a406762"} Oct 03 09:07:39 crc kubenswrapper[4765]: I1003 09:07:39.692160 4765 scope.go:117] "RemoveContainer" containerID="4e18a5c8c5eb32a52eb080a822703efc70ee3cd3be4f1ecfd74ba9c99ccbba9b" Oct 03 09:07:39 crc kubenswrapper[4765]: I1003 09:07:39.692332 4765 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/cinder-scheduler-0" Oct 03 09:07:39 crc kubenswrapper[4765]: I1003 09:07:39.697800 4765 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/e8a8d866-ba44-4829-a8e8-df5c0c40ae8e-config-data-custom\") pod \"e8a8d866-ba44-4829-a8e8-df5c0c40ae8e\" (UID: \"e8a8d866-ba44-4829-a8e8-df5c0c40ae8e\") " Oct 03 09:07:39 crc kubenswrapper[4765]: I1003 09:07:39.697879 4765 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/260b2c35-636a-4254-a743-d7a34677e0cc-combined-ca-bundle\") pod \"260b2c35-636a-4254-a743-d7a34677e0cc\" (UID: \"260b2c35-636a-4254-a743-d7a34677e0cc\") " Oct 03 09:07:39 crc kubenswrapper[4765]: I1003 09:07:39.697933 4765 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e8a8d866-ba44-4829-a8e8-df5c0c40ae8e-scripts\") pod \"e8a8d866-ba44-4829-a8e8-df5c0c40ae8e\" (UID: \"e8a8d866-ba44-4829-a8e8-df5c0c40ae8e\") " Oct 03 09:07:39 crc kubenswrapper[4765]: I1003 09:07:39.697965 4765 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cert-memcached-mtls\" (UniqueName: \"kubernetes.io/secret/e8a8d866-ba44-4829-a8e8-df5c0c40ae8e-cert-memcached-mtls\") pod \"e8a8d866-ba44-4829-a8e8-df5c0c40ae8e\" (UID: \"e8a8d866-ba44-4829-a8e8-df5c0c40ae8e\") " Oct 03 09:07:39 crc kubenswrapper[4765]: I1003 09:07:39.698031 4765 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cert-memcached-mtls\" (UniqueName: \"kubernetes.io/secret/260b2c35-636a-4254-a743-d7a34677e0cc-cert-memcached-mtls\") pod \"260b2c35-636a-4254-a743-d7a34677e0cc\" (UID: \"260b2c35-636a-4254-a743-d7a34677e0cc\") " Oct 03 09:07:39 crc kubenswrapper[4765]: I1003 09:07:39.698061 4765 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e8a8d866-ba44-4829-a8e8-df5c0c40ae8e-config-data\") pod \"e8a8d866-ba44-4829-a8e8-df5c0c40ae8e\" (UID: \"e8a8d866-ba44-4829-a8e8-df5c0c40ae8e\") " Oct 03 09:07:39 crc kubenswrapper[4765]: I1003 09:07:39.698104 4765 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-locks-brick\" (UniqueName: \"kubernetes.io/host-path/260b2c35-636a-4254-a743-d7a34677e0cc-var-locks-brick\") pod \"260b2c35-636a-4254-a743-d7a34677e0cc\" (UID: \"260b2c35-636a-4254-a743-d7a34677e0cc\") " Oct 03 09:07:39 crc kubenswrapper[4765]: I1003 09:07:39.698140 4765 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/260b2c35-636a-4254-a743-d7a34677e0cc-sys\") pod \"260b2c35-636a-4254-a743-d7a34677e0cc\" (UID: \"260b2c35-636a-4254-a743-d7a34677e0cc\") " Oct 03 09:07:39 crc kubenswrapper[4765]: I1003 09:07:39.698276 4765 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/260b2c35-636a-4254-a743-d7a34677e0cc-sys" (OuterVolumeSpecName: "sys") pod "260b2c35-636a-4254-a743-d7a34677e0cc" (UID: "260b2c35-636a-4254-a743-d7a34677e0cc"). InnerVolumeSpecName "sys". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 03 09:07:39 crc kubenswrapper[4765]: I1003 09:07:39.700857 4765 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bcqqt\" (UniqueName: \"kubernetes.io/projected/e8a8d866-ba44-4829-a8e8-df5c0c40ae8e-kube-api-access-bcqqt\") pod \"e8a8d866-ba44-4829-a8e8-df5c0c40ae8e\" (UID: \"e8a8d866-ba44-4829-a8e8-df5c0c40ae8e\") " Oct 03 09:07:39 crc kubenswrapper[4765]: I1003 09:07:39.700953 4765 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-locks-cinder\" (UniqueName: \"kubernetes.io/host-path/260b2c35-636a-4254-a743-d7a34677e0cc-var-locks-cinder\") pod \"260b2c35-636a-4254-a743-d7a34677e0cc\" (UID: \"260b2c35-636a-4254-a743-d7a34677e0cc\") " Oct 03 09:07:39 crc kubenswrapper[4765]: I1003 09:07:39.700989 4765 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dev\" (UniqueName: \"kubernetes.io/host-path/260b2c35-636a-4254-a743-d7a34677e0cc-dev\") pod \"260b2c35-636a-4254-a743-d7a34677e0cc\" (UID: \"260b2c35-636a-4254-a743-d7a34677e0cc\") " Oct 03 09:07:39 crc kubenswrapper[4765]: I1003 09:07:39.701013 4765 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-nvme\" (UniqueName: \"kubernetes.io/host-path/260b2c35-636a-4254-a743-d7a34677e0cc-etc-nvme\") pod \"260b2c35-636a-4254-a743-d7a34677e0cc\" (UID: \"260b2c35-636a-4254-a743-d7a34677e0cc\") " Oct 03 09:07:39 crc kubenswrapper[4765]: I1003 09:07:39.701054 4765 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/260b2c35-636a-4254-a743-d7a34677e0cc-config-data\") pod \"260b2c35-636a-4254-a743-d7a34677e0cc\" (UID: \"260b2c35-636a-4254-a743-d7a34677e0cc\") " Oct 03 09:07:39 crc kubenswrapper[4765]: I1003 09:07:39.701083 4765 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/260b2c35-636a-4254-a743-d7a34677e0cc-config-data-custom\") pod \"260b2c35-636a-4254-a743-d7a34677e0cc\" (UID: \"260b2c35-636a-4254-a743-d7a34677e0cc\") " Oct 03 09:07:39 crc kubenswrapper[4765]: I1003 09:07:39.701121 4765 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/260b2c35-636a-4254-a743-d7a34677e0cc-scripts\") pod \"260b2c35-636a-4254-a743-d7a34677e0cc\" (UID: \"260b2c35-636a-4254-a743-d7a34677e0cc\") " Oct 03 09:07:39 crc kubenswrapper[4765]: I1003 09:07:39.701161 4765 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-v5wzj\" (UniqueName: \"kubernetes.io/projected/260b2c35-636a-4254-a743-d7a34677e0cc-kube-api-access-v5wzj\") pod \"260b2c35-636a-4254-a743-d7a34677e0cc\" (UID: \"260b2c35-636a-4254-a743-d7a34677e0cc\") " Oct 03 09:07:39 crc kubenswrapper[4765]: I1003 09:07:39.701186 4765 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-iscsi\" (UniqueName: \"kubernetes.io/host-path/260b2c35-636a-4254-a743-d7a34677e0cc-etc-iscsi\") pod \"260b2c35-636a-4254-a743-d7a34677e0cc\" (UID: \"260b2c35-636a-4254-a743-d7a34677e0cc\") " Oct 03 09:07:39 crc kubenswrapper[4765]: I1003 09:07:39.701218 4765 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/e8a8d866-ba44-4829-a8e8-df5c0c40ae8e-etc-machine-id\") pod \"e8a8d866-ba44-4829-a8e8-df5c0c40ae8e\" (UID: \"e8a8d866-ba44-4829-a8e8-df5c0c40ae8e\") " Oct 03 09:07:39 crc kubenswrapper[4765]: I1003 09:07:39.701244 4765 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-lib-cinder\" (UniqueName: \"kubernetes.io/host-path/260b2c35-636a-4254-a743-d7a34677e0cc-var-lib-cinder\") pod \"260b2c35-636a-4254-a743-d7a34677e0cc\" (UID: \"260b2c35-636a-4254-a743-d7a34677e0cc\") " Oct 03 09:07:39 crc kubenswrapper[4765]: I1003 09:07:39.701286 4765 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/260b2c35-636a-4254-a743-d7a34677e0cc-run\") pod \"260b2c35-636a-4254-a743-d7a34677e0cc\" (UID: \"260b2c35-636a-4254-a743-d7a34677e0cc\") " Oct 03 09:07:39 crc kubenswrapper[4765]: I1003 09:07:39.701317 4765 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e8a8d866-ba44-4829-a8e8-df5c0c40ae8e-combined-ca-bundle\") pod \"e8a8d866-ba44-4829-a8e8-df5c0c40ae8e\" (UID: \"e8a8d866-ba44-4829-a8e8-df5c0c40ae8e\") " Oct 03 09:07:39 crc kubenswrapper[4765]: I1003 09:07:39.701412 4765 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/260b2c35-636a-4254-a743-d7a34677e0cc-lib-modules\") pod \"260b2c35-636a-4254-a743-d7a34677e0cc\" (UID: \"260b2c35-636a-4254-a743-d7a34677e0cc\") " Oct 03 09:07:39 crc kubenswrapper[4765]: I1003 09:07:39.701467 4765 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/260b2c35-636a-4254-a743-d7a34677e0cc-etc-machine-id\") pod \"260b2c35-636a-4254-a743-d7a34677e0cc\" (UID: \"260b2c35-636a-4254-a743-d7a34677e0cc\") " Oct 03 09:07:39 crc kubenswrapper[4765]: I1003 09:07:39.702228 4765 reconciler_common.go:293] "Volume detached for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/260b2c35-636a-4254-a743-d7a34677e0cc-sys\") on node \"crc\" DevicePath \"\"" Oct 03 09:07:39 crc kubenswrapper[4765]: I1003 09:07:39.702283 4765 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/260b2c35-636a-4254-a743-d7a34677e0cc-etc-machine-id" (OuterVolumeSpecName: "etc-machine-id") pod "260b2c35-636a-4254-a743-d7a34677e0cc" (UID: "260b2c35-636a-4254-a743-d7a34677e0cc"). InnerVolumeSpecName "etc-machine-id". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 03 09:07:39 crc kubenswrapper[4765]: I1003 09:07:39.704830 4765 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e8a8d866-ba44-4829-a8e8-df5c0c40ae8e-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "e8a8d866-ba44-4829-a8e8-df5c0c40ae8e" (UID: "e8a8d866-ba44-4829-a8e8-df5c0c40ae8e"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 09:07:39 crc kubenswrapper[4765]: I1003 09:07:39.707052 4765 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e8a8d866-ba44-4829-a8e8-df5c0c40ae8e-scripts" (OuterVolumeSpecName: "scripts") pod "e8a8d866-ba44-4829-a8e8-df5c0c40ae8e" (UID: "e8a8d866-ba44-4829-a8e8-df5c0c40ae8e"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 09:07:39 crc kubenswrapper[4765]: I1003 09:07:39.709263 4765 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/260b2c35-636a-4254-a743-d7a34677e0cc-kube-api-access-v5wzj" (OuterVolumeSpecName: "kube-api-access-v5wzj") pod "260b2c35-636a-4254-a743-d7a34677e0cc" (UID: "260b2c35-636a-4254-a743-d7a34677e0cc"). InnerVolumeSpecName "kube-api-access-v5wzj". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 09:07:39 crc kubenswrapper[4765]: I1003 09:07:39.709334 4765 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/260b2c35-636a-4254-a743-d7a34677e0cc-var-locks-cinder" (OuterVolumeSpecName: "var-locks-cinder") pod "260b2c35-636a-4254-a743-d7a34677e0cc" (UID: "260b2c35-636a-4254-a743-d7a34677e0cc"). InnerVolumeSpecName "var-locks-cinder". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 03 09:07:39 crc kubenswrapper[4765]: I1003 09:07:39.709362 4765 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/260b2c35-636a-4254-a743-d7a34677e0cc-dev" (OuterVolumeSpecName: "dev") pod "260b2c35-636a-4254-a743-d7a34677e0cc" (UID: "260b2c35-636a-4254-a743-d7a34677e0cc"). InnerVolumeSpecName "dev". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 03 09:07:39 crc kubenswrapper[4765]: I1003 09:07:39.709386 4765 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/260b2c35-636a-4254-a743-d7a34677e0cc-etc-nvme" (OuterVolumeSpecName: "etc-nvme") pod "260b2c35-636a-4254-a743-d7a34677e0cc" (UID: "260b2c35-636a-4254-a743-d7a34677e0cc"). InnerVolumeSpecName "etc-nvme". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 03 09:07:39 crc kubenswrapper[4765]: I1003 09:07:39.710980 4765 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e8a8d866-ba44-4829-a8e8-df5c0c40ae8e-kube-api-access-bcqqt" (OuterVolumeSpecName: "kube-api-access-bcqqt") pod "e8a8d866-ba44-4829-a8e8-df5c0c40ae8e" (UID: "e8a8d866-ba44-4829-a8e8-df5c0c40ae8e"). InnerVolumeSpecName "kube-api-access-bcqqt". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 09:07:39 crc kubenswrapper[4765]: I1003 09:07:39.711144 4765 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/260b2c35-636a-4254-a743-d7a34677e0cc-etc-iscsi" (OuterVolumeSpecName: "etc-iscsi") pod "260b2c35-636a-4254-a743-d7a34677e0cc" (UID: "260b2c35-636a-4254-a743-d7a34677e0cc"). InnerVolumeSpecName "etc-iscsi". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 03 09:07:39 crc kubenswrapper[4765]: I1003 09:07:39.711256 4765 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/e8a8d866-ba44-4829-a8e8-df5c0c40ae8e-etc-machine-id" (OuterVolumeSpecName: "etc-machine-id") pod "e8a8d866-ba44-4829-a8e8-df5c0c40ae8e" (UID: "e8a8d866-ba44-4829-a8e8-df5c0c40ae8e"). InnerVolumeSpecName "etc-machine-id". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 03 09:07:39 crc kubenswrapper[4765]: I1003 09:07:39.711342 4765 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/260b2c35-636a-4254-a743-d7a34677e0cc-var-lib-cinder" (OuterVolumeSpecName: "var-lib-cinder") pod "260b2c35-636a-4254-a743-d7a34677e0cc" (UID: "260b2c35-636a-4254-a743-d7a34677e0cc"). InnerVolumeSpecName "var-lib-cinder". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 03 09:07:39 crc kubenswrapper[4765]: I1003 09:07:39.711433 4765 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/260b2c35-636a-4254-a743-d7a34677e0cc-run" (OuterVolumeSpecName: "run") pod "260b2c35-636a-4254-a743-d7a34677e0cc" (UID: "260b2c35-636a-4254-a743-d7a34677e0cc"). InnerVolumeSpecName "run". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 03 09:07:39 crc kubenswrapper[4765]: I1003 09:07:39.713625 4765 generic.go:334] "Generic (PLEG): container finished" podID="260b2c35-636a-4254-a743-d7a34677e0cc" containerID="50823c8edfe8e7840fb0b06c30143be35dacad1c8767f08433cbd86227cc2443" exitCode=0 Oct 03 09:07:39 crc kubenswrapper[4765]: I1003 09:07:39.713697 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/cinder-backup-0" event={"ID":"260b2c35-636a-4254-a743-d7a34677e0cc","Type":"ContainerDied","Data":"50823c8edfe8e7840fb0b06c30143be35dacad1c8767f08433cbd86227cc2443"} Oct 03 09:07:39 crc kubenswrapper[4765]: I1003 09:07:39.713731 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/cinder-backup-0" event={"ID":"260b2c35-636a-4254-a743-d7a34677e0cc","Type":"ContainerDied","Data":"faecdaf8b623942c9b079962ce42e1f995ea26af381b0b0771c7518f04eab352"} Oct 03 09:07:39 crc kubenswrapper[4765]: I1003 09:07:39.713814 4765 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/cinder-backup-0" Oct 03 09:07:39 crc kubenswrapper[4765]: I1003 09:07:39.714698 4765 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/260b2c35-636a-4254-a743-d7a34677e0cc-var-locks-brick" (OuterVolumeSpecName: "var-locks-brick") pod "260b2c35-636a-4254-a743-d7a34677e0cc" (UID: "260b2c35-636a-4254-a743-d7a34677e0cc"). InnerVolumeSpecName "var-locks-brick". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 03 09:07:39 crc kubenswrapper[4765]: I1003 09:07:39.714754 4765 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/260b2c35-636a-4254-a743-d7a34677e0cc-lib-modules" (OuterVolumeSpecName: "lib-modules") pod "260b2c35-636a-4254-a743-d7a34677e0cc" (UID: "260b2c35-636a-4254-a743-d7a34677e0cc"). InnerVolumeSpecName "lib-modules". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 03 09:07:39 crc kubenswrapper[4765]: I1003 09:07:39.724748 4765 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/260b2c35-636a-4254-a743-d7a34677e0cc-scripts" (OuterVolumeSpecName: "scripts") pod "260b2c35-636a-4254-a743-d7a34677e0cc" (UID: "260b2c35-636a-4254-a743-d7a34677e0cc"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 09:07:39 crc kubenswrapper[4765]: I1003 09:07:39.725839 4765 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/260b2c35-636a-4254-a743-d7a34677e0cc-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "260b2c35-636a-4254-a743-d7a34677e0cc" (UID: "260b2c35-636a-4254-a743-d7a34677e0cc"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 09:07:39 crc kubenswrapper[4765]: I1003 09:07:39.790062 4765 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e8a8d866-ba44-4829-a8e8-df5c0c40ae8e-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "e8a8d866-ba44-4829-a8e8-df5c0c40ae8e" (UID: "e8a8d866-ba44-4829-a8e8-df5c0c40ae8e"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 09:07:39 crc kubenswrapper[4765]: I1003 09:07:39.791059 4765 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/260b2c35-636a-4254-a743-d7a34677e0cc-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "260b2c35-636a-4254-a743-d7a34677e0cc" (UID: "260b2c35-636a-4254-a743-d7a34677e0cc"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 09:07:39 crc kubenswrapper[4765]: I1003 09:07:39.803515 4765 reconciler_common.go:293] "Volume detached for volume \"run\" (UniqueName: \"kubernetes.io/host-path/260b2c35-636a-4254-a743-d7a34677e0cc-run\") on node \"crc\" DevicePath \"\"" Oct 03 09:07:39 crc kubenswrapper[4765]: I1003 09:07:39.803562 4765 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e8a8d866-ba44-4829-a8e8-df5c0c40ae8e-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 03 09:07:39 crc kubenswrapper[4765]: I1003 09:07:39.803577 4765 reconciler_common.go:293] "Volume detached for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/260b2c35-636a-4254-a743-d7a34677e0cc-lib-modules\") on node \"crc\" DevicePath \"\"" Oct 03 09:07:39 crc kubenswrapper[4765]: I1003 09:07:39.803588 4765 reconciler_common.go:293] "Volume detached for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/260b2c35-636a-4254-a743-d7a34677e0cc-etc-machine-id\") on node \"crc\" DevicePath \"\"" Oct 03 09:07:39 crc kubenswrapper[4765]: I1003 09:07:39.803597 4765 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/e8a8d866-ba44-4829-a8e8-df5c0c40ae8e-config-data-custom\") on node \"crc\" DevicePath \"\"" Oct 03 09:07:39 crc kubenswrapper[4765]: I1003 09:07:39.803609 4765 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/260b2c35-636a-4254-a743-d7a34677e0cc-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 03 09:07:39 crc kubenswrapper[4765]: I1003 09:07:39.803616 4765 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e8a8d866-ba44-4829-a8e8-df5c0c40ae8e-scripts\") on node \"crc\" DevicePath \"\"" Oct 03 09:07:39 crc kubenswrapper[4765]: I1003 09:07:39.803624 4765 reconciler_common.go:293] "Volume detached for volume \"var-locks-brick\" (UniqueName: \"kubernetes.io/host-path/260b2c35-636a-4254-a743-d7a34677e0cc-var-locks-brick\") on node \"crc\" DevicePath \"\"" Oct 03 09:07:39 crc kubenswrapper[4765]: I1003 09:07:39.803634 4765 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bcqqt\" (UniqueName: \"kubernetes.io/projected/e8a8d866-ba44-4829-a8e8-df5c0c40ae8e-kube-api-access-bcqqt\") on node \"crc\" DevicePath \"\"" Oct 03 09:07:39 crc kubenswrapper[4765]: I1003 09:07:39.803671 4765 reconciler_common.go:293] "Volume detached for volume \"var-locks-cinder\" (UniqueName: \"kubernetes.io/host-path/260b2c35-636a-4254-a743-d7a34677e0cc-var-locks-cinder\") on node \"crc\" DevicePath \"\"" Oct 03 09:07:39 crc kubenswrapper[4765]: I1003 09:07:39.803683 4765 reconciler_common.go:293] "Volume detached for volume \"dev\" (UniqueName: \"kubernetes.io/host-path/260b2c35-636a-4254-a743-d7a34677e0cc-dev\") on node \"crc\" DevicePath \"\"" Oct 03 09:07:39 crc kubenswrapper[4765]: I1003 09:07:39.803693 4765 reconciler_common.go:293] "Volume detached for volume \"etc-nvme\" (UniqueName: \"kubernetes.io/host-path/260b2c35-636a-4254-a743-d7a34677e0cc-etc-nvme\") on node \"crc\" DevicePath \"\"" Oct 03 09:07:39 crc kubenswrapper[4765]: I1003 09:07:39.803707 4765 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/260b2c35-636a-4254-a743-d7a34677e0cc-config-data-custom\") on node \"crc\" DevicePath \"\"" Oct 03 09:07:39 crc kubenswrapper[4765]: I1003 09:07:39.803717 4765 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/260b2c35-636a-4254-a743-d7a34677e0cc-scripts\") on node \"crc\" DevicePath \"\"" Oct 03 09:07:39 crc kubenswrapper[4765]: I1003 09:07:39.803727 4765 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-v5wzj\" (UniqueName: \"kubernetes.io/projected/260b2c35-636a-4254-a743-d7a34677e0cc-kube-api-access-v5wzj\") on node \"crc\" DevicePath \"\"" Oct 03 09:07:39 crc kubenswrapper[4765]: I1003 09:07:39.803737 4765 reconciler_common.go:293] "Volume detached for volume \"etc-iscsi\" (UniqueName: \"kubernetes.io/host-path/260b2c35-636a-4254-a743-d7a34677e0cc-etc-iscsi\") on node \"crc\" DevicePath \"\"" Oct 03 09:07:39 crc kubenswrapper[4765]: I1003 09:07:39.803747 4765 reconciler_common.go:293] "Volume detached for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/e8a8d866-ba44-4829-a8e8-df5c0c40ae8e-etc-machine-id\") on node \"crc\" DevicePath \"\"" Oct 03 09:07:39 crc kubenswrapper[4765]: I1003 09:07:39.803757 4765 reconciler_common.go:293] "Volume detached for volume \"var-lib-cinder\" (UniqueName: \"kubernetes.io/host-path/260b2c35-636a-4254-a743-d7a34677e0cc-var-lib-cinder\") on node \"crc\" DevicePath \"\"" Oct 03 09:07:39 crc kubenswrapper[4765]: I1003 09:07:39.806926 4765 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/260b2c35-636a-4254-a743-d7a34677e0cc-config-data" (OuterVolumeSpecName: "config-data") pod "260b2c35-636a-4254-a743-d7a34677e0cc" (UID: "260b2c35-636a-4254-a743-d7a34677e0cc"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 09:07:39 crc kubenswrapper[4765]: I1003 09:07:39.826883 4765 scope.go:117] "RemoveContainer" containerID="7d251be655899592f4e4c7638dbd63a609f5d166157cb5847cdea70728591818" Oct 03 09:07:39 crc kubenswrapper[4765]: I1003 09:07:39.888344 4765 scope.go:117] "RemoveContainer" containerID="c41ef2caf3db2ea87154fb3d6fe3722ce288239cce04209f2e68bce2d96b22cb" Oct 03 09:07:39 crc kubenswrapper[4765]: I1003 09:07:39.894299 4765 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e8a8d866-ba44-4829-a8e8-df5c0c40ae8e-config-data" (OuterVolumeSpecName: "config-data") pod "e8a8d866-ba44-4829-a8e8-df5c0c40ae8e" (UID: "e8a8d866-ba44-4829-a8e8-df5c0c40ae8e"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 09:07:39 crc kubenswrapper[4765]: I1003 09:07:39.907992 4765 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e8a8d866-ba44-4829-a8e8-df5c0c40ae8e-config-data\") on node \"crc\" DevicePath \"\"" Oct 03 09:07:39 crc kubenswrapper[4765]: I1003 09:07:39.908022 4765 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/260b2c35-636a-4254-a743-d7a34677e0cc-config-data\") on node \"crc\" DevicePath \"\"" Oct 03 09:07:39 crc kubenswrapper[4765]: I1003 09:07:39.968273 4765 scope.go:117] "RemoveContainer" containerID="50823c8edfe8e7840fb0b06c30143be35dacad1c8767f08433cbd86227cc2443" Oct 03 09:07:40 crc kubenswrapper[4765]: I1003 09:07:40.017503 4765 scope.go:117] "RemoveContainer" containerID="c41ef2caf3db2ea87154fb3d6fe3722ce288239cce04209f2e68bce2d96b22cb" Oct 03 09:07:40 crc kubenswrapper[4765]: E1003 09:07:40.018732 4765 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c41ef2caf3db2ea87154fb3d6fe3722ce288239cce04209f2e68bce2d96b22cb\": container with ID starting with c41ef2caf3db2ea87154fb3d6fe3722ce288239cce04209f2e68bce2d96b22cb not found: ID does not exist" containerID="c41ef2caf3db2ea87154fb3d6fe3722ce288239cce04209f2e68bce2d96b22cb" Oct 03 09:07:40 crc kubenswrapper[4765]: I1003 09:07:40.018771 4765 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c41ef2caf3db2ea87154fb3d6fe3722ce288239cce04209f2e68bce2d96b22cb"} err="failed to get container status \"c41ef2caf3db2ea87154fb3d6fe3722ce288239cce04209f2e68bce2d96b22cb\": rpc error: code = NotFound desc = could not find container \"c41ef2caf3db2ea87154fb3d6fe3722ce288239cce04209f2e68bce2d96b22cb\": container with ID starting with c41ef2caf3db2ea87154fb3d6fe3722ce288239cce04209f2e68bce2d96b22cb not found: ID does not exist" Oct 03 09:07:40 crc kubenswrapper[4765]: I1003 09:07:40.018856 4765 scope.go:117] "RemoveContainer" containerID="50823c8edfe8e7840fb0b06c30143be35dacad1c8767f08433cbd86227cc2443" Oct 03 09:07:40 crc kubenswrapper[4765]: E1003 09:07:40.019493 4765 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"50823c8edfe8e7840fb0b06c30143be35dacad1c8767f08433cbd86227cc2443\": container with ID starting with 50823c8edfe8e7840fb0b06c30143be35dacad1c8767f08433cbd86227cc2443 not found: ID does not exist" containerID="50823c8edfe8e7840fb0b06c30143be35dacad1c8767f08433cbd86227cc2443" Oct 03 09:07:40 crc kubenswrapper[4765]: I1003 09:07:40.019515 4765 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"50823c8edfe8e7840fb0b06c30143be35dacad1c8767f08433cbd86227cc2443"} err="failed to get container status \"50823c8edfe8e7840fb0b06c30143be35dacad1c8767f08433cbd86227cc2443\": rpc error: code = NotFound desc = could not find container \"50823c8edfe8e7840fb0b06c30143be35dacad1c8767f08433cbd86227cc2443\": container with ID starting with 50823c8edfe8e7840fb0b06c30143be35dacad1c8767f08433cbd86227cc2443 not found: ID does not exist" Oct 03 09:07:40 crc kubenswrapper[4765]: I1003 09:07:40.030019 4765 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e8a8d866-ba44-4829-a8e8-df5c0c40ae8e-cert-memcached-mtls" (OuterVolumeSpecName: "cert-memcached-mtls") pod "e8a8d866-ba44-4829-a8e8-df5c0c40ae8e" (UID: "e8a8d866-ba44-4829-a8e8-df5c0c40ae8e"). InnerVolumeSpecName "cert-memcached-mtls". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 09:07:40 crc kubenswrapper[4765]: I1003 09:07:40.074829 4765 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/260b2c35-636a-4254-a743-d7a34677e0cc-cert-memcached-mtls" (OuterVolumeSpecName: "cert-memcached-mtls") pod "260b2c35-636a-4254-a743-d7a34677e0cc" (UID: "260b2c35-636a-4254-a743-d7a34677e0cc"). InnerVolumeSpecName "cert-memcached-mtls". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 09:07:40 crc kubenswrapper[4765]: I1003 09:07:40.112555 4765 reconciler_common.go:293] "Volume detached for volume \"cert-memcached-mtls\" (UniqueName: \"kubernetes.io/secret/e8a8d866-ba44-4829-a8e8-df5c0c40ae8e-cert-memcached-mtls\") on node \"crc\" DevicePath \"\"" Oct 03 09:07:40 crc kubenswrapper[4765]: I1003 09:07:40.112598 4765 reconciler_common.go:293] "Volume detached for volume \"cert-memcached-mtls\" (UniqueName: \"kubernetes.io/secret/260b2c35-636a-4254-a743-d7a34677e0cc-cert-memcached-mtls\") on node \"crc\" DevicePath \"\"" Oct 03 09:07:40 crc kubenswrapper[4765]: I1003 09:07:40.270301 4765 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["watcher-kuttl-default/ceilometer-0"] Oct 03 09:07:40 crc kubenswrapper[4765]: I1003 09:07:40.270573 4765 kuberuntime_container.go:808] "Killing container with a grace period" pod="watcher-kuttl-default/ceilometer-0" podUID="5dbe0344-1813-4e84-a956-434bd050bdc1" containerName="ceilometer-central-agent" containerID="cri-o://b06db2bd1f356e803c7e960bd2bfdda5b8ece438e43f4a8114c4944017c78bfb" gracePeriod=30 Oct 03 09:07:40 crc kubenswrapper[4765]: I1003 09:07:40.270634 4765 kuberuntime_container.go:808] "Killing container with a grace period" pod="watcher-kuttl-default/ceilometer-0" podUID="5dbe0344-1813-4e84-a956-434bd050bdc1" containerName="ceilometer-notification-agent" containerID="cri-o://f579c547101e5562a5e8cbbf13eefd68ec182efd8182644ad53e45f28d15a161" gracePeriod=30 Oct 03 09:07:40 crc kubenswrapper[4765]: I1003 09:07:40.270687 4765 kuberuntime_container.go:808] "Killing container with a grace period" pod="watcher-kuttl-default/ceilometer-0" podUID="5dbe0344-1813-4e84-a956-434bd050bdc1" containerName="proxy-httpd" containerID="cri-o://c9a7c8cff14aa929cc1d212e1add72e03d3849017c7ba288aba297471ec75f4c" gracePeriod=30 Oct 03 09:07:40 crc kubenswrapper[4765]: I1003 09:07:40.270616 4765 kuberuntime_container.go:808] "Killing container with a grace period" pod="watcher-kuttl-default/ceilometer-0" podUID="5dbe0344-1813-4e84-a956-434bd050bdc1" containerName="sg-core" containerID="cri-o://848b071f1bb670ceae2b3fdb7f9ba0a83275f026ea59a9689396f1e284267548" gracePeriod=30 Oct 03 09:07:40 crc kubenswrapper[4765]: I1003 09:07:40.307509 4765 scope.go:117] "RemoveContainer" containerID="dd918556e4256b95f1ffce5dba4f8a301b33441a569fc5bbea88da3f09eb9800" Oct 03 09:07:40 crc kubenswrapper[4765]: E1003 09:07:40.307820 4765 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-j8mss_openshift-machine-config-operator(d636dbad-9ffa-4ba7-953f-adea04b76a23)\"" pod="openshift-machine-config-operator/machine-config-daemon-j8mss" podUID="d636dbad-9ffa-4ba7-953f-adea04b76a23" Oct 03 09:07:40 crc kubenswrapper[4765]: I1003 09:07:40.339835 4765 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["watcher-kuttl-default/cinder-scheduler-0"] Oct 03 09:07:40 crc kubenswrapper[4765]: I1003 09:07:40.357360 4765 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["watcher-kuttl-default/cinder-scheduler-0"] Oct 03 09:07:40 crc kubenswrapper[4765]: I1003 09:07:40.365203 4765 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["watcher-kuttl-default/cinder-scheduler-0"] Oct 03 09:07:40 crc kubenswrapper[4765]: E1003 09:07:40.365683 4765 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e8a8d866-ba44-4829-a8e8-df5c0c40ae8e" containerName="probe" Oct 03 09:07:40 crc kubenswrapper[4765]: I1003 09:07:40.365708 4765 state_mem.go:107] "Deleted CPUSet assignment" podUID="e8a8d866-ba44-4829-a8e8-df5c0c40ae8e" containerName="probe" Oct 03 09:07:40 crc kubenswrapper[4765]: E1003 09:07:40.365741 4765 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="260b2c35-636a-4254-a743-d7a34677e0cc" containerName="probe" Oct 03 09:07:40 crc kubenswrapper[4765]: I1003 09:07:40.365751 4765 state_mem.go:107] "Deleted CPUSet assignment" podUID="260b2c35-636a-4254-a743-d7a34677e0cc" containerName="probe" Oct 03 09:07:40 crc kubenswrapper[4765]: E1003 09:07:40.365766 4765 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="260b2c35-636a-4254-a743-d7a34677e0cc" containerName="cinder-backup" Oct 03 09:07:40 crc kubenswrapper[4765]: I1003 09:07:40.365774 4765 state_mem.go:107] "Deleted CPUSet assignment" podUID="260b2c35-636a-4254-a743-d7a34677e0cc" containerName="cinder-backup" Oct 03 09:07:40 crc kubenswrapper[4765]: E1003 09:07:40.365799 4765 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e8a8d866-ba44-4829-a8e8-df5c0c40ae8e" containerName="cinder-scheduler" Oct 03 09:07:40 crc kubenswrapper[4765]: I1003 09:07:40.365806 4765 state_mem.go:107] "Deleted CPUSet assignment" podUID="e8a8d866-ba44-4829-a8e8-df5c0c40ae8e" containerName="cinder-scheduler" Oct 03 09:07:40 crc kubenswrapper[4765]: I1003 09:07:40.365987 4765 memory_manager.go:354] "RemoveStaleState removing state" podUID="260b2c35-636a-4254-a743-d7a34677e0cc" containerName="cinder-backup" Oct 03 09:07:40 crc kubenswrapper[4765]: I1003 09:07:40.366004 4765 memory_manager.go:354] "RemoveStaleState removing state" podUID="260b2c35-636a-4254-a743-d7a34677e0cc" containerName="probe" Oct 03 09:07:40 crc kubenswrapper[4765]: I1003 09:07:40.366026 4765 memory_manager.go:354] "RemoveStaleState removing state" podUID="e8a8d866-ba44-4829-a8e8-df5c0c40ae8e" containerName="cinder-scheduler" Oct 03 09:07:40 crc kubenswrapper[4765]: I1003 09:07:40.366039 4765 memory_manager.go:354] "RemoveStaleState removing state" podUID="e8a8d866-ba44-4829-a8e8-df5c0c40ae8e" containerName="probe" Oct 03 09:07:40 crc kubenswrapper[4765]: I1003 09:07:40.367270 4765 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/cinder-scheduler-0" Oct 03 09:07:40 crc kubenswrapper[4765]: I1003 09:07:40.369155 4765 reflector.go:368] Caches populated for *v1.Secret from object-"watcher-kuttl-default"/"cinder-scheduler-config-data" Oct 03 09:07:40 crc kubenswrapper[4765]: I1003 09:07:40.380357 4765 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["watcher-kuttl-default/cinder-backup-0"] Oct 03 09:07:40 crc kubenswrapper[4765]: I1003 09:07:40.399016 4765 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["watcher-kuttl-default/cinder-scheduler-0"] Oct 03 09:07:40 crc kubenswrapper[4765]: I1003 09:07:40.416862 4765 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["watcher-kuttl-default/cinder-backup-0"] Oct 03 09:07:40 crc kubenswrapper[4765]: I1003 09:07:40.422574 4765 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["watcher-kuttl-default/cinder-backup-0"] Oct 03 09:07:40 crc kubenswrapper[4765]: I1003 09:07:40.424269 4765 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/cinder-backup-0" Oct 03 09:07:40 crc kubenswrapper[4765]: I1003 09:07:40.424444 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e96c47af-97c3-4c7e-9679-1cce340373ba-scripts\") pod \"cinder-scheduler-0\" (UID: \"e96c47af-97c3-4c7e-9679-1cce340373ba\") " pod="watcher-kuttl-default/cinder-scheduler-0" Oct 03 09:07:40 crc kubenswrapper[4765]: I1003 09:07:40.424494 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/e96c47af-97c3-4c7e-9679-1cce340373ba-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"e96c47af-97c3-4c7e-9679-1cce340373ba\") " pod="watcher-kuttl-default/cinder-scheduler-0" Oct 03 09:07:40 crc kubenswrapper[4765]: I1003 09:07:40.424560 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/e96c47af-97c3-4c7e-9679-1cce340373ba-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"e96c47af-97c3-4c7e-9679-1cce340373ba\") " pod="watcher-kuttl-default/cinder-scheduler-0" Oct 03 09:07:40 crc kubenswrapper[4765]: I1003 09:07:40.424591 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e96c47af-97c3-4c7e-9679-1cce340373ba-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"e96c47af-97c3-4c7e-9679-1cce340373ba\") " pod="watcher-kuttl-default/cinder-scheduler-0" Oct 03 09:07:40 crc kubenswrapper[4765]: I1003 09:07:40.424659 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert-memcached-mtls\" (UniqueName: \"kubernetes.io/secret/e96c47af-97c3-4c7e-9679-1cce340373ba-cert-memcached-mtls\") pod \"cinder-scheduler-0\" (UID: \"e96c47af-97c3-4c7e-9679-1cce340373ba\") " pod="watcher-kuttl-default/cinder-scheduler-0" Oct 03 09:07:40 crc kubenswrapper[4765]: I1003 09:07:40.424680 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e96c47af-97c3-4c7e-9679-1cce340373ba-config-data\") pod \"cinder-scheduler-0\" (UID: \"e96c47af-97c3-4c7e-9679-1cce340373ba\") " pod="watcher-kuttl-default/cinder-scheduler-0" Oct 03 09:07:40 crc kubenswrapper[4765]: I1003 09:07:40.424699 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nhnpg\" (UniqueName: \"kubernetes.io/projected/e96c47af-97c3-4c7e-9679-1cce340373ba-kube-api-access-nhnpg\") pod \"cinder-scheduler-0\" (UID: \"e96c47af-97c3-4c7e-9679-1cce340373ba\") " pod="watcher-kuttl-default/cinder-scheduler-0" Oct 03 09:07:40 crc kubenswrapper[4765]: I1003 09:07:40.426913 4765 reflector.go:368] Caches populated for *v1.Secret from object-"watcher-kuttl-default"/"cinder-backup-config-data" Oct 03 09:07:40 crc kubenswrapper[4765]: I1003 09:07:40.449728 4765 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["watcher-kuttl-default/cinder-backup-0"] Oct 03 09:07:40 crc kubenswrapper[4765]: I1003 09:07:40.526218 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-locks-cinder\" (UniqueName: \"kubernetes.io/host-path/3e6de015-79c8-4659-b0ea-dffac3d7bfe0-var-locks-cinder\") pod \"cinder-backup-0\" (UID: \"3e6de015-79c8-4659-b0ea-dffac3d7bfe0\") " pod="watcher-kuttl-default/cinder-backup-0" Oct 03 09:07:40 crc kubenswrapper[4765]: I1003 09:07:40.526280 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/e96c47af-97c3-4c7e-9679-1cce340373ba-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"e96c47af-97c3-4c7e-9679-1cce340373ba\") " pod="watcher-kuttl-default/cinder-scheduler-0" Oct 03 09:07:40 crc kubenswrapper[4765]: I1003 09:07:40.526310 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-cinder\" (UniqueName: \"kubernetes.io/host-path/3e6de015-79c8-4659-b0ea-dffac3d7bfe0-var-lib-cinder\") pod \"cinder-backup-0\" (UID: \"3e6de015-79c8-4659-b0ea-dffac3d7bfe0\") " pod="watcher-kuttl-default/cinder-backup-0" Oct 03 09:07:40 crc kubenswrapper[4765]: I1003 09:07:40.526344 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/3e6de015-79c8-4659-b0ea-dffac3d7bfe0-lib-modules\") pod \"cinder-backup-0\" (UID: \"3e6de015-79c8-4659-b0ea-dffac3d7bfe0\") " pod="watcher-kuttl-default/cinder-backup-0" Oct 03 09:07:40 crc kubenswrapper[4765]: I1003 09:07:40.526380 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/3e6de015-79c8-4659-b0ea-dffac3d7bfe0-run\") pod \"cinder-backup-0\" (UID: \"3e6de015-79c8-4659-b0ea-dffac3d7bfe0\") " pod="watcher-kuttl-default/cinder-backup-0" Oct 03 09:07:40 crc kubenswrapper[4765]: I1003 09:07:40.526383 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/e96c47af-97c3-4c7e-9679-1cce340373ba-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"e96c47af-97c3-4c7e-9679-1cce340373ba\") " pod="watcher-kuttl-default/cinder-scheduler-0" Oct 03 09:07:40 crc kubenswrapper[4765]: I1003 09:07:40.526403 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/3e6de015-79c8-4659-b0ea-dffac3d7bfe0-scripts\") pod \"cinder-backup-0\" (UID: \"3e6de015-79c8-4659-b0ea-dffac3d7bfe0\") " pod="watcher-kuttl-default/cinder-backup-0" Oct 03 09:07:40 crc kubenswrapper[4765]: I1003 09:07:40.526456 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3e6de015-79c8-4659-b0ea-dffac3d7bfe0-config-data\") pod \"cinder-backup-0\" (UID: \"3e6de015-79c8-4659-b0ea-dffac3d7bfe0\") " pod="watcher-kuttl-default/cinder-backup-0" Oct 03 09:07:40 crc kubenswrapper[4765]: I1003 09:07:40.526533 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-nvme\" (UniqueName: \"kubernetes.io/host-path/3e6de015-79c8-4659-b0ea-dffac3d7bfe0-etc-nvme\") pod \"cinder-backup-0\" (UID: \"3e6de015-79c8-4659-b0ea-dffac3d7bfe0\") " pod="watcher-kuttl-default/cinder-backup-0" Oct 03 09:07:40 crc kubenswrapper[4765]: I1003 09:07:40.526584 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/e96c47af-97c3-4c7e-9679-1cce340373ba-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"e96c47af-97c3-4c7e-9679-1cce340373ba\") " pod="watcher-kuttl-default/cinder-scheduler-0" Oct 03 09:07:40 crc kubenswrapper[4765]: I1003 09:07:40.526613 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-iscsi\" (UniqueName: \"kubernetes.io/host-path/3e6de015-79c8-4659-b0ea-dffac3d7bfe0-etc-iscsi\") pod \"cinder-backup-0\" (UID: \"3e6de015-79c8-4659-b0ea-dffac3d7bfe0\") " pod="watcher-kuttl-default/cinder-backup-0" Oct 03 09:07:40 crc kubenswrapper[4765]: I1003 09:07:40.526638 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert-memcached-mtls\" (UniqueName: \"kubernetes.io/secret/3e6de015-79c8-4659-b0ea-dffac3d7bfe0-cert-memcached-mtls\") pod \"cinder-backup-0\" (UID: \"3e6de015-79c8-4659-b0ea-dffac3d7bfe0\") " pod="watcher-kuttl-default/cinder-backup-0" Oct 03 09:07:40 crc kubenswrapper[4765]: I1003 09:07:40.526726 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e96c47af-97c3-4c7e-9679-1cce340373ba-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"e96c47af-97c3-4c7e-9679-1cce340373ba\") " pod="watcher-kuttl-default/cinder-scheduler-0" Oct 03 09:07:40 crc kubenswrapper[4765]: I1003 09:07:40.526769 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/3e6de015-79c8-4659-b0ea-dffac3d7bfe0-etc-machine-id\") pod \"cinder-backup-0\" (UID: \"3e6de015-79c8-4659-b0ea-dffac3d7bfe0\") " pod="watcher-kuttl-default/cinder-backup-0" Oct 03 09:07:40 crc kubenswrapper[4765]: I1003 09:07:40.526788 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dev\" (UniqueName: \"kubernetes.io/host-path/3e6de015-79c8-4659-b0ea-dffac3d7bfe0-dev\") pod \"cinder-backup-0\" (UID: \"3e6de015-79c8-4659-b0ea-dffac3d7bfe0\") " pod="watcher-kuttl-default/cinder-backup-0" Oct 03 09:07:40 crc kubenswrapper[4765]: I1003 09:07:40.526808 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-q78mk\" (UniqueName: \"kubernetes.io/projected/3e6de015-79c8-4659-b0ea-dffac3d7bfe0-kube-api-access-q78mk\") pod \"cinder-backup-0\" (UID: \"3e6de015-79c8-4659-b0ea-dffac3d7bfe0\") " pod="watcher-kuttl-default/cinder-backup-0" Oct 03 09:07:40 crc kubenswrapper[4765]: I1003 09:07:40.526828 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert-memcached-mtls\" (UniqueName: \"kubernetes.io/secret/e96c47af-97c3-4c7e-9679-1cce340373ba-cert-memcached-mtls\") pod \"cinder-scheduler-0\" (UID: \"e96c47af-97c3-4c7e-9679-1cce340373ba\") " pod="watcher-kuttl-default/cinder-scheduler-0" Oct 03 09:07:40 crc kubenswrapper[4765]: I1003 09:07:40.526847 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e96c47af-97c3-4c7e-9679-1cce340373ba-config-data\") pod \"cinder-scheduler-0\" (UID: \"e96c47af-97c3-4c7e-9679-1cce340373ba\") " pod="watcher-kuttl-default/cinder-scheduler-0" Oct 03 09:07:40 crc kubenswrapper[4765]: I1003 09:07:40.526867 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nhnpg\" (UniqueName: \"kubernetes.io/projected/e96c47af-97c3-4c7e-9679-1cce340373ba-kube-api-access-nhnpg\") pod \"cinder-scheduler-0\" (UID: \"e96c47af-97c3-4c7e-9679-1cce340373ba\") " pod="watcher-kuttl-default/cinder-scheduler-0" Oct 03 09:07:40 crc kubenswrapper[4765]: I1003 09:07:40.526888 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-locks-brick\" (UniqueName: \"kubernetes.io/host-path/3e6de015-79c8-4659-b0ea-dffac3d7bfe0-var-locks-brick\") pod \"cinder-backup-0\" (UID: \"3e6de015-79c8-4659-b0ea-dffac3d7bfe0\") " pod="watcher-kuttl-default/cinder-backup-0" Oct 03 09:07:40 crc kubenswrapper[4765]: I1003 09:07:40.526906 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/3e6de015-79c8-4659-b0ea-dffac3d7bfe0-config-data-custom\") pod \"cinder-backup-0\" (UID: \"3e6de015-79c8-4659-b0ea-dffac3d7bfe0\") " pod="watcher-kuttl-default/cinder-backup-0" Oct 03 09:07:40 crc kubenswrapper[4765]: I1003 09:07:40.526969 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/3e6de015-79c8-4659-b0ea-dffac3d7bfe0-sys\") pod \"cinder-backup-0\" (UID: \"3e6de015-79c8-4659-b0ea-dffac3d7bfe0\") " pod="watcher-kuttl-default/cinder-backup-0" Oct 03 09:07:40 crc kubenswrapper[4765]: I1003 09:07:40.526988 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3e6de015-79c8-4659-b0ea-dffac3d7bfe0-combined-ca-bundle\") pod \"cinder-backup-0\" (UID: \"3e6de015-79c8-4659-b0ea-dffac3d7bfe0\") " pod="watcher-kuttl-default/cinder-backup-0" Oct 03 09:07:40 crc kubenswrapper[4765]: I1003 09:07:40.527006 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e96c47af-97c3-4c7e-9679-1cce340373ba-scripts\") pod \"cinder-scheduler-0\" (UID: \"e96c47af-97c3-4c7e-9679-1cce340373ba\") " pod="watcher-kuttl-default/cinder-scheduler-0" Oct 03 09:07:40 crc kubenswrapper[4765]: I1003 09:07:40.531158 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e96c47af-97c3-4c7e-9679-1cce340373ba-scripts\") pod \"cinder-scheduler-0\" (UID: \"e96c47af-97c3-4c7e-9679-1cce340373ba\") " pod="watcher-kuttl-default/cinder-scheduler-0" Oct 03 09:07:40 crc kubenswrapper[4765]: I1003 09:07:40.531905 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/e96c47af-97c3-4c7e-9679-1cce340373ba-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"e96c47af-97c3-4c7e-9679-1cce340373ba\") " pod="watcher-kuttl-default/cinder-scheduler-0" Oct 03 09:07:40 crc kubenswrapper[4765]: I1003 09:07:40.534008 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert-memcached-mtls\" (UniqueName: \"kubernetes.io/secret/e96c47af-97c3-4c7e-9679-1cce340373ba-cert-memcached-mtls\") pod \"cinder-scheduler-0\" (UID: \"e96c47af-97c3-4c7e-9679-1cce340373ba\") " pod="watcher-kuttl-default/cinder-scheduler-0" Oct 03 09:07:40 crc kubenswrapper[4765]: I1003 09:07:40.534025 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e96c47af-97c3-4c7e-9679-1cce340373ba-config-data\") pod \"cinder-scheduler-0\" (UID: \"e96c47af-97c3-4c7e-9679-1cce340373ba\") " pod="watcher-kuttl-default/cinder-scheduler-0" Oct 03 09:07:40 crc kubenswrapper[4765]: I1003 09:07:40.537922 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e96c47af-97c3-4c7e-9679-1cce340373ba-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"e96c47af-97c3-4c7e-9679-1cce340373ba\") " pod="watcher-kuttl-default/cinder-scheduler-0" Oct 03 09:07:40 crc kubenswrapper[4765]: I1003 09:07:40.555856 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nhnpg\" (UniqueName: \"kubernetes.io/projected/e96c47af-97c3-4c7e-9679-1cce340373ba-kube-api-access-nhnpg\") pod \"cinder-scheduler-0\" (UID: \"e96c47af-97c3-4c7e-9679-1cce340373ba\") " pod="watcher-kuttl-default/cinder-scheduler-0" Oct 03 09:07:40 crc kubenswrapper[4765]: I1003 09:07:40.628302 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/3e6de015-79c8-4659-b0ea-dffac3d7bfe0-etc-machine-id\") pod \"cinder-backup-0\" (UID: \"3e6de015-79c8-4659-b0ea-dffac3d7bfe0\") " pod="watcher-kuttl-default/cinder-backup-0" Oct 03 09:07:40 crc kubenswrapper[4765]: I1003 09:07:40.628374 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dev\" (UniqueName: \"kubernetes.io/host-path/3e6de015-79c8-4659-b0ea-dffac3d7bfe0-dev\") pod \"cinder-backup-0\" (UID: \"3e6de015-79c8-4659-b0ea-dffac3d7bfe0\") " pod="watcher-kuttl-default/cinder-backup-0" Oct 03 09:07:40 crc kubenswrapper[4765]: I1003 09:07:40.628397 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-q78mk\" (UniqueName: \"kubernetes.io/projected/3e6de015-79c8-4659-b0ea-dffac3d7bfe0-kube-api-access-q78mk\") pod \"cinder-backup-0\" (UID: \"3e6de015-79c8-4659-b0ea-dffac3d7bfe0\") " pod="watcher-kuttl-default/cinder-backup-0" Oct 03 09:07:40 crc kubenswrapper[4765]: I1003 09:07:40.628435 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-locks-brick\" (UniqueName: \"kubernetes.io/host-path/3e6de015-79c8-4659-b0ea-dffac3d7bfe0-var-locks-brick\") pod \"cinder-backup-0\" (UID: \"3e6de015-79c8-4659-b0ea-dffac3d7bfe0\") " pod="watcher-kuttl-default/cinder-backup-0" Oct 03 09:07:40 crc kubenswrapper[4765]: I1003 09:07:40.628451 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/3e6de015-79c8-4659-b0ea-dffac3d7bfe0-config-data-custom\") pod \"cinder-backup-0\" (UID: \"3e6de015-79c8-4659-b0ea-dffac3d7bfe0\") " pod="watcher-kuttl-default/cinder-backup-0" Oct 03 09:07:40 crc kubenswrapper[4765]: I1003 09:07:40.628531 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/3e6de015-79c8-4659-b0ea-dffac3d7bfe0-sys\") pod \"cinder-backup-0\" (UID: \"3e6de015-79c8-4659-b0ea-dffac3d7bfe0\") " pod="watcher-kuttl-default/cinder-backup-0" Oct 03 09:07:40 crc kubenswrapper[4765]: I1003 09:07:40.628548 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3e6de015-79c8-4659-b0ea-dffac3d7bfe0-combined-ca-bundle\") pod \"cinder-backup-0\" (UID: \"3e6de015-79c8-4659-b0ea-dffac3d7bfe0\") " pod="watcher-kuttl-default/cinder-backup-0" Oct 03 09:07:40 crc kubenswrapper[4765]: I1003 09:07:40.628572 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-locks-cinder\" (UniqueName: \"kubernetes.io/host-path/3e6de015-79c8-4659-b0ea-dffac3d7bfe0-var-locks-cinder\") pod \"cinder-backup-0\" (UID: \"3e6de015-79c8-4659-b0ea-dffac3d7bfe0\") " pod="watcher-kuttl-default/cinder-backup-0" Oct 03 09:07:40 crc kubenswrapper[4765]: I1003 09:07:40.628591 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lib-cinder\" (UniqueName: \"kubernetes.io/host-path/3e6de015-79c8-4659-b0ea-dffac3d7bfe0-var-lib-cinder\") pod \"cinder-backup-0\" (UID: \"3e6de015-79c8-4659-b0ea-dffac3d7bfe0\") " pod="watcher-kuttl-default/cinder-backup-0" Oct 03 09:07:40 crc kubenswrapper[4765]: I1003 09:07:40.628627 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/3e6de015-79c8-4659-b0ea-dffac3d7bfe0-lib-modules\") pod \"cinder-backup-0\" (UID: \"3e6de015-79c8-4659-b0ea-dffac3d7bfe0\") " pod="watcher-kuttl-default/cinder-backup-0" Oct 03 09:07:40 crc kubenswrapper[4765]: I1003 09:07:40.628667 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/3e6de015-79c8-4659-b0ea-dffac3d7bfe0-run\") pod \"cinder-backup-0\" (UID: \"3e6de015-79c8-4659-b0ea-dffac3d7bfe0\") " pod="watcher-kuttl-default/cinder-backup-0" Oct 03 09:07:40 crc kubenswrapper[4765]: I1003 09:07:40.628686 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/3e6de015-79c8-4659-b0ea-dffac3d7bfe0-scripts\") pod \"cinder-backup-0\" (UID: \"3e6de015-79c8-4659-b0ea-dffac3d7bfe0\") " pod="watcher-kuttl-default/cinder-backup-0" Oct 03 09:07:40 crc kubenswrapper[4765]: I1003 09:07:40.628701 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3e6de015-79c8-4659-b0ea-dffac3d7bfe0-config-data\") pod \"cinder-backup-0\" (UID: \"3e6de015-79c8-4659-b0ea-dffac3d7bfe0\") " pod="watcher-kuttl-default/cinder-backup-0" Oct 03 09:07:40 crc kubenswrapper[4765]: I1003 09:07:40.628732 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-nvme\" (UniqueName: \"kubernetes.io/host-path/3e6de015-79c8-4659-b0ea-dffac3d7bfe0-etc-nvme\") pod \"cinder-backup-0\" (UID: \"3e6de015-79c8-4659-b0ea-dffac3d7bfe0\") " pod="watcher-kuttl-default/cinder-backup-0" Oct 03 09:07:40 crc kubenswrapper[4765]: I1003 09:07:40.628788 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-iscsi\" (UniqueName: \"kubernetes.io/host-path/3e6de015-79c8-4659-b0ea-dffac3d7bfe0-etc-iscsi\") pod \"cinder-backup-0\" (UID: \"3e6de015-79c8-4659-b0ea-dffac3d7bfe0\") " pod="watcher-kuttl-default/cinder-backup-0" Oct 03 09:07:40 crc kubenswrapper[4765]: I1003 09:07:40.628812 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert-memcached-mtls\" (UniqueName: \"kubernetes.io/secret/3e6de015-79c8-4659-b0ea-dffac3d7bfe0-cert-memcached-mtls\") pod \"cinder-backup-0\" (UID: \"3e6de015-79c8-4659-b0ea-dffac3d7bfe0\") " pod="watcher-kuttl-default/cinder-backup-0" Oct 03 09:07:40 crc kubenswrapper[4765]: I1003 09:07:40.629415 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-locks-cinder\" (UniqueName: \"kubernetes.io/host-path/3e6de015-79c8-4659-b0ea-dffac3d7bfe0-var-locks-cinder\") pod \"cinder-backup-0\" (UID: \"3e6de015-79c8-4659-b0ea-dffac3d7bfe0\") " pod="watcher-kuttl-default/cinder-backup-0" Oct 03 09:07:40 crc kubenswrapper[4765]: I1003 09:07:40.629527 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-locks-brick\" (UniqueName: \"kubernetes.io/host-path/3e6de015-79c8-4659-b0ea-dffac3d7bfe0-var-locks-brick\") pod \"cinder-backup-0\" (UID: \"3e6de015-79c8-4659-b0ea-dffac3d7bfe0\") " pod="watcher-kuttl-default/cinder-backup-0" Oct 03 09:07:40 crc kubenswrapper[4765]: I1003 09:07:40.629527 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/3e6de015-79c8-4659-b0ea-dffac3d7bfe0-sys\") pod \"cinder-backup-0\" (UID: \"3e6de015-79c8-4659-b0ea-dffac3d7bfe0\") " pod="watcher-kuttl-default/cinder-backup-0" Oct 03 09:07:40 crc kubenswrapper[4765]: I1003 09:07:40.629675 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/3e6de015-79c8-4659-b0ea-dffac3d7bfe0-etc-machine-id\") pod \"cinder-backup-0\" (UID: \"3e6de015-79c8-4659-b0ea-dffac3d7bfe0\") " pod="watcher-kuttl-default/cinder-backup-0" Oct 03 09:07:40 crc kubenswrapper[4765]: I1003 09:07:40.629705 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dev\" (UniqueName: \"kubernetes.io/host-path/3e6de015-79c8-4659-b0ea-dffac3d7bfe0-dev\") pod \"cinder-backup-0\" (UID: \"3e6de015-79c8-4659-b0ea-dffac3d7bfe0\") " pod="watcher-kuttl-default/cinder-backup-0" Oct 03 09:07:40 crc kubenswrapper[4765]: I1003 09:07:40.630494 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-nvme\" (UniqueName: \"kubernetes.io/host-path/3e6de015-79c8-4659-b0ea-dffac3d7bfe0-etc-nvme\") pod \"cinder-backup-0\" (UID: \"3e6de015-79c8-4659-b0ea-dffac3d7bfe0\") " pod="watcher-kuttl-default/cinder-backup-0" Oct 03 09:07:40 crc kubenswrapper[4765]: I1003 09:07:40.630540 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/3e6de015-79c8-4659-b0ea-dffac3d7bfe0-lib-modules\") pod \"cinder-backup-0\" (UID: \"3e6de015-79c8-4659-b0ea-dffac3d7bfe0\") " pod="watcher-kuttl-default/cinder-backup-0" Oct 03 09:07:40 crc kubenswrapper[4765]: I1003 09:07:40.630604 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lib-cinder\" (UniqueName: \"kubernetes.io/host-path/3e6de015-79c8-4659-b0ea-dffac3d7bfe0-var-lib-cinder\") pod \"cinder-backup-0\" (UID: \"3e6de015-79c8-4659-b0ea-dffac3d7bfe0\") " pod="watcher-kuttl-default/cinder-backup-0" Oct 03 09:07:40 crc kubenswrapper[4765]: I1003 09:07:40.630630 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-iscsi\" (UniqueName: \"kubernetes.io/host-path/3e6de015-79c8-4659-b0ea-dffac3d7bfe0-etc-iscsi\") pod \"cinder-backup-0\" (UID: \"3e6de015-79c8-4659-b0ea-dffac3d7bfe0\") " pod="watcher-kuttl-default/cinder-backup-0" Oct 03 09:07:40 crc kubenswrapper[4765]: I1003 09:07:40.630675 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run\" (UniqueName: \"kubernetes.io/host-path/3e6de015-79c8-4659-b0ea-dffac3d7bfe0-run\") pod \"cinder-backup-0\" (UID: \"3e6de015-79c8-4659-b0ea-dffac3d7bfe0\") " pod="watcher-kuttl-default/cinder-backup-0" Oct 03 09:07:40 crc kubenswrapper[4765]: I1003 09:07:40.632882 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/3e6de015-79c8-4659-b0ea-dffac3d7bfe0-scripts\") pod \"cinder-backup-0\" (UID: \"3e6de015-79c8-4659-b0ea-dffac3d7bfe0\") " pod="watcher-kuttl-default/cinder-backup-0" Oct 03 09:07:40 crc kubenswrapper[4765]: I1003 09:07:40.633297 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert-memcached-mtls\" (UniqueName: \"kubernetes.io/secret/3e6de015-79c8-4659-b0ea-dffac3d7bfe0-cert-memcached-mtls\") pod \"cinder-backup-0\" (UID: \"3e6de015-79c8-4659-b0ea-dffac3d7bfe0\") " pod="watcher-kuttl-default/cinder-backup-0" Oct 03 09:07:40 crc kubenswrapper[4765]: I1003 09:07:40.633384 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3e6de015-79c8-4659-b0ea-dffac3d7bfe0-combined-ca-bundle\") pod \"cinder-backup-0\" (UID: \"3e6de015-79c8-4659-b0ea-dffac3d7bfe0\") " pod="watcher-kuttl-default/cinder-backup-0" Oct 03 09:07:40 crc kubenswrapper[4765]: I1003 09:07:40.633715 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/3e6de015-79c8-4659-b0ea-dffac3d7bfe0-config-data-custom\") pod \"cinder-backup-0\" (UID: \"3e6de015-79c8-4659-b0ea-dffac3d7bfe0\") " pod="watcher-kuttl-default/cinder-backup-0" Oct 03 09:07:40 crc kubenswrapper[4765]: I1003 09:07:40.634565 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3e6de015-79c8-4659-b0ea-dffac3d7bfe0-config-data\") pod \"cinder-backup-0\" (UID: \"3e6de015-79c8-4659-b0ea-dffac3d7bfe0\") " pod="watcher-kuttl-default/cinder-backup-0" Oct 03 09:07:40 crc kubenswrapper[4765]: I1003 09:07:40.651001 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-q78mk\" (UniqueName: \"kubernetes.io/projected/3e6de015-79c8-4659-b0ea-dffac3d7bfe0-kube-api-access-q78mk\") pod \"cinder-backup-0\" (UID: \"3e6de015-79c8-4659-b0ea-dffac3d7bfe0\") " pod="watcher-kuttl-default/cinder-backup-0" Oct 03 09:07:40 crc kubenswrapper[4765]: I1003 09:07:40.690235 4765 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/cinder-scheduler-0" Oct 03 09:07:40 crc kubenswrapper[4765]: I1003 09:07:40.735754 4765 generic.go:334] "Generic (PLEG): container finished" podID="5dbe0344-1813-4e84-a956-434bd050bdc1" containerID="c9a7c8cff14aa929cc1d212e1add72e03d3849017c7ba288aba297471ec75f4c" exitCode=0 Oct 03 09:07:40 crc kubenswrapper[4765]: I1003 09:07:40.735795 4765 generic.go:334] "Generic (PLEG): container finished" podID="5dbe0344-1813-4e84-a956-434bd050bdc1" containerID="848b071f1bb670ceae2b3fdb7f9ba0a83275f026ea59a9689396f1e284267548" exitCode=2 Oct 03 09:07:40 crc kubenswrapper[4765]: I1003 09:07:40.735842 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/ceilometer-0" event={"ID":"5dbe0344-1813-4e84-a956-434bd050bdc1","Type":"ContainerDied","Data":"c9a7c8cff14aa929cc1d212e1add72e03d3849017c7ba288aba297471ec75f4c"} Oct 03 09:07:40 crc kubenswrapper[4765]: I1003 09:07:40.735904 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/ceilometer-0" event={"ID":"5dbe0344-1813-4e84-a956-434bd050bdc1","Type":"ContainerDied","Data":"848b071f1bb670ceae2b3fdb7f9ba0a83275f026ea59a9689396f1e284267548"} Oct 03 09:07:40 crc kubenswrapper[4765]: I1003 09:07:40.765916 4765 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/cinder-backup-0" Oct 03 09:07:40 crc kubenswrapper[4765]: I1003 09:07:40.783595 4765 log.go:25] "Finished parsing log file" path="/var/log/pods/watcher-kuttl-default_watcher-kuttl-decision-engine-0_ba2a7e44-33cc-4cf1-8dfc-7aa8306118d2/watcher-decision-engine/0.log" Oct 03 09:07:41 crc kubenswrapper[4765]: I1003 09:07:41.168972 4765 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["watcher-kuttl-default/cinder-scheduler-0"] Oct 03 09:07:41 crc kubenswrapper[4765]: I1003 09:07:41.295215 4765 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["watcher-kuttl-default/cinder-backup-0"] Oct 03 09:07:41 crc kubenswrapper[4765]: W1003 09:07:41.306417 4765 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod3e6de015_79c8_4659_b0ea_dffac3d7bfe0.slice/crio-7c7222b7c578a5b2a2c3688f05337679c04bc565ca189797fc0401934a71c241 WatchSource:0}: Error finding container 7c7222b7c578a5b2a2c3688f05337679c04bc565ca189797fc0401934a71c241: Status 404 returned error can't find the container with id 7c7222b7c578a5b2a2c3688f05337679c04bc565ca189797fc0401934a71c241 Oct 03 09:07:41 crc kubenswrapper[4765]: I1003 09:07:41.349372 4765 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/watcher-kuttl-decision-engine-0" Oct 03 09:07:41 crc kubenswrapper[4765]: I1003 09:07:41.448670 4765 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/ba2a7e44-33cc-4cf1-8dfc-7aa8306118d2-logs\") pod \"ba2a7e44-33cc-4cf1-8dfc-7aa8306118d2\" (UID: \"ba2a7e44-33cc-4cf1-8dfc-7aa8306118d2\") " Oct 03 09:07:41 crc kubenswrapper[4765]: I1003 09:07:41.448743 4765 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-95jcn\" (UniqueName: \"kubernetes.io/projected/ba2a7e44-33cc-4cf1-8dfc-7aa8306118d2-kube-api-access-95jcn\") pod \"ba2a7e44-33cc-4cf1-8dfc-7aa8306118d2\" (UID: \"ba2a7e44-33cc-4cf1-8dfc-7aa8306118d2\") " Oct 03 09:07:41 crc kubenswrapper[4765]: I1003 09:07:41.448830 4765 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ba2a7e44-33cc-4cf1-8dfc-7aa8306118d2-combined-ca-bundle\") pod \"ba2a7e44-33cc-4cf1-8dfc-7aa8306118d2\" (UID: \"ba2a7e44-33cc-4cf1-8dfc-7aa8306118d2\") " Oct 03 09:07:41 crc kubenswrapper[4765]: I1003 09:07:41.448886 4765 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"custom-prometheus-ca\" (UniqueName: \"kubernetes.io/secret/ba2a7e44-33cc-4cf1-8dfc-7aa8306118d2-custom-prometheus-ca\") pod \"ba2a7e44-33cc-4cf1-8dfc-7aa8306118d2\" (UID: \"ba2a7e44-33cc-4cf1-8dfc-7aa8306118d2\") " Oct 03 09:07:41 crc kubenswrapper[4765]: I1003 09:07:41.448901 4765 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ba2a7e44-33cc-4cf1-8dfc-7aa8306118d2-config-data\") pod \"ba2a7e44-33cc-4cf1-8dfc-7aa8306118d2\" (UID: \"ba2a7e44-33cc-4cf1-8dfc-7aa8306118d2\") " Oct 03 09:07:41 crc kubenswrapper[4765]: I1003 09:07:41.448948 4765 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cert-memcached-mtls\" (UniqueName: \"kubernetes.io/secret/ba2a7e44-33cc-4cf1-8dfc-7aa8306118d2-cert-memcached-mtls\") pod \"ba2a7e44-33cc-4cf1-8dfc-7aa8306118d2\" (UID: \"ba2a7e44-33cc-4cf1-8dfc-7aa8306118d2\") " Oct 03 09:07:41 crc kubenswrapper[4765]: I1003 09:07:41.459365 4765 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ba2a7e44-33cc-4cf1-8dfc-7aa8306118d2-logs" (OuterVolumeSpecName: "logs") pod "ba2a7e44-33cc-4cf1-8dfc-7aa8306118d2" (UID: "ba2a7e44-33cc-4cf1-8dfc-7aa8306118d2"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 03 09:07:41 crc kubenswrapper[4765]: I1003 09:07:41.484633 4765 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ba2a7e44-33cc-4cf1-8dfc-7aa8306118d2-kube-api-access-95jcn" (OuterVolumeSpecName: "kube-api-access-95jcn") pod "ba2a7e44-33cc-4cf1-8dfc-7aa8306118d2" (UID: "ba2a7e44-33cc-4cf1-8dfc-7aa8306118d2"). InnerVolumeSpecName "kube-api-access-95jcn". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 09:07:41 crc kubenswrapper[4765]: I1003 09:07:41.491467 4765 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ba2a7e44-33cc-4cf1-8dfc-7aa8306118d2-custom-prometheus-ca" (OuterVolumeSpecName: "custom-prometheus-ca") pod "ba2a7e44-33cc-4cf1-8dfc-7aa8306118d2" (UID: "ba2a7e44-33cc-4cf1-8dfc-7aa8306118d2"). InnerVolumeSpecName "custom-prometheus-ca". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 09:07:41 crc kubenswrapper[4765]: I1003 09:07:41.494865 4765 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ba2a7e44-33cc-4cf1-8dfc-7aa8306118d2-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "ba2a7e44-33cc-4cf1-8dfc-7aa8306118d2" (UID: "ba2a7e44-33cc-4cf1-8dfc-7aa8306118d2"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 09:07:41 crc kubenswrapper[4765]: I1003 09:07:41.522009 4765 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ba2a7e44-33cc-4cf1-8dfc-7aa8306118d2-config-data" (OuterVolumeSpecName: "config-data") pod "ba2a7e44-33cc-4cf1-8dfc-7aa8306118d2" (UID: "ba2a7e44-33cc-4cf1-8dfc-7aa8306118d2"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 09:07:41 crc kubenswrapper[4765]: I1003 09:07:41.536065 4765 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ba2a7e44-33cc-4cf1-8dfc-7aa8306118d2-cert-memcached-mtls" (OuterVolumeSpecName: "cert-memcached-mtls") pod "ba2a7e44-33cc-4cf1-8dfc-7aa8306118d2" (UID: "ba2a7e44-33cc-4cf1-8dfc-7aa8306118d2"). InnerVolumeSpecName "cert-memcached-mtls". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 09:07:41 crc kubenswrapper[4765]: I1003 09:07:41.550961 4765 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ba2a7e44-33cc-4cf1-8dfc-7aa8306118d2-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 03 09:07:41 crc kubenswrapper[4765]: I1003 09:07:41.551007 4765 reconciler_common.go:293] "Volume detached for volume \"custom-prometheus-ca\" (UniqueName: \"kubernetes.io/secret/ba2a7e44-33cc-4cf1-8dfc-7aa8306118d2-custom-prometheus-ca\") on node \"crc\" DevicePath \"\"" Oct 03 09:07:41 crc kubenswrapper[4765]: I1003 09:07:41.551021 4765 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ba2a7e44-33cc-4cf1-8dfc-7aa8306118d2-config-data\") on node \"crc\" DevicePath \"\"" Oct 03 09:07:41 crc kubenswrapper[4765]: I1003 09:07:41.551032 4765 reconciler_common.go:293] "Volume detached for volume \"cert-memcached-mtls\" (UniqueName: \"kubernetes.io/secret/ba2a7e44-33cc-4cf1-8dfc-7aa8306118d2-cert-memcached-mtls\") on node \"crc\" DevicePath \"\"" Oct 03 09:07:41 crc kubenswrapper[4765]: I1003 09:07:41.551044 4765 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/ba2a7e44-33cc-4cf1-8dfc-7aa8306118d2-logs\") on node \"crc\" DevicePath \"\"" Oct 03 09:07:41 crc kubenswrapper[4765]: I1003 09:07:41.551057 4765 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-95jcn\" (UniqueName: \"kubernetes.io/projected/ba2a7e44-33cc-4cf1-8dfc-7aa8306118d2-kube-api-access-95jcn\") on node \"crc\" DevicePath \"\"" Oct 03 09:07:41 crc kubenswrapper[4765]: I1003 09:07:41.748865 4765 generic.go:334] "Generic (PLEG): container finished" podID="ba2a7e44-33cc-4cf1-8dfc-7aa8306118d2" containerID="de9d18011da3ea7f10fb762bcd7f356de4b115da9910de90f6b45abc85109696" exitCode=0 Oct 03 09:07:41 crc kubenswrapper[4765]: I1003 09:07:41.749331 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/watcher-kuttl-decision-engine-0" event={"ID":"ba2a7e44-33cc-4cf1-8dfc-7aa8306118d2","Type":"ContainerDied","Data":"de9d18011da3ea7f10fb762bcd7f356de4b115da9910de90f6b45abc85109696"} Oct 03 09:07:41 crc kubenswrapper[4765]: I1003 09:07:41.749418 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/watcher-kuttl-decision-engine-0" event={"ID":"ba2a7e44-33cc-4cf1-8dfc-7aa8306118d2","Type":"ContainerDied","Data":"0a4d82f8e30da1dd48d2f6b4e5554982c50f7491805e027f7dd85feb90d55969"} Oct 03 09:07:41 crc kubenswrapper[4765]: I1003 09:07:41.749440 4765 scope.go:117] "RemoveContainer" containerID="de9d18011da3ea7f10fb762bcd7f356de4b115da9910de90f6b45abc85109696" Oct 03 09:07:41 crc kubenswrapper[4765]: I1003 09:07:41.749731 4765 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/watcher-kuttl-decision-engine-0" Oct 03 09:07:41 crc kubenswrapper[4765]: I1003 09:07:41.752025 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/cinder-scheduler-0" event={"ID":"e96c47af-97c3-4c7e-9679-1cce340373ba","Type":"ContainerStarted","Data":"456de9d76f2f46a634b9662c99cac9680c1b4b42eaaa366e46cb20670185501a"} Oct 03 09:07:41 crc kubenswrapper[4765]: I1003 09:07:41.755886 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/cinder-backup-0" event={"ID":"3e6de015-79c8-4659-b0ea-dffac3d7bfe0","Type":"ContainerStarted","Data":"d97dd3000fe2403652860d876d82decd335c2c473ab4fe465ccce2a59850af8a"} Oct 03 09:07:41 crc kubenswrapper[4765]: I1003 09:07:41.755925 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/cinder-backup-0" event={"ID":"3e6de015-79c8-4659-b0ea-dffac3d7bfe0","Type":"ContainerStarted","Data":"7c7222b7c578a5b2a2c3688f05337679c04bc565ca189797fc0401934a71c241"} Oct 03 09:07:41 crc kubenswrapper[4765]: I1003 09:07:41.762046 4765 generic.go:334] "Generic (PLEG): container finished" podID="5dbe0344-1813-4e84-a956-434bd050bdc1" containerID="b06db2bd1f356e803c7e960bd2bfdda5b8ece438e43f4a8114c4944017c78bfb" exitCode=0 Oct 03 09:07:41 crc kubenswrapper[4765]: I1003 09:07:41.762096 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/ceilometer-0" event={"ID":"5dbe0344-1813-4e84-a956-434bd050bdc1","Type":"ContainerDied","Data":"b06db2bd1f356e803c7e960bd2bfdda5b8ece438e43f4a8114c4944017c78bfb"} Oct 03 09:07:41 crc kubenswrapper[4765]: I1003 09:07:41.861858 4765 scope.go:117] "RemoveContainer" containerID="de9d18011da3ea7f10fb762bcd7f356de4b115da9910de90f6b45abc85109696" Oct 03 09:07:41 crc kubenswrapper[4765]: E1003 09:07:41.862702 4765 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"de9d18011da3ea7f10fb762bcd7f356de4b115da9910de90f6b45abc85109696\": container with ID starting with de9d18011da3ea7f10fb762bcd7f356de4b115da9910de90f6b45abc85109696 not found: ID does not exist" containerID="de9d18011da3ea7f10fb762bcd7f356de4b115da9910de90f6b45abc85109696" Oct 03 09:07:41 crc kubenswrapper[4765]: I1003 09:07:41.862742 4765 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"de9d18011da3ea7f10fb762bcd7f356de4b115da9910de90f6b45abc85109696"} err="failed to get container status \"de9d18011da3ea7f10fb762bcd7f356de4b115da9910de90f6b45abc85109696\": rpc error: code = NotFound desc = could not find container \"de9d18011da3ea7f10fb762bcd7f356de4b115da9910de90f6b45abc85109696\": container with ID starting with de9d18011da3ea7f10fb762bcd7f356de4b115da9910de90f6b45abc85109696 not found: ID does not exist" Oct 03 09:07:41 crc kubenswrapper[4765]: I1003 09:07:41.926923 4765 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["watcher-kuttl-default/watcher-kuttl-decision-engine-0"] Oct 03 09:07:41 crc kubenswrapper[4765]: I1003 09:07:41.944554 4765 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["watcher-kuttl-default/watcher-kuttl-decision-engine-0"] Oct 03 09:07:41 crc kubenswrapper[4765]: I1003 09:07:41.951516 4765 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["watcher-kuttl-default/watcher-kuttl-decision-engine-0"] Oct 03 09:07:41 crc kubenswrapper[4765]: E1003 09:07:41.952111 4765 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ba2a7e44-33cc-4cf1-8dfc-7aa8306118d2" containerName="watcher-decision-engine" Oct 03 09:07:41 crc kubenswrapper[4765]: I1003 09:07:41.952225 4765 state_mem.go:107] "Deleted CPUSet assignment" podUID="ba2a7e44-33cc-4cf1-8dfc-7aa8306118d2" containerName="watcher-decision-engine" Oct 03 09:07:41 crc kubenswrapper[4765]: I1003 09:07:41.952524 4765 memory_manager.go:354] "RemoveStaleState removing state" podUID="ba2a7e44-33cc-4cf1-8dfc-7aa8306118d2" containerName="watcher-decision-engine" Oct 03 09:07:41 crc kubenswrapper[4765]: I1003 09:07:41.953228 4765 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/watcher-kuttl-decision-engine-0" Oct 03 09:07:41 crc kubenswrapper[4765]: I1003 09:07:41.959359 4765 reflector.go:368] Caches populated for *v1.Secret from object-"watcher-kuttl-default"/"watcher-kuttl-decision-engine-config-data" Oct 03 09:07:41 crc kubenswrapper[4765]: I1003 09:07:41.994012 4765 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["watcher-kuttl-default/watcher-kuttl-decision-engine-0"] Oct 03 09:07:42 crc kubenswrapper[4765]: I1003 09:07:42.060601 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0323892b-865c-444a-bbe4-f73361d81cb1-config-data\") pod \"watcher-kuttl-decision-engine-0\" (UID: \"0323892b-865c-444a-bbe4-f73361d81cb1\") " pod="watcher-kuttl-default/watcher-kuttl-decision-engine-0" Oct 03 09:07:42 crc kubenswrapper[4765]: I1003 09:07:42.060672 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-v62hn\" (UniqueName: \"kubernetes.io/projected/0323892b-865c-444a-bbe4-f73361d81cb1-kube-api-access-v62hn\") pod \"watcher-kuttl-decision-engine-0\" (UID: \"0323892b-865c-444a-bbe4-f73361d81cb1\") " pod="watcher-kuttl-default/watcher-kuttl-decision-engine-0" Oct 03 09:07:42 crc kubenswrapper[4765]: I1003 09:07:42.060698 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0323892b-865c-444a-bbe4-f73361d81cb1-combined-ca-bundle\") pod \"watcher-kuttl-decision-engine-0\" (UID: \"0323892b-865c-444a-bbe4-f73361d81cb1\") " pod="watcher-kuttl-default/watcher-kuttl-decision-engine-0" Oct 03 09:07:42 crc kubenswrapper[4765]: I1003 09:07:42.060713 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/0323892b-865c-444a-bbe4-f73361d81cb1-logs\") pod \"watcher-kuttl-decision-engine-0\" (UID: \"0323892b-865c-444a-bbe4-f73361d81cb1\") " pod="watcher-kuttl-default/watcher-kuttl-decision-engine-0" Oct 03 09:07:42 crc kubenswrapper[4765]: I1003 09:07:42.060803 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"custom-prometheus-ca\" (UniqueName: \"kubernetes.io/secret/0323892b-865c-444a-bbe4-f73361d81cb1-custom-prometheus-ca\") pod \"watcher-kuttl-decision-engine-0\" (UID: \"0323892b-865c-444a-bbe4-f73361d81cb1\") " pod="watcher-kuttl-default/watcher-kuttl-decision-engine-0" Oct 03 09:07:42 crc kubenswrapper[4765]: I1003 09:07:42.060887 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert-memcached-mtls\" (UniqueName: \"kubernetes.io/secret/0323892b-865c-444a-bbe4-f73361d81cb1-cert-memcached-mtls\") pod \"watcher-kuttl-decision-engine-0\" (UID: \"0323892b-865c-444a-bbe4-f73361d81cb1\") " pod="watcher-kuttl-default/watcher-kuttl-decision-engine-0" Oct 03 09:07:42 crc kubenswrapper[4765]: I1003 09:07:42.165759 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert-memcached-mtls\" (UniqueName: \"kubernetes.io/secret/0323892b-865c-444a-bbe4-f73361d81cb1-cert-memcached-mtls\") pod \"watcher-kuttl-decision-engine-0\" (UID: \"0323892b-865c-444a-bbe4-f73361d81cb1\") " pod="watcher-kuttl-default/watcher-kuttl-decision-engine-0" Oct 03 09:07:42 crc kubenswrapper[4765]: I1003 09:07:42.165885 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0323892b-865c-444a-bbe4-f73361d81cb1-config-data\") pod \"watcher-kuttl-decision-engine-0\" (UID: \"0323892b-865c-444a-bbe4-f73361d81cb1\") " pod="watcher-kuttl-default/watcher-kuttl-decision-engine-0" Oct 03 09:07:42 crc kubenswrapper[4765]: I1003 09:07:42.165932 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-v62hn\" (UniqueName: \"kubernetes.io/projected/0323892b-865c-444a-bbe4-f73361d81cb1-kube-api-access-v62hn\") pod \"watcher-kuttl-decision-engine-0\" (UID: \"0323892b-865c-444a-bbe4-f73361d81cb1\") " pod="watcher-kuttl-default/watcher-kuttl-decision-engine-0" Oct 03 09:07:42 crc kubenswrapper[4765]: I1003 09:07:42.165968 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0323892b-865c-444a-bbe4-f73361d81cb1-combined-ca-bundle\") pod \"watcher-kuttl-decision-engine-0\" (UID: \"0323892b-865c-444a-bbe4-f73361d81cb1\") " pod="watcher-kuttl-default/watcher-kuttl-decision-engine-0" Oct 03 09:07:42 crc kubenswrapper[4765]: I1003 09:07:42.165990 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/0323892b-865c-444a-bbe4-f73361d81cb1-logs\") pod \"watcher-kuttl-decision-engine-0\" (UID: \"0323892b-865c-444a-bbe4-f73361d81cb1\") " pod="watcher-kuttl-default/watcher-kuttl-decision-engine-0" Oct 03 09:07:42 crc kubenswrapper[4765]: I1003 09:07:42.166079 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"custom-prometheus-ca\" (UniqueName: \"kubernetes.io/secret/0323892b-865c-444a-bbe4-f73361d81cb1-custom-prometheus-ca\") pod \"watcher-kuttl-decision-engine-0\" (UID: \"0323892b-865c-444a-bbe4-f73361d81cb1\") " pod="watcher-kuttl-default/watcher-kuttl-decision-engine-0" Oct 03 09:07:42 crc kubenswrapper[4765]: I1003 09:07:42.167011 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/0323892b-865c-444a-bbe4-f73361d81cb1-logs\") pod \"watcher-kuttl-decision-engine-0\" (UID: \"0323892b-865c-444a-bbe4-f73361d81cb1\") " pod="watcher-kuttl-default/watcher-kuttl-decision-engine-0" Oct 03 09:07:42 crc kubenswrapper[4765]: I1003 09:07:42.169567 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert-memcached-mtls\" (UniqueName: \"kubernetes.io/secret/0323892b-865c-444a-bbe4-f73361d81cb1-cert-memcached-mtls\") pod \"watcher-kuttl-decision-engine-0\" (UID: \"0323892b-865c-444a-bbe4-f73361d81cb1\") " pod="watcher-kuttl-default/watcher-kuttl-decision-engine-0" Oct 03 09:07:42 crc kubenswrapper[4765]: I1003 09:07:42.170017 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"custom-prometheus-ca\" (UniqueName: \"kubernetes.io/secret/0323892b-865c-444a-bbe4-f73361d81cb1-custom-prometheus-ca\") pod \"watcher-kuttl-decision-engine-0\" (UID: \"0323892b-865c-444a-bbe4-f73361d81cb1\") " pod="watcher-kuttl-default/watcher-kuttl-decision-engine-0" Oct 03 09:07:42 crc kubenswrapper[4765]: I1003 09:07:42.170431 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0323892b-865c-444a-bbe4-f73361d81cb1-config-data\") pod \"watcher-kuttl-decision-engine-0\" (UID: \"0323892b-865c-444a-bbe4-f73361d81cb1\") " pod="watcher-kuttl-default/watcher-kuttl-decision-engine-0" Oct 03 09:07:42 crc kubenswrapper[4765]: I1003 09:07:42.188279 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-v62hn\" (UniqueName: \"kubernetes.io/projected/0323892b-865c-444a-bbe4-f73361d81cb1-kube-api-access-v62hn\") pod \"watcher-kuttl-decision-engine-0\" (UID: \"0323892b-865c-444a-bbe4-f73361d81cb1\") " pod="watcher-kuttl-default/watcher-kuttl-decision-engine-0" Oct 03 09:07:42 crc kubenswrapper[4765]: I1003 09:07:42.188435 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0323892b-865c-444a-bbe4-f73361d81cb1-combined-ca-bundle\") pod \"watcher-kuttl-decision-engine-0\" (UID: \"0323892b-865c-444a-bbe4-f73361d81cb1\") " pod="watcher-kuttl-default/watcher-kuttl-decision-engine-0" Oct 03 09:07:42 crc kubenswrapper[4765]: I1003 09:07:42.289397 4765 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/watcher-kuttl-decision-engine-0" Oct 03 09:07:42 crc kubenswrapper[4765]: I1003 09:07:42.322614 4765 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="260b2c35-636a-4254-a743-d7a34677e0cc" path="/var/lib/kubelet/pods/260b2c35-636a-4254-a743-d7a34677e0cc/volumes" Oct 03 09:07:42 crc kubenswrapper[4765]: I1003 09:07:42.323821 4765 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ba2a7e44-33cc-4cf1-8dfc-7aa8306118d2" path="/var/lib/kubelet/pods/ba2a7e44-33cc-4cf1-8dfc-7aa8306118d2/volumes" Oct 03 09:07:42 crc kubenswrapper[4765]: I1003 09:07:42.324403 4765 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e8a8d866-ba44-4829-a8e8-df5c0c40ae8e" path="/var/lib/kubelet/pods/e8a8d866-ba44-4829-a8e8-df5c0c40ae8e/volumes" Oct 03 09:07:42 crc kubenswrapper[4765]: I1003 09:07:42.787412 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/cinder-scheduler-0" event={"ID":"e96c47af-97c3-4c7e-9679-1cce340373ba","Type":"ContainerStarted","Data":"4f903e85136bfa6f8da321d57e9c9f18f118ae78c26d42ee5dbfdfcdba7109aa"} Oct 03 09:07:42 crc kubenswrapper[4765]: I1003 09:07:42.790240 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/cinder-backup-0" event={"ID":"3e6de015-79c8-4659-b0ea-dffac3d7bfe0","Type":"ContainerStarted","Data":"012c69d0bb250ee72feb481599a2d4d635a8bb98e2f3072d5012fdccbe993ab5"} Oct 03 09:07:42 crc kubenswrapper[4765]: I1003 09:07:42.948725 4765 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="watcher-kuttl-default/cinder-backup-0" podStartSLOduration=2.948704879 podStartE2EDuration="2.948704879s" podCreationTimestamp="2025-10-03 09:07:40 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-03 09:07:42.825005504 +0000 UTC m=+1707.126499844" watchObservedRunningTime="2025-10-03 09:07:42.948704879 +0000 UTC m=+1707.250199209" Oct 03 09:07:42 crc kubenswrapper[4765]: I1003 09:07:42.950634 4765 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["watcher-kuttl-default/watcher-kuttl-decision-engine-0"] Oct 03 09:07:42 crc kubenswrapper[4765]: W1003 09:07:42.962795 4765 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod0323892b_865c_444a_bbe4_f73361d81cb1.slice/crio-fe7305b0a5a6e933065a065036911ea325178ed557d65c40e5bef65e16c5e9c7 WatchSource:0}: Error finding container fe7305b0a5a6e933065a065036911ea325178ed557d65c40e5bef65e16c5e9c7: Status 404 returned error can't find the container with id fe7305b0a5a6e933065a065036911ea325178ed557d65c40e5bef65e16c5e9c7 Oct 03 09:07:43 crc kubenswrapper[4765]: I1003 09:07:43.325342 4765 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/ceilometer-0" Oct 03 09:07:43 crc kubenswrapper[4765]: I1003 09:07:43.388535 4765 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-g922f\" (UniqueName: \"kubernetes.io/projected/5dbe0344-1813-4e84-a956-434bd050bdc1-kube-api-access-g922f\") pod \"5dbe0344-1813-4e84-a956-434bd050bdc1\" (UID: \"5dbe0344-1813-4e84-a956-434bd050bdc1\") " Oct 03 09:07:43 crc kubenswrapper[4765]: I1003 09:07:43.388617 4765 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/5dbe0344-1813-4e84-a956-434bd050bdc1-ceilometer-tls-certs\") pod \"5dbe0344-1813-4e84-a956-434bd050bdc1\" (UID: \"5dbe0344-1813-4e84-a956-434bd050bdc1\") " Oct 03 09:07:43 crc kubenswrapper[4765]: I1003 09:07:43.388854 4765 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5dbe0344-1813-4e84-a956-434bd050bdc1-config-data\") pod \"5dbe0344-1813-4e84-a956-434bd050bdc1\" (UID: \"5dbe0344-1813-4e84-a956-434bd050bdc1\") " Oct 03 09:07:43 crc kubenswrapper[4765]: I1003 09:07:43.388908 4765 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/5dbe0344-1813-4e84-a956-434bd050bdc1-run-httpd\") pod \"5dbe0344-1813-4e84-a956-434bd050bdc1\" (UID: \"5dbe0344-1813-4e84-a956-434bd050bdc1\") " Oct 03 09:07:43 crc kubenswrapper[4765]: I1003 09:07:43.388931 4765 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/5dbe0344-1813-4e84-a956-434bd050bdc1-scripts\") pod \"5dbe0344-1813-4e84-a956-434bd050bdc1\" (UID: \"5dbe0344-1813-4e84-a956-434bd050bdc1\") " Oct 03 09:07:43 crc kubenswrapper[4765]: I1003 09:07:43.388958 4765 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/5dbe0344-1813-4e84-a956-434bd050bdc1-log-httpd\") pod \"5dbe0344-1813-4e84-a956-434bd050bdc1\" (UID: \"5dbe0344-1813-4e84-a956-434bd050bdc1\") " Oct 03 09:07:43 crc kubenswrapper[4765]: I1003 09:07:43.388988 4765 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5dbe0344-1813-4e84-a956-434bd050bdc1-combined-ca-bundle\") pod \"5dbe0344-1813-4e84-a956-434bd050bdc1\" (UID: \"5dbe0344-1813-4e84-a956-434bd050bdc1\") " Oct 03 09:07:43 crc kubenswrapper[4765]: I1003 09:07:43.389061 4765 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/5dbe0344-1813-4e84-a956-434bd050bdc1-sg-core-conf-yaml\") pod \"5dbe0344-1813-4e84-a956-434bd050bdc1\" (UID: \"5dbe0344-1813-4e84-a956-434bd050bdc1\") " Oct 03 09:07:43 crc kubenswrapper[4765]: I1003 09:07:43.390135 4765 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5dbe0344-1813-4e84-a956-434bd050bdc1-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "5dbe0344-1813-4e84-a956-434bd050bdc1" (UID: "5dbe0344-1813-4e84-a956-434bd050bdc1"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 03 09:07:43 crc kubenswrapper[4765]: I1003 09:07:43.394515 4765 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5dbe0344-1813-4e84-a956-434bd050bdc1-scripts" (OuterVolumeSpecName: "scripts") pod "5dbe0344-1813-4e84-a956-434bd050bdc1" (UID: "5dbe0344-1813-4e84-a956-434bd050bdc1"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 09:07:43 crc kubenswrapper[4765]: I1003 09:07:43.394961 4765 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5dbe0344-1813-4e84-a956-434bd050bdc1-kube-api-access-g922f" (OuterVolumeSpecName: "kube-api-access-g922f") pod "5dbe0344-1813-4e84-a956-434bd050bdc1" (UID: "5dbe0344-1813-4e84-a956-434bd050bdc1"). InnerVolumeSpecName "kube-api-access-g922f". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 09:07:43 crc kubenswrapper[4765]: I1003 09:07:43.400532 4765 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5dbe0344-1813-4e84-a956-434bd050bdc1-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "5dbe0344-1813-4e84-a956-434bd050bdc1" (UID: "5dbe0344-1813-4e84-a956-434bd050bdc1"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 03 09:07:43 crc kubenswrapper[4765]: I1003 09:07:43.435483 4765 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5dbe0344-1813-4e84-a956-434bd050bdc1-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "5dbe0344-1813-4e84-a956-434bd050bdc1" (UID: "5dbe0344-1813-4e84-a956-434bd050bdc1"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 09:07:43 crc kubenswrapper[4765]: I1003 09:07:43.491593 4765 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-g922f\" (UniqueName: \"kubernetes.io/projected/5dbe0344-1813-4e84-a956-434bd050bdc1-kube-api-access-g922f\") on node \"crc\" DevicePath \"\"" Oct 03 09:07:43 crc kubenswrapper[4765]: I1003 09:07:43.491621 4765 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/5dbe0344-1813-4e84-a956-434bd050bdc1-run-httpd\") on node \"crc\" DevicePath \"\"" Oct 03 09:07:43 crc kubenswrapper[4765]: I1003 09:07:43.491633 4765 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/5dbe0344-1813-4e84-a956-434bd050bdc1-scripts\") on node \"crc\" DevicePath \"\"" Oct 03 09:07:43 crc kubenswrapper[4765]: I1003 09:07:43.491786 4765 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/5dbe0344-1813-4e84-a956-434bd050bdc1-log-httpd\") on node \"crc\" DevicePath \"\"" Oct 03 09:07:43 crc kubenswrapper[4765]: I1003 09:07:43.491798 4765 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/5dbe0344-1813-4e84-a956-434bd050bdc1-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Oct 03 09:07:43 crc kubenswrapper[4765]: I1003 09:07:43.509799 4765 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5dbe0344-1813-4e84-a956-434bd050bdc1-ceilometer-tls-certs" (OuterVolumeSpecName: "ceilometer-tls-certs") pod "5dbe0344-1813-4e84-a956-434bd050bdc1" (UID: "5dbe0344-1813-4e84-a956-434bd050bdc1"). InnerVolumeSpecName "ceilometer-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 09:07:43 crc kubenswrapper[4765]: I1003 09:07:43.520792 4765 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5dbe0344-1813-4e84-a956-434bd050bdc1-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "5dbe0344-1813-4e84-a956-434bd050bdc1" (UID: "5dbe0344-1813-4e84-a956-434bd050bdc1"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 09:07:43 crc kubenswrapper[4765]: I1003 09:07:43.565689 4765 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5dbe0344-1813-4e84-a956-434bd050bdc1-config-data" (OuterVolumeSpecName: "config-data") pod "5dbe0344-1813-4e84-a956-434bd050bdc1" (UID: "5dbe0344-1813-4e84-a956-434bd050bdc1"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 09:07:43 crc kubenswrapper[4765]: I1003 09:07:43.594232 4765 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5dbe0344-1813-4e84-a956-434bd050bdc1-config-data\") on node \"crc\" DevicePath \"\"" Oct 03 09:07:43 crc kubenswrapper[4765]: I1003 09:07:43.594446 4765 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5dbe0344-1813-4e84-a956-434bd050bdc1-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 03 09:07:43 crc kubenswrapper[4765]: I1003 09:07:43.594507 4765 reconciler_common.go:293] "Volume detached for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/5dbe0344-1813-4e84-a956-434bd050bdc1-ceilometer-tls-certs\") on node \"crc\" DevicePath \"\"" Oct 03 09:07:43 crc kubenswrapper[4765]: I1003 09:07:43.809462 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/cinder-scheduler-0" event={"ID":"e96c47af-97c3-4c7e-9679-1cce340373ba","Type":"ContainerStarted","Data":"8a5a8b087b28783f047cc76b278e468a6acbee55b3fdf2c312d41a3c1d6ac018"} Oct 03 09:07:43 crc kubenswrapper[4765]: I1003 09:07:43.811384 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/watcher-kuttl-decision-engine-0" event={"ID":"0323892b-865c-444a-bbe4-f73361d81cb1","Type":"ContainerStarted","Data":"be10a807664e26ba0eb4d68c6ea5c7bcf237b53cfb6e2802a73578559c273a41"} Oct 03 09:07:43 crc kubenswrapper[4765]: I1003 09:07:43.811445 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/watcher-kuttl-decision-engine-0" event={"ID":"0323892b-865c-444a-bbe4-f73361d81cb1","Type":"ContainerStarted","Data":"fe7305b0a5a6e933065a065036911ea325178ed557d65c40e5bef65e16c5e9c7"} Oct 03 09:07:43 crc kubenswrapper[4765]: I1003 09:07:43.814193 4765 generic.go:334] "Generic (PLEG): container finished" podID="5dbe0344-1813-4e84-a956-434bd050bdc1" containerID="f579c547101e5562a5e8cbbf13eefd68ec182efd8182644ad53e45f28d15a161" exitCode=0 Oct 03 09:07:43 crc kubenswrapper[4765]: I1003 09:07:43.814588 4765 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/ceilometer-0" Oct 03 09:07:43 crc kubenswrapper[4765]: I1003 09:07:43.814983 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/ceilometer-0" event={"ID":"5dbe0344-1813-4e84-a956-434bd050bdc1","Type":"ContainerDied","Data":"f579c547101e5562a5e8cbbf13eefd68ec182efd8182644ad53e45f28d15a161"} Oct 03 09:07:43 crc kubenswrapper[4765]: I1003 09:07:43.815074 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/ceilometer-0" event={"ID":"5dbe0344-1813-4e84-a956-434bd050bdc1","Type":"ContainerDied","Data":"5e7862abcd6052d41f72b9beae90ce11a1b516e2e1a70282577cef7da8ecc676"} Oct 03 09:07:43 crc kubenswrapper[4765]: I1003 09:07:43.815142 4765 scope.go:117] "RemoveContainer" containerID="c9a7c8cff14aa929cc1d212e1add72e03d3849017c7ba288aba297471ec75f4c" Oct 03 09:07:43 crc kubenswrapper[4765]: I1003 09:07:43.843058 4765 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="watcher-kuttl-default/cinder-scheduler-0" podStartSLOduration=3.84303258 podStartE2EDuration="3.84303258s" podCreationTimestamp="2025-10-03 09:07:40 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-03 09:07:43.836746168 +0000 UTC m=+1708.138240508" watchObservedRunningTime="2025-10-03 09:07:43.84303258 +0000 UTC m=+1708.144526910" Oct 03 09:07:43 crc kubenswrapper[4765]: I1003 09:07:43.880814 4765 scope.go:117] "RemoveContainer" containerID="848b071f1bb670ceae2b3fdb7f9ba0a83275f026ea59a9689396f1e284267548" Oct 03 09:07:43 crc kubenswrapper[4765]: I1003 09:07:43.882875 4765 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="watcher-kuttl-default/watcher-kuttl-decision-engine-0" podStartSLOduration=2.8828520749999997 podStartE2EDuration="2.882852075s" podCreationTimestamp="2025-10-03 09:07:41 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-03 09:07:43.874058138 +0000 UTC m=+1708.175552468" watchObservedRunningTime="2025-10-03 09:07:43.882852075 +0000 UTC m=+1708.184346415" Oct 03 09:07:43 crc kubenswrapper[4765]: I1003 09:07:43.930101 4765 scope.go:117] "RemoveContainer" containerID="f579c547101e5562a5e8cbbf13eefd68ec182efd8182644ad53e45f28d15a161" Oct 03 09:07:43 crc kubenswrapper[4765]: I1003 09:07:43.930500 4765 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["watcher-kuttl-default/ceilometer-0"] Oct 03 09:07:43 crc kubenswrapper[4765]: I1003 09:07:43.938173 4765 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["watcher-kuttl-default/ceilometer-0"] Oct 03 09:07:43 crc kubenswrapper[4765]: I1003 09:07:43.960872 4765 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["watcher-kuttl-default/ceilometer-0"] Oct 03 09:07:43 crc kubenswrapper[4765]: E1003 09:07:43.961565 4765 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5dbe0344-1813-4e84-a956-434bd050bdc1" containerName="proxy-httpd" Oct 03 09:07:43 crc kubenswrapper[4765]: I1003 09:07:43.961601 4765 state_mem.go:107] "Deleted CPUSet assignment" podUID="5dbe0344-1813-4e84-a956-434bd050bdc1" containerName="proxy-httpd" Oct 03 09:07:43 crc kubenswrapper[4765]: E1003 09:07:43.961614 4765 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5dbe0344-1813-4e84-a956-434bd050bdc1" containerName="ceilometer-central-agent" Oct 03 09:07:43 crc kubenswrapper[4765]: I1003 09:07:43.961622 4765 state_mem.go:107] "Deleted CPUSet assignment" podUID="5dbe0344-1813-4e84-a956-434bd050bdc1" containerName="ceilometer-central-agent" Oct 03 09:07:43 crc kubenswrapper[4765]: E1003 09:07:43.961728 4765 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5dbe0344-1813-4e84-a956-434bd050bdc1" containerName="sg-core" Oct 03 09:07:43 crc kubenswrapper[4765]: I1003 09:07:43.961740 4765 state_mem.go:107] "Deleted CPUSet assignment" podUID="5dbe0344-1813-4e84-a956-434bd050bdc1" containerName="sg-core" Oct 03 09:07:43 crc kubenswrapper[4765]: E1003 09:07:43.961754 4765 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5dbe0344-1813-4e84-a956-434bd050bdc1" containerName="ceilometer-notification-agent" Oct 03 09:07:43 crc kubenswrapper[4765]: I1003 09:07:43.961762 4765 state_mem.go:107] "Deleted CPUSet assignment" podUID="5dbe0344-1813-4e84-a956-434bd050bdc1" containerName="ceilometer-notification-agent" Oct 03 09:07:43 crc kubenswrapper[4765]: I1003 09:07:43.961962 4765 memory_manager.go:354] "RemoveStaleState removing state" podUID="5dbe0344-1813-4e84-a956-434bd050bdc1" containerName="ceilometer-central-agent" Oct 03 09:07:43 crc kubenswrapper[4765]: I1003 09:07:43.961988 4765 memory_manager.go:354] "RemoveStaleState removing state" podUID="5dbe0344-1813-4e84-a956-434bd050bdc1" containerName="sg-core" Oct 03 09:07:43 crc kubenswrapper[4765]: I1003 09:07:43.962008 4765 memory_manager.go:354] "RemoveStaleState removing state" podUID="5dbe0344-1813-4e84-a956-434bd050bdc1" containerName="ceilometer-notification-agent" Oct 03 09:07:43 crc kubenswrapper[4765]: I1003 09:07:43.962019 4765 memory_manager.go:354] "RemoveStaleState removing state" podUID="5dbe0344-1813-4e84-a956-434bd050bdc1" containerName="proxy-httpd" Oct 03 09:07:43 crc kubenswrapper[4765]: I1003 09:07:43.965232 4765 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/ceilometer-0" Oct 03 09:07:43 crc kubenswrapper[4765]: I1003 09:07:43.971315 4765 reflector.go:368] Caches populated for *v1.Secret from object-"watcher-kuttl-default"/"ceilometer-scripts" Oct 03 09:07:43 crc kubenswrapper[4765]: I1003 09:07:43.971585 4765 reflector.go:368] Caches populated for *v1.Secret from object-"watcher-kuttl-default"/"ceilometer-config-data" Oct 03 09:07:43 crc kubenswrapper[4765]: I1003 09:07:43.971864 4765 reflector.go:368] Caches populated for *v1.Secret from object-"watcher-kuttl-default"/"cert-ceilometer-internal-svc" Oct 03 09:07:43 crc kubenswrapper[4765]: I1003 09:07:43.974856 4765 scope.go:117] "RemoveContainer" containerID="b06db2bd1f356e803c7e960bd2bfdda5b8ece438e43f4a8114c4944017c78bfb" Oct 03 09:07:43 crc kubenswrapper[4765]: I1003 09:07:43.980617 4765 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["watcher-kuttl-default/ceilometer-0"] Oct 03 09:07:44 crc kubenswrapper[4765]: I1003 09:07:44.003213 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ae3ea092-9e9e-4c82-a915-e6f786a1f9bd-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"ae3ea092-9e9e-4c82-a915-e6f786a1f9bd\") " pod="watcher-kuttl-default/ceilometer-0" Oct 03 09:07:44 crc kubenswrapper[4765]: I1003 09:07:44.003354 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/ae3ea092-9e9e-4c82-a915-e6f786a1f9bd-run-httpd\") pod \"ceilometer-0\" (UID: \"ae3ea092-9e9e-4c82-a915-e6f786a1f9bd\") " pod="watcher-kuttl-default/ceilometer-0" Oct 03 09:07:44 crc kubenswrapper[4765]: I1003 09:07:44.003394 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tnw88\" (UniqueName: \"kubernetes.io/projected/ae3ea092-9e9e-4c82-a915-e6f786a1f9bd-kube-api-access-tnw88\") pod \"ceilometer-0\" (UID: \"ae3ea092-9e9e-4c82-a915-e6f786a1f9bd\") " pod="watcher-kuttl-default/ceilometer-0" Oct 03 09:07:44 crc kubenswrapper[4765]: I1003 09:07:44.003424 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/ae3ea092-9e9e-4c82-a915-e6f786a1f9bd-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"ae3ea092-9e9e-4c82-a915-e6f786a1f9bd\") " pod="watcher-kuttl-default/ceilometer-0" Oct 03 09:07:44 crc kubenswrapper[4765]: I1003 09:07:44.003461 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ae3ea092-9e9e-4c82-a915-e6f786a1f9bd-scripts\") pod \"ceilometer-0\" (UID: \"ae3ea092-9e9e-4c82-a915-e6f786a1f9bd\") " pod="watcher-kuttl-default/ceilometer-0" Oct 03 09:07:44 crc kubenswrapper[4765]: I1003 09:07:44.003480 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ae3ea092-9e9e-4c82-a915-e6f786a1f9bd-config-data\") pod \"ceilometer-0\" (UID: \"ae3ea092-9e9e-4c82-a915-e6f786a1f9bd\") " pod="watcher-kuttl-default/ceilometer-0" Oct 03 09:07:44 crc kubenswrapper[4765]: I1003 09:07:44.003508 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/ae3ea092-9e9e-4c82-a915-e6f786a1f9bd-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"ae3ea092-9e9e-4c82-a915-e6f786a1f9bd\") " pod="watcher-kuttl-default/ceilometer-0" Oct 03 09:07:44 crc kubenswrapper[4765]: I1003 09:07:44.003797 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/ae3ea092-9e9e-4c82-a915-e6f786a1f9bd-log-httpd\") pod \"ceilometer-0\" (UID: \"ae3ea092-9e9e-4c82-a915-e6f786a1f9bd\") " pod="watcher-kuttl-default/ceilometer-0" Oct 03 09:07:44 crc kubenswrapper[4765]: I1003 09:07:44.021070 4765 scope.go:117] "RemoveContainer" containerID="c9a7c8cff14aa929cc1d212e1add72e03d3849017c7ba288aba297471ec75f4c" Oct 03 09:07:44 crc kubenswrapper[4765]: E1003 09:07:44.022053 4765 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c9a7c8cff14aa929cc1d212e1add72e03d3849017c7ba288aba297471ec75f4c\": container with ID starting with c9a7c8cff14aa929cc1d212e1add72e03d3849017c7ba288aba297471ec75f4c not found: ID does not exist" containerID="c9a7c8cff14aa929cc1d212e1add72e03d3849017c7ba288aba297471ec75f4c" Oct 03 09:07:44 crc kubenswrapper[4765]: I1003 09:07:44.022142 4765 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c9a7c8cff14aa929cc1d212e1add72e03d3849017c7ba288aba297471ec75f4c"} err="failed to get container status \"c9a7c8cff14aa929cc1d212e1add72e03d3849017c7ba288aba297471ec75f4c\": rpc error: code = NotFound desc = could not find container \"c9a7c8cff14aa929cc1d212e1add72e03d3849017c7ba288aba297471ec75f4c\": container with ID starting with c9a7c8cff14aa929cc1d212e1add72e03d3849017c7ba288aba297471ec75f4c not found: ID does not exist" Oct 03 09:07:44 crc kubenswrapper[4765]: I1003 09:07:44.022177 4765 scope.go:117] "RemoveContainer" containerID="848b071f1bb670ceae2b3fdb7f9ba0a83275f026ea59a9689396f1e284267548" Oct 03 09:07:44 crc kubenswrapper[4765]: E1003 09:07:44.023006 4765 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"848b071f1bb670ceae2b3fdb7f9ba0a83275f026ea59a9689396f1e284267548\": container with ID starting with 848b071f1bb670ceae2b3fdb7f9ba0a83275f026ea59a9689396f1e284267548 not found: ID does not exist" containerID="848b071f1bb670ceae2b3fdb7f9ba0a83275f026ea59a9689396f1e284267548" Oct 03 09:07:44 crc kubenswrapper[4765]: I1003 09:07:44.023057 4765 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"848b071f1bb670ceae2b3fdb7f9ba0a83275f026ea59a9689396f1e284267548"} err="failed to get container status \"848b071f1bb670ceae2b3fdb7f9ba0a83275f026ea59a9689396f1e284267548\": rpc error: code = NotFound desc = could not find container \"848b071f1bb670ceae2b3fdb7f9ba0a83275f026ea59a9689396f1e284267548\": container with ID starting with 848b071f1bb670ceae2b3fdb7f9ba0a83275f026ea59a9689396f1e284267548 not found: ID does not exist" Oct 03 09:07:44 crc kubenswrapper[4765]: I1003 09:07:44.024691 4765 scope.go:117] "RemoveContainer" containerID="f579c547101e5562a5e8cbbf13eefd68ec182efd8182644ad53e45f28d15a161" Oct 03 09:07:44 crc kubenswrapper[4765]: E1003 09:07:44.025551 4765 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f579c547101e5562a5e8cbbf13eefd68ec182efd8182644ad53e45f28d15a161\": container with ID starting with f579c547101e5562a5e8cbbf13eefd68ec182efd8182644ad53e45f28d15a161 not found: ID does not exist" containerID="f579c547101e5562a5e8cbbf13eefd68ec182efd8182644ad53e45f28d15a161" Oct 03 09:07:44 crc kubenswrapper[4765]: I1003 09:07:44.025612 4765 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f579c547101e5562a5e8cbbf13eefd68ec182efd8182644ad53e45f28d15a161"} err="failed to get container status \"f579c547101e5562a5e8cbbf13eefd68ec182efd8182644ad53e45f28d15a161\": rpc error: code = NotFound desc = could not find container \"f579c547101e5562a5e8cbbf13eefd68ec182efd8182644ad53e45f28d15a161\": container with ID starting with f579c547101e5562a5e8cbbf13eefd68ec182efd8182644ad53e45f28d15a161 not found: ID does not exist" Oct 03 09:07:44 crc kubenswrapper[4765]: I1003 09:07:44.025680 4765 scope.go:117] "RemoveContainer" containerID="b06db2bd1f356e803c7e960bd2bfdda5b8ece438e43f4a8114c4944017c78bfb" Oct 03 09:07:44 crc kubenswrapper[4765]: E1003 09:07:44.036387 4765 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b06db2bd1f356e803c7e960bd2bfdda5b8ece438e43f4a8114c4944017c78bfb\": container with ID starting with b06db2bd1f356e803c7e960bd2bfdda5b8ece438e43f4a8114c4944017c78bfb not found: ID does not exist" containerID="b06db2bd1f356e803c7e960bd2bfdda5b8ece438e43f4a8114c4944017c78bfb" Oct 03 09:07:44 crc kubenswrapper[4765]: I1003 09:07:44.036463 4765 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b06db2bd1f356e803c7e960bd2bfdda5b8ece438e43f4a8114c4944017c78bfb"} err="failed to get container status \"b06db2bd1f356e803c7e960bd2bfdda5b8ece438e43f4a8114c4944017c78bfb\": rpc error: code = NotFound desc = could not find container \"b06db2bd1f356e803c7e960bd2bfdda5b8ece438e43f4a8114c4944017c78bfb\": container with ID starting with b06db2bd1f356e803c7e960bd2bfdda5b8ece438e43f4a8114c4944017c78bfb not found: ID does not exist" Oct 03 09:07:44 crc kubenswrapper[4765]: I1003 09:07:44.105742 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/ae3ea092-9e9e-4c82-a915-e6f786a1f9bd-log-httpd\") pod \"ceilometer-0\" (UID: \"ae3ea092-9e9e-4c82-a915-e6f786a1f9bd\") " pod="watcher-kuttl-default/ceilometer-0" Oct 03 09:07:44 crc kubenswrapper[4765]: I1003 09:07:44.105804 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ae3ea092-9e9e-4c82-a915-e6f786a1f9bd-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"ae3ea092-9e9e-4c82-a915-e6f786a1f9bd\") " pod="watcher-kuttl-default/ceilometer-0" Oct 03 09:07:44 crc kubenswrapper[4765]: I1003 09:07:44.105876 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/ae3ea092-9e9e-4c82-a915-e6f786a1f9bd-run-httpd\") pod \"ceilometer-0\" (UID: \"ae3ea092-9e9e-4c82-a915-e6f786a1f9bd\") " pod="watcher-kuttl-default/ceilometer-0" Oct 03 09:07:44 crc kubenswrapper[4765]: I1003 09:07:44.105903 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tnw88\" (UniqueName: \"kubernetes.io/projected/ae3ea092-9e9e-4c82-a915-e6f786a1f9bd-kube-api-access-tnw88\") pod \"ceilometer-0\" (UID: \"ae3ea092-9e9e-4c82-a915-e6f786a1f9bd\") " pod="watcher-kuttl-default/ceilometer-0" Oct 03 09:07:44 crc kubenswrapper[4765]: I1003 09:07:44.105931 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/ae3ea092-9e9e-4c82-a915-e6f786a1f9bd-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"ae3ea092-9e9e-4c82-a915-e6f786a1f9bd\") " pod="watcher-kuttl-default/ceilometer-0" Oct 03 09:07:44 crc kubenswrapper[4765]: I1003 09:07:44.105962 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ae3ea092-9e9e-4c82-a915-e6f786a1f9bd-scripts\") pod \"ceilometer-0\" (UID: \"ae3ea092-9e9e-4c82-a915-e6f786a1f9bd\") " pod="watcher-kuttl-default/ceilometer-0" Oct 03 09:07:44 crc kubenswrapper[4765]: I1003 09:07:44.105978 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ae3ea092-9e9e-4c82-a915-e6f786a1f9bd-config-data\") pod \"ceilometer-0\" (UID: \"ae3ea092-9e9e-4c82-a915-e6f786a1f9bd\") " pod="watcher-kuttl-default/ceilometer-0" Oct 03 09:07:44 crc kubenswrapper[4765]: I1003 09:07:44.106005 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/ae3ea092-9e9e-4c82-a915-e6f786a1f9bd-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"ae3ea092-9e9e-4c82-a915-e6f786a1f9bd\") " pod="watcher-kuttl-default/ceilometer-0" Oct 03 09:07:44 crc kubenswrapper[4765]: I1003 09:07:44.106270 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/ae3ea092-9e9e-4c82-a915-e6f786a1f9bd-log-httpd\") pod \"ceilometer-0\" (UID: \"ae3ea092-9e9e-4c82-a915-e6f786a1f9bd\") " pod="watcher-kuttl-default/ceilometer-0" Oct 03 09:07:44 crc kubenswrapper[4765]: I1003 09:07:44.106891 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/ae3ea092-9e9e-4c82-a915-e6f786a1f9bd-run-httpd\") pod \"ceilometer-0\" (UID: \"ae3ea092-9e9e-4c82-a915-e6f786a1f9bd\") " pod="watcher-kuttl-default/ceilometer-0" Oct 03 09:07:44 crc kubenswrapper[4765]: I1003 09:07:44.112758 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ae3ea092-9e9e-4c82-a915-e6f786a1f9bd-config-data\") pod \"ceilometer-0\" (UID: \"ae3ea092-9e9e-4c82-a915-e6f786a1f9bd\") " pod="watcher-kuttl-default/ceilometer-0" Oct 03 09:07:44 crc kubenswrapper[4765]: I1003 09:07:44.118932 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/ae3ea092-9e9e-4c82-a915-e6f786a1f9bd-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"ae3ea092-9e9e-4c82-a915-e6f786a1f9bd\") " pod="watcher-kuttl-default/ceilometer-0" Oct 03 09:07:44 crc kubenswrapper[4765]: I1003 09:07:44.119265 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ae3ea092-9e9e-4c82-a915-e6f786a1f9bd-scripts\") pod \"ceilometer-0\" (UID: \"ae3ea092-9e9e-4c82-a915-e6f786a1f9bd\") " pod="watcher-kuttl-default/ceilometer-0" Oct 03 09:07:44 crc kubenswrapper[4765]: I1003 09:07:44.120057 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ae3ea092-9e9e-4c82-a915-e6f786a1f9bd-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"ae3ea092-9e9e-4c82-a915-e6f786a1f9bd\") " pod="watcher-kuttl-default/ceilometer-0" Oct 03 09:07:44 crc kubenswrapper[4765]: I1003 09:07:44.125509 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/ae3ea092-9e9e-4c82-a915-e6f786a1f9bd-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"ae3ea092-9e9e-4c82-a915-e6f786a1f9bd\") " pod="watcher-kuttl-default/ceilometer-0" Oct 03 09:07:44 crc kubenswrapper[4765]: I1003 09:07:44.130193 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tnw88\" (UniqueName: \"kubernetes.io/projected/ae3ea092-9e9e-4c82-a915-e6f786a1f9bd-kube-api-access-tnw88\") pod \"ceilometer-0\" (UID: \"ae3ea092-9e9e-4c82-a915-e6f786a1f9bd\") " pod="watcher-kuttl-default/ceilometer-0" Oct 03 09:07:44 crc kubenswrapper[4765]: I1003 09:07:44.300722 4765 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/ceilometer-0" Oct 03 09:07:44 crc kubenswrapper[4765]: I1003 09:07:44.319131 4765 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5dbe0344-1813-4e84-a956-434bd050bdc1" path="/var/lib/kubelet/pods/5dbe0344-1813-4e84-a956-434bd050bdc1/volumes" Oct 03 09:07:44 crc kubenswrapper[4765]: I1003 09:07:44.430740 4765 log.go:25] "Finished parsing log file" path="/var/log/pods/watcher-kuttl-default_watcher-kuttl-decision-engine-0_0323892b-865c-444a-bbe4-f73361d81cb1/watcher-decision-engine/0.log" Oct 03 09:07:44 crc kubenswrapper[4765]: I1003 09:07:44.926565 4765 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["watcher-kuttl-default/ceilometer-0"] Oct 03 09:07:45 crc kubenswrapper[4765]: I1003 09:07:45.178942 4765 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="watcher-kuttl-default/cinder-api-0" Oct 03 09:07:45 crc kubenswrapper[4765]: I1003 09:07:45.632584 4765 log.go:25] "Finished parsing log file" path="/var/log/pods/watcher-kuttl-default_watcher-kuttl-decision-engine-0_0323892b-865c-444a-bbe4-f73361d81cb1/watcher-decision-engine/0.log" Oct 03 09:07:45 crc kubenswrapper[4765]: I1003 09:07:45.690840 4765 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="watcher-kuttl-default/cinder-scheduler-0" Oct 03 09:07:45 crc kubenswrapper[4765]: I1003 09:07:45.765995 4765 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="watcher-kuttl-default/cinder-backup-0" Oct 03 09:07:45 crc kubenswrapper[4765]: I1003 09:07:45.853884 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/ceilometer-0" event={"ID":"ae3ea092-9e9e-4c82-a915-e6f786a1f9bd","Type":"ContainerStarted","Data":"b805decbc63a7f8b2e5c57934887b889d345271b2e9b7fbae388b635e85fb43f"} Oct 03 09:07:45 crc kubenswrapper[4765]: I1003 09:07:45.854283 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/ceilometer-0" event={"ID":"ae3ea092-9e9e-4c82-a915-e6f786a1f9bd","Type":"ContainerStarted","Data":"250581cc273ff9ac6de1e68c68549ca938432873271dc16ff7a89daf38ffd2ef"} Oct 03 09:07:46 crc kubenswrapper[4765]: I1003 09:07:46.832061 4765 log.go:25] "Finished parsing log file" path="/var/log/pods/watcher-kuttl-default_watcher-kuttl-decision-engine-0_0323892b-865c-444a-bbe4-f73361d81cb1/watcher-decision-engine/0.log" Oct 03 09:07:46 crc kubenswrapper[4765]: I1003 09:07:46.875509 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/ceilometer-0" event={"ID":"ae3ea092-9e9e-4c82-a915-e6f786a1f9bd","Type":"ContainerStarted","Data":"6405725d2edae85a7d5b3f08a49cc319ae516e1ac22dc6c38d0e00668ebba98b"} Oct 03 09:07:47 crc kubenswrapper[4765]: I1003 09:07:47.886998 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/ceilometer-0" event={"ID":"ae3ea092-9e9e-4c82-a915-e6f786a1f9bd","Type":"ContainerStarted","Data":"d915578fdc558b573e36108ca401e5bc9c832db86fef955ec3ea965326c8e625"} Oct 03 09:07:48 crc kubenswrapper[4765]: I1003 09:07:48.075287 4765 log.go:25] "Finished parsing log file" path="/var/log/pods/watcher-kuttl-default_watcher-kuttl-decision-engine-0_0323892b-865c-444a-bbe4-f73361d81cb1/watcher-decision-engine/0.log" Oct 03 09:07:48 crc kubenswrapper[4765]: I1003 09:07:48.900340 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/ceilometer-0" event={"ID":"ae3ea092-9e9e-4c82-a915-e6f786a1f9bd","Type":"ContainerStarted","Data":"3465bc363592d1eda95facd9c3798534166478e47afca455e8c163603af9ede1"} Oct 03 09:07:48 crc kubenswrapper[4765]: I1003 09:07:48.900738 4765 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="watcher-kuttl-default/ceilometer-0" Oct 03 09:07:48 crc kubenswrapper[4765]: I1003 09:07:48.926530 4765 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="watcher-kuttl-default/ceilometer-0" podStartSLOduration=2.679652125 podStartE2EDuration="5.926509505s" podCreationTimestamp="2025-10-03 09:07:43 +0000 UTC" firstStartedPulling="2025-10-03 09:07:44.930857632 +0000 UTC m=+1709.232351962" lastFinishedPulling="2025-10-03 09:07:48.177715012 +0000 UTC m=+1712.479209342" observedRunningTime="2025-10-03 09:07:48.922209944 +0000 UTC m=+1713.223704284" watchObservedRunningTime="2025-10-03 09:07:48.926509505 +0000 UTC m=+1713.228003835" Oct 03 09:07:49 crc kubenswrapper[4765]: I1003 09:07:49.282565 4765 log.go:25] "Finished parsing log file" path="/var/log/pods/watcher-kuttl-default_watcher-kuttl-decision-engine-0_0323892b-865c-444a-bbe4-f73361d81cb1/watcher-decision-engine/0.log" Oct 03 09:07:50 crc kubenswrapper[4765]: I1003 09:07:50.483721 4765 log.go:25] "Finished parsing log file" path="/var/log/pods/watcher-kuttl-default_watcher-kuttl-decision-engine-0_0323892b-865c-444a-bbe4-f73361d81cb1/watcher-decision-engine/0.log" Oct 03 09:07:50 crc kubenswrapper[4765]: I1003 09:07:50.945208 4765 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="watcher-kuttl-default/cinder-scheduler-0" Oct 03 09:07:51 crc kubenswrapper[4765]: I1003 09:07:51.008842 4765 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="watcher-kuttl-default/cinder-backup-0" Oct 03 09:07:51 crc kubenswrapper[4765]: I1003 09:07:51.703876 4765 log.go:25] "Finished parsing log file" path="/var/log/pods/watcher-kuttl-default_watcher-kuttl-decision-engine-0_0323892b-865c-444a-bbe4-f73361d81cb1/watcher-decision-engine/0.log" Oct 03 09:07:52 crc kubenswrapper[4765]: I1003 09:07:52.290407 4765 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="watcher-kuttl-default/watcher-kuttl-decision-engine-0" Oct 03 09:07:52 crc kubenswrapper[4765]: I1003 09:07:52.313948 4765 scope.go:117] "RemoveContainer" containerID="dd918556e4256b95f1ffce5dba4f8a301b33441a569fc5bbea88da3f09eb9800" Oct 03 09:07:52 crc kubenswrapper[4765]: E1003 09:07:52.314168 4765 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-j8mss_openshift-machine-config-operator(d636dbad-9ffa-4ba7-953f-adea04b76a23)\"" pod="openshift-machine-config-operator/machine-config-daemon-j8mss" podUID="d636dbad-9ffa-4ba7-953f-adea04b76a23" Oct 03 09:07:52 crc kubenswrapper[4765]: I1003 09:07:52.321772 4765 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="watcher-kuttl-default/watcher-kuttl-decision-engine-0" Oct 03 09:07:52 crc kubenswrapper[4765]: I1003 09:07:52.954955 4765 log.go:25] "Finished parsing log file" path="/var/log/pods/watcher-kuttl-default_watcher-kuttl-decision-engine-0_0323892b-865c-444a-bbe4-f73361d81cb1/watcher-decision-engine/0.log" Oct 03 09:07:52 crc kubenswrapper[4765]: I1003 09:07:52.955862 4765 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="watcher-kuttl-default/watcher-kuttl-decision-engine-0" Oct 03 09:07:52 crc kubenswrapper[4765]: I1003 09:07:52.981336 4765 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="watcher-kuttl-default/watcher-kuttl-decision-engine-0" Oct 03 09:07:54 crc kubenswrapper[4765]: I1003 09:07:54.050098 4765 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["watcher-kuttl-default/keystone-db-sync-fq7mw"] Oct 03 09:07:54 crc kubenswrapper[4765]: I1003 09:07:54.057932 4765 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["watcher-kuttl-default/keystone-db-sync-fq7mw"] Oct 03 09:07:54 crc kubenswrapper[4765]: I1003 09:07:54.172634 4765 log.go:25] "Finished parsing log file" path="/var/log/pods/watcher-kuttl-default_watcher-kuttl-decision-engine-0_0323892b-865c-444a-bbe4-f73361d81cb1/watcher-decision-engine/0.log" Oct 03 09:07:54 crc kubenswrapper[4765]: I1003 09:07:54.317032 4765 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0271dd37-48aa-49af-bc6e-41b56b8ed75f" path="/var/lib/kubelet/pods/0271dd37-48aa-49af-bc6e-41b56b8ed75f/volumes" Oct 03 09:07:54 crc kubenswrapper[4765]: I1003 09:07:54.423609 4765 log.go:25] "Finished parsing log file" path="/var/log/pods/watcher-kuttl-default_watcher-kuttl-decision-engine-0_0323892b-865c-444a-bbe4-f73361d81cb1/watcher-decision-engine/0.log" Oct 03 09:07:54 crc kubenswrapper[4765]: I1003 09:07:54.478181 4765 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["watcher-kuttl-default/cinder-db-sync-kvjq6"] Oct 03 09:07:54 crc kubenswrapper[4765]: I1003 09:07:54.485216 4765 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["watcher-kuttl-default/cinder-db-sync-kvjq6"] Oct 03 09:07:54 crc kubenswrapper[4765]: I1003 09:07:54.516607 4765 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["watcher-kuttl-default/cinder-scheduler-0"] Oct 03 09:07:54 crc kubenswrapper[4765]: I1003 09:07:54.516964 4765 kuberuntime_container.go:808] "Killing container with a grace period" pod="watcher-kuttl-default/cinder-scheduler-0" podUID="e96c47af-97c3-4c7e-9679-1cce340373ba" containerName="cinder-scheduler" containerID="cri-o://4f903e85136bfa6f8da321d57e9c9f18f118ae78c26d42ee5dbfdfcdba7109aa" gracePeriod=30 Oct 03 09:07:54 crc kubenswrapper[4765]: I1003 09:07:54.517018 4765 kuberuntime_container.go:808] "Killing container with a grace period" pod="watcher-kuttl-default/cinder-scheduler-0" podUID="e96c47af-97c3-4c7e-9679-1cce340373ba" containerName="probe" containerID="cri-o://8a5a8b087b28783f047cc76b278e468a6acbee55b3fdf2c312d41a3c1d6ac018" gracePeriod=30 Oct 03 09:07:54 crc kubenswrapper[4765]: I1003 09:07:54.533785 4765 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["watcher-kuttl-default/cinder-backup-0"] Oct 03 09:07:54 crc kubenswrapper[4765]: I1003 09:07:54.534097 4765 kuberuntime_container.go:808] "Killing container with a grace period" pod="watcher-kuttl-default/cinder-backup-0" podUID="3e6de015-79c8-4659-b0ea-dffac3d7bfe0" containerName="cinder-backup" containerID="cri-o://d97dd3000fe2403652860d876d82decd335c2c473ab4fe465ccce2a59850af8a" gracePeriod=30 Oct 03 09:07:54 crc kubenswrapper[4765]: I1003 09:07:54.534169 4765 kuberuntime_container.go:808] "Killing container with a grace period" pod="watcher-kuttl-default/cinder-backup-0" podUID="3e6de015-79c8-4659-b0ea-dffac3d7bfe0" containerName="probe" containerID="cri-o://012c69d0bb250ee72feb481599a2d4d635a8bb98e2f3072d5012fdccbe993ab5" gracePeriod=30 Oct 03 09:07:54 crc kubenswrapper[4765]: I1003 09:07:54.555690 4765 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["watcher-kuttl-default/cinder-api-0"] Oct 03 09:07:54 crc kubenswrapper[4765]: I1003 09:07:54.555966 4765 kuberuntime_container.go:808] "Killing container with a grace period" pod="watcher-kuttl-default/cinder-api-0" podUID="775f1f1a-d2a9-45a6-91d7-9ea015f815a5" containerName="cinder-api-log" containerID="cri-o://9b40d33389dbf3b866a6e031bf6639fcc656cc3db5295965e512f9492e6d40f9" gracePeriod=30 Oct 03 09:07:54 crc kubenswrapper[4765]: I1003 09:07:54.556065 4765 kuberuntime_container.go:808] "Killing container with a grace period" pod="watcher-kuttl-default/cinder-api-0" podUID="775f1f1a-d2a9-45a6-91d7-9ea015f815a5" containerName="cinder-api" containerID="cri-o://b0761d1d1406df64e7df209c406d9b18463e3f500094ce75b7bd8a795f9271be" gracePeriod=30 Oct 03 09:07:54 crc kubenswrapper[4765]: I1003 09:07:54.616913 4765 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["watcher-kuttl-default/cinderf883-account-delete-t4pjf"] Oct 03 09:07:54 crc kubenswrapper[4765]: I1003 09:07:54.618060 4765 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/cinderf883-account-delete-t4pjf" Oct 03 09:07:54 crc kubenswrapper[4765]: I1003 09:07:54.624079 4765 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["watcher-kuttl-default/cinderf883-account-delete-t4pjf"] Oct 03 09:07:54 crc kubenswrapper[4765]: I1003 09:07:54.679442 4765 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["watcher-kuttl-default/cinder-db-create-nxj4c"] Oct 03 09:07:54 crc kubenswrapper[4765]: I1003 09:07:54.691940 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xwn4p\" (UniqueName: \"kubernetes.io/projected/3aad0823-82f7-4f21-9e62-ace9456d4a84-kube-api-access-xwn4p\") pod \"cinderf883-account-delete-t4pjf\" (UID: \"3aad0823-82f7-4f21-9e62-ace9456d4a84\") " pod="watcher-kuttl-default/cinderf883-account-delete-t4pjf" Oct 03 09:07:54 crc kubenswrapper[4765]: I1003 09:07:54.694196 4765 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["watcher-kuttl-default/cinder-db-create-nxj4c"] Oct 03 09:07:54 crc kubenswrapper[4765]: I1003 09:07:54.706526 4765 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["watcher-kuttl-default/cinder-f883-account-create-nmddb"] Oct 03 09:07:54 crc kubenswrapper[4765]: I1003 09:07:54.713478 4765 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["watcher-kuttl-default/cinderf883-account-delete-t4pjf"] Oct 03 09:07:54 crc kubenswrapper[4765]: E1003 09:07:54.714077 4765 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[kube-api-access-xwn4p], unattached volumes=[], failed to process volumes=[]: context canceled" pod="watcher-kuttl-default/cinderf883-account-delete-t4pjf" podUID="3aad0823-82f7-4f21-9e62-ace9456d4a84" Oct 03 09:07:54 crc kubenswrapper[4765]: I1003 09:07:54.727087 4765 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["watcher-kuttl-default/cinder-f883-account-create-nmddb"] Oct 03 09:07:54 crc kubenswrapper[4765]: I1003 09:07:54.794161 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xwn4p\" (UniqueName: \"kubernetes.io/projected/3aad0823-82f7-4f21-9e62-ace9456d4a84-kube-api-access-xwn4p\") pod \"cinderf883-account-delete-t4pjf\" (UID: \"3aad0823-82f7-4f21-9e62-ace9456d4a84\") " pod="watcher-kuttl-default/cinderf883-account-delete-t4pjf" Oct 03 09:07:54 crc kubenswrapper[4765]: I1003 09:07:54.816009 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xwn4p\" (UniqueName: \"kubernetes.io/projected/3aad0823-82f7-4f21-9e62-ace9456d4a84-kube-api-access-xwn4p\") pod \"cinderf883-account-delete-t4pjf\" (UID: \"3aad0823-82f7-4f21-9e62-ace9456d4a84\") " pod="watcher-kuttl-default/cinderf883-account-delete-t4pjf" Oct 03 09:07:54 crc kubenswrapper[4765]: I1003 09:07:54.974717 4765 generic.go:334] "Generic (PLEG): container finished" podID="775f1f1a-d2a9-45a6-91d7-9ea015f815a5" containerID="9b40d33389dbf3b866a6e031bf6639fcc656cc3db5295965e512f9492e6d40f9" exitCode=143 Oct 03 09:07:54 crc kubenswrapper[4765]: I1003 09:07:54.975018 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/cinder-api-0" event={"ID":"775f1f1a-d2a9-45a6-91d7-9ea015f815a5","Type":"ContainerDied","Data":"9b40d33389dbf3b866a6e031bf6639fcc656cc3db5295965e512f9492e6d40f9"} Oct 03 09:07:54 crc kubenswrapper[4765]: I1003 09:07:54.975109 4765 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/cinderf883-account-delete-t4pjf" Oct 03 09:07:54 crc kubenswrapper[4765]: I1003 09:07:54.995252 4765 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/cinderf883-account-delete-t4pjf" Oct 03 09:07:55 crc kubenswrapper[4765]: I1003 09:07:55.097671 4765 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xwn4p\" (UniqueName: \"kubernetes.io/projected/3aad0823-82f7-4f21-9e62-ace9456d4a84-kube-api-access-xwn4p\") pod \"3aad0823-82f7-4f21-9e62-ace9456d4a84\" (UID: \"3aad0823-82f7-4f21-9e62-ace9456d4a84\") " Oct 03 09:07:55 crc kubenswrapper[4765]: I1003 09:07:55.104553 4765 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3aad0823-82f7-4f21-9e62-ace9456d4a84-kube-api-access-xwn4p" (OuterVolumeSpecName: "kube-api-access-xwn4p") pod "3aad0823-82f7-4f21-9e62-ace9456d4a84" (UID: "3aad0823-82f7-4f21-9e62-ace9456d4a84"). InnerVolumeSpecName "kube-api-access-xwn4p". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 09:07:55 crc kubenswrapper[4765]: I1003 09:07:55.200577 4765 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xwn4p\" (UniqueName: \"kubernetes.io/projected/3aad0823-82f7-4f21-9e62-ace9456d4a84-kube-api-access-xwn4p\") on node \"crc\" DevicePath \"\"" Oct 03 09:07:55 crc kubenswrapper[4765]: I1003 09:07:55.660795 4765 log.go:25] "Finished parsing log file" path="/var/log/pods/watcher-kuttl-default_watcher-kuttl-decision-engine-0_0323892b-865c-444a-bbe4-f73361d81cb1/watcher-decision-engine/0.log" Oct 03 09:07:55 crc kubenswrapper[4765]: I1003 09:07:55.803284 4765 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/cinder-scheduler-0" Oct 03 09:07:55 crc kubenswrapper[4765]: I1003 09:07:55.816612 4765 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e96c47af-97c3-4c7e-9679-1cce340373ba-combined-ca-bundle\") pod \"e96c47af-97c3-4c7e-9679-1cce340373ba\" (UID: \"e96c47af-97c3-4c7e-9679-1cce340373ba\") " Oct 03 09:07:55 crc kubenswrapper[4765]: I1003 09:07:55.816756 4765 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e96c47af-97c3-4c7e-9679-1cce340373ba-config-data\") pod \"e96c47af-97c3-4c7e-9679-1cce340373ba\" (UID: \"e96c47af-97c3-4c7e-9679-1cce340373ba\") " Oct 03 09:07:55 crc kubenswrapper[4765]: I1003 09:07:55.816800 4765 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/e96c47af-97c3-4c7e-9679-1cce340373ba-config-data-custom\") pod \"e96c47af-97c3-4c7e-9679-1cce340373ba\" (UID: \"e96c47af-97c3-4c7e-9679-1cce340373ba\") " Oct 03 09:07:55 crc kubenswrapper[4765]: I1003 09:07:55.816819 4765 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cert-memcached-mtls\" (UniqueName: \"kubernetes.io/secret/e96c47af-97c3-4c7e-9679-1cce340373ba-cert-memcached-mtls\") pod \"e96c47af-97c3-4c7e-9679-1cce340373ba\" (UID: \"e96c47af-97c3-4c7e-9679-1cce340373ba\") " Oct 03 09:07:55 crc kubenswrapper[4765]: I1003 09:07:55.816840 4765 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e96c47af-97c3-4c7e-9679-1cce340373ba-scripts\") pod \"e96c47af-97c3-4c7e-9679-1cce340373ba\" (UID: \"e96c47af-97c3-4c7e-9679-1cce340373ba\") " Oct 03 09:07:55 crc kubenswrapper[4765]: I1003 09:07:55.816865 4765 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nhnpg\" (UniqueName: \"kubernetes.io/projected/e96c47af-97c3-4c7e-9679-1cce340373ba-kube-api-access-nhnpg\") pod \"e96c47af-97c3-4c7e-9679-1cce340373ba\" (UID: \"e96c47af-97c3-4c7e-9679-1cce340373ba\") " Oct 03 09:07:55 crc kubenswrapper[4765]: I1003 09:07:55.816933 4765 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/e96c47af-97c3-4c7e-9679-1cce340373ba-etc-machine-id\") pod \"e96c47af-97c3-4c7e-9679-1cce340373ba\" (UID: \"e96c47af-97c3-4c7e-9679-1cce340373ba\") " Oct 03 09:07:55 crc kubenswrapper[4765]: I1003 09:07:55.817127 4765 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/e96c47af-97c3-4c7e-9679-1cce340373ba-etc-machine-id" (OuterVolumeSpecName: "etc-machine-id") pod "e96c47af-97c3-4c7e-9679-1cce340373ba" (UID: "e96c47af-97c3-4c7e-9679-1cce340373ba"). InnerVolumeSpecName "etc-machine-id". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 03 09:07:55 crc kubenswrapper[4765]: I1003 09:07:55.817823 4765 reconciler_common.go:293] "Volume detached for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/e96c47af-97c3-4c7e-9679-1cce340373ba-etc-machine-id\") on node \"crc\" DevicePath \"\"" Oct 03 09:07:55 crc kubenswrapper[4765]: I1003 09:07:55.836313 4765 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e96c47af-97c3-4c7e-9679-1cce340373ba-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "e96c47af-97c3-4c7e-9679-1cce340373ba" (UID: "e96c47af-97c3-4c7e-9679-1cce340373ba"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 09:07:55 crc kubenswrapper[4765]: I1003 09:07:55.837848 4765 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e96c47af-97c3-4c7e-9679-1cce340373ba-scripts" (OuterVolumeSpecName: "scripts") pod "e96c47af-97c3-4c7e-9679-1cce340373ba" (UID: "e96c47af-97c3-4c7e-9679-1cce340373ba"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 09:07:55 crc kubenswrapper[4765]: I1003 09:07:55.837948 4765 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e96c47af-97c3-4c7e-9679-1cce340373ba-kube-api-access-nhnpg" (OuterVolumeSpecName: "kube-api-access-nhnpg") pod "e96c47af-97c3-4c7e-9679-1cce340373ba" (UID: "e96c47af-97c3-4c7e-9679-1cce340373ba"). InnerVolumeSpecName "kube-api-access-nhnpg". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 09:07:55 crc kubenswrapper[4765]: I1003 09:07:55.919899 4765 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/e96c47af-97c3-4c7e-9679-1cce340373ba-config-data-custom\") on node \"crc\" DevicePath \"\"" Oct 03 09:07:55 crc kubenswrapper[4765]: I1003 09:07:55.919934 4765 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e96c47af-97c3-4c7e-9679-1cce340373ba-scripts\") on node \"crc\" DevicePath \"\"" Oct 03 09:07:55 crc kubenswrapper[4765]: I1003 09:07:55.919948 4765 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-nhnpg\" (UniqueName: \"kubernetes.io/projected/e96c47af-97c3-4c7e-9679-1cce340373ba-kube-api-access-nhnpg\") on node \"crc\" DevicePath \"\"" Oct 03 09:07:55 crc kubenswrapper[4765]: I1003 09:07:55.922743 4765 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e96c47af-97c3-4c7e-9679-1cce340373ba-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "e96c47af-97c3-4c7e-9679-1cce340373ba" (UID: "e96c47af-97c3-4c7e-9679-1cce340373ba"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 09:07:55 crc kubenswrapper[4765]: I1003 09:07:55.981974 4765 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e96c47af-97c3-4c7e-9679-1cce340373ba-config-data" (OuterVolumeSpecName: "config-data") pod "e96c47af-97c3-4c7e-9679-1cce340373ba" (UID: "e96c47af-97c3-4c7e-9679-1cce340373ba"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 09:07:55 crc kubenswrapper[4765]: I1003 09:07:55.995442 4765 generic.go:334] "Generic (PLEG): container finished" podID="e96c47af-97c3-4c7e-9679-1cce340373ba" containerID="8a5a8b087b28783f047cc76b278e468a6acbee55b3fdf2c312d41a3c1d6ac018" exitCode=0 Oct 03 09:07:55 crc kubenswrapper[4765]: I1003 09:07:55.995478 4765 generic.go:334] "Generic (PLEG): container finished" podID="e96c47af-97c3-4c7e-9679-1cce340373ba" containerID="4f903e85136bfa6f8da321d57e9c9f18f118ae78c26d42ee5dbfdfcdba7109aa" exitCode=0 Oct 03 09:07:55 crc kubenswrapper[4765]: I1003 09:07:55.995522 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/cinder-scheduler-0" event={"ID":"e96c47af-97c3-4c7e-9679-1cce340373ba","Type":"ContainerDied","Data":"8a5a8b087b28783f047cc76b278e468a6acbee55b3fdf2c312d41a3c1d6ac018"} Oct 03 09:07:55 crc kubenswrapper[4765]: I1003 09:07:55.995553 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/cinder-scheduler-0" event={"ID":"e96c47af-97c3-4c7e-9679-1cce340373ba","Type":"ContainerDied","Data":"4f903e85136bfa6f8da321d57e9c9f18f118ae78c26d42ee5dbfdfcdba7109aa"} Oct 03 09:07:55 crc kubenswrapper[4765]: I1003 09:07:55.995566 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/cinder-scheduler-0" event={"ID":"e96c47af-97c3-4c7e-9679-1cce340373ba","Type":"ContainerDied","Data":"456de9d76f2f46a634b9662c99cac9680c1b4b42eaaa366e46cb20670185501a"} Oct 03 09:07:55 crc kubenswrapper[4765]: I1003 09:07:55.995588 4765 scope.go:117] "RemoveContainer" containerID="8a5a8b087b28783f047cc76b278e468a6acbee55b3fdf2c312d41a3c1d6ac018" Oct 03 09:07:55 crc kubenswrapper[4765]: I1003 09:07:55.995743 4765 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/cinder-scheduler-0" Oct 03 09:07:55 crc kubenswrapper[4765]: I1003 09:07:55.999392 4765 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e96c47af-97c3-4c7e-9679-1cce340373ba-cert-memcached-mtls" (OuterVolumeSpecName: "cert-memcached-mtls") pod "e96c47af-97c3-4c7e-9679-1cce340373ba" (UID: "e96c47af-97c3-4c7e-9679-1cce340373ba"). InnerVolumeSpecName "cert-memcached-mtls". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 09:07:56 crc kubenswrapper[4765]: I1003 09:07:56.002140 4765 generic.go:334] "Generic (PLEG): container finished" podID="3e6de015-79c8-4659-b0ea-dffac3d7bfe0" containerID="012c69d0bb250ee72feb481599a2d4d635a8bb98e2f3072d5012fdccbe993ab5" exitCode=0 Oct 03 09:07:56 crc kubenswrapper[4765]: I1003 09:07:56.002184 4765 generic.go:334] "Generic (PLEG): container finished" podID="3e6de015-79c8-4659-b0ea-dffac3d7bfe0" containerID="d97dd3000fe2403652860d876d82decd335c2c473ab4fe465ccce2a59850af8a" exitCode=0 Oct 03 09:07:56 crc kubenswrapper[4765]: I1003 09:07:56.002258 4765 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/cinderf883-account-delete-t4pjf" Oct 03 09:07:56 crc kubenswrapper[4765]: I1003 09:07:56.005895 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/cinder-backup-0" event={"ID":"3e6de015-79c8-4659-b0ea-dffac3d7bfe0","Type":"ContainerDied","Data":"012c69d0bb250ee72feb481599a2d4d635a8bb98e2f3072d5012fdccbe993ab5"} Oct 03 09:07:56 crc kubenswrapper[4765]: I1003 09:07:56.005944 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/cinder-backup-0" event={"ID":"3e6de015-79c8-4659-b0ea-dffac3d7bfe0","Type":"ContainerDied","Data":"d97dd3000fe2403652860d876d82decd335c2c473ab4fe465ccce2a59850af8a"} Oct 03 09:07:56 crc kubenswrapper[4765]: I1003 09:07:56.021440 4765 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e96c47af-97c3-4c7e-9679-1cce340373ba-config-data\") on node \"crc\" DevicePath \"\"" Oct 03 09:07:56 crc kubenswrapper[4765]: I1003 09:07:56.021482 4765 reconciler_common.go:293] "Volume detached for volume \"cert-memcached-mtls\" (UniqueName: \"kubernetes.io/secret/e96c47af-97c3-4c7e-9679-1cce340373ba-cert-memcached-mtls\") on node \"crc\" DevicePath \"\"" Oct 03 09:07:56 crc kubenswrapper[4765]: I1003 09:07:56.021495 4765 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e96c47af-97c3-4c7e-9679-1cce340373ba-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 03 09:07:56 crc kubenswrapper[4765]: I1003 09:07:56.055520 4765 scope.go:117] "RemoveContainer" containerID="4f903e85136bfa6f8da321d57e9c9f18f118ae78c26d42ee5dbfdfcdba7109aa" Oct 03 09:07:56 crc kubenswrapper[4765]: I1003 09:07:56.065319 4765 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["watcher-kuttl-default/cinderf883-account-delete-t4pjf"] Oct 03 09:07:56 crc kubenswrapper[4765]: I1003 09:07:56.072465 4765 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["watcher-kuttl-default/cinderf883-account-delete-t4pjf"] Oct 03 09:07:56 crc kubenswrapper[4765]: I1003 09:07:56.089221 4765 scope.go:117] "RemoveContainer" containerID="8a5a8b087b28783f047cc76b278e468a6acbee55b3fdf2c312d41a3c1d6ac018" Oct 03 09:07:56 crc kubenswrapper[4765]: E1003 09:07:56.089715 4765 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8a5a8b087b28783f047cc76b278e468a6acbee55b3fdf2c312d41a3c1d6ac018\": container with ID starting with 8a5a8b087b28783f047cc76b278e468a6acbee55b3fdf2c312d41a3c1d6ac018 not found: ID does not exist" containerID="8a5a8b087b28783f047cc76b278e468a6acbee55b3fdf2c312d41a3c1d6ac018" Oct 03 09:07:56 crc kubenswrapper[4765]: I1003 09:07:56.089752 4765 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8a5a8b087b28783f047cc76b278e468a6acbee55b3fdf2c312d41a3c1d6ac018"} err="failed to get container status \"8a5a8b087b28783f047cc76b278e468a6acbee55b3fdf2c312d41a3c1d6ac018\": rpc error: code = NotFound desc = could not find container \"8a5a8b087b28783f047cc76b278e468a6acbee55b3fdf2c312d41a3c1d6ac018\": container with ID starting with 8a5a8b087b28783f047cc76b278e468a6acbee55b3fdf2c312d41a3c1d6ac018 not found: ID does not exist" Oct 03 09:07:56 crc kubenswrapper[4765]: I1003 09:07:56.089785 4765 scope.go:117] "RemoveContainer" containerID="4f903e85136bfa6f8da321d57e9c9f18f118ae78c26d42ee5dbfdfcdba7109aa" Oct 03 09:07:56 crc kubenswrapper[4765]: E1003 09:07:56.091920 4765 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4f903e85136bfa6f8da321d57e9c9f18f118ae78c26d42ee5dbfdfcdba7109aa\": container with ID starting with 4f903e85136bfa6f8da321d57e9c9f18f118ae78c26d42ee5dbfdfcdba7109aa not found: ID does not exist" containerID="4f903e85136bfa6f8da321d57e9c9f18f118ae78c26d42ee5dbfdfcdba7109aa" Oct 03 09:07:56 crc kubenswrapper[4765]: I1003 09:07:56.091981 4765 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4f903e85136bfa6f8da321d57e9c9f18f118ae78c26d42ee5dbfdfcdba7109aa"} err="failed to get container status \"4f903e85136bfa6f8da321d57e9c9f18f118ae78c26d42ee5dbfdfcdba7109aa\": rpc error: code = NotFound desc = could not find container \"4f903e85136bfa6f8da321d57e9c9f18f118ae78c26d42ee5dbfdfcdba7109aa\": container with ID starting with 4f903e85136bfa6f8da321d57e9c9f18f118ae78c26d42ee5dbfdfcdba7109aa not found: ID does not exist" Oct 03 09:07:56 crc kubenswrapper[4765]: I1003 09:07:56.092032 4765 scope.go:117] "RemoveContainer" containerID="8a5a8b087b28783f047cc76b278e468a6acbee55b3fdf2c312d41a3c1d6ac018" Oct 03 09:07:56 crc kubenswrapper[4765]: I1003 09:07:56.096584 4765 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8a5a8b087b28783f047cc76b278e468a6acbee55b3fdf2c312d41a3c1d6ac018"} err="failed to get container status \"8a5a8b087b28783f047cc76b278e468a6acbee55b3fdf2c312d41a3c1d6ac018\": rpc error: code = NotFound desc = could not find container \"8a5a8b087b28783f047cc76b278e468a6acbee55b3fdf2c312d41a3c1d6ac018\": container with ID starting with 8a5a8b087b28783f047cc76b278e468a6acbee55b3fdf2c312d41a3c1d6ac018 not found: ID does not exist" Oct 03 09:07:56 crc kubenswrapper[4765]: I1003 09:07:56.096636 4765 scope.go:117] "RemoveContainer" containerID="4f903e85136bfa6f8da321d57e9c9f18f118ae78c26d42ee5dbfdfcdba7109aa" Oct 03 09:07:56 crc kubenswrapper[4765]: I1003 09:07:56.097507 4765 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4f903e85136bfa6f8da321d57e9c9f18f118ae78c26d42ee5dbfdfcdba7109aa"} err="failed to get container status \"4f903e85136bfa6f8da321d57e9c9f18f118ae78c26d42ee5dbfdfcdba7109aa\": rpc error: code = NotFound desc = could not find container \"4f903e85136bfa6f8da321d57e9c9f18f118ae78c26d42ee5dbfdfcdba7109aa\": container with ID starting with 4f903e85136bfa6f8da321d57e9c9f18f118ae78c26d42ee5dbfdfcdba7109aa not found: ID does not exist" Oct 03 09:07:56 crc kubenswrapper[4765]: I1003 09:07:56.338164 4765 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/cinder-backup-0" Oct 03 09:07:56 crc kubenswrapper[4765]: I1003 09:07:56.340913 4765 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3aad0823-82f7-4f21-9e62-ace9456d4a84" path="/var/lib/kubelet/pods/3aad0823-82f7-4f21-9e62-ace9456d4a84/volumes" Oct 03 09:07:56 crc kubenswrapper[4765]: I1003 09:07:56.341242 4765 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5dd52fdd-5ddc-4475-a90f-0f51f56e4ac0" path="/var/lib/kubelet/pods/5dd52fdd-5ddc-4475-a90f-0f51f56e4ac0/volumes" Oct 03 09:07:56 crc kubenswrapper[4765]: I1003 09:07:56.341881 4765 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5f05a4bc-df17-4d20-879b-1d082c186426" path="/var/lib/kubelet/pods/5f05a4bc-df17-4d20-879b-1d082c186426/volumes" Oct 03 09:07:56 crc kubenswrapper[4765]: I1003 09:07:56.342368 4765 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c7420dfa-3ae6-4037-9f4e-a43c05cd7ac7" path="/var/lib/kubelet/pods/c7420dfa-3ae6-4037-9f4e-a43c05cd7ac7/volumes" Oct 03 09:07:56 crc kubenswrapper[4765]: I1003 09:07:56.358714 4765 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["watcher-kuttl-default/cinder-scheduler-0"] Oct 03 09:07:56 crc kubenswrapper[4765]: I1003 09:07:56.367239 4765 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["watcher-kuttl-default/cinder-scheduler-0"] Oct 03 09:07:56 crc kubenswrapper[4765]: I1003 09:07:56.440726 4765 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-nvme\" (UniqueName: \"kubernetes.io/host-path/3e6de015-79c8-4659-b0ea-dffac3d7bfe0-etc-nvme\") pod \"3e6de015-79c8-4659-b0ea-dffac3d7bfe0\" (UID: \"3e6de015-79c8-4659-b0ea-dffac3d7bfe0\") " Oct 03 09:07:56 crc kubenswrapper[4765]: I1003 09:07:56.441370 4765 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dev\" (UniqueName: \"kubernetes.io/host-path/3e6de015-79c8-4659-b0ea-dffac3d7bfe0-dev\") pod \"3e6de015-79c8-4659-b0ea-dffac3d7bfe0\" (UID: \"3e6de015-79c8-4659-b0ea-dffac3d7bfe0\") " Oct 03 09:07:56 crc kubenswrapper[4765]: I1003 09:07:56.441410 4765 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-locks-brick\" (UniqueName: \"kubernetes.io/host-path/3e6de015-79c8-4659-b0ea-dffac3d7bfe0-var-locks-brick\") pod \"3e6de015-79c8-4659-b0ea-dffac3d7bfe0\" (UID: \"3e6de015-79c8-4659-b0ea-dffac3d7bfe0\") " Oct 03 09:07:56 crc kubenswrapper[4765]: I1003 09:07:56.441512 4765 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-q78mk\" (UniqueName: \"kubernetes.io/projected/3e6de015-79c8-4659-b0ea-dffac3d7bfe0-kube-api-access-q78mk\") pod \"3e6de015-79c8-4659-b0ea-dffac3d7bfe0\" (UID: \"3e6de015-79c8-4659-b0ea-dffac3d7bfe0\") " Oct 03 09:07:56 crc kubenswrapper[4765]: I1003 09:07:56.441581 4765 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3e6de015-79c8-4659-b0ea-dffac3d7bfe0-config-data\") pod \"3e6de015-79c8-4659-b0ea-dffac3d7bfe0\" (UID: \"3e6de015-79c8-4659-b0ea-dffac3d7bfe0\") " Oct 03 09:07:56 crc kubenswrapper[4765]: I1003 09:07:56.441612 4765 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-iscsi\" (UniqueName: \"kubernetes.io/host-path/3e6de015-79c8-4659-b0ea-dffac3d7bfe0-etc-iscsi\") pod \"3e6de015-79c8-4659-b0ea-dffac3d7bfe0\" (UID: \"3e6de015-79c8-4659-b0ea-dffac3d7bfe0\") " Oct 03 09:07:56 crc kubenswrapper[4765]: I1003 09:07:56.440830 4765 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/3e6de015-79c8-4659-b0ea-dffac3d7bfe0-etc-nvme" (OuterVolumeSpecName: "etc-nvme") pod "3e6de015-79c8-4659-b0ea-dffac3d7bfe0" (UID: "3e6de015-79c8-4659-b0ea-dffac3d7bfe0"). InnerVolumeSpecName "etc-nvme". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 03 09:07:56 crc kubenswrapper[4765]: I1003 09:07:56.441805 4765 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/3e6de015-79c8-4659-b0ea-dffac3d7bfe0-var-locks-brick" (OuterVolumeSpecName: "var-locks-brick") pod "3e6de015-79c8-4659-b0ea-dffac3d7bfe0" (UID: "3e6de015-79c8-4659-b0ea-dffac3d7bfe0"). InnerVolumeSpecName "var-locks-brick". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 03 09:07:56 crc kubenswrapper[4765]: I1003 09:07:56.441910 4765 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/3e6de015-79c8-4659-b0ea-dffac3d7bfe0-dev" (OuterVolumeSpecName: "dev") pod "3e6de015-79c8-4659-b0ea-dffac3d7bfe0" (UID: "3e6de015-79c8-4659-b0ea-dffac3d7bfe0"). InnerVolumeSpecName "dev". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 03 09:07:56 crc kubenswrapper[4765]: I1003 09:07:56.441928 4765 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/3e6de015-79c8-4659-b0ea-dffac3d7bfe0-etc-iscsi" (OuterVolumeSpecName: "etc-iscsi") pod "3e6de015-79c8-4659-b0ea-dffac3d7bfe0" (UID: "3e6de015-79c8-4659-b0ea-dffac3d7bfe0"). InnerVolumeSpecName "etc-iscsi". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 03 09:07:56 crc kubenswrapper[4765]: I1003 09:07:56.442081 4765 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/3e6de015-79c8-4659-b0ea-dffac3d7bfe0-var-locks-cinder" (OuterVolumeSpecName: "var-locks-cinder") pod "3e6de015-79c8-4659-b0ea-dffac3d7bfe0" (UID: "3e6de015-79c8-4659-b0ea-dffac3d7bfe0"). InnerVolumeSpecName "var-locks-cinder". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 03 09:07:56 crc kubenswrapper[4765]: I1003 09:07:56.442792 4765 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-locks-cinder\" (UniqueName: \"kubernetes.io/host-path/3e6de015-79c8-4659-b0ea-dffac3d7bfe0-var-locks-cinder\") pod \"3e6de015-79c8-4659-b0ea-dffac3d7bfe0\" (UID: \"3e6de015-79c8-4659-b0ea-dffac3d7bfe0\") " Oct 03 09:07:56 crc kubenswrapper[4765]: I1003 09:07:56.442896 4765 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cert-memcached-mtls\" (UniqueName: \"kubernetes.io/secret/3e6de015-79c8-4659-b0ea-dffac3d7bfe0-cert-memcached-mtls\") pod \"3e6de015-79c8-4659-b0ea-dffac3d7bfe0\" (UID: \"3e6de015-79c8-4659-b0ea-dffac3d7bfe0\") " Oct 03 09:07:56 crc kubenswrapper[4765]: I1003 09:07:56.443044 4765 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-lib-cinder\" (UniqueName: \"kubernetes.io/host-path/3e6de015-79c8-4659-b0ea-dffac3d7bfe0-var-lib-cinder\") pod \"3e6de015-79c8-4659-b0ea-dffac3d7bfe0\" (UID: \"3e6de015-79c8-4659-b0ea-dffac3d7bfe0\") " Oct 03 09:07:56 crc kubenswrapper[4765]: I1003 09:07:56.443126 4765 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3e6de015-79c8-4659-b0ea-dffac3d7bfe0-combined-ca-bundle\") pod \"3e6de015-79c8-4659-b0ea-dffac3d7bfe0\" (UID: \"3e6de015-79c8-4659-b0ea-dffac3d7bfe0\") " Oct 03 09:07:56 crc kubenswrapper[4765]: I1003 09:07:56.443154 4765 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/3e6de015-79c8-4659-b0ea-dffac3d7bfe0-scripts\") pod \"3e6de015-79c8-4659-b0ea-dffac3d7bfe0\" (UID: \"3e6de015-79c8-4659-b0ea-dffac3d7bfe0\") " Oct 03 09:07:56 crc kubenswrapper[4765]: I1003 09:07:56.443179 4765 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/3e6de015-79c8-4659-b0ea-dffac3d7bfe0-config-data-custom\") pod \"3e6de015-79c8-4659-b0ea-dffac3d7bfe0\" (UID: \"3e6de015-79c8-4659-b0ea-dffac3d7bfe0\") " Oct 03 09:07:56 crc kubenswrapper[4765]: I1003 09:07:56.443244 4765 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/3e6de015-79c8-4659-b0ea-dffac3d7bfe0-sys\") pod \"3e6de015-79c8-4659-b0ea-dffac3d7bfe0\" (UID: \"3e6de015-79c8-4659-b0ea-dffac3d7bfe0\") " Oct 03 09:07:56 crc kubenswrapper[4765]: I1003 09:07:56.443266 4765 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/3e6de015-79c8-4659-b0ea-dffac3d7bfe0-etc-machine-id\") pod \"3e6de015-79c8-4659-b0ea-dffac3d7bfe0\" (UID: \"3e6de015-79c8-4659-b0ea-dffac3d7bfe0\") " Oct 03 09:07:56 crc kubenswrapper[4765]: I1003 09:07:56.443304 4765 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/3e6de015-79c8-4659-b0ea-dffac3d7bfe0-run\") pod \"3e6de015-79c8-4659-b0ea-dffac3d7bfe0\" (UID: \"3e6de015-79c8-4659-b0ea-dffac3d7bfe0\") " Oct 03 09:07:56 crc kubenswrapper[4765]: I1003 09:07:56.443383 4765 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/3e6de015-79c8-4659-b0ea-dffac3d7bfe0-lib-modules\") pod \"3e6de015-79c8-4659-b0ea-dffac3d7bfe0\" (UID: \"3e6de015-79c8-4659-b0ea-dffac3d7bfe0\") " Oct 03 09:07:56 crc kubenswrapper[4765]: I1003 09:07:56.445218 4765 reconciler_common.go:293] "Volume detached for volume \"etc-nvme\" (UniqueName: \"kubernetes.io/host-path/3e6de015-79c8-4659-b0ea-dffac3d7bfe0-etc-nvme\") on node \"crc\" DevicePath \"\"" Oct 03 09:07:56 crc kubenswrapper[4765]: I1003 09:07:56.445242 4765 reconciler_common.go:293] "Volume detached for volume \"dev\" (UniqueName: \"kubernetes.io/host-path/3e6de015-79c8-4659-b0ea-dffac3d7bfe0-dev\") on node \"crc\" DevicePath \"\"" Oct 03 09:07:56 crc kubenswrapper[4765]: I1003 09:07:56.445252 4765 reconciler_common.go:293] "Volume detached for volume \"var-locks-brick\" (UniqueName: \"kubernetes.io/host-path/3e6de015-79c8-4659-b0ea-dffac3d7bfe0-var-locks-brick\") on node \"crc\" DevicePath \"\"" Oct 03 09:07:56 crc kubenswrapper[4765]: I1003 09:07:56.445264 4765 reconciler_common.go:293] "Volume detached for volume \"etc-iscsi\" (UniqueName: \"kubernetes.io/host-path/3e6de015-79c8-4659-b0ea-dffac3d7bfe0-etc-iscsi\") on node \"crc\" DevicePath \"\"" Oct 03 09:07:56 crc kubenswrapper[4765]: I1003 09:07:56.445273 4765 reconciler_common.go:293] "Volume detached for volume \"var-locks-cinder\" (UniqueName: \"kubernetes.io/host-path/3e6de015-79c8-4659-b0ea-dffac3d7bfe0-var-locks-cinder\") on node \"crc\" DevicePath \"\"" Oct 03 09:07:56 crc kubenswrapper[4765]: I1003 09:07:56.443402 4765 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/3e6de015-79c8-4659-b0ea-dffac3d7bfe0-var-lib-cinder" (OuterVolumeSpecName: "var-lib-cinder") pod "3e6de015-79c8-4659-b0ea-dffac3d7bfe0" (UID: "3e6de015-79c8-4659-b0ea-dffac3d7bfe0"). InnerVolumeSpecName "var-lib-cinder". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 03 09:07:56 crc kubenswrapper[4765]: I1003 09:07:56.443440 4765 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/3e6de015-79c8-4659-b0ea-dffac3d7bfe0-run" (OuterVolumeSpecName: "run") pod "3e6de015-79c8-4659-b0ea-dffac3d7bfe0" (UID: "3e6de015-79c8-4659-b0ea-dffac3d7bfe0"). InnerVolumeSpecName "run". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 03 09:07:56 crc kubenswrapper[4765]: I1003 09:07:56.443431 4765 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/3e6de015-79c8-4659-b0ea-dffac3d7bfe0-etc-machine-id" (OuterVolumeSpecName: "etc-machine-id") pod "3e6de015-79c8-4659-b0ea-dffac3d7bfe0" (UID: "3e6de015-79c8-4659-b0ea-dffac3d7bfe0"). InnerVolumeSpecName "etc-machine-id". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 03 09:07:56 crc kubenswrapper[4765]: I1003 09:07:56.443468 4765 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/3e6de015-79c8-4659-b0ea-dffac3d7bfe0-lib-modules" (OuterVolumeSpecName: "lib-modules") pod "3e6de015-79c8-4659-b0ea-dffac3d7bfe0" (UID: "3e6de015-79c8-4659-b0ea-dffac3d7bfe0"). InnerVolumeSpecName "lib-modules". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 03 09:07:56 crc kubenswrapper[4765]: I1003 09:07:56.443469 4765 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/3e6de015-79c8-4659-b0ea-dffac3d7bfe0-sys" (OuterVolumeSpecName: "sys") pod "3e6de015-79c8-4659-b0ea-dffac3d7bfe0" (UID: "3e6de015-79c8-4659-b0ea-dffac3d7bfe0"). InnerVolumeSpecName "sys". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 03 09:07:56 crc kubenswrapper[4765]: I1003 09:07:56.445982 4765 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3e6de015-79c8-4659-b0ea-dffac3d7bfe0-kube-api-access-q78mk" (OuterVolumeSpecName: "kube-api-access-q78mk") pod "3e6de015-79c8-4659-b0ea-dffac3d7bfe0" (UID: "3e6de015-79c8-4659-b0ea-dffac3d7bfe0"). InnerVolumeSpecName "kube-api-access-q78mk". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 09:07:56 crc kubenswrapper[4765]: I1003 09:07:56.449410 4765 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3e6de015-79c8-4659-b0ea-dffac3d7bfe0-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "3e6de015-79c8-4659-b0ea-dffac3d7bfe0" (UID: "3e6de015-79c8-4659-b0ea-dffac3d7bfe0"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 09:07:56 crc kubenswrapper[4765]: I1003 09:07:56.475952 4765 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3e6de015-79c8-4659-b0ea-dffac3d7bfe0-scripts" (OuterVolumeSpecName: "scripts") pod "3e6de015-79c8-4659-b0ea-dffac3d7bfe0" (UID: "3e6de015-79c8-4659-b0ea-dffac3d7bfe0"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 09:07:56 crc kubenswrapper[4765]: I1003 09:07:56.504577 4765 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3e6de015-79c8-4659-b0ea-dffac3d7bfe0-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "3e6de015-79c8-4659-b0ea-dffac3d7bfe0" (UID: "3e6de015-79c8-4659-b0ea-dffac3d7bfe0"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 09:07:56 crc kubenswrapper[4765]: I1003 09:07:56.548353 4765 reconciler_common.go:293] "Volume detached for volume \"var-lib-cinder\" (UniqueName: \"kubernetes.io/host-path/3e6de015-79c8-4659-b0ea-dffac3d7bfe0-var-lib-cinder\") on node \"crc\" DevicePath \"\"" Oct 03 09:07:56 crc kubenswrapper[4765]: I1003 09:07:56.548406 4765 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3e6de015-79c8-4659-b0ea-dffac3d7bfe0-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 03 09:07:56 crc kubenswrapper[4765]: I1003 09:07:56.548418 4765 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/3e6de015-79c8-4659-b0ea-dffac3d7bfe0-scripts\") on node \"crc\" DevicePath \"\"" Oct 03 09:07:56 crc kubenswrapper[4765]: I1003 09:07:56.548428 4765 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/3e6de015-79c8-4659-b0ea-dffac3d7bfe0-config-data-custom\") on node \"crc\" DevicePath \"\"" Oct 03 09:07:56 crc kubenswrapper[4765]: I1003 09:07:56.548438 4765 reconciler_common.go:293] "Volume detached for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/3e6de015-79c8-4659-b0ea-dffac3d7bfe0-sys\") on node \"crc\" DevicePath \"\"" Oct 03 09:07:56 crc kubenswrapper[4765]: I1003 09:07:56.548447 4765 reconciler_common.go:293] "Volume detached for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/3e6de015-79c8-4659-b0ea-dffac3d7bfe0-etc-machine-id\") on node \"crc\" DevicePath \"\"" Oct 03 09:07:56 crc kubenswrapper[4765]: I1003 09:07:56.548460 4765 reconciler_common.go:293] "Volume detached for volume \"run\" (UniqueName: \"kubernetes.io/host-path/3e6de015-79c8-4659-b0ea-dffac3d7bfe0-run\") on node \"crc\" DevicePath \"\"" Oct 03 09:07:56 crc kubenswrapper[4765]: I1003 09:07:56.548470 4765 reconciler_common.go:293] "Volume detached for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/3e6de015-79c8-4659-b0ea-dffac3d7bfe0-lib-modules\") on node \"crc\" DevicePath \"\"" Oct 03 09:07:56 crc kubenswrapper[4765]: I1003 09:07:56.548489 4765 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-q78mk\" (UniqueName: \"kubernetes.io/projected/3e6de015-79c8-4659-b0ea-dffac3d7bfe0-kube-api-access-q78mk\") on node \"crc\" DevicePath \"\"" Oct 03 09:07:56 crc kubenswrapper[4765]: I1003 09:07:56.602136 4765 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3e6de015-79c8-4659-b0ea-dffac3d7bfe0-config-data" (OuterVolumeSpecName: "config-data") pod "3e6de015-79c8-4659-b0ea-dffac3d7bfe0" (UID: "3e6de015-79c8-4659-b0ea-dffac3d7bfe0"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 09:07:56 crc kubenswrapper[4765]: I1003 09:07:56.644193 4765 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3e6de015-79c8-4659-b0ea-dffac3d7bfe0-cert-memcached-mtls" (OuterVolumeSpecName: "cert-memcached-mtls") pod "3e6de015-79c8-4659-b0ea-dffac3d7bfe0" (UID: "3e6de015-79c8-4659-b0ea-dffac3d7bfe0"). InnerVolumeSpecName "cert-memcached-mtls". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 09:07:56 crc kubenswrapper[4765]: I1003 09:07:56.652992 4765 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3e6de015-79c8-4659-b0ea-dffac3d7bfe0-config-data\") on node \"crc\" DevicePath \"\"" Oct 03 09:07:56 crc kubenswrapper[4765]: I1003 09:07:56.653037 4765 reconciler_common.go:293] "Volume detached for volume \"cert-memcached-mtls\" (UniqueName: \"kubernetes.io/secret/3e6de015-79c8-4659-b0ea-dffac3d7bfe0-cert-memcached-mtls\") on node \"crc\" DevicePath \"\"" Oct 03 09:07:56 crc kubenswrapper[4765]: I1003 09:07:56.873390 4765 log.go:25] "Finished parsing log file" path="/var/log/pods/watcher-kuttl-default_watcher-kuttl-decision-engine-0_0323892b-865c-444a-bbe4-f73361d81cb1/watcher-decision-engine/0.log" Oct 03 09:07:57 crc kubenswrapper[4765]: I1003 09:07:57.012565 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/cinder-backup-0" event={"ID":"3e6de015-79c8-4659-b0ea-dffac3d7bfe0","Type":"ContainerDied","Data":"7c7222b7c578a5b2a2c3688f05337679c04bc565ca189797fc0401934a71c241"} Oct 03 09:07:57 crc kubenswrapper[4765]: I1003 09:07:57.012639 4765 scope.go:117] "RemoveContainer" containerID="012c69d0bb250ee72feb481599a2d4d635a8bb98e2f3072d5012fdccbe993ab5" Oct 03 09:07:57 crc kubenswrapper[4765]: I1003 09:07:57.012694 4765 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/cinder-backup-0" Oct 03 09:07:57 crc kubenswrapper[4765]: I1003 09:07:57.057049 4765 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["watcher-kuttl-default/cinder-backup-0"] Oct 03 09:07:57 crc kubenswrapper[4765]: I1003 09:07:57.065263 4765 scope.go:117] "RemoveContainer" containerID="d97dd3000fe2403652860d876d82decd335c2c473ab4fe465ccce2a59850af8a" Oct 03 09:07:57 crc kubenswrapper[4765]: I1003 09:07:57.068619 4765 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["watcher-kuttl-default/cinder-backup-0"] Oct 03 09:07:57 crc kubenswrapper[4765]: I1003 09:07:57.172354 4765 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["watcher-kuttl-default/watcher-kuttl-decision-engine-0"] Oct 03 09:07:57 crc kubenswrapper[4765]: I1003 09:07:57.173472 4765 kuberuntime_container.go:808] "Killing container with a grace period" pod="watcher-kuttl-default/watcher-kuttl-decision-engine-0" podUID="0323892b-865c-444a-bbe4-f73361d81cb1" containerName="watcher-decision-engine" containerID="cri-o://be10a807664e26ba0eb4d68c6ea5c7bcf237b53cfb6e2802a73578559c273a41" gracePeriod=30 Oct 03 09:07:57 crc kubenswrapper[4765]: I1003 09:07:57.446689 4765 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["watcher-kuttl-default/ceilometer-0"] Oct 03 09:07:57 crc kubenswrapper[4765]: I1003 09:07:57.447017 4765 kuberuntime_container.go:808] "Killing container with a grace period" pod="watcher-kuttl-default/ceilometer-0" podUID="ae3ea092-9e9e-4c82-a915-e6f786a1f9bd" containerName="ceilometer-central-agent" containerID="cri-o://b805decbc63a7f8b2e5c57934887b889d345271b2e9b7fbae388b635e85fb43f" gracePeriod=30 Oct 03 09:07:57 crc kubenswrapper[4765]: I1003 09:07:57.447114 4765 kuberuntime_container.go:808] "Killing container with a grace period" pod="watcher-kuttl-default/ceilometer-0" podUID="ae3ea092-9e9e-4c82-a915-e6f786a1f9bd" containerName="ceilometer-notification-agent" containerID="cri-o://6405725d2edae85a7d5b3f08a49cc319ae516e1ac22dc6c38d0e00668ebba98b" gracePeriod=30 Oct 03 09:07:57 crc kubenswrapper[4765]: I1003 09:07:57.447164 4765 kuberuntime_container.go:808] "Killing container with a grace period" pod="watcher-kuttl-default/ceilometer-0" podUID="ae3ea092-9e9e-4c82-a915-e6f786a1f9bd" containerName="proxy-httpd" containerID="cri-o://3465bc363592d1eda95facd9c3798534166478e47afca455e8c163603af9ede1" gracePeriod=30 Oct 03 09:07:57 crc kubenswrapper[4765]: I1003 09:07:57.447093 4765 kuberuntime_container.go:808] "Killing container with a grace period" pod="watcher-kuttl-default/ceilometer-0" podUID="ae3ea092-9e9e-4c82-a915-e6f786a1f9bd" containerName="sg-core" containerID="cri-o://d915578fdc558b573e36108ca401e5bc9c832db86fef955ec3ea965326c8e625" gracePeriod=30 Oct 03 09:07:57 crc kubenswrapper[4765]: I1003 09:07:57.970662 4765 prober.go:107] "Probe failed" probeType="Readiness" pod="watcher-kuttl-default/cinder-api-0" podUID="775f1f1a-d2a9-45a6-91d7-9ea015f815a5" containerName="cinder-api" probeResult="failure" output="Get \"https://10.217.0.203:8776/healthcheck\": read tcp 10.217.0.2:37310->10.217.0.203:8776: read: connection reset by peer" Oct 03 09:07:58 crc kubenswrapper[4765]: I1003 09:07:58.026507 4765 generic.go:334] "Generic (PLEG): container finished" podID="ae3ea092-9e9e-4c82-a915-e6f786a1f9bd" containerID="3465bc363592d1eda95facd9c3798534166478e47afca455e8c163603af9ede1" exitCode=0 Oct 03 09:07:58 crc kubenswrapper[4765]: I1003 09:07:58.026539 4765 generic.go:334] "Generic (PLEG): container finished" podID="ae3ea092-9e9e-4c82-a915-e6f786a1f9bd" containerID="d915578fdc558b573e36108ca401e5bc9c832db86fef955ec3ea965326c8e625" exitCode=2 Oct 03 09:07:58 crc kubenswrapper[4765]: I1003 09:07:58.026547 4765 generic.go:334] "Generic (PLEG): container finished" podID="ae3ea092-9e9e-4c82-a915-e6f786a1f9bd" containerID="6405725d2edae85a7d5b3f08a49cc319ae516e1ac22dc6c38d0e00668ebba98b" exitCode=0 Oct 03 09:07:58 crc kubenswrapper[4765]: I1003 09:07:58.026554 4765 generic.go:334] "Generic (PLEG): container finished" podID="ae3ea092-9e9e-4c82-a915-e6f786a1f9bd" containerID="b805decbc63a7f8b2e5c57934887b889d345271b2e9b7fbae388b635e85fb43f" exitCode=0 Oct 03 09:07:58 crc kubenswrapper[4765]: I1003 09:07:58.026584 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/ceilometer-0" event={"ID":"ae3ea092-9e9e-4c82-a915-e6f786a1f9bd","Type":"ContainerDied","Data":"3465bc363592d1eda95facd9c3798534166478e47afca455e8c163603af9ede1"} Oct 03 09:07:58 crc kubenswrapper[4765]: I1003 09:07:58.026628 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/ceilometer-0" event={"ID":"ae3ea092-9e9e-4c82-a915-e6f786a1f9bd","Type":"ContainerDied","Data":"d915578fdc558b573e36108ca401e5bc9c832db86fef955ec3ea965326c8e625"} Oct 03 09:07:58 crc kubenswrapper[4765]: I1003 09:07:58.026638 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/ceilometer-0" event={"ID":"ae3ea092-9e9e-4c82-a915-e6f786a1f9bd","Type":"ContainerDied","Data":"6405725d2edae85a7d5b3f08a49cc319ae516e1ac22dc6c38d0e00668ebba98b"} Oct 03 09:07:58 crc kubenswrapper[4765]: I1003 09:07:58.026668 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/ceilometer-0" event={"ID":"ae3ea092-9e9e-4c82-a915-e6f786a1f9bd","Type":"ContainerDied","Data":"b805decbc63a7f8b2e5c57934887b889d345271b2e9b7fbae388b635e85fb43f"} Oct 03 09:07:58 crc kubenswrapper[4765]: I1003 09:07:58.056973 4765 log.go:25] "Finished parsing log file" path="/var/log/pods/watcher-kuttl-default_watcher-kuttl-decision-engine-0_0323892b-865c-444a-bbe4-f73361d81cb1/watcher-decision-engine/0.log" Oct 03 09:07:58 crc kubenswrapper[4765]: I1003 09:07:58.320382 4765 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3e6de015-79c8-4659-b0ea-dffac3d7bfe0" path="/var/lib/kubelet/pods/3e6de015-79c8-4659-b0ea-dffac3d7bfe0/volumes" Oct 03 09:07:58 crc kubenswrapper[4765]: I1003 09:07:58.321156 4765 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e96c47af-97c3-4c7e-9679-1cce340373ba" path="/var/lib/kubelet/pods/e96c47af-97c3-4c7e-9679-1cce340373ba/volumes" Oct 03 09:07:58 crc kubenswrapper[4765]: I1003 09:07:58.494888 4765 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/ceilometer-0" Oct 03 09:07:58 crc kubenswrapper[4765]: I1003 09:07:58.569439 4765 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/cinder-api-0" Oct 03 09:07:58 crc kubenswrapper[4765]: I1003 09:07:58.586052 4765 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/775f1f1a-d2a9-45a6-91d7-9ea015f815a5-logs\") pod \"775f1f1a-d2a9-45a6-91d7-9ea015f815a5\" (UID: \"775f1f1a-d2a9-45a6-91d7-9ea015f815a5\") " Oct 03 09:07:58 crc kubenswrapper[4765]: I1003 09:07:58.586132 4765 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ae3ea092-9e9e-4c82-a915-e6f786a1f9bd-scripts\") pod \"ae3ea092-9e9e-4c82-a915-e6f786a1f9bd\" (UID: \"ae3ea092-9e9e-4c82-a915-e6f786a1f9bd\") " Oct 03 09:07:58 crc kubenswrapper[4765]: I1003 09:07:58.586177 4765 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/775f1f1a-d2a9-45a6-91d7-9ea015f815a5-internal-tls-certs\") pod \"775f1f1a-d2a9-45a6-91d7-9ea015f815a5\" (UID: \"775f1f1a-d2a9-45a6-91d7-9ea015f815a5\") " Oct 03 09:07:58 crc kubenswrapper[4765]: I1003 09:07:58.586225 4765 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/775f1f1a-d2a9-45a6-91d7-9ea015f815a5-etc-machine-id\") pod \"775f1f1a-d2a9-45a6-91d7-9ea015f815a5\" (UID: \"775f1f1a-d2a9-45a6-91d7-9ea015f815a5\") " Oct 03 09:07:58 crc kubenswrapper[4765]: I1003 09:07:58.586249 4765 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/ae3ea092-9e9e-4c82-a915-e6f786a1f9bd-ceilometer-tls-certs\") pod \"ae3ea092-9e9e-4c82-a915-e6f786a1f9bd\" (UID: \"ae3ea092-9e9e-4c82-a915-e6f786a1f9bd\") " Oct 03 09:07:58 crc kubenswrapper[4765]: I1003 09:07:58.586284 4765 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/ae3ea092-9e9e-4c82-a915-e6f786a1f9bd-log-httpd\") pod \"ae3ea092-9e9e-4c82-a915-e6f786a1f9bd\" (UID: \"ae3ea092-9e9e-4c82-a915-e6f786a1f9bd\") " Oct 03 09:07:58 crc kubenswrapper[4765]: I1003 09:07:58.586310 4765 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/ae3ea092-9e9e-4c82-a915-e6f786a1f9bd-run-httpd\") pod \"ae3ea092-9e9e-4c82-a915-e6f786a1f9bd\" (UID: \"ae3ea092-9e9e-4c82-a915-e6f786a1f9bd\") " Oct 03 09:07:58 crc kubenswrapper[4765]: I1003 09:07:58.586322 4765 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/775f1f1a-d2a9-45a6-91d7-9ea015f815a5-etc-machine-id" (OuterVolumeSpecName: "etc-machine-id") pod "775f1f1a-d2a9-45a6-91d7-9ea015f815a5" (UID: "775f1f1a-d2a9-45a6-91d7-9ea015f815a5"). InnerVolumeSpecName "etc-machine-id". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 03 09:07:58 crc kubenswrapper[4765]: I1003 09:07:58.586349 4765 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/775f1f1a-d2a9-45a6-91d7-9ea015f815a5-config-data-custom\") pod \"775f1f1a-d2a9-45a6-91d7-9ea015f815a5\" (UID: \"775f1f1a-d2a9-45a6-91d7-9ea015f815a5\") " Oct 03 09:07:58 crc kubenswrapper[4765]: I1003 09:07:58.586397 4765 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cert-memcached-mtls\" (UniqueName: \"kubernetes.io/secret/775f1f1a-d2a9-45a6-91d7-9ea015f815a5-cert-memcached-mtls\") pod \"775f1f1a-d2a9-45a6-91d7-9ea015f815a5\" (UID: \"775f1f1a-d2a9-45a6-91d7-9ea015f815a5\") " Oct 03 09:07:58 crc kubenswrapper[4765]: I1003 09:07:58.586441 4765 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/775f1f1a-d2a9-45a6-91d7-9ea015f815a5-config-data\") pod \"775f1f1a-d2a9-45a6-91d7-9ea015f815a5\" (UID: \"775f1f1a-d2a9-45a6-91d7-9ea015f815a5\") " Oct 03 09:07:58 crc kubenswrapper[4765]: I1003 09:07:58.586476 4765 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ae3ea092-9e9e-4c82-a915-e6f786a1f9bd-combined-ca-bundle\") pod \"ae3ea092-9e9e-4c82-a915-e6f786a1f9bd\" (UID: \"ae3ea092-9e9e-4c82-a915-e6f786a1f9bd\") " Oct 03 09:07:58 crc kubenswrapper[4765]: I1003 09:07:58.586513 4765 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tnw88\" (UniqueName: \"kubernetes.io/projected/ae3ea092-9e9e-4c82-a915-e6f786a1f9bd-kube-api-access-tnw88\") pod \"ae3ea092-9e9e-4c82-a915-e6f786a1f9bd\" (UID: \"ae3ea092-9e9e-4c82-a915-e6f786a1f9bd\") " Oct 03 09:07:58 crc kubenswrapper[4765]: I1003 09:07:58.586553 4765 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/775f1f1a-d2a9-45a6-91d7-9ea015f815a5-public-tls-certs\") pod \"775f1f1a-d2a9-45a6-91d7-9ea015f815a5\" (UID: \"775f1f1a-d2a9-45a6-91d7-9ea015f815a5\") " Oct 03 09:07:58 crc kubenswrapper[4765]: I1003 09:07:58.586569 4765 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ae3ea092-9e9e-4c82-a915-e6f786a1f9bd-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "ae3ea092-9e9e-4c82-a915-e6f786a1f9bd" (UID: "ae3ea092-9e9e-4c82-a915-e6f786a1f9bd"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 03 09:07:58 crc kubenswrapper[4765]: I1003 09:07:58.586583 4765 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/775f1f1a-d2a9-45a6-91d7-9ea015f815a5-combined-ca-bundle\") pod \"775f1f1a-d2a9-45a6-91d7-9ea015f815a5\" (UID: \"775f1f1a-d2a9-45a6-91d7-9ea015f815a5\") " Oct 03 09:07:58 crc kubenswrapper[4765]: I1003 09:07:58.586615 4765 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ae3ea092-9e9e-4c82-a915-e6f786a1f9bd-config-data\") pod \"ae3ea092-9e9e-4c82-a915-e6f786a1f9bd\" (UID: \"ae3ea092-9e9e-4c82-a915-e6f786a1f9bd\") " Oct 03 09:07:58 crc kubenswrapper[4765]: I1003 09:07:58.586635 4765 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/775f1f1a-d2a9-45a6-91d7-9ea015f815a5-logs" (OuterVolumeSpecName: "logs") pod "775f1f1a-d2a9-45a6-91d7-9ea015f815a5" (UID: "775f1f1a-d2a9-45a6-91d7-9ea015f815a5"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 03 09:07:58 crc kubenswrapper[4765]: I1003 09:07:58.586670 4765 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4kk5m\" (UniqueName: \"kubernetes.io/projected/775f1f1a-d2a9-45a6-91d7-9ea015f815a5-kube-api-access-4kk5m\") pod \"775f1f1a-d2a9-45a6-91d7-9ea015f815a5\" (UID: \"775f1f1a-d2a9-45a6-91d7-9ea015f815a5\") " Oct 03 09:07:58 crc kubenswrapper[4765]: I1003 09:07:58.586726 4765 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/ae3ea092-9e9e-4c82-a915-e6f786a1f9bd-sg-core-conf-yaml\") pod \"ae3ea092-9e9e-4c82-a915-e6f786a1f9bd\" (UID: \"ae3ea092-9e9e-4c82-a915-e6f786a1f9bd\") " Oct 03 09:07:58 crc kubenswrapper[4765]: I1003 09:07:58.586747 4765 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/775f1f1a-d2a9-45a6-91d7-9ea015f815a5-scripts\") pod \"775f1f1a-d2a9-45a6-91d7-9ea015f815a5\" (UID: \"775f1f1a-d2a9-45a6-91d7-9ea015f815a5\") " Oct 03 09:07:58 crc kubenswrapper[4765]: I1003 09:07:58.586812 4765 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ae3ea092-9e9e-4c82-a915-e6f786a1f9bd-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "ae3ea092-9e9e-4c82-a915-e6f786a1f9bd" (UID: "ae3ea092-9e9e-4c82-a915-e6f786a1f9bd"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 03 09:07:58 crc kubenswrapper[4765]: I1003 09:07:58.587193 4765 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/775f1f1a-d2a9-45a6-91d7-9ea015f815a5-logs\") on node \"crc\" DevicePath \"\"" Oct 03 09:07:58 crc kubenswrapper[4765]: I1003 09:07:58.587213 4765 reconciler_common.go:293] "Volume detached for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/775f1f1a-d2a9-45a6-91d7-9ea015f815a5-etc-machine-id\") on node \"crc\" DevicePath \"\"" Oct 03 09:07:58 crc kubenswrapper[4765]: I1003 09:07:58.587221 4765 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/ae3ea092-9e9e-4c82-a915-e6f786a1f9bd-log-httpd\") on node \"crc\" DevicePath \"\"" Oct 03 09:07:58 crc kubenswrapper[4765]: I1003 09:07:58.587229 4765 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/ae3ea092-9e9e-4c82-a915-e6f786a1f9bd-run-httpd\") on node \"crc\" DevicePath \"\"" Oct 03 09:07:58 crc kubenswrapper[4765]: I1003 09:07:58.594216 4765 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/775f1f1a-d2a9-45a6-91d7-9ea015f815a5-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "775f1f1a-d2a9-45a6-91d7-9ea015f815a5" (UID: "775f1f1a-d2a9-45a6-91d7-9ea015f815a5"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 09:07:58 crc kubenswrapper[4765]: I1003 09:07:58.599976 4765 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/775f1f1a-d2a9-45a6-91d7-9ea015f815a5-scripts" (OuterVolumeSpecName: "scripts") pod "775f1f1a-d2a9-45a6-91d7-9ea015f815a5" (UID: "775f1f1a-d2a9-45a6-91d7-9ea015f815a5"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 09:07:58 crc kubenswrapper[4765]: I1003 09:07:58.602794 4765 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ae3ea092-9e9e-4c82-a915-e6f786a1f9bd-kube-api-access-tnw88" (OuterVolumeSpecName: "kube-api-access-tnw88") pod "ae3ea092-9e9e-4c82-a915-e6f786a1f9bd" (UID: "ae3ea092-9e9e-4c82-a915-e6f786a1f9bd"). InnerVolumeSpecName "kube-api-access-tnw88". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 09:07:58 crc kubenswrapper[4765]: I1003 09:07:58.601930 4765 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/775f1f1a-d2a9-45a6-91d7-9ea015f815a5-kube-api-access-4kk5m" (OuterVolumeSpecName: "kube-api-access-4kk5m") pod "775f1f1a-d2a9-45a6-91d7-9ea015f815a5" (UID: "775f1f1a-d2a9-45a6-91d7-9ea015f815a5"). InnerVolumeSpecName "kube-api-access-4kk5m". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 09:07:58 crc kubenswrapper[4765]: I1003 09:07:58.625412 4765 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ae3ea092-9e9e-4c82-a915-e6f786a1f9bd-scripts" (OuterVolumeSpecName: "scripts") pod "ae3ea092-9e9e-4c82-a915-e6f786a1f9bd" (UID: "ae3ea092-9e9e-4c82-a915-e6f786a1f9bd"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 09:07:58 crc kubenswrapper[4765]: I1003 09:07:58.634218 4765 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/775f1f1a-d2a9-45a6-91d7-9ea015f815a5-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "775f1f1a-d2a9-45a6-91d7-9ea015f815a5" (UID: "775f1f1a-d2a9-45a6-91d7-9ea015f815a5"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 09:07:58 crc kubenswrapper[4765]: I1003 09:07:58.635421 4765 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ae3ea092-9e9e-4c82-a915-e6f786a1f9bd-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "ae3ea092-9e9e-4c82-a915-e6f786a1f9bd" (UID: "ae3ea092-9e9e-4c82-a915-e6f786a1f9bd"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 09:07:58 crc kubenswrapper[4765]: I1003 09:07:58.670945 4765 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ae3ea092-9e9e-4c82-a915-e6f786a1f9bd-ceilometer-tls-certs" (OuterVolumeSpecName: "ceilometer-tls-certs") pod "ae3ea092-9e9e-4c82-a915-e6f786a1f9bd" (UID: "ae3ea092-9e9e-4c82-a915-e6f786a1f9bd"). InnerVolumeSpecName "ceilometer-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 09:07:58 crc kubenswrapper[4765]: I1003 09:07:58.671241 4765 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/775f1f1a-d2a9-45a6-91d7-9ea015f815a5-config-data" (OuterVolumeSpecName: "config-data") pod "775f1f1a-d2a9-45a6-91d7-9ea015f815a5" (UID: "775f1f1a-d2a9-45a6-91d7-9ea015f815a5"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 09:07:58 crc kubenswrapper[4765]: I1003 09:07:58.675336 4765 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/775f1f1a-d2a9-45a6-91d7-9ea015f815a5-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "775f1f1a-d2a9-45a6-91d7-9ea015f815a5" (UID: "775f1f1a-d2a9-45a6-91d7-9ea015f815a5"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 09:07:58 crc kubenswrapper[4765]: I1003 09:07:58.682224 4765 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ae3ea092-9e9e-4c82-a915-e6f786a1f9bd-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "ae3ea092-9e9e-4c82-a915-e6f786a1f9bd" (UID: "ae3ea092-9e9e-4c82-a915-e6f786a1f9bd"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 09:07:58 crc kubenswrapper[4765]: I1003 09:07:58.688618 4765 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/775f1f1a-d2a9-45a6-91d7-9ea015f815a5-config-data-custom\") on node \"crc\" DevicePath \"\"" Oct 03 09:07:58 crc kubenswrapper[4765]: I1003 09:07:58.688864 4765 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/775f1f1a-d2a9-45a6-91d7-9ea015f815a5-config-data\") on node \"crc\" DevicePath \"\"" Oct 03 09:07:58 crc kubenswrapper[4765]: I1003 09:07:58.688926 4765 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ae3ea092-9e9e-4c82-a915-e6f786a1f9bd-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 03 09:07:58 crc kubenswrapper[4765]: I1003 09:07:58.688988 4765 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-tnw88\" (UniqueName: \"kubernetes.io/projected/ae3ea092-9e9e-4c82-a915-e6f786a1f9bd-kube-api-access-tnw88\") on node \"crc\" DevicePath \"\"" Oct 03 09:07:58 crc kubenswrapper[4765]: I1003 09:07:58.689093 4765 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/775f1f1a-d2a9-45a6-91d7-9ea015f815a5-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 03 09:07:58 crc kubenswrapper[4765]: I1003 09:07:58.689150 4765 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4kk5m\" (UniqueName: \"kubernetes.io/projected/775f1f1a-d2a9-45a6-91d7-9ea015f815a5-kube-api-access-4kk5m\") on node \"crc\" DevicePath \"\"" Oct 03 09:07:58 crc kubenswrapper[4765]: I1003 09:07:58.689202 4765 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/ae3ea092-9e9e-4c82-a915-e6f786a1f9bd-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Oct 03 09:07:58 crc kubenswrapper[4765]: I1003 09:07:58.689257 4765 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/775f1f1a-d2a9-45a6-91d7-9ea015f815a5-scripts\") on node \"crc\" DevicePath \"\"" Oct 03 09:07:58 crc kubenswrapper[4765]: I1003 09:07:58.689322 4765 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ae3ea092-9e9e-4c82-a915-e6f786a1f9bd-scripts\") on node \"crc\" DevicePath \"\"" Oct 03 09:07:58 crc kubenswrapper[4765]: I1003 09:07:58.689635 4765 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/775f1f1a-d2a9-45a6-91d7-9ea015f815a5-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Oct 03 09:07:58 crc kubenswrapper[4765]: I1003 09:07:58.689742 4765 reconciler_common.go:293] "Volume detached for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/ae3ea092-9e9e-4c82-a915-e6f786a1f9bd-ceilometer-tls-certs\") on node \"crc\" DevicePath \"\"" Oct 03 09:07:58 crc kubenswrapper[4765]: I1003 09:07:58.698689 4765 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/775f1f1a-d2a9-45a6-91d7-9ea015f815a5-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "775f1f1a-d2a9-45a6-91d7-9ea015f815a5" (UID: "775f1f1a-d2a9-45a6-91d7-9ea015f815a5"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 09:07:58 crc kubenswrapper[4765]: I1003 09:07:58.720732 4765 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ae3ea092-9e9e-4c82-a915-e6f786a1f9bd-config-data" (OuterVolumeSpecName: "config-data") pod "ae3ea092-9e9e-4c82-a915-e6f786a1f9bd" (UID: "ae3ea092-9e9e-4c82-a915-e6f786a1f9bd"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 09:07:58 crc kubenswrapper[4765]: I1003 09:07:58.725185 4765 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/775f1f1a-d2a9-45a6-91d7-9ea015f815a5-cert-memcached-mtls" (OuterVolumeSpecName: "cert-memcached-mtls") pod "775f1f1a-d2a9-45a6-91d7-9ea015f815a5" (UID: "775f1f1a-d2a9-45a6-91d7-9ea015f815a5"). InnerVolumeSpecName "cert-memcached-mtls". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 09:07:58 crc kubenswrapper[4765]: I1003 09:07:58.791285 4765 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ae3ea092-9e9e-4c82-a915-e6f786a1f9bd-config-data\") on node \"crc\" DevicePath \"\"" Oct 03 09:07:58 crc kubenswrapper[4765]: I1003 09:07:58.791335 4765 reconciler_common.go:293] "Volume detached for volume \"cert-memcached-mtls\" (UniqueName: \"kubernetes.io/secret/775f1f1a-d2a9-45a6-91d7-9ea015f815a5-cert-memcached-mtls\") on node \"crc\" DevicePath \"\"" Oct 03 09:07:58 crc kubenswrapper[4765]: I1003 09:07:58.791347 4765 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/775f1f1a-d2a9-45a6-91d7-9ea015f815a5-public-tls-certs\") on node \"crc\" DevicePath \"\"" Oct 03 09:07:59 crc kubenswrapper[4765]: I1003 09:07:59.038199 4765 generic.go:334] "Generic (PLEG): container finished" podID="775f1f1a-d2a9-45a6-91d7-9ea015f815a5" containerID="b0761d1d1406df64e7df209c406d9b18463e3f500094ce75b7bd8a795f9271be" exitCode=0 Oct 03 09:07:59 crc kubenswrapper[4765]: I1003 09:07:59.038246 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/cinder-api-0" event={"ID":"775f1f1a-d2a9-45a6-91d7-9ea015f815a5","Type":"ContainerDied","Data":"b0761d1d1406df64e7df209c406d9b18463e3f500094ce75b7bd8a795f9271be"} Oct 03 09:07:59 crc kubenswrapper[4765]: I1003 09:07:59.038278 4765 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/cinder-api-0" Oct 03 09:07:59 crc kubenswrapper[4765]: I1003 09:07:59.038304 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/cinder-api-0" event={"ID":"775f1f1a-d2a9-45a6-91d7-9ea015f815a5","Type":"ContainerDied","Data":"f525a662febbbd23fe1a6be5d80282e21df98cebaf61d6942c6d83665ed82739"} Oct 03 09:07:59 crc kubenswrapper[4765]: I1003 09:07:59.038326 4765 scope.go:117] "RemoveContainer" containerID="b0761d1d1406df64e7df209c406d9b18463e3f500094ce75b7bd8a795f9271be" Oct 03 09:07:59 crc kubenswrapper[4765]: I1003 09:07:59.041129 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/ceilometer-0" event={"ID":"ae3ea092-9e9e-4c82-a915-e6f786a1f9bd","Type":"ContainerDied","Data":"250581cc273ff9ac6de1e68c68549ca938432873271dc16ff7a89daf38ffd2ef"} Oct 03 09:07:59 crc kubenswrapper[4765]: I1003 09:07:59.041228 4765 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/ceilometer-0" Oct 03 09:07:59 crc kubenswrapper[4765]: I1003 09:07:59.061419 4765 scope.go:117] "RemoveContainer" containerID="9b40d33389dbf3b866a6e031bf6639fcc656cc3db5295965e512f9492e6d40f9" Oct 03 09:07:59 crc kubenswrapper[4765]: I1003 09:07:59.070426 4765 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["watcher-kuttl-default/cinder-api-0"] Oct 03 09:07:59 crc kubenswrapper[4765]: I1003 09:07:59.078335 4765 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["watcher-kuttl-default/cinder-api-0"] Oct 03 09:07:59 crc kubenswrapper[4765]: I1003 09:07:59.088609 4765 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["watcher-kuttl-default/ceilometer-0"] Oct 03 09:07:59 crc kubenswrapper[4765]: I1003 09:07:59.090172 4765 scope.go:117] "RemoveContainer" containerID="b0761d1d1406df64e7df209c406d9b18463e3f500094ce75b7bd8a795f9271be" Oct 03 09:07:59 crc kubenswrapper[4765]: E1003 09:07:59.090634 4765 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b0761d1d1406df64e7df209c406d9b18463e3f500094ce75b7bd8a795f9271be\": container with ID starting with b0761d1d1406df64e7df209c406d9b18463e3f500094ce75b7bd8a795f9271be not found: ID does not exist" containerID="b0761d1d1406df64e7df209c406d9b18463e3f500094ce75b7bd8a795f9271be" Oct 03 09:07:59 crc kubenswrapper[4765]: I1003 09:07:59.090689 4765 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b0761d1d1406df64e7df209c406d9b18463e3f500094ce75b7bd8a795f9271be"} err="failed to get container status \"b0761d1d1406df64e7df209c406d9b18463e3f500094ce75b7bd8a795f9271be\": rpc error: code = NotFound desc = could not find container \"b0761d1d1406df64e7df209c406d9b18463e3f500094ce75b7bd8a795f9271be\": container with ID starting with b0761d1d1406df64e7df209c406d9b18463e3f500094ce75b7bd8a795f9271be not found: ID does not exist" Oct 03 09:07:59 crc kubenswrapper[4765]: I1003 09:07:59.090716 4765 scope.go:117] "RemoveContainer" containerID="9b40d33389dbf3b866a6e031bf6639fcc656cc3db5295965e512f9492e6d40f9" Oct 03 09:07:59 crc kubenswrapper[4765]: E1003 09:07:59.091086 4765 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9b40d33389dbf3b866a6e031bf6639fcc656cc3db5295965e512f9492e6d40f9\": container with ID starting with 9b40d33389dbf3b866a6e031bf6639fcc656cc3db5295965e512f9492e6d40f9 not found: ID does not exist" containerID="9b40d33389dbf3b866a6e031bf6639fcc656cc3db5295965e512f9492e6d40f9" Oct 03 09:07:59 crc kubenswrapper[4765]: I1003 09:07:59.091119 4765 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9b40d33389dbf3b866a6e031bf6639fcc656cc3db5295965e512f9492e6d40f9"} err="failed to get container status \"9b40d33389dbf3b866a6e031bf6639fcc656cc3db5295965e512f9492e6d40f9\": rpc error: code = NotFound desc = could not find container \"9b40d33389dbf3b866a6e031bf6639fcc656cc3db5295965e512f9492e6d40f9\": container with ID starting with 9b40d33389dbf3b866a6e031bf6639fcc656cc3db5295965e512f9492e6d40f9 not found: ID does not exist" Oct 03 09:07:59 crc kubenswrapper[4765]: I1003 09:07:59.091138 4765 scope.go:117] "RemoveContainer" containerID="3465bc363592d1eda95facd9c3798534166478e47afca455e8c163603af9ede1" Oct 03 09:07:59 crc kubenswrapper[4765]: I1003 09:07:59.095557 4765 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["watcher-kuttl-default/ceilometer-0"] Oct 03 09:07:59 crc kubenswrapper[4765]: I1003 09:07:59.111805 4765 scope.go:117] "RemoveContainer" containerID="d915578fdc558b573e36108ca401e5bc9c832db86fef955ec3ea965326c8e625" Oct 03 09:07:59 crc kubenswrapper[4765]: I1003 09:07:59.122007 4765 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["watcher-kuttl-default/ceilometer-0"] Oct 03 09:07:59 crc kubenswrapper[4765]: E1003 09:07:59.122569 4765 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="775f1f1a-d2a9-45a6-91d7-9ea015f815a5" containerName="cinder-api-log" Oct 03 09:07:59 crc kubenswrapper[4765]: I1003 09:07:59.122589 4765 state_mem.go:107] "Deleted CPUSet assignment" podUID="775f1f1a-d2a9-45a6-91d7-9ea015f815a5" containerName="cinder-api-log" Oct 03 09:07:59 crc kubenswrapper[4765]: E1003 09:07:59.122604 4765 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ae3ea092-9e9e-4c82-a915-e6f786a1f9bd" containerName="ceilometer-notification-agent" Oct 03 09:07:59 crc kubenswrapper[4765]: I1003 09:07:59.122612 4765 state_mem.go:107] "Deleted CPUSet assignment" podUID="ae3ea092-9e9e-4c82-a915-e6f786a1f9bd" containerName="ceilometer-notification-agent" Oct 03 09:07:59 crc kubenswrapper[4765]: E1003 09:07:59.122631 4765 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e96c47af-97c3-4c7e-9679-1cce340373ba" containerName="probe" Oct 03 09:07:59 crc kubenswrapper[4765]: I1003 09:07:59.122638 4765 state_mem.go:107] "Deleted CPUSet assignment" podUID="e96c47af-97c3-4c7e-9679-1cce340373ba" containerName="probe" Oct 03 09:07:59 crc kubenswrapper[4765]: E1003 09:07:59.122666 4765 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ae3ea092-9e9e-4c82-a915-e6f786a1f9bd" containerName="proxy-httpd" Oct 03 09:07:59 crc kubenswrapper[4765]: I1003 09:07:59.122674 4765 state_mem.go:107] "Deleted CPUSet assignment" podUID="ae3ea092-9e9e-4c82-a915-e6f786a1f9bd" containerName="proxy-httpd" Oct 03 09:07:59 crc kubenswrapper[4765]: E1003 09:07:59.122691 4765 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="775f1f1a-d2a9-45a6-91d7-9ea015f815a5" containerName="cinder-api" Oct 03 09:07:59 crc kubenswrapper[4765]: I1003 09:07:59.122699 4765 state_mem.go:107] "Deleted CPUSet assignment" podUID="775f1f1a-d2a9-45a6-91d7-9ea015f815a5" containerName="cinder-api" Oct 03 09:07:59 crc kubenswrapper[4765]: E1003 09:07:59.122717 4765 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3e6de015-79c8-4659-b0ea-dffac3d7bfe0" containerName="cinder-backup" Oct 03 09:07:59 crc kubenswrapper[4765]: I1003 09:07:59.122725 4765 state_mem.go:107] "Deleted CPUSet assignment" podUID="3e6de015-79c8-4659-b0ea-dffac3d7bfe0" containerName="cinder-backup" Oct 03 09:07:59 crc kubenswrapper[4765]: E1003 09:07:59.122737 4765 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ae3ea092-9e9e-4c82-a915-e6f786a1f9bd" containerName="sg-core" Oct 03 09:07:59 crc kubenswrapper[4765]: I1003 09:07:59.122743 4765 state_mem.go:107] "Deleted CPUSet assignment" podUID="ae3ea092-9e9e-4c82-a915-e6f786a1f9bd" containerName="sg-core" Oct 03 09:07:59 crc kubenswrapper[4765]: E1003 09:07:59.122761 4765 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3e6de015-79c8-4659-b0ea-dffac3d7bfe0" containerName="probe" Oct 03 09:07:59 crc kubenswrapper[4765]: I1003 09:07:59.122770 4765 state_mem.go:107] "Deleted CPUSet assignment" podUID="3e6de015-79c8-4659-b0ea-dffac3d7bfe0" containerName="probe" Oct 03 09:07:59 crc kubenswrapper[4765]: E1003 09:07:59.122781 4765 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e96c47af-97c3-4c7e-9679-1cce340373ba" containerName="cinder-scheduler" Oct 03 09:07:59 crc kubenswrapper[4765]: I1003 09:07:59.122788 4765 state_mem.go:107] "Deleted CPUSet assignment" podUID="e96c47af-97c3-4c7e-9679-1cce340373ba" containerName="cinder-scheduler" Oct 03 09:07:59 crc kubenswrapper[4765]: E1003 09:07:59.122799 4765 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ae3ea092-9e9e-4c82-a915-e6f786a1f9bd" containerName="ceilometer-central-agent" Oct 03 09:07:59 crc kubenswrapper[4765]: I1003 09:07:59.122806 4765 state_mem.go:107] "Deleted CPUSet assignment" podUID="ae3ea092-9e9e-4c82-a915-e6f786a1f9bd" containerName="ceilometer-central-agent" Oct 03 09:07:59 crc kubenswrapper[4765]: I1003 09:07:59.123019 4765 memory_manager.go:354] "RemoveStaleState removing state" podUID="e96c47af-97c3-4c7e-9679-1cce340373ba" containerName="probe" Oct 03 09:07:59 crc kubenswrapper[4765]: I1003 09:07:59.123054 4765 memory_manager.go:354] "RemoveStaleState removing state" podUID="3e6de015-79c8-4659-b0ea-dffac3d7bfe0" containerName="probe" Oct 03 09:07:59 crc kubenswrapper[4765]: I1003 09:07:59.123069 4765 memory_manager.go:354] "RemoveStaleState removing state" podUID="ae3ea092-9e9e-4c82-a915-e6f786a1f9bd" containerName="ceilometer-central-agent" Oct 03 09:07:59 crc kubenswrapper[4765]: I1003 09:07:59.123079 4765 memory_manager.go:354] "RemoveStaleState removing state" podUID="3e6de015-79c8-4659-b0ea-dffac3d7bfe0" containerName="cinder-backup" Oct 03 09:07:59 crc kubenswrapper[4765]: I1003 09:07:59.123088 4765 memory_manager.go:354] "RemoveStaleState removing state" podUID="ae3ea092-9e9e-4c82-a915-e6f786a1f9bd" containerName="ceilometer-notification-agent" Oct 03 09:07:59 crc kubenswrapper[4765]: I1003 09:07:59.123098 4765 memory_manager.go:354] "RemoveStaleState removing state" podUID="ae3ea092-9e9e-4c82-a915-e6f786a1f9bd" containerName="proxy-httpd" Oct 03 09:07:59 crc kubenswrapper[4765]: I1003 09:07:59.123108 4765 memory_manager.go:354] "RemoveStaleState removing state" podUID="775f1f1a-d2a9-45a6-91d7-9ea015f815a5" containerName="cinder-api-log" Oct 03 09:07:59 crc kubenswrapper[4765]: I1003 09:07:59.123118 4765 memory_manager.go:354] "RemoveStaleState removing state" podUID="775f1f1a-d2a9-45a6-91d7-9ea015f815a5" containerName="cinder-api" Oct 03 09:07:59 crc kubenswrapper[4765]: I1003 09:07:59.123132 4765 memory_manager.go:354] "RemoveStaleState removing state" podUID="e96c47af-97c3-4c7e-9679-1cce340373ba" containerName="cinder-scheduler" Oct 03 09:07:59 crc kubenswrapper[4765]: I1003 09:07:59.123145 4765 memory_manager.go:354] "RemoveStaleState removing state" podUID="ae3ea092-9e9e-4c82-a915-e6f786a1f9bd" containerName="sg-core" Oct 03 09:07:59 crc kubenswrapper[4765]: I1003 09:07:59.125730 4765 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/ceilometer-0" Oct 03 09:07:59 crc kubenswrapper[4765]: I1003 09:07:59.128833 4765 reflector.go:368] Caches populated for *v1.Secret from object-"watcher-kuttl-default"/"ceilometer-scripts" Oct 03 09:07:59 crc kubenswrapper[4765]: I1003 09:07:59.128968 4765 reflector.go:368] Caches populated for *v1.Secret from object-"watcher-kuttl-default"/"ceilometer-config-data" Oct 03 09:07:59 crc kubenswrapper[4765]: I1003 09:07:59.129064 4765 reflector.go:368] Caches populated for *v1.Secret from object-"watcher-kuttl-default"/"cert-ceilometer-internal-svc" Oct 03 09:07:59 crc kubenswrapper[4765]: I1003 09:07:59.141715 4765 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["watcher-kuttl-default/ceilometer-0"] Oct 03 09:07:59 crc kubenswrapper[4765]: I1003 09:07:59.145483 4765 scope.go:117] "RemoveContainer" containerID="6405725d2edae85a7d5b3f08a49cc319ae516e1ac22dc6c38d0e00668ebba98b" Oct 03 09:07:59 crc kubenswrapper[4765]: I1003 09:07:59.182986 4765 scope.go:117] "RemoveContainer" containerID="b805decbc63a7f8b2e5c57934887b889d345271b2e9b7fbae388b635e85fb43f" Oct 03 09:07:59 crc kubenswrapper[4765]: I1003 09:07:59.198556 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/78b9aabd-dac4-465e-93c6-0e6199890b40-config-data\") pod \"ceilometer-0\" (UID: \"78b9aabd-dac4-465e-93c6-0e6199890b40\") " pod="watcher-kuttl-default/ceilometer-0" Oct 03 09:07:59 crc kubenswrapper[4765]: I1003 09:07:59.198603 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nz7cd\" (UniqueName: \"kubernetes.io/projected/78b9aabd-dac4-465e-93c6-0e6199890b40-kube-api-access-nz7cd\") pod \"ceilometer-0\" (UID: \"78b9aabd-dac4-465e-93c6-0e6199890b40\") " pod="watcher-kuttl-default/ceilometer-0" Oct 03 09:07:59 crc kubenswrapper[4765]: I1003 09:07:59.198814 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/78b9aabd-dac4-465e-93c6-0e6199890b40-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"78b9aabd-dac4-465e-93c6-0e6199890b40\") " pod="watcher-kuttl-default/ceilometer-0" Oct 03 09:07:59 crc kubenswrapper[4765]: I1003 09:07:59.198895 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/78b9aabd-dac4-465e-93c6-0e6199890b40-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"78b9aabd-dac4-465e-93c6-0e6199890b40\") " pod="watcher-kuttl-default/ceilometer-0" Oct 03 09:07:59 crc kubenswrapper[4765]: I1003 09:07:59.198962 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/78b9aabd-dac4-465e-93c6-0e6199890b40-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"78b9aabd-dac4-465e-93c6-0e6199890b40\") " pod="watcher-kuttl-default/ceilometer-0" Oct 03 09:07:59 crc kubenswrapper[4765]: I1003 09:07:59.199050 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/78b9aabd-dac4-465e-93c6-0e6199890b40-log-httpd\") pod \"ceilometer-0\" (UID: \"78b9aabd-dac4-465e-93c6-0e6199890b40\") " pod="watcher-kuttl-default/ceilometer-0" Oct 03 09:07:59 crc kubenswrapper[4765]: I1003 09:07:59.199102 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/78b9aabd-dac4-465e-93c6-0e6199890b40-scripts\") pod \"ceilometer-0\" (UID: \"78b9aabd-dac4-465e-93c6-0e6199890b40\") " pod="watcher-kuttl-default/ceilometer-0" Oct 03 09:07:59 crc kubenswrapper[4765]: I1003 09:07:59.199216 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/78b9aabd-dac4-465e-93c6-0e6199890b40-run-httpd\") pod \"ceilometer-0\" (UID: \"78b9aabd-dac4-465e-93c6-0e6199890b40\") " pod="watcher-kuttl-default/ceilometer-0" Oct 03 09:07:59 crc kubenswrapper[4765]: I1003 09:07:59.283107 4765 log.go:25] "Finished parsing log file" path="/var/log/pods/watcher-kuttl-default_watcher-kuttl-decision-engine-0_0323892b-865c-444a-bbe4-f73361d81cb1/watcher-decision-engine/0.log" Oct 03 09:07:59 crc kubenswrapper[4765]: I1003 09:07:59.301062 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/78b9aabd-dac4-465e-93c6-0e6199890b40-run-httpd\") pod \"ceilometer-0\" (UID: \"78b9aabd-dac4-465e-93c6-0e6199890b40\") " pod="watcher-kuttl-default/ceilometer-0" Oct 03 09:07:59 crc kubenswrapper[4765]: I1003 09:07:59.301142 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/78b9aabd-dac4-465e-93c6-0e6199890b40-config-data\") pod \"ceilometer-0\" (UID: \"78b9aabd-dac4-465e-93c6-0e6199890b40\") " pod="watcher-kuttl-default/ceilometer-0" Oct 03 09:07:59 crc kubenswrapper[4765]: I1003 09:07:59.301179 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nz7cd\" (UniqueName: \"kubernetes.io/projected/78b9aabd-dac4-465e-93c6-0e6199890b40-kube-api-access-nz7cd\") pod \"ceilometer-0\" (UID: \"78b9aabd-dac4-465e-93c6-0e6199890b40\") " pod="watcher-kuttl-default/ceilometer-0" Oct 03 09:07:59 crc kubenswrapper[4765]: I1003 09:07:59.301258 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/78b9aabd-dac4-465e-93c6-0e6199890b40-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"78b9aabd-dac4-465e-93c6-0e6199890b40\") " pod="watcher-kuttl-default/ceilometer-0" Oct 03 09:07:59 crc kubenswrapper[4765]: I1003 09:07:59.301285 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/78b9aabd-dac4-465e-93c6-0e6199890b40-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"78b9aabd-dac4-465e-93c6-0e6199890b40\") " pod="watcher-kuttl-default/ceilometer-0" Oct 03 09:07:59 crc kubenswrapper[4765]: I1003 09:07:59.301306 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/78b9aabd-dac4-465e-93c6-0e6199890b40-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"78b9aabd-dac4-465e-93c6-0e6199890b40\") " pod="watcher-kuttl-default/ceilometer-0" Oct 03 09:07:59 crc kubenswrapper[4765]: I1003 09:07:59.301336 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/78b9aabd-dac4-465e-93c6-0e6199890b40-log-httpd\") pod \"ceilometer-0\" (UID: \"78b9aabd-dac4-465e-93c6-0e6199890b40\") " pod="watcher-kuttl-default/ceilometer-0" Oct 03 09:07:59 crc kubenswrapper[4765]: I1003 09:07:59.301357 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/78b9aabd-dac4-465e-93c6-0e6199890b40-scripts\") pod \"ceilometer-0\" (UID: \"78b9aabd-dac4-465e-93c6-0e6199890b40\") " pod="watcher-kuttl-default/ceilometer-0" Oct 03 09:07:59 crc kubenswrapper[4765]: I1003 09:07:59.301695 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/78b9aabd-dac4-465e-93c6-0e6199890b40-run-httpd\") pod \"ceilometer-0\" (UID: \"78b9aabd-dac4-465e-93c6-0e6199890b40\") " pod="watcher-kuttl-default/ceilometer-0" Oct 03 09:07:59 crc kubenswrapper[4765]: I1003 09:07:59.305292 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/78b9aabd-dac4-465e-93c6-0e6199890b40-log-httpd\") pod \"ceilometer-0\" (UID: \"78b9aabd-dac4-465e-93c6-0e6199890b40\") " pod="watcher-kuttl-default/ceilometer-0" Oct 03 09:07:59 crc kubenswrapper[4765]: I1003 09:07:59.305906 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/78b9aabd-dac4-465e-93c6-0e6199890b40-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"78b9aabd-dac4-465e-93c6-0e6199890b40\") " pod="watcher-kuttl-default/ceilometer-0" Oct 03 09:07:59 crc kubenswrapper[4765]: I1003 09:07:59.306013 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/78b9aabd-dac4-465e-93c6-0e6199890b40-scripts\") pod \"ceilometer-0\" (UID: \"78b9aabd-dac4-465e-93c6-0e6199890b40\") " pod="watcher-kuttl-default/ceilometer-0" Oct 03 09:07:59 crc kubenswrapper[4765]: I1003 09:07:59.308342 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/78b9aabd-dac4-465e-93c6-0e6199890b40-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"78b9aabd-dac4-465e-93c6-0e6199890b40\") " pod="watcher-kuttl-default/ceilometer-0" Oct 03 09:07:59 crc kubenswrapper[4765]: I1003 09:07:59.311335 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/78b9aabd-dac4-465e-93c6-0e6199890b40-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"78b9aabd-dac4-465e-93c6-0e6199890b40\") " pod="watcher-kuttl-default/ceilometer-0" Oct 03 09:07:59 crc kubenswrapper[4765]: I1003 09:07:59.312665 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/78b9aabd-dac4-465e-93c6-0e6199890b40-config-data\") pod \"ceilometer-0\" (UID: \"78b9aabd-dac4-465e-93c6-0e6199890b40\") " pod="watcher-kuttl-default/ceilometer-0" Oct 03 09:07:59 crc kubenswrapper[4765]: I1003 09:07:59.322271 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nz7cd\" (UniqueName: \"kubernetes.io/projected/78b9aabd-dac4-465e-93c6-0e6199890b40-kube-api-access-nz7cd\") pod \"ceilometer-0\" (UID: \"78b9aabd-dac4-465e-93c6-0e6199890b40\") " pod="watcher-kuttl-default/ceilometer-0" Oct 03 09:07:59 crc kubenswrapper[4765]: I1003 09:07:59.444688 4765 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/ceilometer-0" Oct 03 09:07:59 crc kubenswrapper[4765]: I1003 09:07:59.903826 4765 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["watcher-kuttl-default/ceilometer-0"] Oct 03 09:08:00 crc kubenswrapper[4765]: I1003 09:08:00.050346 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/ceilometer-0" event={"ID":"78b9aabd-dac4-465e-93c6-0e6199890b40","Type":"ContainerStarted","Data":"a5349b9793c6371f457d2058c9076fe9eb4edc43a48e6514099e781b5a368f26"} Oct 03 09:08:00 crc kubenswrapper[4765]: I1003 09:08:00.320459 4765 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="775f1f1a-d2a9-45a6-91d7-9ea015f815a5" path="/var/lib/kubelet/pods/775f1f1a-d2a9-45a6-91d7-9ea015f815a5/volumes" Oct 03 09:08:00 crc kubenswrapper[4765]: I1003 09:08:00.321319 4765 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ae3ea092-9e9e-4c82-a915-e6f786a1f9bd" path="/var/lib/kubelet/pods/ae3ea092-9e9e-4c82-a915-e6f786a1f9bd/volumes" Oct 03 09:08:00 crc kubenswrapper[4765]: I1003 09:08:00.487222 4765 log.go:25] "Finished parsing log file" path="/var/log/pods/watcher-kuttl-default_watcher-kuttl-decision-engine-0_0323892b-865c-444a-bbe4-f73361d81cb1/watcher-decision-engine/0.log" Oct 03 09:08:01 crc kubenswrapper[4765]: I1003 09:08:01.061465 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/ceilometer-0" event={"ID":"78b9aabd-dac4-465e-93c6-0e6199890b40","Type":"ContainerStarted","Data":"be70af77a826ff13cbe22bb18412cc2099feae7abb5f1222016c016d21b016c9"} Oct 03 09:08:01 crc kubenswrapper[4765]: I1003 09:08:01.719552 4765 log.go:25] "Finished parsing log file" path="/var/log/pods/watcher-kuttl-default_watcher-kuttl-decision-engine-0_0323892b-865c-444a-bbe4-f73361d81cb1/watcher-decision-engine/0.log" Oct 03 09:08:02 crc kubenswrapper[4765]: I1003 09:08:02.085895 4765 generic.go:334] "Generic (PLEG): container finished" podID="0323892b-865c-444a-bbe4-f73361d81cb1" containerID="be10a807664e26ba0eb4d68c6ea5c7bcf237b53cfb6e2802a73578559c273a41" exitCode=0 Oct 03 09:08:02 crc kubenswrapper[4765]: I1003 09:08:02.086568 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/watcher-kuttl-decision-engine-0" event={"ID":"0323892b-865c-444a-bbe4-f73361d81cb1","Type":"ContainerDied","Data":"be10a807664e26ba0eb4d68c6ea5c7bcf237b53cfb6e2802a73578559c273a41"} Oct 03 09:08:02 crc kubenswrapper[4765]: I1003 09:08:02.090100 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/ceilometer-0" event={"ID":"78b9aabd-dac4-465e-93c6-0e6199890b40","Type":"ContainerStarted","Data":"7e4d64087b1fec4ecb167e1456cba2d2daf4a56df98ea15c9be2879c2e8309a0"} Oct 03 09:08:02 crc kubenswrapper[4765]: I1003 09:08:02.173317 4765 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/watcher-kuttl-decision-engine-0" Oct 03 09:08:02 crc kubenswrapper[4765]: I1003 09:08:02.258770 4765 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"custom-prometheus-ca\" (UniqueName: \"kubernetes.io/secret/0323892b-865c-444a-bbe4-f73361d81cb1-custom-prometheus-ca\") pod \"0323892b-865c-444a-bbe4-f73361d81cb1\" (UID: \"0323892b-865c-444a-bbe4-f73361d81cb1\") " Oct 03 09:08:02 crc kubenswrapper[4765]: I1003 09:08:02.258900 4765 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0323892b-865c-444a-bbe4-f73361d81cb1-config-data\") pod \"0323892b-865c-444a-bbe4-f73361d81cb1\" (UID: \"0323892b-865c-444a-bbe4-f73361d81cb1\") " Oct 03 09:08:02 crc kubenswrapper[4765]: I1003 09:08:02.259021 4765 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cert-memcached-mtls\" (UniqueName: \"kubernetes.io/secret/0323892b-865c-444a-bbe4-f73361d81cb1-cert-memcached-mtls\") pod \"0323892b-865c-444a-bbe4-f73361d81cb1\" (UID: \"0323892b-865c-444a-bbe4-f73361d81cb1\") " Oct 03 09:08:02 crc kubenswrapper[4765]: I1003 09:08:02.259056 4765 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0323892b-865c-444a-bbe4-f73361d81cb1-combined-ca-bundle\") pod \"0323892b-865c-444a-bbe4-f73361d81cb1\" (UID: \"0323892b-865c-444a-bbe4-f73361d81cb1\") " Oct 03 09:08:02 crc kubenswrapper[4765]: I1003 09:08:02.259120 4765 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-v62hn\" (UniqueName: \"kubernetes.io/projected/0323892b-865c-444a-bbe4-f73361d81cb1-kube-api-access-v62hn\") pod \"0323892b-865c-444a-bbe4-f73361d81cb1\" (UID: \"0323892b-865c-444a-bbe4-f73361d81cb1\") " Oct 03 09:08:02 crc kubenswrapper[4765]: I1003 09:08:02.259157 4765 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/0323892b-865c-444a-bbe4-f73361d81cb1-logs\") pod \"0323892b-865c-444a-bbe4-f73361d81cb1\" (UID: \"0323892b-865c-444a-bbe4-f73361d81cb1\") " Oct 03 09:08:02 crc kubenswrapper[4765]: I1003 09:08:02.259826 4765 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/0323892b-865c-444a-bbe4-f73361d81cb1-logs" (OuterVolumeSpecName: "logs") pod "0323892b-865c-444a-bbe4-f73361d81cb1" (UID: "0323892b-865c-444a-bbe4-f73361d81cb1"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 03 09:08:02 crc kubenswrapper[4765]: I1003 09:08:02.282265 4765 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0323892b-865c-444a-bbe4-f73361d81cb1-kube-api-access-v62hn" (OuterVolumeSpecName: "kube-api-access-v62hn") pod "0323892b-865c-444a-bbe4-f73361d81cb1" (UID: "0323892b-865c-444a-bbe4-f73361d81cb1"). InnerVolumeSpecName "kube-api-access-v62hn". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 09:08:02 crc kubenswrapper[4765]: I1003 09:08:02.360821 4765 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-v62hn\" (UniqueName: \"kubernetes.io/projected/0323892b-865c-444a-bbe4-f73361d81cb1-kube-api-access-v62hn\") on node \"crc\" DevicePath \"\"" Oct 03 09:08:02 crc kubenswrapper[4765]: I1003 09:08:02.361037 4765 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/0323892b-865c-444a-bbe4-f73361d81cb1-logs\") on node \"crc\" DevicePath \"\"" Oct 03 09:08:02 crc kubenswrapper[4765]: I1003 09:08:02.363083 4765 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0323892b-865c-444a-bbe4-f73361d81cb1-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "0323892b-865c-444a-bbe4-f73361d81cb1" (UID: "0323892b-865c-444a-bbe4-f73361d81cb1"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 09:08:02 crc kubenswrapper[4765]: I1003 09:08:02.417989 4765 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0323892b-865c-444a-bbe4-f73361d81cb1-custom-prometheus-ca" (OuterVolumeSpecName: "custom-prometheus-ca") pod "0323892b-865c-444a-bbe4-f73361d81cb1" (UID: "0323892b-865c-444a-bbe4-f73361d81cb1"). InnerVolumeSpecName "custom-prometheus-ca". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 09:08:02 crc kubenswrapper[4765]: I1003 09:08:02.449819 4765 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0323892b-865c-444a-bbe4-f73361d81cb1-config-data" (OuterVolumeSpecName: "config-data") pod "0323892b-865c-444a-bbe4-f73361d81cb1" (UID: "0323892b-865c-444a-bbe4-f73361d81cb1"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 09:08:02 crc kubenswrapper[4765]: I1003 09:08:02.450468 4765 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0323892b-865c-444a-bbe4-f73361d81cb1-cert-memcached-mtls" (OuterVolumeSpecName: "cert-memcached-mtls") pod "0323892b-865c-444a-bbe4-f73361d81cb1" (UID: "0323892b-865c-444a-bbe4-f73361d81cb1"). InnerVolumeSpecName "cert-memcached-mtls". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 09:08:02 crc kubenswrapper[4765]: I1003 09:08:02.462668 4765 reconciler_common.go:293] "Volume detached for volume \"cert-memcached-mtls\" (UniqueName: \"kubernetes.io/secret/0323892b-865c-444a-bbe4-f73361d81cb1-cert-memcached-mtls\") on node \"crc\" DevicePath \"\"" Oct 03 09:08:02 crc kubenswrapper[4765]: I1003 09:08:02.462729 4765 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0323892b-865c-444a-bbe4-f73361d81cb1-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 03 09:08:02 crc kubenswrapper[4765]: I1003 09:08:02.462740 4765 reconciler_common.go:293] "Volume detached for volume \"custom-prometheus-ca\" (UniqueName: \"kubernetes.io/secret/0323892b-865c-444a-bbe4-f73361d81cb1-custom-prometheus-ca\") on node \"crc\" DevicePath \"\"" Oct 03 09:08:02 crc kubenswrapper[4765]: I1003 09:08:02.462752 4765 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0323892b-865c-444a-bbe4-f73361d81cb1-config-data\") on node \"crc\" DevicePath \"\"" Oct 03 09:08:02 crc kubenswrapper[4765]: I1003 09:08:02.944528 4765 log.go:25] "Finished parsing log file" path="/var/log/pods/watcher-kuttl-default_watcher-kuttl-decision-engine-0_0323892b-865c-444a-bbe4-f73361d81cb1/watcher-decision-engine/0.log" Oct 03 09:08:03 crc kubenswrapper[4765]: I1003 09:08:03.100792 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/watcher-kuttl-decision-engine-0" event={"ID":"0323892b-865c-444a-bbe4-f73361d81cb1","Type":"ContainerDied","Data":"fe7305b0a5a6e933065a065036911ea325178ed557d65c40e5bef65e16c5e9c7"} Oct 03 09:08:03 crc kubenswrapper[4765]: I1003 09:08:03.100849 4765 scope.go:117] "RemoveContainer" containerID="be10a807664e26ba0eb4d68c6ea5c7bcf237b53cfb6e2802a73578559c273a41" Oct 03 09:08:03 crc kubenswrapper[4765]: I1003 09:08:03.100970 4765 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/watcher-kuttl-decision-engine-0" Oct 03 09:08:03 crc kubenswrapper[4765]: I1003 09:08:03.119292 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/ceilometer-0" event={"ID":"78b9aabd-dac4-465e-93c6-0e6199890b40","Type":"ContainerStarted","Data":"a1225aa184b4b800149b86dc4f7e720809d9d0bd2564424d0ecb327cfc3e32cb"} Oct 03 09:08:03 crc kubenswrapper[4765]: I1003 09:08:03.137996 4765 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["watcher-kuttl-default/watcher-kuttl-decision-engine-0"] Oct 03 09:08:03 crc kubenswrapper[4765]: I1003 09:08:03.144195 4765 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["watcher-kuttl-default/watcher-kuttl-decision-engine-0"] Oct 03 09:08:03 crc kubenswrapper[4765]: I1003 09:08:03.161013 4765 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["watcher-kuttl-default/watcher-kuttl-decision-engine-0"] Oct 03 09:08:03 crc kubenswrapper[4765]: E1003 09:08:03.161406 4765 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0323892b-865c-444a-bbe4-f73361d81cb1" containerName="watcher-decision-engine" Oct 03 09:08:03 crc kubenswrapper[4765]: I1003 09:08:03.161426 4765 state_mem.go:107] "Deleted CPUSet assignment" podUID="0323892b-865c-444a-bbe4-f73361d81cb1" containerName="watcher-decision-engine" Oct 03 09:08:03 crc kubenswrapper[4765]: I1003 09:08:03.162215 4765 memory_manager.go:354] "RemoveStaleState removing state" podUID="0323892b-865c-444a-bbe4-f73361d81cb1" containerName="watcher-decision-engine" Oct 03 09:08:03 crc kubenswrapper[4765]: I1003 09:08:03.171349 4765 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/watcher-kuttl-decision-engine-0" Oct 03 09:08:03 crc kubenswrapper[4765]: I1003 09:08:03.179104 4765 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["watcher-kuttl-default/watcher-kuttl-decision-engine-0"] Oct 03 09:08:03 crc kubenswrapper[4765]: I1003 09:08:03.180028 4765 reflector.go:368] Caches populated for *v1.Secret from object-"watcher-kuttl-default"/"watcher-kuttl-decision-engine-config-data" Oct 03 09:08:03 crc kubenswrapper[4765]: I1003 09:08:03.275927 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/aa4c7fcd-bff5-453b-9bda-72e97366bc30-logs\") pod \"watcher-kuttl-decision-engine-0\" (UID: \"aa4c7fcd-bff5-453b-9bda-72e97366bc30\") " pod="watcher-kuttl-default/watcher-kuttl-decision-engine-0" Oct 03 09:08:03 crc kubenswrapper[4765]: I1003 09:08:03.275991 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"custom-prometheus-ca\" (UniqueName: \"kubernetes.io/secret/aa4c7fcd-bff5-453b-9bda-72e97366bc30-custom-prometheus-ca\") pod \"watcher-kuttl-decision-engine-0\" (UID: \"aa4c7fcd-bff5-453b-9bda-72e97366bc30\") " pod="watcher-kuttl-default/watcher-kuttl-decision-engine-0" Oct 03 09:08:03 crc kubenswrapper[4765]: I1003 09:08:03.276057 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/aa4c7fcd-bff5-453b-9bda-72e97366bc30-combined-ca-bundle\") pod \"watcher-kuttl-decision-engine-0\" (UID: \"aa4c7fcd-bff5-453b-9bda-72e97366bc30\") " pod="watcher-kuttl-default/watcher-kuttl-decision-engine-0" Oct 03 09:08:03 crc kubenswrapper[4765]: I1003 09:08:03.276110 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert-memcached-mtls\" (UniqueName: \"kubernetes.io/secret/aa4c7fcd-bff5-453b-9bda-72e97366bc30-cert-memcached-mtls\") pod \"watcher-kuttl-decision-engine-0\" (UID: \"aa4c7fcd-bff5-453b-9bda-72e97366bc30\") " pod="watcher-kuttl-default/watcher-kuttl-decision-engine-0" Oct 03 09:08:03 crc kubenswrapper[4765]: I1003 09:08:03.276191 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/aa4c7fcd-bff5-453b-9bda-72e97366bc30-config-data\") pod \"watcher-kuttl-decision-engine-0\" (UID: \"aa4c7fcd-bff5-453b-9bda-72e97366bc30\") " pod="watcher-kuttl-default/watcher-kuttl-decision-engine-0" Oct 03 09:08:03 crc kubenswrapper[4765]: I1003 09:08:03.276218 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5647s\" (UniqueName: \"kubernetes.io/projected/aa4c7fcd-bff5-453b-9bda-72e97366bc30-kube-api-access-5647s\") pod \"watcher-kuttl-decision-engine-0\" (UID: \"aa4c7fcd-bff5-453b-9bda-72e97366bc30\") " pod="watcher-kuttl-default/watcher-kuttl-decision-engine-0" Oct 03 09:08:03 crc kubenswrapper[4765]: I1003 09:08:03.378043 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/aa4c7fcd-bff5-453b-9bda-72e97366bc30-config-data\") pod \"watcher-kuttl-decision-engine-0\" (UID: \"aa4c7fcd-bff5-453b-9bda-72e97366bc30\") " pod="watcher-kuttl-default/watcher-kuttl-decision-engine-0" Oct 03 09:08:03 crc kubenswrapper[4765]: I1003 09:08:03.378097 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5647s\" (UniqueName: \"kubernetes.io/projected/aa4c7fcd-bff5-453b-9bda-72e97366bc30-kube-api-access-5647s\") pod \"watcher-kuttl-decision-engine-0\" (UID: \"aa4c7fcd-bff5-453b-9bda-72e97366bc30\") " pod="watcher-kuttl-default/watcher-kuttl-decision-engine-0" Oct 03 09:08:03 crc kubenswrapper[4765]: I1003 09:08:03.378117 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/aa4c7fcd-bff5-453b-9bda-72e97366bc30-logs\") pod \"watcher-kuttl-decision-engine-0\" (UID: \"aa4c7fcd-bff5-453b-9bda-72e97366bc30\") " pod="watcher-kuttl-default/watcher-kuttl-decision-engine-0" Oct 03 09:08:03 crc kubenswrapper[4765]: I1003 09:08:03.378147 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"custom-prometheus-ca\" (UniqueName: \"kubernetes.io/secret/aa4c7fcd-bff5-453b-9bda-72e97366bc30-custom-prometheus-ca\") pod \"watcher-kuttl-decision-engine-0\" (UID: \"aa4c7fcd-bff5-453b-9bda-72e97366bc30\") " pod="watcher-kuttl-default/watcher-kuttl-decision-engine-0" Oct 03 09:08:03 crc kubenswrapper[4765]: I1003 09:08:03.378203 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/aa4c7fcd-bff5-453b-9bda-72e97366bc30-combined-ca-bundle\") pod \"watcher-kuttl-decision-engine-0\" (UID: \"aa4c7fcd-bff5-453b-9bda-72e97366bc30\") " pod="watcher-kuttl-default/watcher-kuttl-decision-engine-0" Oct 03 09:08:03 crc kubenswrapper[4765]: I1003 09:08:03.378240 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert-memcached-mtls\" (UniqueName: \"kubernetes.io/secret/aa4c7fcd-bff5-453b-9bda-72e97366bc30-cert-memcached-mtls\") pod \"watcher-kuttl-decision-engine-0\" (UID: \"aa4c7fcd-bff5-453b-9bda-72e97366bc30\") " pod="watcher-kuttl-default/watcher-kuttl-decision-engine-0" Oct 03 09:08:03 crc kubenswrapper[4765]: I1003 09:08:03.378683 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/aa4c7fcd-bff5-453b-9bda-72e97366bc30-logs\") pod \"watcher-kuttl-decision-engine-0\" (UID: \"aa4c7fcd-bff5-453b-9bda-72e97366bc30\") " pod="watcher-kuttl-default/watcher-kuttl-decision-engine-0" Oct 03 09:08:03 crc kubenswrapper[4765]: I1003 09:08:03.381982 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/aa4c7fcd-bff5-453b-9bda-72e97366bc30-combined-ca-bundle\") pod \"watcher-kuttl-decision-engine-0\" (UID: \"aa4c7fcd-bff5-453b-9bda-72e97366bc30\") " pod="watcher-kuttl-default/watcher-kuttl-decision-engine-0" Oct 03 09:08:03 crc kubenswrapper[4765]: I1003 09:08:03.382067 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/aa4c7fcd-bff5-453b-9bda-72e97366bc30-config-data\") pod \"watcher-kuttl-decision-engine-0\" (UID: \"aa4c7fcd-bff5-453b-9bda-72e97366bc30\") " pod="watcher-kuttl-default/watcher-kuttl-decision-engine-0" Oct 03 09:08:03 crc kubenswrapper[4765]: I1003 09:08:03.382242 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"custom-prometheus-ca\" (UniqueName: \"kubernetes.io/secret/aa4c7fcd-bff5-453b-9bda-72e97366bc30-custom-prometheus-ca\") pod \"watcher-kuttl-decision-engine-0\" (UID: \"aa4c7fcd-bff5-453b-9bda-72e97366bc30\") " pod="watcher-kuttl-default/watcher-kuttl-decision-engine-0" Oct 03 09:08:03 crc kubenswrapper[4765]: I1003 09:08:03.389215 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert-memcached-mtls\" (UniqueName: \"kubernetes.io/secret/aa4c7fcd-bff5-453b-9bda-72e97366bc30-cert-memcached-mtls\") pod \"watcher-kuttl-decision-engine-0\" (UID: \"aa4c7fcd-bff5-453b-9bda-72e97366bc30\") " pod="watcher-kuttl-default/watcher-kuttl-decision-engine-0" Oct 03 09:08:03 crc kubenswrapper[4765]: I1003 09:08:03.401632 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5647s\" (UniqueName: \"kubernetes.io/projected/aa4c7fcd-bff5-453b-9bda-72e97366bc30-kube-api-access-5647s\") pod \"watcher-kuttl-decision-engine-0\" (UID: \"aa4c7fcd-bff5-453b-9bda-72e97366bc30\") " pod="watcher-kuttl-default/watcher-kuttl-decision-engine-0" Oct 03 09:08:03 crc kubenswrapper[4765]: I1003 09:08:03.488531 4765 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/watcher-kuttl-decision-engine-0" Oct 03 09:08:03 crc kubenswrapper[4765]: I1003 09:08:03.950529 4765 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["watcher-kuttl-default/watcher-kuttl-decision-engine-0"] Oct 03 09:08:04 crc kubenswrapper[4765]: I1003 09:08:04.142092 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/ceilometer-0" event={"ID":"78b9aabd-dac4-465e-93c6-0e6199890b40","Type":"ContainerStarted","Data":"236c63359cd8872236a6a89cf2cb6e3f388f04a643f9dc8573f79c825364ae91"} Oct 03 09:08:04 crc kubenswrapper[4765]: I1003 09:08:04.143388 4765 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="watcher-kuttl-default/ceilometer-0" Oct 03 09:08:04 crc kubenswrapper[4765]: I1003 09:08:04.146923 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/watcher-kuttl-decision-engine-0" event={"ID":"aa4c7fcd-bff5-453b-9bda-72e97366bc30","Type":"ContainerStarted","Data":"b17f8207e84939b59ca8c95917e139f6da88fa05e7c2beda2e740fdb8a7fe7f5"} Oct 03 09:08:04 crc kubenswrapper[4765]: I1003 09:08:04.165028 4765 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="watcher-kuttl-default/ceilometer-0" podStartSLOduration=1.568830657 podStartE2EDuration="5.165010402s" podCreationTimestamp="2025-10-03 09:07:59 +0000 UTC" firstStartedPulling="2025-10-03 09:07:59.911159551 +0000 UTC m=+1724.212653881" lastFinishedPulling="2025-10-03 09:08:03.507339296 +0000 UTC m=+1727.808833626" observedRunningTime="2025-10-03 09:08:04.16455001 +0000 UTC m=+1728.466044340" watchObservedRunningTime="2025-10-03 09:08:04.165010402 +0000 UTC m=+1728.466504732" Oct 03 09:08:04 crc kubenswrapper[4765]: I1003 09:08:04.309842 4765 scope.go:117] "RemoveContainer" containerID="dd918556e4256b95f1ffce5dba4f8a301b33441a569fc5bbea88da3f09eb9800" Oct 03 09:08:04 crc kubenswrapper[4765]: E1003 09:08:04.310039 4765 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-j8mss_openshift-machine-config-operator(d636dbad-9ffa-4ba7-953f-adea04b76a23)\"" pod="openshift-machine-config-operator/machine-config-daemon-j8mss" podUID="d636dbad-9ffa-4ba7-953f-adea04b76a23" Oct 03 09:08:04 crc kubenswrapper[4765]: I1003 09:08:04.320583 4765 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0323892b-865c-444a-bbe4-f73361d81cb1" path="/var/lib/kubelet/pods/0323892b-865c-444a-bbe4-f73361d81cb1/volumes" Oct 03 09:08:05 crc kubenswrapper[4765]: I1003 09:08:05.166520 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/watcher-kuttl-decision-engine-0" event={"ID":"aa4c7fcd-bff5-453b-9bda-72e97366bc30","Type":"ContainerStarted","Data":"17a492e825edc0d43037fcce9d5cefdb2f7464e8af080eb65a27a1198b235738"} Oct 03 09:08:05 crc kubenswrapper[4765]: I1003 09:08:05.191956 4765 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="watcher-kuttl-default/watcher-kuttl-decision-engine-0" podStartSLOduration=2.191937357 podStartE2EDuration="2.191937357s" podCreationTimestamp="2025-10-03 09:08:03 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-03 09:08:05.182395251 +0000 UTC m=+1729.483889601" watchObservedRunningTime="2025-10-03 09:08:05.191937357 +0000 UTC m=+1729.493431687" Oct 03 09:08:05 crc kubenswrapper[4765]: I1003 09:08:05.342477 4765 log.go:25] "Finished parsing log file" path="/var/log/pods/watcher-kuttl-default_watcher-kuttl-decision-engine-0_aa4c7fcd-bff5-453b-9bda-72e97366bc30/watcher-decision-engine/0.log" Oct 03 09:08:06 crc kubenswrapper[4765]: I1003 09:08:06.522203 4765 log.go:25] "Finished parsing log file" path="/var/log/pods/watcher-kuttl-default_watcher-kuttl-decision-engine-0_aa4c7fcd-bff5-453b-9bda-72e97366bc30/watcher-decision-engine/0.log" Oct 03 09:08:07 crc kubenswrapper[4765]: I1003 09:08:07.742952 4765 log.go:25] "Finished parsing log file" path="/var/log/pods/watcher-kuttl-default_watcher-kuttl-decision-engine-0_aa4c7fcd-bff5-453b-9bda-72e97366bc30/watcher-decision-engine/0.log" Oct 03 09:08:08 crc kubenswrapper[4765]: I1003 09:08:08.951778 4765 log.go:25] "Finished parsing log file" path="/var/log/pods/watcher-kuttl-default_watcher-kuttl-decision-engine-0_aa4c7fcd-bff5-453b-9bda-72e97366bc30/watcher-decision-engine/0.log" Oct 03 09:08:10 crc kubenswrapper[4765]: I1003 09:08:10.179713 4765 log.go:25] "Finished parsing log file" path="/var/log/pods/watcher-kuttl-default_watcher-kuttl-decision-engine-0_aa4c7fcd-bff5-453b-9bda-72e97366bc30/watcher-decision-engine/0.log" Oct 03 09:08:11 crc kubenswrapper[4765]: I1003 09:08:11.394586 4765 log.go:25] "Finished parsing log file" path="/var/log/pods/watcher-kuttl-default_watcher-kuttl-decision-engine-0_aa4c7fcd-bff5-453b-9bda-72e97366bc30/watcher-decision-engine/0.log" Oct 03 09:08:12 crc kubenswrapper[4765]: I1003 09:08:12.596551 4765 log.go:25] "Finished parsing log file" path="/var/log/pods/watcher-kuttl-default_watcher-kuttl-decision-engine-0_aa4c7fcd-bff5-453b-9bda-72e97366bc30/watcher-decision-engine/0.log" Oct 03 09:08:13 crc kubenswrapper[4765]: I1003 09:08:13.489016 4765 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="watcher-kuttl-default/watcher-kuttl-decision-engine-0" Oct 03 09:08:13 crc kubenswrapper[4765]: I1003 09:08:13.515082 4765 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="watcher-kuttl-default/watcher-kuttl-decision-engine-0" Oct 03 09:08:13 crc kubenswrapper[4765]: I1003 09:08:13.816410 4765 log.go:25] "Finished parsing log file" path="/var/log/pods/watcher-kuttl-default_watcher-kuttl-decision-engine-0_aa4c7fcd-bff5-453b-9bda-72e97366bc30/watcher-decision-engine/0.log" Oct 03 09:08:14 crc kubenswrapper[4765]: I1003 09:08:14.253430 4765 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="watcher-kuttl-default/watcher-kuttl-decision-engine-0" Oct 03 09:08:14 crc kubenswrapper[4765]: I1003 09:08:14.276519 4765 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="watcher-kuttl-default/watcher-kuttl-decision-engine-0" Oct 03 09:08:15 crc kubenswrapper[4765]: I1003 09:08:15.026742 4765 log.go:25] "Finished parsing log file" path="/var/log/pods/watcher-kuttl-default_watcher-kuttl-decision-engine-0_aa4c7fcd-bff5-453b-9bda-72e97366bc30/watcher-decision-engine/0.log" Oct 03 09:08:15 crc kubenswrapper[4765]: I1003 09:08:15.148230 4765 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["watcher-kuttl-default/watcher-kuttl-db-sync-lzqdv"] Oct 03 09:08:15 crc kubenswrapper[4765]: I1003 09:08:15.158589 4765 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["watcher-kuttl-default/watcher-kuttl-db-sync-lzqdv"] Oct 03 09:08:15 crc kubenswrapper[4765]: I1003 09:08:15.221579 4765 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["watcher-kuttl-default/watcher-kuttl-applier-0"] Oct 03 09:08:15 crc kubenswrapper[4765]: I1003 09:08:15.221947 4765 kuberuntime_container.go:808] "Killing container with a grace period" pod="watcher-kuttl-default/watcher-kuttl-applier-0" podUID="c6c827c9-7f24-44dc-841b-246a3eafaae5" containerName="watcher-applier" containerID="cri-o://ce94645a3b1bf7876f235e3ac0ce6d987e69566b3de848e9be5e8eca67816971" gracePeriod=30 Oct 03 09:08:15 crc kubenswrapper[4765]: I1003 09:08:15.228330 4765 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["watcher-kuttl-default/watcherbc0e-account-delete-2zz5q"] Oct 03 09:08:15 crc kubenswrapper[4765]: I1003 09:08:15.229404 4765 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/watcherbc0e-account-delete-2zz5q" Oct 03 09:08:15 crc kubenswrapper[4765]: I1003 09:08:15.240144 4765 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["watcher-kuttl-default/watcherbc0e-account-delete-2zz5q"] Oct 03 09:08:15 crc kubenswrapper[4765]: I1003 09:08:15.262963 4765 kubelet_pods.go:1007] "Unable to retrieve pull secret, the image pull may not succeed." pod="watcher-kuttl-default/watcher-kuttl-decision-engine-0" secret="" err="secret \"watcher-watcher-kuttl-dockercfg-f4q6z\" not found" Oct 03 09:08:15 crc kubenswrapper[4765]: I1003 09:08:15.271389 4765 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["watcher-kuttl-default/watcher-db-create-wfptv"] Oct 03 09:08:15 crc kubenswrapper[4765]: I1003 09:08:15.279727 4765 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["watcher-kuttl-default/watcher-kuttl-decision-engine-0"] Oct 03 09:08:15 crc kubenswrapper[4765]: I1003 09:08:15.284379 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-v6h2s\" (UniqueName: \"kubernetes.io/projected/6f162962-d9a4-4bfe-b2e2-0843ada74f39-kube-api-access-v6h2s\") pod \"watcherbc0e-account-delete-2zz5q\" (UID: \"6f162962-d9a4-4bfe-b2e2-0843ada74f39\") " pod="watcher-kuttl-default/watcherbc0e-account-delete-2zz5q" Oct 03 09:08:15 crc kubenswrapper[4765]: I1003 09:08:15.286434 4765 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["watcher-kuttl-default/watcher-db-create-wfptv"] Oct 03 09:08:15 crc kubenswrapper[4765]: I1003 09:08:15.320712 4765 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["watcher-kuttl-default/watcherbc0e-account-delete-2zz5q"] Oct 03 09:08:15 crc kubenswrapper[4765]: E1003 09:08:15.321421 4765 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[kube-api-access-v6h2s], unattached volumes=[], failed to process volumes=[]: context canceled" pod="watcher-kuttl-default/watcherbc0e-account-delete-2zz5q" podUID="6f162962-d9a4-4bfe-b2e2-0843ada74f39" Oct 03 09:08:15 crc kubenswrapper[4765]: I1003 09:08:15.328456 4765 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["watcher-kuttl-default/watcher-bc0e-account-create-qh4zl"] Oct 03 09:08:15 crc kubenswrapper[4765]: I1003 09:08:15.337394 4765 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["watcher-kuttl-default/watcher-bc0e-account-create-qh4zl"] Oct 03 09:08:15 crc kubenswrapper[4765]: I1003 09:08:15.385197 4765 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["watcher-kuttl-default/watcher-kuttl-api-0"] Oct 03 09:08:15 crc kubenswrapper[4765]: I1003 09:08:15.385449 4765 kuberuntime_container.go:808] "Killing container with a grace period" pod="watcher-kuttl-default/watcher-kuttl-api-0" podUID="9ce33c7a-0177-422f-8cfa-7d88fa60c2ee" containerName="watcher-kuttl-api-log" containerID="cri-o://052bfd19b2f2aac71e6087c585e63529b2fe429d5adbb681db8feb4bac7908e7" gracePeriod=30 Oct 03 09:08:15 crc kubenswrapper[4765]: I1003 09:08:15.385532 4765 kuberuntime_container.go:808] "Killing container with a grace period" pod="watcher-kuttl-default/watcher-kuttl-api-0" podUID="9ce33c7a-0177-422f-8cfa-7d88fa60c2ee" containerName="watcher-api" containerID="cri-o://59516f60cc564107349b6d1caf969e96f2d1f0ae07ece499ad03762a161bf1e0" gracePeriod=30 Oct 03 09:08:15 crc kubenswrapper[4765]: I1003 09:08:15.386210 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-v6h2s\" (UniqueName: \"kubernetes.io/projected/6f162962-d9a4-4bfe-b2e2-0843ada74f39-kube-api-access-v6h2s\") pod \"watcherbc0e-account-delete-2zz5q\" (UID: \"6f162962-d9a4-4bfe-b2e2-0843ada74f39\") " pod="watcher-kuttl-default/watcherbc0e-account-delete-2zz5q" Oct 03 09:08:15 crc kubenswrapper[4765]: E1003 09:08:15.386610 4765 secret.go:188] Couldn't get secret watcher-kuttl-default/watcher-kuttl-decision-engine-config-data: secret "watcher-kuttl-decision-engine-config-data" not found Oct 03 09:08:15 crc kubenswrapper[4765]: E1003 09:08:15.386708 4765 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/aa4c7fcd-bff5-453b-9bda-72e97366bc30-config-data podName:aa4c7fcd-bff5-453b-9bda-72e97366bc30 nodeName:}" failed. No retries permitted until 2025-10-03 09:08:15.886691173 +0000 UTC m=+1740.188185503 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "config-data" (UniqueName: "kubernetes.io/secret/aa4c7fcd-bff5-453b-9bda-72e97366bc30-config-data") pod "watcher-kuttl-decision-engine-0" (UID: "aa4c7fcd-bff5-453b-9bda-72e97366bc30") : secret "watcher-kuttl-decision-engine-config-data" not found Oct 03 09:08:15 crc kubenswrapper[4765]: I1003 09:08:15.415498 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-v6h2s\" (UniqueName: \"kubernetes.io/projected/6f162962-d9a4-4bfe-b2e2-0843ada74f39-kube-api-access-v6h2s\") pod \"watcherbc0e-account-delete-2zz5q\" (UID: \"6f162962-d9a4-4bfe-b2e2-0843ada74f39\") " pod="watcher-kuttl-default/watcherbc0e-account-delete-2zz5q" Oct 03 09:08:15 crc kubenswrapper[4765]: E1003 09:08:15.901992 4765 secret.go:188] Couldn't get secret watcher-kuttl-default/watcher-kuttl-decision-engine-config-data: secret "watcher-kuttl-decision-engine-config-data" not found Oct 03 09:08:15 crc kubenswrapper[4765]: E1003 09:08:15.902058 4765 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/aa4c7fcd-bff5-453b-9bda-72e97366bc30-config-data podName:aa4c7fcd-bff5-453b-9bda-72e97366bc30 nodeName:}" failed. No retries permitted until 2025-10-03 09:08:16.902044194 +0000 UTC m=+1741.203538524 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "config-data" (UniqueName: "kubernetes.io/secret/aa4c7fcd-bff5-453b-9bda-72e97366bc30-config-data") pod "watcher-kuttl-decision-engine-0" (UID: "aa4c7fcd-bff5-453b-9bda-72e97366bc30") : secret "watcher-kuttl-decision-engine-config-data" not found Oct 03 09:08:16 crc kubenswrapper[4765]: I1003 09:08:16.276794 4765 generic.go:334] "Generic (PLEG): container finished" podID="9ce33c7a-0177-422f-8cfa-7d88fa60c2ee" containerID="052bfd19b2f2aac71e6087c585e63529b2fe429d5adbb681db8feb4bac7908e7" exitCode=143 Oct 03 09:08:16 crc kubenswrapper[4765]: I1003 09:08:16.277675 4765 kubelet_pods.go:1007] "Unable to retrieve pull secret, the image pull may not succeed." pod="watcher-kuttl-default/watcher-kuttl-decision-engine-0" secret="" err="secret \"watcher-watcher-kuttl-dockercfg-f4q6z\" not found" Oct 03 09:08:16 crc kubenswrapper[4765]: I1003 09:08:16.278258 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/watcher-kuttl-api-0" event={"ID":"9ce33c7a-0177-422f-8cfa-7d88fa60c2ee","Type":"ContainerDied","Data":"052bfd19b2f2aac71e6087c585e63529b2fe429d5adbb681db8feb4bac7908e7"} Oct 03 09:08:16 crc kubenswrapper[4765]: I1003 09:08:16.278311 4765 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/watcherbc0e-account-delete-2zz5q" Oct 03 09:08:16 crc kubenswrapper[4765]: I1003 09:08:16.291481 4765 prober.go:107] "Probe failed" probeType="Readiness" pod="watcher-kuttl-default/watcher-kuttl-api-0" podUID="9ce33c7a-0177-422f-8cfa-7d88fa60c2ee" containerName="watcher-api" probeResult="failure" output="Get \"http://10.217.0.193:9322/\": read tcp 10.217.0.2:33598->10.217.0.193:9322: read: connection reset by peer" Oct 03 09:08:16 crc kubenswrapper[4765]: I1003 09:08:16.291903 4765 prober.go:107] "Probe failed" probeType="Readiness" pod="watcher-kuttl-default/watcher-kuttl-api-0" podUID="9ce33c7a-0177-422f-8cfa-7d88fa60c2ee" containerName="watcher-kuttl-api-log" probeResult="failure" output="Get \"http://10.217.0.193:9322/\": read tcp 10.217.0.2:33596->10.217.0.193:9322: read: connection reset by peer" Oct 03 09:08:16 crc kubenswrapper[4765]: I1003 09:08:16.292055 4765 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/watcherbc0e-account-delete-2zz5q" Oct 03 09:08:16 crc kubenswrapper[4765]: I1003 09:08:16.314522 4765 scope.go:117] "RemoveContainer" containerID="dd918556e4256b95f1ffce5dba4f8a301b33441a569fc5bbea88da3f09eb9800" Oct 03 09:08:16 crc kubenswrapper[4765]: E1003 09:08:16.314862 4765 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-j8mss_openshift-machine-config-operator(d636dbad-9ffa-4ba7-953f-adea04b76a23)\"" pod="openshift-machine-config-operator/machine-config-daemon-j8mss" podUID="d636dbad-9ffa-4ba7-953f-adea04b76a23" Oct 03 09:08:16 crc kubenswrapper[4765]: I1003 09:08:16.321812 4765 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="42c7eba2-2b60-4283-bdfe-320338b7c04d" path="/var/lib/kubelet/pods/42c7eba2-2b60-4283-bdfe-320338b7c04d/volumes" Oct 03 09:08:16 crc kubenswrapper[4765]: I1003 09:08:16.322431 4765 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="870c1bf5-02a9-472e-8747-b92bdc505367" path="/var/lib/kubelet/pods/870c1bf5-02a9-472e-8747-b92bdc505367/volumes" Oct 03 09:08:16 crc kubenswrapper[4765]: I1003 09:08:16.323146 4765 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bb0e5f30-95a2-4a98-80c2-cd33dd4328a1" path="/var/lib/kubelet/pods/bb0e5f30-95a2-4a98-80c2-cd33dd4328a1/volumes" Oct 03 09:08:16 crc kubenswrapper[4765]: E1003 09:08:16.384083 4765 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="ce94645a3b1bf7876f235e3ac0ce6d987e69566b3de848e9be5e8eca67816971" cmd=["/usr/bin/pgrep","-r","DRST","watcher-applier"] Oct 03 09:08:16 crc kubenswrapper[4765]: E1003 09:08:16.387842 4765 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="ce94645a3b1bf7876f235e3ac0ce6d987e69566b3de848e9be5e8eca67816971" cmd=["/usr/bin/pgrep","-r","DRST","watcher-applier"] Oct 03 09:08:16 crc kubenswrapper[4765]: E1003 09:08:16.390373 4765 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="ce94645a3b1bf7876f235e3ac0ce6d987e69566b3de848e9be5e8eca67816971" cmd=["/usr/bin/pgrep","-r","DRST","watcher-applier"] Oct 03 09:08:16 crc kubenswrapper[4765]: E1003 09:08:16.390417 4765 prober.go:104] "Probe errored" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" probeType="Readiness" pod="watcher-kuttl-default/watcher-kuttl-applier-0" podUID="c6c827c9-7f24-44dc-841b-246a3eafaae5" containerName="watcher-applier" Oct 03 09:08:16 crc kubenswrapper[4765]: I1003 09:08:16.409614 4765 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-v6h2s\" (UniqueName: \"kubernetes.io/projected/6f162962-d9a4-4bfe-b2e2-0843ada74f39-kube-api-access-v6h2s\") pod \"6f162962-d9a4-4bfe-b2e2-0843ada74f39\" (UID: \"6f162962-d9a4-4bfe-b2e2-0843ada74f39\") " Oct 03 09:08:16 crc kubenswrapper[4765]: I1003 09:08:16.416228 4765 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6f162962-d9a4-4bfe-b2e2-0843ada74f39-kube-api-access-v6h2s" (OuterVolumeSpecName: "kube-api-access-v6h2s") pod "6f162962-d9a4-4bfe-b2e2-0843ada74f39" (UID: "6f162962-d9a4-4bfe-b2e2-0843ada74f39"). InnerVolumeSpecName "kube-api-access-v6h2s". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 09:08:16 crc kubenswrapper[4765]: I1003 09:08:16.511933 4765 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-v6h2s\" (UniqueName: \"kubernetes.io/projected/6f162962-d9a4-4bfe-b2e2-0843ada74f39-kube-api-access-v6h2s\") on node \"crc\" DevicePath \"\"" Oct 03 09:08:16 crc kubenswrapper[4765]: I1003 09:08:16.727682 4765 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/watcher-kuttl-api-0" Oct 03 09:08:16 crc kubenswrapper[4765]: I1003 09:08:16.818832 4765 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9ce33c7a-0177-422f-8cfa-7d88fa60c2ee-combined-ca-bundle\") pod \"9ce33c7a-0177-422f-8cfa-7d88fa60c2ee\" (UID: \"9ce33c7a-0177-422f-8cfa-7d88fa60c2ee\") " Oct 03 09:08:16 crc kubenswrapper[4765]: I1003 09:08:16.818953 4765 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9ce33c7a-0177-422f-8cfa-7d88fa60c2ee-config-data\") pod \"9ce33c7a-0177-422f-8cfa-7d88fa60c2ee\" (UID: \"9ce33c7a-0177-422f-8cfa-7d88fa60c2ee\") " Oct 03 09:08:16 crc kubenswrapper[4765]: I1003 09:08:16.819030 4765 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cert-memcached-mtls\" (UniqueName: \"kubernetes.io/secret/9ce33c7a-0177-422f-8cfa-7d88fa60c2ee-cert-memcached-mtls\") pod \"9ce33c7a-0177-422f-8cfa-7d88fa60c2ee\" (UID: \"9ce33c7a-0177-422f-8cfa-7d88fa60c2ee\") " Oct 03 09:08:16 crc kubenswrapper[4765]: I1003 09:08:16.819181 4765 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/9ce33c7a-0177-422f-8cfa-7d88fa60c2ee-logs\") pod \"9ce33c7a-0177-422f-8cfa-7d88fa60c2ee\" (UID: \"9ce33c7a-0177-422f-8cfa-7d88fa60c2ee\") " Oct 03 09:08:16 crc kubenswrapper[4765]: I1003 09:08:16.819223 4765 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7wjn6\" (UniqueName: \"kubernetes.io/projected/9ce33c7a-0177-422f-8cfa-7d88fa60c2ee-kube-api-access-7wjn6\") pod \"9ce33c7a-0177-422f-8cfa-7d88fa60c2ee\" (UID: \"9ce33c7a-0177-422f-8cfa-7d88fa60c2ee\") " Oct 03 09:08:16 crc kubenswrapper[4765]: I1003 09:08:16.819281 4765 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"custom-prometheus-ca\" (UniqueName: \"kubernetes.io/secret/9ce33c7a-0177-422f-8cfa-7d88fa60c2ee-custom-prometheus-ca\") pod \"9ce33c7a-0177-422f-8cfa-7d88fa60c2ee\" (UID: \"9ce33c7a-0177-422f-8cfa-7d88fa60c2ee\") " Oct 03 09:08:16 crc kubenswrapper[4765]: I1003 09:08:16.835921 4765 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/9ce33c7a-0177-422f-8cfa-7d88fa60c2ee-logs" (OuterVolumeSpecName: "logs") pod "9ce33c7a-0177-422f-8cfa-7d88fa60c2ee" (UID: "9ce33c7a-0177-422f-8cfa-7d88fa60c2ee"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 03 09:08:16 crc kubenswrapper[4765]: I1003 09:08:16.849883 4765 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9ce33c7a-0177-422f-8cfa-7d88fa60c2ee-kube-api-access-7wjn6" (OuterVolumeSpecName: "kube-api-access-7wjn6") pod "9ce33c7a-0177-422f-8cfa-7d88fa60c2ee" (UID: "9ce33c7a-0177-422f-8cfa-7d88fa60c2ee"). InnerVolumeSpecName "kube-api-access-7wjn6". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 09:08:16 crc kubenswrapper[4765]: I1003 09:08:16.917052 4765 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9ce33c7a-0177-422f-8cfa-7d88fa60c2ee-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "9ce33c7a-0177-422f-8cfa-7d88fa60c2ee" (UID: "9ce33c7a-0177-422f-8cfa-7d88fa60c2ee"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 09:08:16 crc kubenswrapper[4765]: I1003 09:08:16.921714 4765 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9ce33c7a-0177-422f-8cfa-7d88fa60c2ee-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 03 09:08:16 crc kubenswrapper[4765]: I1003 09:08:16.921760 4765 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/9ce33c7a-0177-422f-8cfa-7d88fa60c2ee-logs\") on node \"crc\" DevicePath \"\"" Oct 03 09:08:16 crc kubenswrapper[4765]: I1003 09:08:16.921771 4765 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7wjn6\" (UniqueName: \"kubernetes.io/projected/9ce33c7a-0177-422f-8cfa-7d88fa60c2ee-kube-api-access-7wjn6\") on node \"crc\" DevicePath \"\"" Oct 03 09:08:16 crc kubenswrapper[4765]: E1003 09:08:16.921853 4765 secret.go:188] Couldn't get secret watcher-kuttl-default/watcher-kuttl-decision-engine-config-data: secret "watcher-kuttl-decision-engine-config-data" not found Oct 03 09:08:16 crc kubenswrapper[4765]: E1003 09:08:16.921898 4765 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/aa4c7fcd-bff5-453b-9bda-72e97366bc30-config-data podName:aa4c7fcd-bff5-453b-9bda-72e97366bc30 nodeName:}" failed. No retries permitted until 2025-10-03 09:08:18.921884036 +0000 UTC m=+1743.223378366 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "config-data" (UniqueName: "kubernetes.io/secret/aa4c7fcd-bff5-453b-9bda-72e97366bc30-config-data") pod "watcher-kuttl-decision-engine-0" (UID: "aa4c7fcd-bff5-453b-9bda-72e97366bc30") : secret "watcher-kuttl-decision-engine-config-data" not found Oct 03 09:08:16 crc kubenswrapper[4765]: I1003 09:08:16.932804 4765 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9ce33c7a-0177-422f-8cfa-7d88fa60c2ee-custom-prometheus-ca" (OuterVolumeSpecName: "custom-prometheus-ca") pod "9ce33c7a-0177-422f-8cfa-7d88fa60c2ee" (UID: "9ce33c7a-0177-422f-8cfa-7d88fa60c2ee"). InnerVolumeSpecName "custom-prometheus-ca". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 09:08:16 crc kubenswrapper[4765]: I1003 09:08:16.996013 4765 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9ce33c7a-0177-422f-8cfa-7d88fa60c2ee-config-data" (OuterVolumeSpecName: "config-data") pod "9ce33c7a-0177-422f-8cfa-7d88fa60c2ee" (UID: "9ce33c7a-0177-422f-8cfa-7d88fa60c2ee"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 09:08:17 crc kubenswrapper[4765]: I1003 09:08:17.027710 4765 reconciler_common.go:293] "Volume detached for volume \"custom-prometheus-ca\" (UniqueName: \"kubernetes.io/secret/9ce33c7a-0177-422f-8cfa-7d88fa60c2ee-custom-prometheus-ca\") on node \"crc\" DevicePath \"\"" Oct 03 09:08:17 crc kubenswrapper[4765]: I1003 09:08:17.027745 4765 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9ce33c7a-0177-422f-8cfa-7d88fa60c2ee-config-data\") on node \"crc\" DevicePath \"\"" Oct 03 09:08:17 crc kubenswrapper[4765]: I1003 09:08:17.035602 4765 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9ce33c7a-0177-422f-8cfa-7d88fa60c2ee-cert-memcached-mtls" (OuterVolumeSpecName: "cert-memcached-mtls") pod "9ce33c7a-0177-422f-8cfa-7d88fa60c2ee" (UID: "9ce33c7a-0177-422f-8cfa-7d88fa60c2ee"). InnerVolumeSpecName "cert-memcached-mtls". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 09:08:17 crc kubenswrapper[4765]: I1003 09:08:17.129160 4765 reconciler_common.go:293] "Volume detached for volume \"cert-memcached-mtls\" (UniqueName: \"kubernetes.io/secret/9ce33c7a-0177-422f-8cfa-7d88fa60c2ee-cert-memcached-mtls\") on node \"crc\" DevicePath \"\"" Oct 03 09:08:17 crc kubenswrapper[4765]: I1003 09:08:17.290134 4765 generic.go:334] "Generic (PLEG): container finished" podID="9ce33c7a-0177-422f-8cfa-7d88fa60c2ee" containerID="59516f60cc564107349b6d1caf969e96f2d1f0ae07ece499ad03762a161bf1e0" exitCode=0 Oct 03 09:08:17 crc kubenswrapper[4765]: I1003 09:08:17.290243 4765 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/watcherbc0e-account-delete-2zz5q" Oct 03 09:08:17 crc kubenswrapper[4765]: I1003 09:08:17.291773 4765 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/watcher-kuttl-api-0" Oct 03 09:08:17 crc kubenswrapper[4765]: I1003 09:08:17.291793 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/watcher-kuttl-api-0" event={"ID":"9ce33c7a-0177-422f-8cfa-7d88fa60c2ee","Type":"ContainerDied","Data":"59516f60cc564107349b6d1caf969e96f2d1f0ae07ece499ad03762a161bf1e0"} Oct 03 09:08:17 crc kubenswrapper[4765]: I1003 09:08:17.291855 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/watcher-kuttl-api-0" event={"ID":"9ce33c7a-0177-422f-8cfa-7d88fa60c2ee","Type":"ContainerDied","Data":"ef498f5ba8a9608f3f5c387cf09146a241fbed441284f31a1cae3965a60d6696"} Oct 03 09:08:17 crc kubenswrapper[4765]: I1003 09:08:17.291879 4765 scope.go:117] "RemoveContainer" containerID="59516f60cc564107349b6d1caf969e96f2d1f0ae07ece499ad03762a161bf1e0" Oct 03 09:08:17 crc kubenswrapper[4765]: I1003 09:08:17.292204 4765 kuberuntime_container.go:808] "Killing container with a grace period" pod="watcher-kuttl-default/watcher-kuttl-decision-engine-0" podUID="aa4c7fcd-bff5-453b-9bda-72e97366bc30" containerName="watcher-decision-engine" containerID="cri-o://17a492e825edc0d43037fcce9d5cefdb2f7464e8af080eb65a27a1198b235738" gracePeriod=30 Oct 03 09:08:17 crc kubenswrapper[4765]: I1003 09:08:17.343433 4765 scope.go:117] "RemoveContainer" containerID="052bfd19b2f2aac71e6087c585e63529b2fe429d5adbb681db8feb4bac7908e7" Oct 03 09:08:17 crc kubenswrapper[4765]: I1003 09:08:17.369406 4765 scope.go:117] "RemoveContainer" containerID="59516f60cc564107349b6d1caf969e96f2d1f0ae07ece499ad03762a161bf1e0" Oct 03 09:08:17 crc kubenswrapper[4765]: E1003 09:08:17.370149 4765 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"59516f60cc564107349b6d1caf969e96f2d1f0ae07ece499ad03762a161bf1e0\": container with ID starting with 59516f60cc564107349b6d1caf969e96f2d1f0ae07ece499ad03762a161bf1e0 not found: ID does not exist" containerID="59516f60cc564107349b6d1caf969e96f2d1f0ae07ece499ad03762a161bf1e0" Oct 03 09:08:17 crc kubenswrapper[4765]: I1003 09:08:17.370252 4765 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"59516f60cc564107349b6d1caf969e96f2d1f0ae07ece499ad03762a161bf1e0"} err="failed to get container status \"59516f60cc564107349b6d1caf969e96f2d1f0ae07ece499ad03762a161bf1e0\": rpc error: code = NotFound desc = could not find container \"59516f60cc564107349b6d1caf969e96f2d1f0ae07ece499ad03762a161bf1e0\": container with ID starting with 59516f60cc564107349b6d1caf969e96f2d1f0ae07ece499ad03762a161bf1e0 not found: ID does not exist" Oct 03 09:08:17 crc kubenswrapper[4765]: I1003 09:08:17.370301 4765 scope.go:117] "RemoveContainer" containerID="052bfd19b2f2aac71e6087c585e63529b2fe429d5adbb681db8feb4bac7908e7" Oct 03 09:08:17 crc kubenswrapper[4765]: E1003 09:08:17.370868 4765 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"052bfd19b2f2aac71e6087c585e63529b2fe429d5adbb681db8feb4bac7908e7\": container with ID starting with 052bfd19b2f2aac71e6087c585e63529b2fe429d5adbb681db8feb4bac7908e7 not found: ID does not exist" containerID="052bfd19b2f2aac71e6087c585e63529b2fe429d5adbb681db8feb4bac7908e7" Oct 03 09:08:17 crc kubenswrapper[4765]: I1003 09:08:17.370942 4765 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"052bfd19b2f2aac71e6087c585e63529b2fe429d5adbb681db8feb4bac7908e7"} err="failed to get container status \"052bfd19b2f2aac71e6087c585e63529b2fe429d5adbb681db8feb4bac7908e7\": rpc error: code = NotFound desc = could not find container \"052bfd19b2f2aac71e6087c585e63529b2fe429d5adbb681db8feb4bac7908e7\": container with ID starting with 052bfd19b2f2aac71e6087c585e63529b2fe429d5adbb681db8feb4bac7908e7 not found: ID does not exist" Oct 03 09:08:17 crc kubenswrapper[4765]: I1003 09:08:17.376100 4765 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["watcher-kuttl-default/watcherbc0e-account-delete-2zz5q"] Oct 03 09:08:17 crc kubenswrapper[4765]: I1003 09:08:17.382824 4765 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["watcher-kuttl-default/watcherbc0e-account-delete-2zz5q"] Oct 03 09:08:17 crc kubenswrapper[4765]: I1003 09:08:17.392629 4765 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["watcher-kuttl-default/watcher-kuttl-api-0"] Oct 03 09:08:17 crc kubenswrapper[4765]: I1003 09:08:17.405747 4765 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["watcher-kuttl-default/watcher-kuttl-api-0"] Oct 03 09:08:17 crc kubenswrapper[4765]: I1003 09:08:17.935630 4765 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["watcher-kuttl-default/ceilometer-0"] Oct 03 09:08:17 crc kubenswrapper[4765]: I1003 09:08:17.935919 4765 kuberuntime_container.go:808] "Killing container with a grace period" pod="watcher-kuttl-default/ceilometer-0" podUID="78b9aabd-dac4-465e-93c6-0e6199890b40" containerName="ceilometer-central-agent" containerID="cri-o://be70af77a826ff13cbe22bb18412cc2099feae7abb5f1222016c016d21b016c9" gracePeriod=30 Oct 03 09:08:17 crc kubenswrapper[4765]: I1003 09:08:17.936008 4765 kuberuntime_container.go:808] "Killing container with a grace period" pod="watcher-kuttl-default/ceilometer-0" podUID="78b9aabd-dac4-465e-93c6-0e6199890b40" containerName="proxy-httpd" containerID="cri-o://236c63359cd8872236a6a89cf2cb6e3f388f04a643f9dc8573f79c825364ae91" gracePeriod=30 Oct 03 09:08:17 crc kubenswrapper[4765]: I1003 09:08:17.936043 4765 kuberuntime_container.go:808] "Killing container with a grace period" pod="watcher-kuttl-default/ceilometer-0" podUID="78b9aabd-dac4-465e-93c6-0e6199890b40" containerName="ceilometer-notification-agent" containerID="cri-o://7e4d64087b1fec4ecb167e1456cba2d2daf4a56df98ea15c9be2879c2e8309a0" gracePeriod=30 Oct 03 09:08:17 crc kubenswrapper[4765]: I1003 09:08:17.936032 4765 kuberuntime_container.go:808] "Killing container with a grace period" pod="watcher-kuttl-default/ceilometer-0" podUID="78b9aabd-dac4-465e-93c6-0e6199890b40" containerName="sg-core" containerID="cri-o://a1225aa184b4b800149b86dc4f7e720809d9d0bd2564424d0ecb327cfc3e32cb" gracePeriod=30 Oct 03 09:08:17 crc kubenswrapper[4765]: I1003 09:08:17.945031 4765 prober.go:107] "Probe failed" probeType="Readiness" pod="watcher-kuttl-default/ceilometer-0" podUID="78b9aabd-dac4-465e-93c6-0e6199890b40" containerName="proxy-httpd" probeResult="failure" output="Get \"https://10.217.0.209:3000/\": EOF" Oct 03 09:08:18 crc kubenswrapper[4765]: I1003 09:08:18.301211 4765 generic.go:334] "Generic (PLEG): container finished" podID="78b9aabd-dac4-465e-93c6-0e6199890b40" containerID="236c63359cd8872236a6a89cf2cb6e3f388f04a643f9dc8573f79c825364ae91" exitCode=0 Oct 03 09:08:18 crc kubenswrapper[4765]: I1003 09:08:18.301235 4765 generic.go:334] "Generic (PLEG): container finished" podID="78b9aabd-dac4-465e-93c6-0e6199890b40" containerID="a1225aa184b4b800149b86dc4f7e720809d9d0bd2564424d0ecb327cfc3e32cb" exitCode=2 Oct 03 09:08:18 crc kubenswrapper[4765]: I1003 09:08:18.301253 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/ceilometer-0" event={"ID":"78b9aabd-dac4-465e-93c6-0e6199890b40","Type":"ContainerDied","Data":"236c63359cd8872236a6a89cf2cb6e3f388f04a643f9dc8573f79c825364ae91"} Oct 03 09:08:18 crc kubenswrapper[4765]: I1003 09:08:18.301284 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/ceilometer-0" event={"ID":"78b9aabd-dac4-465e-93c6-0e6199890b40","Type":"ContainerDied","Data":"a1225aa184b4b800149b86dc4f7e720809d9d0bd2564424d0ecb327cfc3e32cb"} Oct 03 09:08:18 crc kubenswrapper[4765]: I1003 09:08:18.315822 4765 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6f162962-d9a4-4bfe-b2e2-0843ada74f39" path="/var/lib/kubelet/pods/6f162962-d9a4-4bfe-b2e2-0843ada74f39/volumes" Oct 03 09:08:18 crc kubenswrapper[4765]: I1003 09:08:18.316285 4765 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9ce33c7a-0177-422f-8cfa-7d88fa60c2ee" path="/var/lib/kubelet/pods/9ce33c7a-0177-422f-8cfa-7d88fa60c2ee/volumes" Oct 03 09:08:18 crc kubenswrapper[4765]: I1003 09:08:18.665897 4765 scope.go:117] "RemoveContainer" containerID="24a927e1eb93abbbf3d89f6e61c8d60c35cb8444b79fef81158ad66d2e80d4ab" Oct 03 09:08:18 crc kubenswrapper[4765]: I1003 09:08:18.686410 4765 scope.go:117] "RemoveContainer" containerID="caa0c7221bdba9e370f980c5b399ceba968e1831eae98d7ae392be92df6494e5" Oct 03 09:08:18 crc kubenswrapper[4765]: I1003 09:08:18.743625 4765 scope.go:117] "RemoveContainer" containerID="420bd236d0b7be62bc45f519bc95d0a361de7a867c4336e89abf73594df0cc38" Oct 03 09:08:18 crc kubenswrapper[4765]: I1003 09:08:18.777764 4765 scope.go:117] "RemoveContainer" containerID="323551f52f86181592b93eb14fff39ac59c9ed25a4731bbfba3cc8d0f7b904fb" Oct 03 09:08:18 crc kubenswrapper[4765]: I1003 09:08:18.829460 4765 scope.go:117] "RemoveContainer" containerID="3e876d0881959f828070159610cf715b0328f5f69152bcbf95e36604b60306d6" Oct 03 09:08:18 crc kubenswrapper[4765]: E1003 09:08:18.961040 4765 secret.go:188] Couldn't get secret watcher-kuttl-default/watcher-kuttl-decision-engine-config-data: secret "watcher-kuttl-decision-engine-config-data" not found Oct 03 09:08:18 crc kubenswrapper[4765]: E1003 09:08:18.961117 4765 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/aa4c7fcd-bff5-453b-9bda-72e97366bc30-config-data podName:aa4c7fcd-bff5-453b-9bda-72e97366bc30 nodeName:}" failed. No retries permitted until 2025-10-03 09:08:22.961100868 +0000 UTC m=+1747.262595188 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "config-data" (UniqueName: "kubernetes.io/secret/aa4c7fcd-bff5-453b-9bda-72e97366bc30-config-data") pod "watcher-kuttl-decision-engine-0" (UID: "aa4c7fcd-bff5-453b-9bda-72e97366bc30") : secret "watcher-kuttl-decision-engine-config-data" not found Oct 03 09:08:19 crc kubenswrapper[4765]: I1003 09:08:19.320235 4765 generic.go:334] "Generic (PLEG): container finished" podID="78b9aabd-dac4-465e-93c6-0e6199890b40" containerID="be70af77a826ff13cbe22bb18412cc2099feae7abb5f1222016c016d21b016c9" exitCode=0 Oct 03 09:08:19 crc kubenswrapper[4765]: I1003 09:08:19.320317 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/ceilometer-0" event={"ID":"78b9aabd-dac4-465e-93c6-0e6199890b40","Type":"ContainerDied","Data":"be70af77a826ff13cbe22bb18412cc2099feae7abb5f1222016c016d21b016c9"} Oct 03 09:08:20 crc kubenswrapper[4765]: I1003 09:08:20.327950 4765 generic.go:334] "Generic (PLEG): container finished" podID="c6c827c9-7f24-44dc-841b-246a3eafaae5" containerID="ce94645a3b1bf7876f235e3ac0ce6d987e69566b3de848e9be5e8eca67816971" exitCode=0 Oct 03 09:08:20 crc kubenswrapper[4765]: I1003 09:08:20.328420 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/watcher-kuttl-applier-0" event={"ID":"c6c827c9-7f24-44dc-841b-246a3eafaae5","Type":"ContainerDied","Data":"ce94645a3b1bf7876f235e3ac0ce6d987e69566b3de848e9be5e8eca67816971"} Oct 03 09:08:20 crc kubenswrapper[4765]: I1003 09:08:20.521197 4765 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/watcher-kuttl-applier-0" Oct 03 09:08:20 crc kubenswrapper[4765]: I1003 09:08:20.594038 4765 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/c6c827c9-7f24-44dc-841b-246a3eafaae5-logs\") pod \"c6c827c9-7f24-44dc-841b-246a3eafaae5\" (UID: \"c6c827c9-7f24-44dc-841b-246a3eafaae5\") " Oct 03 09:08:20 crc kubenswrapper[4765]: I1003 09:08:20.594207 4765 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c6c827c9-7f24-44dc-841b-246a3eafaae5-config-data\") pod \"c6c827c9-7f24-44dc-841b-246a3eafaae5\" (UID: \"c6c827c9-7f24-44dc-841b-246a3eafaae5\") " Oct 03 09:08:20 crc kubenswrapper[4765]: I1003 09:08:20.594238 4765 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cert-memcached-mtls\" (UniqueName: \"kubernetes.io/secret/c6c827c9-7f24-44dc-841b-246a3eafaae5-cert-memcached-mtls\") pod \"c6c827c9-7f24-44dc-841b-246a3eafaae5\" (UID: \"c6c827c9-7f24-44dc-841b-246a3eafaae5\") " Oct 03 09:08:20 crc kubenswrapper[4765]: I1003 09:08:20.594269 4765 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-hl5pw\" (UniqueName: \"kubernetes.io/projected/c6c827c9-7f24-44dc-841b-246a3eafaae5-kube-api-access-hl5pw\") pod \"c6c827c9-7f24-44dc-841b-246a3eafaae5\" (UID: \"c6c827c9-7f24-44dc-841b-246a3eafaae5\") " Oct 03 09:08:20 crc kubenswrapper[4765]: I1003 09:08:20.594289 4765 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c6c827c9-7f24-44dc-841b-246a3eafaae5-combined-ca-bundle\") pod \"c6c827c9-7f24-44dc-841b-246a3eafaae5\" (UID: \"c6c827c9-7f24-44dc-841b-246a3eafaae5\") " Oct 03 09:08:20 crc kubenswrapper[4765]: I1003 09:08:20.594603 4765 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c6c827c9-7f24-44dc-841b-246a3eafaae5-logs" (OuterVolumeSpecName: "logs") pod "c6c827c9-7f24-44dc-841b-246a3eafaae5" (UID: "c6c827c9-7f24-44dc-841b-246a3eafaae5"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 03 09:08:20 crc kubenswrapper[4765]: I1003 09:08:20.603881 4765 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c6c827c9-7f24-44dc-841b-246a3eafaae5-kube-api-access-hl5pw" (OuterVolumeSpecName: "kube-api-access-hl5pw") pod "c6c827c9-7f24-44dc-841b-246a3eafaae5" (UID: "c6c827c9-7f24-44dc-841b-246a3eafaae5"). InnerVolumeSpecName "kube-api-access-hl5pw". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 09:08:20 crc kubenswrapper[4765]: I1003 09:08:20.627855 4765 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c6c827c9-7f24-44dc-841b-246a3eafaae5-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "c6c827c9-7f24-44dc-841b-246a3eafaae5" (UID: "c6c827c9-7f24-44dc-841b-246a3eafaae5"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 09:08:20 crc kubenswrapper[4765]: I1003 09:08:20.653999 4765 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c6c827c9-7f24-44dc-841b-246a3eafaae5-config-data" (OuterVolumeSpecName: "config-data") pod "c6c827c9-7f24-44dc-841b-246a3eafaae5" (UID: "c6c827c9-7f24-44dc-841b-246a3eafaae5"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 09:08:20 crc kubenswrapper[4765]: I1003 09:08:20.670077 4765 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c6c827c9-7f24-44dc-841b-246a3eafaae5-cert-memcached-mtls" (OuterVolumeSpecName: "cert-memcached-mtls") pod "c6c827c9-7f24-44dc-841b-246a3eafaae5" (UID: "c6c827c9-7f24-44dc-841b-246a3eafaae5"). InnerVolumeSpecName "cert-memcached-mtls". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 09:08:20 crc kubenswrapper[4765]: I1003 09:08:20.697001 4765 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c6c827c9-7f24-44dc-841b-246a3eafaae5-config-data\") on node \"crc\" DevicePath \"\"" Oct 03 09:08:20 crc kubenswrapper[4765]: I1003 09:08:20.697377 4765 reconciler_common.go:293] "Volume detached for volume \"cert-memcached-mtls\" (UniqueName: \"kubernetes.io/secret/c6c827c9-7f24-44dc-841b-246a3eafaae5-cert-memcached-mtls\") on node \"crc\" DevicePath \"\"" Oct 03 09:08:20 crc kubenswrapper[4765]: I1003 09:08:20.697479 4765 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-hl5pw\" (UniqueName: \"kubernetes.io/projected/c6c827c9-7f24-44dc-841b-246a3eafaae5-kube-api-access-hl5pw\") on node \"crc\" DevicePath \"\"" Oct 03 09:08:20 crc kubenswrapper[4765]: I1003 09:08:20.697573 4765 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c6c827c9-7f24-44dc-841b-246a3eafaae5-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 03 09:08:20 crc kubenswrapper[4765]: I1003 09:08:20.697668 4765 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/c6c827c9-7f24-44dc-841b-246a3eafaae5-logs\") on node \"crc\" DevicePath \"\"" Oct 03 09:08:21 crc kubenswrapper[4765]: I1003 09:08:21.359739 4765 generic.go:334] "Generic (PLEG): container finished" podID="aa4c7fcd-bff5-453b-9bda-72e97366bc30" containerID="17a492e825edc0d43037fcce9d5cefdb2f7464e8af080eb65a27a1198b235738" exitCode=0 Oct 03 09:08:21 crc kubenswrapper[4765]: I1003 09:08:21.359818 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/watcher-kuttl-decision-engine-0" event={"ID":"aa4c7fcd-bff5-453b-9bda-72e97366bc30","Type":"ContainerDied","Data":"17a492e825edc0d43037fcce9d5cefdb2f7464e8af080eb65a27a1198b235738"} Oct 03 09:08:21 crc kubenswrapper[4765]: I1003 09:08:21.364113 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/watcher-kuttl-applier-0" event={"ID":"c6c827c9-7f24-44dc-841b-246a3eafaae5","Type":"ContainerDied","Data":"206cbd03b1dd601820f9b7ce00262169db25d13c3c7067eec441ffa7df44ef44"} Oct 03 09:08:21 crc kubenswrapper[4765]: I1003 09:08:21.364178 4765 scope.go:117] "RemoveContainer" containerID="ce94645a3b1bf7876f235e3ac0ce6d987e69566b3de848e9be5e8eca67816971" Oct 03 09:08:21 crc kubenswrapper[4765]: I1003 09:08:21.364235 4765 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/watcher-kuttl-applier-0" Oct 03 09:08:21 crc kubenswrapper[4765]: I1003 09:08:21.451505 4765 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/watcher-kuttl-decision-engine-0" Oct 03 09:08:21 crc kubenswrapper[4765]: I1003 09:08:21.470962 4765 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["watcher-kuttl-default/watcher-kuttl-applier-0"] Oct 03 09:08:21 crc kubenswrapper[4765]: I1003 09:08:21.481086 4765 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["watcher-kuttl-default/watcher-kuttl-applier-0"] Oct 03 09:08:21 crc kubenswrapper[4765]: I1003 09:08:21.515705 4765 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/aa4c7fcd-bff5-453b-9bda-72e97366bc30-combined-ca-bundle\") pod \"aa4c7fcd-bff5-453b-9bda-72e97366bc30\" (UID: \"aa4c7fcd-bff5-453b-9bda-72e97366bc30\") " Oct 03 09:08:21 crc kubenswrapper[4765]: I1003 09:08:21.515748 4765 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"custom-prometheus-ca\" (UniqueName: \"kubernetes.io/secret/aa4c7fcd-bff5-453b-9bda-72e97366bc30-custom-prometheus-ca\") pod \"aa4c7fcd-bff5-453b-9bda-72e97366bc30\" (UID: \"aa4c7fcd-bff5-453b-9bda-72e97366bc30\") " Oct 03 09:08:21 crc kubenswrapper[4765]: I1003 09:08:21.515773 4765 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/aa4c7fcd-bff5-453b-9bda-72e97366bc30-logs\") pod \"aa4c7fcd-bff5-453b-9bda-72e97366bc30\" (UID: \"aa4c7fcd-bff5-453b-9bda-72e97366bc30\") " Oct 03 09:08:21 crc kubenswrapper[4765]: I1003 09:08:21.515816 4765 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5647s\" (UniqueName: \"kubernetes.io/projected/aa4c7fcd-bff5-453b-9bda-72e97366bc30-kube-api-access-5647s\") pod \"aa4c7fcd-bff5-453b-9bda-72e97366bc30\" (UID: \"aa4c7fcd-bff5-453b-9bda-72e97366bc30\") " Oct 03 09:08:21 crc kubenswrapper[4765]: I1003 09:08:21.515899 4765 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cert-memcached-mtls\" (UniqueName: \"kubernetes.io/secret/aa4c7fcd-bff5-453b-9bda-72e97366bc30-cert-memcached-mtls\") pod \"aa4c7fcd-bff5-453b-9bda-72e97366bc30\" (UID: \"aa4c7fcd-bff5-453b-9bda-72e97366bc30\") " Oct 03 09:08:21 crc kubenswrapper[4765]: I1003 09:08:21.515984 4765 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/aa4c7fcd-bff5-453b-9bda-72e97366bc30-config-data\") pod \"aa4c7fcd-bff5-453b-9bda-72e97366bc30\" (UID: \"aa4c7fcd-bff5-453b-9bda-72e97366bc30\") " Oct 03 09:08:21 crc kubenswrapper[4765]: I1003 09:08:21.516891 4765 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/aa4c7fcd-bff5-453b-9bda-72e97366bc30-logs" (OuterVolumeSpecName: "logs") pod "aa4c7fcd-bff5-453b-9bda-72e97366bc30" (UID: "aa4c7fcd-bff5-453b-9bda-72e97366bc30"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 03 09:08:21 crc kubenswrapper[4765]: I1003 09:08:21.530128 4765 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/aa4c7fcd-bff5-453b-9bda-72e97366bc30-kube-api-access-5647s" (OuterVolumeSpecName: "kube-api-access-5647s") pod "aa4c7fcd-bff5-453b-9bda-72e97366bc30" (UID: "aa4c7fcd-bff5-453b-9bda-72e97366bc30"). InnerVolumeSpecName "kube-api-access-5647s". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 09:08:21 crc kubenswrapper[4765]: I1003 09:08:21.548405 4765 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/aa4c7fcd-bff5-453b-9bda-72e97366bc30-custom-prometheus-ca" (OuterVolumeSpecName: "custom-prometheus-ca") pod "aa4c7fcd-bff5-453b-9bda-72e97366bc30" (UID: "aa4c7fcd-bff5-453b-9bda-72e97366bc30"). InnerVolumeSpecName "custom-prometheus-ca". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 09:08:21 crc kubenswrapper[4765]: I1003 09:08:21.549434 4765 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/aa4c7fcd-bff5-453b-9bda-72e97366bc30-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "aa4c7fcd-bff5-453b-9bda-72e97366bc30" (UID: "aa4c7fcd-bff5-453b-9bda-72e97366bc30"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 09:08:21 crc kubenswrapper[4765]: I1003 09:08:21.571129 4765 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/aa4c7fcd-bff5-453b-9bda-72e97366bc30-config-data" (OuterVolumeSpecName: "config-data") pod "aa4c7fcd-bff5-453b-9bda-72e97366bc30" (UID: "aa4c7fcd-bff5-453b-9bda-72e97366bc30"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 09:08:21 crc kubenswrapper[4765]: I1003 09:08:21.616807 4765 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/aa4c7fcd-bff5-453b-9bda-72e97366bc30-cert-memcached-mtls" (OuterVolumeSpecName: "cert-memcached-mtls") pod "aa4c7fcd-bff5-453b-9bda-72e97366bc30" (UID: "aa4c7fcd-bff5-453b-9bda-72e97366bc30"). InnerVolumeSpecName "cert-memcached-mtls". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 09:08:21 crc kubenswrapper[4765]: I1003 09:08:21.618275 4765 reconciler_common.go:293] "Volume detached for volume \"cert-memcached-mtls\" (UniqueName: \"kubernetes.io/secret/aa4c7fcd-bff5-453b-9bda-72e97366bc30-cert-memcached-mtls\") on node \"crc\" DevicePath \"\"" Oct 03 09:08:21 crc kubenswrapper[4765]: I1003 09:08:21.618299 4765 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/aa4c7fcd-bff5-453b-9bda-72e97366bc30-config-data\") on node \"crc\" DevicePath \"\"" Oct 03 09:08:21 crc kubenswrapper[4765]: I1003 09:08:21.618309 4765 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/aa4c7fcd-bff5-453b-9bda-72e97366bc30-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 03 09:08:21 crc kubenswrapper[4765]: I1003 09:08:21.618320 4765 reconciler_common.go:293] "Volume detached for volume \"custom-prometheus-ca\" (UniqueName: \"kubernetes.io/secret/aa4c7fcd-bff5-453b-9bda-72e97366bc30-custom-prometheus-ca\") on node \"crc\" DevicePath \"\"" Oct 03 09:08:21 crc kubenswrapper[4765]: I1003 09:08:21.618333 4765 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/aa4c7fcd-bff5-453b-9bda-72e97366bc30-logs\") on node \"crc\" DevicePath \"\"" Oct 03 09:08:21 crc kubenswrapper[4765]: I1003 09:08:21.618345 4765 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-5647s\" (UniqueName: \"kubernetes.io/projected/aa4c7fcd-bff5-453b-9bda-72e97366bc30-kube-api-access-5647s\") on node \"crc\" DevicePath \"\"" Oct 03 09:08:22 crc kubenswrapper[4765]: I1003 09:08:22.316801 4765 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c6c827c9-7f24-44dc-841b-246a3eafaae5" path="/var/lib/kubelet/pods/c6c827c9-7f24-44dc-841b-246a3eafaae5/volumes" Oct 03 09:08:22 crc kubenswrapper[4765]: I1003 09:08:22.386590 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/watcher-kuttl-decision-engine-0" event={"ID":"aa4c7fcd-bff5-453b-9bda-72e97366bc30","Type":"ContainerDied","Data":"b17f8207e84939b59ca8c95917e139f6da88fa05e7c2beda2e740fdb8a7fe7f5"} Oct 03 09:08:22 crc kubenswrapper[4765]: I1003 09:08:22.386665 4765 scope.go:117] "RemoveContainer" containerID="17a492e825edc0d43037fcce9d5cefdb2f7464e8af080eb65a27a1198b235738" Oct 03 09:08:22 crc kubenswrapper[4765]: I1003 09:08:22.386715 4765 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/watcher-kuttl-decision-engine-0" Oct 03 09:08:22 crc kubenswrapper[4765]: I1003 09:08:22.409577 4765 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["watcher-kuttl-default/watcher-kuttl-decision-engine-0"] Oct 03 09:08:22 crc kubenswrapper[4765]: I1003 09:08:22.416041 4765 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["watcher-kuttl-default/watcher-kuttl-decision-engine-0"] Oct 03 09:08:23 crc kubenswrapper[4765]: I1003 09:08:23.404695 4765 generic.go:334] "Generic (PLEG): container finished" podID="78b9aabd-dac4-465e-93c6-0e6199890b40" containerID="7e4d64087b1fec4ecb167e1456cba2d2daf4a56df98ea15c9be2879c2e8309a0" exitCode=0 Oct 03 09:08:23 crc kubenswrapper[4765]: I1003 09:08:23.405425 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/ceilometer-0" event={"ID":"78b9aabd-dac4-465e-93c6-0e6199890b40","Type":"ContainerDied","Data":"7e4d64087b1fec4ecb167e1456cba2d2daf4a56df98ea15c9be2879c2e8309a0"} Oct 03 09:08:23 crc kubenswrapper[4765]: I1003 09:08:23.498824 4765 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["watcher-kuttl-default/watcher-db-create-swdf8"] Oct 03 09:08:23 crc kubenswrapper[4765]: E1003 09:08:23.499149 4765 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9ce33c7a-0177-422f-8cfa-7d88fa60c2ee" containerName="watcher-api" Oct 03 09:08:23 crc kubenswrapper[4765]: I1003 09:08:23.499175 4765 state_mem.go:107] "Deleted CPUSet assignment" podUID="9ce33c7a-0177-422f-8cfa-7d88fa60c2ee" containerName="watcher-api" Oct 03 09:08:23 crc kubenswrapper[4765]: E1003 09:08:23.499206 4765 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="aa4c7fcd-bff5-453b-9bda-72e97366bc30" containerName="watcher-decision-engine" Oct 03 09:08:23 crc kubenswrapper[4765]: I1003 09:08:23.499212 4765 state_mem.go:107] "Deleted CPUSet assignment" podUID="aa4c7fcd-bff5-453b-9bda-72e97366bc30" containerName="watcher-decision-engine" Oct 03 09:08:23 crc kubenswrapper[4765]: E1003 09:08:23.499234 4765 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c6c827c9-7f24-44dc-841b-246a3eafaae5" containerName="watcher-applier" Oct 03 09:08:23 crc kubenswrapper[4765]: I1003 09:08:23.499242 4765 state_mem.go:107] "Deleted CPUSet assignment" podUID="c6c827c9-7f24-44dc-841b-246a3eafaae5" containerName="watcher-applier" Oct 03 09:08:23 crc kubenswrapper[4765]: E1003 09:08:23.499257 4765 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9ce33c7a-0177-422f-8cfa-7d88fa60c2ee" containerName="watcher-kuttl-api-log" Oct 03 09:08:23 crc kubenswrapper[4765]: I1003 09:08:23.499264 4765 state_mem.go:107] "Deleted CPUSet assignment" podUID="9ce33c7a-0177-422f-8cfa-7d88fa60c2ee" containerName="watcher-kuttl-api-log" Oct 03 09:08:23 crc kubenswrapper[4765]: I1003 09:08:23.499553 4765 memory_manager.go:354] "RemoveStaleState removing state" podUID="9ce33c7a-0177-422f-8cfa-7d88fa60c2ee" containerName="watcher-kuttl-api-log" Oct 03 09:08:23 crc kubenswrapper[4765]: I1003 09:08:23.499576 4765 memory_manager.go:354] "RemoveStaleState removing state" podUID="c6c827c9-7f24-44dc-841b-246a3eafaae5" containerName="watcher-applier" Oct 03 09:08:23 crc kubenswrapper[4765]: I1003 09:08:23.499590 4765 memory_manager.go:354] "RemoveStaleState removing state" podUID="aa4c7fcd-bff5-453b-9bda-72e97366bc30" containerName="watcher-decision-engine" Oct 03 09:08:23 crc kubenswrapper[4765]: I1003 09:08:23.499600 4765 memory_manager.go:354] "RemoveStaleState removing state" podUID="9ce33c7a-0177-422f-8cfa-7d88fa60c2ee" containerName="watcher-api" Oct 03 09:08:23 crc kubenswrapper[4765]: I1003 09:08:23.500316 4765 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/watcher-db-create-swdf8" Oct 03 09:08:23 crc kubenswrapper[4765]: I1003 09:08:23.511527 4765 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["watcher-kuttl-default/watcher-db-create-swdf8"] Oct 03 09:08:23 crc kubenswrapper[4765]: I1003 09:08:23.550220 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nnvgw\" (UniqueName: \"kubernetes.io/projected/e6d93bea-73df-46c9-8ac7-c68d9c6ab117-kube-api-access-nnvgw\") pod \"watcher-db-create-swdf8\" (UID: \"e6d93bea-73df-46c9-8ac7-c68d9c6ab117\") " pod="watcher-kuttl-default/watcher-db-create-swdf8" Oct 03 09:08:23 crc kubenswrapper[4765]: I1003 09:08:23.555084 4765 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/ceilometer-0" Oct 03 09:08:23 crc kubenswrapper[4765]: I1003 09:08:23.652235 4765 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/78b9aabd-dac4-465e-93c6-0e6199890b40-log-httpd\") pod \"78b9aabd-dac4-465e-93c6-0e6199890b40\" (UID: \"78b9aabd-dac4-465e-93c6-0e6199890b40\") " Oct 03 09:08:23 crc kubenswrapper[4765]: I1003 09:08:23.652530 4765 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/78b9aabd-dac4-465e-93c6-0e6199890b40-sg-core-conf-yaml\") pod \"78b9aabd-dac4-465e-93c6-0e6199890b40\" (UID: \"78b9aabd-dac4-465e-93c6-0e6199890b40\") " Oct 03 09:08:23 crc kubenswrapper[4765]: I1003 09:08:23.652672 4765 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nz7cd\" (UniqueName: \"kubernetes.io/projected/78b9aabd-dac4-465e-93c6-0e6199890b40-kube-api-access-nz7cd\") pod \"78b9aabd-dac4-465e-93c6-0e6199890b40\" (UID: \"78b9aabd-dac4-465e-93c6-0e6199890b40\") " Oct 03 09:08:23 crc kubenswrapper[4765]: I1003 09:08:23.652826 4765 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/78b9aabd-dac4-465e-93c6-0e6199890b40-scripts\") pod \"78b9aabd-dac4-465e-93c6-0e6199890b40\" (UID: \"78b9aabd-dac4-465e-93c6-0e6199890b40\") " Oct 03 09:08:23 crc kubenswrapper[4765]: I1003 09:08:23.652955 4765 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/78b9aabd-dac4-465e-93c6-0e6199890b40-config-data\") pod \"78b9aabd-dac4-465e-93c6-0e6199890b40\" (UID: \"78b9aabd-dac4-465e-93c6-0e6199890b40\") " Oct 03 09:08:23 crc kubenswrapper[4765]: I1003 09:08:23.653151 4765 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/78b9aabd-dac4-465e-93c6-0e6199890b40-run-httpd\") pod \"78b9aabd-dac4-465e-93c6-0e6199890b40\" (UID: \"78b9aabd-dac4-465e-93c6-0e6199890b40\") " Oct 03 09:08:23 crc kubenswrapper[4765]: I1003 09:08:23.653255 4765 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/78b9aabd-dac4-465e-93c6-0e6199890b40-combined-ca-bundle\") pod \"78b9aabd-dac4-465e-93c6-0e6199890b40\" (UID: \"78b9aabd-dac4-465e-93c6-0e6199890b40\") " Oct 03 09:08:23 crc kubenswrapper[4765]: I1003 09:08:23.653365 4765 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/78b9aabd-dac4-465e-93c6-0e6199890b40-ceilometer-tls-certs\") pod \"78b9aabd-dac4-465e-93c6-0e6199890b40\" (UID: \"78b9aabd-dac4-465e-93c6-0e6199890b40\") " Oct 03 09:08:23 crc kubenswrapper[4765]: I1003 09:08:23.653700 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nnvgw\" (UniqueName: \"kubernetes.io/projected/e6d93bea-73df-46c9-8ac7-c68d9c6ab117-kube-api-access-nnvgw\") pod \"watcher-db-create-swdf8\" (UID: \"e6d93bea-73df-46c9-8ac7-c68d9c6ab117\") " pod="watcher-kuttl-default/watcher-db-create-swdf8" Oct 03 09:08:23 crc kubenswrapper[4765]: I1003 09:08:23.653286 4765 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/78b9aabd-dac4-465e-93c6-0e6199890b40-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "78b9aabd-dac4-465e-93c6-0e6199890b40" (UID: "78b9aabd-dac4-465e-93c6-0e6199890b40"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 03 09:08:23 crc kubenswrapper[4765]: I1003 09:08:23.654172 4765 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/78b9aabd-dac4-465e-93c6-0e6199890b40-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "78b9aabd-dac4-465e-93c6-0e6199890b40" (UID: "78b9aabd-dac4-465e-93c6-0e6199890b40"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 03 09:08:23 crc kubenswrapper[4765]: I1003 09:08:23.659094 4765 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/78b9aabd-dac4-465e-93c6-0e6199890b40-scripts" (OuterVolumeSpecName: "scripts") pod "78b9aabd-dac4-465e-93c6-0e6199890b40" (UID: "78b9aabd-dac4-465e-93c6-0e6199890b40"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 09:08:23 crc kubenswrapper[4765]: I1003 09:08:23.660230 4765 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/78b9aabd-dac4-465e-93c6-0e6199890b40-kube-api-access-nz7cd" (OuterVolumeSpecName: "kube-api-access-nz7cd") pod "78b9aabd-dac4-465e-93c6-0e6199890b40" (UID: "78b9aabd-dac4-465e-93c6-0e6199890b40"). InnerVolumeSpecName "kube-api-access-nz7cd". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 09:08:23 crc kubenswrapper[4765]: I1003 09:08:23.681837 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nnvgw\" (UniqueName: \"kubernetes.io/projected/e6d93bea-73df-46c9-8ac7-c68d9c6ab117-kube-api-access-nnvgw\") pod \"watcher-db-create-swdf8\" (UID: \"e6d93bea-73df-46c9-8ac7-c68d9c6ab117\") " pod="watcher-kuttl-default/watcher-db-create-swdf8" Oct 03 09:08:23 crc kubenswrapper[4765]: I1003 09:08:23.693148 4765 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/78b9aabd-dac4-465e-93c6-0e6199890b40-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "78b9aabd-dac4-465e-93c6-0e6199890b40" (UID: "78b9aabd-dac4-465e-93c6-0e6199890b40"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 09:08:23 crc kubenswrapper[4765]: I1003 09:08:23.699777 4765 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/78b9aabd-dac4-465e-93c6-0e6199890b40-ceilometer-tls-certs" (OuterVolumeSpecName: "ceilometer-tls-certs") pod "78b9aabd-dac4-465e-93c6-0e6199890b40" (UID: "78b9aabd-dac4-465e-93c6-0e6199890b40"). InnerVolumeSpecName "ceilometer-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 09:08:23 crc kubenswrapper[4765]: I1003 09:08:23.755214 4765 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/78b9aabd-dac4-465e-93c6-0e6199890b40-log-httpd\") on node \"crc\" DevicePath \"\"" Oct 03 09:08:23 crc kubenswrapper[4765]: I1003 09:08:23.755477 4765 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/78b9aabd-dac4-465e-93c6-0e6199890b40-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Oct 03 09:08:23 crc kubenswrapper[4765]: I1003 09:08:23.755584 4765 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-nz7cd\" (UniqueName: \"kubernetes.io/projected/78b9aabd-dac4-465e-93c6-0e6199890b40-kube-api-access-nz7cd\") on node \"crc\" DevicePath \"\"" Oct 03 09:08:23 crc kubenswrapper[4765]: I1003 09:08:23.755734 4765 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/78b9aabd-dac4-465e-93c6-0e6199890b40-scripts\") on node \"crc\" DevicePath \"\"" Oct 03 09:08:23 crc kubenswrapper[4765]: I1003 09:08:23.756534 4765 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/78b9aabd-dac4-465e-93c6-0e6199890b40-run-httpd\") on node \"crc\" DevicePath \"\"" Oct 03 09:08:23 crc kubenswrapper[4765]: I1003 09:08:23.756622 4765 reconciler_common.go:293] "Volume detached for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/78b9aabd-dac4-465e-93c6-0e6199890b40-ceilometer-tls-certs\") on node \"crc\" DevicePath \"\"" Oct 03 09:08:23 crc kubenswrapper[4765]: I1003 09:08:23.760910 4765 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/78b9aabd-dac4-465e-93c6-0e6199890b40-config-data" (OuterVolumeSpecName: "config-data") pod "78b9aabd-dac4-465e-93c6-0e6199890b40" (UID: "78b9aabd-dac4-465e-93c6-0e6199890b40"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 09:08:23 crc kubenswrapper[4765]: I1003 09:08:23.760974 4765 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/78b9aabd-dac4-465e-93c6-0e6199890b40-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "78b9aabd-dac4-465e-93c6-0e6199890b40" (UID: "78b9aabd-dac4-465e-93c6-0e6199890b40"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 09:08:23 crc kubenswrapper[4765]: I1003 09:08:23.858500 4765 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/78b9aabd-dac4-465e-93c6-0e6199890b40-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 03 09:08:23 crc kubenswrapper[4765]: I1003 09:08:23.858534 4765 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/78b9aabd-dac4-465e-93c6-0e6199890b40-config-data\") on node \"crc\" DevicePath \"\"" Oct 03 09:08:23 crc kubenswrapper[4765]: I1003 09:08:23.875376 4765 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/watcher-db-create-swdf8" Oct 03 09:08:24 crc kubenswrapper[4765]: I1003 09:08:24.318156 4765 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="aa4c7fcd-bff5-453b-9bda-72e97366bc30" path="/var/lib/kubelet/pods/aa4c7fcd-bff5-453b-9bda-72e97366bc30/volumes" Oct 03 09:08:24 crc kubenswrapper[4765]: I1003 09:08:24.365413 4765 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["watcher-kuttl-default/watcher-db-create-swdf8"] Oct 03 09:08:24 crc kubenswrapper[4765]: I1003 09:08:24.416000 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/watcher-db-create-swdf8" event={"ID":"e6d93bea-73df-46c9-8ac7-c68d9c6ab117","Type":"ContainerStarted","Data":"c9b6ed68410f19a71e52fdef781427b7165cf5a07dc4721ed1512a6af607ef82"} Oct 03 09:08:24 crc kubenswrapper[4765]: I1003 09:08:24.420117 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/ceilometer-0" event={"ID":"78b9aabd-dac4-465e-93c6-0e6199890b40","Type":"ContainerDied","Data":"a5349b9793c6371f457d2058c9076fe9eb4edc43a48e6514099e781b5a368f26"} Oct 03 09:08:24 crc kubenswrapper[4765]: I1003 09:08:24.420162 4765 scope.go:117] "RemoveContainer" containerID="236c63359cd8872236a6a89cf2cb6e3f388f04a643f9dc8573f79c825364ae91" Oct 03 09:08:24 crc kubenswrapper[4765]: I1003 09:08:24.420268 4765 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/ceilometer-0" Oct 03 09:08:24 crc kubenswrapper[4765]: I1003 09:08:24.463175 4765 scope.go:117] "RemoveContainer" containerID="a1225aa184b4b800149b86dc4f7e720809d9d0bd2564424d0ecb327cfc3e32cb" Oct 03 09:08:24 crc kubenswrapper[4765]: I1003 09:08:24.488713 4765 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["watcher-kuttl-default/ceilometer-0"] Oct 03 09:08:24 crc kubenswrapper[4765]: I1003 09:08:24.513842 4765 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["watcher-kuttl-default/ceilometer-0"] Oct 03 09:08:24 crc kubenswrapper[4765]: I1003 09:08:24.522973 4765 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["watcher-kuttl-default/ceilometer-0"] Oct 03 09:08:24 crc kubenswrapper[4765]: E1003 09:08:24.523400 4765 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="78b9aabd-dac4-465e-93c6-0e6199890b40" containerName="ceilometer-notification-agent" Oct 03 09:08:24 crc kubenswrapper[4765]: I1003 09:08:24.523416 4765 state_mem.go:107] "Deleted CPUSet assignment" podUID="78b9aabd-dac4-465e-93c6-0e6199890b40" containerName="ceilometer-notification-agent" Oct 03 09:08:24 crc kubenswrapper[4765]: E1003 09:08:24.523431 4765 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="78b9aabd-dac4-465e-93c6-0e6199890b40" containerName="proxy-httpd" Oct 03 09:08:24 crc kubenswrapper[4765]: I1003 09:08:24.523436 4765 state_mem.go:107] "Deleted CPUSet assignment" podUID="78b9aabd-dac4-465e-93c6-0e6199890b40" containerName="proxy-httpd" Oct 03 09:08:24 crc kubenswrapper[4765]: E1003 09:08:24.523453 4765 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="78b9aabd-dac4-465e-93c6-0e6199890b40" containerName="sg-core" Oct 03 09:08:24 crc kubenswrapper[4765]: I1003 09:08:24.523460 4765 state_mem.go:107] "Deleted CPUSet assignment" podUID="78b9aabd-dac4-465e-93c6-0e6199890b40" containerName="sg-core" Oct 03 09:08:24 crc kubenswrapper[4765]: E1003 09:08:24.523483 4765 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="78b9aabd-dac4-465e-93c6-0e6199890b40" containerName="ceilometer-central-agent" Oct 03 09:08:24 crc kubenswrapper[4765]: I1003 09:08:24.523493 4765 state_mem.go:107] "Deleted CPUSet assignment" podUID="78b9aabd-dac4-465e-93c6-0e6199890b40" containerName="ceilometer-central-agent" Oct 03 09:08:24 crc kubenswrapper[4765]: I1003 09:08:24.523708 4765 memory_manager.go:354] "RemoveStaleState removing state" podUID="78b9aabd-dac4-465e-93c6-0e6199890b40" containerName="sg-core" Oct 03 09:08:24 crc kubenswrapper[4765]: I1003 09:08:24.523724 4765 memory_manager.go:354] "RemoveStaleState removing state" podUID="78b9aabd-dac4-465e-93c6-0e6199890b40" containerName="ceilometer-notification-agent" Oct 03 09:08:24 crc kubenswrapper[4765]: I1003 09:08:24.523743 4765 memory_manager.go:354] "RemoveStaleState removing state" podUID="78b9aabd-dac4-465e-93c6-0e6199890b40" containerName="proxy-httpd" Oct 03 09:08:24 crc kubenswrapper[4765]: I1003 09:08:24.523764 4765 memory_manager.go:354] "RemoveStaleState removing state" podUID="78b9aabd-dac4-465e-93c6-0e6199890b40" containerName="ceilometer-central-agent" Oct 03 09:08:24 crc kubenswrapper[4765]: I1003 09:08:24.526172 4765 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/ceilometer-0" Oct 03 09:08:24 crc kubenswrapper[4765]: I1003 09:08:24.538194 4765 scope.go:117] "RemoveContainer" containerID="7e4d64087b1fec4ecb167e1456cba2d2daf4a56df98ea15c9be2879c2e8309a0" Oct 03 09:08:24 crc kubenswrapper[4765]: I1003 09:08:24.579211 4765 reflector.go:368] Caches populated for *v1.Secret from object-"watcher-kuttl-default"/"ceilometer-config-data" Oct 03 09:08:24 crc kubenswrapper[4765]: I1003 09:08:24.581491 4765 reflector.go:368] Caches populated for *v1.Secret from object-"watcher-kuttl-default"/"cert-ceilometer-internal-svc" Oct 03 09:08:24 crc kubenswrapper[4765]: I1003 09:08:24.589088 4765 reflector.go:368] Caches populated for *v1.Secret from object-"watcher-kuttl-default"/"ceilometer-scripts" Oct 03 09:08:24 crc kubenswrapper[4765]: I1003 09:08:24.593638 4765 scope.go:117] "RemoveContainer" containerID="be70af77a826ff13cbe22bb18412cc2099feae7abb5f1222016c016d21b016c9" Oct 03 09:08:24 crc kubenswrapper[4765]: I1003 09:08:24.594857 4765 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["watcher-kuttl-default/ceilometer-0"] Oct 03 09:08:24 crc kubenswrapper[4765]: I1003 09:08:24.674695 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/574ea430-f39b-420d-9b46-e64eb9e6135b-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"574ea430-f39b-420d-9b46-e64eb9e6135b\") " pod="watcher-kuttl-default/ceilometer-0" Oct 03 09:08:24 crc kubenswrapper[4765]: I1003 09:08:24.674827 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/574ea430-f39b-420d-9b46-e64eb9e6135b-scripts\") pod \"ceilometer-0\" (UID: \"574ea430-f39b-420d-9b46-e64eb9e6135b\") " pod="watcher-kuttl-default/ceilometer-0" Oct 03 09:08:24 crc kubenswrapper[4765]: I1003 09:08:24.674852 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/574ea430-f39b-420d-9b46-e64eb9e6135b-config-data\") pod \"ceilometer-0\" (UID: \"574ea430-f39b-420d-9b46-e64eb9e6135b\") " pod="watcher-kuttl-default/ceilometer-0" Oct 03 09:08:24 crc kubenswrapper[4765]: I1003 09:08:24.674942 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/574ea430-f39b-420d-9b46-e64eb9e6135b-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"574ea430-f39b-420d-9b46-e64eb9e6135b\") " pod="watcher-kuttl-default/ceilometer-0" Oct 03 09:08:24 crc kubenswrapper[4765]: I1003 09:08:24.674985 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/574ea430-f39b-420d-9b46-e64eb9e6135b-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"574ea430-f39b-420d-9b46-e64eb9e6135b\") " pod="watcher-kuttl-default/ceilometer-0" Oct 03 09:08:24 crc kubenswrapper[4765]: I1003 09:08:24.675012 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/574ea430-f39b-420d-9b46-e64eb9e6135b-run-httpd\") pod \"ceilometer-0\" (UID: \"574ea430-f39b-420d-9b46-e64eb9e6135b\") " pod="watcher-kuttl-default/ceilometer-0" Oct 03 09:08:24 crc kubenswrapper[4765]: I1003 09:08:24.675026 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/574ea430-f39b-420d-9b46-e64eb9e6135b-log-httpd\") pod \"ceilometer-0\" (UID: \"574ea430-f39b-420d-9b46-e64eb9e6135b\") " pod="watcher-kuttl-default/ceilometer-0" Oct 03 09:08:24 crc kubenswrapper[4765]: I1003 09:08:24.675062 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qm8rf\" (UniqueName: \"kubernetes.io/projected/574ea430-f39b-420d-9b46-e64eb9e6135b-kube-api-access-qm8rf\") pod \"ceilometer-0\" (UID: \"574ea430-f39b-420d-9b46-e64eb9e6135b\") " pod="watcher-kuttl-default/ceilometer-0" Oct 03 09:08:24 crc kubenswrapper[4765]: I1003 09:08:24.776587 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/574ea430-f39b-420d-9b46-e64eb9e6135b-log-httpd\") pod \"ceilometer-0\" (UID: \"574ea430-f39b-420d-9b46-e64eb9e6135b\") " pod="watcher-kuttl-default/ceilometer-0" Oct 03 09:08:24 crc kubenswrapper[4765]: I1003 09:08:24.776637 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/574ea430-f39b-420d-9b46-e64eb9e6135b-run-httpd\") pod \"ceilometer-0\" (UID: \"574ea430-f39b-420d-9b46-e64eb9e6135b\") " pod="watcher-kuttl-default/ceilometer-0" Oct 03 09:08:24 crc kubenswrapper[4765]: I1003 09:08:24.776683 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qm8rf\" (UniqueName: \"kubernetes.io/projected/574ea430-f39b-420d-9b46-e64eb9e6135b-kube-api-access-qm8rf\") pod \"ceilometer-0\" (UID: \"574ea430-f39b-420d-9b46-e64eb9e6135b\") " pod="watcher-kuttl-default/ceilometer-0" Oct 03 09:08:24 crc kubenswrapper[4765]: I1003 09:08:24.776728 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/574ea430-f39b-420d-9b46-e64eb9e6135b-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"574ea430-f39b-420d-9b46-e64eb9e6135b\") " pod="watcher-kuttl-default/ceilometer-0" Oct 03 09:08:24 crc kubenswrapper[4765]: I1003 09:08:24.776758 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/574ea430-f39b-420d-9b46-e64eb9e6135b-scripts\") pod \"ceilometer-0\" (UID: \"574ea430-f39b-420d-9b46-e64eb9e6135b\") " pod="watcher-kuttl-default/ceilometer-0" Oct 03 09:08:24 crc kubenswrapper[4765]: I1003 09:08:24.776776 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/574ea430-f39b-420d-9b46-e64eb9e6135b-config-data\") pod \"ceilometer-0\" (UID: \"574ea430-f39b-420d-9b46-e64eb9e6135b\") " pod="watcher-kuttl-default/ceilometer-0" Oct 03 09:08:24 crc kubenswrapper[4765]: I1003 09:08:24.776854 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/574ea430-f39b-420d-9b46-e64eb9e6135b-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"574ea430-f39b-420d-9b46-e64eb9e6135b\") " pod="watcher-kuttl-default/ceilometer-0" Oct 03 09:08:24 crc kubenswrapper[4765]: I1003 09:08:24.776889 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/574ea430-f39b-420d-9b46-e64eb9e6135b-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"574ea430-f39b-420d-9b46-e64eb9e6135b\") " pod="watcher-kuttl-default/ceilometer-0" Oct 03 09:08:24 crc kubenswrapper[4765]: I1003 09:08:24.777203 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/574ea430-f39b-420d-9b46-e64eb9e6135b-run-httpd\") pod \"ceilometer-0\" (UID: \"574ea430-f39b-420d-9b46-e64eb9e6135b\") " pod="watcher-kuttl-default/ceilometer-0" Oct 03 09:08:24 crc kubenswrapper[4765]: I1003 09:08:24.777254 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/574ea430-f39b-420d-9b46-e64eb9e6135b-log-httpd\") pod \"ceilometer-0\" (UID: \"574ea430-f39b-420d-9b46-e64eb9e6135b\") " pod="watcher-kuttl-default/ceilometer-0" Oct 03 09:08:24 crc kubenswrapper[4765]: I1003 09:08:24.783477 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/574ea430-f39b-420d-9b46-e64eb9e6135b-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"574ea430-f39b-420d-9b46-e64eb9e6135b\") " pod="watcher-kuttl-default/ceilometer-0" Oct 03 09:08:24 crc kubenswrapper[4765]: I1003 09:08:24.784322 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/574ea430-f39b-420d-9b46-e64eb9e6135b-config-data\") pod \"ceilometer-0\" (UID: \"574ea430-f39b-420d-9b46-e64eb9e6135b\") " pod="watcher-kuttl-default/ceilometer-0" Oct 03 09:08:24 crc kubenswrapper[4765]: I1003 09:08:24.784993 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/574ea430-f39b-420d-9b46-e64eb9e6135b-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"574ea430-f39b-420d-9b46-e64eb9e6135b\") " pod="watcher-kuttl-default/ceilometer-0" Oct 03 09:08:24 crc kubenswrapper[4765]: I1003 09:08:24.786087 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/574ea430-f39b-420d-9b46-e64eb9e6135b-scripts\") pod \"ceilometer-0\" (UID: \"574ea430-f39b-420d-9b46-e64eb9e6135b\") " pod="watcher-kuttl-default/ceilometer-0" Oct 03 09:08:24 crc kubenswrapper[4765]: I1003 09:08:24.797049 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/574ea430-f39b-420d-9b46-e64eb9e6135b-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"574ea430-f39b-420d-9b46-e64eb9e6135b\") " pod="watcher-kuttl-default/ceilometer-0" Oct 03 09:08:24 crc kubenswrapper[4765]: I1003 09:08:24.812497 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qm8rf\" (UniqueName: \"kubernetes.io/projected/574ea430-f39b-420d-9b46-e64eb9e6135b-kube-api-access-qm8rf\") pod \"ceilometer-0\" (UID: \"574ea430-f39b-420d-9b46-e64eb9e6135b\") " pod="watcher-kuttl-default/ceilometer-0" Oct 03 09:08:24 crc kubenswrapper[4765]: I1003 09:08:24.867160 4765 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/ceilometer-0" Oct 03 09:08:25 crc kubenswrapper[4765]: I1003 09:08:25.410215 4765 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["watcher-kuttl-default/ceilometer-0"] Oct 03 09:08:25 crc kubenswrapper[4765]: I1003 09:08:25.430180 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/ceilometer-0" event={"ID":"574ea430-f39b-420d-9b46-e64eb9e6135b","Type":"ContainerStarted","Data":"28b94debead004499e7b60df67e2395baf2b84052124a7cc4caed7bde44afd98"} Oct 03 09:08:25 crc kubenswrapper[4765]: I1003 09:08:25.431886 4765 generic.go:334] "Generic (PLEG): container finished" podID="e6d93bea-73df-46c9-8ac7-c68d9c6ab117" containerID="58f9d0a6d37ff142730f3ba68ad738dc9ac8d9317f4f3c26c3a577dc34fdd2f9" exitCode=0 Oct 03 09:08:25 crc kubenswrapper[4765]: I1003 09:08:25.431938 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/watcher-db-create-swdf8" event={"ID":"e6d93bea-73df-46c9-8ac7-c68d9c6ab117","Type":"ContainerDied","Data":"58f9d0a6d37ff142730f3ba68ad738dc9ac8d9317f4f3c26c3a577dc34fdd2f9"} Oct 03 09:08:26 crc kubenswrapper[4765]: I1003 09:08:26.320297 4765 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="78b9aabd-dac4-465e-93c6-0e6199890b40" path="/var/lib/kubelet/pods/78b9aabd-dac4-465e-93c6-0e6199890b40/volumes" Oct 03 09:08:26 crc kubenswrapper[4765]: I1003 09:08:26.442062 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/ceilometer-0" event={"ID":"574ea430-f39b-420d-9b46-e64eb9e6135b","Type":"ContainerStarted","Data":"30d84e308541b3eebf1cf7d69f996a051fec0a846a685abb13b90a184fba99fa"} Oct 03 09:08:26 crc kubenswrapper[4765]: I1003 09:08:26.793381 4765 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/watcher-db-create-swdf8" Oct 03 09:08:26 crc kubenswrapper[4765]: I1003 09:08:26.911386 4765 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nnvgw\" (UniqueName: \"kubernetes.io/projected/e6d93bea-73df-46c9-8ac7-c68d9c6ab117-kube-api-access-nnvgw\") pod \"e6d93bea-73df-46c9-8ac7-c68d9c6ab117\" (UID: \"e6d93bea-73df-46c9-8ac7-c68d9c6ab117\") " Oct 03 09:08:26 crc kubenswrapper[4765]: I1003 09:08:26.918028 4765 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e6d93bea-73df-46c9-8ac7-c68d9c6ab117-kube-api-access-nnvgw" (OuterVolumeSpecName: "kube-api-access-nnvgw") pod "e6d93bea-73df-46c9-8ac7-c68d9c6ab117" (UID: "e6d93bea-73df-46c9-8ac7-c68d9c6ab117"). InnerVolumeSpecName "kube-api-access-nnvgw". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 09:08:27 crc kubenswrapper[4765]: I1003 09:08:27.013894 4765 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-nnvgw\" (UniqueName: \"kubernetes.io/projected/e6d93bea-73df-46c9-8ac7-c68d9c6ab117-kube-api-access-nnvgw\") on node \"crc\" DevicePath \"\"" Oct 03 09:08:27 crc kubenswrapper[4765]: I1003 09:08:27.452383 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/ceilometer-0" event={"ID":"574ea430-f39b-420d-9b46-e64eb9e6135b","Type":"ContainerStarted","Data":"e52914795d9952cfff13da50e0ad72afe1127e580aa4ef423c467c14e24be966"} Oct 03 09:08:27 crc kubenswrapper[4765]: I1003 09:08:27.455793 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/watcher-db-create-swdf8" event={"ID":"e6d93bea-73df-46c9-8ac7-c68d9c6ab117","Type":"ContainerDied","Data":"c9b6ed68410f19a71e52fdef781427b7165cf5a07dc4721ed1512a6af607ef82"} Oct 03 09:08:27 crc kubenswrapper[4765]: I1003 09:08:27.455834 4765 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="c9b6ed68410f19a71e52fdef781427b7165cf5a07dc4721ed1512a6af607ef82" Oct 03 09:08:27 crc kubenswrapper[4765]: I1003 09:08:27.455838 4765 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/watcher-db-create-swdf8" Oct 03 09:08:29 crc kubenswrapper[4765]: I1003 09:08:29.471992 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/ceilometer-0" event={"ID":"574ea430-f39b-420d-9b46-e64eb9e6135b","Type":"ContainerStarted","Data":"2fa8e1b9334e9f8919f511494a08200a08cfaa72c334a5e9739f139399bc2abc"} Oct 03 09:08:30 crc kubenswrapper[4765]: I1003 09:08:30.311906 4765 scope.go:117] "RemoveContainer" containerID="dd918556e4256b95f1ffce5dba4f8a301b33441a569fc5bbea88da3f09eb9800" Oct 03 09:08:30 crc kubenswrapper[4765]: E1003 09:08:30.312326 4765 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-j8mss_openshift-machine-config-operator(d636dbad-9ffa-4ba7-953f-adea04b76a23)\"" pod="openshift-machine-config-operator/machine-config-daemon-j8mss" podUID="d636dbad-9ffa-4ba7-953f-adea04b76a23" Oct 03 09:08:30 crc kubenswrapper[4765]: I1003 09:08:30.483268 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/ceilometer-0" event={"ID":"574ea430-f39b-420d-9b46-e64eb9e6135b","Type":"ContainerStarted","Data":"5b1ec43428ec5ea3aa19b60992674f93c92171bb9518b1fb3f051f442004101c"} Oct 03 09:08:30 crc kubenswrapper[4765]: I1003 09:08:30.483423 4765 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="watcher-kuttl-default/ceilometer-0" Oct 03 09:08:30 crc kubenswrapper[4765]: I1003 09:08:30.509422 4765 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="watcher-kuttl-default/ceilometer-0" podStartSLOduration=2.158506198 podStartE2EDuration="6.50940071s" podCreationTimestamp="2025-10-03 09:08:24 +0000 UTC" firstStartedPulling="2025-10-03 09:08:25.413526224 +0000 UTC m=+1749.715020554" lastFinishedPulling="2025-10-03 09:08:29.764420736 +0000 UTC m=+1754.065915066" observedRunningTime="2025-10-03 09:08:30.5032362 +0000 UTC m=+1754.804730530" watchObservedRunningTime="2025-10-03 09:08:30.50940071 +0000 UTC m=+1754.810895040" Oct 03 09:08:33 crc kubenswrapper[4765]: I1003 09:08:33.614529 4765 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["watcher-kuttl-default/watcher-32f6-account-create-mtr5l"] Oct 03 09:08:33 crc kubenswrapper[4765]: E1003 09:08:33.615188 4765 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e6d93bea-73df-46c9-8ac7-c68d9c6ab117" containerName="mariadb-database-create" Oct 03 09:08:33 crc kubenswrapper[4765]: I1003 09:08:33.615202 4765 state_mem.go:107] "Deleted CPUSet assignment" podUID="e6d93bea-73df-46c9-8ac7-c68d9c6ab117" containerName="mariadb-database-create" Oct 03 09:08:33 crc kubenswrapper[4765]: I1003 09:08:33.615361 4765 memory_manager.go:354] "RemoveStaleState removing state" podUID="e6d93bea-73df-46c9-8ac7-c68d9c6ab117" containerName="mariadb-database-create" Oct 03 09:08:33 crc kubenswrapper[4765]: I1003 09:08:33.615958 4765 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/watcher-32f6-account-create-mtr5l" Oct 03 09:08:33 crc kubenswrapper[4765]: I1003 09:08:33.626290 4765 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["watcher-kuttl-default/watcher-32f6-account-create-mtr5l"] Oct 03 09:08:33 crc kubenswrapper[4765]: I1003 09:08:33.629128 4765 reflector.go:368] Caches populated for *v1.Secret from object-"watcher-kuttl-default"/"watcher-db-secret" Oct 03 09:08:33 crc kubenswrapper[4765]: I1003 09:08:33.741118 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-j2hxj\" (UniqueName: \"kubernetes.io/projected/9d8df962-8e48-477a-a6f4-0b2067c1ccdd-kube-api-access-j2hxj\") pod \"watcher-32f6-account-create-mtr5l\" (UID: \"9d8df962-8e48-477a-a6f4-0b2067c1ccdd\") " pod="watcher-kuttl-default/watcher-32f6-account-create-mtr5l" Oct 03 09:08:33 crc kubenswrapper[4765]: I1003 09:08:33.842422 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-j2hxj\" (UniqueName: \"kubernetes.io/projected/9d8df962-8e48-477a-a6f4-0b2067c1ccdd-kube-api-access-j2hxj\") pod \"watcher-32f6-account-create-mtr5l\" (UID: \"9d8df962-8e48-477a-a6f4-0b2067c1ccdd\") " pod="watcher-kuttl-default/watcher-32f6-account-create-mtr5l" Oct 03 09:08:33 crc kubenswrapper[4765]: I1003 09:08:33.875532 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-j2hxj\" (UniqueName: \"kubernetes.io/projected/9d8df962-8e48-477a-a6f4-0b2067c1ccdd-kube-api-access-j2hxj\") pod \"watcher-32f6-account-create-mtr5l\" (UID: \"9d8df962-8e48-477a-a6f4-0b2067c1ccdd\") " pod="watcher-kuttl-default/watcher-32f6-account-create-mtr5l" Oct 03 09:08:33 crc kubenswrapper[4765]: I1003 09:08:33.935348 4765 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/watcher-32f6-account-create-mtr5l" Oct 03 09:08:34 crc kubenswrapper[4765]: I1003 09:08:34.406437 4765 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["watcher-kuttl-default/watcher-32f6-account-create-mtr5l"] Oct 03 09:08:34 crc kubenswrapper[4765]: I1003 09:08:34.524015 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/watcher-32f6-account-create-mtr5l" event={"ID":"9d8df962-8e48-477a-a6f4-0b2067c1ccdd","Type":"ContainerStarted","Data":"75993caa0ef72cebcb39c53a863d6329ed0e357c86715ab765961a4d25463393"} Oct 03 09:08:35 crc kubenswrapper[4765]: I1003 09:08:35.535212 4765 generic.go:334] "Generic (PLEG): container finished" podID="9d8df962-8e48-477a-a6f4-0b2067c1ccdd" containerID="82095e3c8a766ab9578d34df5a12aed1393ea5c90dcc36c1b3d5b6998b2cf1c6" exitCode=0 Oct 03 09:08:35 crc kubenswrapper[4765]: I1003 09:08:35.535331 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/watcher-32f6-account-create-mtr5l" event={"ID":"9d8df962-8e48-477a-a6f4-0b2067c1ccdd","Type":"ContainerDied","Data":"82095e3c8a766ab9578d34df5a12aed1393ea5c90dcc36c1b3d5b6998b2cf1c6"} Oct 03 09:08:37 crc kubenswrapper[4765]: I1003 09:08:37.017386 4765 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/watcher-32f6-account-create-mtr5l" Oct 03 09:08:37 crc kubenswrapper[4765]: I1003 09:08:37.114422 4765 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-j2hxj\" (UniqueName: \"kubernetes.io/projected/9d8df962-8e48-477a-a6f4-0b2067c1ccdd-kube-api-access-j2hxj\") pod \"9d8df962-8e48-477a-a6f4-0b2067c1ccdd\" (UID: \"9d8df962-8e48-477a-a6f4-0b2067c1ccdd\") " Oct 03 09:08:37 crc kubenswrapper[4765]: I1003 09:08:37.120089 4765 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9d8df962-8e48-477a-a6f4-0b2067c1ccdd-kube-api-access-j2hxj" (OuterVolumeSpecName: "kube-api-access-j2hxj") pod "9d8df962-8e48-477a-a6f4-0b2067c1ccdd" (UID: "9d8df962-8e48-477a-a6f4-0b2067c1ccdd"). InnerVolumeSpecName "kube-api-access-j2hxj". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 09:08:37 crc kubenswrapper[4765]: I1003 09:08:37.215892 4765 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-j2hxj\" (UniqueName: \"kubernetes.io/projected/9d8df962-8e48-477a-a6f4-0b2067c1ccdd-kube-api-access-j2hxj\") on node \"crc\" DevicePath \"\"" Oct 03 09:08:37 crc kubenswrapper[4765]: I1003 09:08:37.554919 4765 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/watcher-32f6-account-create-mtr5l" Oct 03 09:08:37 crc kubenswrapper[4765]: I1003 09:08:37.554825 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/watcher-32f6-account-create-mtr5l" event={"ID":"9d8df962-8e48-477a-a6f4-0b2067c1ccdd","Type":"ContainerDied","Data":"75993caa0ef72cebcb39c53a863d6329ed0e357c86715ab765961a4d25463393"} Oct 03 09:08:37 crc kubenswrapper[4765]: I1003 09:08:37.564891 4765 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="75993caa0ef72cebcb39c53a863d6329ed0e357c86715ab765961a4d25463393" Oct 03 09:08:38 crc kubenswrapper[4765]: I1003 09:08:38.787074 4765 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["watcher-kuttl-default/watcher-kuttl-db-sync-8j8kj"] Oct 03 09:08:38 crc kubenswrapper[4765]: E1003 09:08:38.787441 4765 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9d8df962-8e48-477a-a6f4-0b2067c1ccdd" containerName="mariadb-account-create" Oct 03 09:08:38 crc kubenswrapper[4765]: I1003 09:08:38.787453 4765 state_mem.go:107] "Deleted CPUSet assignment" podUID="9d8df962-8e48-477a-a6f4-0b2067c1ccdd" containerName="mariadb-account-create" Oct 03 09:08:38 crc kubenswrapper[4765]: I1003 09:08:38.787832 4765 memory_manager.go:354] "RemoveStaleState removing state" podUID="9d8df962-8e48-477a-a6f4-0b2067c1ccdd" containerName="mariadb-account-create" Oct 03 09:08:38 crc kubenswrapper[4765]: I1003 09:08:38.788405 4765 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/watcher-kuttl-db-sync-8j8kj" Oct 03 09:08:38 crc kubenswrapper[4765]: I1003 09:08:38.790480 4765 reflector.go:368] Caches populated for *v1.Secret from object-"watcher-kuttl-default"/"watcher-watcher-kuttl-dockercfg-rmgxn" Oct 03 09:08:38 crc kubenswrapper[4765]: I1003 09:08:38.795846 4765 reflector.go:368] Caches populated for *v1.Secret from object-"watcher-kuttl-default"/"watcher-kuttl-config-data" Oct 03 09:08:38 crc kubenswrapper[4765]: I1003 09:08:38.800573 4765 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["watcher-kuttl-default/watcher-kuttl-db-sync-8j8kj"] Oct 03 09:08:38 crc kubenswrapper[4765]: I1003 09:08:38.941702 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/15257523-ed0d-43fe-aa54-96b150144372-combined-ca-bundle\") pod \"watcher-kuttl-db-sync-8j8kj\" (UID: \"15257523-ed0d-43fe-aa54-96b150144372\") " pod="watcher-kuttl-default/watcher-kuttl-db-sync-8j8kj" Oct 03 09:08:38 crc kubenswrapper[4765]: I1003 09:08:38.941744 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wn245\" (UniqueName: \"kubernetes.io/projected/15257523-ed0d-43fe-aa54-96b150144372-kube-api-access-wn245\") pod \"watcher-kuttl-db-sync-8j8kj\" (UID: \"15257523-ed0d-43fe-aa54-96b150144372\") " pod="watcher-kuttl-default/watcher-kuttl-db-sync-8j8kj" Oct 03 09:08:38 crc kubenswrapper[4765]: I1003 09:08:38.941834 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/15257523-ed0d-43fe-aa54-96b150144372-db-sync-config-data\") pod \"watcher-kuttl-db-sync-8j8kj\" (UID: \"15257523-ed0d-43fe-aa54-96b150144372\") " pod="watcher-kuttl-default/watcher-kuttl-db-sync-8j8kj" Oct 03 09:08:38 crc kubenswrapper[4765]: I1003 09:08:38.941855 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/15257523-ed0d-43fe-aa54-96b150144372-config-data\") pod \"watcher-kuttl-db-sync-8j8kj\" (UID: \"15257523-ed0d-43fe-aa54-96b150144372\") " pod="watcher-kuttl-default/watcher-kuttl-db-sync-8j8kj" Oct 03 09:08:39 crc kubenswrapper[4765]: I1003 09:08:39.042972 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/15257523-ed0d-43fe-aa54-96b150144372-combined-ca-bundle\") pod \"watcher-kuttl-db-sync-8j8kj\" (UID: \"15257523-ed0d-43fe-aa54-96b150144372\") " pod="watcher-kuttl-default/watcher-kuttl-db-sync-8j8kj" Oct 03 09:08:39 crc kubenswrapper[4765]: I1003 09:08:39.043275 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wn245\" (UniqueName: \"kubernetes.io/projected/15257523-ed0d-43fe-aa54-96b150144372-kube-api-access-wn245\") pod \"watcher-kuttl-db-sync-8j8kj\" (UID: \"15257523-ed0d-43fe-aa54-96b150144372\") " pod="watcher-kuttl-default/watcher-kuttl-db-sync-8j8kj" Oct 03 09:08:39 crc kubenswrapper[4765]: I1003 09:08:39.043343 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/15257523-ed0d-43fe-aa54-96b150144372-db-sync-config-data\") pod \"watcher-kuttl-db-sync-8j8kj\" (UID: \"15257523-ed0d-43fe-aa54-96b150144372\") " pod="watcher-kuttl-default/watcher-kuttl-db-sync-8j8kj" Oct 03 09:08:39 crc kubenswrapper[4765]: I1003 09:08:39.043363 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/15257523-ed0d-43fe-aa54-96b150144372-config-data\") pod \"watcher-kuttl-db-sync-8j8kj\" (UID: \"15257523-ed0d-43fe-aa54-96b150144372\") " pod="watcher-kuttl-default/watcher-kuttl-db-sync-8j8kj" Oct 03 09:08:39 crc kubenswrapper[4765]: I1003 09:08:39.049226 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/15257523-ed0d-43fe-aa54-96b150144372-db-sync-config-data\") pod \"watcher-kuttl-db-sync-8j8kj\" (UID: \"15257523-ed0d-43fe-aa54-96b150144372\") " pod="watcher-kuttl-default/watcher-kuttl-db-sync-8j8kj" Oct 03 09:08:39 crc kubenswrapper[4765]: I1003 09:08:39.049919 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/15257523-ed0d-43fe-aa54-96b150144372-config-data\") pod \"watcher-kuttl-db-sync-8j8kj\" (UID: \"15257523-ed0d-43fe-aa54-96b150144372\") " pod="watcher-kuttl-default/watcher-kuttl-db-sync-8j8kj" Oct 03 09:08:39 crc kubenswrapper[4765]: I1003 09:08:39.060445 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/15257523-ed0d-43fe-aa54-96b150144372-combined-ca-bundle\") pod \"watcher-kuttl-db-sync-8j8kj\" (UID: \"15257523-ed0d-43fe-aa54-96b150144372\") " pod="watcher-kuttl-default/watcher-kuttl-db-sync-8j8kj" Oct 03 09:08:39 crc kubenswrapper[4765]: I1003 09:08:39.061778 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wn245\" (UniqueName: \"kubernetes.io/projected/15257523-ed0d-43fe-aa54-96b150144372-kube-api-access-wn245\") pod \"watcher-kuttl-db-sync-8j8kj\" (UID: \"15257523-ed0d-43fe-aa54-96b150144372\") " pod="watcher-kuttl-default/watcher-kuttl-db-sync-8j8kj" Oct 03 09:08:39 crc kubenswrapper[4765]: I1003 09:08:39.104724 4765 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/watcher-kuttl-db-sync-8j8kj" Oct 03 09:08:39 crc kubenswrapper[4765]: I1003 09:08:39.583911 4765 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["watcher-kuttl-default/watcher-kuttl-db-sync-8j8kj"] Oct 03 09:08:39 crc kubenswrapper[4765]: W1003 09:08:39.593446 4765 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod15257523_ed0d_43fe_aa54_96b150144372.slice/crio-eb26710b530201a7199b801c0fe1808ad895922013054ed4f71a90ffdbc017e4 WatchSource:0}: Error finding container eb26710b530201a7199b801c0fe1808ad895922013054ed4f71a90ffdbc017e4: Status 404 returned error can't find the container with id eb26710b530201a7199b801c0fe1808ad895922013054ed4f71a90ffdbc017e4 Oct 03 09:08:40 crc kubenswrapper[4765]: I1003 09:08:40.580757 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/watcher-kuttl-db-sync-8j8kj" event={"ID":"15257523-ed0d-43fe-aa54-96b150144372","Type":"ContainerStarted","Data":"8b36e642d06489402d75d70d3196a61e9a07ee67a6b1a2cbdce5f38e7afa9daa"} Oct 03 09:08:40 crc kubenswrapper[4765]: I1003 09:08:40.582268 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/watcher-kuttl-db-sync-8j8kj" event={"ID":"15257523-ed0d-43fe-aa54-96b150144372","Type":"ContainerStarted","Data":"eb26710b530201a7199b801c0fe1808ad895922013054ed4f71a90ffdbc017e4"} Oct 03 09:08:40 crc kubenswrapper[4765]: I1003 09:08:40.609582 4765 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="watcher-kuttl-default/watcher-kuttl-db-sync-8j8kj" podStartSLOduration=2.609565486 podStartE2EDuration="2.609565486s" podCreationTimestamp="2025-10-03 09:08:38 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-03 09:08:40.604860104 +0000 UTC m=+1764.906354434" watchObservedRunningTime="2025-10-03 09:08:40.609565486 +0000 UTC m=+1764.911059816" Oct 03 09:08:43 crc kubenswrapper[4765]: I1003 09:08:43.606341 4765 generic.go:334] "Generic (PLEG): container finished" podID="15257523-ed0d-43fe-aa54-96b150144372" containerID="8b36e642d06489402d75d70d3196a61e9a07ee67a6b1a2cbdce5f38e7afa9daa" exitCode=0 Oct 03 09:08:43 crc kubenswrapper[4765]: I1003 09:08:43.606924 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/watcher-kuttl-db-sync-8j8kj" event={"ID":"15257523-ed0d-43fe-aa54-96b150144372","Type":"ContainerDied","Data":"8b36e642d06489402d75d70d3196a61e9a07ee67a6b1a2cbdce5f38e7afa9daa"} Oct 03 09:08:45 crc kubenswrapper[4765]: I1003 09:08:45.114664 4765 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/watcher-kuttl-db-sync-8j8kj" Oct 03 09:08:45 crc kubenswrapper[4765]: I1003 09:08:45.244269 4765 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/15257523-ed0d-43fe-aa54-96b150144372-combined-ca-bundle\") pod \"15257523-ed0d-43fe-aa54-96b150144372\" (UID: \"15257523-ed0d-43fe-aa54-96b150144372\") " Oct 03 09:08:45 crc kubenswrapper[4765]: I1003 09:08:45.244326 4765 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/15257523-ed0d-43fe-aa54-96b150144372-db-sync-config-data\") pod \"15257523-ed0d-43fe-aa54-96b150144372\" (UID: \"15257523-ed0d-43fe-aa54-96b150144372\") " Oct 03 09:08:45 crc kubenswrapper[4765]: I1003 09:08:45.244384 4765 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wn245\" (UniqueName: \"kubernetes.io/projected/15257523-ed0d-43fe-aa54-96b150144372-kube-api-access-wn245\") pod \"15257523-ed0d-43fe-aa54-96b150144372\" (UID: \"15257523-ed0d-43fe-aa54-96b150144372\") " Oct 03 09:08:45 crc kubenswrapper[4765]: I1003 09:08:45.244498 4765 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/15257523-ed0d-43fe-aa54-96b150144372-config-data\") pod \"15257523-ed0d-43fe-aa54-96b150144372\" (UID: \"15257523-ed0d-43fe-aa54-96b150144372\") " Oct 03 09:08:45 crc kubenswrapper[4765]: I1003 09:08:45.250273 4765 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/15257523-ed0d-43fe-aa54-96b150144372-kube-api-access-wn245" (OuterVolumeSpecName: "kube-api-access-wn245") pod "15257523-ed0d-43fe-aa54-96b150144372" (UID: "15257523-ed0d-43fe-aa54-96b150144372"). InnerVolumeSpecName "kube-api-access-wn245". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 09:08:45 crc kubenswrapper[4765]: I1003 09:08:45.252835 4765 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/15257523-ed0d-43fe-aa54-96b150144372-db-sync-config-data" (OuterVolumeSpecName: "db-sync-config-data") pod "15257523-ed0d-43fe-aa54-96b150144372" (UID: "15257523-ed0d-43fe-aa54-96b150144372"). InnerVolumeSpecName "db-sync-config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 09:08:45 crc kubenswrapper[4765]: I1003 09:08:45.271152 4765 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/15257523-ed0d-43fe-aa54-96b150144372-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "15257523-ed0d-43fe-aa54-96b150144372" (UID: "15257523-ed0d-43fe-aa54-96b150144372"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 09:08:45 crc kubenswrapper[4765]: I1003 09:08:45.286521 4765 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/15257523-ed0d-43fe-aa54-96b150144372-config-data" (OuterVolumeSpecName: "config-data") pod "15257523-ed0d-43fe-aa54-96b150144372" (UID: "15257523-ed0d-43fe-aa54-96b150144372"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 09:08:45 crc kubenswrapper[4765]: I1003 09:08:45.307200 4765 scope.go:117] "RemoveContainer" containerID="dd918556e4256b95f1ffce5dba4f8a301b33441a569fc5bbea88da3f09eb9800" Oct 03 09:08:45 crc kubenswrapper[4765]: E1003 09:08:45.307469 4765 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-j8mss_openshift-machine-config-operator(d636dbad-9ffa-4ba7-953f-adea04b76a23)\"" pod="openshift-machine-config-operator/machine-config-daemon-j8mss" podUID="d636dbad-9ffa-4ba7-953f-adea04b76a23" Oct 03 09:08:45 crc kubenswrapper[4765]: I1003 09:08:45.346948 4765 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/15257523-ed0d-43fe-aa54-96b150144372-config-data\") on node \"crc\" DevicePath \"\"" Oct 03 09:08:45 crc kubenswrapper[4765]: I1003 09:08:45.346992 4765 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/15257523-ed0d-43fe-aa54-96b150144372-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 03 09:08:45 crc kubenswrapper[4765]: I1003 09:08:45.347013 4765 reconciler_common.go:293] "Volume detached for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/15257523-ed0d-43fe-aa54-96b150144372-db-sync-config-data\") on node \"crc\" DevicePath \"\"" Oct 03 09:08:45 crc kubenswrapper[4765]: I1003 09:08:45.347026 4765 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wn245\" (UniqueName: \"kubernetes.io/projected/15257523-ed0d-43fe-aa54-96b150144372-kube-api-access-wn245\") on node \"crc\" DevicePath \"\"" Oct 03 09:08:45 crc kubenswrapper[4765]: I1003 09:08:45.624821 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/watcher-kuttl-db-sync-8j8kj" event={"ID":"15257523-ed0d-43fe-aa54-96b150144372","Type":"ContainerDied","Data":"eb26710b530201a7199b801c0fe1808ad895922013054ed4f71a90ffdbc017e4"} Oct 03 09:08:45 crc kubenswrapper[4765]: I1003 09:08:45.624862 4765 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/watcher-kuttl-db-sync-8j8kj" Oct 03 09:08:45 crc kubenswrapper[4765]: I1003 09:08:45.624868 4765 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="eb26710b530201a7199b801c0fe1808ad895922013054ed4f71a90ffdbc017e4" Oct 03 09:08:45 crc kubenswrapper[4765]: I1003 09:08:45.920587 4765 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["watcher-kuttl-default/watcher-kuttl-api-0"] Oct 03 09:08:45 crc kubenswrapper[4765]: E1003 09:08:45.921062 4765 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="15257523-ed0d-43fe-aa54-96b150144372" containerName="watcher-kuttl-db-sync" Oct 03 09:08:45 crc kubenswrapper[4765]: I1003 09:08:45.921091 4765 state_mem.go:107] "Deleted CPUSet assignment" podUID="15257523-ed0d-43fe-aa54-96b150144372" containerName="watcher-kuttl-db-sync" Oct 03 09:08:45 crc kubenswrapper[4765]: I1003 09:08:45.921277 4765 memory_manager.go:354] "RemoveStaleState removing state" podUID="15257523-ed0d-43fe-aa54-96b150144372" containerName="watcher-kuttl-db-sync" Oct 03 09:08:45 crc kubenswrapper[4765]: I1003 09:08:45.922197 4765 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/watcher-kuttl-api-0" Oct 03 09:08:45 crc kubenswrapper[4765]: I1003 09:08:45.924729 4765 reflector.go:368] Caches populated for *v1.Secret from object-"watcher-kuttl-default"/"watcher-kuttl-api-config-data" Oct 03 09:08:45 crc kubenswrapper[4765]: I1003 09:08:45.924775 4765 reflector.go:368] Caches populated for *v1.Secret from object-"watcher-kuttl-default"/"watcher-watcher-kuttl-dockercfg-rmgxn" Oct 03 09:08:45 crc kubenswrapper[4765]: I1003 09:08:45.943981 4765 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["watcher-kuttl-default/watcher-kuttl-api-0"] Oct 03 09:08:45 crc kubenswrapper[4765]: I1003 09:08:45.951504 4765 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["watcher-kuttl-default/watcher-kuttl-applier-0"] Oct 03 09:08:45 crc kubenswrapper[4765]: I1003 09:08:45.952664 4765 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/watcher-kuttl-applier-0" Oct 03 09:08:45 crc kubenswrapper[4765]: I1003 09:08:45.959373 4765 reflector.go:368] Caches populated for *v1.Secret from object-"watcher-kuttl-default"/"watcher-kuttl-applier-config-data" Oct 03 09:08:45 crc kubenswrapper[4765]: I1003 09:08:45.963481 4765 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["watcher-kuttl-default/watcher-kuttl-applier-0"] Oct 03 09:08:46 crc kubenswrapper[4765]: I1003 09:08:46.020191 4765 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["watcher-kuttl-default/watcher-kuttl-decision-engine-0"] Oct 03 09:08:46 crc kubenswrapper[4765]: I1003 09:08:46.025460 4765 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/watcher-kuttl-decision-engine-0" Oct 03 09:08:46 crc kubenswrapper[4765]: I1003 09:08:46.029705 4765 reflector.go:368] Caches populated for *v1.Secret from object-"watcher-kuttl-default"/"watcher-kuttl-decision-engine-config-data" Oct 03 09:08:46 crc kubenswrapper[4765]: I1003 09:08:46.030965 4765 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["watcher-kuttl-default/watcher-kuttl-decision-engine-0"] Oct 03 09:08:46 crc kubenswrapper[4765]: I1003 09:08:46.073961 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6a075387-26c0-423a-b081-403cf3aec3b8-combined-ca-bundle\") pod \"watcher-kuttl-applier-0\" (UID: \"6a075387-26c0-423a-b081-403cf3aec3b8\") " pod="watcher-kuttl-default/watcher-kuttl-applier-0" Oct 03 09:08:46 crc kubenswrapper[4765]: I1003 09:08:46.074287 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"custom-prometheus-ca\" (UniqueName: \"kubernetes.io/secret/e41242f3-47ba-4f24-8287-756b9a7743be-custom-prometheus-ca\") pod \"watcher-kuttl-api-0\" (UID: \"e41242f3-47ba-4f24-8287-756b9a7743be\") " pod="watcher-kuttl-default/watcher-kuttl-api-0" Oct 03 09:08:46 crc kubenswrapper[4765]: I1003 09:08:46.074318 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pqf52\" (UniqueName: \"kubernetes.io/projected/6a075387-26c0-423a-b081-403cf3aec3b8-kube-api-access-pqf52\") pod \"watcher-kuttl-applier-0\" (UID: \"6a075387-26c0-423a-b081-403cf3aec3b8\") " pod="watcher-kuttl-default/watcher-kuttl-applier-0" Oct 03 09:08:46 crc kubenswrapper[4765]: I1003 09:08:46.074334 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/e41242f3-47ba-4f24-8287-756b9a7743be-logs\") pod \"watcher-kuttl-api-0\" (UID: \"e41242f3-47ba-4f24-8287-756b9a7743be\") " pod="watcher-kuttl-default/watcher-kuttl-api-0" Oct 03 09:08:46 crc kubenswrapper[4765]: I1003 09:08:46.074354 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hnhmj\" (UniqueName: \"kubernetes.io/projected/e41242f3-47ba-4f24-8287-756b9a7743be-kube-api-access-hnhmj\") pod \"watcher-kuttl-api-0\" (UID: \"e41242f3-47ba-4f24-8287-756b9a7743be\") " pod="watcher-kuttl-default/watcher-kuttl-api-0" Oct 03 09:08:46 crc kubenswrapper[4765]: I1003 09:08:46.074371 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6a075387-26c0-423a-b081-403cf3aec3b8-config-data\") pod \"watcher-kuttl-applier-0\" (UID: \"6a075387-26c0-423a-b081-403cf3aec3b8\") " pod="watcher-kuttl-default/watcher-kuttl-applier-0" Oct 03 09:08:46 crc kubenswrapper[4765]: I1003 09:08:46.074422 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert-memcached-mtls\" (UniqueName: \"kubernetes.io/secret/6a075387-26c0-423a-b081-403cf3aec3b8-cert-memcached-mtls\") pod \"watcher-kuttl-applier-0\" (UID: \"6a075387-26c0-423a-b081-403cf3aec3b8\") " pod="watcher-kuttl-default/watcher-kuttl-applier-0" Oct 03 09:08:46 crc kubenswrapper[4765]: I1003 09:08:46.074440 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert-memcached-mtls\" (UniqueName: \"kubernetes.io/secret/e41242f3-47ba-4f24-8287-756b9a7743be-cert-memcached-mtls\") pod \"watcher-kuttl-api-0\" (UID: \"e41242f3-47ba-4f24-8287-756b9a7743be\") " pod="watcher-kuttl-default/watcher-kuttl-api-0" Oct 03 09:08:46 crc kubenswrapper[4765]: I1003 09:08:46.074465 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e41242f3-47ba-4f24-8287-756b9a7743be-config-data\") pod \"watcher-kuttl-api-0\" (UID: \"e41242f3-47ba-4f24-8287-756b9a7743be\") " pod="watcher-kuttl-default/watcher-kuttl-api-0" Oct 03 09:08:46 crc kubenswrapper[4765]: I1003 09:08:46.074490 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/6a075387-26c0-423a-b081-403cf3aec3b8-logs\") pod \"watcher-kuttl-applier-0\" (UID: \"6a075387-26c0-423a-b081-403cf3aec3b8\") " pod="watcher-kuttl-default/watcher-kuttl-applier-0" Oct 03 09:08:46 crc kubenswrapper[4765]: I1003 09:08:46.074538 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e41242f3-47ba-4f24-8287-756b9a7743be-combined-ca-bundle\") pod \"watcher-kuttl-api-0\" (UID: \"e41242f3-47ba-4f24-8287-756b9a7743be\") " pod="watcher-kuttl-default/watcher-kuttl-api-0" Oct 03 09:08:46 crc kubenswrapper[4765]: I1003 09:08:46.176126 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/88d4477a-79e0-4441-b6d1-789055a87840-config-data\") pod \"watcher-kuttl-decision-engine-0\" (UID: \"88d4477a-79e0-4441-b6d1-789055a87840\") " pod="watcher-kuttl-default/watcher-kuttl-decision-engine-0" Oct 03 09:08:46 crc kubenswrapper[4765]: I1003 09:08:46.176552 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert-memcached-mtls\" (UniqueName: \"kubernetes.io/secret/88d4477a-79e0-4441-b6d1-789055a87840-cert-memcached-mtls\") pod \"watcher-kuttl-decision-engine-0\" (UID: \"88d4477a-79e0-4441-b6d1-789055a87840\") " pod="watcher-kuttl-default/watcher-kuttl-decision-engine-0" Oct 03 09:08:46 crc kubenswrapper[4765]: I1003 09:08:46.176591 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e41242f3-47ba-4f24-8287-756b9a7743be-combined-ca-bundle\") pod \"watcher-kuttl-api-0\" (UID: \"e41242f3-47ba-4f24-8287-756b9a7743be\") " pod="watcher-kuttl-default/watcher-kuttl-api-0" Oct 03 09:08:46 crc kubenswrapper[4765]: I1003 09:08:46.176618 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"custom-prometheus-ca\" (UniqueName: \"kubernetes.io/secret/88d4477a-79e0-4441-b6d1-789055a87840-custom-prometheus-ca\") pod \"watcher-kuttl-decision-engine-0\" (UID: \"88d4477a-79e0-4441-b6d1-789055a87840\") " pod="watcher-kuttl-default/watcher-kuttl-decision-engine-0" Oct 03 09:08:46 crc kubenswrapper[4765]: I1003 09:08:46.176795 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6a075387-26c0-423a-b081-403cf3aec3b8-combined-ca-bundle\") pod \"watcher-kuttl-applier-0\" (UID: \"6a075387-26c0-423a-b081-403cf3aec3b8\") " pod="watcher-kuttl-default/watcher-kuttl-applier-0" Oct 03 09:08:46 crc kubenswrapper[4765]: I1003 09:08:46.176875 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/88d4477a-79e0-4441-b6d1-789055a87840-logs\") pod \"watcher-kuttl-decision-engine-0\" (UID: \"88d4477a-79e0-4441-b6d1-789055a87840\") " pod="watcher-kuttl-default/watcher-kuttl-decision-engine-0" Oct 03 09:08:46 crc kubenswrapper[4765]: I1003 09:08:46.176903 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vz9c7\" (UniqueName: \"kubernetes.io/projected/88d4477a-79e0-4441-b6d1-789055a87840-kube-api-access-vz9c7\") pod \"watcher-kuttl-decision-engine-0\" (UID: \"88d4477a-79e0-4441-b6d1-789055a87840\") " pod="watcher-kuttl-default/watcher-kuttl-decision-engine-0" Oct 03 09:08:46 crc kubenswrapper[4765]: I1003 09:08:46.176919 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/88d4477a-79e0-4441-b6d1-789055a87840-combined-ca-bundle\") pod \"watcher-kuttl-decision-engine-0\" (UID: \"88d4477a-79e0-4441-b6d1-789055a87840\") " pod="watcher-kuttl-default/watcher-kuttl-decision-engine-0" Oct 03 09:08:46 crc kubenswrapper[4765]: I1003 09:08:46.176941 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"custom-prometheus-ca\" (UniqueName: \"kubernetes.io/secret/e41242f3-47ba-4f24-8287-756b9a7743be-custom-prometheus-ca\") pod \"watcher-kuttl-api-0\" (UID: \"e41242f3-47ba-4f24-8287-756b9a7743be\") " pod="watcher-kuttl-default/watcher-kuttl-api-0" Oct 03 09:08:46 crc kubenswrapper[4765]: I1003 09:08:46.176965 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pqf52\" (UniqueName: \"kubernetes.io/projected/6a075387-26c0-423a-b081-403cf3aec3b8-kube-api-access-pqf52\") pod \"watcher-kuttl-applier-0\" (UID: \"6a075387-26c0-423a-b081-403cf3aec3b8\") " pod="watcher-kuttl-default/watcher-kuttl-applier-0" Oct 03 09:08:46 crc kubenswrapper[4765]: I1003 09:08:46.176978 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/e41242f3-47ba-4f24-8287-756b9a7743be-logs\") pod \"watcher-kuttl-api-0\" (UID: \"e41242f3-47ba-4f24-8287-756b9a7743be\") " pod="watcher-kuttl-default/watcher-kuttl-api-0" Oct 03 09:08:46 crc kubenswrapper[4765]: I1003 09:08:46.176996 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hnhmj\" (UniqueName: \"kubernetes.io/projected/e41242f3-47ba-4f24-8287-756b9a7743be-kube-api-access-hnhmj\") pod \"watcher-kuttl-api-0\" (UID: \"e41242f3-47ba-4f24-8287-756b9a7743be\") " pod="watcher-kuttl-default/watcher-kuttl-api-0" Oct 03 09:08:46 crc kubenswrapper[4765]: I1003 09:08:46.177017 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6a075387-26c0-423a-b081-403cf3aec3b8-config-data\") pod \"watcher-kuttl-applier-0\" (UID: \"6a075387-26c0-423a-b081-403cf3aec3b8\") " pod="watcher-kuttl-default/watcher-kuttl-applier-0" Oct 03 09:08:46 crc kubenswrapper[4765]: I1003 09:08:46.177047 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert-memcached-mtls\" (UniqueName: \"kubernetes.io/secret/6a075387-26c0-423a-b081-403cf3aec3b8-cert-memcached-mtls\") pod \"watcher-kuttl-applier-0\" (UID: \"6a075387-26c0-423a-b081-403cf3aec3b8\") " pod="watcher-kuttl-default/watcher-kuttl-applier-0" Oct 03 09:08:46 crc kubenswrapper[4765]: I1003 09:08:46.177072 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert-memcached-mtls\" (UniqueName: \"kubernetes.io/secret/e41242f3-47ba-4f24-8287-756b9a7743be-cert-memcached-mtls\") pod \"watcher-kuttl-api-0\" (UID: \"e41242f3-47ba-4f24-8287-756b9a7743be\") " pod="watcher-kuttl-default/watcher-kuttl-api-0" Oct 03 09:08:46 crc kubenswrapper[4765]: I1003 09:08:46.177102 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e41242f3-47ba-4f24-8287-756b9a7743be-config-data\") pod \"watcher-kuttl-api-0\" (UID: \"e41242f3-47ba-4f24-8287-756b9a7743be\") " pod="watcher-kuttl-default/watcher-kuttl-api-0" Oct 03 09:08:46 crc kubenswrapper[4765]: I1003 09:08:46.177123 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/6a075387-26c0-423a-b081-403cf3aec3b8-logs\") pod \"watcher-kuttl-applier-0\" (UID: \"6a075387-26c0-423a-b081-403cf3aec3b8\") " pod="watcher-kuttl-default/watcher-kuttl-applier-0" Oct 03 09:08:46 crc kubenswrapper[4765]: I1003 09:08:46.177543 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/6a075387-26c0-423a-b081-403cf3aec3b8-logs\") pod \"watcher-kuttl-applier-0\" (UID: \"6a075387-26c0-423a-b081-403cf3aec3b8\") " pod="watcher-kuttl-default/watcher-kuttl-applier-0" Oct 03 09:08:46 crc kubenswrapper[4765]: I1003 09:08:46.177893 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/e41242f3-47ba-4f24-8287-756b9a7743be-logs\") pod \"watcher-kuttl-api-0\" (UID: \"e41242f3-47ba-4f24-8287-756b9a7743be\") " pod="watcher-kuttl-default/watcher-kuttl-api-0" Oct 03 09:08:46 crc kubenswrapper[4765]: I1003 09:08:46.181441 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6a075387-26c0-423a-b081-403cf3aec3b8-combined-ca-bundle\") pod \"watcher-kuttl-applier-0\" (UID: \"6a075387-26c0-423a-b081-403cf3aec3b8\") " pod="watcher-kuttl-default/watcher-kuttl-applier-0" Oct 03 09:08:46 crc kubenswrapper[4765]: I1003 09:08:46.181795 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e41242f3-47ba-4f24-8287-756b9a7743be-combined-ca-bundle\") pod \"watcher-kuttl-api-0\" (UID: \"e41242f3-47ba-4f24-8287-756b9a7743be\") " pod="watcher-kuttl-default/watcher-kuttl-api-0" Oct 03 09:08:46 crc kubenswrapper[4765]: I1003 09:08:46.182352 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert-memcached-mtls\" (UniqueName: \"kubernetes.io/secret/e41242f3-47ba-4f24-8287-756b9a7743be-cert-memcached-mtls\") pod \"watcher-kuttl-api-0\" (UID: \"e41242f3-47ba-4f24-8287-756b9a7743be\") " pod="watcher-kuttl-default/watcher-kuttl-api-0" Oct 03 09:08:46 crc kubenswrapper[4765]: I1003 09:08:46.189067 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6a075387-26c0-423a-b081-403cf3aec3b8-config-data\") pod \"watcher-kuttl-applier-0\" (UID: \"6a075387-26c0-423a-b081-403cf3aec3b8\") " pod="watcher-kuttl-default/watcher-kuttl-applier-0" Oct 03 09:08:46 crc kubenswrapper[4765]: I1003 09:08:46.195300 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e41242f3-47ba-4f24-8287-756b9a7743be-config-data\") pod \"watcher-kuttl-api-0\" (UID: \"e41242f3-47ba-4f24-8287-756b9a7743be\") " pod="watcher-kuttl-default/watcher-kuttl-api-0" Oct 03 09:08:46 crc kubenswrapper[4765]: I1003 09:08:46.196254 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pqf52\" (UniqueName: \"kubernetes.io/projected/6a075387-26c0-423a-b081-403cf3aec3b8-kube-api-access-pqf52\") pod \"watcher-kuttl-applier-0\" (UID: \"6a075387-26c0-423a-b081-403cf3aec3b8\") " pod="watcher-kuttl-default/watcher-kuttl-applier-0" Oct 03 09:08:46 crc kubenswrapper[4765]: I1003 09:08:46.197317 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"custom-prometheus-ca\" (UniqueName: \"kubernetes.io/secret/e41242f3-47ba-4f24-8287-756b9a7743be-custom-prometheus-ca\") pod \"watcher-kuttl-api-0\" (UID: \"e41242f3-47ba-4f24-8287-756b9a7743be\") " pod="watcher-kuttl-default/watcher-kuttl-api-0" Oct 03 09:08:46 crc kubenswrapper[4765]: I1003 09:08:46.197811 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert-memcached-mtls\" (UniqueName: \"kubernetes.io/secret/6a075387-26c0-423a-b081-403cf3aec3b8-cert-memcached-mtls\") pod \"watcher-kuttl-applier-0\" (UID: \"6a075387-26c0-423a-b081-403cf3aec3b8\") " pod="watcher-kuttl-default/watcher-kuttl-applier-0" Oct 03 09:08:46 crc kubenswrapper[4765]: I1003 09:08:46.197939 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hnhmj\" (UniqueName: \"kubernetes.io/projected/e41242f3-47ba-4f24-8287-756b9a7743be-kube-api-access-hnhmj\") pod \"watcher-kuttl-api-0\" (UID: \"e41242f3-47ba-4f24-8287-756b9a7743be\") " pod="watcher-kuttl-default/watcher-kuttl-api-0" Oct 03 09:08:46 crc kubenswrapper[4765]: I1003 09:08:46.239743 4765 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/watcher-kuttl-api-0" Oct 03 09:08:46 crc kubenswrapper[4765]: I1003 09:08:46.271419 4765 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/watcher-kuttl-applier-0" Oct 03 09:08:46 crc kubenswrapper[4765]: I1003 09:08:46.283355 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/88d4477a-79e0-4441-b6d1-789055a87840-config-data\") pod \"watcher-kuttl-decision-engine-0\" (UID: \"88d4477a-79e0-4441-b6d1-789055a87840\") " pod="watcher-kuttl-default/watcher-kuttl-decision-engine-0" Oct 03 09:08:46 crc kubenswrapper[4765]: I1003 09:08:46.283797 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/88d4477a-79e0-4441-b6d1-789055a87840-config-data\") pod \"watcher-kuttl-decision-engine-0\" (UID: \"88d4477a-79e0-4441-b6d1-789055a87840\") " pod="watcher-kuttl-default/watcher-kuttl-decision-engine-0" Oct 03 09:08:46 crc kubenswrapper[4765]: I1003 09:08:46.283960 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert-memcached-mtls\" (UniqueName: \"kubernetes.io/secret/88d4477a-79e0-4441-b6d1-789055a87840-cert-memcached-mtls\") pod \"watcher-kuttl-decision-engine-0\" (UID: \"88d4477a-79e0-4441-b6d1-789055a87840\") " pod="watcher-kuttl-default/watcher-kuttl-decision-engine-0" Oct 03 09:08:46 crc kubenswrapper[4765]: I1003 09:08:46.283997 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"custom-prometheus-ca\" (UniqueName: \"kubernetes.io/secret/88d4477a-79e0-4441-b6d1-789055a87840-custom-prometheus-ca\") pod \"watcher-kuttl-decision-engine-0\" (UID: \"88d4477a-79e0-4441-b6d1-789055a87840\") " pod="watcher-kuttl-default/watcher-kuttl-decision-engine-0" Oct 03 09:08:46 crc kubenswrapper[4765]: I1003 09:08:46.284220 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/88d4477a-79e0-4441-b6d1-789055a87840-logs\") pod \"watcher-kuttl-decision-engine-0\" (UID: \"88d4477a-79e0-4441-b6d1-789055a87840\") " pod="watcher-kuttl-default/watcher-kuttl-decision-engine-0" Oct 03 09:08:46 crc kubenswrapper[4765]: I1003 09:08:46.284305 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vz9c7\" (UniqueName: \"kubernetes.io/projected/88d4477a-79e0-4441-b6d1-789055a87840-kube-api-access-vz9c7\") pod \"watcher-kuttl-decision-engine-0\" (UID: \"88d4477a-79e0-4441-b6d1-789055a87840\") " pod="watcher-kuttl-default/watcher-kuttl-decision-engine-0" Oct 03 09:08:46 crc kubenswrapper[4765]: I1003 09:08:46.284374 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/88d4477a-79e0-4441-b6d1-789055a87840-combined-ca-bundle\") pod \"watcher-kuttl-decision-engine-0\" (UID: \"88d4477a-79e0-4441-b6d1-789055a87840\") " pod="watcher-kuttl-default/watcher-kuttl-decision-engine-0" Oct 03 09:08:46 crc kubenswrapper[4765]: I1003 09:08:46.284784 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/88d4477a-79e0-4441-b6d1-789055a87840-logs\") pod \"watcher-kuttl-decision-engine-0\" (UID: \"88d4477a-79e0-4441-b6d1-789055a87840\") " pod="watcher-kuttl-default/watcher-kuttl-decision-engine-0" Oct 03 09:08:46 crc kubenswrapper[4765]: I1003 09:08:46.288849 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert-memcached-mtls\" (UniqueName: \"kubernetes.io/secret/88d4477a-79e0-4441-b6d1-789055a87840-cert-memcached-mtls\") pod \"watcher-kuttl-decision-engine-0\" (UID: \"88d4477a-79e0-4441-b6d1-789055a87840\") " pod="watcher-kuttl-default/watcher-kuttl-decision-engine-0" Oct 03 09:08:46 crc kubenswrapper[4765]: I1003 09:08:46.298417 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/88d4477a-79e0-4441-b6d1-789055a87840-combined-ca-bundle\") pod \"watcher-kuttl-decision-engine-0\" (UID: \"88d4477a-79e0-4441-b6d1-789055a87840\") " pod="watcher-kuttl-default/watcher-kuttl-decision-engine-0" Oct 03 09:08:46 crc kubenswrapper[4765]: I1003 09:08:46.298450 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"custom-prometheus-ca\" (UniqueName: \"kubernetes.io/secret/88d4477a-79e0-4441-b6d1-789055a87840-custom-prometheus-ca\") pod \"watcher-kuttl-decision-engine-0\" (UID: \"88d4477a-79e0-4441-b6d1-789055a87840\") " pod="watcher-kuttl-default/watcher-kuttl-decision-engine-0" Oct 03 09:08:46 crc kubenswrapper[4765]: I1003 09:08:46.304743 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vz9c7\" (UniqueName: \"kubernetes.io/projected/88d4477a-79e0-4441-b6d1-789055a87840-kube-api-access-vz9c7\") pod \"watcher-kuttl-decision-engine-0\" (UID: \"88d4477a-79e0-4441-b6d1-789055a87840\") " pod="watcher-kuttl-default/watcher-kuttl-decision-engine-0" Oct 03 09:08:46 crc kubenswrapper[4765]: I1003 09:08:46.342082 4765 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/watcher-kuttl-decision-engine-0" Oct 03 09:08:46 crc kubenswrapper[4765]: W1003 09:08:46.790449 4765 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pode41242f3_47ba_4f24_8287_756b9a7743be.slice/crio-7b89fa6adccbfdca34f165870625d53aad58669f7962998bbfe754fb7bc9b141 WatchSource:0}: Error finding container 7b89fa6adccbfdca34f165870625d53aad58669f7962998bbfe754fb7bc9b141: Status 404 returned error can't find the container with id 7b89fa6adccbfdca34f165870625d53aad58669f7962998bbfe754fb7bc9b141 Oct 03 09:08:46 crc kubenswrapper[4765]: I1003 09:08:46.800477 4765 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["watcher-kuttl-default/watcher-kuttl-api-0"] Oct 03 09:08:46 crc kubenswrapper[4765]: I1003 09:08:46.877626 4765 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["watcher-kuttl-default/watcher-kuttl-applier-0"] Oct 03 09:08:46 crc kubenswrapper[4765]: I1003 09:08:46.995491 4765 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["watcher-kuttl-default/watcher-kuttl-decision-engine-0"] Oct 03 09:08:47 crc kubenswrapper[4765]: I1003 09:08:47.648489 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/watcher-kuttl-applier-0" event={"ID":"6a075387-26c0-423a-b081-403cf3aec3b8","Type":"ContainerStarted","Data":"1672ce3a83397ed7f6e940bf72b09ad0208092104d26081c2f9aefca308f793f"} Oct 03 09:08:47 crc kubenswrapper[4765]: I1003 09:08:47.648876 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/watcher-kuttl-applier-0" event={"ID":"6a075387-26c0-423a-b081-403cf3aec3b8","Type":"ContainerStarted","Data":"b34b1cd005c60b7c3cd429297e2f55cb53ce0cf74b04166d19203351ff3e3162"} Oct 03 09:08:47 crc kubenswrapper[4765]: I1003 09:08:47.651133 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/watcher-kuttl-decision-engine-0" event={"ID":"88d4477a-79e0-4441-b6d1-789055a87840","Type":"ContainerStarted","Data":"94718d2e8bbef895d589842ad2becfcdf4813f57392af0b9e7b4f7132b9b556a"} Oct 03 09:08:47 crc kubenswrapper[4765]: I1003 09:08:47.651175 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/watcher-kuttl-decision-engine-0" event={"ID":"88d4477a-79e0-4441-b6d1-789055a87840","Type":"ContainerStarted","Data":"e4c5e7cef3b36b71391f3e13f32c93700edab7914120a74ef62c2f8b12c94d98"} Oct 03 09:08:47 crc kubenswrapper[4765]: I1003 09:08:47.654354 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/watcher-kuttl-api-0" event={"ID":"e41242f3-47ba-4f24-8287-756b9a7743be","Type":"ContainerStarted","Data":"21ba0c2fb05ca7740fcfdaa71de126d194d83881c0201333dae0d087e258d33e"} Oct 03 09:08:47 crc kubenswrapper[4765]: I1003 09:08:47.654395 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/watcher-kuttl-api-0" event={"ID":"e41242f3-47ba-4f24-8287-756b9a7743be","Type":"ContainerStarted","Data":"d9131e283bbc924a306a6a7e2ec75f01f9e5213922f31101da67362ef4e9fc14"} Oct 03 09:08:47 crc kubenswrapper[4765]: I1003 09:08:47.654409 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/watcher-kuttl-api-0" event={"ID":"e41242f3-47ba-4f24-8287-756b9a7743be","Type":"ContainerStarted","Data":"7b89fa6adccbfdca34f165870625d53aad58669f7962998bbfe754fb7bc9b141"} Oct 03 09:08:47 crc kubenswrapper[4765]: I1003 09:08:47.655433 4765 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="watcher-kuttl-default/watcher-kuttl-api-0" Oct 03 09:08:47 crc kubenswrapper[4765]: I1003 09:08:47.703561 4765 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="watcher-kuttl-default/watcher-kuttl-applier-0" podStartSLOduration=2.703542572 podStartE2EDuration="2.703542572s" podCreationTimestamp="2025-10-03 09:08:45 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-03 09:08:47.677941025 +0000 UTC m=+1771.979435365" watchObservedRunningTime="2025-10-03 09:08:47.703542572 +0000 UTC m=+1772.005036902" Oct 03 09:08:47 crc kubenswrapper[4765]: I1003 09:08:47.734501 4765 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="watcher-kuttl-default/watcher-kuttl-decision-engine-0" podStartSLOduration=2.734482248 podStartE2EDuration="2.734482248s" podCreationTimestamp="2025-10-03 09:08:45 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-03 09:08:47.731632664 +0000 UTC m=+1772.033126994" watchObservedRunningTime="2025-10-03 09:08:47.734482248 +0000 UTC m=+1772.035976578" Oct 03 09:08:47 crc kubenswrapper[4765]: I1003 09:08:47.738127 4765 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="watcher-kuttl-default/watcher-kuttl-api-0" podStartSLOduration=2.738114953 podStartE2EDuration="2.738114953s" podCreationTimestamp="2025-10-03 09:08:45 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-03 09:08:47.705429611 +0000 UTC m=+1772.006923941" watchObservedRunningTime="2025-10-03 09:08:47.738114953 +0000 UTC m=+1772.039609283" Oct 03 09:08:49 crc kubenswrapper[4765]: I1003 09:08:49.672101 4765 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Oct 03 09:08:50 crc kubenswrapper[4765]: I1003 09:08:50.483185 4765 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="watcher-kuttl-default/watcher-kuttl-api-0" Oct 03 09:08:51 crc kubenswrapper[4765]: I1003 09:08:51.240496 4765 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="watcher-kuttl-default/watcher-kuttl-api-0" Oct 03 09:08:51 crc kubenswrapper[4765]: I1003 09:08:51.272519 4765 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="watcher-kuttl-default/watcher-kuttl-applier-0" Oct 03 09:08:54 crc kubenswrapper[4765]: I1003 09:08:54.875611 4765 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="watcher-kuttl-default/ceilometer-0" Oct 03 09:08:56 crc kubenswrapper[4765]: I1003 09:08:56.240096 4765 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="watcher-kuttl-default/watcher-kuttl-api-0" Oct 03 09:08:56 crc kubenswrapper[4765]: I1003 09:08:56.260021 4765 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="watcher-kuttl-default/watcher-kuttl-api-0" Oct 03 09:08:56 crc kubenswrapper[4765]: I1003 09:08:56.271850 4765 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="watcher-kuttl-default/watcher-kuttl-applier-0" Oct 03 09:08:56 crc kubenswrapper[4765]: I1003 09:08:56.300900 4765 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="watcher-kuttl-default/watcher-kuttl-applier-0" Oct 03 09:08:56 crc kubenswrapper[4765]: I1003 09:08:56.346192 4765 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="watcher-kuttl-default/watcher-kuttl-decision-engine-0" Oct 03 09:08:56 crc kubenswrapper[4765]: I1003 09:08:56.370192 4765 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="watcher-kuttl-default/watcher-kuttl-decision-engine-0" Oct 03 09:08:56 crc kubenswrapper[4765]: I1003 09:08:56.731954 4765 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="watcher-kuttl-default/watcher-kuttl-decision-engine-0" Oct 03 09:08:56 crc kubenswrapper[4765]: I1003 09:08:56.742581 4765 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="watcher-kuttl-default/watcher-kuttl-api-0" Oct 03 09:08:56 crc kubenswrapper[4765]: I1003 09:08:56.766344 4765 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="watcher-kuttl-default/watcher-kuttl-applier-0" Oct 03 09:08:56 crc kubenswrapper[4765]: I1003 09:08:56.841428 4765 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="watcher-kuttl-default/watcher-kuttl-decision-engine-0" Oct 03 09:08:58 crc kubenswrapper[4765]: I1003 09:08:58.635301 4765 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["watcher-kuttl-default/ceilometer-0"] Oct 03 09:08:58 crc kubenswrapper[4765]: I1003 09:08:58.636872 4765 kuberuntime_container.go:808] "Killing container with a grace period" pod="watcher-kuttl-default/ceilometer-0" podUID="574ea430-f39b-420d-9b46-e64eb9e6135b" containerName="ceilometer-central-agent" containerID="cri-o://30d84e308541b3eebf1cf7d69f996a051fec0a846a685abb13b90a184fba99fa" gracePeriod=30 Oct 03 09:08:58 crc kubenswrapper[4765]: I1003 09:08:58.636996 4765 kuberuntime_container.go:808] "Killing container with a grace period" pod="watcher-kuttl-default/ceilometer-0" podUID="574ea430-f39b-420d-9b46-e64eb9e6135b" containerName="ceilometer-notification-agent" containerID="cri-o://e52914795d9952cfff13da50e0ad72afe1127e580aa4ef423c467c14e24be966" gracePeriod=30 Oct 03 09:08:58 crc kubenswrapper[4765]: I1003 09:08:58.636996 4765 kuberuntime_container.go:808] "Killing container with a grace period" pod="watcher-kuttl-default/ceilometer-0" podUID="574ea430-f39b-420d-9b46-e64eb9e6135b" containerName="sg-core" containerID="cri-o://2fa8e1b9334e9f8919f511494a08200a08cfaa72c334a5e9739f139399bc2abc" gracePeriod=30 Oct 03 09:08:58 crc kubenswrapper[4765]: I1003 09:08:58.636935 4765 kuberuntime_container.go:808] "Killing container with a grace period" pod="watcher-kuttl-default/ceilometer-0" podUID="574ea430-f39b-420d-9b46-e64eb9e6135b" containerName="proxy-httpd" containerID="cri-o://5b1ec43428ec5ea3aa19b60992674f93c92171bb9518b1fb3f051f442004101c" gracePeriod=30 Oct 03 09:08:59 crc kubenswrapper[4765]: I1003 09:08:59.307203 4765 scope.go:117] "RemoveContainer" containerID="dd918556e4256b95f1ffce5dba4f8a301b33441a569fc5bbea88da3f09eb9800" Oct 03 09:08:59 crc kubenswrapper[4765]: E1003 09:08:59.307430 4765 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-j8mss_openshift-machine-config-operator(d636dbad-9ffa-4ba7-953f-adea04b76a23)\"" pod="openshift-machine-config-operator/machine-config-daemon-j8mss" podUID="d636dbad-9ffa-4ba7-953f-adea04b76a23" Oct 03 09:08:59 crc kubenswrapper[4765]: I1003 09:08:59.756017 4765 generic.go:334] "Generic (PLEG): container finished" podID="574ea430-f39b-420d-9b46-e64eb9e6135b" containerID="5b1ec43428ec5ea3aa19b60992674f93c92171bb9518b1fb3f051f442004101c" exitCode=0 Oct 03 09:08:59 crc kubenswrapper[4765]: I1003 09:08:59.756384 4765 generic.go:334] "Generic (PLEG): container finished" podID="574ea430-f39b-420d-9b46-e64eb9e6135b" containerID="2fa8e1b9334e9f8919f511494a08200a08cfaa72c334a5e9739f139399bc2abc" exitCode=2 Oct 03 09:08:59 crc kubenswrapper[4765]: I1003 09:08:59.756398 4765 generic.go:334] "Generic (PLEG): container finished" podID="574ea430-f39b-420d-9b46-e64eb9e6135b" containerID="30d84e308541b3eebf1cf7d69f996a051fec0a846a685abb13b90a184fba99fa" exitCode=0 Oct 03 09:08:59 crc kubenswrapper[4765]: I1003 09:08:59.756081 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/ceilometer-0" event={"ID":"574ea430-f39b-420d-9b46-e64eb9e6135b","Type":"ContainerDied","Data":"5b1ec43428ec5ea3aa19b60992674f93c92171bb9518b1fb3f051f442004101c"} Oct 03 09:08:59 crc kubenswrapper[4765]: I1003 09:08:59.756445 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/ceilometer-0" event={"ID":"574ea430-f39b-420d-9b46-e64eb9e6135b","Type":"ContainerDied","Data":"2fa8e1b9334e9f8919f511494a08200a08cfaa72c334a5e9739f139399bc2abc"} Oct 03 09:08:59 crc kubenswrapper[4765]: I1003 09:08:59.756465 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/ceilometer-0" event={"ID":"574ea430-f39b-420d-9b46-e64eb9e6135b","Type":"ContainerDied","Data":"30d84e308541b3eebf1cf7d69f996a051fec0a846a685abb13b90a184fba99fa"} Oct 03 09:09:03 crc kubenswrapper[4765]: I1003 09:09:03.792869 4765 generic.go:334] "Generic (PLEG): container finished" podID="574ea430-f39b-420d-9b46-e64eb9e6135b" containerID="e52914795d9952cfff13da50e0ad72afe1127e580aa4ef423c467c14e24be966" exitCode=0 Oct 03 09:09:03 crc kubenswrapper[4765]: I1003 09:09:03.792947 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/ceilometer-0" event={"ID":"574ea430-f39b-420d-9b46-e64eb9e6135b","Type":"ContainerDied","Data":"e52914795d9952cfff13da50e0ad72afe1127e580aa4ef423c467c14e24be966"} Oct 03 09:09:03 crc kubenswrapper[4765]: I1003 09:09:03.861469 4765 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/ceilometer-0" Oct 03 09:09:03 crc kubenswrapper[4765]: I1003 09:09:03.990881 4765 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/574ea430-f39b-420d-9b46-e64eb9e6135b-combined-ca-bundle\") pod \"574ea430-f39b-420d-9b46-e64eb9e6135b\" (UID: \"574ea430-f39b-420d-9b46-e64eb9e6135b\") " Oct 03 09:09:03 crc kubenswrapper[4765]: I1003 09:09:03.990943 4765 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/574ea430-f39b-420d-9b46-e64eb9e6135b-sg-core-conf-yaml\") pod \"574ea430-f39b-420d-9b46-e64eb9e6135b\" (UID: \"574ea430-f39b-420d-9b46-e64eb9e6135b\") " Oct 03 09:09:03 crc kubenswrapper[4765]: I1003 09:09:03.991013 4765 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/574ea430-f39b-420d-9b46-e64eb9e6135b-run-httpd\") pod \"574ea430-f39b-420d-9b46-e64eb9e6135b\" (UID: \"574ea430-f39b-420d-9b46-e64eb9e6135b\") " Oct 03 09:09:03 crc kubenswrapper[4765]: I1003 09:09:03.991078 4765 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qm8rf\" (UniqueName: \"kubernetes.io/projected/574ea430-f39b-420d-9b46-e64eb9e6135b-kube-api-access-qm8rf\") pod \"574ea430-f39b-420d-9b46-e64eb9e6135b\" (UID: \"574ea430-f39b-420d-9b46-e64eb9e6135b\") " Oct 03 09:09:03 crc kubenswrapper[4765]: I1003 09:09:03.991106 4765 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/574ea430-f39b-420d-9b46-e64eb9e6135b-ceilometer-tls-certs\") pod \"574ea430-f39b-420d-9b46-e64eb9e6135b\" (UID: \"574ea430-f39b-420d-9b46-e64eb9e6135b\") " Oct 03 09:09:03 crc kubenswrapper[4765]: I1003 09:09:03.991129 4765 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/574ea430-f39b-420d-9b46-e64eb9e6135b-log-httpd\") pod \"574ea430-f39b-420d-9b46-e64eb9e6135b\" (UID: \"574ea430-f39b-420d-9b46-e64eb9e6135b\") " Oct 03 09:09:03 crc kubenswrapper[4765]: I1003 09:09:03.991216 4765 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/574ea430-f39b-420d-9b46-e64eb9e6135b-scripts\") pod \"574ea430-f39b-420d-9b46-e64eb9e6135b\" (UID: \"574ea430-f39b-420d-9b46-e64eb9e6135b\") " Oct 03 09:09:03 crc kubenswrapper[4765]: I1003 09:09:03.991332 4765 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/574ea430-f39b-420d-9b46-e64eb9e6135b-config-data\") pod \"574ea430-f39b-420d-9b46-e64eb9e6135b\" (UID: \"574ea430-f39b-420d-9b46-e64eb9e6135b\") " Oct 03 09:09:03 crc kubenswrapper[4765]: I1003 09:09:03.991680 4765 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/574ea430-f39b-420d-9b46-e64eb9e6135b-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "574ea430-f39b-420d-9b46-e64eb9e6135b" (UID: "574ea430-f39b-420d-9b46-e64eb9e6135b"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 03 09:09:04 crc kubenswrapper[4765]: I1003 09:09:04.011633 4765 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/574ea430-f39b-420d-9b46-e64eb9e6135b-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "574ea430-f39b-420d-9b46-e64eb9e6135b" (UID: "574ea430-f39b-420d-9b46-e64eb9e6135b"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 03 09:09:04 crc kubenswrapper[4765]: I1003 09:09:04.011878 4765 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/574ea430-f39b-420d-9b46-e64eb9e6135b-scripts" (OuterVolumeSpecName: "scripts") pod "574ea430-f39b-420d-9b46-e64eb9e6135b" (UID: "574ea430-f39b-420d-9b46-e64eb9e6135b"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 09:09:04 crc kubenswrapper[4765]: I1003 09:09:04.012925 4765 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/574ea430-f39b-420d-9b46-e64eb9e6135b-kube-api-access-qm8rf" (OuterVolumeSpecName: "kube-api-access-qm8rf") pod "574ea430-f39b-420d-9b46-e64eb9e6135b" (UID: "574ea430-f39b-420d-9b46-e64eb9e6135b"). InnerVolumeSpecName "kube-api-access-qm8rf". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 09:09:04 crc kubenswrapper[4765]: I1003 09:09:04.053850 4765 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/574ea430-f39b-420d-9b46-e64eb9e6135b-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "574ea430-f39b-420d-9b46-e64eb9e6135b" (UID: "574ea430-f39b-420d-9b46-e64eb9e6135b"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 09:09:04 crc kubenswrapper[4765]: I1003 09:09:04.079082 4765 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/574ea430-f39b-420d-9b46-e64eb9e6135b-ceilometer-tls-certs" (OuterVolumeSpecName: "ceilometer-tls-certs") pod "574ea430-f39b-420d-9b46-e64eb9e6135b" (UID: "574ea430-f39b-420d-9b46-e64eb9e6135b"). InnerVolumeSpecName "ceilometer-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 09:09:04 crc kubenswrapper[4765]: I1003 09:09:04.093145 4765 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/574ea430-f39b-420d-9b46-e64eb9e6135b-scripts\") on node \"crc\" DevicePath \"\"" Oct 03 09:09:04 crc kubenswrapper[4765]: I1003 09:09:04.093179 4765 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/574ea430-f39b-420d-9b46-e64eb9e6135b-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Oct 03 09:09:04 crc kubenswrapper[4765]: I1003 09:09:04.093190 4765 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/574ea430-f39b-420d-9b46-e64eb9e6135b-run-httpd\") on node \"crc\" DevicePath \"\"" Oct 03 09:09:04 crc kubenswrapper[4765]: I1003 09:09:04.093199 4765 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qm8rf\" (UniqueName: \"kubernetes.io/projected/574ea430-f39b-420d-9b46-e64eb9e6135b-kube-api-access-qm8rf\") on node \"crc\" DevicePath \"\"" Oct 03 09:09:04 crc kubenswrapper[4765]: I1003 09:09:04.093209 4765 reconciler_common.go:293] "Volume detached for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/574ea430-f39b-420d-9b46-e64eb9e6135b-ceilometer-tls-certs\") on node \"crc\" DevicePath \"\"" Oct 03 09:09:04 crc kubenswrapper[4765]: I1003 09:09:04.093218 4765 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/574ea430-f39b-420d-9b46-e64eb9e6135b-log-httpd\") on node \"crc\" DevicePath \"\"" Oct 03 09:09:04 crc kubenswrapper[4765]: I1003 09:09:04.093719 4765 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/574ea430-f39b-420d-9b46-e64eb9e6135b-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "574ea430-f39b-420d-9b46-e64eb9e6135b" (UID: "574ea430-f39b-420d-9b46-e64eb9e6135b"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 09:09:04 crc kubenswrapper[4765]: I1003 09:09:04.111998 4765 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/574ea430-f39b-420d-9b46-e64eb9e6135b-config-data" (OuterVolumeSpecName: "config-data") pod "574ea430-f39b-420d-9b46-e64eb9e6135b" (UID: "574ea430-f39b-420d-9b46-e64eb9e6135b"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 09:09:04 crc kubenswrapper[4765]: I1003 09:09:04.194375 4765 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/574ea430-f39b-420d-9b46-e64eb9e6135b-config-data\") on node \"crc\" DevicePath \"\"" Oct 03 09:09:04 crc kubenswrapper[4765]: I1003 09:09:04.194628 4765 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/574ea430-f39b-420d-9b46-e64eb9e6135b-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 03 09:09:04 crc kubenswrapper[4765]: I1003 09:09:04.802364 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/ceilometer-0" event={"ID":"574ea430-f39b-420d-9b46-e64eb9e6135b","Type":"ContainerDied","Data":"28b94debead004499e7b60df67e2395baf2b84052124a7cc4caed7bde44afd98"} Oct 03 09:09:04 crc kubenswrapper[4765]: I1003 09:09:04.802417 4765 scope.go:117] "RemoveContainer" containerID="5b1ec43428ec5ea3aa19b60992674f93c92171bb9518b1fb3f051f442004101c" Oct 03 09:09:04 crc kubenswrapper[4765]: I1003 09:09:04.802447 4765 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/ceilometer-0" Oct 03 09:09:04 crc kubenswrapper[4765]: I1003 09:09:04.826087 4765 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["watcher-kuttl-default/ceilometer-0"] Oct 03 09:09:04 crc kubenswrapper[4765]: I1003 09:09:04.826903 4765 scope.go:117] "RemoveContainer" containerID="2fa8e1b9334e9f8919f511494a08200a08cfaa72c334a5e9739f139399bc2abc" Oct 03 09:09:04 crc kubenswrapper[4765]: I1003 09:09:04.837991 4765 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["watcher-kuttl-default/ceilometer-0"] Oct 03 09:09:04 crc kubenswrapper[4765]: I1003 09:09:04.853225 4765 scope.go:117] "RemoveContainer" containerID="e52914795d9952cfff13da50e0ad72afe1127e580aa4ef423c467c14e24be966" Oct 03 09:09:04 crc kubenswrapper[4765]: I1003 09:09:04.860465 4765 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["watcher-kuttl-default/ceilometer-0"] Oct 03 09:09:04 crc kubenswrapper[4765]: E1003 09:09:04.860880 4765 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="574ea430-f39b-420d-9b46-e64eb9e6135b" containerName="sg-core" Oct 03 09:09:04 crc kubenswrapper[4765]: I1003 09:09:04.860898 4765 state_mem.go:107] "Deleted CPUSet assignment" podUID="574ea430-f39b-420d-9b46-e64eb9e6135b" containerName="sg-core" Oct 03 09:09:04 crc kubenswrapper[4765]: E1003 09:09:04.860912 4765 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="574ea430-f39b-420d-9b46-e64eb9e6135b" containerName="ceilometer-central-agent" Oct 03 09:09:04 crc kubenswrapper[4765]: I1003 09:09:04.860926 4765 state_mem.go:107] "Deleted CPUSet assignment" podUID="574ea430-f39b-420d-9b46-e64eb9e6135b" containerName="ceilometer-central-agent" Oct 03 09:09:04 crc kubenswrapper[4765]: E1003 09:09:04.860950 4765 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="574ea430-f39b-420d-9b46-e64eb9e6135b" containerName="ceilometer-notification-agent" Oct 03 09:09:04 crc kubenswrapper[4765]: I1003 09:09:04.860958 4765 state_mem.go:107] "Deleted CPUSet assignment" podUID="574ea430-f39b-420d-9b46-e64eb9e6135b" containerName="ceilometer-notification-agent" Oct 03 09:09:04 crc kubenswrapper[4765]: E1003 09:09:04.860986 4765 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="574ea430-f39b-420d-9b46-e64eb9e6135b" containerName="proxy-httpd" Oct 03 09:09:04 crc kubenswrapper[4765]: I1003 09:09:04.860993 4765 state_mem.go:107] "Deleted CPUSet assignment" podUID="574ea430-f39b-420d-9b46-e64eb9e6135b" containerName="proxy-httpd" Oct 03 09:09:04 crc kubenswrapper[4765]: I1003 09:09:04.861164 4765 memory_manager.go:354] "RemoveStaleState removing state" podUID="574ea430-f39b-420d-9b46-e64eb9e6135b" containerName="sg-core" Oct 03 09:09:04 crc kubenswrapper[4765]: I1003 09:09:04.861181 4765 memory_manager.go:354] "RemoveStaleState removing state" podUID="574ea430-f39b-420d-9b46-e64eb9e6135b" containerName="ceilometer-central-agent" Oct 03 09:09:04 crc kubenswrapper[4765]: I1003 09:09:04.861200 4765 memory_manager.go:354] "RemoveStaleState removing state" podUID="574ea430-f39b-420d-9b46-e64eb9e6135b" containerName="ceilometer-notification-agent" Oct 03 09:09:04 crc kubenswrapper[4765]: I1003 09:09:04.861235 4765 memory_manager.go:354] "RemoveStaleState removing state" podUID="574ea430-f39b-420d-9b46-e64eb9e6135b" containerName="proxy-httpd" Oct 03 09:09:04 crc kubenswrapper[4765]: I1003 09:09:04.862684 4765 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/ceilometer-0" Oct 03 09:09:04 crc kubenswrapper[4765]: I1003 09:09:04.865479 4765 reflector.go:368] Caches populated for *v1.Secret from object-"watcher-kuttl-default"/"ceilometer-scripts" Oct 03 09:09:04 crc kubenswrapper[4765]: I1003 09:09:04.870798 4765 reflector.go:368] Caches populated for *v1.Secret from object-"watcher-kuttl-default"/"ceilometer-config-data" Oct 03 09:09:04 crc kubenswrapper[4765]: I1003 09:09:04.873326 4765 reflector.go:368] Caches populated for *v1.Secret from object-"watcher-kuttl-default"/"cert-ceilometer-internal-svc" Oct 03 09:09:04 crc kubenswrapper[4765]: I1003 09:09:04.877782 4765 scope.go:117] "RemoveContainer" containerID="30d84e308541b3eebf1cf7d69f996a051fec0a846a685abb13b90a184fba99fa" Oct 03 09:09:04 crc kubenswrapper[4765]: I1003 09:09:04.883123 4765 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["watcher-kuttl-default/ceilometer-0"] Oct 03 09:09:05 crc kubenswrapper[4765]: I1003 09:09:05.007331 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/8a42feb3-6ec9-44be-b975-483de1699d32-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"8a42feb3-6ec9-44be-b975-483de1699d32\") " pod="watcher-kuttl-default/ceilometer-0" Oct 03 09:09:05 crc kubenswrapper[4765]: I1003 09:09:05.007385 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/8a42feb3-6ec9-44be-b975-483de1699d32-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"8a42feb3-6ec9-44be-b975-483de1699d32\") " pod="watcher-kuttl-default/ceilometer-0" Oct 03 09:09:05 crc kubenswrapper[4765]: I1003 09:09:05.007444 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dt4dt\" (UniqueName: \"kubernetes.io/projected/8a42feb3-6ec9-44be-b975-483de1699d32-kube-api-access-dt4dt\") pod \"ceilometer-0\" (UID: \"8a42feb3-6ec9-44be-b975-483de1699d32\") " pod="watcher-kuttl-default/ceilometer-0" Oct 03 09:09:05 crc kubenswrapper[4765]: I1003 09:09:05.007491 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/8a42feb3-6ec9-44be-b975-483de1699d32-run-httpd\") pod \"ceilometer-0\" (UID: \"8a42feb3-6ec9-44be-b975-483de1699d32\") " pod="watcher-kuttl-default/ceilometer-0" Oct 03 09:09:05 crc kubenswrapper[4765]: I1003 09:09:05.007525 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/8a42feb3-6ec9-44be-b975-483de1699d32-log-httpd\") pod \"ceilometer-0\" (UID: \"8a42feb3-6ec9-44be-b975-483de1699d32\") " pod="watcher-kuttl-default/ceilometer-0" Oct 03 09:09:05 crc kubenswrapper[4765]: I1003 09:09:05.007552 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8a42feb3-6ec9-44be-b975-483de1699d32-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"8a42feb3-6ec9-44be-b975-483de1699d32\") " pod="watcher-kuttl-default/ceilometer-0" Oct 03 09:09:05 crc kubenswrapper[4765]: I1003 09:09:05.007595 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/8a42feb3-6ec9-44be-b975-483de1699d32-scripts\") pod \"ceilometer-0\" (UID: \"8a42feb3-6ec9-44be-b975-483de1699d32\") " pod="watcher-kuttl-default/ceilometer-0" Oct 03 09:09:05 crc kubenswrapper[4765]: I1003 09:09:05.007616 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8a42feb3-6ec9-44be-b975-483de1699d32-config-data\") pod \"ceilometer-0\" (UID: \"8a42feb3-6ec9-44be-b975-483de1699d32\") " pod="watcher-kuttl-default/ceilometer-0" Oct 03 09:09:05 crc kubenswrapper[4765]: I1003 09:09:05.109121 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8a42feb3-6ec9-44be-b975-483de1699d32-config-data\") pod \"ceilometer-0\" (UID: \"8a42feb3-6ec9-44be-b975-483de1699d32\") " pod="watcher-kuttl-default/ceilometer-0" Oct 03 09:09:05 crc kubenswrapper[4765]: I1003 09:09:05.109183 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/8a42feb3-6ec9-44be-b975-483de1699d32-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"8a42feb3-6ec9-44be-b975-483de1699d32\") " pod="watcher-kuttl-default/ceilometer-0" Oct 03 09:09:05 crc kubenswrapper[4765]: I1003 09:09:05.109209 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/8a42feb3-6ec9-44be-b975-483de1699d32-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"8a42feb3-6ec9-44be-b975-483de1699d32\") " pod="watcher-kuttl-default/ceilometer-0" Oct 03 09:09:05 crc kubenswrapper[4765]: I1003 09:09:05.109253 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dt4dt\" (UniqueName: \"kubernetes.io/projected/8a42feb3-6ec9-44be-b975-483de1699d32-kube-api-access-dt4dt\") pod \"ceilometer-0\" (UID: \"8a42feb3-6ec9-44be-b975-483de1699d32\") " pod="watcher-kuttl-default/ceilometer-0" Oct 03 09:09:05 crc kubenswrapper[4765]: I1003 09:09:05.109290 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/8a42feb3-6ec9-44be-b975-483de1699d32-run-httpd\") pod \"ceilometer-0\" (UID: \"8a42feb3-6ec9-44be-b975-483de1699d32\") " pod="watcher-kuttl-default/ceilometer-0" Oct 03 09:09:05 crc kubenswrapper[4765]: I1003 09:09:05.109319 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/8a42feb3-6ec9-44be-b975-483de1699d32-log-httpd\") pod \"ceilometer-0\" (UID: \"8a42feb3-6ec9-44be-b975-483de1699d32\") " pod="watcher-kuttl-default/ceilometer-0" Oct 03 09:09:05 crc kubenswrapper[4765]: I1003 09:09:05.109340 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8a42feb3-6ec9-44be-b975-483de1699d32-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"8a42feb3-6ec9-44be-b975-483de1699d32\") " pod="watcher-kuttl-default/ceilometer-0" Oct 03 09:09:05 crc kubenswrapper[4765]: I1003 09:09:05.109373 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/8a42feb3-6ec9-44be-b975-483de1699d32-scripts\") pod \"ceilometer-0\" (UID: \"8a42feb3-6ec9-44be-b975-483de1699d32\") " pod="watcher-kuttl-default/ceilometer-0" Oct 03 09:09:05 crc kubenswrapper[4765]: I1003 09:09:05.110200 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/8a42feb3-6ec9-44be-b975-483de1699d32-log-httpd\") pod \"ceilometer-0\" (UID: \"8a42feb3-6ec9-44be-b975-483de1699d32\") " pod="watcher-kuttl-default/ceilometer-0" Oct 03 09:09:05 crc kubenswrapper[4765]: I1003 09:09:05.110309 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/8a42feb3-6ec9-44be-b975-483de1699d32-run-httpd\") pod \"ceilometer-0\" (UID: \"8a42feb3-6ec9-44be-b975-483de1699d32\") " pod="watcher-kuttl-default/ceilometer-0" Oct 03 09:09:05 crc kubenswrapper[4765]: I1003 09:09:05.115390 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/8a42feb3-6ec9-44be-b975-483de1699d32-scripts\") pod \"ceilometer-0\" (UID: \"8a42feb3-6ec9-44be-b975-483de1699d32\") " pod="watcher-kuttl-default/ceilometer-0" Oct 03 09:09:05 crc kubenswrapper[4765]: I1003 09:09:05.115816 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8a42feb3-6ec9-44be-b975-483de1699d32-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"8a42feb3-6ec9-44be-b975-483de1699d32\") " pod="watcher-kuttl-default/ceilometer-0" Oct 03 09:09:05 crc kubenswrapper[4765]: I1003 09:09:05.115869 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8a42feb3-6ec9-44be-b975-483de1699d32-config-data\") pod \"ceilometer-0\" (UID: \"8a42feb3-6ec9-44be-b975-483de1699d32\") " pod="watcher-kuttl-default/ceilometer-0" Oct 03 09:09:05 crc kubenswrapper[4765]: I1003 09:09:05.116201 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/8a42feb3-6ec9-44be-b975-483de1699d32-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"8a42feb3-6ec9-44be-b975-483de1699d32\") " pod="watcher-kuttl-default/ceilometer-0" Oct 03 09:09:05 crc kubenswrapper[4765]: I1003 09:09:05.116288 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/8a42feb3-6ec9-44be-b975-483de1699d32-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"8a42feb3-6ec9-44be-b975-483de1699d32\") " pod="watcher-kuttl-default/ceilometer-0" Oct 03 09:09:05 crc kubenswrapper[4765]: I1003 09:09:05.130536 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dt4dt\" (UniqueName: \"kubernetes.io/projected/8a42feb3-6ec9-44be-b975-483de1699d32-kube-api-access-dt4dt\") pod \"ceilometer-0\" (UID: \"8a42feb3-6ec9-44be-b975-483de1699d32\") " pod="watcher-kuttl-default/ceilometer-0" Oct 03 09:09:05 crc kubenswrapper[4765]: I1003 09:09:05.181176 4765 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/ceilometer-0" Oct 03 09:09:05 crc kubenswrapper[4765]: I1003 09:09:05.669268 4765 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["watcher-kuttl-default/ceilometer-0"] Oct 03 09:09:05 crc kubenswrapper[4765]: I1003 09:09:05.811096 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/ceilometer-0" event={"ID":"8a42feb3-6ec9-44be-b975-483de1699d32","Type":"ContainerStarted","Data":"004e84bec52ff853beae9062128392d9322c516ad141f706f783e01131d9397a"} Oct 03 09:09:06 crc kubenswrapper[4765]: I1003 09:09:06.317253 4765 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="574ea430-f39b-420d-9b46-e64eb9e6135b" path="/var/lib/kubelet/pods/574ea430-f39b-420d-9b46-e64eb9e6135b/volumes" Oct 03 09:09:06 crc kubenswrapper[4765]: I1003 09:09:06.820703 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/ceilometer-0" event={"ID":"8a42feb3-6ec9-44be-b975-483de1699d32","Type":"ContainerStarted","Data":"798a5f5518e5dab2a45e0776ea7b4ddd388efb363d594c7b1ce00b3987f31c76"} Oct 03 09:09:07 crc kubenswrapper[4765]: I1003 09:09:07.043707 4765 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["watcher-kuttl-default/watcher-kuttl-db-sync-8j8kj"] Oct 03 09:09:07 crc kubenswrapper[4765]: I1003 09:09:07.049101 4765 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["watcher-kuttl-default/watcher-kuttl-db-sync-8j8kj"] Oct 03 09:09:07 crc kubenswrapper[4765]: I1003 09:09:07.100440 4765 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["watcher-kuttl-default/watcher-kuttl-applier-0"] Oct 03 09:09:07 crc kubenswrapper[4765]: I1003 09:09:07.100724 4765 kuberuntime_container.go:808] "Killing container with a grace period" pod="watcher-kuttl-default/watcher-kuttl-applier-0" podUID="6a075387-26c0-423a-b081-403cf3aec3b8" containerName="watcher-applier" containerID="cri-o://1672ce3a83397ed7f6e940bf72b09ad0208092104d26081c2f9aefca308f793f" gracePeriod=30 Oct 03 09:09:07 crc kubenswrapper[4765]: I1003 09:09:07.130753 4765 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["watcher-kuttl-default/watcher32f6-account-delete-2bphl"] Oct 03 09:09:07 crc kubenswrapper[4765]: I1003 09:09:07.132029 4765 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/watcher32f6-account-delete-2bphl" Oct 03 09:09:07 crc kubenswrapper[4765]: I1003 09:09:07.145392 4765 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["watcher-kuttl-default/watcher32f6-account-delete-2bphl"] Oct 03 09:09:07 crc kubenswrapper[4765]: I1003 09:09:07.150784 4765 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["watcher-kuttl-default/watcher-kuttl-decision-engine-0"] Oct 03 09:09:07 crc kubenswrapper[4765]: I1003 09:09:07.151023 4765 kuberuntime_container.go:808] "Killing container with a grace period" pod="watcher-kuttl-default/watcher-kuttl-decision-engine-0" podUID="88d4477a-79e0-4441-b6d1-789055a87840" containerName="watcher-decision-engine" containerID="cri-o://94718d2e8bbef895d589842ad2becfcdf4813f57392af0b9e7b4f7132b9b556a" gracePeriod=30 Oct 03 09:09:07 crc kubenswrapper[4765]: I1003 09:09:07.241821 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zz94x\" (UniqueName: \"kubernetes.io/projected/0174d3df-5a54-49bd-bbc4-bf824c7a865a-kube-api-access-zz94x\") pod \"watcher32f6-account-delete-2bphl\" (UID: \"0174d3df-5a54-49bd-bbc4-bf824c7a865a\") " pod="watcher-kuttl-default/watcher32f6-account-delete-2bphl" Oct 03 09:09:07 crc kubenswrapper[4765]: I1003 09:09:07.242455 4765 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["watcher-kuttl-default/watcher-db-create-swdf8"] Oct 03 09:09:07 crc kubenswrapper[4765]: I1003 09:09:07.269032 4765 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["watcher-kuttl-default/watcher-kuttl-api-0"] Oct 03 09:09:07 crc kubenswrapper[4765]: I1003 09:09:07.269346 4765 kuberuntime_container.go:808] "Killing container with a grace period" pod="watcher-kuttl-default/watcher-kuttl-api-0" podUID="e41242f3-47ba-4f24-8287-756b9a7743be" containerName="watcher-kuttl-api-log" containerID="cri-o://d9131e283bbc924a306a6a7e2ec75f01f9e5213922f31101da67362ef4e9fc14" gracePeriod=30 Oct 03 09:09:07 crc kubenswrapper[4765]: I1003 09:09:07.269843 4765 kuberuntime_container.go:808] "Killing container with a grace period" pod="watcher-kuttl-default/watcher-kuttl-api-0" podUID="e41242f3-47ba-4f24-8287-756b9a7743be" containerName="watcher-api" containerID="cri-o://21ba0c2fb05ca7740fcfdaa71de126d194d83881c0201333dae0d087e258d33e" gracePeriod=30 Oct 03 09:09:07 crc kubenswrapper[4765]: I1003 09:09:07.306363 4765 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["watcher-kuttl-default/watcher-db-create-swdf8"] Oct 03 09:09:07 crc kubenswrapper[4765]: I1003 09:09:07.318934 4765 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["watcher-kuttl-default/watcher-32f6-account-create-mtr5l"] Oct 03 09:09:07 crc kubenswrapper[4765]: I1003 09:09:07.324466 4765 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["watcher-kuttl-default/watcher-32f6-account-create-mtr5l"] Oct 03 09:09:07 crc kubenswrapper[4765]: I1003 09:09:07.330737 4765 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["watcher-kuttl-default/watcher32f6-account-delete-2bphl"] Oct 03 09:09:07 crc kubenswrapper[4765]: E1003 09:09:07.331356 4765 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[kube-api-access-zz94x], unattached volumes=[], failed to process volumes=[]: context canceled" pod="watcher-kuttl-default/watcher32f6-account-delete-2bphl" podUID="0174d3df-5a54-49bd-bbc4-bf824c7a865a" Oct 03 09:09:07 crc kubenswrapper[4765]: I1003 09:09:07.354334 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zz94x\" (UniqueName: \"kubernetes.io/projected/0174d3df-5a54-49bd-bbc4-bf824c7a865a-kube-api-access-zz94x\") pod \"watcher32f6-account-delete-2bphl\" (UID: \"0174d3df-5a54-49bd-bbc4-bf824c7a865a\") " pod="watcher-kuttl-default/watcher32f6-account-delete-2bphl" Oct 03 09:09:07 crc kubenswrapper[4765]: I1003 09:09:07.383591 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zz94x\" (UniqueName: \"kubernetes.io/projected/0174d3df-5a54-49bd-bbc4-bf824c7a865a-kube-api-access-zz94x\") pod \"watcher32f6-account-delete-2bphl\" (UID: \"0174d3df-5a54-49bd-bbc4-bf824c7a865a\") " pod="watcher-kuttl-default/watcher32f6-account-delete-2bphl" Oct 03 09:09:07 crc kubenswrapper[4765]: I1003 09:09:07.466686 4765 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["watcher-kuttl-default/watcher-db-create-9qsk6"] Oct 03 09:09:07 crc kubenswrapper[4765]: I1003 09:09:07.467994 4765 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/watcher-db-create-9qsk6" Oct 03 09:09:07 crc kubenswrapper[4765]: I1003 09:09:07.482801 4765 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["watcher-kuttl-default/watcher-db-create-9qsk6"] Oct 03 09:09:07 crc kubenswrapper[4765]: I1003 09:09:07.564136 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bdvzc\" (UniqueName: \"kubernetes.io/projected/4b719514-ecf2-47e6-a002-fb51b387e66f-kube-api-access-bdvzc\") pod \"watcher-db-create-9qsk6\" (UID: \"4b719514-ecf2-47e6-a002-fb51b387e66f\") " pod="watcher-kuttl-default/watcher-db-create-9qsk6" Oct 03 09:09:07 crc kubenswrapper[4765]: I1003 09:09:07.666043 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bdvzc\" (UniqueName: \"kubernetes.io/projected/4b719514-ecf2-47e6-a002-fb51b387e66f-kube-api-access-bdvzc\") pod \"watcher-db-create-9qsk6\" (UID: \"4b719514-ecf2-47e6-a002-fb51b387e66f\") " pod="watcher-kuttl-default/watcher-db-create-9qsk6" Oct 03 09:09:07 crc kubenswrapper[4765]: I1003 09:09:07.698249 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bdvzc\" (UniqueName: \"kubernetes.io/projected/4b719514-ecf2-47e6-a002-fb51b387e66f-kube-api-access-bdvzc\") pod \"watcher-db-create-9qsk6\" (UID: \"4b719514-ecf2-47e6-a002-fb51b387e66f\") " pod="watcher-kuttl-default/watcher-db-create-9qsk6" Oct 03 09:09:07 crc kubenswrapper[4765]: I1003 09:09:07.831687 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/ceilometer-0" event={"ID":"8a42feb3-6ec9-44be-b975-483de1699d32","Type":"ContainerStarted","Data":"b72f6e8147b39ef2181f04c69b2e91ea6bc922de7763058a87a64082af3510c0"} Oct 03 09:09:07 crc kubenswrapper[4765]: I1003 09:09:07.832129 4765 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/watcher-db-create-9qsk6" Oct 03 09:09:07 crc kubenswrapper[4765]: I1003 09:09:07.834751 4765 generic.go:334] "Generic (PLEG): container finished" podID="e41242f3-47ba-4f24-8287-756b9a7743be" containerID="d9131e283bbc924a306a6a7e2ec75f01f9e5213922f31101da67362ef4e9fc14" exitCode=143 Oct 03 09:09:07 crc kubenswrapper[4765]: I1003 09:09:07.834781 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/watcher-kuttl-api-0" event={"ID":"e41242f3-47ba-4f24-8287-756b9a7743be","Type":"ContainerDied","Data":"d9131e283bbc924a306a6a7e2ec75f01f9e5213922f31101da67362ef4e9fc14"} Oct 03 09:09:07 crc kubenswrapper[4765]: I1003 09:09:07.834821 4765 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/watcher32f6-account-delete-2bphl" Oct 03 09:09:07 crc kubenswrapper[4765]: I1003 09:09:07.847675 4765 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/watcher32f6-account-delete-2bphl" Oct 03 09:09:07 crc kubenswrapper[4765]: I1003 09:09:07.971056 4765 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zz94x\" (UniqueName: \"kubernetes.io/projected/0174d3df-5a54-49bd-bbc4-bf824c7a865a-kube-api-access-zz94x\") pod \"0174d3df-5a54-49bd-bbc4-bf824c7a865a\" (UID: \"0174d3df-5a54-49bd-bbc4-bf824c7a865a\") " Oct 03 09:09:07 crc kubenswrapper[4765]: I1003 09:09:07.975435 4765 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0174d3df-5a54-49bd-bbc4-bf824c7a865a-kube-api-access-zz94x" (OuterVolumeSpecName: "kube-api-access-zz94x") pod "0174d3df-5a54-49bd-bbc4-bf824c7a865a" (UID: "0174d3df-5a54-49bd-bbc4-bf824c7a865a"). InnerVolumeSpecName "kube-api-access-zz94x". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 09:09:08 crc kubenswrapper[4765]: I1003 09:09:08.072728 4765 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zz94x\" (UniqueName: \"kubernetes.io/projected/0174d3df-5a54-49bd-bbc4-bf824c7a865a-kube-api-access-zz94x\") on node \"crc\" DevicePath \"\"" Oct 03 09:09:08 crc kubenswrapper[4765]: I1003 09:09:08.342604 4765 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="15257523-ed0d-43fe-aa54-96b150144372" path="/var/lib/kubelet/pods/15257523-ed0d-43fe-aa54-96b150144372/volumes" Oct 03 09:09:08 crc kubenswrapper[4765]: I1003 09:09:08.344119 4765 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9d8df962-8e48-477a-a6f4-0b2067c1ccdd" path="/var/lib/kubelet/pods/9d8df962-8e48-477a-a6f4-0b2067c1ccdd/volumes" Oct 03 09:09:08 crc kubenswrapper[4765]: I1003 09:09:08.344571 4765 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e6d93bea-73df-46c9-8ac7-c68d9c6ab117" path="/var/lib/kubelet/pods/e6d93bea-73df-46c9-8ac7-c68d9c6ab117/volumes" Oct 03 09:09:08 crc kubenswrapper[4765]: I1003 09:09:08.423579 4765 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["watcher-kuttl-default/watcher-db-create-9qsk6"] Oct 03 09:09:08 crc kubenswrapper[4765]: I1003 09:09:08.859085 4765 generic.go:334] "Generic (PLEG): container finished" podID="e41242f3-47ba-4f24-8287-756b9a7743be" containerID="21ba0c2fb05ca7740fcfdaa71de126d194d83881c0201333dae0d087e258d33e" exitCode=0 Oct 03 09:09:08 crc kubenswrapper[4765]: I1003 09:09:08.859304 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/watcher-kuttl-api-0" event={"ID":"e41242f3-47ba-4f24-8287-756b9a7743be","Type":"ContainerDied","Data":"21ba0c2fb05ca7740fcfdaa71de126d194d83881c0201333dae0d087e258d33e"} Oct 03 09:09:08 crc kubenswrapper[4765]: I1003 09:09:08.865257 4765 generic.go:334] "Generic (PLEG): container finished" podID="4b719514-ecf2-47e6-a002-fb51b387e66f" containerID="a91f44cc0c667004eee6ca4eea74e36064adc2cbd51680a6064eaf99964c8d76" exitCode=0 Oct 03 09:09:08 crc kubenswrapper[4765]: I1003 09:09:08.865360 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/watcher-db-create-9qsk6" event={"ID":"4b719514-ecf2-47e6-a002-fb51b387e66f","Type":"ContainerDied","Data":"a91f44cc0c667004eee6ca4eea74e36064adc2cbd51680a6064eaf99964c8d76"} Oct 03 09:09:08 crc kubenswrapper[4765]: I1003 09:09:08.865390 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/watcher-db-create-9qsk6" event={"ID":"4b719514-ecf2-47e6-a002-fb51b387e66f","Type":"ContainerStarted","Data":"8321c523dfca7fbbde9f84675d5f72e47888c826ca2b7db9a1761ca86bc70f60"} Oct 03 09:09:08 crc kubenswrapper[4765]: I1003 09:09:08.871238 4765 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/watcher32f6-account-delete-2bphl" Oct 03 09:09:08 crc kubenswrapper[4765]: I1003 09:09:08.871274 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/ceilometer-0" event={"ID":"8a42feb3-6ec9-44be-b975-483de1699d32","Type":"ContainerStarted","Data":"3ba420715ce6e255eb4f5665a813c79671375a147299526e605b2f4d2429a0c9"} Oct 03 09:09:08 crc kubenswrapper[4765]: I1003 09:09:08.925689 4765 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["watcher-kuttl-default/watcher32f6-account-delete-2bphl"] Oct 03 09:09:08 crc kubenswrapper[4765]: I1003 09:09:08.936629 4765 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["watcher-kuttl-default/watcher32f6-account-delete-2bphl"] Oct 03 09:09:09 crc kubenswrapper[4765]: I1003 09:09:09.177572 4765 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/watcher-kuttl-api-0" Oct 03 09:09:09 crc kubenswrapper[4765]: I1003 09:09:09.297175 4765 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e41242f3-47ba-4f24-8287-756b9a7743be-combined-ca-bundle\") pod \"e41242f3-47ba-4f24-8287-756b9a7743be\" (UID: \"e41242f3-47ba-4f24-8287-756b9a7743be\") " Oct 03 09:09:09 crc kubenswrapper[4765]: I1003 09:09:09.297240 4765 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-hnhmj\" (UniqueName: \"kubernetes.io/projected/e41242f3-47ba-4f24-8287-756b9a7743be-kube-api-access-hnhmj\") pod \"e41242f3-47ba-4f24-8287-756b9a7743be\" (UID: \"e41242f3-47ba-4f24-8287-756b9a7743be\") " Oct 03 09:09:09 crc kubenswrapper[4765]: I1003 09:09:09.297277 4765 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cert-memcached-mtls\" (UniqueName: \"kubernetes.io/secret/e41242f3-47ba-4f24-8287-756b9a7743be-cert-memcached-mtls\") pod \"e41242f3-47ba-4f24-8287-756b9a7743be\" (UID: \"e41242f3-47ba-4f24-8287-756b9a7743be\") " Oct 03 09:09:09 crc kubenswrapper[4765]: I1003 09:09:09.297332 4765 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/e41242f3-47ba-4f24-8287-756b9a7743be-logs\") pod \"e41242f3-47ba-4f24-8287-756b9a7743be\" (UID: \"e41242f3-47ba-4f24-8287-756b9a7743be\") " Oct 03 09:09:09 crc kubenswrapper[4765]: I1003 09:09:09.297402 4765 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"custom-prometheus-ca\" (UniqueName: \"kubernetes.io/secret/e41242f3-47ba-4f24-8287-756b9a7743be-custom-prometheus-ca\") pod \"e41242f3-47ba-4f24-8287-756b9a7743be\" (UID: \"e41242f3-47ba-4f24-8287-756b9a7743be\") " Oct 03 09:09:09 crc kubenswrapper[4765]: I1003 09:09:09.297464 4765 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e41242f3-47ba-4f24-8287-756b9a7743be-config-data\") pod \"e41242f3-47ba-4f24-8287-756b9a7743be\" (UID: \"e41242f3-47ba-4f24-8287-756b9a7743be\") " Oct 03 09:09:09 crc kubenswrapper[4765]: I1003 09:09:09.301369 4765 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e41242f3-47ba-4f24-8287-756b9a7743be-logs" (OuterVolumeSpecName: "logs") pod "e41242f3-47ba-4f24-8287-756b9a7743be" (UID: "e41242f3-47ba-4f24-8287-756b9a7743be"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 03 09:09:09 crc kubenswrapper[4765]: I1003 09:09:09.316780 4765 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e41242f3-47ba-4f24-8287-756b9a7743be-kube-api-access-hnhmj" (OuterVolumeSpecName: "kube-api-access-hnhmj") pod "e41242f3-47ba-4f24-8287-756b9a7743be" (UID: "e41242f3-47ba-4f24-8287-756b9a7743be"). InnerVolumeSpecName "kube-api-access-hnhmj". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 09:09:09 crc kubenswrapper[4765]: I1003 09:09:09.335269 4765 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e41242f3-47ba-4f24-8287-756b9a7743be-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "e41242f3-47ba-4f24-8287-756b9a7743be" (UID: "e41242f3-47ba-4f24-8287-756b9a7743be"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 09:09:09 crc kubenswrapper[4765]: I1003 09:09:09.378353 4765 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e41242f3-47ba-4f24-8287-756b9a7743be-custom-prometheus-ca" (OuterVolumeSpecName: "custom-prometheus-ca") pod "e41242f3-47ba-4f24-8287-756b9a7743be" (UID: "e41242f3-47ba-4f24-8287-756b9a7743be"). InnerVolumeSpecName "custom-prometheus-ca". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 09:09:09 crc kubenswrapper[4765]: I1003 09:09:09.378425 4765 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e41242f3-47ba-4f24-8287-756b9a7743be-config-data" (OuterVolumeSpecName: "config-data") pod "e41242f3-47ba-4f24-8287-756b9a7743be" (UID: "e41242f3-47ba-4f24-8287-756b9a7743be"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 09:09:09 crc kubenswrapper[4765]: I1003 09:09:09.401185 4765 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e41242f3-47ba-4f24-8287-756b9a7743be-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 03 09:09:09 crc kubenswrapper[4765]: I1003 09:09:09.401224 4765 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-hnhmj\" (UniqueName: \"kubernetes.io/projected/e41242f3-47ba-4f24-8287-756b9a7743be-kube-api-access-hnhmj\") on node \"crc\" DevicePath \"\"" Oct 03 09:09:09 crc kubenswrapper[4765]: I1003 09:09:09.401239 4765 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/e41242f3-47ba-4f24-8287-756b9a7743be-logs\") on node \"crc\" DevicePath \"\"" Oct 03 09:09:09 crc kubenswrapper[4765]: I1003 09:09:09.401250 4765 reconciler_common.go:293] "Volume detached for volume \"custom-prometheus-ca\" (UniqueName: \"kubernetes.io/secret/e41242f3-47ba-4f24-8287-756b9a7743be-custom-prometheus-ca\") on node \"crc\" DevicePath \"\"" Oct 03 09:09:09 crc kubenswrapper[4765]: I1003 09:09:09.401259 4765 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e41242f3-47ba-4f24-8287-756b9a7743be-config-data\") on node \"crc\" DevicePath \"\"" Oct 03 09:09:09 crc kubenswrapper[4765]: I1003 09:09:09.403816 4765 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e41242f3-47ba-4f24-8287-756b9a7743be-cert-memcached-mtls" (OuterVolumeSpecName: "cert-memcached-mtls") pod "e41242f3-47ba-4f24-8287-756b9a7743be" (UID: "e41242f3-47ba-4f24-8287-756b9a7743be"). InnerVolumeSpecName "cert-memcached-mtls". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 09:09:09 crc kubenswrapper[4765]: I1003 09:09:09.503583 4765 reconciler_common.go:293] "Volume detached for volume \"cert-memcached-mtls\" (UniqueName: \"kubernetes.io/secret/e41242f3-47ba-4f24-8287-756b9a7743be-cert-memcached-mtls\") on node \"crc\" DevicePath \"\"" Oct 03 09:09:09 crc kubenswrapper[4765]: I1003 09:09:09.835374 4765 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/watcher-kuttl-applier-0" Oct 03 09:09:09 crc kubenswrapper[4765]: I1003 09:09:09.899743 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/watcher-kuttl-api-0" event={"ID":"e41242f3-47ba-4f24-8287-756b9a7743be","Type":"ContainerDied","Data":"7b89fa6adccbfdca34f165870625d53aad58669f7962998bbfe754fb7bc9b141"} Oct 03 09:09:09 crc kubenswrapper[4765]: I1003 09:09:09.900722 4765 scope.go:117] "RemoveContainer" containerID="21ba0c2fb05ca7740fcfdaa71de126d194d83881c0201333dae0d087e258d33e" Oct 03 09:09:09 crc kubenswrapper[4765]: I1003 09:09:09.900941 4765 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/watcher-kuttl-api-0" Oct 03 09:09:09 crc kubenswrapper[4765]: I1003 09:09:09.913703 4765 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cert-memcached-mtls\" (UniqueName: \"kubernetes.io/secret/6a075387-26c0-423a-b081-403cf3aec3b8-cert-memcached-mtls\") pod \"6a075387-26c0-423a-b081-403cf3aec3b8\" (UID: \"6a075387-26c0-423a-b081-403cf3aec3b8\") " Oct 03 09:09:09 crc kubenswrapper[4765]: I1003 09:09:09.913775 4765 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6a075387-26c0-423a-b081-403cf3aec3b8-combined-ca-bundle\") pod \"6a075387-26c0-423a-b081-403cf3aec3b8\" (UID: \"6a075387-26c0-423a-b081-403cf3aec3b8\") " Oct 03 09:09:09 crc kubenswrapper[4765]: I1003 09:09:09.913889 4765 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pqf52\" (UniqueName: \"kubernetes.io/projected/6a075387-26c0-423a-b081-403cf3aec3b8-kube-api-access-pqf52\") pod \"6a075387-26c0-423a-b081-403cf3aec3b8\" (UID: \"6a075387-26c0-423a-b081-403cf3aec3b8\") " Oct 03 09:09:09 crc kubenswrapper[4765]: I1003 09:09:09.913959 4765 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6a075387-26c0-423a-b081-403cf3aec3b8-config-data\") pod \"6a075387-26c0-423a-b081-403cf3aec3b8\" (UID: \"6a075387-26c0-423a-b081-403cf3aec3b8\") " Oct 03 09:09:09 crc kubenswrapper[4765]: I1003 09:09:09.914017 4765 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/6a075387-26c0-423a-b081-403cf3aec3b8-logs\") pod \"6a075387-26c0-423a-b081-403cf3aec3b8\" (UID: \"6a075387-26c0-423a-b081-403cf3aec3b8\") " Oct 03 09:09:09 crc kubenswrapper[4765]: I1003 09:09:09.914589 4765 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/6a075387-26c0-423a-b081-403cf3aec3b8-logs" (OuterVolumeSpecName: "logs") pod "6a075387-26c0-423a-b081-403cf3aec3b8" (UID: "6a075387-26c0-423a-b081-403cf3aec3b8"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 03 09:09:09 crc kubenswrapper[4765]: I1003 09:09:09.915480 4765 generic.go:334] "Generic (PLEG): container finished" podID="6a075387-26c0-423a-b081-403cf3aec3b8" containerID="1672ce3a83397ed7f6e940bf72b09ad0208092104d26081c2f9aefca308f793f" exitCode=0 Oct 03 09:09:09 crc kubenswrapper[4765]: I1003 09:09:09.915571 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/watcher-kuttl-applier-0" event={"ID":"6a075387-26c0-423a-b081-403cf3aec3b8","Type":"ContainerDied","Data":"1672ce3a83397ed7f6e940bf72b09ad0208092104d26081c2f9aefca308f793f"} Oct 03 09:09:09 crc kubenswrapper[4765]: I1003 09:09:09.915598 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/watcher-kuttl-applier-0" event={"ID":"6a075387-26c0-423a-b081-403cf3aec3b8","Type":"ContainerDied","Data":"b34b1cd005c60b7c3cd429297e2f55cb53ce0cf74b04166d19203351ff3e3162"} Oct 03 09:09:09 crc kubenswrapper[4765]: I1003 09:09:09.915672 4765 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/watcher-kuttl-applier-0" Oct 03 09:09:09 crc kubenswrapper[4765]: I1003 09:09:09.919227 4765 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6a075387-26c0-423a-b081-403cf3aec3b8-kube-api-access-pqf52" (OuterVolumeSpecName: "kube-api-access-pqf52") pod "6a075387-26c0-423a-b081-403cf3aec3b8" (UID: "6a075387-26c0-423a-b081-403cf3aec3b8"). InnerVolumeSpecName "kube-api-access-pqf52". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 09:09:09 crc kubenswrapper[4765]: I1003 09:09:09.922335 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/ceilometer-0" event={"ID":"8a42feb3-6ec9-44be-b975-483de1699d32","Type":"ContainerStarted","Data":"385c741f51973737dbcc4bdf3c5f9a242319cfeebac8cc4f97105aa61fec12b9"} Oct 03 09:09:09 crc kubenswrapper[4765]: I1003 09:09:09.922705 4765 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="watcher-kuttl-default/ceilometer-0" Oct 03 09:09:09 crc kubenswrapper[4765]: I1003 09:09:09.946615 4765 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="watcher-kuttl-default/ceilometer-0" podStartSLOduration=2.122863482 podStartE2EDuration="5.946586127s" podCreationTimestamp="2025-10-03 09:09:04 +0000 UTC" firstStartedPulling="2025-10-03 09:09:05.671303804 +0000 UTC m=+1789.972798134" lastFinishedPulling="2025-10-03 09:09:09.495026449 +0000 UTC m=+1793.796520779" observedRunningTime="2025-10-03 09:09:09.94287755 +0000 UTC m=+1794.244371880" watchObservedRunningTime="2025-10-03 09:09:09.946586127 +0000 UTC m=+1794.248080457" Oct 03 09:09:09 crc kubenswrapper[4765]: I1003 09:09:09.959464 4765 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6a075387-26c0-423a-b081-403cf3aec3b8-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "6a075387-26c0-423a-b081-403cf3aec3b8" (UID: "6a075387-26c0-423a-b081-403cf3aec3b8"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 09:09:09 crc kubenswrapper[4765]: I1003 09:09:09.965409 4765 scope.go:117] "RemoveContainer" containerID="d9131e283bbc924a306a6a7e2ec75f01f9e5213922f31101da67362ef4e9fc14" Oct 03 09:09:10 crc kubenswrapper[4765]: I1003 09:09:10.004495 4765 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["watcher-kuttl-default/watcher-kuttl-api-0"] Oct 03 09:09:10 crc kubenswrapper[4765]: I1003 09:09:10.008681 4765 scope.go:117] "RemoveContainer" containerID="1672ce3a83397ed7f6e940bf72b09ad0208092104d26081c2f9aefca308f793f" Oct 03 09:09:10 crc kubenswrapper[4765]: I1003 09:09:10.012637 4765 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["watcher-kuttl-default/watcher-kuttl-api-0"] Oct 03 09:09:10 crc kubenswrapper[4765]: I1003 09:09:10.015778 4765 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6a075387-26c0-423a-b081-403cf3aec3b8-cert-memcached-mtls" (OuterVolumeSpecName: "cert-memcached-mtls") pod "6a075387-26c0-423a-b081-403cf3aec3b8" (UID: "6a075387-26c0-423a-b081-403cf3aec3b8"). InnerVolumeSpecName "cert-memcached-mtls". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 09:09:10 crc kubenswrapper[4765]: I1003 09:09:10.016970 4765 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/6a075387-26c0-423a-b081-403cf3aec3b8-logs\") on node \"crc\" DevicePath \"\"" Oct 03 09:09:10 crc kubenswrapper[4765]: I1003 09:09:10.017088 4765 reconciler_common.go:293] "Volume detached for volume \"cert-memcached-mtls\" (UniqueName: \"kubernetes.io/secret/6a075387-26c0-423a-b081-403cf3aec3b8-cert-memcached-mtls\") on node \"crc\" DevicePath \"\"" Oct 03 09:09:10 crc kubenswrapper[4765]: I1003 09:09:10.017173 4765 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6a075387-26c0-423a-b081-403cf3aec3b8-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 03 09:09:10 crc kubenswrapper[4765]: I1003 09:09:10.017246 4765 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pqf52\" (UniqueName: \"kubernetes.io/projected/6a075387-26c0-423a-b081-403cf3aec3b8-kube-api-access-pqf52\") on node \"crc\" DevicePath \"\"" Oct 03 09:09:10 crc kubenswrapper[4765]: I1003 09:09:10.051946 4765 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6a075387-26c0-423a-b081-403cf3aec3b8-config-data" (OuterVolumeSpecName: "config-data") pod "6a075387-26c0-423a-b081-403cf3aec3b8" (UID: "6a075387-26c0-423a-b081-403cf3aec3b8"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 09:09:10 crc kubenswrapper[4765]: I1003 09:09:10.052443 4765 scope.go:117] "RemoveContainer" containerID="1672ce3a83397ed7f6e940bf72b09ad0208092104d26081c2f9aefca308f793f" Oct 03 09:09:10 crc kubenswrapper[4765]: E1003 09:09:10.053220 4765 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1672ce3a83397ed7f6e940bf72b09ad0208092104d26081c2f9aefca308f793f\": container with ID starting with 1672ce3a83397ed7f6e940bf72b09ad0208092104d26081c2f9aefca308f793f not found: ID does not exist" containerID="1672ce3a83397ed7f6e940bf72b09ad0208092104d26081c2f9aefca308f793f" Oct 03 09:09:10 crc kubenswrapper[4765]: I1003 09:09:10.053282 4765 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1672ce3a83397ed7f6e940bf72b09ad0208092104d26081c2f9aefca308f793f"} err="failed to get container status \"1672ce3a83397ed7f6e940bf72b09ad0208092104d26081c2f9aefca308f793f\": rpc error: code = NotFound desc = could not find container \"1672ce3a83397ed7f6e940bf72b09ad0208092104d26081c2f9aefca308f793f\": container with ID starting with 1672ce3a83397ed7f6e940bf72b09ad0208092104d26081c2f9aefca308f793f not found: ID does not exist" Oct 03 09:09:10 crc kubenswrapper[4765]: I1003 09:09:10.119373 4765 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6a075387-26c0-423a-b081-403cf3aec3b8-config-data\") on node \"crc\" DevicePath \"\"" Oct 03 09:09:10 crc kubenswrapper[4765]: I1003 09:09:10.267176 4765 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["watcher-kuttl-default/watcher-kuttl-applier-0"] Oct 03 09:09:10 crc kubenswrapper[4765]: I1003 09:09:10.284674 4765 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["watcher-kuttl-default/watcher-kuttl-applier-0"] Oct 03 09:09:10 crc kubenswrapper[4765]: I1003 09:09:10.310287 4765 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/watcher-db-create-9qsk6" Oct 03 09:09:10 crc kubenswrapper[4765]: I1003 09:09:10.321599 4765 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0174d3df-5a54-49bd-bbc4-bf824c7a865a" path="/var/lib/kubelet/pods/0174d3df-5a54-49bd-bbc4-bf824c7a865a/volumes" Oct 03 09:09:10 crc kubenswrapper[4765]: I1003 09:09:10.322182 4765 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6a075387-26c0-423a-b081-403cf3aec3b8" path="/var/lib/kubelet/pods/6a075387-26c0-423a-b081-403cf3aec3b8/volumes" Oct 03 09:09:10 crc kubenswrapper[4765]: I1003 09:09:10.322772 4765 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e41242f3-47ba-4f24-8287-756b9a7743be" path="/var/lib/kubelet/pods/e41242f3-47ba-4f24-8287-756b9a7743be/volumes" Oct 03 09:09:10 crc kubenswrapper[4765]: I1003 09:09:10.428226 4765 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bdvzc\" (UniqueName: \"kubernetes.io/projected/4b719514-ecf2-47e6-a002-fb51b387e66f-kube-api-access-bdvzc\") pod \"4b719514-ecf2-47e6-a002-fb51b387e66f\" (UID: \"4b719514-ecf2-47e6-a002-fb51b387e66f\") " Oct 03 09:09:10 crc kubenswrapper[4765]: I1003 09:09:10.434063 4765 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4b719514-ecf2-47e6-a002-fb51b387e66f-kube-api-access-bdvzc" (OuterVolumeSpecName: "kube-api-access-bdvzc") pod "4b719514-ecf2-47e6-a002-fb51b387e66f" (UID: "4b719514-ecf2-47e6-a002-fb51b387e66f"). InnerVolumeSpecName "kube-api-access-bdvzc". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 09:09:10 crc kubenswrapper[4765]: I1003 09:09:10.524832 4765 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/watcher-kuttl-decision-engine-0" Oct 03 09:09:10 crc kubenswrapper[4765]: I1003 09:09:10.531966 4765 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bdvzc\" (UniqueName: \"kubernetes.io/projected/4b719514-ecf2-47e6-a002-fb51b387e66f-kube-api-access-bdvzc\") on node \"crc\" DevicePath \"\"" Oct 03 09:09:10 crc kubenswrapper[4765]: I1003 09:09:10.632935 4765 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/88d4477a-79e0-4441-b6d1-789055a87840-logs\") pod \"88d4477a-79e0-4441-b6d1-789055a87840\" (UID: \"88d4477a-79e0-4441-b6d1-789055a87840\") " Oct 03 09:09:10 crc kubenswrapper[4765]: I1003 09:09:10.633001 4765 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vz9c7\" (UniqueName: \"kubernetes.io/projected/88d4477a-79e0-4441-b6d1-789055a87840-kube-api-access-vz9c7\") pod \"88d4477a-79e0-4441-b6d1-789055a87840\" (UID: \"88d4477a-79e0-4441-b6d1-789055a87840\") " Oct 03 09:09:10 crc kubenswrapper[4765]: I1003 09:09:10.633074 4765 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/88d4477a-79e0-4441-b6d1-789055a87840-combined-ca-bundle\") pod \"88d4477a-79e0-4441-b6d1-789055a87840\" (UID: \"88d4477a-79e0-4441-b6d1-789055a87840\") " Oct 03 09:09:10 crc kubenswrapper[4765]: I1003 09:09:10.633092 4765 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"custom-prometheus-ca\" (UniqueName: \"kubernetes.io/secret/88d4477a-79e0-4441-b6d1-789055a87840-custom-prometheus-ca\") pod \"88d4477a-79e0-4441-b6d1-789055a87840\" (UID: \"88d4477a-79e0-4441-b6d1-789055a87840\") " Oct 03 09:09:10 crc kubenswrapper[4765]: I1003 09:09:10.633143 4765 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cert-memcached-mtls\" (UniqueName: \"kubernetes.io/secret/88d4477a-79e0-4441-b6d1-789055a87840-cert-memcached-mtls\") pod \"88d4477a-79e0-4441-b6d1-789055a87840\" (UID: \"88d4477a-79e0-4441-b6d1-789055a87840\") " Oct 03 09:09:10 crc kubenswrapper[4765]: I1003 09:09:10.633183 4765 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/88d4477a-79e0-4441-b6d1-789055a87840-config-data\") pod \"88d4477a-79e0-4441-b6d1-789055a87840\" (UID: \"88d4477a-79e0-4441-b6d1-789055a87840\") " Oct 03 09:09:10 crc kubenswrapper[4765]: I1003 09:09:10.634378 4765 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/88d4477a-79e0-4441-b6d1-789055a87840-logs" (OuterVolumeSpecName: "logs") pod "88d4477a-79e0-4441-b6d1-789055a87840" (UID: "88d4477a-79e0-4441-b6d1-789055a87840"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 03 09:09:10 crc kubenswrapper[4765]: I1003 09:09:10.644740 4765 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/88d4477a-79e0-4441-b6d1-789055a87840-kube-api-access-vz9c7" (OuterVolumeSpecName: "kube-api-access-vz9c7") pod "88d4477a-79e0-4441-b6d1-789055a87840" (UID: "88d4477a-79e0-4441-b6d1-789055a87840"). InnerVolumeSpecName "kube-api-access-vz9c7". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 09:09:10 crc kubenswrapper[4765]: I1003 09:09:10.662383 4765 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/88d4477a-79e0-4441-b6d1-789055a87840-custom-prometheus-ca" (OuterVolumeSpecName: "custom-prometheus-ca") pod "88d4477a-79e0-4441-b6d1-789055a87840" (UID: "88d4477a-79e0-4441-b6d1-789055a87840"). InnerVolumeSpecName "custom-prometheus-ca". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 09:09:10 crc kubenswrapper[4765]: I1003 09:09:10.664711 4765 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/88d4477a-79e0-4441-b6d1-789055a87840-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "88d4477a-79e0-4441-b6d1-789055a87840" (UID: "88d4477a-79e0-4441-b6d1-789055a87840"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 09:09:10 crc kubenswrapper[4765]: I1003 09:09:10.716582 4765 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/88d4477a-79e0-4441-b6d1-789055a87840-config-data" (OuterVolumeSpecName: "config-data") pod "88d4477a-79e0-4441-b6d1-789055a87840" (UID: "88d4477a-79e0-4441-b6d1-789055a87840"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 09:09:10 crc kubenswrapper[4765]: I1003 09:09:10.734475 4765 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/88d4477a-79e0-4441-b6d1-789055a87840-config-data\") on node \"crc\" DevicePath \"\"" Oct 03 09:09:10 crc kubenswrapper[4765]: I1003 09:09:10.734509 4765 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/88d4477a-79e0-4441-b6d1-789055a87840-logs\") on node \"crc\" DevicePath \"\"" Oct 03 09:09:10 crc kubenswrapper[4765]: I1003 09:09:10.734518 4765 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vz9c7\" (UniqueName: \"kubernetes.io/projected/88d4477a-79e0-4441-b6d1-789055a87840-kube-api-access-vz9c7\") on node \"crc\" DevicePath \"\"" Oct 03 09:09:10 crc kubenswrapper[4765]: I1003 09:09:10.734527 4765 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/88d4477a-79e0-4441-b6d1-789055a87840-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 03 09:09:10 crc kubenswrapper[4765]: I1003 09:09:10.734536 4765 reconciler_common.go:293] "Volume detached for volume \"custom-prometheus-ca\" (UniqueName: \"kubernetes.io/secret/88d4477a-79e0-4441-b6d1-789055a87840-custom-prometheus-ca\") on node \"crc\" DevicePath \"\"" Oct 03 09:09:10 crc kubenswrapper[4765]: I1003 09:09:10.737601 4765 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/88d4477a-79e0-4441-b6d1-789055a87840-cert-memcached-mtls" (OuterVolumeSpecName: "cert-memcached-mtls") pod "88d4477a-79e0-4441-b6d1-789055a87840" (UID: "88d4477a-79e0-4441-b6d1-789055a87840"). InnerVolumeSpecName "cert-memcached-mtls". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 09:09:10 crc kubenswrapper[4765]: I1003 09:09:10.835938 4765 reconciler_common.go:293] "Volume detached for volume \"cert-memcached-mtls\" (UniqueName: \"kubernetes.io/secret/88d4477a-79e0-4441-b6d1-789055a87840-cert-memcached-mtls\") on node \"crc\" DevicePath \"\"" Oct 03 09:09:10 crc kubenswrapper[4765]: I1003 09:09:10.931132 4765 generic.go:334] "Generic (PLEG): container finished" podID="88d4477a-79e0-4441-b6d1-789055a87840" containerID="94718d2e8bbef895d589842ad2becfcdf4813f57392af0b9e7b4f7132b9b556a" exitCode=0 Oct 03 09:09:10 crc kubenswrapper[4765]: I1003 09:09:10.931176 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/watcher-kuttl-decision-engine-0" event={"ID":"88d4477a-79e0-4441-b6d1-789055a87840","Type":"ContainerDied","Data":"94718d2e8bbef895d589842ad2becfcdf4813f57392af0b9e7b4f7132b9b556a"} Oct 03 09:09:10 crc kubenswrapper[4765]: I1003 09:09:10.931220 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/watcher-kuttl-decision-engine-0" event={"ID":"88d4477a-79e0-4441-b6d1-789055a87840","Type":"ContainerDied","Data":"e4c5e7cef3b36b71391f3e13f32c93700edab7914120a74ef62c2f8b12c94d98"} Oct 03 09:09:10 crc kubenswrapper[4765]: I1003 09:09:10.931239 4765 scope.go:117] "RemoveContainer" containerID="94718d2e8bbef895d589842ad2becfcdf4813f57392af0b9e7b4f7132b9b556a" Oct 03 09:09:10 crc kubenswrapper[4765]: I1003 09:09:10.932273 4765 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/watcher-kuttl-decision-engine-0" Oct 03 09:09:10 crc kubenswrapper[4765]: I1003 09:09:10.932957 4765 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/watcher-db-create-9qsk6" Oct 03 09:09:10 crc kubenswrapper[4765]: I1003 09:09:10.932952 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/watcher-db-create-9qsk6" event={"ID":"4b719514-ecf2-47e6-a002-fb51b387e66f","Type":"ContainerDied","Data":"8321c523dfca7fbbde9f84675d5f72e47888c826ca2b7db9a1761ca86bc70f60"} Oct 03 09:09:10 crc kubenswrapper[4765]: I1003 09:09:10.933072 4765 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="8321c523dfca7fbbde9f84675d5f72e47888c826ca2b7db9a1761ca86bc70f60" Oct 03 09:09:10 crc kubenswrapper[4765]: I1003 09:09:10.958484 4765 scope.go:117] "RemoveContainer" containerID="94718d2e8bbef895d589842ad2becfcdf4813f57392af0b9e7b4f7132b9b556a" Oct 03 09:09:10 crc kubenswrapper[4765]: E1003 09:09:10.959056 4765 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"94718d2e8bbef895d589842ad2becfcdf4813f57392af0b9e7b4f7132b9b556a\": container with ID starting with 94718d2e8bbef895d589842ad2becfcdf4813f57392af0b9e7b4f7132b9b556a not found: ID does not exist" containerID="94718d2e8bbef895d589842ad2becfcdf4813f57392af0b9e7b4f7132b9b556a" Oct 03 09:09:10 crc kubenswrapper[4765]: I1003 09:09:10.959177 4765 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"94718d2e8bbef895d589842ad2becfcdf4813f57392af0b9e7b4f7132b9b556a"} err="failed to get container status \"94718d2e8bbef895d589842ad2becfcdf4813f57392af0b9e7b4f7132b9b556a\": rpc error: code = NotFound desc = could not find container \"94718d2e8bbef895d589842ad2becfcdf4813f57392af0b9e7b4f7132b9b556a\": container with ID starting with 94718d2e8bbef895d589842ad2becfcdf4813f57392af0b9e7b4f7132b9b556a not found: ID does not exist" Oct 03 09:09:10 crc kubenswrapper[4765]: I1003 09:09:10.969807 4765 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["watcher-kuttl-default/watcher-kuttl-decision-engine-0"] Oct 03 09:09:10 crc kubenswrapper[4765]: I1003 09:09:10.981310 4765 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["watcher-kuttl-default/watcher-kuttl-decision-engine-0"] Oct 03 09:09:11 crc kubenswrapper[4765]: I1003 09:09:11.307050 4765 scope.go:117] "RemoveContainer" containerID="dd918556e4256b95f1ffce5dba4f8a301b33441a569fc5bbea88da3f09eb9800" Oct 03 09:09:11 crc kubenswrapper[4765]: E1003 09:09:11.307330 4765 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-j8mss_openshift-machine-config-operator(d636dbad-9ffa-4ba7-953f-adea04b76a23)\"" pod="openshift-machine-config-operator/machine-config-daemon-j8mss" podUID="d636dbad-9ffa-4ba7-953f-adea04b76a23" Oct 03 09:09:11 crc kubenswrapper[4765]: I1003 09:09:11.642023 4765 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["watcher-kuttl-default/ceilometer-0"] Oct 03 09:09:12 crc kubenswrapper[4765]: I1003 09:09:12.316455 4765 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="88d4477a-79e0-4441-b6d1-789055a87840" path="/var/lib/kubelet/pods/88d4477a-79e0-4441-b6d1-789055a87840/volumes" Oct 03 09:09:12 crc kubenswrapper[4765]: I1003 09:09:12.958467 4765 kuberuntime_container.go:808] "Killing container with a grace period" pod="watcher-kuttl-default/ceilometer-0" podUID="8a42feb3-6ec9-44be-b975-483de1699d32" containerName="ceilometer-central-agent" containerID="cri-o://798a5f5518e5dab2a45e0776ea7b4ddd388efb363d594c7b1ce00b3987f31c76" gracePeriod=30 Oct 03 09:09:12 crc kubenswrapper[4765]: I1003 09:09:12.958572 4765 kuberuntime_container.go:808] "Killing container with a grace period" pod="watcher-kuttl-default/ceilometer-0" podUID="8a42feb3-6ec9-44be-b975-483de1699d32" containerName="proxy-httpd" containerID="cri-o://385c741f51973737dbcc4bdf3c5f9a242319cfeebac8cc4f97105aa61fec12b9" gracePeriod=30 Oct 03 09:09:12 crc kubenswrapper[4765]: I1003 09:09:12.958607 4765 kuberuntime_container.go:808] "Killing container with a grace period" pod="watcher-kuttl-default/ceilometer-0" podUID="8a42feb3-6ec9-44be-b975-483de1699d32" containerName="sg-core" containerID="cri-o://3ba420715ce6e255eb4f5665a813c79671375a147299526e605b2f4d2429a0c9" gracePeriod=30 Oct 03 09:09:12 crc kubenswrapper[4765]: I1003 09:09:12.958638 4765 kuberuntime_container.go:808] "Killing container with a grace period" pod="watcher-kuttl-default/ceilometer-0" podUID="8a42feb3-6ec9-44be-b975-483de1699d32" containerName="ceilometer-notification-agent" containerID="cri-o://b72f6e8147b39ef2181f04c69b2e91ea6bc922de7763058a87a64082af3510c0" gracePeriod=30 Oct 03 09:09:13 crc kubenswrapper[4765]: I1003 09:09:13.764692 4765 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/ceilometer-0" Oct 03 09:09:13 crc kubenswrapper[4765]: I1003 09:09:13.879910 4765 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/8a42feb3-6ec9-44be-b975-483de1699d32-log-httpd\") pod \"8a42feb3-6ec9-44be-b975-483de1699d32\" (UID: \"8a42feb3-6ec9-44be-b975-483de1699d32\") " Oct 03 09:09:13 crc kubenswrapper[4765]: I1003 09:09:13.879991 4765 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/8a42feb3-6ec9-44be-b975-483de1699d32-scripts\") pod \"8a42feb3-6ec9-44be-b975-483de1699d32\" (UID: \"8a42feb3-6ec9-44be-b975-483de1699d32\") " Oct 03 09:09:13 crc kubenswrapper[4765]: I1003 09:09:13.880087 4765 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/8a42feb3-6ec9-44be-b975-483de1699d32-ceilometer-tls-certs\") pod \"8a42feb3-6ec9-44be-b975-483de1699d32\" (UID: \"8a42feb3-6ec9-44be-b975-483de1699d32\") " Oct 03 09:09:13 crc kubenswrapper[4765]: I1003 09:09:13.880129 4765 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/8a42feb3-6ec9-44be-b975-483de1699d32-run-httpd\") pod \"8a42feb3-6ec9-44be-b975-483de1699d32\" (UID: \"8a42feb3-6ec9-44be-b975-483de1699d32\") " Oct 03 09:09:13 crc kubenswrapper[4765]: I1003 09:09:13.880191 4765 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dt4dt\" (UniqueName: \"kubernetes.io/projected/8a42feb3-6ec9-44be-b975-483de1699d32-kube-api-access-dt4dt\") pod \"8a42feb3-6ec9-44be-b975-483de1699d32\" (UID: \"8a42feb3-6ec9-44be-b975-483de1699d32\") " Oct 03 09:09:13 crc kubenswrapper[4765]: I1003 09:09:13.880213 4765 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/8a42feb3-6ec9-44be-b975-483de1699d32-sg-core-conf-yaml\") pod \"8a42feb3-6ec9-44be-b975-483de1699d32\" (UID: \"8a42feb3-6ec9-44be-b975-483de1699d32\") " Oct 03 09:09:13 crc kubenswrapper[4765]: I1003 09:09:13.880274 4765 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8a42feb3-6ec9-44be-b975-483de1699d32-combined-ca-bundle\") pod \"8a42feb3-6ec9-44be-b975-483de1699d32\" (UID: \"8a42feb3-6ec9-44be-b975-483de1699d32\") " Oct 03 09:09:13 crc kubenswrapper[4765]: I1003 09:09:13.880298 4765 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8a42feb3-6ec9-44be-b975-483de1699d32-config-data\") pod \"8a42feb3-6ec9-44be-b975-483de1699d32\" (UID: \"8a42feb3-6ec9-44be-b975-483de1699d32\") " Oct 03 09:09:13 crc kubenswrapper[4765]: I1003 09:09:13.880427 4765 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8a42feb3-6ec9-44be-b975-483de1699d32-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "8a42feb3-6ec9-44be-b975-483de1699d32" (UID: "8a42feb3-6ec9-44be-b975-483de1699d32"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 03 09:09:13 crc kubenswrapper[4765]: I1003 09:09:13.880702 4765 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/8a42feb3-6ec9-44be-b975-483de1699d32-log-httpd\") on node \"crc\" DevicePath \"\"" Oct 03 09:09:13 crc kubenswrapper[4765]: I1003 09:09:13.880708 4765 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8a42feb3-6ec9-44be-b975-483de1699d32-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "8a42feb3-6ec9-44be-b975-483de1699d32" (UID: "8a42feb3-6ec9-44be-b975-483de1699d32"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 03 09:09:13 crc kubenswrapper[4765]: I1003 09:09:13.886332 4765 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8a42feb3-6ec9-44be-b975-483de1699d32-kube-api-access-dt4dt" (OuterVolumeSpecName: "kube-api-access-dt4dt") pod "8a42feb3-6ec9-44be-b975-483de1699d32" (UID: "8a42feb3-6ec9-44be-b975-483de1699d32"). InnerVolumeSpecName "kube-api-access-dt4dt". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 09:09:13 crc kubenswrapper[4765]: I1003 09:09:13.886532 4765 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8a42feb3-6ec9-44be-b975-483de1699d32-scripts" (OuterVolumeSpecName: "scripts") pod "8a42feb3-6ec9-44be-b975-483de1699d32" (UID: "8a42feb3-6ec9-44be-b975-483de1699d32"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 09:09:13 crc kubenswrapper[4765]: I1003 09:09:13.904465 4765 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8a42feb3-6ec9-44be-b975-483de1699d32-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "8a42feb3-6ec9-44be-b975-483de1699d32" (UID: "8a42feb3-6ec9-44be-b975-483de1699d32"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 09:09:13 crc kubenswrapper[4765]: I1003 09:09:13.923629 4765 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8a42feb3-6ec9-44be-b975-483de1699d32-ceilometer-tls-certs" (OuterVolumeSpecName: "ceilometer-tls-certs") pod "8a42feb3-6ec9-44be-b975-483de1699d32" (UID: "8a42feb3-6ec9-44be-b975-483de1699d32"). InnerVolumeSpecName "ceilometer-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 09:09:13 crc kubenswrapper[4765]: I1003 09:09:13.943092 4765 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8a42feb3-6ec9-44be-b975-483de1699d32-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "8a42feb3-6ec9-44be-b975-483de1699d32" (UID: "8a42feb3-6ec9-44be-b975-483de1699d32"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 09:09:13 crc kubenswrapper[4765]: I1003 09:09:13.967814 4765 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8a42feb3-6ec9-44be-b975-483de1699d32-config-data" (OuterVolumeSpecName: "config-data") pod "8a42feb3-6ec9-44be-b975-483de1699d32" (UID: "8a42feb3-6ec9-44be-b975-483de1699d32"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 09:09:13 crc kubenswrapper[4765]: I1003 09:09:13.971351 4765 generic.go:334] "Generic (PLEG): container finished" podID="8a42feb3-6ec9-44be-b975-483de1699d32" containerID="385c741f51973737dbcc4bdf3c5f9a242319cfeebac8cc4f97105aa61fec12b9" exitCode=0 Oct 03 09:09:13 crc kubenswrapper[4765]: I1003 09:09:13.971497 4765 generic.go:334] "Generic (PLEG): container finished" podID="8a42feb3-6ec9-44be-b975-483de1699d32" containerID="3ba420715ce6e255eb4f5665a813c79671375a147299526e605b2f4d2429a0c9" exitCode=2 Oct 03 09:09:13 crc kubenswrapper[4765]: I1003 09:09:13.971580 4765 generic.go:334] "Generic (PLEG): container finished" podID="8a42feb3-6ec9-44be-b975-483de1699d32" containerID="b72f6e8147b39ef2181f04c69b2e91ea6bc922de7763058a87a64082af3510c0" exitCode=0 Oct 03 09:09:13 crc kubenswrapper[4765]: I1003 09:09:13.971431 4765 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/ceilometer-0" Oct 03 09:09:13 crc kubenswrapper[4765]: I1003 09:09:13.971400 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/ceilometer-0" event={"ID":"8a42feb3-6ec9-44be-b975-483de1699d32","Type":"ContainerDied","Data":"385c741f51973737dbcc4bdf3c5f9a242319cfeebac8cc4f97105aa61fec12b9"} Oct 03 09:09:13 crc kubenswrapper[4765]: I1003 09:09:13.971774 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/ceilometer-0" event={"ID":"8a42feb3-6ec9-44be-b975-483de1699d32","Type":"ContainerDied","Data":"3ba420715ce6e255eb4f5665a813c79671375a147299526e605b2f4d2429a0c9"} Oct 03 09:09:13 crc kubenswrapper[4765]: I1003 09:09:13.971793 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/ceilometer-0" event={"ID":"8a42feb3-6ec9-44be-b975-483de1699d32","Type":"ContainerDied","Data":"b72f6e8147b39ef2181f04c69b2e91ea6bc922de7763058a87a64082af3510c0"} Oct 03 09:09:13 crc kubenswrapper[4765]: I1003 09:09:13.971804 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/ceilometer-0" event={"ID":"8a42feb3-6ec9-44be-b975-483de1699d32","Type":"ContainerDied","Data":"798a5f5518e5dab2a45e0776ea7b4ddd388efb363d594c7b1ce00b3987f31c76"} Oct 03 09:09:13 crc kubenswrapper[4765]: I1003 09:09:13.971834 4765 scope.go:117] "RemoveContainer" containerID="385c741f51973737dbcc4bdf3c5f9a242319cfeebac8cc4f97105aa61fec12b9" Oct 03 09:09:13 crc kubenswrapper[4765]: I1003 09:09:13.971666 4765 generic.go:334] "Generic (PLEG): container finished" podID="8a42feb3-6ec9-44be-b975-483de1699d32" containerID="798a5f5518e5dab2a45e0776ea7b4ddd388efb363d594c7b1ce00b3987f31c76" exitCode=0 Oct 03 09:09:13 crc kubenswrapper[4765]: I1003 09:09:13.972025 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/ceilometer-0" event={"ID":"8a42feb3-6ec9-44be-b975-483de1699d32","Type":"ContainerDied","Data":"004e84bec52ff853beae9062128392d9322c516ad141f706f783e01131d9397a"} Oct 03 09:09:13 crc kubenswrapper[4765]: I1003 09:09:13.982661 4765 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8a42feb3-6ec9-44be-b975-483de1699d32-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 03 09:09:13 crc kubenswrapper[4765]: I1003 09:09:13.982694 4765 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8a42feb3-6ec9-44be-b975-483de1699d32-config-data\") on node \"crc\" DevicePath \"\"" Oct 03 09:09:13 crc kubenswrapper[4765]: I1003 09:09:13.982706 4765 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/8a42feb3-6ec9-44be-b975-483de1699d32-scripts\") on node \"crc\" DevicePath \"\"" Oct 03 09:09:13 crc kubenswrapper[4765]: I1003 09:09:13.982716 4765 reconciler_common.go:293] "Volume detached for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/8a42feb3-6ec9-44be-b975-483de1699d32-ceilometer-tls-certs\") on node \"crc\" DevicePath \"\"" Oct 03 09:09:13 crc kubenswrapper[4765]: I1003 09:09:13.982725 4765 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/8a42feb3-6ec9-44be-b975-483de1699d32-run-httpd\") on node \"crc\" DevicePath \"\"" Oct 03 09:09:13 crc kubenswrapper[4765]: I1003 09:09:13.982735 4765 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-dt4dt\" (UniqueName: \"kubernetes.io/projected/8a42feb3-6ec9-44be-b975-483de1699d32-kube-api-access-dt4dt\") on node \"crc\" DevicePath \"\"" Oct 03 09:09:13 crc kubenswrapper[4765]: I1003 09:09:13.982744 4765 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/8a42feb3-6ec9-44be-b975-483de1699d32-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Oct 03 09:09:14 crc kubenswrapper[4765]: I1003 09:09:14.000170 4765 scope.go:117] "RemoveContainer" containerID="3ba420715ce6e255eb4f5665a813c79671375a147299526e605b2f4d2429a0c9" Oct 03 09:09:14 crc kubenswrapper[4765]: I1003 09:09:14.004223 4765 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["watcher-kuttl-default/ceilometer-0"] Oct 03 09:09:14 crc kubenswrapper[4765]: I1003 09:09:14.011776 4765 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["watcher-kuttl-default/ceilometer-0"] Oct 03 09:09:14 crc kubenswrapper[4765]: I1003 09:09:14.023278 4765 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["watcher-kuttl-default/ceilometer-0"] Oct 03 09:09:14 crc kubenswrapper[4765]: E1003 09:09:14.023630 4765 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8a42feb3-6ec9-44be-b975-483de1699d32" containerName="sg-core" Oct 03 09:09:14 crc kubenswrapper[4765]: I1003 09:09:14.023766 4765 state_mem.go:107] "Deleted CPUSet assignment" podUID="8a42feb3-6ec9-44be-b975-483de1699d32" containerName="sg-core" Oct 03 09:09:14 crc kubenswrapper[4765]: E1003 09:09:14.023789 4765 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4b719514-ecf2-47e6-a002-fb51b387e66f" containerName="mariadb-database-create" Oct 03 09:09:14 crc kubenswrapper[4765]: I1003 09:09:14.023797 4765 state_mem.go:107] "Deleted CPUSet assignment" podUID="4b719514-ecf2-47e6-a002-fb51b387e66f" containerName="mariadb-database-create" Oct 03 09:09:14 crc kubenswrapper[4765]: E1003 09:09:14.023811 4765 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8a42feb3-6ec9-44be-b975-483de1699d32" containerName="ceilometer-central-agent" Oct 03 09:09:14 crc kubenswrapper[4765]: I1003 09:09:14.023819 4765 state_mem.go:107] "Deleted CPUSet assignment" podUID="8a42feb3-6ec9-44be-b975-483de1699d32" containerName="ceilometer-central-agent" Oct 03 09:09:14 crc kubenswrapper[4765]: E1003 09:09:14.023830 4765 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e41242f3-47ba-4f24-8287-756b9a7743be" containerName="watcher-kuttl-api-log" Oct 03 09:09:14 crc kubenswrapper[4765]: I1003 09:09:14.023836 4765 state_mem.go:107] "Deleted CPUSet assignment" podUID="e41242f3-47ba-4f24-8287-756b9a7743be" containerName="watcher-kuttl-api-log" Oct 03 09:09:14 crc kubenswrapper[4765]: E1003 09:09:14.023844 4765 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8a42feb3-6ec9-44be-b975-483de1699d32" containerName="ceilometer-notification-agent" Oct 03 09:09:14 crc kubenswrapper[4765]: I1003 09:09:14.023851 4765 state_mem.go:107] "Deleted CPUSet assignment" podUID="8a42feb3-6ec9-44be-b975-483de1699d32" containerName="ceilometer-notification-agent" Oct 03 09:09:14 crc kubenswrapper[4765]: E1003 09:09:14.023859 4765 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8a42feb3-6ec9-44be-b975-483de1699d32" containerName="proxy-httpd" Oct 03 09:09:14 crc kubenswrapper[4765]: I1003 09:09:14.023864 4765 state_mem.go:107] "Deleted CPUSet assignment" podUID="8a42feb3-6ec9-44be-b975-483de1699d32" containerName="proxy-httpd" Oct 03 09:09:14 crc kubenswrapper[4765]: E1003 09:09:14.023875 4765 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e41242f3-47ba-4f24-8287-756b9a7743be" containerName="watcher-api" Oct 03 09:09:14 crc kubenswrapper[4765]: I1003 09:09:14.023881 4765 state_mem.go:107] "Deleted CPUSet assignment" podUID="e41242f3-47ba-4f24-8287-756b9a7743be" containerName="watcher-api" Oct 03 09:09:14 crc kubenswrapper[4765]: E1003 09:09:14.023900 4765 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="88d4477a-79e0-4441-b6d1-789055a87840" containerName="watcher-decision-engine" Oct 03 09:09:14 crc kubenswrapper[4765]: I1003 09:09:14.023907 4765 state_mem.go:107] "Deleted CPUSet assignment" podUID="88d4477a-79e0-4441-b6d1-789055a87840" containerName="watcher-decision-engine" Oct 03 09:09:14 crc kubenswrapper[4765]: E1003 09:09:14.023917 4765 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6a075387-26c0-423a-b081-403cf3aec3b8" containerName="watcher-applier" Oct 03 09:09:14 crc kubenswrapper[4765]: I1003 09:09:14.023924 4765 state_mem.go:107] "Deleted CPUSet assignment" podUID="6a075387-26c0-423a-b081-403cf3aec3b8" containerName="watcher-applier" Oct 03 09:09:14 crc kubenswrapper[4765]: I1003 09:09:14.024072 4765 memory_manager.go:354] "RemoveStaleState removing state" podUID="4b719514-ecf2-47e6-a002-fb51b387e66f" containerName="mariadb-database-create" Oct 03 09:09:14 crc kubenswrapper[4765]: I1003 09:09:14.024084 4765 memory_manager.go:354] "RemoveStaleState removing state" podUID="8a42feb3-6ec9-44be-b975-483de1699d32" containerName="ceilometer-notification-agent" Oct 03 09:09:14 crc kubenswrapper[4765]: I1003 09:09:14.024092 4765 memory_manager.go:354] "RemoveStaleState removing state" podUID="8a42feb3-6ec9-44be-b975-483de1699d32" containerName="ceilometer-central-agent" Oct 03 09:09:14 crc kubenswrapper[4765]: I1003 09:09:14.024104 4765 memory_manager.go:354] "RemoveStaleState removing state" podUID="6a075387-26c0-423a-b081-403cf3aec3b8" containerName="watcher-applier" Oct 03 09:09:14 crc kubenswrapper[4765]: I1003 09:09:14.024112 4765 memory_manager.go:354] "RemoveStaleState removing state" podUID="e41242f3-47ba-4f24-8287-756b9a7743be" containerName="watcher-api" Oct 03 09:09:14 crc kubenswrapper[4765]: I1003 09:09:14.024121 4765 memory_manager.go:354] "RemoveStaleState removing state" podUID="8a42feb3-6ec9-44be-b975-483de1699d32" containerName="sg-core" Oct 03 09:09:14 crc kubenswrapper[4765]: I1003 09:09:14.024132 4765 memory_manager.go:354] "RemoveStaleState removing state" podUID="e41242f3-47ba-4f24-8287-756b9a7743be" containerName="watcher-kuttl-api-log" Oct 03 09:09:14 crc kubenswrapper[4765]: I1003 09:09:14.024144 4765 memory_manager.go:354] "RemoveStaleState removing state" podUID="88d4477a-79e0-4441-b6d1-789055a87840" containerName="watcher-decision-engine" Oct 03 09:09:14 crc kubenswrapper[4765]: I1003 09:09:14.024156 4765 memory_manager.go:354] "RemoveStaleState removing state" podUID="8a42feb3-6ec9-44be-b975-483de1699d32" containerName="proxy-httpd" Oct 03 09:09:14 crc kubenswrapper[4765]: I1003 09:09:14.032967 4765 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/ceilometer-0" Oct 03 09:09:14 crc kubenswrapper[4765]: I1003 09:09:14.036878 4765 scope.go:117] "RemoveContainer" containerID="b72f6e8147b39ef2181f04c69b2e91ea6bc922de7763058a87a64082af3510c0" Oct 03 09:09:14 crc kubenswrapper[4765]: I1003 09:09:14.037043 4765 reflector.go:368] Caches populated for *v1.Secret from object-"watcher-kuttl-default"/"ceilometer-config-data" Oct 03 09:09:14 crc kubenswrapper[4765]: I1003 09:09:14.037132 4765 reflector.go:368] Caches populated for *v1.Secret from object-"watcher-kuttl-default"/"cert-ceilometer-internal-svc" Oct 03 09:09:14 crc kubenswrapper[4765]: I1003 09:09:14.037262 4765 reflector.go:368] Caches populated for *v1.Secret from object-"watcher-kuttl-default"/"ceilometer-scripts" Oct 03 09:09:14 crc kubenswrapper[4765]: I1003 09:09:14.046041 4765 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["watcher-kuttl-default/ceilometer-0"] Oct 03 09:09:14 crc kubenswrapper[4765]: I1003 09:09:14.076828 4765 scope.go:117] "RemoveContainer" containerID="798a5f5518e5dab2a45e0776ea7b4ddd388efb363d594c7b1ce00b3987f31c76" Oct 03 09:09:14 crc kubenswrapper[4765]: I1003 09:09:14.137015 4765 scope.go:117] "RemoveContainer" containerID="385c741f51973737dbcc4bdf3c5f9a242319cfeebac8cc4f97105aa61fec12b9" Oct 03 09:09:14 crc kubenswrapper[4765]: E1003 09:09:14.138145 4765 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"385c741f51973737dbcc4bdf3c5f9a242319cfeebac8cc4f97105aa61fec12b9\": container with ID starting with 385c741f51973737dbcc4bdf3c5f9a242319cfeebac8cc4f97105aa61fec12b9 not found: ID does not exist" containerID="385c741f51973737dbcc4bdf3c5f9a242319cfeebac8cc4f97105aa61fec12b9" Oct 03 09:09:14 crc kubenswrapper[4765]: I1003 09:09:14.138184 4765 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"385c741f51973737dbcc4bdf3c5f9a242319cfeebac8cc4f97105aa61fec12b9"} err="failed to get container status \"385c741f51973737dbcc4bdf3c5f9a242319cfeebac8cc4f97105aa61fec12b9\": rpc error: code = NotFound desc = could not find container \"385c741f51973737dbcc4bdf3c5f9a242319cfeebac8cc4f97105aa61fec12b9\": container with ID starting with 385c741f51973737dbcc4bdf3c5f9a242319cfeebac8cc4f97105aa61fec12b9 not found: ID does not exist" Oct 03 09:09:14 crc kubenswrapper[4765]: I1003 09:09:14.138204 4765 scope.go:117] "RemoveContainer" containerID="3ba420715ce6e255eb4f5665a813c79671375a147299526e605b2f4d2429a0c9" Oct 03 09:09:14 crc kubenswrapper[4765]: E1003 09:09:14.138521 4765 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3ba420715ce6e255eb4f5665a813c79671375a147299526e605b2f4d2429a0c9\": container with ID starting with 3ba420715ce6e255eb4f5665a813c79671375a147299526e605b2f4d2429a0c9 not found: ID does not exist" containerID="3ba420715ce6e255eb4f5665a813c79671375a147299526e605b2f4d2429a0c9" Oct 03 09:09:14 crc kubenswrapper[4765]: I1003 09:09:14.138544 4765 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3ba420715ce6e255eb4f5665a813c79671375a147299526e605b2f4d2429a0c9"} err="failed to get container status \"3ba420715ce6e255eb4f5665a813c79671375a147299526e605b2f4d2429a0c9\": rpc error: code = NotFound desc = could not find container \"3ba420715ce6e255eb4f5665a813c79671375a147299526e605b2f4d2429a0c9\": container with ID starting with 3ba420715ce6e255eb4f5665a813c79671375a147299526e605b2f4d2429a0c9 not found: ID does not exist" Oct 03 09:09:14 crc kubenswrapper[4765]: I1003 09:09:14.138558 4765 scope.go:117] "RemoveContainer" containerID="b72f6e8147b39ef2181f04c69b2e91ea6bc922de7763058a87a64082af3510c0" Oct 03 09:09:14 crc kubenswrapper[4765]: E1003 09:09:14.138948 4765 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b72f6e8147b39ef2181f04c69b2e91ea6bc922de7763058a87a64082af3510c0\": container with ID starting with b72f6e8147b39ef2181f04c69b2e91ea6bc922de7763058a87a64082af3510c0 not found: ID does not exist" containerID="b72f6e8147b39ef2181f04c69b2e91ea6bc922de7763058a87a64082af3510c0" Oct 03 09:09:14 crc kubenswrapper[4765]: I1003 09:09:14.138968 4765 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b72f6e8147b39ef2181f04c69b2e91ea6bc922de7763058a87a64082af3510c0"} err="failed to get container status \"b72f6e8147b39ef2181f04c69b2e91ea6bc922de7763058a87a64082af3510c0\": rpc error: code = NotFound desc = could not find container \"b72f6e8147b39ef2181f04c69b2e91ea6bc922de7763058a87a64082af3510c0\": container with ID starting with b72f6e8147b39ef2181f04c69b2e91ea6bc922de7763058a87a64082af3510c0 not found: ID does not exist" Oct 03 09:09:14 crc kubenswrapper[4765]: I1003 09:09:14.138979 4765 scope.go:117] "RemoveContainer" containerID="798a5f5518e5dab2a45e0776ea7b4ddd388efb363d594c7b1ce00b3987f31c76" Oct 03 09:09:14 crc kubenswrapper[4765]: E1003 09:09:14.139153 4765 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"798a5f5518e5dab2a45e0776ea7b4ddd388efb363d594c7b1ce00b3987f31c76\": container with ID starting with 798a5f5518e5dab2a45e0776ea7b4ddd388efb363d594c7b1ce00b3987f31c76 not found: ID does not exist" containerID="798a5f5518e5dab2a45e0776ea7b4ddd388efb363d594c7b1ce00b3987f31c76" Oct 03 09:09:14 crc kubenswrapper[4765]: I1003 09:09:14.139172 4765 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"798a5f5518e5dab2a45e0776ea7b4ddd388efb363d594c7b1ce00b3987f31c76"} err="failed to get container status \"798a5f5518e5dab2a45e0776ea7b4ddd388efb363d594c7b1ce00b3987f31c76\": rpc error: code = NotFound desc = could not find container \"798a5f5518e5dab2a45e0776ea7b4ddd388efb363d594c7b1ce00b3987f31c76\": container with ID starting with 798a5f5518e5dab2a45e0776ea7b4ddd388efb363d594c7b1ce00b3987f31c76 not found: ID does not exist" Oct 03 09:09:14 crc kubenswrapper[4765]: I1003 09:09:14.139183 4765 scope.go:117] "RemoveContainer" containerID="385c741f51973737dbcc4bdf3c5f9a242319cfeebac8cc4f97105aa61fec12b9" Oct 03 09:09:14 crc kubenswrapper[4765]: I1003 09:09:14.139352 4765 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"385c741f51973737dbcc4bdf3c5f9a242319cfeebac8cc4f97105aa61fec12b9"} err="failed to get container status \"385c741f51973737dbcc4bdf3c5f9a242319cfeebac8cc4f97105aa61fec12b9\": rpc error: code = NotFound desc = could not find container \"385c741f51973737dbcc4bdf3c5f9a242319cfeebac8cc4f97105aa61fec12b9\": container with ID starting with 385c741f51973737dbcc4bdf3c5f9a242319cfeebac8cc4f97105aa61fec12b9 not found: ID does not exist" Oct 03 09:09:14 crc kubenswrapper[4765]: I1003 09:09:14.139368 4765 scope.go:117] "RemoveContainer" containerID="3ba420715ce6e255eb4f5665a813c79671375a147299526e605b2f4d2429a0c9" Oct 03 09:09:14 crc kubenswrapper[4765]: I1003 09:09:14.139540 4765 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3ba420715ce6e255eb4f5665a813c79671375a147299526e605b2f4d2429a0c9"} err="failed to get container status \"3ba420715ce6e255eb4f5665a813c79671375a147299526e605b2f4d2429a0c9\": rpc error: code = NotFound desc = could not find container \"3ba420715ce6e255eb4f5665a813c79671375a147299526e605b2f4d2429a0c9\": container with ID starting with 3ba420715ce6e255eb4f5665a813c79671375a147299526e605b2f4d2429a0c9 not found: ID does not exist" Oct 03 09:09:14 crc kubenswrapper[4765]: I1003 09:09:14.139558 4765 scope.go:117] "RemoveContainer" containerID="b72f6e8147b39ef2181f04c69b2e91ea6bc922de7763058a87a64082af3510c0" Oct 03 09:09:14 crc kubenswrapper[4765]: I1003 09:09:14.139782 4765 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b72f6e8147b39ef2181f04c69b2e91ea6bc922de7763058a87a64082af3510c0"} err="failed to get container status \"b72f6e8147b39ef2181f04c69b2e91ea6bc922de7763058a87a64082af3510c0\": rpc error: code = NotFound desc = could not find container \"b72f6e8147b39ef2181f04c69b2e91ea6bc922de7763058a87a64082af3510c0\": container with ID starting with b72f6e8147b39ef2181f04c69b2e91ea6bc922de7763058a87a64082af3510c0 not found: ID does not exist" Oct 03 09:09:14 crc kubenswrapper[4765]: I1003 09:09:14.139798 4765 scope.go:117] "RemoveContainer" containerID="798a5f5518e5dab2a45e0776ea7b4ddd388efb363d594c7b1ce00b3987f31c76" Oct 03 09:09:14 crc kubenswrapper[4765]: I1003 09:09:14.139974 4765 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"798a5f5518e5dab2a45e0776ea7b4ddd388efb363d594c7b1ce00b3987f31c76"} err="failed to get container status \"798a5f5518e5dab2a45e0776ea7b4ddd388efb363d594c7b1ce00b3987f31c76\": rpc error: code = NotFound desc = could not find container \"798a5f5518e5dab2a45e0776ea7b4ddd388efb363d594c7b1ce00b3987f31c76\": container with ID starting with 798a5f5518e5dab2a45e0776ea7b4ddd388efb363d594c7b1ce00b3987f31c76 not found: ID does not exist" Oct 03 09:09:14 crc kubenswrapper[4765]: I1003 09:09:14.139993 4765 scope.go:117] "RemoveContainer" containerID="385c741f51973737dbcc4bdf3c5f9a242319cfeebac8cc4f97105aa61fec12b9" Oct 03 09:09:14 crc kubenswrapper[4765]: I1003 09:09:14.140147 4765 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"385c741f51973737dbcc4bdf3c5f9a242319cfeebac8cc4f97105aa61fec12b9"} err="failed to get container status \"385c741f51973737dbcc4bdf3c5f9a242319cfeebac8cc4f97105aa61fec12b9\": rpc error: code = NotFound desc = could not find container \"385c741f51973737dbcc4bdf3c5f9a242319cfeebac8cc4f97105aa61fec12b9\": container with ID starting with 385c741f51973737dbcc4bdf3c5f9a242319cfeebac8cc4f97105aa61fec12b9 not found: ID does not exist" Oct 03 09:09:14 crc kubenswrapper[4765]: I1003 09:09:14.140164 4765 scope.go:117] "RemoveContainer" containerID="3ba420715ce6e255eb4f5665a813c79671375a147299526e605b2f4d2429a0c9" Oct 03 09:09:14 crc kubenswrapper[4765]: I1003 09:09:14.140305 4765 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3ba420715ce6e255eb4f5665a813c79671375a147299526e605b2f4d2429a0c9"} err="failed to get container status \"3ba420715ce6e255eb4f5665a813c79671375a147299526e605b2f4d2429a0c9\": rpc error: code = NotFound desc = could not find container \"3ba420715ce6e255eb4f5665a813c79671375a147299526e605b2f4d2429a0c9\": container with ID starting with 3ba420715ce6e255eb4f5665a813c79671375a147299526e605b2f4d2429a0c9 not found: ID does not exist" Oct 03 09:09:14 crc kubenswrapper[4765]: I1003 09:09:14.140320 4765 scope.go:117] "RemoveContainer" containerID="b72f6e8147b39ef2181f04c69b2e91ea6bc922de7763058a87a64082af3510c0" Oct 03 09:09:14 crc kubenswrapper[4765]: I1003 09:09:14.140467 4765 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b72f6e8147b39ef2181f04c69b2e91ea6bc922de7763058a87a64082af3510c0"} err="failed to get container status \"b72f6e8147b39ef2181f04c69b2e91ea6bc922de7763058a87a64082af3510c0\": rpc error: code = NotFound desc = could not find container \"b72f6e8147b39ef2181f04c69b2e91ea6bc922de7763058a87a64082af3510c0\": container with ID starting with b72f6e8147b39ef2181f04c69b2e91ea6bc922de7763058a87a64082af3510c0 not found: ID does not exist" Oct 03 09:09:14 crc kubenswrapper[4765]: I1003 09:09:14.140480 4765 scope.go:117] "RemoveContainer" containerID="798a5f5518e5dab2a45e0776ea7b4ddd388efb363d594c7b1ce00b3987f31c76" Oct 03 09:09:14 crc kubenswrapper[4765]: I1003 09:09:14.140620 4765 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"798a5f5518e5dab2a45e0776ea7b4ddd388efb363d594c7b1ce00b3987f31c76"} err="failed to get container status \"798a5f5518e5dab2a45e0776ea7b4ddd388efb363d594c7b1ce00b3987f31c76\": rpc error: code = NotFound desc = could not find container \"798a5f5518e5dab2a45e0776ea7b4ddd388efb363d594c7b1ce00b3987f31c76\": container with ID starting with 798a5f5518e5dab2a45e0776ea7b4ddd388efb363d594c7b1ce00b3987f31c76 not found: ID does not exist" Oct 03 09:09:14 crc kubenswrapper[4765]: I1003 09:09:14.140635 4765 scope.go:117] "RemoveContainer" containerID="385c741f51973737dbcc4bdf3c5f9a242319cfeebac8cc4f97105aa61fec12b9" Oct 03 09:09:14 crc kubenswrapper[4765]: I1003 09:09:14.140826 4765 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"385c741f51973737dbcc4bdf3c5f9a242319cfeebac8cc4f97105aa61fec12b9"} err="failed to get container status \"385c741f51973737dbcc4bdf3c5f9a242319cfeebac8cc4f97105aa61fec12b9\": rpc error: code = NotFound desc = could not find container \"385c741f51973737dbcc4bdf3c5f9a242319cfeebac8cc4f97105aa61fec12b9\": container with ID starting with 385c741f51973737dbcc4bdf3c5f9a242319cfeebac8cc4f97105aa61fec12b9 not found: ID does not exist" Oct 03 09:09:14 crc kubenswrapper[4765]: I1003 09:09:14.140846 4765 scope.go:117] "RemoveContainer" containerID="3ba420715ce6e255eb4f5665a813c79671375a147299526e605b2f4d2429a0c9" Oct 03 09:09:14 crc kubenswrapper[4765]: I1003 09:09:14.141038 4765 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3ba420715ce6e255eb4f5665a813c79671375a147299526e605b2f4d2429a0c9"} err="failed to get container status \"3ba420715ce6e255eb4f5665a813c79671375a147299526e605b2f4d2429a0c9\": rpc error: code = NotFound desc = could not find container \"3ba420715ce6e255eb4f5665a813c79671375a147299526e605b2f4d2429a0c9\": container with ID starting with 3ba420715ce6e255eb4f5665a813c79671375a147299526e605b2f4d2429a0c9 not found: ID does not exist" Oct 03 09:09:14 crc kubenswrapper[4765]: I1003 09:09:14.141055 4765 scope.go:117] "RemoveContainer" containerID="b72f6e8147b39ef2181f04c69b2e91ea6bc922de7763058a87a64082af3510c0" Oct 03 09:09:14 crc kubenswrapper[4765]: I1003 09:09:14.141212 4765 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b72f6e8147b39ef2181f04c69b2e91ea6bc922de7763058a87a64082af3510c0"} err="failed to get container status \"b72f6e8147b39ef2181f04c69b2e91ea6bc922de7763058a87a64082af3510c0\": rpc error: code = NotFound desc = could not find container \"b72f6e8147b39ef2181f04c69b2e91ea6bc922de7763058a87a64082af3510c0\": container with ID starting with b72f6e8147b39ef2181f04c69b2e91ea6bc922de7763058a87a64082af3510c0 not found: ID does not exist" Oct 03 09:09:14 crc kubenswrapper[4765]: I1003 09:09:14.141229 4765 scope.go:117] "RemoveContainer" containerID="798a5f5518e5dab2a45e0776ea7b4ddd388efb363d594c7b1ce00b3987f31c76" Oct 03 09:09:14 crc kubenswrapper[4765]: I1003 09:09:14.141415 4765 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"798a5f5518e5dab2a45e0776ea7b4ddd388efb363d594c7b1ce00b3987f31c76"} err="failed to get container status \"798a5f5518e5dab2a45e0776ea7b4ddd388efb363d594c7b1ce00b3987f31c76\": rpc error: code = NotFound desc = could not find container \"798a5f5518e5dab2a45e0776ea7b4ddd388efb363d594c7b1ce00b3987f31c76\": container with ID starting with 798a5f5518e5dab2a45e0776ea7b4ddd388efb363d594c7b1ce00b3987f31c76 not found: ID does not exist" Oct 03 09:09:14 crc kubenswrapper[4765]: I1003 09:09:14.186186 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/27cb84f7-b506-44f1-8200-799520d6baa9-scripts\") pod \"ceilometer-0\" (UID: \"27cb84f7-b506-44f1-8200-799520d6baa9\") " pod="watcher-kuttl-default/ceilometer-0" Oct 03 09:09:14 crc kubenswrapper[4765]: I1003 09:09:14.186268 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/27cb84f7-b506-44f1-8200-799520d6baa9-run-httpd\") pod \"ceilometer-0\" (UID: \"27cb84f7-b506-44f1-8200-799520d6baa9\") " pod="watcher-kuttl-default/ceilometer-0" Oct 03 09:09:14 crc kubenswrapper[4765]: I1003 09:09:14.186292 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/27cb84f7-b506-44f1-8200-799520d6baa9-config-data\") pod \"ceilometer-0\" (UID: \"27cb84f7-b506-44f1-8200-799520d6baa9\") " pod="watcher-kuttl-default/ceilometer-0" Oct 03 09:09:14 crc kubenswrapper[4765]: I1003 09:09:14.186323 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/27cb84f7-b506-44f1-8200-799520d6baa9-log-httpd\") pod \"ceilometer-0\" (UID: \"27cb84f7-b506-44f1-8200-799520d6baa9\") " pod="watcher-kuttl-default/ceilometer-0" Oct 03 09:09:14 crc kubenswrapper[4765]: I1003 09:09:14.186341 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-97wn9\" (UniqueName: \"kubernetes.io/projected/27cb84f7-b506-44f1-8200-799520d6baa9-kube-api-access-97wn9\") pod \"ceilometer-0\" (UID: \"27cb84f7-b506-44f1-8200-799520d6baa9\") " pod="watcher-kuttl-default/ceilometer-0" Oct 03 09:09:14 crc kubenswrapper[4765]: I1003 09:09:14.186388 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/27cb84f7-b506-44f1-8200-799520d6baa9-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"27cb84f7-b506-44f1-8200-799520d6baa9\") " pod="watcher-kuttl-default/ceilometer-0" Oct 03 09:09:14 crc kubenswrapper[4765]: I1003 09:09:14.186404 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/27cb84f7-b506-44f1-8200-799520d6baa9-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"27cb84f7-b506-44f1-8200-799520d6baa9\") " pod="watcher-kuttl-default/ceilometer-0" Oct 03 09:09:14 crc kubenswrapper[4765]: I1003 09:09:14.186430 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/27cb84f7-b506-44f1-8200-799520d6baa9-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"27cb84f7-b506-44f1-8200-799520d6baa9\") " pod="watcher-kuttl-default/ceilometer-0" Oct 03 09:09:14 crc kubenswrapper[4765]: I1003 09:09:14.287513 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/27cb84f7-b506-44f1-8200-799520d6baa9-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"27cb84f7-b506-44f1-8200-799520d6baa9\") " pod="watcher-kuttl-default/ceilometer-0" Oct 03 09:09:14 crc kubenswrapper[4765]: I1003 09:09:14.287925 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/27cb84f7-b506-44f1-8200-799520d6baa9-scripts\") pod \"ceilometer-0\" (UID: \"27cb84f7-b506-44f1-8200-799520d6baa9\") " pod="watcher-kuttl-default/ceilometer-0" Oct 03 09:09:14 crc kubenswrapper[4765]: I1003 09:09:14.287990 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/27cb84f7-b506-44f1-8200-799520d6baa9-run-httpd\") pod \"ceilometer-0\" (UID: \"27cb84f7-b506-44f1-8200-799520d6baa9\") " pod="watcher-kuttl-default/ceilometer-0" Oct 03 09:09:14 crc kubenswrapper[4765]: I1003 09:09:14.288021 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/27cb84f7-b506-44f1-8200-799520d6baa9-config-data\") pod \"ceilometer-0\" (UID: \"27cb84f7-b506-44f1-8200-799520d6baa9\") " pod="watcher-kuttl-default/ceilometer-0" Oct 03 09:09:14 crc kubenswrapper[4765]: I1003 09:09:14.288059 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/27cb84f7-b506-44f1-8200-799520d6baa9-log-httpd\") pod \"ceilometer-0\" (UID: \"27cb84f7-b506-44f1-8200-799520d6baa9\") " pod="watcher-kuttl-default/ceilometer-0" Oct 03 09:09:14 crc kubenswrapper[4765]: I1003 09:09:14.288081 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-97wn9\" (UniqueName: \"kubernetes.io/projected/27cb84f7-b506-44f1-8200-799520d6baa9-kube-api-access-97wn9\") pod \"ceilometer-0\" (UID: \"27cb84f7-b506-44f1-8200-799520d6baa9\") " pod="watcher-kuttl-default/ceilometer-0" Oct 03 09:09:14 crc kubenswrapper[4765]: I1003 09:09:14.288130 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/27cb84f7-b506-44f1-8200-799520d6baa9-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"27cb84f7-b506-44f1-8200-799520d6baa9\") " pod="watcher-kuttl-default/ceilometer-0" Oct 03 09:09:14 crc kubenswrapper[4765]: I1003 09:09:14.288147 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/27cb84f7-b506-44f1-8200-799520d6baa9-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"27cb84f7-b506-44f1-8200-799520d6baa9\") " pod="watcher-kuttl-default/ceilometer-0" Oct 03 09:09:14 crc kubenswrapper[4765]: I1003 09:09:14.289744 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/27cb84f7-b506-44f1-8200-799520d6baa9-log-httpd\") pod \"ceilometer-0\" (UID: \"27cb84f7-b506-44f1-8200-799520d6baa9\") " pod="watcher-kuttl-default/ceilometer-0" Oct 03 09:09:14 crc kubenswrapper[4765]: I1003 09:09:14.290248 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/27cb84f7-b506-44f1-8200-799520d6baa9-run-httpd\") pod \"ceilometer-0\" (UID: \"27cb84f7-b506-44f1-8200-799520d6baa9\") " pod="watcher-kuttl-default/ceilometer-0" Oct 03 09:09:14 crc kubenswrapper[4765]: I1003 09:09:14.292370 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/27cb84f7-b506-44f1-8200-799520d6baa9-config-data\") pod \"ceilometer-0\" (UID: \"27cb84f7-b506-44f1-8200-799520d6baa9\") " pod="watcher-kuttl-default/ceilometer-0" Oct 03 09:09:14 crc kubenswrapper[4765]: I1003 09:09:14.293014 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/27cb84f7-b506-44f1-8200-799520d6baa9-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"27cb84f7-b506-44f1-8200-799520d6baa9\") " pod="watcher-kuttl-default/ceilometer-0" Oct 03 09:09:14 crc kubenswrapper[4765]: I1003 09:09:14.293058 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/27cb84f7-b506-44f1-8200-799520d6baa9-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"27cb84f7-b506-44f1-8200-799520d6baa9\") " pod="watcher-kuttl-default/ceilometer-0" Oct 03 09:09:14 crc kubenswrapper[4765]: I1003 09:09:14.293387 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/27cb84f7-b506-44f1-8200-799520d6baa9-scripts\") pod \"ceilometer-0\" (UID: \"27cb84f7-b506-44f1-8200-799520d6baa9\") " pod="watcher-kuttl-default/ceilometer-0" Oct 03 09:09:14 crc kubenswrapper[4765]: I1003 09:09:14.299101 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/27cb84f7-b506-44f1-8200-799520d6baa9-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"27cb84f7-b506-44f1-8200-799520d6baa9\") " pod="watcher-kuttl-default/ceilometer-0" Oct 03 09:09:14 crc kubenswrapper[4765]: I1003 09:09:14.307235 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-97wn9\" (UniqueName: \"kubernetes.io/projected/27cb84f7-b506-44f1-8200-799520d6baa9-kube-api-access-97wn9\") pod \"ceilometer-0\" (UID: \"27cb84f7-b506-44f1-8200-799520d6baa9\") " pod="watcher-kuttl-default/ceilometer-0" Oct 03 09:09:14 crc kubenswrapper[4765]: I1003 09:09:14.323487 4765 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8a42feb3-6ec9-44be-b975-483de1699d32" path="/var/lib/kubelet/pods/8a42feb3-6ec9-44be-b975-483de1699d32/volumes" Oct 03 09:09:14 crc kubenswrapper[4765]: I1003 09:09:14.354564 4765 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/ceilometer-0" Oct 03 09:09:14 crc kubenswrapper[4765]: I1003 09:09:14.774509 4765 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["watcher-kuttl-default/ceilometer-0"] Oct 03 09:09:14 crc kubenswrapper[4765]: I1003 09:09:14.982286 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/ceilometer-0" event={"ID":"27cb84f7-b506-44f1-8200-799520d6baa9","Type":"ContainerStarted","Data":"3c48ed6ddf8a4f06e9860ff39759ce2f9767fcdebd0c1a4bb09d7728241b2f51"} Oct 03 09:09:15 crc kubenswrapper[4765]: I1003 09:09:15.992174 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/ceilometer-0" event={"ID":"27cb84f7-b506-44f1-8200-799520d6baa9","Type":"ContainerStarted","Data":"d0e8c32c1f9dd668ba1656d79e066d004298fc25e969e6502733139680d7ca22"} Oct 03 09:09:17 crc kubenswrapper[4765]: I1003 09:09:17.004323 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/ceilometer-0" event={"ID":"27cb84f7-b506-44f1-8200-799520d6baa9","Type":"ContainerStarted","Data":"ade6a3f26a6c6c39349acb24c2ed7e1ff8f7b63b70b4dee90e3a2edae7870479"} Oct 03 09:09:17 crc kubenswrapper[4765]: I1003 09:09:17.004953 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/ceilometer-0" event={"ID":"27cb84f7-b506-44f1-8200-799520d6baa9","Type":"ContainerStarted","Data":"c96dddc539346e09fe9cb6f685f8a10316bf692897986860f69b2f04d321afab"} Oct 03 09:09:17 crc kubenswrapper[4765]: I1003 09:09:17.516182 4765 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["watcher-kuttl-default/watcher-test-account-create-pfw25"] Oct 03 09:09:17 crc kubenswrapper[4765]: I1003 09:09:17.517462 4765 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/watcher-test-account-create-pfw25" Oct 03 09:09:17 crc kubenswrapper[4765]: I1003 09:09:17.519460 4765 reflector.go:368] Caches populated for *v1.Secret from object-"watcher-kuttl-default"/"watcher-db-secret" Oct 03 09:09:17 crc kubenswrapper[4765]: I1003 09:09:17.527496 4765 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["watcher-kuttl-default/watcher-test-account-create-pfw25"] Oct 03 09:09:17 crc kubenswrapper[4765]: I1003 09:09:17.583657 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mb66f\" (UniqueName: \"kubernetes.io/projected/87745db5-ec67-4cc3-9f16-fdde96423caa-kube-api-access-mb66f\") pod \"watcher-test-account-create-pfw25\" (UID: \"87745db5-ec67-4cc3-9f16-fdde96423caa\") " pod="watcher-kuttl-default/watcher-test-account-create-pfw25" Oct 03 09:09:17 crc kubenswrapper[4765]: I1003 09:09:17.684685 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mb66f\" (UniqueName: \"kubernetes.io/projected/87745db5-ec67-4cc3-9f16-fdde96423caa-kube-api-access-mb66f\") pod \"watcher-test-account-create-pfw25\" (UID: \"87745db5-ec67-4cc3-9f16-fdde96423caa\") " pod="watcher-kuttl-default/watcher-test-account-create-pfw25" Oct 03 09:09:17 crc kubenswrapper[4765]: I1003 09:09:17.711358 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mb66f\" (UniqueName: \"kubernetes.io/projected/87745db5-ec67-4cc3-9f16-fdde96423caa-kube-api-access-mb66f\") pod \"watcher-test-account-create-pfw25\" (UID: \"87745db5-ec67-4cc3-9f16-fdde96423caa\") " pod="watcher-kuttl-default/watcher-test-account-create-pfw25" Oct 03 09:09:17 crc kubenswrapper[4765]: I1003 09:09:17.843421 4765 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/watcher-test-account-create-pfw25" Oct 03 09:09:18 crc kubenswrapper[4765]: I1003 09:09:18.294795 4765 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["watcher-kuttl-default/watcher-test-account-create-pfw25"] Oct 03 09:09:18 crc kubenswrapper[4765]: W1003 09:09:18.298813 4765 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod87745db5_ec67_4cc3_9f16_fdde96423caa.slice/crio-642880723f1061d2300162d0d6f45155cf8ff8369f304dd56e84074f6499a908 WatchSource:0}: Error finding container 642880723f1061d2300162d0d6f45155cf8ff8369f304dd56e84074f6499a908: Status 404 returned error can't find the container with id 642880723f1061d2300162d0d6f45155cf8ff8369f304dd56e84074f6499a908 Oct 03 09:09:19 crc kubenswrapper[4765]: I1003 09:09:19.021436 4765 generic.go:334] "Generic (PLEG): container finished" podID="87745db5-ec67-4cc3-9f16-fdde96423caa" containerID="8f7a0a346594db76148950b9853e166a62634e5289700e48c0ba599d70913a66" exitCode=0 Oct 03 09:09:19 crc kubenswrapper[4765]: I1003 09:09:19.021516 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/watcher-test-account-create-pfw25" event={"ID":"87745db5-ec67-4cc3-9f16-fdde96423caa","Type":"ContainerDied","Data":"8f7a0a346594db76148950b9853e166a62634e5289700e48c0ba599d70913a66"} Oct 03 09:09:19 crc kubenswrapper[4765]: I1003 09:09:19.021546 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/watcher-test-account-create-pfw25" event={"ID":"87745db5-ec67-4cc3-9f16-fdde96423caa","Type":"ContainerStarted","Data":"642880723f1061d2300162d0d6f45155cf8ff8369f304dd56e84074f6499a908"} Oct 03 09:09:19 crc kubenswrapper[4765]: I1003 09:09:19.024456 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/ceilometer-0" event={"ID":"27cb84f7-b506-44f1-8200-799520d6baa9","Type":"ContainerStarted","Data":"2aecf6015ad7cb8364e97e7103329978881d3a4e7f1aae01599c727dbceb7ab9"} Oct 03 09:09:19 crc kubenswrapper[4765]: I1003 09:09:19.025211 4765 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="watcher-kuttl-default/ceilometer-0" Oct 03 09:09:19 crc kubenswrapper[4765]: I1003 09:09:19.063879 4765 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="watcher-kuttl-default/ceilometer-0" podStartSLOduration=1.743316266 podStartE2EDuration="5.063844728s" podCreationTimestamp="2025-10-03 09:09:14 +0000 UTC" firstStartedPulling="2025-10-03 09:09:14.790773444 +0000 UTC m=+1799.092267774" lastFinishedPulling="2025-10-03 09:09:18.111301906 +0000 UTC m=+1802.412796236" observedRunningTime="2025-10-03 09:09:19.062119613 +0000 UTC m=+1803.363613963" watchObservedRunningTime="2025-10-03 09:09:19.063844728 +0000 UTC m=+1803.365339068" Oct 03 09:09:19 crc kubenswrapper[4765]: I1003 09:09:19.128312 4765 scope.go:117] "RemoveContainer" containerID="7f0779cb52b4215fc79ae8065c56892a6740e507e4a3a5b41f70a32e94b373ec" Oct 03 09:09:19 crc kubenswrapper[4765]: I1003 09:09:19.150045 4765 scope.go:117] "RemoveContainer" containerID="fb430e7d139d8ef6c9d2c00d38b03c6289fd36807707c4b86a3953e4fc3f713d" Oct 03 09:09:19 crc kubenswrapper[4765]: I1003 09:09:19.208131 4765 scope.go:117] "RemoveContainer" containerID="c6deb8c59356abc0c9a2545173d3120e90f8ca2abc81252a82f9ce2813e834a4" Oct 03 09:09:19 crc kubenswrapper[4765]: I1003 09:09:19.252522 4765 scope.go:117] "RemoveContainer" containerID="066430c682df73e67f1c634704cdfee0b662d84f04be4d8d8c4ef68a2c49e23f" Oct 03 09:09:19 crc kubenswrapper[4765]: I1003 09:09:19.280876 4765 scope.go:117] "RemoveContainer" containerID="4459abc515635cdd34530daac327cd3e8b9803349e47b91ff79beed3a2f9e681" Oct 03 09:09:20 crc kubenswrapper[4765]: I1003 09:09:20.394893 4765 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/watcher-test-account-create-pfw25" Oct 03 09:09:20 crc kubenswrapper[4765]: I1003 09:09:20.535502 4765 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mb66f\" (UniqueName: \"kubernetes.io/projected/87745db5-ec67-4cc3-9f16-fdde96423caa-kube-api-access-mb66f\") pod \"87745db5-ec67-4cc3-9f16-fdde96423caa\" (UID: \"87745db5-ec67-4cc3-9f16-fdde96423caa\") " Oct 03 09:09:20 crc kubenswrapper[4765]: I1003 09:09:20.540994 4765 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/87745db5-ec67-4cc3-9f16-fdde96423caa-kube-api-access-mb66f" (OuterVolumeSpecName: "kube-api-access-mb66f") pod "87745db5-ec67-4cc3-9f16-fdde96423caa" (UID: "87745db5-ec67-4cc3-9f16-fdde96423caa"). InnerVolumeSpecName "kube-api-access-mb66f". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 09:09:20 crc kubenswrapper[4765]: I1003 09:09:20.638052 4765 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mb66f\" (UniqueName: \"kubernetes.io/projected/87745db5-ec67-4cc3-9f16-fdde96423caa-kube-api-access-mb66f\") on node \"crc\" DevicePath \"\"" Oct 03 09:09:21 crc kubenswrapper[4765]: I1003 09:09:21.041974 4765 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/watcher-test-account-create-pfw25" Oct 03 09:09:21 crc kubenswrapper[4765]: I1003 09:09:21.042472 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/watcher-test-account-create-pfw25" event={"ID":"87745db5-ec67-4cc3-9f16-fdde96423caa","Type":"ContainerDied","Data":"642880723f1061d2300162d0d6f45155cf8ff8369f304dd56e84074f6499a908"} Oct 03 09:09:21 crc kubenswrapper[4765]: I1003 09:09:21.042524 4765 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="642880723f1061d2300162d0d6f45155cf8ff8369f304dd56e84074f6499a908" Oct 03 09:09:22 crc kubenswrapper[4765]: I1003 09:09:22.306518 4765 scope.go:117] "RemoveContainer" containerID="dd918556e4256b95f1ffce5dba4f8a301b33441a569fc5bbea88da3f09eb9800" Oct 03 09:09:22 crc kubenswrapper[4765]: E1003 09:09:22.307977 4765 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-j8mss_openshift-machine-config-operator(d636dbad-9ffa-4ba7-953f-adea04b76a23)\"" pod="openshift-machine-config-operator/machine-config-daemon-j8mss" podUID="d636dbad-9ffa-4ba7-953f-adea04b76a23" Oct 03 09:09:22 crc kubenswrapper[4765]: I1003 09:09:22.686038 4765 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["watcher-kuttl-default/watcher-kuttl-db-sync-94ndg"] Oct 03 09:09:22 crc kubenswrapper[4765]: E1003 09:09:22.686374 4765 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="87745db5-ec67-4cc3-9f16-fdde96423caa" containerName="mariadb-account-create" Oct 03 09:09:22 crc kubenswrapper[4765]: I1003 09:09:22.686389 4765 state_mem.go:107] "Deleted CPUSet assignment" podUID="87745db5-ec67-4cc3-9f16-fdde96423caa" containerName="mariadb-account-create" Oct 03 09:09:22 crc kubenswrapper[4765]: I1003 09:09:22.686525 4765 memory_manager.go:354] "RemoveStaleState removing state" podUID="87745db5-ec67-4cc3-9f16-fdde96423caa" containerName="mariadb-account-create" Oct 03 09:09:22 crc kubenswrapper[4765]: I1003 09:09:22.687258 4765 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/watcher-kuttl-db-sync-94ndg" Oct 03 09:09:22 crc kubenswrapper[4765]: I1003 09:09:22.690478 4765 reflector.go:368] Caches populated for *v1.Secret from object-"watcher-kuttl-default"/"watcher-kuttl-config-data" Oct 03 09:09:22 crc kubenswrapper[4765]: I1003 09:09:22.690819 4765 reflector.go:368] Caches populated for *v1.Secret from object-"watcher-kuttl-default"/"watcher-watcher-kuttl-dockercfg-9t6jr" Oct 03 09:09:22 crc kubenswrapper[4765]: I1003 09:09:22.700213 4765 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["watcher-kuttl-default/watcher-kuttl-db-sync-94ndg"] Oct 03 09:09:22 crc kubenswrapper[4765]: I1003 09:09:22.871197 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7f5e0ff7-6ca7-4937-99f2-e4c122f2e9ab-config-data\") pod \"watcher-kuttl-db-sync-94ndg\" (UID: \"7f5e0ff7-6ca7-4937-99f2-e4c122f2e9ab\") " pod="watcher-kuttl-default/watcher-kuttl-db-sync-94ndg" Oct 03 09:09:22 crc kubenswrapper[4765]: I1003 09:09:22.871272 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-v84s8\" (UniqueName: \"kubernetes.io/projected/7f5e0ff7-6ca7-4937-99f2-e4c122f2e9ab-kube-api-access-v84s8\") pod \"watcher-kuttl-db-sync-94ndg\" (UID: \"7f5e0ff7-6ca7-4937-99f2-e4c122f2e9ab\") " pod="watcher-kuttl-default/watcher-kuttl-db-sync-94ndg" Oct 03 09:09:22 crc kubenswrapper[4765]: I1003 09:09:22.871370 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7f5e0ff7-6ca7-4937-99f2-e4c122f2e9ab-combined-ca-bundle\") pod \"watcher-kuttl-db-sync-94ndg\" (UID: \"7f5e0ff7-6ca7-4937-99f2-e4c122f2e9ab\") " pod="watcher-kuttl-default/watcher-kuttl-db-sync-94ndg" Oct 03 09:09:22 crc kubenswrapper[4765]: I1003 09:09:22.871394 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/7f5e0ff7-6ca7-4937-99f2-e4c122f2e9ab-db-sync-config-data\") pod \"watcher-kuttl-db-sync-94ndg\" (UID: \"7f5e0ff7-6ca7-4937-99f2-e4c122f2e9ab\") " pod="watcher-kuttl-default/watcher-kuttl-db-sync-94ndg" Oct 03 09:09:22 crc kubenswrapper[4765]: I1003 09:09:22.972600 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7f5e0ff7-6ca7-4937-99f2-e4c122f2e9ab-config-data\") pod \"watcher-kuttl-db-sync-94ndg\" (UID: \"7f5e0ff7-6ca7-4937-99f2-e4c122f2e9ab\") " pod="watcher-kuttl-default/watcher-kuttl-db-sync-94ndg" Oct 03 09:09:22 crc kubenswrapper[4765]: I1003 09:09:22.972999 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-v84s8\" (UniqueName: \"kubernetes.io/projected/7f5e0ff7-6ca7-4937-99f2-e4c122f2e9ab-kube-api-access-v84s8\") pod \"watcher-kuttl-db-sync-94ndg\" (UID: \"7f5e0ff7-6ca7-4937-99f2-e4c122f2e9ab\") " pod="watcher-kuttl-default/watcher-kuttl-db-sync-94ndg" Oct 03 09:09:22 crc kubenswrapper[4765]: I1003 09:09:22.973140 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7f5e0ff7-6ca7-4937-99f2-e4c122f2e9ab-combined-ca-bundle\") pod \"watcher-kuttl-db-sync-94ndg\" (UID: \"7f5e0ff7-6ca7-4937-99f2-e4c122f2e9ab\") " pod="watcher-kuttl-default/watcher-kuttl-db-sync-94ndg" Oct 03 09:09:22 crc kubenswrapper[4765]: I1003 09:09:22.973232 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/7f5e0ff7-6ca7-4937-99f2-e4c122f2e9ab-db-sync-config-data\") pod \"watcher-kuttl-db-sync-94ndg\" (UID: \"7f5e0ff7-6ca7-4937-99f2-e4c122f2e9ab\") " pod="watcher-kuttl-default/watcher-kuttl-db-sync-94ndg" Oct 03 09:09:22 crc kubenswrapper[4765]: I1003 09:09:22.977374 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/7f5e0ff7-6ca7-4937-99f2-e4c122f2e9ab-db-sync-config-data\") pod \"watcher-kuttl-db-sync-94ndg\" (UID: \"7f5e0ff7-6ca7-4937-99f2-e4c122f2e9ab\") " pod="watcher-kuttl-default/watcher-kuttl-db-sync-94ndg" Oct 03 09:09:22 crc kubenswrapper[4765]: I1003 09:09:22.977606 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7f5e0ff7-6ca7-4937-99f2-e4c122f2e9ab-config-data\") pod \"watcher-kuttl-db-sync-94ndg\" (UID: \"7f5e0ff7-6ca7-4937-99f2-e4c122f2e9ab\") " pod="watcher-kuttl-default/watcher-kuttl-db-sync-94ndg" Oct 03 09:09:22 crc kubenswrapper[4765]: I1003 09:09:22.978316 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7f5e0ff7-6ca7-4937-99f2-e4c122f2e9ab-combined-ca-bundle\") pod \"watcher-kuttl-db-sync-94ndg\" (UID: \"7f5e0ff7-6ca7-4937-99f2-e4c122f2e9ab\") " pod="watcher-kuttl-default/watcher-kuttl-db-sync-94ndg" Oct 03 09:09:23 crc kubenswrapper[4765]: I1003 09:09:23.001434 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-v84s8\" (UniqueName: \"kubernetes.io/projected/7f5e0ff7-6ca7-4937-99f2-e4c122f2e9ab-kube-api-access-v84s8\") pod \"watcher-kuttl-db-sync-94ndg\" (UID: \"7f5e0ff7-6ca7-4937-99f2-e4c122f2e9ab\") " pod="watcher-kuttl-default/watcher-kuttl-db-sync-94ndg" Oct 03 09:09:23 crc kubenswrapper[4765]: I1003 09:09:23.005515 4765 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/watcher-kuttl-db-sync-94ndg" Oct 03 09:09:23 crc kubenswrapper[4765]: I1003 09:09:23.458570 4765 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["watcher-kuttl-default/watcher-kuttl-db-sync-94ndg"] Oct 03 09:09:24 crc kubenswrapper[4765]: I1003 09:09:24.073371 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/watcher-kuttl-db-sync-94ndg" event={"ID":"7f5e0ff7-6ca7-4937-99f2-e4c122f2e9ab","Type":"ContainerStarted","Data":"a1e1ad9af4dd403de16b7205ea870344fe8ae66f6f46d7c8cd0144ff04b9c5b9"} Oct 03 09:09:24 crc kubenswrapper[4765]: I1003 09:09:24.073754 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/watcher-kuttl-db-sync-94ndg" event={"ID":"7f5e0ff7-6ca7-4937-99f2-e4c122f2e9ab","Type":"ContainerStarted","Data":"750113aa258ece7496be4d96e09ef50c63ccba3a3a24cdb4c259b2e918846c07"} Oct 03 09:09:24 crc kubenswrapper[4765]: I1003 09:09:24.098703 4765 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="watcher-kuttl-default/watcher-kuttl-db-sync-94ndg" podStartSLOduration=2.098681934 podStartE2EDuration="2.098681934s" podCreationTimestamp="2025-10-03 09:09:22 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-03 09:09:24.090408129 +0000 UTC m=+1808.391902459" watchObservedRunningTime="2025-10-03 09:09:24.098681934 +0000 UTC m=+1808.400176264" Oct 03 09:09:27 crc kubenswrapper[4765]: I1003 09:09:27.099019 4765 generic.go:334] "Generic (PLEG): container finished" podID="7f5e0ff7-6ca7-4937-99f2-e4c122f2e9ab" containerID="a1e1ad9af4dd403de16b7205ea870344fe8ae66f6f46d7c8cd0144ff04b9c5b9" exitCode=0 Oct 03 09:09:27 crc kubenswrapper[4765]: I1003 09:09:27.099095 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/watcher-kuttl-db-sync-94ndg" event={"ID":"7f5e0ff7-6ca7-4937-99f2-e4c122f2e9ab","Type":"ContainerDied","Data":"a1e1ad9af4dd403de16b7205ea870344fe8ae66f6f46d7c8cd0144ff04b9c5b9"} Oct 03 09:09:28 crc kubenswrapper[4765]: I1003 09:09:28.485027 4765 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/watcher-kuttl-db-sync-94ndg" Oct 03 09:09:28 crc kubenswrapper[4765]: I1003 09:09:28.576009 4765 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7f5e0ff7-6ca7-4937-99f2-e4c122f2e9ab-combined-ca-bundle\") pod \"7f5e0ff7-6ca7-4937-99f2-e4c122f2e9ab\" (UID: \"7f5e0ff7-6ca7-4937-99f2-e4c122f2e9ab\") " Oct 03 09:09:28 crc kubenswrapper[4765]: I1003 09:09:28.576064 4765 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7f5e0ff7-6ca7-4937-99f2-e4c122f2e9ab-config-data\") pod \"7f5e0ff7-6ca7-4937-99f2-e4c122f2e9ab\" (UID: \"7f5e0ff7-6ca7-4937-99f2-e4c122f2e9ab\") " Oct 03 09:09:28 crc kubenswrapper[4765]: I1003 09:09:28.576103 4765 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/7f5e0ff7-6ca7-4937-99f2-e4c122f2e9ab-db-sync-config-data\") pod \"7f5e0ff7-6ca7-4937-99f2-e4c122f2e9ab\" (UID: \"7f5e0ff7-6ca7-4937-99f2-e4c122f2e9ab\") " Oct 03 09:09:28 crc kubenswrapper[4765]: I1003 09:09:28.576134 4765 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-v84s8\" (UniqueName: \"kubernetes.io/projected/7f5e0ff7-6ca7-4937-99f2-e4c122f2e9ab-kube-api-access-v84s8\") pod \"7f5e0ff7-6ca7-4937-99f2-e4c122f2e9ab\" (UID: \"7f5e0ff7-6ca7-4937-99f2-e4c122f2e9ab\") " Oct 03 09:09:28 crc kubenswrapper[4765]: I1003 09:09:28.581062 4765 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7f5e0ff7-6ca7-4937-99f2-e4c122f2e9ab-db-sync-config-data" (OuterVolumeSpecName: "db-sync-config-data") pod "7f5e0ff7-6ca7-4937-99f2-e4c122f2e9ab" (UID: "7f5e0ff7-6ca7-4937-99f2-e4c122f2e9ab"). InnerVolumeSpecName "db-sync-config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 09:09:28 crc kubenswrapper[4765]: I1003 09:09:28.581513 4765 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7f5e0ff7-6ca7-4937-99f2-e4c122f2e9ab-kube-api-access-v84s8" (OuterVolumeSpecName: "kube-api-access-v84s8") pod "7f5e0ff7-6ca7-4937-99f2-e4c122f2e9ab" (UID: "7f5e0ff7-6ca7-4937-99f2-e4c122f2e9ab"). InnerVolumeSpecName "kube-api-access-v84s8". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 09:09:28 crc kubenswrapper[4765]: I1003 09:09:28.600879 4765 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7f5e0ff7-6ca7-4937-99f2-e4c122f2e9ab-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "7f5e0ff7-6ca7-4937-99f2-e4c122f2e9ab" (UID: "7f5e0ff7-6ca7-4937-99f2-e4c122f2e9ab"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 09:09:28 crc kubenswrapper[4765]: I1003 09:09:28.617804 4765 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7f5e0ff7-6ca7-4937-99f2-e4c122f2e9ab-config-data" (OuterVolumeSpecName: "config-data") pod "7f5e0ff7-6ca7-4937-99f2-e4c122f2e9ab" (UID: "7f5e0ff7-6ca7-4937-99f2-e4c122f2e9ab"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 09:09:28 crc kubenswrapper[4765]: I1003 09:09:28.677923 4765 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7f5e0ff7-6ca7-4937-99f2-e4c122f2e9ab-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 03 09:09:28 crc kubenswrapper[4765]: I1003 09:09:28.677959 4765 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7f5e0ff7-6ca7-4937-99f2-e4c122f2e9ab-config-data\") on node \"crc\" DevicePath \"\"" Oct 03 09:09:28 crc kubenswrapper[4765]: I1003 09:09:28.677968 4765 reconciler_common.go:293] "Volume detached for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/7f5e0ff7-6ca7-4937-99f2-e4c122f2e9ab-db-sync-config-data\") on node \"crc\" DevicePath \"\"" Oct 03 09:09:28 crc kubenswrapper[4765]: I1003 09:09:28.677977 4765 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-v84s8\" (UniqueName: \"kubernetes.io/projected/7f5e0ff7-6ca7-4937-99f2-e4c122f2e9ab-kube-api-access-v84s8\") on node \"crc\" DevicePath \"\"" Oct 03 09:09:29 crc kubenswrapper[4765]: I1003 09:09:29.116308 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/watcher-kuttl-db-sync-94ndg" event={"ID":"7f5e0ff7-6ca7-4937-99f2-e4c122f2e9ab","Type":"ContainerDied","Data":"750113aa258ece7496be4d96e09ef50c63ccba3a3a24cdb4c259b2e918846c07"} Oct 03 09:09:29 crc kubenswrapper[4765]: I1003 09:09:29.116340 4765 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/watcher-kuttl-db-sync-94ndg" Oct 03 09:09:29 crc kubenswrapper[4765]: I1003 09:09:29.116352 4765 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="750113aa258ece7496be4d96e09ef50c63ccba3a3a24cdb4c259b2e918846c07" Oct 03 09:09:29 crc kubenswrapper[4765]: I1003 09:09:29.380271 4765 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["watcher-kuttl-default/watcher-kuttl-api-0"] Oct 03 09:09:29 crc kubenswrapper[4765]: E1003 09:09:29.380636 4765 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7f5e0ff7-6ca7-4937-99f2-e4c122f2e9ab" containerName="watcher-kuttl-db-sync" Oct 03 09:09:29 crc kubenswrapper[4765]: I1003 09:09:29.380666 4765 state_mem.go:107] "Deleted CPUSet assignment" podUID="7f5e0ff7-6ca7-4937-99f2-e4c122f2e9ab" containerName="watcher-kuttl-db-sync" Oct 03 09:09:29 crc kubenswrapper[4765]: I1003 09:09:29.380869 4765 memory_manager.go:354] "RemoveStaleState removing state" podUID="7f5e0ff7-6ca7-4937-99f2-e4c122f2e9ab" containerName="watcher-kuttl-db-sync" Oct 03 09:09:29 crc kubenswrapper[4765]: I1003 09:09:29.381901 4765 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/watcher-kuttl-api-0" Oct 03 09:09:29 crc kubenswrapper[4765]: I1003 09:09:29.390933 4765 reflector.go:368] Caches populated for *v1.Secret from object-"watcher-kuttl-default"/"watcher-watcher-kuttl-dockercfg-9t6jr" Oct 03 09:09:29 crc kubenswrapper[4765]: I1003 09:09:29.391144 4765 reflector.go:368] Caches populated for *v1.Secret from object-"watcher-kuttl-default"/"watcher-kuttl-api-config-data" Oct 03 09:09:29 crc kubenswrapper[4765]: I1003 09:09:29.398472 4765 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["watcher-kuttl-default/watcher-kuttl-api-0"] Oct 03 09:09:29 crc kubenswrapper[4765]: I1003 09:09:29.413924 4765 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["watcher-kuttl-default/watcher-kuttl-api-1"] Oct 03 09:09:29 crc kubenswrapper[4765]: I1003 09:09:29.415284 4765 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/watcher-kuttl-api-1" Oct 03 09:09:29 crc kubenswrapper[4765]: I1003 09:09:29.440820 4765 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["watcher-kuttl-default/watcher-kuttl-api-1"] Oct 03 09:09:29 crc kubenswrapper[4765]: I1003 09:09:29.483726 4765 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["watcher-kuttl-default/watcher-kuttl-decision-engine-0"] Oct 03 09:09:29 crc kubenswrapper[4765]: I1003 09:09:29.485195 4765 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/watcher-kuttl-decision-engine-0" Oct 03 09:09:29 crc kubenswrapper[4765]: I1003 09:09:29.495453 4765 reflector.go:368] Caches populated for *v1.Secret from object-"watcher-kuttl-default"/"watcher-kuttl-decision-engine-config-data" Oct 03 09:09:29 crc kubenswrapper[4765]: I1003 09:09:29.495568 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"custom-prometheus-ca\" (UniqueName: \"kubernetes.io/secret/f7dfb030-e149-4806-bf36-54b8283a9027-custom-prometheus-ca\") pod \"watcher-kuttl-api-0\" (UID: \"f7dfb030-e149-4806-bf36-54b8283a9027\") " pod="watcher-kuttl-default/watcher-kuttl-api-0" Oct 03 09:09:29 crc kubenswrapper[4765]: I1003 09:09:29.495664 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/15583e2e-4769-40c5-b09e-4aa70e3dc238-config-data\") pod \"watcher-kuttl-api-1\" (UID: \"15583e2e-4769-40c5-b09e-4aa70e3dc238\") " pod="watcher-kuttl-default/watcher-kuttl-api-1" Oct 03 09:09:29 crc kubenswrapper[4765]: I1003 09:09:29.495696 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/15583e2e-4769-40c5-b09e-4aa70e3dc238-combined-ca-bundle\") pod \"watcher-kuttl-api-1\" (UID: \"15583e2e-4769-40c5-b09e-4aa70e3dc238\") " pod="watcher-kuttl-default/watcher-kuttl-api-1" Oct 03 09:09:29 crc kubenswrapper[4765]: I1003 09:09:29.495721 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vtdgd\" (UniqueName: \"kubernetes.io/projected/2e44e319-a13c-4e4a-bf9c-775e09f92bc3-kube-api-access-vtdgd\") pod \"watcher-kuttl-decision-engine-0\" (UID: \"2e44e319-a13c-4e4a-bf9c-775e09f92bc3\") " pod="watcher-kuttl-default/watcher-kuttl-decision-engine-0" Oct 03 09:09:29 crc kubenswrapper[4765]: I1003 09:09:29.495769 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/2e44e319-a13c-4e4a-bf9c-775e09f92bc3-logs\") pod \"watcher-kuttl-decision-engine-0\" (UID: \"2e44e319-a13c-4e4a-bf9c-775e09f92bc3\") " pod="watcher-kuttl-default/watcher-kuttl-decision-engine-0" Oct 03 09:09:29 crc kubenswrapper[4765]: I1003 09:09:29.495789 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"custom-prometheus-ca\" (UniqueName: \"kubernetes.io/secret/2e44e319-a13c-4e4a-bf9c-775e09f92bc3-custom-prometheus-ca\") pod \"watcher-kuttl-decision-engine-0\" (UID: \"2e44e319-a13c-4e4a-bf9c-775e09f92bc3\") " pod="watcher-kuttl-default/watcher-kuttl-decision-engine-0" Oct 03 09:09:29 crc kubenswrapper[4765]: I1003 09:09:29.495818 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2e44e319-a13c-4e4a-bf9c-775e09f92bc3-config-data\") pod \"watcher-kuttl-decision-engine-0\" (UID: \"2e44e319-a13c-4e4a-bf9c-775e09f92bc3\") " pod="watcher-kuttl-default/watcher-kuttl-decision-engine-0" Oct 03 09:09:29 crc kubenswrapper[4765]: I1003 09:09:29.495838 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert-memcached-mtls\" (UniqueName: \"kubernetes.io/secret/2e44e319-a13c-4e4a-bf9c-775e09f92bc3-cert-memcached-mtls\") pod \"watcher-kuttl-decision-engine-0\" (UID: \"2e44e319-a13c-4e4a-bf9c-775e09f92bc3\") " pod="watcher-kuttl-default/watcher-kuttl-decision-engine-0" Oct 03 09:09:29 crc kubenswrapper[4765]: I1003 09:09:29.495860 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"custom-prometheus-ca\" (UniqueName: \"kubernetes.io/secret/15583e2e-4769-40c5-b09e-4aa70e3dc238-custom-prometheus-ca\") pod \"watcher-kuttl-api-1\" (UID: \"15583e2e-4769-40c5-b09e-4aa70e3dc238\") " pod="watcher-kuttl-default/watcher-kuttl-api-1" Oct 03 09:09:29 crc kubenswrapper[4765]: I1003 09:09:29.495878 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/15583e2e-4769-40c5-b09e-4aa70e3dc238-logs\") pod \"watcher-kuttl-api-1\" (UID: \"15583e2e-4769-40c5-b09e-4aa70e3dc238\") " pod="watcher-kuttl-default/watcher-kuttl-api-1" Oct 03 09:09:29 crc kubenswrapper[4765]: I1003 09:09:29.495913 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f7dfb030-e149-4806-bf36-54b8283a9027-config-data\") pod \"watcher-kuttl-api-0\" (UID: \"f7dfb030-e149-4806-bf36-54b8283a9027\") " pod="watcher-kuttl-default/watcher-kuttl-api-0" Oct 03 09:09:29 crc kubenswrapper[4765]: I1003 09:09:29.495941 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-szbpj\" (UniqueName: \"kubernetes.io/projected/f7dfb030-e149-4806-bf36-54b8283a9027-kube-api-access-szbpj\") pod \"watcher-kuttl-api-0\" (UID: \"f7dfb030-e149-4806-bf36-54b8283a9027\") " pod="watcher-kuttl-default/watcher-kuttl-api-0" Oct 03 09:09:29 crc kubenswrapper[4765]: I1003 09:09:29.495965 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/f7dfb030-e149-4806-bf36-54b8283a9027-logs\") pod \"watcher-kuttl-api-0\" (UID: \"f7dfb030-e149-4806-bf36-54b8283a9027\") " pod="watcher-kuttl-default/watcher-kuttl-api-0" Oct 03 09:09:29 crc kubenswrapper[4765]: I1003 09:09:29.495986 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert-memcached-mtls\" (UniqueName: \"kubernetes.io/secret/f7dfb030-e149-4806-bf36-54b8283a9027-cert-memcached-mtls\") pod \"watcher-kuttl-api-0\" (UID: \"f7dfb030-e149-4806-bf36-54b8283a9027\") " pod="watcher-kuttl-default/watcher-kuttl-api-0" Oct 03 09:09:29 crc kubenswrapper[4765]: I1003 09:09:29.496015 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert-memcached-mtls\" (UniqueName: \"kubernetes.io/secret/15583e2e-4769-40c5-b09e-4aa70e3dc238-cert-memcached-mtls\") pod \"watcher-kuttl-api-1\" (UID: \"15583e2e-4769-40c5-b09e-4aa70e3dc238\") " pod="watcher-kuttl-default/watcher-kuttl-api-1" Oct 03 09:09:29 crc kubenswrapper[4765]: I1003 09:09:29.496053 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f7dfb030-e149-4806-bf36-54b8283a9027-combined-ca-bundle\") pod \"watcher-kuttl-api-0\" (UID: \"f7dfb030-e149-4806-bf36-54b8283a9027\") " pod="watcher-kuttl-default/watcher-kuttl-api-0" Oct 03 09:09:29 crc kubenswrapper[4765]: I1003 09:09:29.496075 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2e44e319-a13c-4e4a-bf9c-775e09f92bc3-combined-ca-bundle\") pod \"watcher-kuttl-decision-engine-0\" (UID: \"2e44e319-a13c-4e4a-bf9c-775e09f92bc3\") " pod="watcher-kuttl-default/watcher-kuttl-decision-engine-0" Oct 03 09:09:29 crc kubenswrapper[4765]: I1003 09:09:29.496107 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-777js\" (UniqueName: \"kubernetes.io/projected/15583e2e-4769-40c5-b09e-4aa70e3dc238-kube-api-access-777js\") pod \"watcher-kuttl-api-1\" (UID: \"15583e2e-4769-40c5-b09e-4aa70e3dc238\") " pod="watcher-kuttl-default/watcher-kuttl-api-1" Oct 03 09:09:29 crc kubenswrapper[4765]: I1003 09:09:29.508232 4765 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["watcher-kuttl-default/watcher-kuttl-decision-engine-0"] Oct 03 09:09:29 crc kubenswrapper[4765]: I1003 09:09:29.524715 4765 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["watcher-kuttl-default/watcher-kuttl-applier-0"] Oct 03 09:09:29 crc kubenswrapper[4765]: I1003 09:09:29.526318 4765 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/watcher-kuttl-applier-0" Oct 03 09:09:29 crc kubenswrapper[4765]: I1003 09:09:29.529465 4765 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["watcher-kuttl-default/watcher-kuttl-applier-0"] Oct 03 09:09:29 crc kubenswrapper[4765]: I1003 09:09:29.532465 4765 reflector.go:368] Caches populated for *v1.Secret from object-"watcher-kuttl-default"/"watcher-kuttl-applier-config-data" Oct 03 09:09:29 crc kubenswrapper[4765]: I1003 09:09:29.598847 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5717d110-7f82-4448-8891-4a0dc6ae2703-config-data\") pod \"watcher-kuttl-applier-0\" (UID: \"5717d110-7f82-4448-8891-4a0dc6ae2703\") " pod="watcher-kuttl-default/watcher-kuttl-applier-0" Oct 03 09:09:29 crc kubenswrapper[4765]: I1003 09:09:29.598885 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-smmbc\" (UniqueName: \"kubernetes.io/projected/5717d110-7f82-4448-8891-4a0dc6ae2703-kube-api-access-smmbc\") pod \"watcher-kuttl-applier-0\" (UID: \"5717d110-7f82-4448-8891-4a0dc6ae2703\") " pod="watcher-kuttl-default/watcher-kuttl-applier-0" Oct 03 09:09:29 crc kubenswrapper[4765]: I1003 09:09:29.598912 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2e44e319-a13c-4e4a-bf9c-775e09f92bc3-config-data\") pod \"watcher-kuttl-decision-engine-0\" (UID: \"2e44e319-a13c-4e4a-bf9c-775e09f92bc3\") " pod="watcher-kuttl-default/watcher-kuttl-decision-engine-0" Oct 03 09:09:29 crc kubenswrapper[4765]: I1003 09:09:29.598928 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert-memcached-mtls\" (UniqueName: \"kubernetes.io/secret/2e44e319-a13c-4e4a-bf9c-775e09f92bc3-cert-memcached-mtls\") pod \"watcher-kuttl-decision-engine-0\" (UID: \"2e44e319-a13c-4e4a-bf9c-775e09f92bc3\") " pod="watcher-kuttl-default/watcher-kuttl-decision-engine-0" Oct 03 09:09:29 crc kubenswrapper[4765]: I1003 09:09:29.598952 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"custom-prometheus-ca\" (UniqueName: \"kubernetes.io/secret/15583e2e-4769-40c5-b09e-4aa70e3dc238-custom-prometheus-ca\") pod \"watcher-kuttl-api-1\" (UID: \"15583e2e-4769-40c5-b09e-4aa70e3dc238\") " pod="watcher-kuttl-default/watcher-kuttl-api-1" Oct 03 09:09:29 crc kubenswrapper[4765]: I1003 09:09:29.599111 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/15583e2e-4769-40c5-b09e-4aa70e3dc238-logs\") pod \"watcher-kuttl-api-1\" (UID: \"15583e2e-4769-40c5-b09e-4aa70e3dc238\") " pod="watcher-kuttl-default/watcher-kuttl-api-1" Oct 03 09:09:29 crc kubenswrapper[4765]: I1003 09:09:29.599171 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5717d110-7f82-4448-8891-4a0dc6ae2703-combined-ca-bundle\") pod \"watcher-kuttl-applier-0\" (UID: \"5717d110-7f82-4448-8891-4a0dc6ae2703\") " pod="watcher-kuttl-default/watcher-kuttl-applier-0" Oct 03 09:09:29 crc kubenswrapper[4765]: I1003 09:09:29.599228 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f7dfb030-e149-4806-bf36-54b8283a9027-config-data\") pod \"watcher-kuttl-api-0\" (UID: \"f7dfb030-e149-4806-bf36-54b8283a9027\") " pod="watcher-kuttl-default/watcher-kuttl-api-0" Oct 03 09:09:29 crc kubenswrapper[4765]: I1003 09:09:29.599257 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-szbpj\" (UniqueName: \"kubernetes.io/projected/f7dfb030-e149-4806-bf36-54b8283a9027-kube-api-access-szbpj\") pod \"watcher-kuttl-api-0\" (UID: \"f7dfb030-e149-4806-bf36-54b8283a9027\") " pod="watcher-kuttl-default/watcher-kuttl-api-0" Oct 03 09:09:29 crc kubenswrapper[4765]: I1003 09:09:29.599293 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/f7dfb030-e149-4806-bf36-54b8283a9027-logs\") pod \"watcher-kuttl-api-0\" (UID: \"f7dfb030-e149-4806-bf36-54b8283a9027\") " pod="watcher-kuttl-default/watcher-kuttl-api-0" Oct 03 09:09:29 crc kubenswrapper[4765]: I1003 09:09:29.599320 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert-memcached-mtls\" (UniqueName: \"kubernetes.io/secret/f7dfb030-e149-4806-bf36-54b8283a9027-cert-memcached-mtls\") pod \"watcher-kuttl-api-0\" (UID: \"f7dfb030-e149-4806-bf36-54b8283a9027\") " pod="watcher-kuttl-default/watcher-kuttl-api-0" Oct 03 09:09:29 crc kubenswrapper[4765]: I1003 09:09:29.599354 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert-memcached-mtls\" (UniqueName: \"kubernetes.io/secret/15583e2e-4769-40c5-b09e-4aa70e3dc238-cert-memcached-mtls\") pod \"watcher-kuttl-api-1\" (UID: \"15583e2e-4769-40c5-b09e-4aa70e3dc238\") " pod="watcher-kuttl-default/watcher-kuttl-api-1" Oct 03 09:09:29 crc kubenswrapper[4765]: I1003 09:09:29.599412 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f7dfb030-e149-4806-bf36-54b8283a9027-combined-ca-bundle\") pod \"watcher-kuttl-api-0\" (UID: \"f7dfb030-e149-4806-bf36-54b8283a9027\") " pod="watcher-kuttl-default/watcher-kuttl-api-0" Oct 03 09:09:29 crc kubenswrapper[4765]: I1003 09:09:29.599446 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2e44e319-a13c-4e4a-bf9c-775e09f92bc3-combined-ca-bundle\") pod \"watcher-kuttl-decision-engine-0\" (UID: \"2e44e319-a13c-4e4a-bf9c-775e09f92bc3\") " pod="watcher-kuttl-default/watcher-kuttl-decision-engine-0" Oct 03 09:09:29 crc kubenswrapper[4765]: I1003 09:09:29.599472 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert-memcached-mtls\" (UniqueName: \"kubernetes.io/secret/5717d110-7f82-4448-8891-4a0dc6ae2703-cert-memcached-mtls\") pod \"watcher-kuttl-applier-0\" (UID: \"5717d110-7f82-4448-8891-4a0dc6ae2703\") " pod="watcher-kuttl-default/watcher-kuttl-applier-0" Oct 03 09:09:29 crc kubenswrapper[4765]: I1003 09:09:29.599519 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-777js\" (UniqueName: \"kubernetes.io/projected/15583e2e-4769-40c5-b09e-4aa70e3dc238-kube-api-access-777js\") pod \"watcher-kuttl-api-1\" (UID: \"15583e2e-4769-40c5-b09e-4aa70e3dc238\") " pod="watcher-kuttl-default/watcher-kuttl-api-1" Oct 03 09:09:29 crc kubenswrapper[4765]: I1003 09:09:29.599565 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"custom-prometheus-ca\" (UniqueName: \"kubernetes.io/secret/f7dfb030-e149-4806-bf36-54b8283a9027-custom-prometheus-ca\") pod \"watcher-kuttl-api-0\" (UID: \"f7dfb030-e149-4806-bf36-54b8283a9027\") " pod="watcher-kuttl-default/watcher-kuttl-api-0" Oct 03 09:09:29 crc kubenswrapper[4765]: I1003 09:09:29.599686 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/15583e2e-4769-40c5-b09e-4aa70e3dc238-config-data\") pod \"watcher-kuttl-api-1\" (UID: \"15583e2e-4769-40c5-b09e-4aa70e3dc238\") " pod="watcher-kuttl-default/watcher-kuttl-api-1" Oct 03 09:09:29 crc kubenswrapper[4765]: I1003 09:09:29.599717 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/15583e2e-4769-40c5-b09e-4aa70e3dc238-combined-ca-bundle\") pod \"watcher-kuttl-api-1\" (UID: \"15583e2e-4769-40c5-b09e-4aa70e3dc238\") " pod="watcher-kuttl-default/watcher-kuttl-api-1" Oct 03 09:09:29 crc kubenswrapper[4765]: I1003 09:09:29.599747 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vtdgd\" (UniqueName: \"kubernetes.io/projected/2e44e319-a13c-4e4a-bf9c-775e09f92bc3-kube-api-access-vtdgd\") pod \"watcher-kuttl-decision-engine-0\" (UID: \"2e44e319-a13c-4e4a-bf9c-775e09f92bc3\") " pod="watcher-kuttl-default/watcher-kuttl-decision-engine-0" Oct 03 09:09:29 crc kubenswrapper[4765]: I1003 09:09:29.599772 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/5717d110-7f82-4448-8891-4a0dc6ae2703-logs\") pod \"watcher-kuttl-applier-0\" (UID: \"5717d110-7f82-4448-8891-4a0dc6ae2703\") " pod="watcher-kuttl-default/watcher-kuttl-applier-0" Oct 03 09:09:29 crc kubenswrapper[4765]: I1003 09:09:29.599850 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/2e44e319-a13c-4e4a-bf9c-775e09f92bc3-logs\") pod \"watcher-kuttl-decision-engine-0\" (UID: \"2e44e319-a13c-4e4a-bf9c-775e09f92bc3\") " pod="watcher-kuttl-default/watcher-kuttl-decision-engine-0" Oct 03 09:09:29 crc kubenswrapper[4765]: I1003 09:09:29.599886 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"custom-prometheus-ca\" (UniqueName: \"kubernetes.io/secret/2e44e319-a13c-4e4a-bf9c-775e09f92bc3-custom-prometheus-ca\") pod \"watcher-kuttl-decision-engine-0\" (UID: \"2e44e319-a13c-4e4a-bf9c-775e09f92bc3\") " pod="watcher-kuttl-default/watcher-kuttl-decision-engine-0" Oct 03 09:09:29 crc kubenswrapper[4765]: I1003 09:09:29.600290 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/f7dfb030-e149-4806-bf36-54b8283a9027-logs\") pod \"watcher-kuttl-api-0\" (UID: \"f7dfb030-e149-4806-bf36-54b8283a9027\") " pod="watcher-kuttl-default/watcher-kuttl-api-0" Oct 03 09:09:29 crc kubenswrapper[4765]: I1003 09:09:29.600778 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/15583e2e-4769-40c5-b09e-4aa70e3dc238-logs\") pod \"watcher-kuttl-api-1\" (UID: \"15583e2e-4769-40c5-b09e-4aa70e3dc238\") " pod="watcher-kuttl-default/watcher-kuttl-api-1" Oct 03 09:09:29 crc kubenswrapper[4765]: I1003 09:09:29.603728 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2e44e319-a13c-4e4a-bf9c-775e09f92bc3-config-data\") pod \"watcher-kuttl-decision-engine-0\" (UID: \"2e44e319-a13c-4e4a-bf9c-775e09f92bc3\") " pod="watcher-kuttl-default/watcher-kuttl-decision-engine-0" Oct 03 09:09:29 crc kubenswrapper[4765]: I1003 09:09:29.604603 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"custom-prometheus-ca\" (UniqueName: \"kubernetes.io/secret/15583e2e-4769-40c5-b09e-4aa70e3dc238-custom-prometheus-ca\") pod \"watcher-kuttl-api-1\" (UID: \"15583e2e-4769-40c5-b09e-4aa70e3dc238\") " pod="watcher-kuttl-default/watcher-kuttl-api-1" Oct 03 09:09:29 crc kubenswrapper[4765]: I1003 09:09:29.604937 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/2e44e319-a13c-4e4a-bf9c-775e09f92bc3-logs\") pod \"watcher-kuttl-decision-engine-0\" (UID: \"2e44e319-a13c-4e4a-bf9c-775e09f92bc3\") " pod="watcher-kuttl-default/watcher-kuttl-decision-engine-0" Oct 03 09:09:29 crc kubenswrapper[4765]: I1003 09:09:29.606177 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f7dfb030-e149-4806-bf36-54b8283a9027-config-data\") pod \"watcher-kuttl-api-0\" (UID: \"f7dfb030-e149-4806-bf36-54b8283a9027\") " pod="watcher-kuttl-default/watcher-kuttl-api-0" Oct 03 09:09:29 crc kubenswrapper[4765]: I1003 09:09:29.606765 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"custom-prometheus-ca\" (UniqueName: \"kubernetes.io/secret/2e44e319-a13c-4e4a-bf9c-775e09f92bc3-custom-prometheus-ca\") pod \"watcher-kuttl-decision-engine-0\" (UID: \"2e44e319-a13c-4e4a-bf9c-775e09f92bc3\") " pod="watcher-kuttl-default/watcher-kuttl-decision-engine-0" Oct 03 09:09:29 crc kubenswrapper[4765]: I1003 09:09:29.607224 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f7dfb030-e149-4806-bf36-54b8283a9027-combined-ca-bundle\") pod \"watcher-kuttl-api-0\" (UID: \"f7dfb030-e149-4806-bf36-54b8283a9027\") " pod="watcher-kuttl-default/watcher-kuttl-api-0" Oct 03 09:09:29 crc kubenswrapper[4765]: I1003 09:09:29.611504 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/15583e2e-4769-40c5-b09e-4aa70e3dc238-config-data\") pod \"watcher-kuttl-api-1\" (UID: \"15583e2e-4769-40c5-b09e-4aa70e3dc238\") " pod="watcher-kuttl-default/watcher-kuttl-api-1" Oct 03 09:09:29 crc kubenswrapper[4765]: I1003 09:09:29.611950 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2e44e319-a13c-4e4a-bf9c-775e09f92bc3-combined-ca-bundle\") pod \"watcher-kuttl-decision-engine-0\" (UID: \"2e44e319-a13c-4e4a-bf9c-775e09f92bc3\") " pod="watcher-kuttl-default/watcher-kuttl-decision-engine-0" Oct 03 09:09:29 crc kubenswrapper[4765]: I1003 09:09:29.613579 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert-memcached-mtls\" (UniqueName: \"kubernetes.io/secret/2e44e319-a13c-4e4a-bf9c-775e09f92bc3-cert-memcached-mtls\") pod \"watcher-kuttl-decision-engine-0\" (UID: \"2e44e319-a13c-4e4a-bf9c-775e09f92bc3\") " pod="watcher-kuttl-default/watcher-kuttl-decision-engine-0" Oct 03 09:09:29 crc kubenswrapper[4765]: I1003 09:09:29.619230 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert-memcached-mtls\" (UniqueName: \"kubernetes.io/secret/f7dfb030-e149-4806-bf36-54b8283a9027-cert-memcached-mtls\") pod \"watcher-kuttl-api-0\" (UID: \"f7dfb030-e149-4806-bf36-54b8283a9027\") " pod="watcher-kuttl-default/watcher-kuttl-api-0" Oct 03 09:09:29 crc kubenswrapper[4765]: I1003 09:09:29.619935 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert-memcached-mtls\" (UniqueName: \"kubernetes.io/secret/15583e2e-4769-40c5-b09e-4aa70e3dc238-cert-memcached-mtls\") pod \"watcher-kuttl-api-1\" (UID: \"15583e2e-4769-40c5-b09e-4aa70e3dc238\") " pod="watcher-kuttl-default/watcher-kuttl-api-1" Oct 03 09:09:29 crc kubenswrapper[4765]: I1003 09:09:29.624331 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"custom-prometheus-ca\" (UniqueName: \"kubernetes.io/secret/f7dfb030-e149-4806-bf36-54b8283a9027-custom-prometheus-ca\") pod \"watcher-kuttl-api-0\" (UID: \"f7dfb030-e149-4806-bf36-54b8283a9027\") " pod="watcher-kuttl-default/watcher-kuttl-api-0" Oct 03 09:09:29 crc kubenswrapper[4765]: I1003 09:09:29.633267 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vtdgd\" (UniqueName: \"kubernetes.io/projected/2e44e319-a13c-4e4a-bf9c-775e09f92bc3-kube-api-access-vtdgd\") pod \"watcher-kuttl-decision-engine-0\" (UID: \"2e44e319-a13c-4e4a-bf9c-775e09f92bc3\") " pod="watcher-kuttl-default/watcher-kuttl-decision-engine-0" Oct 03 09:09:29 crc kubenswrapper[4765]: I1003 09:09:29.633411 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/15583e2e-4769-40c5-b09e-4aa70e3dc238-combined-ca-bundle\") pod \"watcher-kuttl-api-1\" (UID: \"15583e2e-4769-40c5-b09e-4aa70e3dc238\") " pod="watcher-kuttl-default/watcher-kuttl-api-1" Oct 03 09:09:29 crc kubenswrapper[4765]: I1003 09:09:29.636420 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-777js\" (UniqueName: \"kubernetes.io/projected/15583e2e-4769-40c5-b09e-4aa70e3dc238-kube-api-access-777js\") pod \"watcher-kuttl-api-1\" (UID: \"15583e2e-4769-40c5-b09e-4aa70e3dc238\") " pod="watcher-kuttl-default/watcher-kuttl-api-1" Oct 03 09:09:29 crc kubenswrapper[4765]: I1003 09:09:29.641577 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-szbpj\" (UniqueName: \"kubernetes.io/projected/f7dfb030-e149-4806-bf36-54b8283a9027-kube-api-access-szbpj\") pod \"watcher-kuttl-api-0\" (UID: \"f7dfb030-e149-4806-bf36-54b8283a9027\") " pod="watcher-kuttl-default/watcher-kuttl-api-0" Oct 03 09:09:29 crc kubenswrapper[4765]: I1003 09:09:29.698288 4765 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/watcher-kuttl-api-0" Oct 03 09:09:29 crc kubenswrapper[4765]: I1003 09:09:29.700811 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert-memcached-mtls\" (UniqueName: \"kubernetes.io/secret/5717d110-7f82-4448-8891-4a0dc6ae2703-cert-memcached-mtls\") pod \"watcher-kuttl-applier-0\" (UID: \"5717d110-7f82-4448-8891-4a0dc6ae2703\") " pod="watcher-kuttl-default/watcher-kuttl-applier-0" Oct 03 09:09:29 crc kubenswrapper[4765]: I1003 09:09:29.700888 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/5717d110-7f82-4448-8891-4a0dc6ae2703-logs\") pod \"watcher-kuttl-applier-0\" (UID: \"5717d110-7f82-4448-8891-4a0dc6ae2703\") " pod="watcher-kuttl-default/watcher-kuttl-applier-0" Oct 03 09:09:29 crc kubenswrapper[4765]: I1003 09:09:29.700931 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5717d110-7f82-4448-8891-4a0dc6ae2703-config-data\") pod \"watcher-kuttl-applier-0\" (UID: \"5717d110-7f82-4448-8891-4a0dc6ae2703\") " pod="watcher-kuttl-default/watcher-kuttl-applier-0" Oct 03 09:09:29 crc kubenswrapper[4765]: I1003 09:09:29.700947 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-smmbc\" (UniqueName: \"kubernetes.io/projected/5717d110-7f82-4448-8891-4a0dc6ae2703-kube-api-access-smmbc\") pod \"watcher-kuttl-applier-0\" (UID: \"5717d110-7f82-4448-8891-4a0dc6ae2703\") " pod="watcher-kuttl-default/watcher-kuttl-applier-0" Oct 03 09:09:29 crc kubenswrapper[4765]: I1003 09:09:29.700969 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5717d110-7f82-4448-8891-4a0dc6ae2703-combined-ca-bundle\") pod \"watcher-kuttl-applier-0\" (UID: \"5717d110-7f82-4448-8891-4a0dc6ae2703\") " pod="watcher-kuttl-default/watcher-kuttl-applier-0" Oct 03 09:09:29 crc kubenswrapper[4765]: I1003 09:09:29.701539 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/5717d110-7f82-4448-8891-4a0dc6ae2703-logs\") pod \"watcher-kuttl-applier-0\" (UID: \"5717d110-7f82-4448-8891-4a0dc6ae2703\") " pod="watcher-kuttl-default/watcher-kuttl-applier-0" Oct 03 09:09:29 crc kubenswrapper[4765]: I1003 09:09:29.705196 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert-memcached-mtls\" (UniqueName: \"kubernetes.io/secret/5717d110-7f82-4448-8891-4a0dc6ae2703-cert-memcached-mtls\") pod \"watcher-kuttl-applier-0\" (UID: \"5717d110-7f82-4448-8891-4a0dc6ae2703\") " pod="watcher-kuttl-default/watcher-kuttl-applier-0" Oct 03 09:09:29 crc kubenswrapper[4765]: I1003 09:09:29.705730 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5717d110-7f82-4448-8891-4a0dc6ae2703-combined-ca-bundle\") pod \"watcher-kuttl-applier-0\" (UID: \"5717d110-7f82-4448-8891-4a0dc6ae2703\") " pod="watcher-kuttl-default/watcher-kuttl-applier-0" Oct 03 09:09:29 crc kubenswrapper[4765]: I1003 09:09:29.706192 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5717d110-7f82-4448-8891-4a0dc6ae2703-config-data\") pod \"watcher-kuttl-applier-0\" (UID: \"5717d110-7f82-4448-8891-4a0dc6ae2703\") " pod="watcher-kuttl-default/watcher-kuttl-applier-0" Oct 03 09:09:29 crc kubenswrapper[4765]: I1003 09:09:29.720932 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-smmbc\" (UniqueName: \"kubernetes.io/projected/5717d110-7f82-4448-8891-4a0dc6ae2703-kube-api-access-smmbc\") pod \"watcher-kuttl-applier-0\" (UID: \"5717d110-7f82-4448-8891-4a0dc6ae2703\") " pod="watcher-kuttl-default/watcher-kuttl-applier-0" Oct 03 09:09:29 crc kubenswrapper[4765]: I1003 09:09:29.730184 4765 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/watcher-kuttl-api-1" Oct 03 09:09:29 crc kubenswrapper[4765]: I1003 09:09:29.809371 4765 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/watcher-kuttl-decision-engine-0" Oct 03 09:09:29 crc kubenswrapper[4765]: I1003 09:09:29.846128 4765 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/watcher-kuttl-applier-0" Oct 03 09:09:30 crc kubenswrapper[4765]: I1003 09:09:30.200318 4765 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["watcher-kuttl-default/watcher-kuttl-api-0"] Oct 03 09:09:30 crc kubenswrapper[4765]: I1003 09:09:30.284562 4765 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["watcher-kuttl-default/watcher-kuttl-api-1"] Oct 03 09:09:30 crc kubenswrapper[4765]: W1003 09:09:30.294412 4765 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod15583e2e_4769_40c5_b09e_4aa70e3dc238.slice/crio-1bc5a52997906850877a33d3b57684d1e34de387e3af7c0038b3232a3a910438 WatchSource:0}: Error finding container 1bc5a52997906850877a33d3b57684d1e34de387e3af7c0038b3232a3a910438: Status 404 returned error can't find the container with id 1bc5a52997906850877a33d3b57684d1e34de387e3af7c0038b3232a3a910438 Oct 03 09:09:30 crc kubenswrapper[4765]: I1003 09:09:30.362225 4765 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["watcher-kuttl-default/watcher-kuttl-decision-engine-0"] Oct 03 09:09:30 crc kubenswrapper[4765]: I1003 09:09:30.484695 4765 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["watcher-kuttl-default/watcher-kuttl-applier-0"] Oct 03 09:09:31 crc kubenswrapper[4765]: I1003 09:09:31.142903 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/watcher-kuttl-applier-0" event={"ID":"5717d110-7f82-4448-8891-4a0dc6ae2703","Type":"ContainerStarted","Data":"6c4f112880a058f5288cb046ab769b1fd48c801d885e8923750566b8cc6f9bd4"} Oct 03 09:09:31 crc kubenswrapper[4765]: I1003 09:09:31.143746 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/watcher-kuttl-applier-0" event={"ID":"5717d110-7f82-4448-8891-4a0dc6ae2703","Type":"ContainerStarted","Data":"64ec5330ef3fb8efbf1e78af1cd48a08eb5eeb39e6029de50dff0ce87af3ecad"} Oct 03 09:09:31 crc kubenswrapper[4765]: I1003 09:09:31.155107 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/watcher-kuttl-api-0" event={"ID":"f7dfb030-e149-4806-bf36-54b8283a9027","Type":"ContainerStarted","Data":"b926ca3b2525a6bb74bbb2dc98d17adc2b5a19d17c0aa49fccf302e0efda2cb7"} Oct 03 09:09:31 crc kubenswrapper[4765]: I1003 09:09:31.155174 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/watcher-kuttl-api-0" event={"ID":"f7dfb030-e149-4806-bf36-54b8283a9027","Type":"ContainerStarted","Data":"7a9bf1461dd3939caa54fc132139cf8dbaded387e7e1d579891f45191d60352c"} Oct 03 09:09:31 crc kubenswrapper[4765]: I1003 09:09:31.155192 4765 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="watcher-kuttl-default/watcher-kuttl-api-0" Oct 03 09:09:31 crc kubenswrapper[4765]: I1003 09:09:31.155202 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/watcher-kuttl-api-0" event={"ID":"f7dfb030-e149-4806-bf36-54b8283a9027","Type":"ContainerStarted","Data":"6fa95d06c4af9c88ede6b79db55d33367db0f16af253befc47f5356328c1aa4f"} Oct 03 09:09:31 crc kubenswrapper[4765]: I1003 09:09:31.157779 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/watcher-kuttl-decision-engine-0" event={"ID":"2e44e319-a13c-4e4a-bf9c-775e09f92bc3","Type":"ContainerStarted","Data":"2f7dafdada6021fd8062c4874aa556dcda55353642cac651f706241412a4279f"} Oct 03 09:09:31 crc kubenswrapper[4765]: I1003 09:09:31.157843 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/watcher-kuttl-decision-engine-0" event={"ID":"2e44e319-a13c-4e4a-bf9c-775e09f92bc3","Type":"ContainerStarted","Data":"f612bd5c46fc3581179eab6a2c75b5dcc3b88b2878be5362c58b51029d4e473a"} Oct 03 09:09:31 crc kubenswrapper[4765]: I1003 09:09:31.159632 4765 prober.go:107] "Probe failed" probeType="Readiness" pod="watcher-kuttl-default/watcher-kuttl-api-0" podUID="f7dfb030-e149-4806-bf36-54b8283a9027" containerName="watcher-api" probeResult="failure" output="Get \"http://10.217.0.225:9322/\": dial tcp 10.217.0.225:9322: connect: connection refused" Oct 03 09:09:31 crc kubenswrapper[4765]: I1003 09:09:31.160552 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/watcher-kuttl-api-1" event={"ID":"15583e2e-4769-40c5-b09e-4aa70e3dc238","Type":"ContainerStarted","Data":"af30e18d65b447d9070ee30a84c41a260bb43bd8afbd273bacaf58110a4a2bf2"} Oct 03 09:09:31 crc kubenswrapper[4765]: I1003 09:09:31.160581 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/watcher-kuttl-api-1" event={"ID":"15583e2e-4769-40c5-b09e-4aa70e3dc238","Type":"ContainerStarted","Data":"ecaae0984a5e363a2cd98663328b28aa2f9495c40ae3666535c89c97660cfd65"} Oct 03 09:09:31 crc kubenswrapper[4765]: I1003 09:09:31.160590 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/watcher-kuttl-api-1" event={"ID":"15583e2e-4769-40c5-b09e-4aa70e3dc238","Type":"ContainerStarted","Data":"1bc5a52997906850877a33d3b57684d1e34de387e3af7c0038b3232a3a910438"} Oct 03 09:09:31 crc kubenswrapper[4765]: I1003 09:09:31.161132 4765 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="watcher-kuttl-default/watcher-kuttl-api-1" Oct 03 09:09:31 crc kubenswrapper[4765]: I1003 09:09:31.162293 4765 prober.go:107] "Probe failed" probeType="Readiness" pod="watcher-kuttl-default/watcher-kuttl-api-1" podUID="15583e2e-4769-40c5-b09e-4aa70e3dc238" containerName="watcher-api" probeResult="failure" output="Get \"http://10.217.0.226:9322/\": dial tcp 10.217.0.226:9322: connect: connection refused" Oct 03 09:09:31 crc kubenswrapper[4765]: I1003 09:09:31.175508 4765 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="watcher-kuttl-default/watcher-kuttl-applier-0" podStartSLOduration=2.17548298 podStartE2EDuration="2.17548298s" podCreationTimestamp="2025-10-03 09:09:29 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-03 09:09:31.168307793 +0000 UTC m=+1815.469802143" watchObservedRunningTime="2025-10-03 09:09:31.17548298 +0000 UTC m=+1815.476977310" Oct 03 09:09:31 crc kubenswrapper[4765]: I1003 09:09:31.216569 4765 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="watcher-kuttl-default/watcher-kuttl-api-1" podStartSLOduration=2.2165503 podStartE2EDuration="2.2165503s" podCreationTimestamp="2025-10-03 09:09:29 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-03 09:09:31.207141825 +0000 UTC m=+1815.508636165" watchObservedRunningTime="2025-10-03 09:09:31.2165503 +0000 UTC m=+1815.518044630" Oct 03 09:09:31 crc kubenswrapper[4765]: I1003 09:09:31.228071 4765 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="watcher-kuttl-default/watcher-kuttl-api-0" podStartSLOduration=2.22805023 podStartE2EDuration="2.22805023s" podCreationTimestamp="2025-10-03 09:09:29 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-03 09:09:31.226456798 +0000 UTC m=+1815.527951128" watchObservedRunningTime="2025-10-03 09:09:31.22805023 +0000 UTC m=+1815.529544560" Oct 03 09:09:31 crc kubenswrapper[4765]: I1003 09:09:31.263772 4765 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="watcher-kuttl-default/watcher-kuttl-decision-engine-0" podStartSLOduration=2.26375263 podStartE2EDuration="2.26375263s" podCreationTimestamp="2025-10-03 09:09:29 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-03 09:09:31.262143908 +0000 UTC m=+1815.563638238" watchObservedRunningTime="2025-10-03 09:09:31.26375263 +0000 UTC m=+1815.565246960" Oct 03 09:09:34 crc kubenswrapper[4765]: I1003 09:09:34.698680 4765 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="watcher-kuttl-default/watcher-kuttl-api-0" Oct 03 09:09:34 crc kubenswrapper[4765]: I1003 09:09:34.731365 4765 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="watcher-kuttl-default/watcher-kuttl-api-1" Oct 03 09:09:34 crc kubenswrapper[4765]: I1003 09:09:34.731421 4765 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="watcher-kuttl-default/watcher-kuttl-api-1" Oct 03 09:09:34 crc kubenswrapper[4765]: I1003 09:09:34.835606 4765 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="watcher-kuttl-default/watcher-kuttl-api-0" Oct 03 09:09:34 crc kubenswrapper[4765]: I1003 09:09:34.846203 4765 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="watcher-kuttl-default/watcher-kuttl-applier-0" Oct 03 09:09:37 crc kubenswrapper[4765]: I1003 09:09:37.307421 4765 scope.go:117] "RemoveContainer" containerID="dd918556e4256b95f1ffce5dba4f8a301b33441a569fc5bbea88da3f09eb9800" Oct 03 09:09:37 crc kubenswrapper[4765]: E1003 09:09:37.307987 4765 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-j8mss_openshift-machine-config-operator(d636dbad-9ffa-4ba7-953f-adea04b76a23)\"" pod="openshift-machine-config-operator/machine-config-daemon-j8mss" podUID="d636dbad-9ffa-4ba7-953f-adea04b76a23" Oct 03 09:09:39 crc kubenswrapper[4765]: I1003 09:09:39.699077 4765 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="watcher-kuttl-default/watcher-kuttl-api-0" Oct 03 09:09:39 crc kubenswrapper[4765]: I1003 09:09:39.706606 4765 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="watcher-kuttl-default/watcher-kuttl-api-0" Oct 03 09:09:39 crc kubenswrapper[4765]: I1003 09:09:39.731717 4765 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="watcher-kuttl-default/watcher-kuttl-api-1" Oct 03 09:09:39 crc kubenswrapper[4765]: I1003 09:09:39.742880 4765 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="watcher-kuttl-default/watcher-kuttl-api-1" Oct 03 09:09:39 crc kubenswrapper[4765]: I1003 09:09:39.809664 4765 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="watcher-kuttl-default/watcher-kuttl-decision-engine-0" Oct 03 09:09:39 crc kubenswrapper[4765]: I1003 09:09:39.837554 4765 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="watcher-kuttl-default/watcher-kuttl-decision-engine-0" Oct 03 09:09:39 crc kubenswrapper[4765]: I1003 09:09:39.846330 4765 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="watcher-kuttl-default/watcher-kuttl-applier-0" Oct 03 09:09:39 crc kubenswrapper[4765]: I1003 09:09:39.903632 4765 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="watcher-kuttl-default/watcher-kuttl-applier-0" Oct 03 09:09:40 crc kubenswrapper[4765]: I1003 09:09:40.238350 4765 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="watcher-kuttl-default/watcher-kuttl-decision-engine-0" Oct 03 09:09:40 crc kubenswrapper[4765]: I1003 09:09:40.241692 4765 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="watcher-kuttl-default/watcher-kuttl-api-1" Oct 03 09:09:40 crc kubenswrapper[4765]: I1003 09:09:40.243814 4765 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="watcher-kuttl-default/watcher-kuttl-api-0" Oct 03 09:09:40 crc kubenswrapper[4765]: I1003 09:09:40.268432 4765 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="watcher-kuttl-default/watcher-kuttl-decision-engine-0" Oct 03 09:09:40 crc kubenswrapper[4765]: I1003 09:09:40.285292 4765 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="watcher-kuttl-default/watcher-kuttl-applier-0" Oct 03 09:09:42 crc kubenswrapper[4765]: I1003 09:09:42.530674 4765 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["watcher-kuttl-default/ceilometer-0"] Oct 03 09:09:42 crc kubenswrapper[4765]: I1003 09:09:42.531369 4765 kuberuntime_container.go:808] "Killing container with a grace period" pod="watcher-kuttl-default/ceilometer-0" podUID="27cb84f7-b506-44f1-8200-799520d6baa9" containerName="ceilometer-central-agent" containerID="cri-o://d0e8c32c1f9dd668ba1656d79e066d004298fc25e969e6502733139680d7ca22" gracePeriod=30 Oct 03 09:09:42 crc kubenswrapper[4765]: I1003 09:09:42.531462 4765 kuberuntime_container.go:808] "Killing container with a grace period" pod="watcher-kuttl-default/ceilometer-0" podUID="27cb84f7-b506-44f1-8200-799520d6baa9" containerName="ceilometer-notification-agent" containerID="cri-o://c96dddc539346e09fe9cb6f685f8a10316bf692897986860f69b2f04d321afab" gracePeriod=30 Oct 03 09:09:42 crc kubenswrapper[4765]: I1003 09:09:42.531520 4765 kuberuntime_container.go:808] "Killing container with a grace period" pod="watcher-kuttl-default/ceilometer-0" podUID="27cb84f7-b506-44f1-8200-799520d6baa9" containerName="proxy-httpd" containerID="cri-o://2aecf6015ad7cb8364e97e7103329978881d3a4e7f1aae01599c727dbceb7ab9" gracePeriod=30 Oct 03 09:09:42 crc kubenswrapper[4765]: I1003 09:09:42.531691 4765 kuberuntime_container.go:808] "Killing container with a grace period" pod="watcher-kuttl-default/ceilometer-0" podUID="27cb84f7-b506-44f1-8200-799520d6baa9" containerName="sg-core" containerID="cri-o://ade6a3f26a6c6c39349acb24c2ed7e1ff8f7b63b70b4dee90e3a2edae7870479" gracePeriod=30 Oct 03 09:09:42 crc kubenswrapper[4765]: I1003 09:09:42.544908 4765 prober.go:107] "Probe failed" probeType="Readiness" pod="watcher-kuttl-default/ceilometer-0" podUID="27cb84f7-b506-44f1-8200-799520d6baa9" containerName="proxy-httpd" probeResult="failure" output="Get \"https://10.217.0.222:3000/\": EOF" Oct 03 09:09:43 crc kubenswrapper[4765]: I1003 09:09:43.295364 4765 generic.go:334] "Generic (PLEG): container finished" podID="27cb84f7-b506-44f1-8200-799520d6baa9" containerID="2aecf6015ad7cb8364e97e7103329978881d3a4e7f1aae01599c727dbceb7ab9" exitCode=0 Oct 03 09:09:43 crc kubenswrapper[4765]: I1003 09:09:43.295700 4765 generic.go:334] "Generic (PLEG): container finished" podID="27cb84f7-b506-44f1-8200-799520d6baa9" containerID="ade6a3f26a6c6c39349acb24c2ed7e1ff8f7b63b70b4dee90e3a2edae7870479" exitCode=2 Oct 03 09:09:43 crc kubenswrapper[4765]: I1003 09:09:43.295710 4765 generic.go:334] "Generic (PLEG): container finished" podID="27cb84f7-b506-44f1-8200-799520d6baa9" containerID="c96dddc539346e09fe9cb6f685f8a10316bf692897986860f69b2f04d321afab" exitCode=0 Oct 03 09:09:43 crc kubenswrapper[4765]: I1003 09:09:43.295717 4765 generic.go:334] "Generic (PLEG): container finished" podID="27cb84f7-b506-44f1-8200-799520d6baa9" containerID="d0e8c32c1f9dd668ba1656d79e066d004298fc25e969e6502733139680d7ca22" exitCode=0 Oct 03 09:09:43 crc kubenswrapper[4765]: I1003 09:09:43.295435 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/ceilometer-0" event={"ID":"27cb84f7-b506-44f1-8200-799520d6baa9","Type":"ContainerDied","Data":"2aecf6015ad7cb8364e97e7103329978881d3a4e7f1aae01599c727dbceb7ab9"} Oct 03 09:09:43 crc kubenswrapper[4765]: I1003 09:09:43.295753 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/ceilometer-0" event={"ID":"27cb84f7-b506-44f1-8200-799520d6baa9","Type":"ContainerDied","Data":"ade6a3f26a6c6c39349acb24c2ed7e1ff8f7b63b70b4dee90e3a2edae7870479"} Oct 03 09:09:43 crc kubenswrapper[4765]: I1003 09:09:43.295767 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/ceilometer-0" event={"ID":"27cb84f7-b506-44f1-8200-799520d6baa9","Type":"ContainerDied","Data":"c96dddc539346e09fe9cb6f685f8a10316bf692897986860f69b2f04d321afab"} Oct 03 09:09:43 crc kubenswrapper[4765]: I1003 09:09:43.295776 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/ceilometer-0" event={"ID":"27cb84f7-b506-44f1-8200-799520d6baa9","Type":"ContainerDied","Data":"d0e8c32c1f9dd668ba1656d79e066d004298fc25e969e6502733139680d7ca22"} Oct 03 09:09:43 crc kubenswrapper[4765]: I1003 09:09:43.376379 4765 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/ceilometer-0" Oct 03 09:09:43 crc kubenswrapper[4765]: I1003 09:09:43.528586 4765 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/27cb84f7-b506-44f1-8200-799520d6baa9-ceilometer-tls-certs\") pod \"27cb84f7-b506-44f1-8200-799520d6baa9\" (UID: \"27cb84f7-b506-44f1-8200-799520d6baa9\") " Oct 03 09:09:43 crc kubenswrapper[4765]: I1003 09:09:43.528664 4765 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/27cb84f7-b506-44f1-8200-799520d6baa9-combined-ca-bundle\") pod \"27cb84f7-b506-44f1-8200-799520d6baa9\" (UID: \"27cb84f7-b506-44f1-8200-799520d6baa9\") " Oct 03 09:09:43 crc kubenswrapper[4765]: I1003 09:09:43.528709 4765 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/27cb84f7-b506-44f1-8200-799520d6baa9-run-httpd\") pod \"27cb84f7-b506-44f1-8200-799520d6baa9\" (UID: \"27cb84f7-b506-44f1-8200-799520d6baa9\") " Oct 03 09:09:43 crc kubenswrapper[4765]: I1003 09:09:43.528778 4765 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-97wn9\" (UniqueName: \"kubernetes.io/projected/27cb84f7-b506-44f1-8200-799520d6baa9-kube-api-access-97wn9\") pod \"27cb84f7-b506-44f1-8200-799520d6baa9\" (UID: \"27cb84f7-b506-44f1-8200-799520d6baa9\") " Oct 03 09:09:43 crc kubenswrapper[4765]: I1003 09:09:43.529112 4765 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/27cb84f7-b506-44f1-8200-799520d6baa9-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "27cb84f7-b506-44f1-8200-799520d6baa9" (UID: "27cb84f7-b506-44f1-8200-799520d6baa9"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 03 09:09:43 crc kubenswrapper[4765]: I1003 09:09:43.529634 4765 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/27cb84f7-b506-44f1-8200-799520d6baa9-sg-core-conf-yaml\") pod \"27cb84f7-b506-44f1-8200-799520d6baa9\" (UID: \"27cb84f7-b506-44f1-8200-799520d6baa9\") " Oct 03 09:09:43 crc kubenswrapper[4765]: I1003 09:09:43.529731 4765 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/27cb84f7-b506-44f1-8200-799520d6baa9-config-data\") pod \"27cb84f7-b506-44f1-8200-799520d6baa9\" (UID: \"27cb84f7-b506-44f1-8200-799520d6baa9\") " Oct 03 09:09:43 crc kubenswrapper[4765]: I1003 09:09:43.529767 4765 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/27cb84f7-b506-44f1-8200-799520d6baa9-scripts\") pod \"27cb84f7-b506-44f1-8200-799520d6baa9\" (UID: \"27cb84f7-b506-44f1-8200-799520d6baa9\") " Oct 03 09:09:43 crc kubenswrapper[4765]: I1003 09:09:43.529844 4765 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/27cb84f7-b506-44f1-8200-799520d6baa9-log-httpd\") pod \"27cb84f7-b506-44f1-8200-799520d6baa9\" (UID: \"27cb84f7-b506-44f1-8200-799520d6baa9\") " Oct 03 09:09:43 crc kubenswrapper[4765]: I1003 09:09:43.530322 4765 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/27cb84f7-b506-44f1-8200-799520d6baa9-run-httpd\") on node \"crc\" DevicePath \"\"" Oct 03 09:09:43 crc kubenswrapper[4765]: I1003 09:09:43.530704 4765 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/27cb84f7-b506-44f1-8200-799520d6baa9-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "27cb84f7-b506-44f1-8200-799520d6baa9" (UID: "27cb84f7-b506-44f1-8200-799520d6baa9"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 03 09:09:43 crc kubenswrapper[4765]: I1003 09:09:43.534322 4765 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/27cb84f7-b506-44f1-8200-799520d6baa9-scripts" (OuterVolumeSpecName: "scripts") pod "27cb84f7-b506-44f1-8200-799520d6baa9" (UID: "27cb84f7-b506-44f1-8200-799520d6baa9"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 09:09:43 crc kubenswrapper[4765]: I1003 09:09:43.537054 4765 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/27cb84f7-b506-44f1-8200-799520d6baa9-kube-api-access-97wn9" (OuterVolumeSpecName: "kube-api-access-97wn9") pod "27cb84f7-b506-44f1-8200-799520d6baa9" (UID: "27cb84f7-b506-44f1-8200-799520d6baa9"). InnerVolumeSpecName "kube-api-access-97wn9". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 09:09:43 crc kubenswrapper[4765]: I1003 09:09:43.567794 4765 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/27cb84f7-b506-44f1-8200-799520d6baa9-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "27cb84f7-b506-44f1-8200-799520d6baa9" (UID: "27cb84f7-b506-44f1-8200-799520d6baa9"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 09:09:43 crc kubenswrapper[4765]: I1003 09:09:43.578419 4765 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/27cb84f7-b506-44f1-8200-799520d6baa9-ceilometer-tls-certs" (OuterVolumeSpecName: "ceilometer-tls-certs") pod "27cb84f7-b506-44f1-8200-799520d6baa9" (UID: "27cb84f7-b506-44f1-8200-799520d6baa9"). InnerVolumeSpecName "ceilometer-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 09:09:43 crc kubenswrapper[4765]: I1003 09:09:43.604813 4765 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/27cb84f7-b506-44f1-8200-799520d6baa9-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "27cb84f7-b506-44f1-8200-799520d6baa9" (UID: "27cb84f7-b506-44f1-8200-799520d6baa9"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 09:09:43 crc kubenswrapper[4765]: I1003 09:09:43.619306 4765 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/27cb84f7-b506-44f1-8200-799520d6baa9-config-data" (OuterVolumeSpecName: "config-data") pod "27cb84f7-b506-44f1-8200-799520d6baa9" (UID: "27cb84f7-b506-44f1-8200-799520d6baa9"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 09:09:43 crc kubenswrapper[4765]: I1003 09:09:43.630872 4765 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/27cb84f7-b506-44f1-8200-799520d6baa9-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Oct 03 09:09:43 crc kubenswrapper[4765]: I1003 09:09:43.630913 4765 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/27cb84f7-b506-44f1-8200-799520d6baa9-config-data\") on node \"crc\" DevicePath \"\"" Oct 03 09:09:43 crc kubenswrapper[4765]: I1003 09:09:43.630927 4765 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/27cb84f7-b506-44f1-8200-799520d6baa9-scripts\") on node \"crc\" DevicePath \"\"" Oct 03 09:09:43 crc kubenswrapper[4765]: I1003 09:09:43.630937 4765 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/27cb84f7-b506-44f1-8200-799520d6baa9-log-httpd\") on node \"crc\" DevicePath \"\"" Oct 03 09:09:43 crc kubenswrapper[4765]: I1003 09:09:43.630949 4765 reconciler_common.go:293] "Volume detached for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/27cb84f7-b506-44f1-8200-799520d6baa9-ceilometer-tls-certs\") on node \"crc\" DevicePath \"\"" Oct 03 09:09:43 crc kubenswrapper[4765]: I1003 09:09:43.630962 4765 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/27cb84f7-b506-44f1-8200-799520d6baa9-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 03 09:09:43 crc kubenswrapper[4765]: I1003 09:09:43.630974 4765 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-97wn9\" (UniqueName: \"kubernetes.io/projected/27cb84f7-b506-44f1-8200-799520d6baa9-kube-api-access-97wn9\") on node \"crc\" DevicePath \"\"" Oct 03 09:09:44 crc kubenswrapper[4765]: I1003 09:09:44.305758 4765 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/ceilometer-0" Oct 03 09:09:44 crc kubenswrapper[4765]: I1003 09:09:44.315865 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/ceilometer-0" event={"ID":"27cb84f7-b506-44f1-8200-799520d6baa9","Type":"ContainerDied","Data":"3c48ed6ddf8a4f06e9860ff39759ce2f9767fcdebd0c1a4bb09d7728241b2f51"} Oct 03 09:09:44 crc kubenswrapper[4765]: I1003 09:09:44.316586 4765 scope.go:117] "RemoveContainer" containerID="2aecf6015ad7cb8364e97e7103329978881d3a4e7f1aae01599c727dbceb7ab9" Oct 03 09:09:44 crc kubenswrapper[4765]: I1003 09:09:44.336530 4765 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["watcher-kuttl-default/ceilometer-0"] Oct 03 09:09:44 crc kubenswrapper[4765]: I1003 09:09:44.345197 4765 scope.go:117] "RemoveContainer" containerID="ade6a3f26a6c6c39349acb24c2ed7e1ff8f7b63b70b4dee90e3a2edae7870479" Oct 03 09:09:44 crc kubenswrapper[4765]: I1003 09:09:44.346456 4765 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["watcher-kuttl-default/ceilometer-0"] Oct 03 09:09:44 crc kubenswrapper[4765]: I1003 09:09:44.371129 4765 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["watcher-kuttl-default/ceilometer-0"] Oct 03 09:09:44 crc kubenswrapper[4765]: E1003 09:09:44.371437 4765 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="27cb84f7-b506-44f1-8200-799520d6baa9" containerName="ceilometer-notification-agent" Oct 03 09:09:44 crc kubenswrapper[4765]: I1003 09:09:44.371452 4765 state_mem.go:107] "Deleted CPUSet assignment" podUID="27cb84f7-b506-44f1-8200-799520d6baa9" containerName="ceilometer-notification-agent" Oct 03 09:09:44 crc kubenswrapper[4765]: E1003 09:09:44.371468 4765 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="27cb84f7-b506-44f1-8200-799520d6baa9" containerName="proxy-httpd" Oct 03 09:09:44 crc kubenswrapper[4765]: I1003 09:09:44.371474 4765 state_mem.go:107] "Deleted CPUSet assignment" podUID="27cb84f7-b506-44f1-8200-799520d6baa9" containerName="proxy-httpd" Oct 03 09:09:44 crc kubenswrapper[4765]: E1003 09:09:44.371499 4765 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="27cb84f7-b506-44f1-8200-799520d6baa9" containerName="ceilometer-central-agent" Oct 03 09:09:44 crc kubenswrapper[4765]: I1003 09:09:44.371511 4765 state_mem.go:107] "Deleted CPUSet assignment" podUID="27cb84f7-b506-44f1-8200-799520d6baa9" containerName="ceilometer-central-agent" Oct 03 09:09:44 crc kubenswrapper[4765]: E1003 09:09:44.371531 4765 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="27cb84f7-b506-44f1-8200-799520d6baa9" containerName="sg-core" Oct 03 09:09:44 crc kubenswrapper[4765]: I1003 09:09:44.371538 4765 state_mem.go:107] "Deleted CPUSet assignment" podUID="27cb84f7-b506-44f1-8200-799520d6baa9" containerName="sg-core" Oct 03 09:09:44 crc kubenswrapper[4765]: I1003 09:09:44.371849 4765 memory_manager.go:354] "RemoveStaleState removing state" podUID="27cb84f7-b506-44f1-8200-799520d6baa9" containerName="proxy-httpd" Oct 03 09:09:44 crc kubenswrapper[4765]: I1003 09:09:44.371873 4765 memory_manager.go:354] "RemoveStaleState removing state" podUID="27cb84f7-b506-44f1-8200-799520d6baa9" containerName="sg-core" Oct 03 09:09:44 crc kubenswrapper[4765]: I1003 09:09:44.371888 4765 memory_manager.go:354] "RemoveStaleState removing state" podUID="27cb84f7-b506-44f1-8200-799520d6baa9" containerName="ceilometer-central-agent" Oct 03 09:09:44 crc kubenswrapper[4765]: I1003 09:09:44.371897 4765 memory_manager.go:354] "RemoveStaleState removing state" podUID="27cb84f7-b506-44f1-8200-799520d6baa9" containerName="ceilometer-notification-agent" Oct 03 09:09:44 crc kubenswrapper[4765]: I1003 09:09:44.373783 4765 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/ceilometer-0" Oct 03 09:09:44 crc kubenswrapper[4765]: I1003 09:09:44.377583 4765 reflector.go:368] Caches populated for *v1.Secret from object-"watcher-kuttl-default"/"ceilometer-scripts" Oct 03 09:09:44 crc kubenswrapper[4765]: I1003 09:09:44.378231 4765 reflector.go:368] Caches populated for *v1.Secret from object-"watcher-kuttl-default"/"ceilometer-config-data" Oct 03 09:09:44 crc kubenswrapper[4765]: I1003 09:09:44.387029 4765 scope.go:117] "RemoveContainer" containerID="c96dddc539346e09fe9cb6f685f8a10316bf692897986860f69b2f04d321afab" Oct 03 09:09:44 crc kubenswrapper[4765]: I1003 09:09:44.378777 4765 reflector.go:368] Caches populated for *v1.Secret from object-"watcher-kuttl-default"/"cert-ceilometer-internal-svc" Oct 03 09:09:44 crc kubenswrapper[4765]: I1003 09:09:44.387577 4765 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["watcher-kuttl-default/ceilometer-0"] Oct 03 09:09:44 crc kubenswrapper[4765]: I1003 09:09:44.424606 4765 scope.go:117] "RemoveContainer" containerID="d0e8c32c1f9dd668ba1656d79e066d004298fc25e969e6502733139680d7ca22" Oct 03 09:09:44 crc kubenswrapper[4765]: I1003 09:09:44.445903 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/61098995-6197-447a-b950-7ee9bfa2f643-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"61098995-6197-447a-b950-7ee9bfa2f643\") " pod="watcher-kuttl-default/ceilometer-0" Oct 03 09:09:44 crc kubenswrapper[4765]: I1003 09:09:44.445961 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/61098995-6197-447a-b950-7ee9bfa2f643-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"61098995-6197-447a-b950-7ee9bfa2f643\") " pod="watcher-kuttl-default/ceilometer-0" Oct 03 09:09:44 crc kubenswrapper[4765]: I1003 09:09:44.445998 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/61098995-6197-447a-b950-7ee9bfa2f643-run-httpd\") pod \"ceilometer-0\" (UID: \"61098995-6197-447a-b950-7ee9bfa2f643\") " pod="watcher-kuttl-default/ceilometer-0" Oct 03 09:09:44 crc kubenswrapper[4765]: I1003 09:09:44.446025 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/61098995-6197-447a-b950-7ee9bfa2f643-config-data\") pod \"ceilometer-0\" (UID: \"61098995-6197-447a-b950-7ee9bfa2f643\") " pod="watcher-kuttl-default/ceilometer-0" Oct 03 09:09:44 crc kubenswrapper[4765]: I1003 09:09:44.446060 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-c7wdd\" (UniqueName: \"kubernetes.io/projected/61098995-6197-447a-b950-7ee9bfa2f643-kube-api-access-c7wdd\") pod \"ceilometer-0\" (UID: \"61098995-6197-447a-b950-7ee9bfa2f643\") " pod="watcher-kuttl-default/ceilometer-0" Oct 03 09:09:44 crc kubenswrapper[4765]: I1003 09:09:44.446108 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/61098995-6197-447a-b950-7ee9bfa2f643-scripts\") pod \"ceilometer-0\" (UID: \"61098995-6197-447a-b950-7ee9bfa2f643\") " pod="watcher-kuttl-default/ceilometer-0" Oct 03 09:09:44 crc kubenswrapper[4765]: I1003 09:09:44.446135 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/61098995-6197-447a-b950-7ee9bfa2f643-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"61098995-6197-447a-b950-7ee9bfa2f643\") " pod="watcher-kuttl-default/ceilometer-0" Oct 03 09:09:44 crc kubenswrapper[4765]: I1003 09:09:44.446167 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/61098995-6197-447a-b950-7ee9bfa2f643-log-httpd\") pod \"ceilometer-0\" (UID: \"61098995-6197-447a-b950-7ee9bfa2f643\") " pod="watcher-kuttl-default/ceilometer-0" Oct 03 09:09:44 crc kubenswrapper[4765]: I1003 09:09:44.548129 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/61098995-6197-447a-b950-7ee9bfa2f643-scripts\") pod \"ceilometer-0\" (UID: \"61098995-6197-447a-b950-7ee9bfa2f643\") " pod="watcher-kuttl-default/ceilometer-0" Oct 03 09:09:44 crc kubenswrapper[4765]: I1003 09:09:44.548185 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/61098995-6197-447a-b950-7ee9bfa2f643-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"61098995-6197-447a-b950-7ee9bfa2f643\") " pod="watcher-kuttl-default/ceilometer-0" Oct 03 09:09:44 crc kubenswrapper[4765]: I1003 09:09:44.548232 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/61098995-6197-447a-b950-7ee9bfa2f643-log-httpd\") pod \"ceilometer-0\" (UID: \"61098995-6197-447a-b950-7ee9bfa2f643\") " pod="watcher-kuttl-default/ceilometer-0" Oct 03 09:09:44 crc kubenswrapper[4765]: I1003 09:09:44.548294 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/61098995-6197-447a-b950-7ee9bfa2f643-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"61098995-6197-447a-b950-7ee9bfa2f643\") " pod="watcher-kuttl-default/ceilometer-0" Oct 03 09:09:44 crc kubenswrapper[4765]: I1003 09:09:44.548347 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/61098995-6197-447a-b950-7ee9bfa2f643-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"61098995-6197-447a-b950-7ee9bfa2f643\") " pod="watcher-kuttl-default/ceilometer-0" Oct 03 09:09:44 crc kubenswrapper[4765]: I1003 09:09:44.548386 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/61098995-6197-447a-b950-7ee9bfa2f643-run-httpd\") pod \"ceilometer-0\" (UID: \"61098995-6197-447a-b950-7ee9bfa2f643\") " pod="watcher-kuttl-default/ceilometer-0" Oct 03 09:09:44 crc kubenswrapper[4765]: I1003 09:09:44.548417 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/61098995-6197-447a-b950-7ee9bfa2f643-config-data\") pod \"ceilometer-0\" (UID: \"61098995-6197-447a-b950-7ee9bfa2f643\") " pod="watcher-kuttl-default/ceilometer-0" Oct 03 09:09:44 crc kubenswrapper[4765]: I1003 09:09:44.548443 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-c7wdd\" (UniqueName: \"kubernetes.io/projected/61098995-6197-447a-b950-7ee9bfa2f643-kube-api-access-c7wdd\") pod \"ceilometer-0\" (UID: \"61098995-6197-447a-b950-7ee9bfa2f643\") " pod="watcher-kuttl-default/ceilometer-0" Oct 03 09:09:44 crc kubenswrapper[4765]: I1003 09:09:44.548799 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/61098995-6197-447a-b950-7ee9bfa2f643-log-httpd\") pod \"ceilometer-0\" (UID: \"61098995-6197-447a-b950-7ee9bfa2f643\") " pod="watcher-kuttl-default/ceilometer-0" Oct 03 09:09:44 crc kubenswrapper[4765]: I1003 09:09:44.548848 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/61098995-6197-447a-b950-7ee9bfa2f643-run-httpd\") pod \"ceilometer-0\" (UID: \"61098995-6197-447a-b950-7ee9bfa2f643\") " pod="watcher-kuttl-default/ceilometer-0" Oct 03 09:09:44 crc kubenswrapper[4765]: I1003 09:09:44.553485 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/61098995-6197-447a-b950-7ee9bfa2f643-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"61098995-6197-447a-b950-7ee9bfa2f643\") " pod="watcher-kuttl-default/ceilometer-0" Oct 03 09:09:44 crc kubenswrapper[4765]: I1003 09:09:44.554117 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/61098995-6197-447a-b950-7ee9bfa2f643-scripts\") pod \"ceilometer-0\" (UID: \"61098995-6197-447a-b950-7ee9bfa2f643\") " pod="watcher-kuttl-default/ceilometer-0" Oct 03 09:09:44 crc kubenswrapper[4765]: I1003 09:09:44.555479 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/61098995-6197-447a-b950-7ee9bfa2f643-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"61098995-6197-447a-b950-7ee9bfa2f643\") " pod="watcher-kuttl-default/ceilometer-0" Oct 03 09:09:44 crc kubenswrapper[4765]: I1003 09:09:44.555662 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/61098995-6197-447a-b950-7ee9bfa2f643-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"61098995-6197-447a-b950-7ee9bfa2f643\") " pod="watcher-kuttl-default/ceilometer-0" Oct 03 09:09:44 crc kubenswrapper[4765]: I1003 09:09:44.557313 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/61098995-6197-447a-b950-7ee9bfa2f643-config-data\") pod \"ceilometer-0\" (UID: \"61098995-6197-447a-b950-7ee9bfa2f643\") " pod="watcher-kuttl-default/ceilometer-0" Oct 03 09:09:44 crc kubenswrapper[4765]: I1003 09:09:44.565892 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-c7wdd\" (UniqueName: \"kubernetes.io/projected/61098995-6197-447a-b950-7ee9bfa2f643-kube-api-access-c7wdd\") pod \"ceilometer-0\" (UID: \"61098995-6197-447a-b950-7ee9bfa2f643\") " pod="watcher-kuttl-default/ceilometer-0" Oct 03 09:09:44 crc kubenswrapper[4765]: I1003 09:09:44.708720 4765 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/ceilometer-0" Oct 03 09:09:45 crc kubenswrapper[4765]: I1003 09:09:45.195730 4765 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["watcher-kuttl-default/ceilometer-0"] Oct 03 09:09:45 crc kubenswrapper[4765]: W1003 09:09:45.205362 4765 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod61098995_6197_447a_b950_7ee9bfa2f643.slice/crio-4c9e6692cb13288944cb9a1730a247f8a73d2338b7633fec915516d23dde6ae7 WatchSource:0}: Error finding container 4c9e6692cb13288944cb9a1730a247f8a73d2338b7633fec915516d23dde6ae7: Status 404 returned error can't find the container with id 4c9e6692cb13288944cb9a1730a247f8a73d2338b7633fec915516d23dde6ae7 Oct 03 09:09:45 crc kubenswrapper[4765]: I1003 09:09:45.326555 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/ceilometer-0" event={"ID":"61098995-6197-447a-b950-7ee9bfa2f643","Type":"ContainerStarted","Data":"4c9e6692cb13288944cb9a1730a247f8a73d2338b7633fec915516d23dde6ae7"} Oct 03 09:09:46 crc kubenswrapper[4765]: I1003 09:09:46.345301 4765 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="27cb84f7-b506-44f1-8200-799520d6baa9" path="/var/lib/kubelet/pods/27cb84f7-b506-44f1-8200-799520d6baa9/volumes" Oct 03 09:09:46 crc kubenswrapper[4765]: I1003 09:09:46.356159 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/ceilometer-0" event={"ID":"61098995-6197-447a-b950-7ee9bfa2f643","Type":"ContainerStarted","Data":"a5d2feaddb249676c69586e35a62c1d79368a5730ccaed6998f70948202282a8"} Oct 03 09:09:47 crc kubenswrapper[4765]: I1003 09:09:47.377183 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/ceilometer-0" event={"ID":"61098995-6197-447a-b950-7ee9bfa2f643","Type":"ContainerStarted","Data":"55bd06de291f364b65c952dc29f64dc22d987b3daa89aea3183e79149d5b9820"} Oct 03 09:09:48 crc kubenswrapper[4765]: I1003 09:09:48.390247 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/ceilometer-0" event={"ID":"61098995-6197-447a-b950-7ee9bfa2f643","Type":"ContainerStarted","Data":"ac3633d8155b97658e4c7f58ad6c3f06d426c6881f40074cc0dca9726a090e1d"} Oct 03 09:09:49 crc kubenswrapper[4765]: I1003 09:09:49.336339 4765 scope.go:117] "RemoveContainer" containerID="dd918556e4256b95f1ffce5dba4f8a301b33441a569fc5bbea88da3f09eb9800" Oct 03 09:09:49 crc kubenswrapper[4765]: E1003 09:09:49.349585 4765 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-j8mss_openshift-machine-config-operator(d636dbad-9ffa-4ba7-953f-adea04b76a23)\"" pod="openshift-machine-config-operator/machine-config-daemon-j8mss" podUID="d636dbad-9ffa-4ba7-953f-adea04b76a23" Oct 03 09:09:49 crc kubenswrapper[4765]: I1003 09:09:49.402412 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/ceilometer-0" event={"ID":"61098995-6197-447a-b950-7ee9bfa2f643","Type":"ContainerStarted","Data":"32d54cf682ef268c2bbd75695e7fe4b2ecf2191f5135a14665eb06b6468397ba"} Oct 03 09:09:49 crc kubenswrapper[4765]: I1003 09:09:49.433521 4765 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="watcher-kuttl-default/ceilometer-0" podStartSLOduration=2.092100102 podStartE2EDuration="5.433498097s" podCreationTimestamp="2025-10-03 09:09:44 +0000 UTC" firstStartedPulling="2025-10-03 09:09:45.212985362 +0000 UTC m=+1829.514479692" lastFinishedPulling="2025-10-03 09:09:48.554383357 +0000 UTC m=+1832.855877687" observedRunningTime="2025-10-03 09:09:49.425664533 +0000 UTC m=+1833.727158883" watchObservedRunningTime="2025-10-03 09:09:49.433498097 +0000 UTC m=+1833.734992427" Oct 03 09:09:50 crc kubenswrapper[4765]: I1003 09:09:50.409963 4765 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="watcher-kuttl-default/ceilometer-0" Oct 03 09:10:00 crc kubenswrapper[4765]: I1003 09:10:00.157487 4765 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["watcher-kuttl-default/watcher-kuttl-db-purge-29324710-zm4p9"] Oct 03 09:10:00 crc kubenswrapper[4765]: I1003 09:10:00.159487 4765 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/watcher-kuttl-db-purge-29324710-zm4p9" Oct 03 09:10:00 crc kubenswrapper[4765]: I1003 09:10:00.162452 4765 reflector.go:368] Caches populated for *v1.Secret from object-"watcher-kuttl-default"/"watcher-kuttl-scripts" Oct 03 09:10:00 crc kubenswrapper[4765]: I1003 09:10:00.162458 4765 reflector.go:368] Caches populated for *v1.Secret from object-"watcher-kuttl-default"/"watcher-kuttl-config-data" Oct 03 09:10:00 crc kubenswrapper[4765]: I1003 09:10:00.168859 4765 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["watcher-kuttl-default/watcher-kuttl-db-purge-29324710-zm4p9"] Oct 03 09:10:00 crc kubenswrapper[4765]: I1003 09:10:00.222874 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts-volume\" (UniqueName: \"kubernetes.io/secret/2067f240-abcc-4950-a444-8a71a3f5484e-scripts-volume\") pod \"watcher-kuttl-db-purge-29324710-zm4p9\" (UID: \"2067f240-abcc-4950-a444-8a71a3f5484e\") " pod="watcher-kuttl-default/watcher-kuttl-db-purge-29324710-zm4p9" Oct 03 09:10:00 crc kubenswrapper[4765]: I1003 09:10:00.222921 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-48p7f\" (UniqueName: \"kubernetes.io/projected/2067f240-abcc-4950-a444-8a71a3f5484e-kube-api-access-48p7f\") pod \"watcher-kuttl-db-purge-29324710-zm4p9\" (UID: \"2067f240-abcc-4950-a444-8a71a3f5484e\") " pod="watcher-kuttl-default/watcher-kuttl-db-purge-29324710-zm4p9" Oct 03 09:10:00 crc kubenswrapper[4765]: I1003 09:10:00.222957 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2067f240-abcc-4950-a444-8a71a3f5484e-config-data\") pod \"watcher-kuttl-db-purge-29324710-zm4p9\" (UID: \"2067f240-abcc-4950-a444-8a71a3f5484e\") " pod="watcher-kuttl-default/watcher-kuttl-db-purge-29324710-zm4p9" Oct 03 09:10:00 crc kubenswrapper[4765]: I1003 09:10:00.223115 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2067f240-abcc-4950-a444-8a71a3f5484e-combined-ca-bundle\") pod \"watcher-kuttl-db-purge-29324710-zm4p9\" (UID: \"2067f240-abcc-4950-a444-8a71a3f5484e\") " pod="watcher-kuttl-default/watcher-kuttl-db-purge-29324710-zm4p9" Oct 03 09:10:00 crc kubenswrapper[4765]: I1003 09:10:00.329268 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2067f240-abcc-4950-a444-8a71a3f5484e-config-data\") pod \"watcher-kuttl-db-purge-29324710-zm4p9\" (UID: \"2067f240-abcc-4950-a444-8a71a3f5484e\") " pod="watcher-kuttl-default/watcher-kuttl-db-purge-29324710-zm4p9" Oct 03 09:10:00 crc kubenswrapper[4765]: I1003 09:10:00.329336 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2067f240-abcc-4950-a444-8a71a3f5484e-combined-ca-bundle\") pod \"watcher-kuttl-db-purge-29324710-zm4p9\" (UID: \"2067f240-abcc-4950-a444-8a71a3f5484e\") " pod="watcher-kuttl-default/watcher-kuttl-db-purge-29324710-zm4p9" Oct 03 09:10:00 crc kubenswrapper[4765]: I1003 09:10:00.329427 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts-volume\" (UniqueName: \"kubernetes.io/secret/2067f240-abcc-4950-a444-8a71a3f5484e-scripts-volume\") pod \"watcher-kuttl-db-purge-29324710-zm4p9\" (UID: \"2067f240-abcc-4950-a444-8a71a3f5484e\") " pod="watcher-kuttl-default/watcher-kuttl-db-purge-29324710-zm4p9" Oct 03 09:10:00 crc kubenswrapper[4765]: I1003 09:10:00.329454 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-48p7f\" (UniqueName: \"kubernetes.io/projected/2067f240-abcc-4950-a444-8a71a3f5484e-kube-api-access-48p7f\") pod \"watcher-kuttl-db-purge-29324710-zm4p9\" (UID: \"2067f240-abcc-4950-a444-8a71a3f5484e\") " pod="watcher-kuttl-default/watcher-kuttl-db-purge-29324710-zm4p9" Oct 03 09:10:00 crc kubenswrapper[4765]: I1003 09:10:00.336303 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts-volume\" (UniqueName: \"kubernetes.io/secret/2067f240-abcc-4950-a444-8a71a3f5484e-scripts-volume\") pod \"watcher-kuttl-db-purge-29324710-zm4p9\" (UID: \"2067f240-abcc-4950-a444-8a71a3f5484e\") " pod="watcher-kuttl-default/watcher-kuttl-db-purge-29324710-zm4p9" Oct 03 09:10:00 crc kubenswrapper[4765]: I1003 09:10:00.336962 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2067f240-abcc-4950-a444-8a71a3f5484e-combined-ca-bundle\") pod \"watcher-kuttl-db-purge-29324710-zm4p9\" (UID: \"2067f240-abcc-4950-a444-8a71a3f5484e\") " pod="watcher-kuttl-default/watcher-kuttl-db-purge-29324710-zm4p9" Oct 03 09:10:00 crc kubenswrapper[4765]: I1003 09:10:00.344797 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2067f240-abcc-4950-a444-8a71a3f5484e-config-data\") pod \"watcher-kuttl-db-purge-29324710-zm4p9\" (UID: \"2067f240-abcc-4950-a444-8a71a3f5484e\") " pod="watcher-kuttl-default/watcher-kuttl-db-purge-29324710-zm4p9" Oct 03 09:10:00 crc kubenswrapper[4765]: I1003 09:10:00.350339 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-48p7f\" (UniqueName: \"kubernetes.io/projected/2067f240-abcc-4950-a444-8a71a3f5484e-kube-api-access-48p7f\") pod \"watcher-kuttl-db-purge-29324710-zm4p9\" (UID: \"2067f240-abcc-4950-a444-8a71a3f5484e\") " pod="watcher-kuttl-default/watcher-kuttl-db-purge-29324710-zm4p9" Oct 03 09:10:00 crc kubenswrapper[4765]: I1003 09:10:00.490925 4765 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/watcher-kuttl-db-purge-29324710-zm4p9" Oct 03 09:10:00 crc kubenswrapper[4765]: W1003 09:10:00.962100 4765 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod2067f240_abcc_4950_a444_8a71a3f5484e.slice/crio-d25a1b41bd05605151708237c62547761e916bf9a7ba91b587dc9efa3826be0b WatchSource:0}: Error finding container d25a1b41bd05605151708237c62547761e916bf9a7ba91b587dc9efa3826be0b: Status 404 returned error can't find the container with id d25a1b41bd05605151708237c62547761e916bf9a7ba91b587dc9efa3826be0b Oct 03 09:10:00 crc kubenswrapper[4765]: I1003 09:10:00.962341 4765 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["watcher-kuttl-default/watcher-kuttl-db-purge-29324710-zm4p9"] Oct 03 09:10:01 crc kubenswrapper[4765]: I1003 09:10:01.503686 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/watcher-kuttl-db-purge-29324710-zm4p9" event={"ID":"2067f240-abcc-4950-a444-8a71a3f5484e","Type":"ContainerStarted","Data":"52e5333e0c644f0e7810a257ae9627cd84df8e24ed12d8948d8e8ac2fd1bca37"} Oct 03 09:10:01 crc kubenswrapper[4765]: I1003 09:10:01.503939 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/watcher-kuttl-db-purge-29324710-zm4p9" event={"ID":"2067f240-abcc-4950-a444-8a71a3f5484e","Type":"ContainerStarted","Data":"d25a1b41bd05605151708237c62547761e916bf9a7ba91b587dc9efa3826be0b"} Oct 03 09:10:01 crc kubenswrapper[4765]: I1003 09:10:01.526385 4765 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="watcher-kuttl-default/watcher-kuttl-db-purge-29324710-zm4p9" podStartSLOduration=1.526366551 podStartE2EDuration="1.526366551s" podCreationTimestamp="2025-10-03 09:10:00 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-03 09:10:01.516567676 +0000 UTC m=+1845.818062006" watchObservedRunningTime="2025-10-03 09:10:01.526366551 +0000 UTC m=+1845.827860881" Oct 03 09:10:03 crc kubenswrapper[4765]: I1003 09:10:03.522953 4765 generic.go:334] "Generic (PLEG): container finished" podID="2067f240-abcc-4950-a444-8a71a3f5484e" containerID="52e5333e0c644f0e7810a257ae9627cd84df8e24ed12d8948d8e8ac2fd1bca37" exitCode=0 Oct 03 09:10:03 crc kubenswrapper[4765]: I1003 09:10:03.523068 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/watcher-kuttl-db-purge-29324710-zm4p9" event={"ID":"2067f240-abcc-4950-a444-8a71a3f5484e","Type":"ContainerDied","Data":"52e5333e0c644f0e7810a257ae9627cd84df8e24ed12d8948d8e8ac2fd1bca37"} Oct 03 09:10:04 crc kubenswrapper[4765]: I1003 09:10:04.306428 4765 scope.go:117] "RemoveContainer" containerID="dd918556e4256b95f1ffce5dba4f8a301b33441a569fc5bbea88da3f09eb9800" Oct 03 09:10:04 crc kubenswrapper[4765]: I1003 09:10:04.534402 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-j8mss" event={"ID":"d636dbad-9ffa-4ba7-953f-adea04b76a23","Type":"ContainerStarted","Data":"1978eaa8d4c867a88cdf1bd67d12cbe56c8e728282f53b9cdc636787901cab02"} Oct 03 09:10:04 crc kubenswrapper[4765]: I1003 09:10:04.903777 4765 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/watcher-kuttl-db-purge-29324710-zm4p9" Oct 03 09:10:05 crc kubenswrapper[4765]: I1003 09:10:05.002283 4765 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2067f240-abcc-4950-a444-8a71a3f5484e-combined-ca-bundle\") pod \"2067f240-abcc-4950-a444-8a71a3f5484e\" (UID: \"2067f240-abcc-4950-a444-8a71a3f5484e\") " Oct 03 09:10:05 crc kubenswrapper[4765]: I1003 09:10:05.002389 4765 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-48p7f\" (UniqueName: \"kubernetes.io/projected/2067f240-abcc-4950-a444-8a71a3f5484e-kube-api-access-48p7f\") pod \"2067f240-abcc-4950-a444-8a71a3f5484e\" (UID: \"2067f240-abcc-4950-a444-8a71a3f5484e\") " Oct 03 09:10:05 crc kubenswrapper[4765]: I1003 09:10:05.002426 4765 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts-volume\" (UniqueName: \"kubernetes.io/secret/2067f240-abcc-4950-a444-8a71a3f5484e-scripts-volume\") pod \"2067f240-abcc-4950-a444-8a71a3f5484e\" (UID: \"2067f240-abcc-4950-a444-8a71a3f5484e\") " Oct 03 09:10:05 crc kubenswrapper[4765]: I1003 09:10:05.002500 4765 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2067f240-abcc-4950-a444-8a71a3f5484e-config-data\") pod \"2067f240-abcc-4950-a444-8a71a3f5484e\" (UID: \"2067f240-abcc-4950-a444-8a71a3f5484e\") " Oct 03 09:10:05 crc kubenswrapper[4765]: I1003 09:10:05.008376 4765 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2067f240-abcc-4950-a444-8a71a3f5484e-scripts-volume" (OuterVolumeSpecName: "scripts-volume") pod "2067f240-abcc-4950-a444-8a71a3f5484e" (UID: "2067f240-abcc-4950-a444-8a71a3f5484e"). InnerVolumeSpecName "scripts-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 09:10:05 crc kubenswrapper[4765]: I1003 09:10:05.012249 4765 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2067f240-abcc-4950-a444-8a71a3f5484e-kube-api-access-48p7f" (OuterVolumeSpecName: "kube-api-access-48p7f") pod "2067f240-abcc-4950-a444-8a71a3f5484e" (UID: "2067f240-abcc-4950-a444-8a71a3f5484e"). InnerVolumeSpecName "kube-api-access-48p7f". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 09:10:05 crc kubenswrapper[4765]: I1003 09:10:05.028169 4765 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2067f240-abcc-4950-a444-8a71a3f5484e-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "2067f240-abcc-4950-a444-8a71a3f5484e" (UID: "2067f240-abcc-4950-a444-8a71a3f5484e"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 09:10:05 crc kubenswrapper[4765]: I1003 09:10:05.052070 4765 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2067f240-abcc-4950-a444-8a71a3f5484e-config-data" (OuterVolumeSpecName: "config-data") pod "2067f240-abcc-4950-a444-8a71a3f5484e" (UID: "2067f240-abcc-4950-a444-8a71a3f5484e"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 09:10:05 crc kubenswrapper[4765]: I1003 09:10:05.105255 4765 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2067f240-abcc-4950-a444-8a71a3f5484e-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 03 09:10:05 crc kubenswrapper[4765]: I1003 09:10:05.105288 4765 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-48p7f\" (UniqueName: \"kubernetes.io/projected/2067f240-abcc-4950-a444-8a71a3f5484e-kube-api-access-48p7f\") on node \"crc\" DevicePath \"\"" Oct 03 09:10:05 crc kubenswrapper[4765]: I1003 09:10:05.105303 4765 reconciler_common.go:293] "Volume detached for volume \"scripts-volume\" (UniqueName: \"kubernetes.io/secret/2067f240-abcc-4950-a444-8a71a3f5484e-scripts-volume\") on node \"crc\" DevicePath \"\"" Oct 03 09:10:05 crc kubenswrapper[4765]: I1003 09:10:05.105314 4765 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2067f240-abcc-4950-a444-8a71a3f5484e-config-data\") on node \"crc\" DevicePath \"\"" Oct 03 09:10:05 crc kubenswrapper[4765]: I1003 09:10:05.554712 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/watcher-kuttl-db-purge-29324710-zm4p9" event={"ID":"2067f240-abcc-4950-a444-8a71a3f5484e","Type":"ContainerDied","Data":"d25a1b41bd05605151708237c62547761e916bf9a7ba91b587dc9efa3826be0b"} Oct 03 09:10:05 crc kubenswrapper[4765]: I1003 09:10:05.555007 4765 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="d25a1b41bd05605151708237c62547761e916bf9a7ba91b587dc9efa3826be0b" Oct 03 09:10:05 crc kubenswrapper[4765]: I1003 09:10:05.555087 4765 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/watcher-kuttl-db-purge-29324710-zm4p9" Oct 03 09:10:07 crc kubenswrapper[4765]: I1003 09:10:07.752963 4765 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["watcher-kuttl-default/watcher-kuttl-db-sync-94ndg"] Oct 03 09:10:07 crc kubenswrapper[4765]: I1003 09:10:07.761768 4765 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["watcher-kuttl-default/watcher-kuttl-db-sync-94ndg"] Oct 03 09:10:07 crc kubenswrapper[4765]: I1003 09:10:07.783733 4765 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["watcher-kuttl-default/watcher-kuttl-db-purge-29324710-zm4p9"] Oct 03 09:10:07 crc kubenswrapper[4765]: I1003 09:10:07.812530 4765 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["watcher-kuttl-default/watcher-kuttl-db-purge-29324710-zm4p9"] Oct 03 09:10:07 crc kubenswrapper[4765]: I1003 09:10:07.819616 4765 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["watcher-kuttl-default/watchertest-account-delete-h6kbd"] Oct 03 09:10:07 crc kubenswrapper[4765]: E1003 09:10:07.820373 4765 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2067f240-abcc-4950-a444-8a71a3f5484e" containerName="watcher-db-manage" Oct 03 09:10:07 crc kubenswrapper[4765]: I1003 09:10:07.820612 4765 state_mem.go:107] "Deleted CPUSet assignment" podUID="2067f240-abcc-4950-a444-8a71a3f5484e" containerName="watcher-db-manage" Oct 03 09:10:07 crc kubenswrapper[4765]: I1003 09:10:07.820943 4765 memory_manager.go:354] "RemoveStaleState removing state" podUID="2067f240-abcc-4950-a444-8a71a3f5484e" containerName="watcher-db-manage" Oct 03 09:10:07 crc kubenswrapper[4765]: I1003 09:10:07.821806 4765 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/watchertest-account-delete-h6kbd" Oct 03 09:10:07 crc kubenswrapper[4765]: I1003 09:10:07.830638 4765 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["watcher-kuttl-default/watchertest-account-delete-h6kbd"] Oct 03 09:10:07 crc kubenswrapper[4765]: I1003 09:10:07.845340 4765 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["watcher-kuttl-default/watcher-kuttl-decision-engine-0"] Oct 03 09:10:07 crc kubenswrapper[4765]: I1003 09:10:07.845674 4765 kuberuntime_container.go:808] "Killing container with a grace period" pod="watcher-kuttl-default/watcher-kuttl-decision-engine-0" podUID="2e44e319-a13c-4e4a-bf9c-775e09f92bc3" containerName="watcher-decision-engine" containerID="cri-o://2f7dafdada6021fd8062c4874aa556dcda55353642cac651f706241412a4279f" gracePeriod=30 Oct 03 09:10:07 crc kubenswrapper[4765]: I1003 09:10:07.894671 4765 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["watcher-kuttl-default/watcher-kuttl-api-0"] Oct 03 09:10:07 crc kubenswrapper[4765]: I1003 09:10:07.895212 4765 kuberuntime_container.go:808] "Killing container with a grace period" pod="watcher-kuttl-default/watcher-kuttl-api-0" podUID="f7dfb030-e149-4806-bf36-54b8283a9027" containerName="watcher-kuttl-api-log" containerID="cri-o://7a9bf1461dd3939caa54fc132139cf8dbaded387e7e1d579891f45191d60352c" gracePeriod=30 Oct 03 09:10:07 crc kubenswrapper[4765]: I1003 09:10:07.895641 4765 kuberuntime_container.go:808] "Killing container with a grace period" pod="watcher-kuttl-default/watcher-kuttl-api-0" podUID="f7dfb030-e149-4806-bf36-54b8283a9027" containerName="watcher-api" containerID="cri-o://b926ca3b2525a6bb74bbb2dc98d17adc2b5a19d17c0aa49fccf302e0efda2cb7" gracePeriod=30 Oct 03 09:10:07 crc kubenswrapper[4765]: I1003 09:10:07.921052 4765 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["watcher-kuttl-default/watcher-kuttl-api-1"] Oct 03 09:10:07 crc kubenswrapper[4765]: I1003 09:10:07.921360 4765 kuberuntime_container.go:808] "Killing container with a grace period" pod="watcher-kuttl-default/watcher-kuttl-api-1" podUID="15583e2e-4769-40c5-b09e-4aa70e3dc238" containerName="watcher-kuttl-api-log" containerID="cri-o://ecaae0984a5e363a2cd98663328b28aa2f9495c40ae3666535c89c97660cfd65" gracePeriod=30 Oct 03 09:10:07 crc kubenswrapper[4765]: I1003 09:10:07.921921 4765 kuberuntime_container.go:808] "Killing container with a grace period" pod="watcher-kuttl-default/watcher-kuttl-api-1" podUID="15583e2e-4769-40c5-b09e-4aa70e3dc238" containerName="watcher-api" containerID="cri-o://af30e18d65b447d9070ee30a84c41a260bb43bd8afbd273bacaf58110a4a2bf2" gracePeriod=30 Oct 03 09:10:07 crc kubenswrapper[4765]: I1003 09:10:07.943621 4765 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["watcher-kuttl-default/watcher-kuttl-applier-0"] Oct 03 09:10:07 crc kubenswrapper[4765]: I1003 09:10:07.944343 4765 kuberuntime_container.go:808] "Killing container with a grace period" pod="watcher-kuttl-default/watcher-kuttl-applier-0" podUID="5717d110-7f82-4448-8891-4a0dc6ae2703" containerName="watcher-applier" containerID="cri-o://6c4f112880a058f5288cb046ab769b1fd48c801d885e8923750566b8cc6f9bd4" gracePeriod=30 Oct 03 09:10:07 crc kubenswrapper[4765]: I1003 09:10:07.958844 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rksrg\" (UniqueName: \"kubernetes.io/projected/a850b634-7419-4983-ad82-8efd6dea5cb7-kube-api-access-rksrg\") pod \"watchertest-account-delete-h6kbd\" (UID: \"a850b634-7419-4983-ad82-8efd6dea5cb7\") " pod="watcher-kuttl-default/watchertest-account-delete-h6kbd" Oct 03 09:10:08 crc kubenswrapper[4765]: I1003 09:10:08.066963 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rksrg\" (UniqueName: \"kubernetes.io/projected/a850b634-7419-4983-ad82-8efd6dea5cb7-kube-api-access-rksrg\") pod \"watchertest-account-delete-h6kbd\" (UID: \"a850b634-7419-4983-ad82-8efd6dea5cb7\") " pod="watcher-kuttl-default/watchertest-account-delete-h6kbd" Oct 03 09:10:08 crc kubenswrapper[4765]: I1003 09:10:08.089944 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rksrg\" (UniqueName: \"kubernetes.io/projected/a850b634-7419-4983-ad82-8efd6dea5cb7-kube-api-access-rksrg\") pod \"watchertest-account-delete-h6kbd\" (UID: \"a850b634-7419-4983-ad82-8efd6dea5cb7\") " pod="watcher-kuttl-default/watchertest-account-delete-h6kbd" Oct 03 09:10:08 crc kubenswrapper[4765]: I1003 09:10:08.149097 4765 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/watchertest-account-delete-h6kbd" Oct 03 09:10:08 crc kubenswrapper[4765]: I1003 09:10:08.321389 4765 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2067f240-abcc-4950-a444-8a71a3f5484e" path="/var/lib/kubelet/pods/2067f240-abcc-4950-a444-8a71a3f5484e/volumes" Oct 03 09:10:08 crc kubenswrapper[4765]: I1003 09:10:08.322634 4765 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7f5e0ff7-6ca7-4937-99f2-e4c122f2e9ab" path="/var/lib/kubelet/pods/7f5e0ff7-6ca7-4937-99f2-e4c122f2e9ab/volumes" Oct 03 09:10:08 crc kubenswrapper[4765]: I1003 09:10:08.588153 4765 generic.go:334] "Generic (PLEG): container finished" podID="f7dfb030-e149-4806-bf36-54b8283a9027" containerID="7a9bf1461dd3939caa54fc132139cf8dbaded387e7e1d579891f45191d60352c" exitCode=143 Oct 03 09:10:08 crc kubenswrapper[4765]: I1003 09:10:08.588236 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/watcher-kuttl-api-0" event={"ID":"f7dfb030-e149-4806-bf36-54b8283a9027","Type":"ContainerDied","Data":"7a9bf1461dd3939caa54fc132139cf8dbaded387e7e1d579891f45191d60352c"} Oct 03 09:10:08 crc kubenswrapper[4765]: I1003 09:10:08.591443 4765 generic.go:334] "Generic (PLEG): container finished" podID="15583e2e-4769-40c5-b09e-4aa70e3dc238" containerID="ecaae0984a5e363a2cd98663328b28aa2f9495c40ae3666535c89c97660cfd65" exitCode=143 Oct 03 09:10:08 crc kubenswrapper[4765]: I1003 09:10:08.591498 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/watcher-kuttl-api-1" event={"ID":"15583e2e-4769-40c5-b09e-4aa70e3dc238","Type":"ContainerDied","Data":"ecaae0984a5e363a2cd98663328b28aa2f9495c40ae3666535c89c97660cfd65"} Oct 03 09:10:08 crc kubenswrapper[4765]: I1003 09:10:08.643648 4765 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["watcher-kuttl-default/watchertest-account-delete-h6kbd"] Oct 03 09:10:08 crc kubenswrapper[4765]: W1003 09:10:08.651680 4765 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-poda850b634_7419_4983_ad82_8efd6dea5cb7.slice/crio-55533fe2d733b86f6757c85069919c6a57922976789d40f40d50a86b031e4a04 WatchSource:0}: Error finding container 55533fe2d733b86f6757c85069919c6a57922976789d40f40d50a86b031e4a04: Status 404 returned error can't find the container with id 55533fe2d733b86f6757c85069919c6a57922976789d40f40d50a86b031e4a04 Oct 03 09:10:09 crc kubenswrapper[4765]: I1003 09:10:09.356943 4765 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/watcher-kuttl-api-0" Oct 03 09:10:09 crc kubenswrapper[4765]: I1003 09:10:09.459621 4765 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/watcher-kuttl-applier-0" Oct 03 09:10:09 crc kubenswrapper[4765]: I1003 09:10:09.498296 4765 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-szbpj\" (UniqueName: \"kubernetes.io/projected/f7dfb030-e149-4806-bf36-54b8283a9027-kube-api-access-szbpj\") pod \"f7dfb030-e149-4806-bf36-54b8283a9027\" (UID: \"f7dfb030-e149-4806-bf36-54b8283a9027\") " Oct 03 09:10:09 crc kubenswrapper[4765]: I1003 09:10:09.498355 4765 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cert-memcached-mtls\" (UniqueName: \"kubernetes.io/secret/f7dfb030-e149-4806-bf36-54b8283a9027-cert-memcached-mtls\") pod \"f7dfb030-e149-4806-bf36-54b8283a9027\" (UID: \"f7dfb030-e149-4806-bf36-54b8283a9027\") " Oct 03 09:10:09 crc kubenswrapper[4765]: I1003 09:10:09.498397 4765 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/f7dfb030-e149-4806-bf36-54b8283a9027-logs\") pod \"f7dfb030-e149-4806-bf36-54b8283a9027\" (UID: \"f7dfb030-e149-4806-bf36-54b8283a9027\") " Oct 03 09:10:09 crc kubenswrapper[4765]: I1003 09:10:09.498466 4765 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f7dfb030-e149-4806-bf36-54b8283a9027-config-data\") pod \"f7dfb030-e149-4806-bf36-54b8283a9027\" (UID: \"f7dfb030-e149-4806-bf36-54b8283a9027\") " Oct 03 09:10:09 crc kubenswrapper[4765]: I1003 09:10:09.498489 4765 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"custom-prometheus-ca\" (UniqueName: \"kubernetes.io/secret/f7dfb030-e149-4806-bf36-54b8283a9027-custom-prometheus-ca\") pod \"f7dfb030-e149-4806-bf36-54b8283a9027\" (UID: \"f7dfb030-e149-4806-bf36-54b8283a9027\") " Oct 03 09:10:09 crc kubenswrapper[4765]: I1003 09:10:09.498583 4765 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f7dfb030-e149-4806-bf36-54b8283a9027-combined-ca-bundle\") pod \"f7dfb030-e149-4806-bf36-54b8283a9027\" (UID: \"f7dfb030-e149-4806-bf36-54b8283a9027\") " Oct 03 09:10:09 crc kubenswrapper[4765]: I1003 09:10:09.499866 4765 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f7dfb030-e149-4806-bf36-54b8283a9027-logs" (OuterVolumeSpecName: "logs") pod "f7dfb030-e149-4806-bf36-54b8283a9027" (UID: "f7dfb030-e149-4806-bf36-54b8283a9027"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 03 09:10:09 crc kubenswrapper[4765]: I1003 09:10:09.505677 4765 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f7dfb030-e149-4806-bf36-54b8283a9027-kube-api-access-szbpj" (OuterVolumeSpecName: "kube-api-access-szbpj") pod "f7dfb030-e149-4806-bf36-54b8283a9027" (UID: "f7dfb030-e149-4806-bf36-54b8283a9027"). InnerVolumeSpecName "kube-api-access-szbpj". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 09:10:09 crc kubenswrapper[4765]: I1003 09:10:09.529833 4765 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f7dfb030-e149-4806-bf36-54b8283a9027-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "f7dfb030-e149-4806-bf36-54b8283a9027" (UID: "f7dfb030-e149-4806-bf36-54b8283a9027"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 09:10:09 crc kubenswrapper[4765]: I1003 09:10:09.530823 4765 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f7dfb030-e149-4806-bf36-54b8283a9027-custom-prometheus-ca" (OuterVolumeSpecName: "custom-prometheus-ca") pod "f7dfb030-e149-4806-bf36-54b8283a9027" (UID: "f7dfb030-e149-4806-bf36-54b8283a9027"). InnerVolumeSpecName "custom-prometheus-ca". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 09:10:09 crc kubenswrapper[4765]: I1003 09:10:09.547942 4765 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f7dfb030-e149-4806-bf36-54b8283a9027-config-data" (OuterVolumeSpecName: "config-data") pod "f7dfb030-e149-4806-bf36-54b8283a9027" (UID: "f7dfb030-e149-4806-bf36-54b8283a9027"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 09:10:09 crc kubenswrapper[4765]: I1003 09:10:09.599450 4765 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5717d110-7f82-4448-8891-4a0dc6ae2703-combined-ca-bundle\") pod \"5717d110-7f82-4448-8891-4a0dc6ae2703\" (UID: \"5717d110-7f82-4448-8891-4a0dc6ae2703\") " Oct 03 09:10:09 crc kubenswrapper[4765]: I1003 09:10:09.599566 4765 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5717d110-7f82-4448-8891-4a0dc6ae2703-config-data\") pod \"5717d110-7f82-4448-8891-4a0dc6ae2703\" (UID: \"5717d110-7f82-4448-8891-4a0dc6ae2703\") " Oct 03 09:10:09 crc kubenswrapper[4765]: I1003 09:10:09.599599 4765 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cert-memcached-mtls\" (UniqueName: \"kubernetes.io/secret/5717d110-7f82-4448-8891-4a0dc6ae2703-cert-memcached-mtls\") pod \"5717d110-7f82-4448-8891-4a0dc6ae2703\" (UID: \"5717d110-7f82-4448-8891-4a0dc6ae2703\") " Oct 03 09:10:09 crc kubenswrapper[4765]: I1003 09:10:09.599765 4765 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-smmbc\" (UniqueName: \"kubernetes.io/projected/5717d110-7f82-4448-8891-4a0dc6ae2703-kube-api-access-smmbc\") pod \"5717d110-7f82-4448-8891-4a0dc6ae2703\" (UID: \"5717d110-7f82-4448-8891-4a0dc6ae2703\") " Oct 03 09:10:09 crc kubenswrapper[4765]: I1003 09:10:09.599799 4765 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/5717d110-7f82-4448-8891-4a0dc6ae2703-logs\") pod \"5717d110-7f82-4448-8891-4a0dc6ae2703\" (UID: \"5717d110-7f82-4448-8891-4a0dc6ae2703\") " Oct 03 09:10:09 crc kubenswrapper[4765]: I1003 09:10:09.600264 4765 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-szbpj\" (UniqueName: \"kubernetes.io/projected/f7dfb030-e149-4806-bf36-54b8283a9027-kube-api-access-szbpj\") on node \"crc\" DevicePath \"\"" Oct 03 09:10:09 crc kubenswrapper[4765]: I1003 09:10:09.600293 4765 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/f7dfb030-e149-4806-bf36-54b8283a9027-logs\") on node \"crc\" DevicePath \"\"" Oct 03 09:10:09 crc kubenswrapper[4765]: I1003 09:10:09.600309 4765 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f7dfb030-e149-4806-bf36-54b8283a9027-config-data\") on node \"crc\" DevicePath \"\"" Oct 03 09:10:09 crc kubenswrapper[4765]: I1003 09:10:09.600321 4765 reconciler_common.go:293] "Volume detached for volume \"custom-prometheus-ca\" (UniqueName: \"kubernetes.io/secret/f7dfb030-e149-4806-bf36-54b8283a9027-custom-prometheus-ca\") on node \"crc\" DevicePath \"\"" Oct 03 09:10:09 crc kubenswrapper[4765]: I1003 09:10:09.600332 4765 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f7dfb030-e149-4806-bf36-54b8283a9027-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 03 09:10:09 crc kubenswrapper[4765]: I1003 09:10:09.600700 4765 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5717d110-7f82-4448-8891-4a0dc6ae2703-logs" (OuterVolumeSpecName: "logs") pod "5717d110-7f82-4448-8891-4a0dc6ae2703" (UID: "5717d110-7f82-4448-8891-4a0dc6ae2703"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 03 09:10:09 crc kubenswrapper[4765]: I1003 09:10:09.603134 4765 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f7dfb030-e149-4806-bf36-54b8283a9027-cert-memcached-mtls" (OuterVolumeSpecName: "cert-memcached-mtls") pod "f7dfb030-e149-4806-bf36-54b8283a9027" (UID: "f7dfb030-e149-4806-bf36-54b8283a9027"). InnerVolumeSpecName "cert-memcached-mtls". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 09:10:09 crc kubenswrapper[4765]: I1003 09:10:09.603809 4765 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5717d110-7f82-4448-8891-4a0dc6ae2703-kube-api-access-smmbc" (OuterVolumeSpecName: "kube-api-access-smmbc") pod "5717d110-7f82-4448-8891-4a0dc6ae2703" (UID: "5717d110-7f82-4448-8891-4a0dc6ae2703"). InnerVolumeSpecName "kube-api-access-smmbc". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 09:10:09 crc kubenswrapper[4765]: I1003 09:10:09.615153 4765 generic.go:334] "Generic (PLEG): container finished" podID="a850b634-7419-4983-ad82-8efd6dea5cb7" containerID="c5aa3d69e81e36f32d78eaaeb917f4e4c5484bccba300b7d17b4d734bfc45b24" exitCode=0 Oct 03 09:10:09 crc kubenswrapper[4765]: I1003 09:10:09.615242 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/watchertest-account-delete-h6kbd" event={"ID":"a850b634-7419-4983-ad82-8efd6dea5cb7","Type":"ContainerDied","Data":"c5aa3d69e81e36f32d78eaaeb917f4e4c5484bccba300b7d17b4d734bfc45b24"} Oct 03 09:10:09 crc kubenswrapper[4765]: I1003 09:10:09.615277 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/watchertest-account-delete-h6kbd" event={"ID":"a850b634-7419-4983-ad82-8efd6dea5cb7","Type":"ContainerStarted","Data":"55533fe2d733b86f6757c85069919c6a57922976789d40f40d50a86b031e4a04"} Oct 03 09:10:09 crc kubenswrapper[4765]: I1003 09:10:09.634553 4765 generic.go:334] "Generic (PLEG): container finished" podID="5717d110-7f82-4448-8891-4a0dc6ae2703" containerID="6c4f112880a058f5288cb046ab769b1fd48c801d885e8923750566b8cc6f9bd4" exitCode=0 Oct 03 09:10:09 crc kubenswrapper[4765]: I1003 09:10:09.634635 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/watcher-kuttl-applier-0" event={"ID":"5717d110-7f82-4448-8891-4a0dc6ae2703","Type":"ContainerDied","Data":"6c4f112880a058f5288cb046ab769b1fd48c801d885e8923750566b8cc6f9bd4"} Oct 03 09:10:09 crc kubenswrapper[4765]: I1003 09:10:09.634681 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/watcher-kuttl-applier-0" event={"ID":"5717d110-7f82-4448-8891-4a0dc6ae2703","Type":"ContainerDied","Data":"64ec5330ef3fb8efbf1e78af1cd48a08eb5eeb39e6029de50dff0ce87af3ecad"} Oct 03 09:10:09 crc kubenswrapper[4765]: I1003 09:10:09.634723 4765 scope.go:117] "RemoveContainer" containerID="6c4f112880a058f5288cb046ab769b1fd48c801d885e8923750566b8cc6f9bd4" Oct 03 09:10:09 crc kubenswrapper[4765]: I1003 09:10:09.634994 4765 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/watcher-kuttl-applier-0" Oct 03 09:10:09 crc kubenswrapper[4765]: I1003 09:10:09.650691 4765 generic.go:334] "Generic (PLEG): container finished" podID="f7dfb030-e149-4806-bf36-54b8283a9027" containerID="b926ca3b2525a6bb74bbb2dc98d17adc2b5a19d17c0aa49fccf302e0efda2cb7" exitCode=0 Oct 03 09:10:09 crc kubenswrapper[4765]: I1003 09:10:09.650735 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/watcher-kuttl-api-0" event={"ID":"f7dfb030-e149-4806-bf36-54b8283a9027","Type":"ContainerDied","Data":"b926ca3b2525a6bb74bbb2dc98d17adc2b5a19d17c0aa49fccf302e0efda2cb7"} Oct 03 09:10:09 crc kubenswrapper[4765]: I1003 09:10:09.650760 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/watcher-kuttl-api-0" event={"ID":"f7dfb030-e149-4806-bf36-54b8283a9027","Type":"ContainerDied","Data":"6fa95d06c4af9c88ede6b79db55d33367db0f16af253befc47f5356328c1aa4f"} Oct 03 09:10:09 crc kubenswrapper[4765]: I1003 09:10:09.650833 4765 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/watcher-kuttl-api-0" Oct 03 09:10:09 crc kubenswrapper[4765]: I1003 09:10:09.655379 4765 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5717d110-7f82-4448-8891-4a0dc6ae2703-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "5717d110-7f82-4448-8891-4a0dc6ae2703" (UID: "5717d110-7f82-4448-8891-4a0dc6ae2703"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 09:10:09 crc kubenswrapper[4765]: I1003 09:10:09.661161 4765 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5717d110-7f82-4448-8891-4a0dc6ae2703-config-data" (OuterVolumeSpecName: "config-data") pod "5717d110-7f82-4448-8891-4a0dc6ae2703" (UID: "5717d110-7f82-4448-8891-4a0dc6ae2703"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 09:10:09 crc kubenswrapper[4765]: I1003 09:10:09.708612 4765 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5717d110-7f82-4448-8891-4a0dc6ae2703-config-data\") on node \"crc\" DevicePath \"\"" Oct 03 09:10:09 crc kubenswrapper[4765]: I1003 09:10:09.708673 4765 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-smmbc\" (UniqueName: \"kubernetes.io/projected/5717d110-7f82-4448-8891-4a0dc6ae2703-kube-api-access-smmbc\") on node \"crc\" DevicePath \"\"" Oct 03 09:10:09 crc kubenswrapper[4765]: I1003 09:10:09.708697 4765 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/5717d110-7f82-4448-8891-4a0dc6ae2703-logs\") on node \"crc\" DevicePath \"\"" Oct 03 09:10:09 crc kubenswrapper[4765]: I1003 09:10:09.708710 4765 reconciler_common.go:293] "Volume detached for volume \"cert-memcached-mtls\" (UniqueName: \"kubernetes.io/secret/f7dfb030-e149-4806-bf36-54b8283a9027-cert-memcached-mtls\") on node \"crc\" DevicePath \"\"" Oct 03 09:10:09 crc kubenswrapper[4765]: I1003 09:10:09.708723 4765 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5717d110-7f82-4448-8891-4a0dc6ae2703-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 03 09:10:09 crc kubenswrapper[4765]: I1003 09:10:09.711969 4765 scope.go:117] "RemoveContainer" containerID="6c4f112880a058f5288cb046ab769b1fd48c801d885e8923750566b8cc6f9bd4" Oct 03 09:10:09 crc kubenswrapper[4765]: E1003 09:10:09.712598 4765 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6c4f112880a058f5288cb046ab769b1fd48c801d885e8923750566b8cc6f9bd4\": container with ID starting with 6c4f112880a058f5288cb046ab769b1fd48c801d885e8923750566b8cc6f9bd4 not found: ID does not exist" containerID="6c4f112880a058f5288cb046ab769b1fd48c801d885e8923750566b8cc6f9bd4" Oct 03 09:10:09 crc kubenswrapper[4765]: I1003 09:10:09.712750 4765 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6c4f112880a058f5288cb046ab769b1fd48c801d885e8923750566b8cc6f9bd4"} err="failed to get container status \"6c4f112880a058f5288cb046ab769b1fd48c801d885e8923750566b8cc6f9bd4\": rpc error: code = NotFound desc = could not find container \"6c4f112880a058f5288cb046ab769b1fd48c801d885e8923750566b8cc6f9bd4\": container with ID starting with 6c4f112880a058f5288cb046ab769b1fd48c801d885e8923750566b8cc6f9bd4 not found: ID does not exist" Oct 03 09:10:09 crc kubenswrapper[4765]: I1003 09:10:09.712846 4765 scope.go:117] "RemoveContainer" containerID="b926ca3b2525a6bb74bbb2dc98d17adc2b5a19d17c0aa49fccf302e0efda2cb7" Oct 03 09:10:09 crc kubenswrapper[4765]: I1003 09:10:09.724968 4765 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5717d110-7f82-4448-8891-4a0dc6ae2703-cert-memcached-mtls" (OuterVolumeSpecName: "cert-memcached-mtls") pod "5717d110-7f82-4448-8891-4a0dc6ae2703" (UID: "5717d110-7f82-4448-8891-4a0dc6ae2703"). InnerVolumeSpecName "cert-memcached-mtls". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 09:10:09 crc kubenswrapper[4765]: I1003 09:10:09.728506 4765 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["watcher-kuttl-default/watcher-kuttl-api-0"] Oct 03 09:10:09 crc kubenswrapper[4765]: I1003 09:10:09.732229 4765 prober.go:107] "Probe failed" probeType="Readiness" pod="watcher-kuttl-default/watcher-kuttl-api-1" podUID="15583e2e-4769-40c5-b09e-4aa70e3dc238" containerName="watcher-api" probeResult="failure" output="Get \"http://10.217.0.226:9322/\": dial tcp 10.217.0.226:9322: connect: connection refused" Oct 03 09:10:09 crc kubenswrapper[4765]: I1003 09:10:09.732256 4765 prober.go:107] "Probe failed" probeType="Readiness" pod="watcher-kuttl-default/watcher-kuttl-api-1" podUID="15583e2e-4769-40c5-b09e-4aa70e3dc238" containerName="watcher-kuttl-api-log" probeResult="failure" output="Get \"http://10.217.0.226:9322/\": dial tcp 10.217.0.226:9322: connect: connection refused" Oct 03 09:10:09 crc kubenswrapper[4765]: I1003 09:10:09.738634 4765 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["watcher-kuttl-default/watcher-kuttl-api-0"] Oct 03 09:10:09 crc kubenswrapper[4765]: I1003 09:10:09.741081 4765 scope.go:117] "RemoveContainer" containerID="7a9bf1461dd3939caa54fc132139cf8dbaded387e7e1d579891f45191d60352c" Oct 03 09:10:09 crc kubenswrapper[4765]: I1003 09:10:09.761985 4765 scope.go:117] "RemoveContainer" containerID="b926ca3b2525a6bb74bbb2dc98d17adc2b5a19d17c0aa49fccf302e0efda2cb7" Oct 03 09:10:09 crc kubenswrapper[4765]: E1003 09:10:09.764106 4765 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b926ca3b2525a6bb74bbb2dc98d17adc2b5a19d17c0aa49fccf302e0efda2cb7\": container with ID starting with b926ca3b2525a6bb74bbb2dc98d17adc2b5a19d17c0aa49fccf302e0efda2cb7 not found: ID does not exist" containerID="b926ca3b2525a6bb74bbb2dc98d17adc2b5a19d17c0aa49fccf302e0efda2cb7" Oct 03 09:10:09 crc kubenswrapper[4765]: I1003 09:10:09.764148 4765 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b926ca3b2525a6bb74bbb2dc98d17adc2b5a19d17c0aa49fccf302e0efda2cb7"} err="failed to get container status \"b926ca3b2525a6bb74bbb2dc98d17adc2b5a19d17c0aa49fccf302e0efda2cb7\": rpc error: code = NotFound desc = could not find container \"b926ca3b2525a6bb74bbb2dc98d17adc2b5a19d17c0aa49fccf302e0efda2cb7\": container with ID starting with b926ca3b2525a6bb74bbb2dc98d17adc2b5a19d17c0aa49fccf302e0efda2cb7 not found: ID does not exist" Oct 03 09:10:09 crc kubenswrapper[4765]: I1003 09:10:09.764177 4765 scope.go:117] "RemoveContainer" containerID="7a9bf1461dd3939caa54fc132139cf8dbaded387e7e1d579891f45191d60352c" Oct 03 09:10:09 crc kubenswrapper[4765]: E1003 09:10:09.764954 4765 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7a9bf1461dd3939caa54fc132139cf8dbaded387e7e1d579891f45191d60352c\": container with ID starting with 7a9bf1461dd3939caa54fc132139cf8dbaded387e7e1d579891f45191d60352c not found: ID does not exist" containerID="7a9bf1461dd3939caa54fc132139cf8dbaded387e7e1d579891f45191d60352c" Oct 03 09:10:09 crc kubenswrapper[4765]: I1003 09:10:09.764998 4765 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7a9bf1461dd3939caa54fc132139cf8dbaded387e7e1d579891f45191d60352c"} err="failed to get container status \"7a9bf1461dd3939caa54fc132139cf8dbaded387e7e1d579891f45191d60352c\": rpc error: code = NotFound desc = could not find container \"7a9bf1461dd3939caa54fc132139cf8dbaded387e7e1d579891f45191d60352c\": container with ID starting with 7a9bf1461dd3939caa54fc132139cf8dbaded387e7e1d579891f45191d60352c not found: ID does not exist" Oct 03 09:10:09 crc kubenswrapper[4765]: I1003 09:10:09.811126 4765 reconciler_common.go:293] "Volume detached for volume \"cert-memcached-mtls\" (UniqueName: \"kubernetes.io/secret/5717d110-7f82-4448-8891-4a0dc6ae2703-cert-memcached-mtls\") on node \"crc\" DevicePath \"\"" Oct 03 09:10:09 crc kubenswrapper[4765]: I1003 09:10:09.973117 4765 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["watcher-kuttl-default/watcher-kuttl-applier-0"] Oct 03 09:10:09 crc kubenswrapper[4765]: I1003 09:10:09.980230 4765 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["watcher-kuttl-default/watcher-kuttl-applier-0"] Oct 03 09:10:09 crc kubenswrapper[4765]: I1003 09:10:09.995340 4765 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/watcher-kuttl-api-1" Oct 03 09:10:10 crc kubenswrapper[4765]: I1003 09:10:10.013237 4765 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-777js\" (UniqueName: \"kubernetes.io/projected/15583e2e-4769-40c5-b09e-4aa70e3dc238-kube-api-access-777js\") pod \"15583e2e-4769-40c5-b09e-4aa70e3dc238\" (UID: \"15583e2e-4769-40c5-b09e-4aa70e3dc238\") " Oct 03 09:10:10 crc kubenswrapper[4765]: I1003 09:10:10.013340 4765 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cert-memcached-mtls\" (UniqueName: \"kubernetes.io/secret/15583e2e-4769-40c5-b09e-4aa70e3dc238-cert-memcached-mtls\") pod \"15583e2e-4769-40c5-b09e-4aa70e3dc238\" (UID: \"15583e2e-4769-40c5-b09e-4aa70e3dc238\") " Oct 03 09:10:10 crc kubenswrapper[4765]: I1003 09:10:10.013383 4765 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/15583e2e-4769-40c5-b09e-4aa70e3dc238-combined-ca-bundle\") pod \"15583e2e-4769-40c5-b09e-4aa70e3dc238\" (UID: \"15583e2e-4769-40c5-b09e-4aa70e3dc238\") " Oct 03 09:10:10 crc kubenswrapper[4765]: I1003 09:10:10.013411 4765 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/15583e2e-4769-40c5-b09e-4aa70e3dc238-logs\") pod \"15583e2e-4769-40c5-b09e-4aa70e3dc238\" (UID: \"15583e2e-4769-40c5-b09e-4aa70e3dc238\") " Oct 03 09:10:10 crc kubenswrapper[4765]: I1003 09:10:10.013445 4765 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"custom-prometheus-ca\" (UniqueName: \"kubernetes.io/secret/15583e2e-4769-40c5-b09e-4aa70e3dc238-custom-prometheus-ca\") pod \"15583e2e-4769-40c5-b09e-4aa70e3dc238\" (UID: \"15583e2e-4769-40c5-b09e-4aa70e3dc238\") " Oct 03 09:10:10 crc kubenswrapper[4765]: I1003 09:10:10.013473 4765 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/15583e2e-4769-40c5-b09e-4aa70e3dc238-config-data\") pod \"15583e2e-4769-40c5-b09e-4aa70e3dc238\" (UID: \"15583e2e-4769-40c5-b09e-4aa70e3dc238\") " Oct 03 09:10:10 crc kubenswrapper[4765]: I1003 09:10:10.014007 4765 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/15583e2e-4769-40c5-b09e-4aa70e3dc238-logs" (OuterVolumeSpecName: "logs") pod "15583e2e-4769-40c5-b09e-4aa70e3dc238" (UID: "15583e2e-4769-40c5-b09e-4aa70e3dc238"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 03 09:10:10 crc kubenswrapper[4765]: I1003 09:10:10.014249 4765 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/15583e2e-4769-40c5-b09e-4aa70e3dc238-logs\") on node \"crc\" DevicePath \"\"" Oct 03 09:10:10 crc kubenswrapper[4765]: I1003 09:10:10.017938 4765 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/15583e2e-4769-40c5-b09e-4aa70e3dc238-kube-api-access-777js" (OuterVolumeSpecName: "kube-api-access-777js") pod "15583e2e-4769-40c5-b09e-4aa70e3dc238" (UID: "15583e2e-4769-40c5-b09e-4aa70e3dc238"). InnerVolumeSpecName "kube-api-access-777js". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 09:10:10 crc kubenswrapper[4765]: I1003 09:10:10.078883 4765 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/15583e2e-4769-40c5-b09e-4aa70e3dc238-config-data" (OuterVolumeSpecName: "config-data") pod "15583e2e-4769-40c5-b09e-4aa70e3dc238" (UID: "15583e2e-4769-40c5-b09e-4aa70e3dc238"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 09:10:10 crc kubenswrapper[4765]: I1003 09:10:10.117619 4765 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-777js\" (UniqueName: \"kubernetes.io/projected/15583e2e-4769-40c5-b09e-4aa70e3dc238-kube-api-access-777js\") on node \"crc\" DevicePath \"\"" Oct 03 09:10:10 crc kubenswrapper[4765]: I1003 09:10:10.117701 4765 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/15583e2e-4769-40c5-b09e-4aa70e3dc238-config-data\") on node \"crc\" DevicePath \"\"" Oct 03 09:10:10 crc kubenswrapper[4765]: I1003 09:10:10.164840 4765 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/15583e2e-4769-40c5-b09e-4aa70e3dc238-custom-prometheus-ca" (OuterVolumeSpecName: "custom-prometheus-ca") pod "15583e2e-4769-40c5-b09e-4aa70e3dc238" (UID: "15583e2e-4769-40c5-b09e-4aa70e3dc238"). InnerVolumeSpecName "custom-prometheus-ca". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 09:10:10 crc kubenswrapper[4765]: I1003 09:10:10.185887 4765 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/15583e2e-4769-40c5-b09e-4aa70e3dc238-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "15583e2e-4769-40c5-b09e-4aa70e3dc238" (UID: "15583e2e-4769-40c5-b09e-4aa70e3dc238"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 09:10:10 crc kubenswrapper[4765]: I1003 09:10:10.222682 4765 reconciler_common.go:293] "Volume detached for volume \"custom-prometheus-ca\" (UniqueName: \"kubernetes.io/secret/15583e2e-4769-40c5-b09e-4aa70e3dc238-custom-prometheus-ca\") on node \"crc\" DevicePath \"\"" Oct 03 09:10:10 crc kubenswrapper[4765]: I1003 09:10:10.222709 4765 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/15583e2e-4769-40c5-b09e-4aa70e3dc238-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 03 09:10:10 crc kubenswrapper[4765]: I1003 09:10:10.226822 4765 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/15583e2e-4769-40c5-b09e-4aa70e3dc238-cert-memcached-mtls" (OuterVolumeSpecName: "cert-memcached-mtls") pod "15583e2e-4769-40c5-b09e-4aa70e3dc238" (UID: "15583e2e-4769-40c5-b09e-4aa70e3dc238"). InnerVolumeSpecName "cert-memcached-mtls". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 09:10:10 crc kubenswrapper[4765]: I1003 09:10:10.318576 4765 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5717d110-7f82-4448-8891-4a0dc6ae2703" path="/var/lib/kubelet/pods/5717d110-7f82-4448-8891-4a0dc6ae2703/volumes" Oct 03 09:10:10 crc kubenswrapper[4765]: I1003 09:10:10.319298 4765 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f7dfb030-e149-4806-bf36-54b8283a9027" path="/var/lib/kubelet/pods/f7dfb030-e149-4806-bf36-54b8283a9027/volumes" Oct 03 09:10:10 crc kubenswrapper[4765]: I1003 09:10:10.324029 4765 reconciler_common.go:293] "Volume detached for volume \"cert-memcached-mtls\" (UniqueName: \"kubernetes.io/secret/15583e2e-4769-40c5-b09e-4aa70e3dc238-cert-memcached-mtls\") on node \"crc\" DevicePath \"\"" Oct 03 09:10:10 crc kubenswrapper[4765]: I1003 09:10:10.671157 4765 generic.go:334] "Generic (PLEG): container finished" podID="2e44e319-a13c-4e4a-bf9c-775e09f92bc3" containerID="2f7dafdada6021fd8062c4874aa556dcda55353642cac651f706241412a4279f" exitCode=0 Oct 03 09:10:10 crc kubenswrapper[4765]: I1003 09:10:10.671501 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/watcher-kuttl-decision-engine-0" event={"ID":"2e44e319-a13c-4e4a-bf9c-775e09f92bc3","Type":"ContainerDied","Data":"2f7dafdada6021fd8062c4874aa556dcda55353642cac651f706241412a4279f"} Oct 03 09:10:10 crc kubenswrapper[4765]: I1003 09:10:10.673221 4765 generic.go:334] "Generic (PLEG): container finished" podID="15583e2e-4769-40c5-b09e-4aa70e3dc238" containerID="af30e18d65b447d9070ee30a84c41a260bb43bd8afbd273bacaf58110a4a2bf2" exitCode=0 Oct 03 09:10:10 crc kubenswrapper[4765]: I1003 09:10:10.673378 4765 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/watcher-kuttl-api-1" Oct 03 09:10:10 crc kubenswrapper[4765]: I1003 09:10:10.674126 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/watcher-kuttl-api-1" event={"ID":"15583e2e-4769-40c5-b09e-4aa70e3dc238","Type":"ContainerDied","Data":"af30e18d65b447d9070ee30a84c41a260bb43bd8afbd273bacaf58110a4a2bf2"} Oct 03 09:10:10 crc kubenswrapper[4765]: I1003 09:10:10.674146 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/watcher-kuttl-api-1" event={"ID":"15583e2e-4769-40c5-b09e-4aa70e3dc238","Type":"ContainerDied","Data":"1bc5a52997906850877a33d3b57684d1e34de387e3af7c0038b3232a3a910438"} Oct 03 09:10:10 crc kubenswrapper[4765]: I1003 09:10:10.674161 4765 scope.go:117] "RemoveContainer" containerID="af30e18d65b447d9070ee30a84c41a260bb43bd8afbd273bacaf58110a4a2bf2" Oct 03 09:10:10 crc kubenswrapper[4765]: I1003 09:10:10.697638 4765 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["watcher-kuttl-default/watcher-kuttl-api-1"] Oct 03 09:10:10 crc kubenswrapper[4765]: I1003 09:10:10.706170 4765 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["watcher-kuttl-default/watcher-kuttl-api-1"] Oct 03 09:10:10 crc kubenswrapper[4765]: I1003 09:10:10.708012 4765 scope.go:117] "RemoveContainer" containerID="ecaae0984a5e363a2cd98663328b28aa2f9495c40ae3666535c89c97660cfd65" Oct 03 09:10:10 crc kubenswrapper[4765]: I1003 09:10:10.719744 4765 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/watcher-kuttl-decision-engine-0" Oct 03 09:10:10 crc kubenswrapper[4765]: I1003 09:10:10.733964 4765 scope.go:117] "RemoveContainer" containerID="af30e18d65b447d9070ee30a84c41a260bb43bd8afbd273bacaf58110a4a2bf2" Oct 03 09:10:10 crc kubenswrapper[4765]: E1003 09:10:10.736563 4765 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"af30e18d65b447d9070ee30a84c41a260bb43bd8afbd273bacaf58110a4a2bf2\": container with ID starting with af30e18d65b447d9070ee30a84c41a260bb43bd8afbd273bacaf58110a4a2bf2 not found: ID does not exist" containerID="af30e18d65b447d9070ee30a84c41a260bb43bd8afbd273bacaf58110a4a2bf2" Oct 03 09:10:10 crc kubenswrapper[4765]: I1003 09:10:10.736613 4765 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"af30e18d65b447d9070ee30a84c41a260bb43bd8afbd273bacaf58110a4a2bf2"} err="failed to get container status \"af30e18d65b447d9070ee30a84c41a260bb43bd8afbd273bacaf58110a4a2bf2\": rpc error: code = NotFound desc = could not find container \"af30e18d65b447d9070ee30a84c41a260bb43bd8afbd273bacaf58110a4a2bf2\": container with ID starting with af30e18d65b447d9070ee30a84c41a260bb43bd8afbd273bacaf58110a4a2bf2 not found: ID does not exist" Oct 03 09:10:10 crc kubenswrapper[4765]: I1003 09:10:10.736664 4765 scope.go:117] "RemoveContainer" containerID="ecaae0984a5e363a2cd98663328b28aa2f9495c40ae3666535c89c97660cfd65" Oct 03 09:10:10 crc kubenswrapper[4765]: E1003 09:10:10.737877 4765 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ecaae0984a5e363a2cd98663328b28aa2f9495c40ae3666535c89c97660cfd65\": container with ID starting with ecaae0984a5e363a2cd98663328b28aa2f9495c40ae3666535c89c97660cfd65 not found: ID does not exist" containerID="ecaae0984a5e363a2cd98663328b28aa2f9495c40ae3666535c89c97660cfd65" Oct 03 09:10:10 crc kubenswrapper[4765]: I1003 09:10:10.737926 4765 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ecaae0984a5e363a2cd98663328b28aa2f9495c40ae3666535c89c97660cfd65"} err="failed to get container status \"ecaae0984a5e363a2cd98663328b28aa2f9495c40ae3666535c89c97660cfd65\": rpc error: code = NotFound desc = could not find container \"ecaae0984a5e363a2cd98663328b28aa2f9495c40ae3666535c89c97660cfd65\": container with ID starting with ecaae0984a5e363a2cd98663328b28aa2f9495c40ae3666535c89c97660cfd65 not found: ID does not exist" Oct 03 09:10:10 crc kubenswrapper[4765]: I1003 09:10:10.831473 4765 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2e44e319-a13c-4e4a-bf9c-775e09f92bc3-combined-ca-bundle\") pod \"2e44e319-a13c-4e4a-bf9c-775e09f92bc3\" (UID: \"2e44e319-a13c-4e4a-bf9c-775e09f92bc3\") " Oct 03 09:10:10 crc kubenswrapper[4765]: I1003 09:10:10.831551 4765 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cert-memcached-mtls\" (UniqueName: \"kubernetes.io/secret/2e44e319-a13c-4e4a-bf9c-775e09f92bc3-cert-memcached-mtls\") pod \"2e44e319-a13c-4e4a-bf9c-775e09f92bc3\" (UID: \"2e44e319-a13c-4e4a-bf9c-775e09f92bc3\") " Oct 03 09:10:10 crc kubenswrapper[4765]: I1003 09:10:10.831628 4765 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2e44e319-a13c-4e4a-bf9c-775e09f92bc3-config-data\") pod \"2e44e319-a13c-4e4a-bf9c-775e09f92bc3\" (UID: \"2e44e319-a13c-4e4a-bf9c-775e09f92bc3\") " Oct 03 09:10:10 crc kubenswrapper[4765]: I1003 09:10:10.831728 4765 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/2e44e319-a13c-4e4a-bf9c-775e09f92bc3-logs\") pod \"2e44e319-a13c-4e4a-bf9c-775e09f92bc3\" (UID: \"2e44e319-a13c-4e4a-bf9c-775e09f92bc3\") " Oct 03 09:10:10 crc kubenswrapper[4765]: I1003 09:10:10.831780 4765 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"custom-prometheus-ca\" (UniqueName: \"kubernetes.io/secret/2e44e319-a13c-4e4a-bf9c-775e09f92bc3-custom-prometheus-ca\") pod \"2e44e319-a13c-4e4a-bf9c-775e09f92bc3\" (UID: \"2e44e319-a13c-4e4a-bf9c-775e09f92bc3\") " Oct 03 09:10:10 crc kubenswrapper[4765]: I1003 09:10:10.831812 4765 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vtdgd\" (UniqueName: \"kubernetes.io/projected/2e44e319-a13c-4e4a-bf9c-775e09f92bc3-kube-api-access-vtdgd\") pod \"2e44e319-a13c-4e4a-bf9c-775e09f92bc3\" (UID: \"2e44e319-a13c-4e4a-bf9c-775e09f92bc3\") " Oct 03 09:10:10 crc kubenswrapper[4765]: I1003 09:10:10.832814 4765 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/2e44e319-a13c-4e4a-bf9c-775e09f92bc3-logs" (OuterVolumeSpecName: "logs") pod "2e44e319-a13c-4e4a-bf9c-775e09f92bc3" (UID: "2e44e319-a13c-4e4a-bf9c-775e09f92bc3"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 03 09:10:10 crc kubenswrapper[4765]: I1003 09:10:10.836263 4765 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2e44e319-a13c-4e4a-bf9c-775e09f92bc3-kube-api-access-vtdgd" (OuterVolumeSpecName: "kube-api-access-vtdgd") pod "2e44e319-a13c-4e4a-bf9c-775e09f92bc3" (UID: "2e44e319-a13c-4e4a-bf9c-775e09f92bc3"). InnerVolumeSpecName "kube-api-access-vtdgd". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 09:10:10 crc kubenswrapper[4765]: I1003 09:10:10.862795 4765 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2e44e319-a13c-4e4a-bf9c-775e09f92bc3-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "2e44e319-a13c-4e4a-bf9c-775e09f92bc3" (UID: "2e44e319-a13c-4e4a-bf9c-775e09f92bc3"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 09:10:10 crc kubenswrapper[4765]: I1003 09:10:10.866520 4765 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2e44e319-a13c-4e4a-bf9c-775e09f92bc3-custom-prometheus-ca" (OuterVolumeSpecName: "custom-prometheus-ca") pod "2e44e319-a13c-4e4a-bf9c-775e09f92bc3" (UID: "2e44e319-a13c-4e4a-bf9c-775e09f92bc3"). InnerVolumeSpecName "custom-prometheus-ca". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 09:10:10 crc kubenswrapper[4765]: I1003 09:10:10.904829 4765 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2e44e319-a13c-4e4a-bf9c-775e09f92bc3-config-data" (OuterVolumeSpecName: "config-data") pod "2e44e319-a13c-4e4a-bf9c-775e09f92bc3" (UID: "2e44e319-a13c-4e4a-bf9c-775e09f92bc3"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 09:10:10 crc kubenswrapper[4765]: I1003 09:10:10.927106 4765 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2e44e319-a13c-4e4a-bf9c-775e09f92bc3-cert-memcached-mtls" (OuterVolumeSpecName: "cert-memcached-mtls") pod "2e44e319-a13c-4e4a-bf9c-775e09f92bc3" (UID: "2e44e319-a13c-4e4a-bf9c-775e09f92bc3"). InnerVolumeSpecName "cert-memcached-mtls". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 09:10:10 crc kubenswrapper[4765]: I1003 09:10:10.934008 4765 reconciler_common.go:293] "Volume detached for volume \"cert-memcached-mtls\" (UniqueName: \"kubernetes.io/secret/2e44e319-a13c-4e4a-bf9c-775e09f92bc3-cert-memcached-mtls\") on node \"crc\" DevicePath \"\"" Oct 03 09:10:10 crc kubenswrapper[4765]: I1003 09:10:10.934041 4765 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2e44e319-a13c-4e4a-bf9c-775e09f92bc3-config-data\") on node \"crc\" DevicePath \"\"" Oct 03 09:10:10 crc kubenswrapper[4765]: I1003 09:10:10.934054 4765 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/2e44e319-a13c-4e4a-bf9c-775e09f92bc3-logs\") on node \"crc\" DevicePath \"\"" Oct 03 09:10:10 crc kubenswrapper[4765]: I1003 09:10:10.934063 4765 reconciler_common.go:293] "Volume detached for volume \"custom-prometheus-ca\" (UniqueName: \"kubernetes.io/secret/2e44e319-a13c-4e4a-bf9c-775e09f92bc3-custom-prometheus-ca\") on node \"crc\" DevicePath \"\"" Oct 03 09:10:10 crc kubenswrapper[4765]: I1003 09:10:10.934073 4765 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vtdgd\" (UniqueName: \"kubernetes.io/projected/2e44e319-a13c-4e4a-bf9c-775e09f92bc3-kube-api-access-vtdgd\") on node \"crc\" DevicePath \"\"" Oct 03 09:10:10 crc kubenswrapper[4765]: I1003 09:10:10.934083 4765 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2e44e319-a13c-4e4a-bf9c-775e09f92bc3-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 03 09:10:10 crc kubenswrapper[4765]: I1003 09:10:10.972078 4765 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-72d7q"] Oct 03 09:10:10 crc kubenswrapper[4765]: E1003 09:10:10.972473 4765 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f7dfb030-e149-4806-bf36-54b8283a9027" containerName="watcher-kuttl-api-log" Oct 03 09:10:10 crc kubenswrapper[4765]: I1003 09:10:10.972493 4765 state_mem.go:107] "Deleted CPUSet assignment" podUID="f7dfb030-e149-4806-bf36-54b8283a9027" containerName="watcher-kuttl-api-log" Oct 03 09:10:10 crc kubenswrapper[4765]: E1003 09:10:10.972507 4765 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="15583e2e-4769-40c5-b09e-4aa70e3dc238" containerName="watcher-api" Oct 03 09:10:10 crc kubenswrapper[4765]: I1003 09:10:10.972513 4765 state_mem.go:107] "Deleted CPUSet assignment" podUID="15583e2e-4769-40c5-b09e-4aa70e3dc238" containerName="watcher-api" Oct 03 09:10:10 crc kubenswrapper[4765]: E1003 09:10:10.972522 4765 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2e44e319-a13c-4e4a-bf9c-775e09f92bc3" containerName="watcher-decision-engine" Oct 03 09:10:10 crc kubenswrapper[4765]: I1003 09:10:10.972529 4765 state_mem.go:107] "Deleted CPUSet assignment" podUID="2e44e319-a13c-4e4a-bf9c-775e09f92bc3" containerName="watcher-decision-engine" Oct 03 09:10:10 crc kubenswrapper[4765]: E1003 09:10:10.972543 4765 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f7dfb030-e149-4806-bf36-54b8283a9027" containerName="watcher-api" Oct 03 09:10:10 crc kubenswrapper[4765]: I1003 09:10:10.972549 4765 state_mem.go:107] "Deleted CPUSet assignment" podUID="f7dfb030-e149-4806-bf36-54b8283a9027" containerName="watcher-api" Oct 03 09:10:10 crc kubenswrapper[4765]: E1003 09:10:10.972571 4765 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5717d110-7f82-4448-8891-4a0dc6ae2703" containerName="watcher-applier" Oct 03 09:10:10 crc kubenswrapper[4765]: I1003 09:10:10.972576 4765 state_mem.go:107] "Deleted CPUSet assignment" podUID="5717d110-7f82-4448-8891-4a0dc6ae2703" containerName="watcher-applier" Oct 03 09:10:10 crc kubenswrapper[4765]: E1003 09:10:10.972590 4765 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="15583e2e-4769-40c5-b09e-4aa70e3dc238" containerName="watcher-kuttl-api-log" Oct 03 09:10:10 crc kubenswrapper[4765]: I1003 09:10:10.972596 4765 state_mem.go:107] "Deleted CPUSet assignment" podUID="15583e2e-4769-40c5-b09e-4aa70e3dc238" containerName="watcher-kuttl-api-log" Oct 03 09:10:10 crc kubenswrapper[4765]: I1003 09:10:10.972762 4765 memory_manager.go:354] "RemoveStaleState removing state" podUID="f7dfb030-e149-4806-bf36-54b8283a9027" containerName="watcher-kuttl-api-log" Oct 03 09:10:10 crc kubenswrapper[4765]: I1003 09:10:10.972776 4765 memory_manager.go:354] "RemoveStaleState removing state" podUID="15583e2e-4769-40c5-b09e-4aa70e3dc238" containerName="watcher-kuttl-api-log" Oct 03 09:10:10 crc kubenswrapper[4765]: I1003 09:10:10.972788 4765 memory_manager.go:354] "RemoveStaleState removing state" podUID="f7dfb030-e149-4806-bf36-54b8283a9027" containerName="watcher-api" Oct 03 09:10:10 crc kubenswrapper[4765]: I1003 09:10:10.972799 4765 memory_manager.go:354] "RemoveStaleState removing state" podUID="15583e2e-4769-40c5-b09e-4aa70e3dc238" containerName="watcher-api" Oct 03 09:10:10 crc kubenswrapper[4765]: I1003 09:10:10.972810 4765 memory_manager.go:354] "RemoveStaleState removing state" podUID="5717d110-7f82-4448-8891-4a0dc6ae2703" containerName="watcher-applier" Oct 03 09:10:10 crc kubenswrapper[4765]: I1003 09:10:10.972818 4765 memory_manager.go:354] "RemoveStaleState removing state" podUID="2e44e319-a13c-4e4a-bf9c-775e09f92bc3" containerName="watcher-decision-engine" Oct 03 09:10:10 crc kubenswrapper[4765]: I1003 09:10:10.974145 4765 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-72d7q" Oct 03 09:10:10 crc kubenswrapper[4765]: I1003 09:10:10.987083 4765 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-72d7q"] Oct 03 09:10:11 crc kubenswrapper[4765]: I1003 09:10:11.107755 4765 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/watchertest-account-delete-h6kbd" Oct 03 09:10:11 crc kubenswrapper[4765]: I1003 09:10:11.137486 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c8739f94-bfcd-4c01-8025-d39aaddacde0-catalog-content\") pod \"community-operators-72d7q\" (UID: \"c8739f94-bfcd-4c01-8025-d39aaddacde0\") " pod="openshift-marketplace/community-operators-72d7q" Oct 03 09:10:11 crc kubenswrapper[4765]: I1003 09:10:11.137545 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c8739f94-bfcd-4c01-8025-d39aaddacde0-utilities\") pod \"community-operators-72d7q\" (UID: \"c8739f94-bfcd-4c01-8025-d39aaddacde0\") " pod="openshift-marketplace/community-operators-72d7q" Oct 03 09:10:11 crc kubenswrapper[4765]: I1003 09:10:11.137598 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-w9rg5\" (UniqueName: \"kubernetes.io/projected/c8739f94-bfcd-4c01-8025-d39aaddacde0-kube-api-access-w9rg5\") pod \"community-operators-72d7q\" (UID: \"c8739f94-bfcd-4c01-8025-d39aaddacde0\") " pod="openshift-marketplace/community-operators-72d7q" Oct 03 09:10:11 crc kubenswrapper[4765]: I1003 09:10:11.196876 4765 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["watcher-kuttl-default/ceilometer-0"] Oct 03 09:10:11 crc kubenswrapper[4765]: I1003 09:10:11.197211 4765 kuberuntime_container.go:808] "Killing container with a grace period" pod="watcher-kuttl-default/ceilometer-0" podUID="61098995-6197-447a-b950-7ee9bfa2f643" containerName="ceilometer-central-agent" containerID="cri-o://a5d2feaddb249676c69586e35a62c1d79368a5730ccaed6998f70948202282a8" gracePeriod=30 Oct 03 09:10:11 crc kubenswrapper[4765]: I1003 09:10:11.197367 4765 kuberuntime_container.go:808] "Killing container with a grace period" pod="watcher-kuttl-default/ceilometer-0" podUID="61098995-6197-447a-b950-7ee9bfa2f643" containerName="proxy-httpd" containerID="cri-o://32d54cf682ef268c2bbd75695e7fe4b2ecf2191f5135a14665eb06b6468397ba" gracePeriod=30 Oct 03 09:10:11 crc kubenswrapper[4765]: I1003 09:10:11.197425 4765 kuberuntime_container.go:808] "Killing container with a grace period" pod="watcher-kuttl-default/ceilometer-0" podUID="61098995-6197-447a-b950-7ee9bfa2f643" containerName="sg-core" containerID="cri-o://ac3633d8155b97658e4c7f58ad6c3f06d426c6881f40074cc0dca9726a090e1d" gracePeriod=30 Oct 03 09:10:11 crc kubenswrapper[4765]: I1003 09:10:11.197470 4765 kuberuntime_container.go:808] "Killing container with a grace period" pod="watcher-kuttl-default/ceilometer-0" podUID="61098995-6197-447a-b950-7ee9bfa2f643" containerName="ceilometer-notification-agent" containerID="cri-o://55bd06de291f364b65c952dc29f64dc22d987b3daa89aea3183e79149d5b9820" gracePeriod=30 Oct 03 09:10:11 crc kubenswrapper[4765]: I1003 09:10:11.238902 4765 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rksrg\" (UniqueName: \"kubernetes.io/projected/a850b634-7419-4983-ad82-8efd6dea5cb7-kube-api-access-rksrg\") pod \"a850b634-7419-4983-ad82-8efd6dea5cb7\" (UID: \"a850b634-7419-4983-ad82-8efd6dea5cb7\") " Oct 03 09:10:11 crc kubenswrapper[4765]: I1003 09:10:11.239204 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-w9rg5\" (UniqueName: \"kubernetes.io/projected/c8739f94-bfcd-4c01-8025-d39aaddacde0-kube-api-access-w9rg5\") pod \"community-operators-72d7q\" (UID: \"c8739f94-bfcd-4c01-8025-d39aaddacde0\") " pod="openshift-marketplace/community-operators-72d7q" Oct 03 09:10:11 crc kubenswrapper[4765]: I1003 09:10:11.239375 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c8739f94-bfcd-4c01-8025-d39aaddacde0-catalog-content\") pod \"community-operators-72d7q\" (UID: \"c8739f94-bfcd-4c01-8025-d39aaddacde0\") " pod="openshift-marketplace/community-operators-72d7q" Oct 03 09:10:11 crc kubenswrapper[4765]: I1003 09:10:11.239439 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c8739f94-bfcd-4c01-8025-d39aaddacde0-utilities\") pod \"community-operators-72d7q\" (UID: \"c8739f94-bfcd-4c01-8025-d39aaddacde0\") " pod="openshift-marketplace/community-operators-72d7q" Oct 03 09:10:11 crc kubenswrapper[4765]: I1003 09:10:11.239954 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c8739f94-bfcd-4c01-8025-d39aaddacde0-utilities\") pod \"community-operators-72d7q\" (UID: \"c8739f94-bfcd-4c01-8025-d39aaddacde0\") " pod="openshift-marketplace/community-operators-72d7q" Oct 03 09:10:11 crc kubenswrapper[4765]: I1003 09:10:11.240502 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c8739f94-bfcd-4c01-8025-d39aaddacde0-catalog-content\") pod \"community-operators-72d7q\" (UID: \"c8739f94-bfcd-4c01-8025-d39aaddacde0\") " pod="openshift-marketplace/community-operators-72d7q" Oct 03 09:10:11 crc kubenswrapper[4765]: I1003 09:10:11.243439 4765 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a850b634-7419-4983-ad82-8efd6dea5cb7-kube-api-access-rksrg" (OuterVolumeSpecName: "kube-api-access-rksrg") pod "a850b634-7419-4983-ad82-8efd6dea5cb7" (UID: "a850b634-7419-4983-ad82-8efd6dea5cb7"). InnerVolumeSpecName "kube-api-access-rksrg". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 09:10:11 crc kubenswrapper[4765]: I1003 09:10:11.257790 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-w9rg5\" (UniqueName: \"kubernetes.io/projected/c8739f94-bfcd-4c01-8025-d39aaddacde0-kube-api-access-w9rg5\") pod \"community-operators-72d7q\" (UID: \"c8739f94-bfcd-4c01-8025-d39aaddacde0\") " pod="openshift-marketplace/community-operators-72d7q" Oct 03 09:10:11 crc kubenswrapper[4765]: I1003 09:10:11.293405 4765 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-72d7q" Oct 03 09:10:11 crc kubenswrapper[4765]: I1003 09:10:11.301321 4765 prober.go:107] "Probe failed" probeType="Readiness" pod="watcher-kuttl-default/ceilometer-0" podUID="61098995-6197-447a-b950-7ee9bfa2f643" containerName="proxy-httpd" probeResult="failure" output="Get \"https://10.217.0.229:3000/\": read tcp 10.217.0.2:58508->10.217.0.229:3000: read: connection reset by peer" Oct 03 09:10:11 crc kubenswrapper[4765]: I1003 09:10:11.342486 4765 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rksrg\" (UniqueName: \"kubernetes.io/projected/a850b634-7419-4983-ad82-8efd6dea5cb7-kube-api-access-rksrg\") on node \"crc\" DevicePath \"\"" Oct 03 09:10:11 crc kubenswrapper[4765]: I1003 09:10:11.685858 4765 generic.go:334] "Generic (PLEG): container finished" podID="61098995-6197-447a-b950-7ee9bfa2f643" containerID="32d54cf682ef268c2bbd75695e7fe4b2ecf2191f5135a14665eb06b6468397ba" exitCode=0 Oct 03 09:10:11 crc kubenswrapper[4765]: I1003 09:10:11.686162 4765 generic.go:334] "Generic (PLEG): container finished" podID="61098995-6197-447a-b950-7ee9bfa2f643" containerID="ac3633d8155b97658e4c7f58ad6c3f06d426c6881f40074cc0dca9726a090e1d" exitCode=2 Oct 03 09:10:11 crc kubenswrapper[4765]: I1003 09:10:11.685933 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/ceilometer-0" event={"ID":"61098995-6197-447a-b950-7ee9bfa2f643","Type":"ContainerDied","Data":"32d54cf682ef268c2bbd75695e7fe4b2ecf2191f5135a14665eb06b6468397ba"} Oct 03 09:10:11 crc kubenswrapper[4765]: I1003 09:10:11.686222 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/ceilometer-0" event={"ID":"61098995-6197-447a-b950-7ee9bfa2f643","Type":"ContainerDied","Data":"ac3633d8155b97658e4c7f58ad6c3f06d426c6881f40074cc0dca9726a090e1d"} Oct 03 09:10:11 crc kubenswrapper[4765]: I1003 09:10:11.688454 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/watchertest-account-delete-h6kbd" event={"ID":"a850b634-7419-4983-ad82-8efd6dea5cb7","Type":"ContainerDied","Data":"55533fe2d733b86f6757c85069919c6a57922976789d40f40d50a86b031e4a04"} Oct 03 09:10:11 crc kubenswrapper[4765]: I1003 09:10:11.688480 4765 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="55533fe2d733b86f6757c85069919c6a57922976789d40f40d50a86b031e4a04" Oct 03 09:10:11 crc kubenswrapper[4765]: I1003 09:10:11.688531 4765 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/watchertest-account-delete-h6kbd" Oct 03 09:10:11 crc kubenswrapper[4765]: I1003 09:10:11.690331 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/watcher-kuttl-decision-engine-0" event={"ID":"2e44e319-a13c-4e4a-bf9c-775e09f92bc3","Type":"ContainerDied","Data":"f612bd5c46fc3581179eab6a2c75b5dcc3b88b2878be5362c58b51029d4e473a"} Oct 03 09:10:11 crc kubenswrapper[4765]: I1003 09:10:11.690397 4765 scope.go:117] "RemoveContainer" containerID="2f7dafdada6021fd8062c4874aa556dcda55353642cac651f706241412a4279f" Oct 03 09:10:11 crc kubenswrapper[4765]: I1003 09:10:11.690355 4765 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/watcher-kuttl-decision-engine-0" Oct 03 09:10:11 crc kubenswrapper[4765]: I1003 09:10:11.733802 4765 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["watcher-kuttl-default/watcher-kuttl-decision-engine-0"] Oct 03 09:10:11 crc kubenswrapper[4765]: I1003 09:10:11.739908 4765 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["watcher-kuttl-default/watcher-kuttl-decision-engine-0"] Oct 03 09:10:11 crc kubenswrapper[4765]: I1003 09:10:11.871289 4765 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-72d7q"] Oct 03 09:10:11 crc kubenswrapper[4765]: W1003 09:10:11.872293 4765 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podc8739f94_bfcd_4c01_8025_d39aaddacde0.slice/crio-fc46318fd066ed07a06cd1fc1ce0b325cfb2ed08c5042ce5fad976d4b160d1e4 WatchSource:0}: Error finding container fc46318fd066ed07a06cd1fc1ce0b325cfb2ed08c5042ce5fad976d4b160d1e4: Status 404 returned error can't find the container with id fc46318fd066ed07a06cd1fc1ce0b325cfb2ed08c5042ce5fad976d4b160d1e4 Oct 03 09:10:12 crc kubenswrapper[4765]: I1003 09:10:12.315836 4765 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="15583e2e-4769-40c5-b09e-4aa70e3dc238" path="/var/lib/kubelet/pods/15583e2e-4769-40c5-b09e-4aa70e3dc238/volumes" Oct 03 09:10:12 crc kubenswrapper[4765]: I1003 09:10:12.317188 4765 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2e44e319-a13c-4e4a-bf9c-775e09f92bc3" path="/var/lib/kubelet/pods/2e44e319-a13c-4e4a-bf9c-775e09f92bc3/volumes" Oct 03 09:10:12 crc kubenswrapper[4765]: I1003 09:10:12.703350 4765 generic.go:334] "Generic (PLEG): container finished" podID="c8739f94-bfcd-4c01-8025-d39aaddacde0" containerID="965537b1c402acc3310196eba3f8fa4b5c495fa54852550bdfd467b1f9073860" exitCode=0 Oct 03 09:10:12 crc kubenswrapper[4765]: I1003 09:10:12.703471 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-72d7q" event={"ID":"c8739f94-bfcd-4c01-8025-d39aaddacde0","Type":"ContainerDied","Data":"965537b1c402acc3310196eba3f8fa4b5c495fa54852550bdfd467b1f9073860"} Oct 03 09:10:12 crc kubenswrapper[4765]: I1003 09:10:12.703517 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-72d7q" event={"ID":"c8739f94-bfcd-4c01-8025-d39aaddacde0","Type":"ContainerStarted","Data":"fc46318fd066ed07a06cd1fc1ce0b325cfb2ed08c5042ce5fad976d4b160d1e4"} Oct 03 09:10:12 crc kubenswrapper[4765]: I1003 09:10:12.709871 4765 generic.go:334] "Generic (PLEG): container finished" podID="61098995-6197-447a-b950-7ee9bfa2f643" containerID="a5d2feaddb249676c69586e35a62c1d79368a5730ccaed6998f70948202282a8" exitCode=0 Oct 03 09:10:12 crc kubenswrapper[4765]: I1003 09:10:12.709897 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/ceilometer-0" event={"ID":"61098995-6197-447a-b950-7ee9bfa2f643","Type":"ContainerDied","Data":"a5d2feaddb249676c69586e35a62c1d79368a5730ccaed6998f70948202282a8"} Oct 03 09:10:12 crc kubenswrapper[4765]: I1003 09:10:12.804794 4765 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["watcher-kuttl-default/watcher-db-create-9qsk6"] Oct 03 09:10:12 crc kubenswrapper[4765]: I1003 09:10:12.819066 4765 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["watcher-kuttl-default/watcher-db-create-9qsk6"] Oct 03 09:10:12 crc kubenswrapper[4765]: I1003 09:10:12.826073 4765 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["watcher-kuttl-default/watcher-test-account-create-pfw25"] Oct 03 09:10:12 crc kubenswrapper[4765]: I1003 09:10:12.833047 4765 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["watcher-kuttl-default/watchertest-account-delete-h6kbd"] Oct 03 09:10:12 crc kubenswrapper[4765]: I1003 09:10:12.841454 4765 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["watcher-kuttl-default/watchertest-account-delete-h6kbd"] Oct 03 09:10:12 crc kubenswrapper[4765]: I1003 09:10:12.846785 4765 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["watcher-kuttl-default/watcher-test-account-create-pfw25"] Oct 03 09:10:13 crc kubenswrapper[4765]: I1003 09:10:13.720598 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-72d7q" event={"ID":"c8739f94-bfcd-4c01-8025-d39aaddacde0","Type":"ContainerStarted","Data":"25cc6ec07fd12c67f593481c3808cbede35b35a3c3214ef388b70c6acf0fc21b"} Oct 03 09:10:14 crc kubenswrapper[4765]: I1003 09:10:14.281972 4765 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/ceilometer-0" Oct 03 09:10:14 crc kubenswrapper[4765]: I1003 09:10:14.316699 4765 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4b719514-ecf2-47e6-a002-fb51b387e66f" path="/var/lib/kubelet/pods/4b719514-ecf2-47e6-a002-fb51b387e66f/volumes" Oct 03 09:10:14 crc kubenswrapper[4765]: I1003 09:10:14.317191 4765 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="87745db5-ec67-4cc3-9f16-fdde96423caa" path="/var/lib/kubelet/pods/87745db5-ec67-4cc3-9f16-fdde96423caa/volumes" Oct 03 09:10:14 crc kubenswrapper[4765]: I1003 09:10:14.317931 4765 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a850b634-7419-4983-ad82-8efd6dea5cb7" path="/var/lib/kubelet/pods/a850b634-7419-4983-ad82-8efd6dea5cb7/volumes" Oct 03 09:10:14 crc kubenswrapper[4765]: I1003 09:10:14.389273 4765 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/61098995-6197-447a-b950-7ee9bfa2f643-scripts\") pod \"61098995-6197-447a-b950-7ee9bfa2f643\" (UID: \"61098995-6197-447a-b950-7ee9bfa2f643\") " Oct 03 09:10:14 crc kubenswrapper[4765]: I1003 09:10:14.390301 4765 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/61098995-6197-447a-b950-7ee9bfa2f643-log-httpd\") pod \"61098995-6197-447a-b950-7ee9bfa2f643\" (UID: \"61098995-6197-447a-b950-7ee9bfa2f643\") " Oct 03 09:10:14 crc kubenswrapper[4765]: I1003 09:10:14.390365 4765 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/61098995-6197-447a-b950-7ee9bfa2f643-ceilometer-tls-certs\") pod \"61098995-6197-447a-b950-7ee9bfa2f643\" (UID: \"61098995-6197-447a-b950-7ee9bfa2f643\") " Oct 03 09:10:14 crc kubenswrapper[4765]: I1003 09:10:14.390386 4765 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/61098995-6197-447a-b950-7ee9bfa2f643-sg-core-conf-yaml\") pod \"61098995-6197-447a-b950-7ee9bfa2f643\" (UID: \"61098995-6197-447a-b950-7ee9bfa2f643\") " Oct 03 09:10:14 crc kubenswrapper[4765]: I1003 09:10:14.390429 4765 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/61098995-6197-447a-b950-7ee9bfa2f643-combined-ca-bundle\") pod \"61098995-6197-447a-b950-7ee9bfa2f643\" (UID: \"61098995-6197-447a-b950-7ee9bfa2f643\") " Oct 03 09:10:14 crc kubenswrapper[4765]: I1003 09:10:14.390483 4765 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-c7wdd\" (UniqueName: \"kubernetes.io/projected/61098995-6197-447a-b950-7ee9bfa2f643-kube-api-access-c7wdd\") pod \"61098995-6197-447a-b950-7ee9bfa2f643\" (UID: \"61098995-6197-447a-b950-7ee9bfa2f643\") " Oct 03 09:10:14 crc kubenswrapper[4765]: I1003 09:10:14.390605 4765 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/61098995-6197-447a-b950-7ee9bfa2f643-config-data\") pod \"61098995-6197-447a-b950-7ee9bfa2f643\" (UID: \"61098995-6197-447a-b950-7ee9bfa2f643\") " Oct 03 09:10:14 crc kubenswrapper[4765]: I1003 09:10:14.390637 4765 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/61098995-6197-447a-b950-7ee9bfa2f643-run-httpd\") pod \"61098995-6197-447a-b950-7ee9bfa2f643\" (UID: \"61098995-6197-447a-b950-7ee9bfa2f643\") " Oct 03 09:10:14 crc kubenswrapper[4765]: I1003 09:10:14.390821 4765 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/61098995-6197-447a-b950-7ee9bfa2f643-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "61098995-6197-447a-b950-7ee9bfa2f643" (UID: "61098995-6197-447a-b950-7ee9bfa2f643"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 03 09:10:14 crc kubenswrapper[4765]: I1003 09:10:14.391092 4765 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/61098995-6197-447a-b950-7ee9bfa2f643-log-httpd\") on node \"crc\" DevicePath \"\"" Oct 03 09:10:14 crc kubenswrapper[4765]: I1003 09:10:14.391118 4765 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/61098995-6197-447a-b950-7ee9bfa2f643-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "61098995-6197-447a-b950-7ee9bfa2f643" (UID: "61098995-6197-447a-b950-7ee9bfa2f643"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 03 09:10:14 crc kubenswrapper[4765]: I1003 09:10:14.395110 4765 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/61098995-6197-447a-b950-7ee9bfa2f643-scripts" (OuterVolumeSpecName: "scripts") pod "61098995-6197-447a-b950-7ee9bfa2f643" (UID: "61098995-6197-447a-b950-7ee9bfa2f643"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 09:10:14 crc kubenswrapper[4765]: I1003 09:10:14.395148 4765 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/61098995-6197-447a-b950-7ee9bfa2f643-kube-api-access-c7wdd" (OuterVolumeSpecName: "kube-api-access-c7wdd") pod "61098995-6197-447a-b950-7ee9bfa2f643" (UID: "61098995-6197-447a-b950-7ee9bfa2f643"). InnerVolumeSpecName "kube-api-access-c7wdd". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 09:10:14 crc kubenswrapper[4765]: I1003 09:10:14.419403 4765 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/61098995-6197-447a-b950-7ee9bfa2f643-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "61098995-6197-447a-b950-7ee9bfa2f643" (UID: "61098995-6197-447a-b950-7ee9bfa2f643"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 09:10:14 crc kubenswrapper[4765]: I1003 09:10:14.449499 4765 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/61098995-6197-447a-b950-7ee9bfa2f643-ceilometer-tls-certs" (OuterVolumeSpecName: "ceilometer-tls-certs") pod "61098995-6197-447a-b950-7ee9bfa2f643" (UID: "61098995-6197-447a-b950-7ee9bfa2f643"). InnerVolumeSpecName "ceilometer-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 09:10:14 crc kubenswrapper[4765]: I1003 09:10:14.472469 4765 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/61098995-6197-447a-b950-7ee9bfa2f643-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "61098995-6197-447a-b950-7ee9bfa2f643" (UID: "61098995-6197-447a-b950-7ee9bfa2f643"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 09:10:14 crc kubenswrapper[4765]: I1003 09:10:14.493758 4765 reconciler_common.go:293] "Volume detached for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/61098995-6197-447a-b950-7ee9bfa2f643-ceilometer-tls-certs\") on node \"crc\" DevicePath \"\"" Oct 03 09:10:14 crc kubenswrapper[4765]: I1003 09:10:14.493795 4765 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/61098995-6197-447a-b950-7ee9bfa2f643-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Oct 03 09:10:14 crc kubenswrapper[4765]: I1003 09:10:14.493804 4765 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/61098995-6197-447a-b950-7ee9bfa2f643-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 03 09:10:14 crc kubenswrapper[4765]: I1003 09:10:14.493814 4765 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-c7wdd\" (UniqueName: \"kubernetes.io/projected/61098995-6197-447a-b950-7ee9bfa2f643-kube-api-access-c7wdd\") on node \"crc\" DevicePath \"\"" Oct 03 09:10:14 crc kubenswrapper[4765]: I1003 09:10:14.493824 4765 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/61098995-6197-447a-b950-7ee9bfa2f643-run-httpd\") on node \"crc\" DevicePath \"\"" Oct 03 09:10:14 crc kubenswrapper[4765]: I1003 09:10:14.493833 4765 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/61098995-6197-447a-b950-7ee9bfa2f643-scripts\") on node \"crc\" DevicePath \"\"" Oct 03 09:10:14 crc kubenswrapper[4765]: I1003 09:10:14.499163 4765 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/61098995-6197-447a-b950-7ee9bfa2f643-config-data" (OuterVolumeSpecName: "config-data") pod "61098995-6197-447a-b950-7ee9bfa2f643" (UID: "61098995-6197-447a-b950-7ee9bfa2f643"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 09:10:14 crc kubenswrapper[4765]: I1003 09:10:14.596055 4765 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/61098995-6197-447a-b950-7ee9bfa2f643-config-data\") on node \"crc\" DevicePath \"\"" Oct 03 09:10:14 crc kubenswrapper[4765]: I1003 09:10:14.733396 4765 generic.go:334] "Generic (PLEG): container finished" podID="61098995-6197-447a-b950-7ee9bfa2f643" containerID="55bd06de291f364b65c952dc29f64dc22d987b3daa89aea3183e79149d5b9820" exitCode=0 Oct 03 09:10:14 crc kubenswrapper[4765]: I1003 09:10:14.733487 4765 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/ceilometer-0" Oct 03 09:10:14 crc kubenswrapper[4765]: I1003 09:10:14.733501 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/ceilometer-0" event={"ID":"61098995-6197-447a-b950-7ee9bfa2f643","Type":"ContainerDied","Data":"55bd06de291f364b65c952dc29f64dc22d987b3daa89aea3183e79149d5b9820"} Oct 03 09:10:14 crc kubenswrapper[4765]: I1003 09:10:14.733587 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/ceilometer-0" event={"ID":"61098995-6197-447a-b950-7ee9bfa2f643","Type":"ContainerDied","Data":"4c9e6692cb13288944cb9a1730a247f8a73d2338b7633fec915516d23dde6ae7"} Oct 03 09:10:14 crc kubenswrapper[4765]: I1003 09:10:14.733614 4765 scope.go:117] "RemoveContainer" containerID="32d54cf682ef268c2bbd75695e7fe4b2ecf2191f5135a14665eb06b6468397ba" Oct 03 09:10:14 crc kubenswrapper[4765]: I1003 09:10:14.736505 4765 generic.go:334] "Generic (PLEG): container finished" podID="c8739f94-bfcd-4c01-8025-d39aaddacde0" containerID="25cc6ec07fd12c67f593481c3808cbede35b35a3c3214ef388b70c6acf0fc21b" exitCode=0 Oct 03 09:10:14 crc kubenswrapper[4765]: I1003 09:10:14.736550 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-72d7q" event={"ID":"c8739f94-bfcd-4c01-8025-d39aaddacde0","Type":"ContainerDied","Data":"25cc6ec07fd12c67f593481c3808cbede35b35a3c3214ef388b70c6acf0fc21b"} Oct 03 09:10:14 crc kubenswrapper[4765]: I1003 09:10:14.794193 4765 scope.go:117] "RemoveContainer" containerID="ac3633d8155b97658e4c7f58ad6c3f06d426c6881f40074cc0dca9726a090e1d" Oct 03 09:10:14 crc kubenswrapper[4765]: I1003 09:10:14.832709 4765 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["watcher-kuttl-default/ceilometer-0"] Oct 03 09:10:14 crc kubenswrapper[4765]: I1003 09:10:14.845216 4765 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["watcher-kuttl-default/ceilometer-0"] Oct 03 09:10:14 crc kubenswrapper[4765]: I1003 09:10:14.856335 4765 scope.go:117] "RemoveContainer" containerID="55bd06de291f364b65c952dc29f64dc22d987b3daa89aea3183e79149d5b9820" Oct 03 09:10:14 crc kubenswrapper[4765]: I1003 09:10:14.862685 4765 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["watcher-kuttl-default/ceilometer-0"] Oct 03 09:10:14 crc kubenswrapper[4765]: E1003 09:10:14.863028 4765 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="61098995-6197-447a-b950-7ee9bfa2f643" containerName="ceilometer-notification-agent" Oct 03 09:10:14 crc kubenswrapper[4765]: I1003 09:10:14.863041 4765 state_mem.go:107] "Deleted CPUSet assignment" podUID="61098995-6197-447a-b950-7ee9bfa2f643" containerName="ceilometer-notification-agent" Oct 03 09:10:14 crc kubenswrapper[4765]: E1003 09:10:14.863061 4765 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="61098995-6197-447a-b950-7ee9bfa2f643" containerName="ceilometer-central-agent" Oct 03 09:10:14 crc kubenswrapper[4765]: I1003 09:10:14.863069 4765 state_mem.go:107] "Deleted CPUSet assignment" podUID="61098995-6197-447a-b950-7ee9bfa2f643" containerName="ceilometer-central-agent" Oct 03 09:10:14 crc kubenswrapper[4765]: E1003 09:10:14.863084 4765 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="61098995-6197-447a-b950-7ee9bfa2f643" containerName="sg-core" Oct 03 09:10:14 crc kubenswrapper[4765]: I1003 09:10:14.863091 4765 state_mem.go:107] "Deleted CPUSet assignment" podUID="61098995-6197-447a-b950-7ee9bfa2f643" containerName="sg-core" Oct 03 09:10:14 crc kubenswrapper[4765]: E1003 09:10:14.863104 4765 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a850b634-7419-4983-ad82-8efd6dea5cb7" containerName="mariadb-account-delete" Oct 03 09:10:14 crc kubenswrapper[4765]: I1003 09:10:14.863110 4765 state_mem.go:107] "Deleted CPUSet assignment" podUID="a850b634-7419-4983-ad82-8efd6dea5cb7" containerName="mariadb-account-delete" Oct 03 09:10:14 crc kubenswrapper[4765]: E1003 09:10:14.863123 4765 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="61098995-6197-447a-b950-7ee9bfa2f643" containerName="proxy-httpd" Oct 03 09:10:14 crc kubenswrapper[4765]: I1003 09:10:14.863129 4765 state_mem.go:107] "Deleted CPUSet assignment" podUID="61098995-6197-447a-b950-7ee9bfa2f643" containerName="proxy-httpd" Oct 03 09:10:14 crc kubenswrapper[4765]: I1003 09:10:14.863332 4765 memory_manager.go:354] "RemoveStaleState removing state" podUID="61098995-6197-447a-b950-7ee9bfa2f643" containerName="ceilometer-notification-agent" Oct 03 09:10:14 crc kubenswrapper[4765]: I1003 09:10:14.863357 4765 memory_manager.go:354] "RemoveStaleState removing state" podUID="61098995-6197-447a-b950-7ee9bfa2f643" containerName="proxy-httpd" Oct 03 09:10:14 crc kubenswrapper[4765]: I1003 09:10:14.863373 4765 memory_manager.go:354] "RemoveStaleState removing state" podUID="61098995-6197-447a-b950-7ee9bfa2f643" containerName="ceilometer-central-agent" Oct 03 09:10:14 crc kubenswrapper[4765]: I1003 09:10:14.863384 4765 memory_manager.go:354] "RemoveStaleState removing state" podUID="a850b634-7419-4983-ad82-8efd6dea5cb7" containerName="mariadb-account-delete" Oct 03 09:10:14 crc kubenswrapper[4765]: I1003 09:10:14.863394 4765 memory_manager.go:354] "RemoveStaleState removing state" podUID="61098995-6197-447a-b950-7ee9bfa2f643" containerName="sg-core" Oct 03 09:10:14 crc kubenswrapper[4765]: I1003 09:10:14.868779 4765 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/ceilometer-0" Oct 03 09:10:14 crc kubenswrapper[4765]: I1003 09:10:14.870763 4765 reflector.go:368] Caches populated for *v1.Secret from object-"watcher-kuttl-default"/"ceilometer-scripts" Oct 03 09:10:14 crc kubenswrapper[4765]: I1003 09:10:14.873556 4765 reflector.go:368] Caches populated for *v1.Secret from object-"watcher-kuttl-default"/"cert-ceilometer-internal-svc" Oct 03 09:10:14 crc kubenswrapper[4765]: I1003 09:10:14.874046 4765 reflector.go:368] Caches populated for *v1.Secret from object-"watcher-kuttl-default"/"ceilometer-config-data" Oct 03 09:10:14 crc kubenswrapper[4765]: I1003 09:10:14.891709 4765 scope.go:117] "RemoveContainer" containerID="a5d2feaddb249676c69586e35a62c1d79368a5730ccaed6998f70948202282a8" Oct 03 09:10:14 crc kubenswrapper[4765]: I1003 09:10:14.897740 4765 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["watcher-kuttl-default/ceilometer-0"] Oct 03 09:10:14 crc kubenswrapper[4765]: I1003 09:10:14.936705 4765 scope.go:117] "RemoveContainer" containerID="32d54cf682ef268c2bbd75695e7fe4b2ecf2191f5135a14665eb06b6468397ba" Oct 03 09:10:14 crc kubenswrapper[4765]: E1003 09:10:14.937298 4765 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"32d54cf682ef268c2bbd75695e7fe4b2ecf2191f5135a14665eb06b6468397ba\": container with ID starting with 32d54cf682ef268c2bbd75695e7fe4b2ecf2191f5135a14665eb06b6468397ba not found: ID does not exist" containerID="32d54cf682ef268c2bbd75695e7fe4b2ecf2191f5135a14665eb06b6468397ba" Oct 03 09:10:14 crc kubenswrapper[4765]: I1003 09:10:14.937335 4765 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"32d54cf682ef268c2bbd75695e7fe4b2ecf2191f5135a14665eb06b6468397ba"} err="failed to get container status \"32d54cf682ef268c2bbd75695e7fe4b2ecf2191f5135a14665eb06b6468397ba\": rpc error: code = NotFound desc = could not find container \"32d54cf682ef268c2bbd75695e7fe4b2ecf2191f5135a14665eb06b6468397ba\": container with ID starting with 32d54cf682ef268c2bbd75695e7fe4b2ecf2191f5135a14665eb06b6468397ba not found: ID does not exist" Oct 03 09:10:14 crc kubenswrapper[4765]: I1003 09:10:14.937379 4765 scope.go:117] "RemoveContainer" containerID="ac3633d8155b97658e4c7f58ad6c3f06d426c6881f40074cc0dca9726a090e1d" Oct 03 09:10:14 crc kubenswrapper[4765]: E1003 09:10:14.938029 4765 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ac3633d8155b97658e4c7f58ad6c3f06d426c6881f40074cc0dca9726a090e1d\": container with ID starting with ac3633d8155b97658e4c7f58ad6c3f06d426c6881f40074cc0dca9726a090e1d not found: ID does not exist" containerID="ac3633d8155b97658e4c7f58ad6c3f06d426c6881f40074cc0dca9726a090e1d" Oct 03 09:10:14 crc kubenswrapper[4765]: I1003 09:10:14.938102 4765 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ac3633d8155b97658e4c7f58ad6c3f06d426c6881f40074cc0dca9726a090e1d"} err="failed to get container status \"ac3633d8155b97658e4c7f58ad6c3f06d426c6881f40074cc0dca9726a090e1d\": rpc error: code = NotFound desc = could not find container \"ac3633d8155b97658e4c7f58ad6c3f06d426c6881f40074cc0dca9726a090e1d\": container with ID starting with ac3633d8155b97658e4c7f58ad6c3f06d426c6881f40074cc0dca9726a090e1d not found: ID does not exist" Oct 03 09:10:14 crc kubenswrapper[4765]: I1003 09:10:14.938135 4765 scope.go:117] "RemoveContainer" containerID="55bd06de291f364b65c952dc29f64dc22d987b3daa89aea3183e79149d5b9820" Oct 03 09:10:14 crc kubenswrapper[4765]: E1003 09:10:14.938475 4765 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"55bd06de291f364b65c952dc29f64dc22d987b3daa89aea3183e79149d5b9820\": container with ID starting with 55bd06de291f364b65c952dc29f64dc22d987b3daa89aea3183e79149d5b9820 not found: ID does not exist" containerID="55bd06de291f364b65c952dc29f64dc22d987b3daa89aea3183e79149d5b9820" Oct 03 09:10:14 crc kubenswrapper[4765]: I1003 09:10:14.938519 4765 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"55bd06de291f364b65c952dc29f64dc22d987b3daa89aea3183e79149d5b9820"} err="failed to get container status \"55bd06de291f364b65c952dc29f64dc22d987b3daa89aea3183e79149d5b9820\": rpc error: code = NotFound desc = could not find container \"55bd06de291f364b65c952dc29f64dc22d987b3daa89aea3183e79149d5b9820\": container with ID starting with 55bd06de291f364b65c952dc29f64dc22d987b3daa89aea3183e79149d5b9820 not found: ID does not exist" Oct 03 09:10:14 crc kubenswrapper[4765]: I1003 09:10:14.938538 4765 scope.go:117] "RemoveContainer" containerID="a5d2feaddb249676c69586e35a62c1d79368a5730ccaed6998f70948202282a8" Oct 03 09:10:14 crc kubenswrapper[4765]: E1003 09:10:14.939010 4765 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a5d2feaddb249676c69586e35a62c1d79368a5730ccaed6998f70948202282a8\": container with ID starting with a5d2feaddb249676c69586e35a62c1d79368a5730ccaed6998f70948202282a8 not found: ID does not exist" containerID="a5d2feaddb249676c69586e35a62c1d79368a5730ccaed6998f70948202282a8" Oct 03 09:10:14 crc kubenswrapper[4765]: I1003 09:10:14.939033 4765 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a5d2feaddb249676c69586e35a62c1d79368a5730ccaed6998f70948202282a8"} err="failed to get container status \"a5d2feaddb249676c69586e35a62c1d79368a5730ccaed6998f70948202282a8\": rpc error: code = NotFound desc = could not find container \"a5d2feaddb249676c69586e35a62c1d79368a5730ccaed6998f70948202282a8\": container with ID starting with a5d2feaddb249676c69586e35a62c1d79368a5730ccaed6998f70948202282a8 not found: ID does not exist" Oct 03 09:10:15 crc kubenswrapper[4765]: I1003 09:10:15.003360 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7cb3090b-4f8a-4f9c-843d-05d324c92f0f-config-data\") pod \"ceilometer-0\" (UID: \"7cb3090b-4f8a-4f9c-843d-05d324c92f0f\") " pod="watcher-kuttl-default/ceilometer-0" Oct 03 09:10:15 crc kubenswrapper[4765]: I1003 09:10:15.003413 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nchz2\" (UniqueName: \"kubernetes.io/projected/7cb3090b-4f8a-4f9c-843d-05d324c92f0f-kube-api-access-nchz2\") pod \"ceilometer-0\" (UID: \"7cb3090b-4f8a-4f9c-843d-05d324c92f0f\") " pod="watcher-kuttl-default/ceilometer-0" Oct 03 09:10:15 crc kubenswrapper[4765]: I1003 09:10:15.003444 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7cb3090b-4f8a-4f9c-843d-05d324c92f0f-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"7cb3090b-4f8a-4f9c-843d-05d324c92f0f\") " pod="watcher-kuttl-default/ceilometer-0" Oct 03 09:10:15 crc kubenswrapper[4765]: I1003 09:10:15.003473 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/7cb3090b-4f8a-4f9c-843d-05d324c92f0f-log-httpd\") pod \"ceilometer-0\" (UID: \"7cb3090b-4f8a-4f9c-843d-05d324c92f0f\") " pod="watcher-kuttl-default/ceilometer-0" Oct 03 09:10:15 crc kubenswrapper[4765]: I1003 09:10:15.003487 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/7cb3090b-4f8a-4f9c-843d-05d324c92f0f-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"7cb3090b-4f8a-4f9c-843d-05d324c92f0f\") " pod="watcher-kuttl-default/ceilometer-0" Oct 03 09:10:15 crc kubenswrapper[4765]: I1003 09:10:15.003508 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/7cb3090b-4f8a-4f9c-843d-05d324c92f0f-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"7cb3090b-4f8a-4f9c-843d-05d324c92f0f\") " pod="watcher-kuttl-default/ceilometer-0" Oct 03 09:10:15 crc kubenswrapper[4765]: I1003 09:10:15.003533 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/7cb3090b-4f8a-4f9c-843d-05d324c92f0f-run-httpd\") pod \"ceilometer-0\" (UID: \"7cb3090b-4f8a-4f9c-843d-05d324c92f0f\") " pod="watcher-kuttl-default/ceilometer-0" Oct 03 09:10:15 crc kubenswrapper[4765]: I1003 09:10:15.003557 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/7cb3090b-4f8a-4f9c-843d-05d324c92f0f-scripts\") pod \"ceilometer-0\" (UID: \"7cb3090b-4f8a-4f9c-843d-05d324c92f0f\") " pod="watcher-kuttl-default/ceilometer-0" Oct 03 09:10:15 crc kubenswrapper[4765]: I1003 09:10:15.105046 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/7cb3090b-4f8a-4f9c-843d-05d324c92f0f-run-httpd\") pod \"ceilometer-0\" (UID: \"7cb3090b-4f8a-4f9c-843d-05d324c92f0f\") " pod="watcher-kuttl-default/ceilometer-0" Oct 03 09:10:15 crc kubenswrapper[4765]: I1003 09:10:15.105124 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/7cb3090b-4f8a-4f9c-843d-05d324c92f0f-scripts\") pod \"ceilometer-0\" (UID: \"7cb3090b-4f8a-4f9c-843d-05d324c92f0f\") " pod="watcher-kuttl-default/ceilometer-0" Oct 03 09:10:15 crc kubenswrapper[4765]: I1003 09:10:15.105207 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7cb3090b-4f8a-4f9c-843d-05d324c92f0f-config-data\") pod \"ceilometer-0\" (UID: \"7cb3090b-4f8a-4f9c-843d-05d324c92f0f\") " pod="watcher-kuttl-default/ceilometer-0" Oct 03 09:10:15 crc kubenswrapper[4765]: I1003 09:10:15.105253 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nchz2\" (UniqueName: \"kubernetes.io/projected/7cb3090b-4f8a-4f9c-843d-05d324c92f0f-kube-api-access-nchz2\") pod \"ceilometer-0\" (UID: \"7cb3090b-4f8a-4f9c-843d-05d324c92f0f\") " pod="watcher-kuttl-default/ceilometer-0" Oct 03 09:10:15 crc kubenswrapper[4765]: I1003 09:10:15.105295 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7cb3090b-4f8a-4f9c-843d-05d324c92f0f-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"7cb3090b-4f8a-4f9c-843d-05d324c92f0f\") " pod="watcher-kuttl-default/ceilometer-0" Oct 03 09:10:15 crc kubenswrapper[4765]: I1003 09:10:15.105340 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/7cb3090b-4f8a-4f9c-843d-05d324c92f0f-log-httpd\") pod \"ceilometer-0\" (UID: \"7cb3090b-4f8a-4f9c-843d-05d324c92f0f\") " pod="watcher-kuttl-default/ceilometer-0" Oct 03 09:10:15 crc kubenswrapper[4765]: I1003 09:10:15.105367 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/7cb3090b-4f8a-4f9c-843d-05d324c92f0f-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"7cb3090b-4f8a-4f9c-843d-05d324c92f0f\") " pod="watcher-kuttl-default/ceilometer-0" Oct 03 09:10:15 crc kubenswrapper[4765]: I1003 09:10:15.105396 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/7cb3090b-4f8a-4f9c-843d-05d324c92f0f-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"7cb3090b-4f8a-4f9c-843d-05d324c92f0f\") " pod="watcher-kuttl-default/ceilometer-0" Oct 03 09:10:15 crc kubenswrapper[4765]: I1003 09:10:15.106179 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/7cb3090b-4f8a-4f9c-843d-05d324c92f0f-run-httpd\") pod \"ceilometer-0\" (UID: \"7cb3090b-4f8a-4f9c-843d-05d324c92f0f\") " pod="watcher-kuttl-default/ceilometer-0" Oct 03 09:10:15 crc kubenswrapper[4765]: I1003 09:10:15.106399 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/7cb3090b-4f8a-4f9c-843d-05d324c92f0f-log-httpd\") pod \"ceilometer-0\" (UID: \"7cb3090b-4f8a-4f9c-843d-05d324c92f0f\") " pod="watcher-kuttl-default/ceilometer-0" Oct 03 09:10:15 crc kubenswrapper[4765]: I1003 09:10:15.110282 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/7cb3090b-4f8a-4f9c-843d-05d324c92f0f-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"7cb3090b-4f8a-4f9c-843d-05d324c92f0f\") " pod="watcher-kuttl-default/ceilometer-0" Oct 03 09:10:15 crc kubenswrapper[4765]: I1003 09:10:15.110431 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/7cb3090b-4f8a-4f9c-843d-05d324c92f0f-scripts\") pod \"ceilometer-0\" (UID: \"7cb3090b-4f8a-4f9c-843d-05d324c92f0f\") " pod="watcher-kuttl-default/ceilometer-0" Oct 03 09:10:15 crc kubenswrapper[4765]: I1003 09:10:15.110823 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7cb3090b-4f8a-4f9c-843d-05d324c92f0f-config-data\") pod \"ceilometer-0\" (UID: \"7cb3090b-4f8a-4f9c-843d-05d324c92f0f\") " pod="watcher-kuttl-default/ceilometer-0" Oct 03 09:10:15 crc kubenswrapper[4765]: I1003 09:10:15.112908 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7cb3090b-4f8a-4f9c-843d-05d324c92f0f-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"7cb3090b-4f8a-4f9c-843d-05d324c92f0f\") " pod="watcher-kuttl-default/ceilometer-0" Oct 03 09:10:15 crc kubenswrapper[4765]: I1003 09:10:15.119999 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/7cb3090b-4f8a-4f9c-843d-05d324c92f0f-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"7cb3090b-4f8a-4f9c-843d-05d324c92f0f\") " pod="watcher-kuttl-default/ceilometer-0" Oct 03 09:10:15 crc kubenswrapper[4765]: I1003 09:10:15.125388 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nchz2\" (UniqueName: \"kubernetes.io/projected/7cb3090b-4f8a-4f9c-843d-05d324c92f0f-kube-api-access-nchz2\") pod \"ceilometer-0\" (UID: \"7cb3090b-4f8a-4f9c-843d-05d324c92f0f\") " pod="watcher-kuttl-default/ceilometer-0" Oct 03 09:10:15 crc kubenswrapper[4765]: I1003 09:10:15.185521 4765 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="watcher-kuttl-default/ceilometer-0" Oct 03 09:10:15 crc kubenswrapper[4765]: I1003 09:10:15.618905 4765 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["watcher-kuttl-default/ceilometer-0"] Oct 03 09:10:15 crc kubenswrapper[4765]: W1003 09:10:15.633263 4765 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod7cb3090b_4f8a_4f9c_843d_05d324c92f0f.slice/crio-19103208ffffacf54f0657f951ce2b54ba11429f5e141eab3cb3a1058f00484e WatchSource:0}: Error finding container 19103208ffffacf54f0657f951ce2b54ba11429f5e141eab3cb3a1058f00484e: Status 404 returned error can't find the container with id 19103208ffffacf54f0657f951ce2b54ba11429f5e141eab3cb3a1058f00484e Oct 03 09:10:15 crc kubenswrapper[4765]: I1003 09:10:15.746950 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-72d7q" event={"ID":"c8739f94-bfcd-4c01-8025-d39aaddacde0","Type":"ContainerStarted","Data":"dd277bea0391faadd85dbd8b45d073d358ab7d12db9bee640f13a20004a36885"} Oct 03 09:10:15 crc kubenswrapper[4765]: I1003 09:10:15.748620 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/ceilometer-0" event={"ID":"7cb3090b-4f8a-4f9c-843d-05d324c92f0f","Type":"ContainerStarted","Data":"19103208ffffacf54f0657f951ce2b54ba11429f5e141eab3cb3a1058f00484e"} Oct 03 09:10:15 crc kubenswrapper[4765]: I1003 09:10:15.765354 4765 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-72d7q" podStartSLOduration=3.088659226 podStartE2EDuration="5.76533556s" podCreationTimestamp="2025-10-03 09:10:10 +0000 UTC" firstStartedPulling="2025-10-03 09:10:12.705165203 +0000 UTC m=+1857.006659533" lastFinishedPulling="2025-10-03 09:10:15.381841537 +0000 UTC m=+1859.683335867" observedRunningTime="2025-10-03 09:10:15.763173114 +0000 UTC m=+1860.064667444" watchObservedRunningTime="2025-10-03 09:10:15.76533556 +0000 UTC m=+1860.066829890" Oct 03 09:10:16 crc kubenswrapper[4765]: I1003 09:10:16.328032 4765 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="61098995-6197-447a-b950-7ee9bfa2f643" path="/var/lib/kubelet/pods/61098995-6197-447a-b950-7ee9bfa2f643/volumes" Oct 03 09:10:16 crc kubenswrapper[4765]: I1003 09:10:16.762603 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/ceilometer-0" event={"ID":"7cb3090b-4f8a-4f9c-843d-05d324c92f0f","Type":"ContainerStarted","Data":"cc0f67fa8673eff64e0e5250f093833eb20b6d39dc892ab7d352c68a58998d67"} Oct 03 09:10:17 crc kubenswrapper[4765]: I1003 09:10:17.774448 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/ceilometer-0" event={"ID":"7cb3090b-4f8a-4f9c-843d-05d324c92f0f","Type":"ContainerStarted","Data":"b8efeab18b1a6cc128c5853d4df275d80282612272fe6509b956d6538eeae07e"} Oct 03 09:10:18 crc kubenswrapper[4765]: I1003 09:10:18.784177 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/ceilometer-0" event={"ID":"7cb3090b-4f8a-4f9c-843d-05d324c92f0f","Type":"ContainerStarted","Data":"b35048f3ece9aa67627d8f822b6c358a3da3b6c2e27331810639875f47f853ea"} Oct 03 09:10:19 crc kubenswrapper[4765]: I1003 09:10:19.548780 4765 scope.go:117] "RemoveContainer" containerID="8c4736edbd19119c5c09fed3accd8e17e5ee98ea1e12ca39fc3b59aeaeb12c20" Oct 03 09:10:19 crc kubenswrapper[4765]: I1003 09:10:19.649352 4765 scope.go:117] "RemoveContainer" containerID="ecb964afaf2f1e274b0b620a5cab69ea802ecaad5e425655dc7383b31b2525c0" Oct 03 09:10:19 crc kubenswrapper[4765]: I1003 09:10:19.806849 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="watcher-kuttl-default/ceilometer-0" event={"ID":"7cb3090b-4f8a-4f9c-843d-05d324c92f0f","Type":"ContainerStarted","Data":"38913a6a8d17c9393ff90ba78c37d357bf87caab73ea4252109aed67e361e7d1"} Oct 03 09:10:19 crc kubenswrapper[4765]: I1003 09:10:19.807165 4765 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="watcher-kuttl-default/ceilometer-0" Oct 03 09:10:19 crc kubenswrapper[4765]: I1003 09:10:19.835619 4765 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="watcher-kuttl-default/ceilometer-0" podStartSLOduration=1.9283623589999999 podStartE2EDuration="5.835600389s" podCreationTimestamp="2025-10-03 09:10:14 +0000 UTC" firstStartedPulling="2025-10-03 09:10:15.636507494 +0000 UTC m=+1859.938001824" lastFinishedPulling="2025-10-03 09:10:19.543745364 +0000 UTC m=+1863.845239854" observedRunningTime="2025-10-03 09:10:19.829121341 +0000 UTC m=+1864.130615671" watchObservedRunningTime="2025-10-03 09:10:19.835600389 +0000 UTC m=+1864.137094719" Oct 03 09:10:21 crc kubenswrapper[4765]: I1003 09:10:21.294569 4765 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-72d7q" Oct 03 09:10:21 crc kubenswrapper[4765]: I1003 09:10:21.295751 4765 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-72d7q" Oct 03 09:10:21 crc kubenswrapper[4765]: I1003 09:10:21.342727 4765 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-72d7q" Oct 03 09:10:21 crc kubenswrapper[4765]: I1003 09:10:21.874474 4765 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-72d7q" Oct 03 09:10:24 crc kubenswrapper[4765]: I1003 09:10:24.951712 4765 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-72d7q"] Oct 03 09:10:24 crc kubenswrapper[4765]: I1003 09:10:24.952080 4765 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-72d7q" podUID="c8739f94-bfcd-4c01-8025-d39aaddacde0" containerName="registry-server" containerID="cri-o://dd277bea0391faadd85dbd8b45d073d358ab7d12db9bee640f13a20004a36885" gracePeriod=2 Oct 03 09:10:25 crc kubenswrapper[4765]: I1003 09:10:25.410424 4765 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-72d7q" Oct 03 09:10:25 crc kubenswrapper[4765]: I1003 09:10:25.468671 4765 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-w9rg5\" (UniqueName: \"kubernetes.io/projected/c8739f94-bfcd-4c01-8025-d39aaddacde0-kube-api-access-w9rg5\") pod \"c8739f94-bfcd-4c01-8025-d39aaddacde0\" (UID: \"c8739f94-bfcd-4c01-8025-d39aaddacde0\") " Oct 03 09:10:25 crc kubenswrapper[4765]: I1003 09:10:25.468771 4765 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c8739f94-bfcd-4c01-8025-d39aaddacde0-catalog-content\") pod \"c8739f94-bfcd-4c01-8025-d39aaddacde0\" (UID: \"c8739f94-bfcd-4c01-8025-d39aaddacde0\") " Oct 03 09:10:25 crc kubenswrapper[4765]: I1003 09:10:25.468891 4765 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c8739f94-bfcd-4c01-8025-d39aaddacde0-utilities\") pod \"c8739f94-bfcd-4c01-8025-d39aaddacde0\" (UID: \"c8739f94-bfcd-4c01-8025-d39aaddacde0\") " Oct 03 09:10:25 crc kubenswrapper[4765]: I1003 09:10:25.469739 4765 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c8739f94-bfcd-4c01-8025-d39aaddacde0-utilities" (OuterVolumeSpecName: "utilities") pod "c8739f94-bfcd-4c01-8025-d39aaddacde0" (UID: "c8739f94-bfcd-4c01-8025-d39aaddacde0"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 03 09:10:25 crc kubenswrapper[4765]: I1003 09:10:25.477115 4765 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c8739f94-bfcd-4c01-8025-d39aaddacde0-kube-api-access-w9rg5" (OuterVolumeSpecName: "kube-api-access-w9rg5") pod "c8739f94-bfcd-4c01-8025-d39aaddacde0" (UID: "c8739f94-bfcd-4c01-8025-d39aaddacde0"). InnerVolumeSpecName "kube-api-access-w9rg5". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 09:10:25 crc kubenswrapper[4765]: I1003 09:10:25.519886 4765 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c8739f94-bfcd-4c01-8025-d39aaddacde0-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "c8739f94-bfcd-4c01-8025-d39aaddacde0" (UID: "c8739f94-bfcd-4c01-8025-d39aaddacde0"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 03 09:10:25 crc kubenswrapper[4765]: I1003 09:10:25.570729 4765 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c8739f94-bfcd-4c01-8025-d39aaddacde0-utilities\") on node \"crc\" DevicePath \"\"" Oct 03 09:10:25 crc kubenswrapper[4765]: I1003 09:10:25.570782 4765 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-w9rg5\" (UniqueName: \"kubernetes.io/projected/c8739f94-bfcd-4c01-8025-d39aaddacde0-kube-api-access-w9rg5\") on node \"crc\" DevicePath \"\"" Oct 03 09:10:25 crc kubenswrapper[4765]: I1003 09:10:25.570796 4765 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c8739f94-bfcd-4c01-8025-d39aaddacde0-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 03 09:10:25 crc kubenswrapper[4765]: I1003 09:10:25.855210 4765 generic.go:334] "Generic (PLEG): container finished" podID="c8739f94-bfcd-4c01-8025-d39aaddacde0" containerID="dd277bea0391faadd85dbd8b45d073d358ab7d12db9bee640f13a20004a36885" exitCode=0 Oct 03 09:10:25 crc kubenswrapper[4765]: I1003 09:10:25.855259 4765 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-72d7q" Oct 03 09:10:25 crc kubenswrapper[4765]: I1003 09:10:25.855276 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-72d7q" event={"ID":"c8739f94-bfcd-4c01-8025-d39aaddacde0","Type":"ContainerDied","Data":"dd277bea0391faadd85dbd8b45d073d358ab7d12db9bee640f13a20004a36885"} Oct 03 09:10:25 crc kubenswrapper[4765]: I1003 09:10:25.855632 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-72d7q" event={"ID":"c8739f94-bfcd-4c01-8025-d39aaddacde0","Type":"ContainerDied","Data":"fc46318fd066ed07a06cd1fc1ce0b325cfb2ed08c5042ce5fad976d4b160d1e4"} Oct 03 09:10:25 crc kubenswrapper[4765]: I1003 09:10:25.855671 4765 scope.go:117] "RemoveContainer" containerID="dd277bea0391faadd85dbd8b45d073d358ab7d12db9bee640f13a20004a36885" Oct 03 09:10:25 crc kubenswrapper[4765]: I1003 09:10:25.874590 4765 scope.go:117] "RemoveContainer" containerID="25cc6ec07fd12c67f593481c3808cbede35b35a3c3214ef388b70c6acf0fc21b" Oct 03 09:10:25 crc kubenswrapper[4765]: I1003 09:10:25.890791 4765 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-72d7q"] Oct 03 09:10:25 crc kubenswrapper[4765]: I1003 09:10:25.894584 4765 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-72d7q"] Oct 03 09:10:25 crc kubenswrapper[4765]: I1003 09:10:25.913915 4765 scope.go:117] "RemoveContainer" containerID="965537b1c402acc3310196eba3f8fa4b5c495fa54852550bdfd467b1f9073860" Oct 03 09:10:25 crc kubenswrapper[4765]: I1003 09:10:25.932346 4765 scope.go:117] "RemoveContainer" containerID="dd277bea0391faadd85dbd8b45d073d358ab7d12db9bee640f13a20004a36885" Oct 03 09:10:25 crc kubenswrapper[4765]: E1003 09:10:25.932907 4765 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"dd277bea0391faadd85dbd8b45d073d358ab7d12db9bee640f13a20004a36885\": container with ID starting with dd277bea0391faadd85dbd8b45d073d358ab7d12db9bee640f13a20004a36885 not found: ID does not exist" containerID="dd277bea0391faadd85dbd8b45d073d358ab7d12db9bee640f13a20004a36885" Oct 03 09:10:25 crc kubenswrapper[4765]: I1003 09:10:25.932937 4765 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"dd277bea0391faadd85dbd8b45d073d358ab7d12db9bee640f13a20004a36885"} err="failed to get container status \"dd277bea0391faadd85dbd8b45d073d358ab7d12db9bee640f13a20004a36885\": rpc error: code = NotFound desc = could not find container \"dd277bea0391faadd85dbd8b45d073d358ab7d12db9bee640f13a20004a36885\": container with ID starting with dd277bea0391faadd85dbd8b45d073d358ab7d12db9bee640f13a20004a36885 not found: ID does not exist" Oct 03 09:10:25 crc kubenswrapper[4765]: I1003 09:10:25.932957 4765 scope.go:117] "RemoveContainer" containerID="25cc6ec07fd12c67f593481c3808cbede35b35a3c3214ef388b70c6acf0fc21b" Oct 03 09:10:25 crc kubenswrapper[4765]: E1003 09:10:25.933223 4765 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"25cc6ec07fd12c67f593481c3808cbede35b35a3c3214ef388b70c6acf0fc21b\": container with ID starting with 25cc6ec07fd12c67f593481c3808cbede35b35a3c3214ef388b70c6acf0fc21b not found: ID does not exist" containerID="25cc6ec07fd12c67f593481c3808cbede35b35a3c3214ef388b70c6acf0fc21b" Oct 03 09:10:25 crc kubenswrapper[4765]: I1003 09:10:25.933253 4765 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"25cc6ec07fd12c67f593481c3808cbede35b35a3c3214ef388b70c6acf0fc21b"} err="failed to get container status \"25cc6ec07fd12c67f593481c3808cbede35b35a3c3214ef388b70c6acf0fc21b\": rpc error: code = NotFound desc = could not find container \"25cc6ec07fd12c67f593481c3808cbede35b35a3c3214ef388b70c6acf0fc21b\": container with ID starting with 25cc6ec07fd12c67f593481c3808cbede35b35a3c3214ef388b70c6acf0fc21b not found: ID does not exist" Oct 03 09:10:25 crc kubenswrapper[4765]: I1003 09:10:25.933271 4765 scope.go:117] "RemoveContainer" containerID="965537b1c402acc3310196eba3f8fa4b5c495fa54852550bdfd467b1f9073860" Oct 03 09:10:25 crc kubenswrapper[4765]: E1003 09:10:25.933587 4765 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"965537b1c402acc3310196eba3f8fa4b5c495fa54852550bdfd467b1f9073860\": container with ID starting with 965537b1c402acc3310196eba3f8fa4b5c495fa54852550bdfd467b1f9073860 not found: ID does not exist" containerID="965537b1c402acc3310196eba3f8fa4b5c495fa54852550bdfd467b1f9073860" Oct 03 09:10:25 crc kubenswrapper[4765]: I1003 09:10:25.933605 4765 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"965537b1c402acc3310196eba3f8fa4b5c495fa54852550bdfd467b1f9073860"} err="failed to get container status \"965537b1c402acc3310196eba3f8fa4b5c495fa54852550bdfd467b1f9073860\": rpc error: code = NotFound desc = could not find container \"965537b1c402acc3310196eba3f8fa4b5c495fa54852550bdfd467b1f9073860\": container with ID starting with 965537b1c402acc3310196eba3f8fa4b5c495fa54852550bdfd467b1f9073860 not found: ID does not exist" Oct 03 09:10:26 crc kubenswrapper[4765]: I1003 09:10:26.317410 4765 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c8739f94-bfcd-4c01-8025-d39aaddacde0" path="/var/lib/kubelet/pods/c8739f94-bfcd-4c01-8025-d39aaddacde0/volumes" Oct 03 09:10:45 crc kubenswrapper[4765]: I1003 09:10:45.201269 4765 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="watcher-kuttl-default/ceilometer-0" Oct 03 09:10:50 crc kubenswrapper[4765]: I1003 09:10:50.792690 4765 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-must-gather-gkwlq/must-gather-k88dn"] Oct 03 09:10:50 crc kubenswrapper[4765]: E1003 09:10:50.794856 4765 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c8739f94-bfcd-4c01-8025-d39aaddacde0" containerName="registry-server" Oct 03 09:10:50 crc kubenswrapper[4765]: I1003 09:10:50.794976 4765 state_mem.go:107] "Deleted CPUSet assignment" podUID="c8739f94-bfcd-4c01-8025-d39aaddacde0" containerName="registry-server" Oct 03 09:10:50 crc kubenswrapper[4765]: E1003 09:10:50.795086 4765 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c8739f94-bfcd-4c01-8025-d39aaddacde0" containerName="extract-content" Oct 03 09:10:50 crc kubenswrapper[4765]: I1003 09:10:50.795169 4765 state_mem.go:107] "Deleted CPUSet assignment" podUID="c8739f94-bfcd-4c01-8025-d39aaddacde0" containerName="extract-content" Oct 03 09:10:50 crc kubenswrapper[4765]: E1003 09:10:50.795282 4765 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c8739f94-bfcd-4c01-8025-d39aaddacde0" containerName="extract-utilities" Oct 03 09:10:50 crc kubenswrapper[4765]: I1003 09:10:50.795370 4765 state_mem.go:107] "Deleted CPUSet assignment" podUID="c8739f94-bfcd-4c01-8025-d39aaddacde0" containerName="extract-utilities" Oct 03 09:10:50 crc kubenswrapper[4765]: I1003 09:10:50.795810 4765 memory_manager.go:354] "RemoveStaleState removing state" podUID="c8739f94-bfcd-4c01-8025-d39aaddacde0" containerName="registry-server" Oct 03 09:10:50 crc kubenswrapper[4765]: I1003 09:10:50.797014 4765 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-gkwlq/must-gather-k88dn" Oct 03 09:10:50 crc kubenswrapper[4765]: I1003 09:10:50.798995 4765 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-must-gather-gkwlq"/"kube-root-ca.crt" Oct 03 09:10:50 crc kubenswrapper[4765]: I1003 09:10:50.799861 4765 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-must-gather-gkwlq"/"openshift-service-ca.crt" Oct 03 09:10:50 crc kubenswrapper[4765]: I1003 09:10:50.803096 4765 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-must-gather-gkwlq"/"default-dockercfg-srxdq" Oct 03 09:10:50 crc kubenswrapper[4765]: I1003 09:10:50.804342 4765 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-gkwlq/must-gather-k88dn"] Oct 03 09:10:50 crc kubenswrapper[4765]: I1003 09:10:50.898115 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/0bddda19-bd72-467b-956e-72a43398545f-must-gather-output\") pod \"must-gather-k88dn\" (UID: \"0bddda19-bd72-467b-956e-72a43398545f\") " pod="openshift-must-gather-gkwlq/must-gather-k88dn" Oct 03 09:10:50 crc kubenswrapper[4765]: I1003 09:10:50.898162 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qsc69\" (UniqueName: \"kubernetes.io/projected/0bddda19-bd72-467b-956e-72a43398545f-kube-api-access-qsc69\") pod \"must-gather-k88dn\" (UID: \"0bddda19-bd72-467b-956e-72a43398545f\") " pod="openshift-must-gather-gkwlq/must-gather-k88dn" Oct 03 09:10:51 crc kubenswrapper[4765]: I1003 09:10:50.999923 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/0bddda19-bd72-467b-956e-72a43398545f-must-gather-output\") pod \"must-gather-k88dn\" (UID: \"0bddda19-bd72-467b-956e-72a43398545f\") " pod="openshift-must-gather-gkwlq/must-gather-k88dn" Oct 03 09:10:51 crc kubenswrapper[4765]: I1003 09:10:50.999973 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qsc69\" (UniqueName: \"kubernetes.io/projected/0bddda19-bd72-467b-956e-72a43398545f-kube-api-access-qsc69\") pod \"must-gather-k88dn\" (UID: \"0bddda19-bd72-467b-956e-72a43398545f\") " pod="openshift-must-gather-gkwlq/must-gather-k88dn" Oct 03 09:10:51 crc kubenswrapper[4765]: I1003 09:10:51.000466 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/0bddda19-bd72-467b-956e-72a43398545f-must-gather-output\") pod \"must-gather-k88dn\" (UID: \"0bddda19-bd72-467b-956e-72a43398545f\") " pod="openshift-must-gather-gkwlq/must-gather-k88dn" Oct 03 09:10:51 crc kubenswrapper[4765]: I1003 09:10:51.020771 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qsc69\" (UniqueName: \"kubernetes.io/projected/0bddda19-bd72-467b-956e-72a43398545f-kube-api-access-qsc69\") pod \"must-gather-k88dn\" (UID: \"0bddda19-bd72-467b-956e-72a43398545f\") " pod="openshift-must-gather-gkwlq/must-gather-k88dn" Oct 03 09:10:51 crc kubenswrapper[4765]: I1003 09:10:51.114360 4765 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-gkwlq/must-gather-k88dn" Oct 03 09:10:51 crc kubenswrapper[4765]: I1003 09:10:51.551435 4765 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-gkwlq/must-gather-k88dn"] Oct 03 09:10:51 crc kubenswrapper[4765]: I1003 09:10:51.561393 4765 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Oct 03 09:10:52 crc kubenswrapper[4765]: I1003 09:10:52.078959 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-gkwlq/must-gather-k88dn" event={"ID":"0bddda19-bd72-467b-956e-72a43398545f","Type":"ContainerStarted","Data":"840bbd9a637223c759ca2d2c5119d05300d7ead235c4b5c387a19e5fcff15a86"} Oct 03 09:10:56 crc kubenswrapper[4765]: I1003 09:10:56.128938 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-gkwlq/must-gather-k88dn" event={"ID":"0bddda19-bd72-467b-956e-72a43398545f","Type":"ContainerStarted","Data":"ac592d307cbd4050950361e916d45fc9bf50e78ffd0b739d1e025232d0e1c9de"} Oct 03 09:10:57 crc kubenswrapper[4765]: I1003 09:10:57.138534 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-gkwlq/must-gather-k88dn" event={"ID":"0bddda19-bd72-467b-956e-72a43398545f","Type":"ContainerStarted","Data":"841914e9b3edf8de5cb68b7586d96ee3d7fb064ddca368c749a4fe785cb8fb91"} Oct 03 09:10:57 crc kubenswrapper[4765]: I1003 09:10:57.161163 4765 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-gkwlq/must-gather-k88dn" podStartSLOduration=2.947285428 podStartE2EDuration="7.161138657s" podCreationTimestamp="2025-10-03 09:10:50 +0000 UTC" firstStartedPulling="2025-10-03 09:10:51.561150216 +0000 UTC m=+1895.862644546" lastFinishedPulling="2025-10-03 09:10:55.775003445 +0000 UTC m=+1900.076497775" observedRunningTime="2025-10-03 09:10:57.154887575 +0000 UTC m=+1901.456381905" watchObservedRunningTime="2025-10-03 09:10:57.161138657 +0000 UTC m=+1901.462633007" Oct 03 09:11:19 crc kubenswrapper[4765]: I1003 09:11:19.803821 4765 scope.go:117] "RemoveContainer" containerID="c52e60ba8087540c720ae0d205856f62a2466d9553bd5bb0fed51601b94417a2" Oct 03 09:11:19 crc kubenswrapper[4765]: I1003 09:11:19.829066 4765 scope.go:117] "RemoveContainer" containerID="330330f7181b4c919e902b096ed32d564070fa45840494fff43596d7ae822baf" Oct 03 09:11:19 crc kubenswrapper[4765]: I1003 09:11:19.867017 4765 scope.go:117] "RemoveContainer" containerID="fce137e1c1c792c1ed9d0bebbcd3b1a3cfcd74dd023097e25aa5e9a1305f2d60" Oct 03 09:11:52 crc kubenswrapper[4765]: I1003 09:11:52.758737 4765 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_05e449ad10d2925ed0c6277e570b403daaa897f83ce6b356969c83b3029txlf_81cd31aa-88b3-4604-9e6c-6360936a50ae/util/0.log" Oct 03 09:11:52 crc kubenswrapper[4765]: I1003 09:11:52.963159 4765 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_05e449ad10d2925ed0c6277e570b403daaa897f83ce6b356969c83b3029txlf_81cd31aa-88b3-4604-9e6c-6360936a50ae/pull/0.log" Oct 03 09:11:52 crc kubenswrapper[4765]: I1003 09:11:52.963334 4765 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_05e449ad10d2925ed0c6277e570b403daaa897f83ce6b356969c83b3029txlf_81cd31aa-88b3-4604-9e6c-6360936a50ae/pull/0.log" Oct 03 09:11:52 crc kubenswrapper[4765]: I1003 09:11:52.978565 4765 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_05e449ad10d2925ed0c6277e570b403daaa897f83ce6b356969c83b3029txlf_81cd31aa-88b3-4604-9e6c-6360936a50ae/util/0.log" Oct 03 09:11:53 crc kubenswrapper[4765]: I1003 09:11:53.185992 4765 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_05e449ad10d2925ed0c6277e570b403daaa897f83ce6b356969c83b3029txlf_81cd31aa-88b3-4604-9e6c-6360936a50ae/util/0.log" Oct 03 09:11:53 crc kubenswrapper[4765]: I1003 09:11:53.219433 4765 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_05e449ad10d2925ed0c6277e570b403daaa897f83ce6b356969c83b3029txlf_81cd31aa-88b3-4604-9e6c-6360936a50ae/pull/0.log" Oct 03 09:11:53 crc kubenswrapper[4765]: I1003 09:11:53.225776 4765 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_05e449ad10d2925ed0c6277e570b403daaa897f83ce6b356969c83b3029txlf_81cd31aa-88b3-4604-9e6c-6360936a50ae/extract/0.log" Oct 03 09:11:53 crc kubenswrapper[4765]: I1003 09:11:53.451068 4765 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_barbican-operator-controller-manager-6c675fb79f-z24cr_a9c35b93-23e9-43d5-9532-f1f0d51e8ae8/manager/0.log" Oct 03 09:11:53 crc kubenswrapper[4765]: I1003 09:11:53.484618 4765 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_cinder-operator-controller-manager-79d68d6c85-kxsxl_4fd5d3b8-ca79-48d0-9854-5bb9bc04eae4/kube-rbac-proxy/0.log" Oct 03 09:11:53 crc kubenswrapper[4765]: I1003 09:11:53.496563 4765 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_barbican-operator-controller-manager-6c675fb79f-z24cr_a9c35b93-23e9-43d5-9532-f1f0d51e8ae8/kube-rbac-proxy/0.log" Oct 03 09:11:53 crc kubenswrapper[4765]: I1003 09:11:53.930371 4765 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_cinder-operator-controller-manager-79d68d6c85-kxsxl_4fd5d3b8-ca79-48d0-9854-5bb9bc04eae4/manager/0.log" Oct 03 09:11:53 crc kubenswrapper[4765]: I1003 09:11:53.955147 4765 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_designate-operator-controller-manager-75dfd9b554-c9sw7_11cef8fb-3e83-485c-8651-6fbf983c682a/kube-rbac-proxy/0.log" Oct 03 09:11:54 crc kubenswrapper[4765]: I1003 09:11:54.002958 4765 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_designate-operator-controller-manager-75dfd9b554-c9sw7_11cef8fb-3e83-485c-8651-6fbf983c682a/manager/0.log" Oct 03 09:11:54 crc kubenswrapper[4765]: I1003 09:11:54.152149 4765 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_e202c84312f326eb61c25220bdbc5ba9f724d2b420482503a1fb00e10cqqrl6_fa66b2dd-bf1b-43db-b43a-e4adae1d77ce/util/0.log" Oct 03 09:11:54 crc kubenswrapper[4765]: I1003 09:11:54.360271 4765 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_e202c84312f326eb61c25220bdbc5ba9f724d2b420482503a1fb00e10cqqrl6_fa66b2dd-bf1b-43db-b43a-e4adae1d77ce/pull/0.log" Oct 03 09:11:54 crc kubenswrapper[4765]: I1003 09:11:54.361582 4765 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_e202c84312f326eb61c25220bdbc5ba9f724d2b420482503a1fb00e10cqqrl6_fa66b2dd-bf1b-43db-b43a-e4adae1d77ce/util/0.log" Oct 03 09:11:54 crc kubenswrapper[4765]: I1003 09:11:54.440409 4765 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_e202c84312f326eb61c25220bdbc5ba9f724d2b420482503a1fb00e10cqqrl6_fa66b2dd-bf1b-43db-b43a-e4adae1d77ce/pull/0.log" Oct 03 09:11:54 crc kubenswrapper[4765]: I1003 09:11:54.578455 4765 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_e202c84312f326eb61c25220bdbc5ba9f724d2b420482503a1fb00e10cqqrl6_fa66b2dd-bf1b-43db-b43a-e4adae1d77ce/util/0.log" Oct 03 09:11:54 crc kubenswrapper[4765]: I1003 09:11:54.597045 4765 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_e202c84312f326eb61c25220bdbc5ba9f724d2b420482503a1fb00e10cqqrl6_fa66b2dd-bf1b-43db-b43a-e4adae1d77ce/pull/0.log" Oct 03 09:11:54 crc kubenswrapper[4765]: I1003 09:11:54.611195 4765 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_e202c84312f326eb61c25220bdbc5ba9f724d2b420482503a1fb00e10cqqrl6_fa66b2dd-bf1b-43db-b43a-e4adae1d77ce/extract/0.log" Oct 03 09:11:54 crc kubenswrapper[4765]: I1003 09:11:54.789965 4765 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_glance-operator-controller-manager-846dff85b5-z5mzn_341b089a-f32e-4c46-a533-dbdb7f7b836f/kube-rbac-proxy/0.log" Oct 03 09:11:54 crc kubenswrapper[4765]: I1003 09:11:54.833818 4765 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_glance-operator-controller-manager-846dff85b5-z5mzn_341b089a-f32e-4c46-a533-dbdb7f7b836f/manager/0.log" Oct 03 09:11:54 crc kubenswrapper[4765]: I1003 09:11:54.902310 4765 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_heat-operator-controller-manager-599898f689-6t6dx_c5aa8d54-3d82-404c-a201-d6ca76dc8a8e/kube-rbac-proxy/0.log" Oct 03 09:11:54 crc kubenswrapper[4765]: I1003 09:11:54.987981 4765 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_heat-operator-controller-manager-599898f689-6t6dx_c5aa8d54-3d82-404c-a201-d6ca76dc8a8e/manager/0.log" Oct 03 09:11:55 crc kubenswrapper[4765]: I1003 09:11:55.056534 4765 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_horizon-operator-controller-manager-6769b867d9-tjk7g_24929c35-6b01-4f35-9d3c-7cbd372661a7/kube-rbac-proxy/0.log" Oct 03 09:11:55 crc kubenswrapper[4765]: I1003 09:11:55.102108 4765 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_horizon-operator-controller-manager-6769b867d9-tjk7g_24929c35-6b01-4f35-9d3c-7cbd372661a7/manager/0.log" Oct 03 09:11:55 crc kubenswrapper[4765]: I1003 09:11:55.254851 4765 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_infra-operator-controller-manager-5fbf469cd7-k5nkl_24d5db65-49a9-45d6-949a-d8310646b691/kube-rbac-proxy/0.log" Oct 03 09:11:55 crc kubenswrapper[4765]: I1003 09:11:55.325838 4765 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_infra-operator-controller-manager-5fbf469cd7-k5nkl_24d5db65-49a9-45d6-949a-d8310646b691/manager/0.log" Oct 03 09:11:55 crc kubenswrapper[4765]: I1003 09:11:55.455592 4765 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_ironic-operator-controller-manager-84bc9db6cc-kc4kr_991c16cf-3b2a-4f6c-b792-3848bffc434d/kube-rbac-proxy/0.log" Oct 03 09:11:55 crc kubenswrapper[4765]: I1003 09:11:55.485548 4765 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_ironic-operator-controller-manager-84bc9db6cc-kc4kr_991c16cf-3b2a-4f6c-b792-3848bffc434d/manager/0.log" Oct 03 09:11:55 crc kubenswrapper[4765]: I1003 09:11:55.561737 4765 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_keystone-operator-controller-manager-7f55849f88-88qh4_b609ed75-6fd6-4719-a9dd-eca3c0f7df03/kube-rbac-proxy/0.log" Oct 03 09:11:55 crc kubenswrapper[4765]: I1003 09:11:55.736718 4765 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_keystone-operator-controller-manager-7f55849f88-88qh4_b609ed75-6fd6-4719-a9dd-eca3c0f7df03/manager/0.log" Oct 03 09:11:55 crc kubenswrapper[4765]: I1003 09:11:55.756483 4765 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_manila-operator-controller-manager-6fd6854b49-rg2n6_65b40fd9-a427-4ab2-ab36-e4422081aa43/kube-rbac-proxy/0.log" Oct 03 09:11:55 crc kubenswrapper[4765]: I1003 09:11:55.842361 4765 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_manila-operator-controller-manager-6fd6854b49-rg2n6_65b40fd9-a427-4ab2-ab36-e4422081aa43/manager/0.log" Oct 03 09:11:55 crc kubenswrapper[4765]: I1003 09:11:55.947795 4765 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_mariadb-operator-controller-manager-5c468bf4d4-8tlhl_2cda6800-dddc-4535-86bc-25d3f2d167b0/kube-rbac-proxy/0.log" Oct 03 09:11:55 crc kubenswrapper[4765]: I1003 09:11:55.985565 4765 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_mariadb-operator-controller-manager-5c468bf4d4-8tlhl_2cda6800-dddc-4535-86bc-25d3f2d167b0/manager/0.log" Oct 03 09:11:56 crc kubenswrapper[4765]: I1003 09:11:56.121136 4765 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_neutron-operator-controller-manager-6574bf987d-867c4_21090664-3a4c-4c58-852d-d2779c7bf17d/kube-rbac-proxy/0.log" Oct 03 09:11:56 crc kubenswrapper[4765]: I1003 09:11:56.167747 4765 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_neutron-operator-controller-manager-6574bf987d-867c4_21090664-3a4c-4c58-852d-d2779c7bf17d/manager/0.log" Oct 03 09:11:56 crc kubenswrapper[4765]: I1003 09:11:56.314165 4765 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_nova-operator-controller-manager-555c7456bd-7zq5j_c6bb21ee-995b-448f-8356-bd14d768d848/kube-rbac-proxy/0.log" Oct 03 09:11:56 crc kubenswrapper[4765]: I1003 09:11:56.357287 4765 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_nova-operator-controller-manager-555c7456bd-7zq5j_c6bb21ee-995b-448f-8356-bd14d768d848/manager/0.log" Oct 03 09:11:56 crc kubenswrapper[4765]: I1003 09:11:56.435920 4765 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_octavia-operator-controller-manager-59d6cfdf45-zmn6f_a8759262-a00d-4609-9ca0-7bb0701064f0/kube-rbac-proxy/0.log" Oct 03 09:11:56 crc kubenswrapper[4765]: I1003 09:11:56.573551 4765 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_octavia-operator-controller-manager-59d6cfdf45-zmn6f_a8759262-a00d-4609-9ca0-7bb0701064f0/manager/0.log" Oct 03 09:11:56 crc kubenswrapper[4765]: I1003 09:11:56.699279 4765 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-baremetal-operator-controller-manager-6f64c4d678bz9r6_b52d06aa-3131-4887-b541-16ada075b2dd/manager/0.log" Oct 03 09:11:56 crc kubenswrapper[4765]: I1003 09:11:56.700881 4765 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-baremetal-operator-controller-manager-6f64c4d678bz9r6_b52d06aa-3131-4887-b541-16ada075b2dd/kube-rbac-proxy/0.log" Oct 03 09:11:56 crc kubenswrapper[4765]: I1003 09:11:56.798240 4765 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-operator-controller-manager-6fb94b767d-jsrj4_8075708f-0d72-4b98-b39e-a0ee77011c10/kube-rbac-proxy/0.log" Oct 03 09:11:56 crc kubenswrapper[4765]: I1003 09:11:56.932336 4765 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-operator-index-bkz6d_731a5006-db70-4aee-8324-98cd6d3b5cf8/registry-server/0.log" Oct 03 09:11:57 crc kubenswrapper[4765]: I1003 09:11:57.068279 4765 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_ovn-operator-controller-manager-688db7b6c7-qmgwv_a6dc4446-834f-45ac-8504-1ffc7be80df3/kube-rbac-proxy/0.log" Oct 03 09:11:57 crc kubenswrapper[4765]: I1003 09:11:57.208338 4765 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_ovn-operator-controller-manager-688db7b6c7-qmgwv_a6dc4446-834f-45ac-8504-1ffc7be80df3/manager/0.log" Oct 03 09:11:57 crc kubenswrapper[4765]: I1003 09:11:57.317734 4765 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_placement-operator-controller-manager-7d8bb7f44c-gsgcs_98995a33-7241-42fe-a4c8-308da084b8db/kube-rbac-proxy/0.log" Oct 03 09:11:57 crc kubenswrapper[4765]: I1003 09:11:57.413304 4765 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-operator-controller-manager-6fb94b767d-jsrj4_8075708f-0d72-4b98-b39e-a0ee77011c10/manager/0.log" Oct 03 09:11:57 crc kubenswrapper[4765]: I1003 09:11:57.424019 4765 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_placement-operator-controller-manager-7d8bb7f44c-gsgcs_98995a33-7241-42fe-a4c8-308da084b8db/manager/0.log" Oct 03 09:11:57 crc kubenswrapper[4765]: I1003 09:11:57.558133 4765 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_rabbitmq-cluster-operator-manager-5f97d8c699-bsdgf_ac8027c9-646b-4c9f-965e-639d6ad818dd/operator/0.log" Oct 03 09:11:57 crc kubenswrapper[4765]: I1003 09:11:57.676498 4765 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_swift-operator-controller-manager-6859f9b676-5rxfw_4a654679-7956-4885-841e-e1718fc57bef/manager/0.log" Oct 03 09:11:57 crc kubenswrapper[4765]: I1003 09:11:57.679063 4765 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_swift-operator-controller-manager-6859f9b676-5rxfw_4a654679-7956-4885-841e-e1718fc57bef/kube-rbac-proxy/0.log" Oct 03 09:11:58 crc kubenswrapper[4765]: I1003 09:11:58.010696 4765 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_telemetry-operator-controller-manager-5db5cf686f-qjp99_b0f3d3a4-5e2c-4f86-a6ab-65be50620ce7/kube-rbac-proxy/0.log" Oct 03 09:11:58 crc kubenswrapper[4765]: I1003 09:11:58.148002 4765 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_test-operator-controller-manager-5cd5cb47d7-vvxdl_f3a40e37-8073-4528-9379-9cea11f883ed/manager/0.log" Oct 03 09:11:58 crc kubenswrapper[4765]: I1003 09:11:58.195506 4765 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_test-operator-controller-manager-5cd5cb47d7-vvxdl_f3a40e37-8073-4528-9379-9cea11f883ed/kube-rbac-proxy/0.log" Oct 03 09:11:58 crc kubenswrapper[4765]: I1003 09:11:58.306429 4765 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_telemetry-operator-controller-manager-5db5cf686f-qjp99_b0f3d3a4-5e2c-4f86-a6ab-65be50620ce7/manager/0.log" Oct 03 09:11:58 crc kubenswrapper[4765]: I1003 09:11:58.377165 4765 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_watcher-operator-controller-manager-5d56cb75ff-b6xc8_5b5423e1-1c3e-4cab-9cc7-112199b23e2e/kube-rbac-proxy/0.log" Oct 03 09:11:58 crc kubenswrapper[4765]: I1003 09:11:58.542516 4765 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_watcher-operator-index-v95cb_5a3da866-1f20-45f7-b2da-6d7beac4f7f4/registry-server/0.log" Oct 03 09:11:58 crc kubenswrapper[4765]: I1003 09:11:58.703072 4765 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_watcher-operator-controller-manager-5d56cb75ff-b6xc8_5b5423e1-1c3e-4cab-9cc7-112199b23e2e/manager/0.log" Oct 03 09:12:15 crc kubenswrapper[4765]: I1003 09:12:15.266356 4765 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-api_control-plane-machine-set-operator-78cbb6b69f-rnsx7_8f8201b3-edba-4bac-9d31-08452195ff1f/control-plane-machine-set-operator/0.log" Oct 03 09:12:15 crc kubenswrapper[4765]: I1003 09:12:15.493047 4765 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-api_machine-api-operator-5694c8668f-8mnt6_17c891c2-c5ff-4815-9f09-347204c5da1d/machine-api-operator/0.log" Oct 03 09:12:15 crc kubenswrapper[4765]: I1003 09:12:15.524229 4765 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-api_machine-api-operator-5694c8668f-8mnt6_17c891c2-c5ff-4815-9f09-347204c5da1d/kube-rbac-proxy/0.log" Oct 03 09:12:19 crc kubenswrapper[4765]: I1003 09:12:19.988789 4765 scope.go:117] "RemoveContainer" containerID="5c944ee405adf975cb9f464c89114bc85779a402ed3be0fec81254716566e0a8" Oct 03 09:12:20 crc kubenswrapper[4765]: I1003 09:12:20.034048 4765 scope.go:117] "RemoveContainer" containerID="7f580f6c80011a6013386fcfee558fccb2c9b64b5f6460aa6fde8191d9650009" Oct 03 09:12:20 crc kubenswrapper[4765]: I1003 09:12:20.060572 4765 scope.go:117] "RemoveContainer" containerID="917858b779ed740f3db191723520a9ae27e54bcc078a6aebe580ef8a1baf447b" Oct 03 09:12:20 crc kubenswrapper[4765]: I1003 09:12:20.095565 4765 scope.go:117] "RemoveContainer" containerID="28c8d5a9fb8b49bec37ee98f35896a8cb55afcf28d1ba92c93967268c0f07240" Oct 03 09:12:28 crc kubenswrapper[4765]: I1003 09:12:28.046705 4765 log.go:25] "Finished parsing log file" path="/var/log/pods/cert-manager_cert-manager-7d4cc89fcb-hlltz_27364373-79dc-418b-8ec0-5d0032a48040/cert-manager-controller/0.log" Oct 03 09:12:28 crc kubenswrapper[4765]: I1003 09:12:28.308128 4765 log.go:25] "Finished parsing log file" path="/var/log/pods/cert-manager_cert-manager-cainjector-7d9f95dbf-9nl8k_fd40bba3-15c9-42d9-83f3-aeb014fd89eb/cert-manager-cainjector/0.log" Oct 03 09:12:28 crc kubenswrapper[4765]: I1003 09:12:28.309172 4765 log.go:25] "Finished parsing log file" path="/var/log/pods/cert-manager_cert-manager-webhook-d969966f-tpwbb_6428e1d2-b7e5-4046-bdd4-b1ec5c55e6cd/cert-manager-webhook/0.log" Oct 03 09:12:30 crc kubenswrapper[4765]: I1003 09:12:30.680020 4765 patch_prober.go:28] interesting pod/machine-config-daemon-j8mss container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 03 09:12:30 crc kubenswrapper[4765]: I1003 09:12:30.680388 4765 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-j8mss" podUID="d636dbad-9ffa-4ba7-953f-adea04b76a23" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 03 09:12:40 crc kubenswrapper[4765]: I1003 09:12:40.371507 4765 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-console-plugin-6b874cbd85-bvxsq_6a5b68b1-49a0-4e3c-a08b-fc4de9903242/nmstate-console-plugin/0.log" Oct 03 09:12:40 crc kubenswrapper[4765]: I1003 09:12:40.535744 4765 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-handler-bwwmp_572363ec-fce1-468d-998d-3ae0dac9c35a/nmstate-handler/0.log" Oct 03 09:12:40 crc kubenswrapper[4765]: I1003 09:12:40.597280 4765 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-metrics-fdff9cb8d-6gs2v_079a3543-9818-46c7-8500-d84424d4f411/kube-rbac-proxy/0.log" Oct 03 09:12:40 crc kubenswrapper[4765]: I1003 09:12:40.606179 4765 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-metrics-fdff9cb8d-6gs2v_079a3543-9818-46c7-8500-d84424d4f411/nmstate-metrics/0.log" Oct 03 09:12:40 crc kubenswrapper[4765]: I1003 09:12:40.806340 4765 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-operator-858ddd8f98-r5vr9_6a17b86e-1da6-447f-abd8-54873d510ee3/nmstate-operator/0.log" Oct 03 09:12:40 crc kubenswrapper[4765]: I1003 09:12:40.857059 4765 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-webhook-6cdbc54649-m6bmz_e3620a8c-3a6c-4f66-8101-cd7f7b91f7d0/nmstate-webhook/0.log" Oct 03 09:12:55 crc kubenswrapper[4765]: I1003 09:12:55.416611 4765 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_controller-68d546b9d8-2cwp9_e1ff5baf-0400-42a9-8815-1a618208934e/kube-rbac-proxy/0.log" Oct 03 09:12:55 crc kubenswrapper[4765]: I1003 09:12:55.640058 4765 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_controller-68d546b9d8-2cwp9_e1ff5baf-0400-42a9-8815-1a618208934e/controller/0.log" Oct 03 09:12:55 crc kubenswrapper[4765]: I1003 09:12:55.720193 4765 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-sx8td_6a402791-ba68-4594-bb36-a6c491fdf723/cp-frr-files/0.log" Oct 03 09:12:55 crc kubenswrapper[4765]: I1003 09:12:55.948127 4765 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-sx8td_6a402791-ba68-4594-bb36-a6c491fdf723/cp-metrics/0.log" Oct 03 09:12:55 crc kubenswrapper[4765]: I1003 09:12:55.977204 4765 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-sx8td_6a402791-ba68-4594-bb36-a6c491fdf723/cp-frr-files/0.log" Oct 03 09:12:56 crc kubenswrapper[4765]: I1003 09:12:56.015323 4765 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-sx8td_6a402791-ba68-4594-bb36-a6c491fdf723/cp-reloader/0.log" Oct 03 09:12:56 crc kubenswrapper[4765]: I1003 09:12:56.018051 4765 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-sx8td_6a402791-ba68-4594-bb36-a6c491fdf723/cp-reloader/0.log" Oct 03 09:12:56 crc kubenswrapper[4765]: I1003 09:12:56.173180 4765 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-sx8td_6a402791-ba68-4594-bb36-a6c491fdf723/cp-frr-files/0.log" Oct 03 09:12:56 crc kubenswrapper[4765]: I1003 09:12:56.234782 4765 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-sx8td_6a402791-ba68-4594-bb36-a6c491fdf723/cp-reloader/0.log" Oct 03 09:12:56 crc kubenswrapper[4765]: I1003 09:12:56.235681 4765 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-sx8td_6a402791-ba68-4594-bb36-a6c491fdf723/cp-metrics/0.log" Oct 03 09:12:56 crc kubenswrapper[4765]: I1003 09:12:56.319766 4765 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-sx8td_6a402791-ba68-4594-bb36-a6c491fdf723/cp-metrics/0.log" Oct 03 09:12:56 crc kubenswrapper[4765]: I1003 09:12:56.450239 4765 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-sx8td_6a402791-ba68-4594-bb36-a6c491fdf723/cp-frr-files/0.log" Oct 03 09:12:56 crc kubenswrapper[4765]: I1003 09:12:56.511089 4765 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-sx8td_6a402791-ba68-4594-bb36-a6c491fdf723/controller/0.log" Oct 03 09:12:56 crc kubenswrapper[4765]: I1003 09:12:56.525066 4765 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-sx8td_6a402791-ba68-4594-bb36-a6c491fdf723/cp-metrics/0.log" Oct 03 09:12:56 crc kubenswrapper[4765]: I1003 09:12:56.525740 4765 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-sx8td_6a402791-ba68-4594-bb36-a6c491fdf723/cp-reloader/0.log" Oct 03 09:12:56 crc kubenswrapper[4765]: I1003 09:12:56.700165 4765 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-sx8td_6a402791-ba68-4594-bb36-a6c491fdf723/frr-metrics/0.log" Oct 03 09:12:56 crc kubenswrapper[4765]: I1003 09:12:56.716607 4765 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-sx8td_6a402791-ba68-4594-bb36-a6c491fdf723/kube-rbac-proxy-frr/0.log" Oct 03 09:12:56 crc kubenswrapper[4765]: I1003 09:12:56.770324 4765 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-sx8td_6a402791-ba68-4594-bb36-a6c491fdf723/kube-rbac-proxy/0.log" Oct 03 09:12:56 crc kubenswrapper[4765]: I1003 09:12:56.930717 4765 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-sx8td_6a402791-ba68-4594-bb36-a6c491fdf723/reloader/0.log" Oct 03 09:12:57 crc kubenswrapper[4765]: I1003 09:12:57.036168 4765 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-webhook-server-64bf5d555-8gx25_a9886be2-cc56-4d96-b89e-55c1fc65774e/frr-k8s-webhook-server/0.log" Oct 03 09:12:57 crc kubenswrapper[4765]: I1003 09:12:57.265811 4765 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_metallb-operator-controller-manager-778d9f7978-ns677_f5247665-eb4f-47b9-9400-71a8a43d381c/manager/0.log" Oct 03 09:12:57 crc kubenswrapper[4765]: I1003 09:12:57.486210 4765 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_metallb-operator-webhook-server-77bffb8bb6-z6dx6_4884351b-d735-4f5a-904a-93df33b47d3f/webhook-server/0.log" Oct 03 09:12:57 crc kubenswrapper[4765]: I1003 09:12:57.667848 4765 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_speaker-h9tnj_93299a5d-55e2-4554-9fe8-4432acc25332/kube-rbac-proxy/0.log" Oct 03 09:12:57 crc kubenswrapper[4765]: I1003 09:12:57.668156 4765 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-sx8td_6a402791-ba68-4594-bb36-a6c491fdf723/frr/0.log" Oct 03 09:12:57 crc kubenswrapper[4765]: I1003 09:12:57.961208 4765 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_speaker-h9tnj_93299a5d-55e2-4554-9fe8-4432acc25332/speaker/0.log" Oct 03 09:13:00 crc kubenswrapper[4765]: I1003 09:13:00.680200 4765 patch_prober.go:28] interesting pod/machine-config-daemon-j8mss container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 03 09:13:00 crc kubenswrapper[4765]: I1003 09:13:00.680537 4765 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-j8mss" podUID="d636dbad-9ffa-4ba7-953f-adea04b76a23" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 03 09:13:03 crc kubenswrapper[4765]: I1003 09:13:03.157229 4765 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-c46pz"] Oct 03 09:13:03 crc kubenswrapper[4765]: I1003 09:13:03.159062 4765 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-c46pz" Oct 03 09:13:03 crc kubenswrapper[4765]: I1003 09:13:03.176421 4765 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-c46pz"] Oct 03 09:13:03 crc kubenswrapper[4765]: I1003 09:13:03.285977 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/04492332-af2a-41b4-a716-2c926247b9b0-catalog-content\") pod \"redhat-operators-c46pz\" (UID: \"04492332-af2a-41b4-a716-2c926247b9b0\") " pod="openshift-marketplace/redhat-operators-c46pz" Oct 03 09:13:03 crc kubenswrapper[4765]: I1003 09:13:03.286079 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/04492332-af2a-41b4-a716-2c926247b9b0-utilities\") pod \"redhat-operators-c46pz\" (UID: \"04492332-af2a-41b4-a716-2c926247b9b0\") " pod="openshift-marketplace/redhat-operators-c46pz" Oct 03 09:13:03 crc kubenswrapper[4765]: I1003 09:13:03.286196 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-p972v\" (UniqueName: \"kubernetes.io/projected/04492332-af2a-41b4-a716-2c926247b9b0-kube-api-access-p972v\") pod \"redhat-operators-c46pz\" (UID: \"04492332-af2a-41b4-a716-2c926247b9b0\") " pod="openshift-marketplace/redhat-operators-c46pz" Oct 03 09:13:03 crc kubenswrapper[4765]: I1003 09:13:03.388329 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/04492332-af2a-41b4-a716-2c926247b9b0-catalog-content\") pod \"redhat-operators-c46pz\" (UID: \"04492332-af2a-41b4-a716-2c926247b9b0\") " pod="openshift-marketplace/redhat-operators-c46pz" Oct 03 09:13:03 crc kubenswrapper[4765]: I1003 09:13:03.388449 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/04492332-af2a-41b4-a716-2c926247b9b0-utilities\") pod \"redhat-operators-c46pz\" (UID: \"04492332-af2a-41b4-a716-2c926247b9b0\") " pod="openshift-marketplace/redhat-operators-c46pz" Oct 03 09:13:03 crc kubenswrapper[4765]: I1003 09:13:03.388473 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-p972v\" (UniqueName: \"kubernetes.io/projected/04492332-af2a-41b4-a716-2c926247b9b0-kube-api-access-p972v\") pod \"redhat-operators-c46pz\" (UID: \"04492332-af2a-41b4-a716-2c926247b9b0\") " pod="openshift-marketplace/redhat-operators-c46pz" Oct 03 09:13:03 crc kubenswrapper[4765]: I1003 09:13:03.389035 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/04492332-af2a-41b4-a716-2c926247b9b0-catalog-content\") pod \"redhat-operators-c46pz\" (UID: \"04492332-af2a-41b4-a716-2c926247b9b0\") " pod="openshift-marketplace/redhat-operators-c46pz" Oct 03 09:13:03 crc kubenswrapper[4765]: I1003 09:13:03.389098 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/04492332-af2a-41b4-a716-2c926247b9b0-utilities\") pod \"redhat-operators-c46pz\" (UID: \"04492332-af2a-41b4-a716-2c926247b9b0\") " pod="openshift-marketplace/redhat-operators-c46pz" Oct 03 09:13:03 crc kubenswrapper[4765]: I1003 09:13:03.407600 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-p972v\" (UniqueName: \"kubernetes.io/projected/04492332-af2a-41b4-a716-2c926247b9b0-kube-api-access-p972v\") pod \"redhat-operators-c46pz\" (UID: \"04492332-af2a-41b4-a716-2c926247b9b0\") " pod="openshift-marketplace/redhat-operators-c46pz" Oct 03 09:13:03 crc kubenswrapper[4765]: I1003 09:13:03.490492 4765 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-c46pz" Oct 03 09:13:04 crc kubenswrapper[4765]: I1003 09:13:04.024537 4765 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-c46pz"] Oct 03 09:13:04 crc kubenswrapper[4765]: I1003 09:13:04.122293 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-c46pz" event={"ID":"04492332-af2a-41b4-a716-2c926247b9b0","Type":"ContainerStarted","Data":"2f06b8636c5dfac64afb94b039c3a4835b2261f93d1a4af2740c387a5f1b2c20"} Oct 03 09:13:05 crc kubenswrapper[4765]: I1003 09:13:05.131384 4765 generic.go:334] "Generic (PLEG): container finished" podID="04492332-af2a-41b4-a716-2c926247b9b0" containerID="8431957ab7e4cbe4a938bda1ffde6165b0bd81868158bf10ade2217ff1d5fa4d" exitCode=0 Oct 03 09:13:05 crc kubenswrapper[4765]: I1003 09:13:05.131448 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-c46pz" event={"ID":"04492332-af2a-41b4-a716-2c926247b9b0","Type":"ContainerDied","Data":"8431957ab7e4cbe4a938bda1ffde6165b0bd81868158bf10ade2217ff1d5fa4d"} Oct 03 09:13:07 crc kubenswrapper[4765]: I1003 09:13:07.150033 4765 generic.go:334] "Generic (PLEG): container finished" podID="04492332-af2a-41b4-a716-2c926247b9b0" containerID="cba7263e52d1811294c4a2520e2604e023c4f7722e08b79fbeab73ec2bd01090" exitCode=0 Oct 03 09:13:07 crc kubenswrapper[4765]: I1003 09:13:07.150112 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-c46pz" event={"ID":"04492332-af2a-41b4-a716-2c926247b9b0","Type":"ContainerDied","Data":"cba7263e52d1811294c4a2520e2604e023c4f7722e08b79fbeab73ec2bd01090"} Oct 03 09:13:08 crc kubenswrapper[4765]: I1003 09:13:08.161743 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-c46pz" event={"ID":"04492332-af2a-41b4-a716-2c926247b9b0","Type":"ContainerStarted","Data":"cb468fc9208063417da7d0c41b06ca748e79f30282c1438479e00dc2930f503c"} Oct 03 09:13:08 crc kubenswrapper[4765]: I1003 09:13:08.182525 4765 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-c46pz" podStartSLOduration=2.7030851350000002 podStartE2EDuration="5.182510064s" podCreationTimestamp="2025-10-03 09:13:03 +0000 UTC" firstStartedPulling="2025-10-03 09:13:05.132972323 +0000 UTC m=+2029.434466653" lastFinishedPulling="2025-10-03 09:13:07.612397252 +0000 UTC m=+2031.913891582" observedRunningTime="2025-10-03 09:13:08.182162995 +0000 UTC m=+2032.483657325" watchObservedRunningTime="2025-10-03 09:13:08.182510064 +0000 UTC m=+2032.484004394" Oct 03 09:13:13 crc kubenswrapper[4765]: I1003 09:13:13.491596 4765 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-c46pz" Oct 03 09:13:13 crc kubenswrapper[4765]: I1003 09:13:13.492356 4765 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-c46pz" Oct 03 09:13:13 crc kubenswrapper[4765]: I1003 09:13:13.544027 4765 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-c46pz" Oct 03 09:13:14 crc kubenswrapper[4765]: I1003 09:13:14.270715 4765 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-c46pz" Oct 03 09:13:17 crc kubenswrapper[4765]: I1003 09:13:17.749408 4765 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-c46pz"] Oct 03 09:13:17 crc kubenswrapper[4765]: I1003 09:13:17.750145 4765 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-c46pz" podUID="04492332-af2a-41b4-a716-2c926247b9b0" containerName="registry-server" containerID="cri-o://cb468fc9208063417da7d0c41b06ca748e79f30282c1438479e00dc2930f503c" gracePeriod=2 Oct 03 09:13:18 crc kubenswrapper[4765]: I1003 09:13:18.155871 4765 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-c46pz" Oct 03 09:13:18 crc kubenswrapper[4765]: I1003 09:13:18.225824 4765 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-p972v\" (UniqueName: \"kubernetes.io/projected/04492332-af2a-41b4-a716-2c926247b9b0-kube-api-access-p972v\") pod \"04492332-af2a-41b4-a716-2c926247b9b0\" (UID: \"04492332-af2a-41b4-a716-2c926247b9b0\") " Oct 03 09:13:18 crc kubenswrapper[4765]: I1003 09:13:18.226266 4765 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/04492332-af2a-41b4-a716-2c926247b9b0-catalog-content\") pod \"04492332-af2a-41b4-a716-2c926247b9b0\" (UID: \"04492332-af2a-41b4-a716-2c926247b9b0\") " Oct 03 09:13:18 crc kubenswrapper[4765]: I1003 09:13:18.226308 4765 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/04492332-af2a-41b4-a716-2c926247b9b0-utilities\") pod \"04492332-af2a-41b4-a716-2c926247b9b0\" (UID: \"04492332-af2a-41b4-a716-2c926247b9b0\") " Oct 03 09:13:18 crc kubenswrapper[4765]: I1003 09:13:18.227107 4765 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/04492332-af2a-41b4-a716-2c926247b9b0-utilities" (OuterVolumeSpecName: "utilities") pod "04492332-af2a-41b4-a716-2c926247b9b0" (UID: "04492332-af2a-41b4-a716-2c926247b9b0"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 03 09:13:18 crc kubenswrapper[4765]: I1003 09:13:18.227527 4765 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/04492332-af2a-41b4-a716-2c926247b9b0-utilities\") on node \"crc\" DevicePath \"\"" Oct 03 09:13:18 crc kubenswrapper[4765]: I1003 09:13:18.238345 4765 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/04492332-af2a-41b4-a716-2c926247b9b0-kube-api-access-p972v" (OuterVolumeSpecName: "kube-api-access-p972v") pod "04492332-af2a-41b4-a716-2c926247b9b0" (UID: "04492332-af2a-41b4-a716-2c926247b9b0"). InnerVolumeSpecName "kube-api-access-p972v". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 09:13:18 crc kubenswrapper[4765]: I1003 09:13:18.254327 4765 generic.go:334] "Generic (PLEG): container finished" podID="04492332-af2a-41b4-a716-2c926247b9b0" containerID="cb468fc9208063417da7d0c41b06ca748e79f30282c1438479e00dc2930f503c" exitCode=0 Oct 03 09:13:18 crc kubenswrapper[4765]: I1003 09:13:18.254377 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-c46pz" event={"ID":"04492332-af2a-41b4-a716-2c926247b9b0","Type":"ContainerDied","Data":"cb468fc9208063417da7d0c41b06ca748e79f30282c1438479e00dc2930f503c"} Oct 03 09:13:18 crc kubenswrapper[4765]: I1003 09:13:18.254406 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-c46pz" event={"ID":"04492332-af2a-41b4-a716-2c926247b9b0","Type":"ContainerDied","Data":"2f06b8636c5dfac64afb94b039c3a4835b2261f93d1a4af2740c387a5f1b2c20"} Oct 03 09:13:18 crc kubenswrapper[4765]: I1003 09:13:18.254426 4765 scope.go:117] "RemoveContainer" containerID="cb468fc9208063417da7d0c41b06ca748e79f30282c1438479e00dc2930f503c" Oct 03 09:13:18 crc kubenswrapper[4765]: I1003 09:13:18.254571 4765 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-c46pz" Oct 03 09:13:18 crc kubenswrapper[4765]: I1003 09:13:18.294140 4765 scope.go:117] "RemoveContainer" containerID="cba7263e52d1811294c4a2520e2604e023c4f7722e08b79fbeab73ec2bd01090" Oct 03 09:13:18 crc kubenswrapper[4765]: I1003 09:13:18.320078 4765 scope.go:117] "RemoveContainer" containerID="8431957ab7e4cbe4a938bda1ffde6165b0bd81868158bf10ade2217ff1d5fa4d" Oct 03 09:13:18 crc kubenswrapper[4765]: I1003 09:13:18.322148 4765 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/04492332-af2a-41b4-a716-2c926247b9b0-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "04492332-af2a-41b4-a716-2c926247b9b0" (UID: "04492332-af2a-41b4-a716-2c926247b9b0"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 03 09:13:18 crc kubenswrapper[4765]: I1003 09:13:18.329020 4765 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-p972v\" (UniqueName: \"kubernetes.io/projected/04492332-af2a-41b4-a716-2c926247b9b0-kube-api-access-p972v\") on node \"crc\" DevicePath \"\"" Oct 03 09:13:18 crc kubenswrapper[4765]: I1003 09:13:18.329048 4765 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/04492332-af2a-41b4-a716-2c926247b9b0-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 03 09:13:18 crc kubenswrapper[4765]: I1003 09:13:18.354676 4765 scope.go:117] "RemoveContainer" containerID="cb468fc9208063417da7d0c41b06ca748e79f30282c1438479e00dc2930f503c" Oct 03 09:13:18 crc kubenswrapper[4765]: E1003 09:13:18.355141 4765 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"cb468fc9208063417da7d0c41b06ca748e79f30282c1438479e00dc2930f503c\": container with ID starting with cb468fc9208063417da7d0c41b06ca748e79f30282c1438479e00dc2930f503c not found: ID does not exist" containerID="cb468fc9208063417da7d0c41b06ca748e79f30282c1438479e00dc2930f503c" Oct 03 09:13:18 crc kubenswrapper[4765]: I1003 09:13:18.355192 4765 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"cb468fc9208063417da7d0c41b06ca748e79f30282c1438479e00dc2930f503c"} err="failed to get container status \"cb468fc9208063417da7d0c41b06ca748e79f30282c1438479e00dc2930f503c\": rpc error: code = NotFound desc = could not find container \"cb468fc9208063417da7d0c41b06ca748e79f30282c1438479e00dc2930f503c\": container with ID starting with cb468fc9208063417da7d0c41b06ca748e79f30282c1438479e00dc2930f503c not found: ID does not exist" Oct 03 09:13:18 crc kubenswrapper[4765]: I1003 09:13:18.355268 4765 scope.go:117] "RemoveContainer" containerID="cba7263e52d1811294c4a2520e2604e023c4f7722e08b79fbeab73ec2bd01090" Oct 03 09:13:18 crc kubenswrapper[4765]: E1003 09:13:18.355525 4765 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"cba7263e52d1811294c4a2520e2604e023c4f7722e08b79fbeab73ec2bd01090\": container with ID starting with cba7263e52d1811294c4a2520e2604e023c4f7722e08b79fbeab73ec2bd01090 not found: ID does not exist" containerID="cba7263e52d1811294c4a2520e2604e023c4f7722e08b79fbeab73ec2bd01090" Oct 03 09:13:18 crc kubenswrapper[4765]: I1003 09:13:18.355552 4765 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"cba7263e52d1811294c4a2520e2604e023c4f7722e08b79fbeab73ec2bd01090"} err="failed to get container status \"cba7263e52d1811294c4a2520e2604e023c4f7722e08b79fbeab73ec2bd01090\": rpc error: code = NotFound desc = could not find container \"cba7263e52d1811294c4a2520e2604e023c4f7722e08b79fbeab73ec2bd01090\": container with ID starting with cba7263e52d1811294c4a2520e2604e023c4f7722e08b79fbeab73ec2bd01090 not found: ID does not exist" Oct 03 09:13:18 crc kubenswrapper[4765]: I1003 09:13:18.355569 4765 scope.go:117] "RemoveContainer" containerID="8431957ab7e4cbe4a938bda1ffde6165b0bd81868158bf10ade2217ff1d5fa4d" Oct 03 09:13:18 crc kubenswrapper[4765]: E1003 09:13:18.355774 4765 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8431957ab7e4cbe4a938bda1ffde6165b0bd81868158bf10ade2217ff1d5fa4d\": container with ID starting with 8431957ab7e4cbe4a938bda1ffde6165b0bd81868158bf10ade2217ff1d5fa4d not found: ID does not exist" containerID="8431957ab7e4cbe4a938bda1ffde6165b0bd81868158bf10ade2217ff1d5fa4d" Oct 03 09:13:18 crc kubenswrapper[4765]: I1003 09:13:18.355805 4765 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8431957ab7e4cbe4a938bda1ffde6165b0bd81868158bf10ade2217ff1d5fa4d"} err="failed to get container status \"8431957ab7e4cbe4a938bda1ffde6165b0bd81868158bf10ade2217ff1d5fa4d\": rpc error: code = NotFound desc = could not find container \"8431957ab7e4cbe4a938bda1ffde6165b0bd81868158bf10ade2217ff1d5fa4d\": container with ID starting with 8431957ab7e4cbe4a938bda1ffde6165b0bd81868158bf10ade2217ff1d5fa4d not found: ID does not exist" Oct 03 09:13:18 crc kubenswrapper[4765]: I1003 09:13:18.593598 4765 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-c46pz"] Oct 03 09:13:18 crc kubenswrapper[4765]: I1003 09:13:18.603414 4765 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-c46pz"] Oct 03 09:13:20 crc kubenswrapper[4765]: I1003 09:13:20.211910 4765 scope.go:117] "RemoveContainer" containerID="dfda99923840b38f8150060deb154ec2326ee60d46ebe1e406b98d760aea99c7" Oct 03 09:13:20 crc kubenswrapper[4765]: I1003 09:13:20.274812 4765 scope.go:117] "RemoveContainer" containerID="a8484ac98edaaf57e9fa0bbd9ed0a814437584819226b3f4a6fdb9b23d694adc" Oct 03 09:13:20 crc kubenswrapper[4765]: I1003 09:13:20.318234 4765 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="04492332-af2a-41b4-a716-2c926247b9b0" path="/var/lib/kubelet/pods/04492332-af2a-41b4-a716-2c926247b9b0/volumes" Oct 03 09:13:20 crc kubenswrapper[4765]: I1003 09:13:20.319575 4765 scope.go:117] "RemoveContainer" containerID="b897ac33fcbb2351a24542507c6113278751d033f25dd4c328e4a384f0ebe9e2" Oct 03 09:13:20 crc kubenswrapper[4765]: I1003 09:13:20.642090 4765 log.go:25] "Finished parsing log file" path="/var/log/pods/watcher-kuttl-default_alertmanager-metric-storage-0_c0fe5012-3b98-4ef6-954c-27b4e962a1cc/init-config-reloader/0.log" Oct 03 09:13:20 crc kubenswrapper[4765]: I1003 09:13:20.883688 4765 log.go:25] "Finished parsing log file" path="/var/log/pods/watcher-kuttl-default_alertmanager-metric-storage-0_c0fe5012-3b98-4ef6-954c-27b4e962a1cc/init-config-reloader/0.log" Oct 03 09:13:20 crc kubenswrapper[4765]: I1003 09:13:20.929015 4765 log.go:25] "Finished parsing log file" path="/var/log/pods/watcher-kuttl-default_alertmanager-metric-storage-0_c0fe5012-3b98-4ef6-954c-27b4e962a1cc/config-reloader/0.log" Oct 03 09:13:20 crc kubenswrapper[4765]: I1003 09:13:20.940519 4765 log.go:25] "Finished parsing log file" path="/var/log/pods/watcher-kuttl-default_alertmanager-metric-storage-0_c0fe5012-3b98-4ef6-954c-27b4e962a1cc/alertmanager/0.log" Oct 03 09:13:21 crc kubenswrapper[4765]: I1003 09:13:21.064458 4765 log.go:25] "Finished parsing log file" path="/var/log/pods/watcher-kuttl-default_ceilometer-0_7cb3090b-4f8a-4f9c-843d-05d324c92f0f/ceilometer-central-agent/0.log" Oct 03 09:13:21 crc kubenswrapper[4765]: I1003 09:13:21.120346 4765 log.go:25] "Finished parsing log file" path="/var/log/pods/watcher-kuttl-default_ceilometer-0_7cb3090b-4f8a-4f9c-843d-05d324c92f0f/ceilometer-notification-agent/0.log" Oct 03 09:13:21 crc kubenswrapper[4765]: I1003 09:13:21.157741 4765 log.go:25] "Finished parsing log file" path="/var/log/pods/watcher-kuttl-default_ceilometer-0_7cb3090b-4f8a-4f9c-843d-05d324c92f0f/proxy-httpd/0.log" Oct 03 09:13:21 crc kubenswrapper[4765]: I1003 09:13:21.212777 4765 log.go:25] "Finished parsing log file" path="/var/log/pods/watcher-kuttl-default_ceilometer-0_7cb3090b-4f8a-4f9c-843d-05d324c92f0f/sg-core/0.log" Oct 03 09:13:21 crc kubenswrapper[4765]: I1003 09:13:21.386256 4765 log.go:25] "Finished parsing log file" path="/var/log/pods/watcher-kuttl-default_keystone-7bb6879856-dm6qk_0ab88eac-dab7-4528-9654-8d5b819ca4d6/keystone-api/0.log" Oct 03 09:13:21 crc kubenswrapper[4765]: I1003 09:13:21.424271 4765 log.go:25] "Finished parsing log file" path="/var/log/pods/watcher-kuttl-default_keystone-bootstrap-kbjbs_3645711c-7984-4502-aee7-98e45640eaa9/keystone-bootstrap/0.log" Oct 03 09:13:21 crc kubenswrapper[4765]: I1003 09:13:21.595222 4765 log.go:25] "Finished parsing log file" path="/var/log/pods/watcher-kuttl-default_keystone-cron-29324701-zwcd8_ad34be00-f9d6-41d7-9db0-decf8a030e53/keystone-cron/0.log" Oct 03 09:13:21 crc kubenswrapper[4765]: I1003 09:13:21.632864 4765 log.go:25] "Finished parsing log file" path="/var/log/pods/watcher-kuttl-default_kube-state-metrics-0_f2c99332-43c7-45cc-b2d3-83f8fe1ffc41/kube-state-metrics/0.log" Oct 03 09:13:21 crc kubenswrapper[4765]: I1003 09:13:21.896583 4765 log.go:25] "Finished parsing log file" path="/var/log/pods/watcher-kuttl-default_openstack-galera-0_3f7d0aed-bfcf-4589-a75c-d94328cf7b7a/mysql-bootstrap/0.log" Oct 03 09:13:22 crc kubenswrapper[4765]: I1003 09:13:22.064914 4765 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["watcher-kuttl-default/keystone-bootstrap-kbjbs"] Oct 03 09:13:22 crc kubenswrapper[4765]: I1003 09:13:22.072266 4765 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["watcher-kuttl-default/keystone-bootstrap-kbjbs"] Oct 03 09:13:22 crc kubenswrapper[4765]: I1003 09:13:22.318999 4765 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3645711c-7984-4502-aee7-98e45640eaa9" path="/var/lib/kubelet/pods/3645711c-7984-4502-aee7-98e45640eaa9/volumes" Oct 03 09:13:22 crc kubenswrapper[4765]: I1003 09:13:22.409511 4765 log.go:25] "Finished parsing log file" path="/var/log/pods/watcher-kuttl-default_openstack-galera-0_3f7d0aed-bfcf-4589-a75c-d94328cf7b7a/galera/0.log" Oct 03 09:13:22 crc kubenswrapper[4765]: I1003 09:13:22.533631 4765 log.go:25] "Finished parsing log file" path="/var/log/pods/watcher-kuttl-default_openstack-galera-0_3f7d0aed-bfcf-4589-a75c-d94328cf7b7a/mysql-bootstrap/0.log" Oct 03 09:13:22 crc kubenswrapper[4765]: I1003 09:13:22.777362 4765 log.go:25] "Finished parsing log file" path="/var/log/pods/watcher-kuttl-default_openstackclient_73ef00a9-8d50-49fb-84ae-669fff822e30/openstackclient/0.log" Oct 03 09:13:22 crc kubenswrapper[4765]: I1003 09:13:22.888052 4765 log.go:25] "Finished parsing log file" path="/var/log/pods/watcher-kuttl-default_prometheus-metric-storage-0_81065dff-f372-4966-8f6b-751090a1f5b6/init-config-reloader/0.log" Oct 03 09:13:23 crc kubenswrapper[4765]: I1003 09:13:23.069138 4765 log.go:25] "Finished parsing log file" path="/var/log/pods/watcher-kuttl-default_prometheus-metric-storage-0_81065dff-f372-4966-8f6b-751090a1f5b6/config-reloader/0.log" Oct 03 09:13:23 crc kubenswrapper[4765]: I1003 09:13:23.172825 4765 log.go:25] "Finished parsing log file" path="/var/log/pods/watcher-kuttl-default_prometheus-metric-storage-0_81065dff-f372-4966-8f6b-751090a1f5b6/init-config-reloader/0.log" Oct 03 09:13:23 crc kubenswrapper[4765]: I1003 09:13:23.188851 4765 log.go:25] "Finished parsing log file" path="/var/log/pods/watcher-kuttl-default_prometheus-metric-storage-0_81065dff-f372-4966-8f6b-751090a1f5b6/prometheus/0.log" Oct 03 09:13:23 crc kubenswrapper[4765]: I1003 09:13:23.301772 4765 log.go:25] "Finished parsing log file" path="/var/log/pods/watcher-kuttl-default_prometheus-metric-storage-0_81065dff-f372-4966-8f6b-751090a1f5b6/thanos-sidecar/0.log" Oct 03 09:13:23 crc kubenswrapper[4765]: I1003 09:13:23.541381 4765 log.go:25] "Finished parsing log file" path="/var/log/pods/watcher-kuttl-default_rabbitmq-notifications-server-0_822ab948-07b5-4946-aeb3-d6cd9e4f6752/setup-container/0.log" Oct 03 09:13:23 crc kubenswrapper[4765]: I1003 09:13:23.733104 4765 log.go:25] "Finished parsing log file" path="/var/log/pods/watcher-kuttl-default_rabbitmq-notifications-server-0_822ab948-07b5-4946-aeb3-d6cd9e4f6752/setup-container/0.log" Oct 03 09:13:23 crc kubenswrapper[4765]: I1003 09:13:23.779801 4765 log.go:25] "Finished parsing log file" path="/var/log/pods/watcher-kuttl-default_rabbitmq-notifications-server-0_822ab948-07b5-4946-aeb3-d6cd9e4f6752/rabbitmq/0.log" Oct 03 09:13:23 crc kubenswrapper[4765]: I1003 09:13:23.929495 4765 log.go:25] "Finished parsing log file" path="/var/log/pods/watcher-kuttl-default_rabbitmq-server-0_ee23f3ed-67bb-44ab-93fe-8251f7768941/setup-container/0.log" Oct 03 09:13:24 crc kubenswrapper[4765]: I1003 09:13:24.171467 4765 log.go:25] "Finished parsing log file" path="/var/log/pods/watcher-kuttl-default_rabbitmq-server-0_ee23f3ed-67bb-44ab-93fe-8251f7768941/setup-container/0.log" Oct 03 09:13:24 crc kubenswrapper[4765]: I1003 09:13:24.254965 4765 log.go:25] "Finished parsing log file" path="/var/log/pods/watcher-kuttl-default_rabbitmq-server-0_ee23f3ed-67bb-44ab-93fe-8251f7768941/rabbitmq/0.log" Oct 03 09:13:30 crc kubenswrapper[4765]: I1003 09:13:30.680044 4765 patch_prober.go:28] interesting pod/machine-config-daemon-j8mss container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 03 09:13:30 crc kubenswrapper[4765]: I1003 09:13:30.680428 4765 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-j8mss" podUID="d636dbad-9ffa-4ba7-953f-adea04b76a23" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 03 09:13:30 crc kubenswrapper[4765]: I1003 09:13:30.680478 4765 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-j8mss" Oct 03 09:13:30 crc kubenswrapper[4765]: I1003 09:13:30.681175 4765 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"1978eaa8d4c867a88cdf1bd67d12cbe56c8e728282f53b9cdc636787901cab02"} pod="openshift-machine-config-operator/machine-config-daemon-j8mss" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Oct 03 09:13:30 crc kubenswrapper[4765]: I1003 09:13:30.681222 4765 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-j8mss" podUID="d636dbad-9ffa-4ba7-953f-adea04b76a23" containerName="machine-config-daemon" containerID="cri-o://1978eaa8d4c867a88cdf1bd67d12cbe56c8e728282f53b9cdc636787901cab02" gracePeriod=600 Oct 03 09:13:31 crc kubenswrapper[4765]: I1003 09:13:31.391342 4765 generic.go:334] "Generic (PLEG): container finished" podID="d636dbad-9ffa-4ba7-953f-adea04b76a23" containerID="1978eaa8d4c867a88cdf1bd67d12cbe56c8e728282f53b9cdc636787901cab02" exitCode=0 Oct 03 09:13:31 crc kubenswrapper[4765]: I1003 09:13:31.391667 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-j8mss" event={"ID":"d636dbad-9ffa-4ba7-953f-adea04b76a23","Type":"ContainerDied","Data":"1978eaa8d4c867a88cdf1bd67d12cbe56c8e728282f53b9cdc636787901cab02"} Oct 03 09:13:31 crc kubenswrapper[4765]: I1003 09:13:31.391706 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-j8mss" event={"ID":"d636dbad-9ffa-4ba7-953f-adea04b76a23","Type":"ContainerStarted","Data":"42bfa14a2e89b5cffa309745efef82767cc344ded4fb8e7eea4024e396e441ec"} Oct 03 09:13:31 crc kubenswrapper[4765]: I1003 09:13:31.391732 4765 scope.go:117] "RemoveContainer" containerID="dd918556e4256b95f1ffce5dba4f8a301b33441a569fc5bbea88da3f09eb9800" Oct 03 09:13:31 crc kubenswrapper[4765]: I1003 09:13:31.590786 4765 log.go:25] "Finished parsing log file" path="/var/log/pods/watcher-kuttl-default_memcached-0_3023d790-ea2b-44fc-9255-338f822b368c/memcached/0.log" Oct 03 09:13:41 crc kubenswrapper[4765]: I1003 09:13:41.090775 4765 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_695e9552c02c72940c72621f824780f00ca58086c3badc308bf0a2eb69pfsz6_16c6b0c1-ad97-4661-81c4-0ae36496ca1e/util/0.log" Oct 03 09:13:41 crc kubenswrapper[4765]: I1003 09:13:41.292863 4765 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_695e9552c02c72940c72621f824780f00ca58086c3badc308bf0a2eb69pfsz6_16c6b0c1-ad97-4661-81c4-0ae36496ca1e/util/0.log" Oct 03 09:13:41 crc kubenswrapper[4765]: I1003 09:13:41.318200 4765 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_695e9552c02c72940c72621f824780f00ca58086c3badc308bf0a2eb69pfsz6_16c6b0c1-ad97-4661-81c4-0ae36496ca1e/pull/0.log" Oct 03 09:13:41 crc kubenswrapper[4765]: I1003 09:13:41.333919 4765 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_695e9552c02c72940c72621f824780f00ca58086c3badc308bf0a2eb69pfsz6_16c6b0c1-ad97-4661-81c4-0ae36496ca1e/pull/0.log" Oct 03 09:13:41 crc kubenswrapper[4765]: I1003 09:13:41.540606 4765 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_695e9552c02c72940c72621f824780f00ca58086c3badc308bf0a2eb69pfsz6_16c6b0c1-ad97-4661-81c4-0ae36496ca1e/pull/0.log" Oct 03 09:13:41 crc kubenswrapper[4765]: I1003 09:13:41.548547 4765 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_695e9552c02c72940c72621f824780f00ca58086c3badc308bf0a2eb69pfsz6_16c6b0c1-ad97-4661-81c4-0ae36496ca1e/extract/0.log" Oct 03 09:13:41 crc kubenswrapper[4765]: I1003 09:13:41.583447 4765 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_695e9552c02c72940c72621f824780f00ca58086c3badc308bf0a2eb69pfsz6_16c6b0c1-ad97-4661-81c4-0ae36496ca1e/util/0.log" Oct 03 09:13:41 crc kubenswrapper[4765]: I1003 09:13:41.742097 4765 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d2r76hj_aca861ef-e249-4395-8760-c2b556b47ae7/util/0.log" Oct 03 09:13:41 crc kubenswrapper[4765]: I1003 09:13:41.903015 4765 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d2r76hj_aca861ef-e249-4395-8760-c2b556b47ae7/util/0.log" Oct 03 09:13:41 crc kubenswrapper[4765]: I1003 09:13:41.939390 4765 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d2r76hj_aca861ef-e249-4395-8760-c2b556b47ae7/pull/0.log" Oct 03 09:13:41 crc kubenswrapper[4765]: I1003 09:13:41.958738 4765 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d2r76hj_aca861ef-e249-4395-8760-c2b556b47ae7/pull/0.log" Oct 03 09:13:42 crc kubenswrapper[4765]: I1003 09:13:42.132493 4765 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d2r76hj_aca861ef-e249-4395-8760-c2b556b47ae7/extract/0.log" Oct 03 09:13:42 crc kubenswrapper[4765]: I1003 09:13:42.163723 4765 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d2r76hj_aca861ef-e249-4395-8760-c2b556b47ae7/pull/0.log" Oct 03 09:13:42 crc kubenswrapper[4765]: I1003 09:13:42.192796 4765 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d2r76hj_aca861ef-e249-4395-8760-c2b556b47ae7/util/0.log" Oct 03 09:13:42 crc kubenswrapper[4765]: I1003 09:13:42.357110 4765 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_a6d815214afcb93f379916e45350d3de39072121f31a1d7eaaf6e22c2dzxfhl_4522df7f-bbdd-4884-a115-d33fec3bb365/util/0.log" Oct 03 09:13:42 crc kubenswrapper[4765]: I1003 09:13:42.607800 4765 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_a6d815214afcb93f379916e45350d3de39072121f31a1d7eaaf6e22c2dzxfhl_4522df7f-bbdd-4884-a115-d33fec3bb365/pull/0.log" Oct 03 09:13:42 crc kubenswrapper[4765]: I1003 09:13:42.633029 4765 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_a6d815214afcb93f379916e45350d3de39072121f31a1d7eaaf6e22c2dzxfhl_4522df7f-bbdd-4884-a115-d33fec3bb365/pull/0.log" Oct 03 09:13:42 crc kubenswrapper[4765]: I1003 09:13:42.633067 4765 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_a6d815214afcb93f379916e45350d3de39072121f31a1d7eaaf6e22c2dzxfhl_4522df7f-bbdd-4884-a115-d33fec3bb365/util/0.log" Oct 03 09:13:42 crc kubenswrapper[4765]: I1003 09:13:42.828941 4765 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_a6d815214afcb93f379916e45350d3de39072121f31a1d7eaaf6e22c2dzxfhl_4522df7f-bbdd-4884-a115-d33fec3bb365/pull/0.log" Oct 03 09:13:42 crc kubenswrapper[4765]: I1003 09:13:42.834983 4765 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_a6d815214afcb93f379916e45350d3de39072121f31a1d7eaaf6e22c2dzxfhl_4522df7f-bbdd-4884-a115-d33fec3bb365/extract/0.log" Oct 03 09:13:42 crc kubenswrapper[4765]: I1003 09:13:42.851411 4765 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_a6d815214afcb93f379916e45350d3de39072121f31a1d7eaaf6e22c2dzxfhl_4522df7f-bbdd-4884-a115-d33fec3bb365/util/0.log" Oct 03 09:13:43 crc kubenswrapper[4765]: I1003 09:13:43.013308 4765 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-vw7jl_b1baebbf-ae2b-4d7c-a366-3c4ecb7741db/extract-utilities/0.log" Oct 03 09:13:43 crc kubenswrapper[4765]: I1003 09:13:43.242821 4765 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-vw7jl_b1baebbf-ae2b-4d7c-a366-3c4ecb7741db/extract-content/0.log" Oct 03 09:13:43 crc kubenswrapper[4765]: I1003 09:13:43.266124 4765 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-vw7jl_b1baebbf-ae2b-4d7c-a366-3c4ecb7741db/extract-utilities/0.log" Oct 03 09:13:43 crc kubenswrapper[4765]: I1003 09:13:43.289433 4765 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-vw7jl_b1baebbf-ae2b-4d7c-a366-3c4ecb7741db/extract-content/0.log" Oct 03 09:13:43 crc kubenswrapper[4765]: I1003 09:13:43.447407 4765 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-vw7jl_b1baebbf-ae2b-4d7c-a366-3c4ecb7741db/extract-utilities/0.log" Oct 03 09:13:43 crc kubenswrapper[4765]: I1003 09:13:43.453520 4765 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-vw7jl_b1baebbf-ae2b-4d7c-a366-3c4ecb7741db/extract-content/0.log" Oct 03 09:13:43 crc kubenswrapper[4765]: I1003 09:13:43.902179 4765 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-vw7jl_b1baebbf-ae2b-4d7c-a366-3c4ecb7741db/registry-server/0.log" Oct 03 09:13:43 crc kubenswrapper[4765]: I1003 09:13:43.979021 4765 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-kcpdx_5ff32a58-625a-44cf-bd6e-7a2f7b3f06f9/extract-utilities/0.log" Oct 03 09:13:44 crc kubenswrapper[4765]: I1003 09:13:44.131611 4765 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-kcpdx_5ff32a58-625a-44cf-bd6e-7a2f7b3f06f9/extract-utilities/0.log" Oct 03 09:13:44 crc kubenswrapper[4765]: I1003 09:13:44.132347 4765 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-kcpdx_5ff32a58-625a-44cf-bd6e-7a2f7b3f06f9/extract-content/0.log" Oct 03 09:13:44 crc kubenswrapper[4765]: I1003 09:13:44.160107 4765 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-kcpdx_5ff32a58-625a-44cf-bd6e-7a2f7b3f06f9/extract-content/0.log" Oct 03 09:13:44 crc kubenswrapper[4765]: I1003 09:13:44.292798 4765 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-kcpdx_5ff32a58-625a-44cf-bd6e-7a2f7b3f06f9/extract-content/0.log" Oct 03 09:13:44 crc kubenswrapper[4765]: I1003 09:13:44.307122 4765 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-kcpdx_5ff32a58-625a-44cf-bd6e-7a2f7b3f06f9/extract-utilities/0.log" Oct 03 09:13:44 crc kubenswrapper[4765]: I1003 09:13:44.521128 4765 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835ck649t_e758dad4-a664-40cc-b2e1-e7e43c757276/util/0.log" Oct 03 09:13:44 crc kubenswrapper[4765]: I1003 09:13:44.746715 4765 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835ck649t_e758dad4-a664-40cc-b2e1-e7e43c757276/pull/0.log" Oct 03 09:13:44 crc kubenswrapper[4765]: I1003 09:13:44.765380 4765 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835ck649t_e758dad4-a664-40cc-b2e1-e7e43c757276/pull/0.log" Oct 03 09:13:44 crc kubenswrapper[4765]: I1003 09:13:44.773566 4765 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-kcpdx_5ff32a58-625a-44cf-bd6e-7a2f7b3f06f9/registry-server/0.log" Oct 03 09:13:44 crc kubenswrapper[4765]: I1003 09:13:44.786999 4765 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835ck649t_e758dad4-a664-40cc-b2e1-e7e43c757276/util/0.log" Oct 03 09:13:44 crc kubenswrapper[4765]: I1003 09:13:44.980094 4765 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835ck649t_e758dad4-a664-40cc-b2e1-e7e43c757276/util/0.log" Oct 03 09:13:45 crc kubenswrapper[4765]: I1003 09:13:45.008414 4765 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835ck649t_e758dad4-a664-40cc-b2e1-e7e43c757276/pull/0.log" Oct 03 09:13:45 crc kubenswrapper[4765]: I1003 09:13:45.011623 4765 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_marketplace-operator-79b997595-fpzd6_a266cf6c-8014-4ce5-a57a-9851e4f971c5/marketplace-operator/0.log" Oct 03 09:13:45 crc kubenswrapper[4765]: I1003 09:13:45.046527 4765 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835ck649t_e758dad4-a664-40cc-b2e1-e7e43c757276/extract/0.log" Oct 03 09:13:45 crc kubenswrapper[4765]: I1003 09:13:45.227528 4765 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-zg6xn_66ed5f9c-a15e-45b4-b79f-e574371609c8/extract-utilities/0.log" Oct 03 09:13:45 crc kubenswrapper[4765]: I1003 09:13:45.429244 4765 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-zg6xn_66ed5f9c-a15e-45b4-b79f-e574371609c8/extract-content/0.log" Oct 03 09:13:45 crc kubenswrapper[4765]: I1003 09:13:45.437309 4765 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-zg6xn_66ed5f9c-a15e-45b4-b79f-e574371609c8/extract-utilities/0.log" Oct 03 09:13:45 crc kubenswrapper[4765]: I1003 09:13:45.458602 4765 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-zg6xn_66ed5f9c-a15e-45b4-b79f-e574371609c8/extract-content/0.log" Oct 03 09:13:45 crc kubenswrapper[4765]: I1003 09:13:45.591971 4765 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-zg6xn_66ed5f9c-a15e-45b4-b79f-e574371609c8/extract-utilities/0.log" Oct 03 09:13:45 crc kubenswrapper[4765]: I1003 09:13:45.598902 4765 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-zg6xn_66ed5f9c-a15e-45b4-b79f-e574371609c8/extract-content/0.log" Oct 03 09:13:45 crc kubenswrapper[4765]: I1003 09:13:45.707330 4765 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-zg6xn_66ed5f9c-a15e-45b4-b79f-e574371609c8/registry-server/0.log" Oct 03 09:13:45 crc kubenswrapper[4765]: I1003 09:13:45.749497 4765 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-bgnsr_085812e0-4616-4b7e-a4b4-1301aa194042/extract-utilities/0.log" Oct 03 09:13:45 crc kubenswrapper[4765]: I1003 09:13:45.869264 4765 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-bgnsr_085812e0-4616-4b7e-a4b4-1301aa194042/extract-content/0.log" Oct 03 09:13:45 crc kubenswrapper[4765]: I1003 09:13:45.884306 4765 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-bgnsr_085812e0-4616-4b7e-a4b4-1301aa194042/extract-utilities/0.log" Oct 03 09:13:45 crc kubenswrapper[4765]: I1003 09:13:45.890175 4765 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-bgnsr_085812e0-4616-4b7e-a4b4-1301aa194042/extract-content/0.log" Oct 03 09:13:46 crc kubenswrapper[4765]: I1003 09:13:46.085745 4765 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-bgnsr_085812e0-4616-4b7e-a4b4-1301aa194042/extract-utilities/0.log" Oct 03 09:13:46 crc kubenswrapper[4765]: I1003 09:13:46.095486 4765 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-bgnsr_085812e0-4616-4b7e-a4b4-1301aa194042/extract-content/0.log" Oct 03 09:13:46 crc kubenswrapper[4765]: I1003 09:13:46.583106 4765 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-bgnsr_085812e0-4616-4b7e-a4b4-1301aa194042/registry-server/0.log" Oct 03 09:13:57 crc kubenswrapper[4765]: I1003 09:13:57.846199 4765 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operators_obo-prometheus-operator-7c8cf85677-b27n7_bc3e65d2-829a-4385-b865-15e288293af9/prometheus-operator/0.log" Oct 03 09:13:58 crc kubenswrapper[4765]: I1003 09:13:58.019887 4765 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operators_obo-prometheus-operator-admission-webhook-665f459cd-2mtm8_4310e68b-d273-4784-8e5f-9114306616d8/prometheus-operator-admission-webhook/0.log" Oct 03 09:13:58 crc kubenswrapper[4765]: I1003 09:13:58.097512 4765 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operators_obo-prometheus-operator-admission-webhook-665f459cd-hcgt2_3f879d8d-cb72-4b5f-a9f6-96ba70a723c0/prometheus-operator-admission-webhook/0.log" Oct 03 09:13:58 crc kubenswrapper[4765]: I1003 09:13:58.261049 4765 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operators_observability-operator-cc5f78dfc-ndvd9_9b3ed21b-9c8f-45e2-a048-c6d1e0324360/operator/0.log" Oct 03 09:13:58 crc kubenswrapper[4765]: I1003 09:13:58.313376 4765 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operators_observability-ui-dashboards-6584dc9448-d5k5d_1f6d6f47-0f09-4c55-8df8-5346c3f8ffb7/observability-ui-dashboards/0.log" Oct 03 09:13:58 crc kubenswrapper[4765]: I1003 09:13:58.437167 4765 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operators_perses-operator-54bc95c9fb-76rtm_4cdf3ecf-31a5-43b7-9df2-d0a6d2e8fb56/perses-operator/0.log" Oct 03 09:14:17 crc kubenswrapper[4765]: I1003 09:14:17.957145 4765 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-5fsx5"] Oct 03 09:14:17 crc kubenswrapper[4765]: E1003 09:14:17.958105 4765 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="04492332-af2a-41b4-a716-2c926247b9b0" containerName="extract-content" Oct 03 09:14:17 crc kubenswrapper[4765]: I1003 09:14:17.958120 4765 state_mem.go:107] "Deleted CPUSet assignment" podUID="04492332-af2a-41b4-a716-2c926247b9b0" containerName="extract-content" Oct 03 09:14:17 crc kubenswrapper[4765]: E1003 09:14:17.958162 4765 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="04492332-af2a-41b4-a716-2c926247b9b0" containerName="extract-utilities" Oct 03 09:14:17 crc kubenswrapper[4765]: I1003 09:14:17.958169 4765 state_mem.go:107] "Deleted CPUSet assignment" podUID="04492332-af2a-41b4-a716-2c926247b9b0" containerName="extract-utilities" Oct 03 09:14:17 crc kubenswrapper[4765]: E1003 09:14:17.958183 4765 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="04492332-af2a-41b4-a716-2c926247b9b0" containerName="registry-server" Oct 03 09:14:17 crc kubenswrapper[4765]: I1003 09:14:17.958190 4765 state_mem.go:107] "Deleted CPUSet assignment" podUID="04492332-af2a-41b4-a716-2c926247b9b0" containerName="registry-server" Oct 03 09:14:17 crc kubenswrapper[4765]: I1003 09:14:17.958390 4765 memory_manager.go:354] "RemoveStaleState removing state" podUID="04492332-af2a-41b4-a716-2c926247b9b0" containerName="registry-server" Oct 03 09:14:17 crc kubenswrapper[4765]: I1003 09:14:17.960041 4765 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-5fsx5" Oct 03 09:14:17 crc kubenswrapper[4765]: I1003 09:14:17.971022 4765 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-5fsx5"] Oct 03 09:14:18 crc kubenswrapper[4765]: I1003 09:14:18.050278 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6ff261c8-10d1-44c5-b224-dbefb6a5a4a4-utilities\") pod \"certified-operators-5fsx5\" (UID: \"6ff261c8-10d1-44c5-b224-dbefb6a5a4a4\") " pod="openshift-marketplace/certified-operators-5fsx5" Oct 03 09:14:18 crc kubenswrapper[4765]: I1003 09:14:18.050331 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6ff261c8-10d1-44c5-b224-dbefb6a5a4a4-catalog-content\") pod \"certified-operators-5fsx5\" (UID: \"6ff261c8-10d1-44c5-b224-dbefb6a5a4a4\") " pod="openshift-marketplace/certified-operators-5fsx5" Oct 03 09:14:18 crc kubenswrapper[4765]: I1003 09:14:18.050641 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vzqnc\" (UniqueName: \"kubernetes.io/projected/6ff261c8-10d1-44c5-b224-dbefb6a5a4a4-kube-api-access-vzqnc\") pod \"certified-operators-5fsx5\" (UID: \"6ff261c8-10d1-44c5-b224-dbefb6a5a4a4\") " pod="openshift-marketplace/certified-operators-5fsx5" Oct 03 09:14:18 crc kubenswrapper[4765]: I1003 09:14:18.152343 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vzqnc\" (UniqueName: \"kubernetes.io/projected/6ff261c8-10d1-44c5-b224-dbefb6a5a4a4-kube-api-access-vzqnc\") pod \"certified-operators-5fsx5\" (UID: \"6ff261c8-10d1-44c5-b224-dbefb6a5a4a4\") " pod="openshift-marketplace/certified-operators-5fsx5" Oct 03 09:14:18 crc kubenswrapper[4765]: I1003 09:14:18.152447 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6ff261c8-10d1-44c5-b224-dbefb6a5a4a4-utilities\") pod \"certified-operators-5fsx5\" (UID: \"6ff261c8-10d1-44c5-b224-dbefb6a5a4a4\") " pod="openshift-marketplace/certified-operators-5fsx5" Oct 03 09:14:18 crc kubenswrapper[4765]: I1003 09:14:18.152470 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6ff261c8-10d1-44c5-b224-dbefb6a5a4a4-catalog-content\") pod \"certified-operators-5fsx5\" (UID: \"6ff261c8-10d1-44c5-b224-dbefb6a5a4a4\") " pod="openshift-marketplace/certified-operators-5fsx5" Oct 03 09:14:18 crc kubenswrapper[4765]: I1003 09:14:18.153115 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6ff261c8-10d1-44c5-b224-dbefb6a5a4a4-catalog-content\") pod \"certified-operators-5fsx5\" (UID: \"6ff261c8-10d1-44c5-b224-dbefb6a5a4a4\") " pod="openshift-marketplace/certified-operators-5fsx5" Oct 03 09:14:18 crc kubenswrapper[4765]: I1003 09:14:18.153179 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6ff261c8-10d1-44c5-b224-dbefb6a5a4a4-utilities\") pod \"certified-operators-5fsx5\" (UID: \"6ff261c8-10d1-44c5-b224-dbefb6a5a4a4\") " pod="openshift-marketplace/certified-operators-5fsx5" Oct 03 09:14:18 crc kubenswrapper[4765]: I1003 09:14:18.173372 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vzqnc\" (UniqueName: \"kubernetes.io/projected/6ff261c8-10d1-44c5-b224-dbefb6a5a4a4-kube-api-access-vzqnc\") pod \"certified-operators-5fsx5\" (UID: \"6ff261c8-10d1-44c5-b224-dbefb6a5a4a4\") " pod="openshift-marketplace/certified-operators-5fsx5" Oct 03 09:14:18 crc kubenswrapper[4765]: I1003 09:14:18.283548 4765 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-5fsx5" Oct 03 09:14:18 crc kubenswrapper[4765]: I1003 09:14:18.578607 4765 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-5fsx5"] Oct 03 09:14:18 crc kubenswrapper[4765]: I1003 09:14:18.755124 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-5fsx5" event={"ID":"6ff261c8-10d1-44c5-b224-dbefb6a5a4a4","Type":"ContainerStarted","Data":"266efaa0b3458ac3c421b933a34f46a0a394f32b624e98c691b2fb3238e45145"} Oct 03 09:14:19 crc kubenswrapper[4765]: I1003 09:14:19.764398 4765 generic.go:334] "Generic (PLEG): container finished" podID="6ff261c8-10d1-44c5-b224-dbefb6a5a4a4" containerID="2fabdb4d5c2fc07965731d69ff7ddcab20ace27a28d704c6fe5e39624178312b" exitCode=0 Oct 03 09:14:19 crc kubenswrapper[4765]: I1003 09:14:19.764467 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-5fsx5" event={"ID":"6ff261c8-10d1-44c5-b224-dbefb6a5a4a4","Type":"ContainerDied","Data":"2fabdb4d5c2fc07965731d69ff7ddcab20ace27a28d704c6fe5e39624178312b"} Oct 03 09:14:20 crc kubenswrapper[4765]: I1003 09:14:20.408603 4765 scope.go:117] "RemoveContainer" containerID="d63ad9af49bea67159a5f84c95383734beb6b9e672682fefa4010326a84c3331" Oct 03 09:14:21 crc kubenswrapper[4765]: I1003 09:14:21.782996 4765 generic.go:334] "Generic (PLEG): container finished" podID="6ff261c8-10d1-44c5-b224-dbefb6a5a4a4" containerID="1c9a28f91cd8e5c8d7e8b95bd8766c7c167afa0b8a2de96326d1f47b9e40ba18" exitCode=0 Oct 03 09:14:21 crc kubenswrapper[4765]: I1003 09:14:21.783049 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-5fsx5" event={"ID":"6ff261c8-10d1-44c5-b224-dbefb6a5a4a4","Type":"ContainerDied","Data":"1c9a28f91cd8e5c8d7e8b95bd8766c7c167afa0b8a2de96326d1f47b9e40ba18"} Oct 03 09:14:23 crc kubenswrapper[4765]: I1003 09:14:23.801861 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-5fsx5" event={"ID":"6ff261c8-10d1-44c5-b224-dbefb6a5a4a4","Type":"ContainerStarted","Data":"ccbe4feecff083c89b2e61fa77a5f822bdf07d85f942904cb7451aff9f906f97"} Oct 03 09:14:23 crc kubenswrapper[4765]: I1003 09:14:23.824038 4765 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-5fsx5" podStartSLOduration=3.869127026 podStartE2EDuration="6.824016933s" podCreationTimestamp="2025-10-03 09:14:17 +0000 UTC" firstStartedPulling="2025-10-03 09:14:19.76632145 +0000 UTC m=+2104.067815780" lastFinishedPulling="2025-10-03 09:14:22.721211357 +0000 UTC m=+2107.022705687" observedRunningTime="2025-10-03 09:14:23.821059396 +0000 UTC m=+2108.122553726" watchObservedRunningTime="2025-10-03 09:14:23.824016933 +0000 UTC m=+2108.125511273" Oct 03 09:14:28 crc kubenswrapper[4765]: I1003 09:14:28.284214 4765 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-5fsx5" Oct 03 09:14:28 crc kubenswrapper[4765]: I1003 09:14:28.284568 4765 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-5fsx5" Oct 03 09:14:28 crc kubenswrapper[4765]: I1003 09:14:28.328304 4765 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-5fsx5" Oct 03 09:14:28 crc kubenswrapper[4765]: I1003 09:14:28.896541 4765 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-5fsx5" Oct 03 09:14:31 crc kubenswrapper[4765]: I1003 09:14:31.950254 4765 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-5fsx5"] Oct 03 09:14:31 crc kubenswrapper[4765]: I1003 09:14:31.950533 4765 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-5fsx5" podUID="6ff261c8-10d1-44c5-b224-dbefb6a5a4a4" containerName="registry-server" containerID="cri-o://ccbe4feecff083c89b2e61fa77a5f822bdf07d85f942904cb7451aff9f906f97" gracePeriod=2 Oct 03 09:14:32 crc kubenswrapper[4765]: I1003 09:14:32.353829 4765 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-5fsx5" Oct 03 09:14:32 crc kubenswrapper[4765]: I1003 09:14:32.475498 4765 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vzqnc\" (UniqueName: \"kubernetes.io/projected/6ff261c8-10d1-44c5-b224-dbefb6a5a4a4-kube-api-access-vzqnc\") pod \"6ff261c8-10d1-44c5-b224-dbefb6a5a4a4\" (UID: \"6ff261c8-10d1-44c5-b224-dbefb6a5a4a4\") " Oct 03 09:14:32 crc kubenswrapper[4765]: I1003 09:14:32.475850 4765 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6ff261c8-10d1-44c5-b224-dbefb6a5a4a4-catalog-content\") pod \"6ff261c8-10d1-44c5-b224-dbefb6a5a4a4\" (UID: \"6ff261c8-10d1-44c5-b224-dbefb6a5a4a4\") " Oct 03 09:14:32 crc kubenswrapper[4765]: I1003 09:14:32.475984 4765 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6ff261c8-10d1-44c5-b224-dbefb6a5a4a4-utilities\") pod \"6ff261c8-10d1-44c5-b224-dbefb6a5a4a4\" (UID: \"6ff261c8-10d1-44c5-b224-dbefb6a5a4a4\") " Oct 03 09:14:32 crc kubenswrapper[4765]: I1003 09:14:32.476903 4765 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/6ff261c8-10d1-44c5-b224-dbefb6a5a4a4-utilities" (OuterVolumeSpecName: "utilities") pod "6ff261c8-10d1-44c5-b224-dbefb6a5a4a4" (UID: "6ff261c8-10d1-44c5-b224-dbefb6a5a4a4"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 03 09:14:32 crc kubenswrapper[4765]: I1003 09:14:32.480996 4765 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6ff261c8-10d1-44c5-b224-dbefb6a5a4a4-kube-api-access-vzqnc" (OuterVolumeSpecName: "kube-api-access-vzqnc") pod "6ff261c8-10d1-44c5-b224-dbefb6a5a4a4" (UID: "6ff261c8-10d1-44c5-b224-dbefb6a5a4a4"). InnerVolumeSpecName "kube-api-access-vzqnc". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 09:14:32 crc kubenswrapper[4765]: I1003 09:14:32.523373 4765 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/6ff261c8-10d1-44c5-b224-dbefb6a5a4a4-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "6ff261c8-10d1-44c5-b224-dbefb6a5a4a4" (UID: "6ff261c8-10d1-44c5-b224-dbefb6a5a4a4"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 03 09:14:32 crc kubenswrapper[4765]: I1003 09:14:32.577851 4765 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vzqnc\" (UniqueName: \"kubernetes.io/projected/6ff261c8-10d1-44c5-b224-dbefb6a5a4a4-kube-api-access-vzqnc\") on node \"crc\" DevicePath \"\"" Oct 03 09:14:32 crc kubenswrapper[4765]: I1003 09:14:32.577906 4765 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6ff261c8-10d1-44c5-b224-dbefb6a5a4a4-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 03 09:14:32 crc kubenswrapper[4765]: I1003 09:14:32.577920 4765 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6ff261c8-10d1-44c5-b224-dbefb6a5a4a4-utilities\") on node \"crc\" DevicePath \"\"" Oct 03 09:14:32 crc kubenswrapper[4765]: I1003 09:14:32.877095 4765 generic.go:334] "Generic (PLEG): container finished" podID="6ff261c8-10d1-44c5-b224-dbefb6a5a4a4" containerID="ccbe4feecff083c89b2e61fa77a5f822bdf07d85f942904cb7451aff9f906f97" exitCode=0 Oct 03 09:14:32 crc kubenswrapper[4765]: I1003 09:14:32.877176 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-5fsx5" event={"ID":"6ff261c8-10d1-44c5-b224-dbefb6a5a4a4","Type":"ContainerDied","Data":"ccbe4feecff083c89b2e61fa77a5f822bdf07d85f942904cb7451aff9f906f97"} Oct 03 09:14:32 crc kubenswrapper[4765]: I1003 09:14:32.877217 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-5fsx5" event={"ID":"6ff261c8-10d1-44c5-b224-dbefb6a5a4a4","Type":"ContainerDied","Data":"266efaa0b3458ac3c421b933a34f46a0a394f32b624e98c691b2fb3238e45145"} Oct 03 09:14:32 crc kubenswrapper[4765]: I1003 09:14:32.877241 4765 scope.go:117] "RemoveContainer" containerID="ccbe4feecff083c89b2e61fa77a5f822bdf07d85f942904cb7451aff9f906f97" Oct 03 09:14:32 crc kubenswrapper[4765]: I1003 09:14:32.877184 4765 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-5fsx5" Oct 03 09:14:32 crc kubenswrapper[4765]: I1003 09:14:32.895721 4765 scope.go:117] "RemoveContainer" containerID="1c9a28f91cd8e5c8d7e8b95bd8766c7c167afa0b8a2de96326d1f47b9e40ba18" Oct 03 09:14:32 crc kubenswrapper[4765]: I1003 09:14:32.912445 4765 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-5fsx5"] Oct 03 09:14:32 crc kubenswrapper[4765]: I1003 09:14:32.919005 4765 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-5fsx5"] Oct 03 09:14:32 crc kubenswrapper[4765]: I1003 09:14:32.929491 4765 scope.go:117] "RemoveContainer" containerID="2fabdb4d5c2fc07965731d69ff7ddcab20ace27a28d704c6fe5e39624178312b" Oct 03 09:14:32 crc kubenswrapper[4765]: I1003 09:14:32.949827 4765 scope.go:117] "RemoveContainer" containerID="ccbe4feecff083c89b2e61fa77a5f822bdf07d85f942904cb7451aff9f906f97" Oct 03 09:14:32 crc kubenswrapper[4765]: E1003 09:14:32.950297 4765 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ccbe4feecff083c89b2e61fa77a5f822bdf07d85f942904cb7451aff9f906f97\": container with ID starting with ccbe4feecff083c89b2e61fa77a5f822bdf07d85f942904cb7451aff9f906f97 not found: ID does not exist" containerID="ccbe4feecff083c89b2e61fa77a5f822bdf07d85f942904cb7451aff9f906f97" Oct 03 09:14:32 crc kubenswrapper[4765]: I1003 09:14:32.950329 4765 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ccbe4feecff083c89b2e61fa77a5f822bdf07d85f942904cb7451aff9f906f97"} err="failed to get container status \"ccbe4feecff083c89b2e61fa77a5f822bdf07d85f942904cb7451aff9f906f97\": rpc error: code = NotFound desc = could not find container \"ccbe4feecff083c89b2e61fa77a5f822bdf07d85f942904cb7451aff9f906f97\": container with ID starting with ccbe4feecff083c89b2e61fa77a5f822bdf07d85f942904cb7451aff9f906f97 not found: ID does not exist" Oct 03 09:14:32 crc kubenswrapper[4765]: I1003 09:14:32.950352 4765 scope.go:117] "RemoveContainer" containerID="1c9a28f91cd8e5c8d7e8b95bd8766c7c167afa0b8a2de96326d1f47b9e40ba18" Oct 03 09:14:32 crc kubenswrapper[4765]: E1003 09:14:32.951942 4765 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1c9a28f91cd8e5c8d7e8b95bd8766c7c167afa0b8a2de96326d1f47b9e40ba18\": container with ID starting with 1c9a28f91cd8e5c8d7e8b95bd8766c7c167afa0b8a2de96326d1f47b9e40ba18 not found: ID does not exist" containerID="1c9a28f91cd8e5c8d7e8b95bd8766c7c167afa0b8a2de96326d1f47b9e40ba18" Oct 03 09:14:32 crc kubenswrapper[4765]: I1003 09:14:32.952111 4765 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1c9a28f91cd8e5c8d7e8b95bd8766c7c167afa0b8a2de96326d1f47b9e40ba18"} err="failed to get container status \"1c9a28f91cd8e5c8d7e8b95bd8766c7c167afa0b8a2de96326d1f47b9e40ba18\": rpc error: code = NotFound desc = could not find container \"1c9a28f91cd8e5c8d7e8b95bd8766c7c167afa0b8a2de96326d1f47b9e40ba18\": container with ID starting with 1c9a28f91cd8e5c8d7e8b95bd8766c7c167afa0b8a2de96326d1f47b9e40ba18 not found: ID does not exist" Oct 03 09:14:32 crc kubenswrapper[4765]: I1003 09:14:32.952218 4765 scope.go:117] "RemoveContainer" containerID="2fabdb4d5c2fc07965731d69ff7ddcab20ace27a28d704c6fe5e39624178312b" Oct 03 09:14:32 crc kubenswrapper[4765]: E1003 09:14:32.952700 4765 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2fabdb4d5c2fc07965731d69ff7ddcab20ace27a28d704c6fe5e39624178312b\": container with ID starting with 2fabdb4d5c2fc07965731d69ff7ddcab20ace27a28d704c6fe5e39624178312b not found: ID does not exist" containerID="2fabdb4d5c2fc07965731d69ff7ddcab20ace27a28d704c6fe5e39624178312b" Oct 03 09:14:32 crc kubenswrapper[4765]: I1003 09:14:32.952795 4765 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2fabdb4d5c2fc07965731d69ff7ddcab20ace27a28d704c6fe5e39624178312b"} err="failed to get container status \"2fabdb4d5c2fc07965731d69ff7ddcab20ace27a28d704c6fe5e39624178312b\": rpc error: code = NotFound desc = could not find container \"2fabdb4d5c2fc07965731d69ff7ddcab20ace27a28d704c6fe5e39624178312b\": container with ID starting with 2fabdb4d5c2fc07965731d69ff7ddcab20ace27a28d704c6fe5e39624178312b not found: ID does not exist" Oct 03 09:14:34 crc kubenswrapper[4765]: I1003 09:14:34.318142 4765 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6ff261c8-10d1-44c5-b224-dbefb6a5a4a4" path="/var/lib/kubelet/pods/6ff261c8-10d1-44c5-b224-dbefb6a5a4a4/volumes" Oct 03 09:15:00 crc kubenswrapper[4765]: I1003 09:15:00.160432 4765 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29324715-zmxsr"] Oct 03 09:15:00 crc kubenswrapper[4765]: E1003 09:15:00.161359 4765 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6ff261c8-10d1-44c5-b224-dbefb6a5a4a4" containerName="registry-server" Oct 03 09:15:00 crc kubenswrapper[4765]: I1003 09:15:00.161375 4765 state_mem.go:107] "Deleted CPUSet assignment" podUID="6ff261c8-10d1-44c5-b224-dbefb6a5a4a4" containerName="registry-server" Oct 03 09:15:00 crc kubenswrapper[4765]: E1003 09:15:00.161389 4765 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6ff261c8-10d1-44c5-b224-dbefb6a5a4a4" containerName="extract-utilities" Oct 03 09:15:00 crc kubenswrapper[4765]: I1003 09:15:00.161395 4765 state_mem.go:107] "Deleted CPUSet assignment" podUID="6ff261c8-10d1-44c5-b224-dbefb6a5a4a4" containerName="extract-utilities" Oct 03 09:15:00 crc kubenswrapper[4765]: E1003 09:15:00.161410 4765 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6ff261c8-10d1-44c5-b224-dbefb6a5a4a4" containerName="extract-content" Oct 03 09:15:00 crc kubenswrapper[4765]: I1003 09:15:00.161417 4765 state_mem.go:107] "Deleted CPUSet assignment" podUID="6ff261c8-10d1-44c5-b224-dbefb6a5a4a4" containerName="extract-content" Oct 03 09:15:00 crc kubenswrapper[4765]: I1003 09:15:00.161582 4765 memory_manager.go:354] "RemoveStaleState removing state" podUID="6ff261c8-10d1-44c5-b224-dbefb6a5a4a4" containerName="registry-server" Oct 03 09:15:00 crc kubenswrapper[4765]: I1003 09:15:00.162181 4765 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29324715-zmxsr" Oct 03 09:15:00 crc kubenswrapper[4765]: I1003 09:15:00.164859 4765 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Oct 03 09:15:00 crc kubenswrapper[4765]: I1003 09:15:00.165698 4765 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Oct 03 09:15:00 crc kubenswrapper[4765]: I1003 09:15:00.179127 4765 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29324715-zmxsr"] Oct 03 09:15:00 crc kubenswrapper[4765]: I1003 09:15:00.241160 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/9a61f4bf-09d3-4fab-809a-4105c512de46-config-volume\") pod \"collect-profiles-29324715-zmxsr\" (UID: \"9a61f4bf-09d3-4fab-809a-4105c512de46\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29324715-zmxsr" Oct 03 09:15:00 crc kubenswrapper[4765]: I1003 09:15:00.241284 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/9a61f4bf-09d3-4fab-809a-4105c512de46-secret-volume\") pod \"collect-profiles-29324715-zmxsr\" (UID: \"9a61f4bf-09d3-4fab-809a-4105c512de46\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29324715-zmxsr" Oct 03 09:15:00 crc kubenswrapper[4765]: I1003 09:15:00.241443 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lzgdp\" (UniqueName: \"kubernetes.io/projected/9a61f4bf-09d3-4fab-809a-4105c512de46-kube-api-access-lzgdp\") pod \"collect-profiles-29324715-zmxsr\" (UID: \"9a61f4bf-09d3-4fab-809a-4105c512de46\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29324715-zmxsr" Oct 03 09:15:00 crc kubenswrapper[4765]: I1003 09:15:00.343411 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lzgdp\" (UniqueName: \"kubernetes.io/projected/9a61f4bf-09d3-4fab-809a-4105c512de46-kube-api-access-lzgdp\") pod \"collect-profiles-29324715-zmxsr\" (UID: \"9a61f4bf-09d3-4fab-809a-4105c512de46\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29324715-zmxsr" Oct 03 09:15:00 crc kubenswrapper[4765]: I1003 09:15:00.344391 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/9a61f4bf-09d3-4fab-809a-4105c512de46-config-volume\") pod \"collect-profiles-29324715-zmxsr\" (UID: \"9a61f4bf-09d3-4fab-809a-4105c512de46\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29324715-zmxsr" Oct 03 09:15:00 crc kubenswrapper[4765]: I1003 09:15:00.344472 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/9a61f4bf-09d3-4fab-809a-4105c512de46-config-volume\") pod \"collect-profiles-29324715-zmxsr\" (UID: \"9a61f4bf-09d3-4fab-809a-4105c512de46\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29324715-zmxsr" Oct 03 09:15:00 crc kubenswrapper[4765]: I1003 09:15:00.344587 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/9a61f4bf-09d3-4fab-809a-4105c512de46-secret-volume\") pod \"collect-profiles-29324715-zmxsr\" (UID: \"9a61f4bf-09d3-4fab-809a-4105c512de46\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29324715-zmxsr" Oct 03 09:15:00 crc kubenswrapper[4765]: I1003 09:15:00.350668 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/9a61f4bf-09d3-4fab-809a-4105c512de46-secret-volume\") pod \"collect-profiles-29324715-zmxsr\" (UID: \"9a61f4bf-09d3-4fab-809a-4105c512de46\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29324715-zmxsr" Oct 03 09:15:00 crc kubenswrapper[4765]: I1003 09:15:00.367793 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lzgdp\" (UniqueName: \"kubernetes.io/projected/9a61f4bf-09d3-4fab-809a-4105c512de46-kube-api-access-lzgdp\") pod \"collect-profiles-29324715-zmxsr\" (UID: \"9a61f4bf-09d3-4fab-809a-4105c512de46\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29324715-zmxsr" Oct 03 09:15:00 crc kubenswrapper[4765]: I1003 09:15:00.491920 4765 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29324715-zmxsr" Oct 03 09:15:00 crc kubenswrapper[4765]: I1003 09:15:00.947548 4765 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29324715-zmxsr"] Oct 03 09:15:01 crc kubenswrapper[4765]: I1003 09:15:01.111002 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29324715-zmxsr" event={"ID":"9a61f4bf-09d3-4fab-809a-4105c512de46","Type":"ContainerStarted","Data":"8ead2184dabbe8c619ab3cbbed6c51776b4ac865b517854a16ff590bf7d932d2"} Oct 03 09:15:02 crc kubenswrapper[4765]: I1003 09:15:02.121596 4765 generic.go:334] "Generic (PLEG): container finished" podID="0bddda19-bd72-467b-956e-72a43398545f" containerID="ac592d307cbd4050950361e916d45fc9bf50e78ffd0b739d1e025232d0e1c9de" exitCode=0 Oct 03 09:15:02 crc kubenswrapper[4765]: I1003 09:15:02.121687 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-gkwlq/must-gather-k88dn" event={"ID":"0bddda19-bd72-467b-956e-72a43398545f","Type":"ContainerDied","Data":"ac592d307cbd4050950361e916d45fc9bf50e78ffd0b739d1e025232d0e1c9de"} Oct 03 09:15:02 crc kubenswrapper[4765]: I1003 09:15:02.122534 4765 scope.go:117] "RemoveContainer" containerID="ac592d307cbd4050950361e916d45fc9bf50e78ffd0b739d1e025232d0e1c9de" Oct 03 09:15:02 crc kubenswrapper[4765]: I1003 09:15:02.124711 4765 generic.go:334] "Generic (PLEG): container finished" podID="9a61f4bf-09d3-4fab-809a-4105c512de46" containerID="e291dd0762ca08c6b0cd6d72c888ef148a29441ebbce494a1a5396a6368214e5" exitCode=0 Oct 03 09:15:02 crc kubenswrapper[4765]: I1003 09:15:02.124771 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29324715-zmxsr" event={"ID":"9a61f4bf-09d3-4fab-809a-4105c512de46","Type":"ContainerDied","Data":"e291dd0762ca08c6b0cd6d72c888ef148a29441ebbce494a1a5396a6368214e5"} Oct 03 09:15:02 crc kubenswrapper[4765]: I1003 09:15:02.170851 4765 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-gkwlq_must-gather-k88dn_0bddda19-bd72-467b-956e-72a43398545f/gather/0.log" Oct 03 09:15:03 crc kubenswrapper[4765]: I1003 09:15:03.423021 4765 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29324715-zmxsr" Oct 03 09:15:03 crc kubenswrapper[4765]: I1003 09:15:03.497472 4765 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lzgdp\" (UniqueName: \"kubernetes.io/projected/9a61f4bf-09d3-4fab-809a-4105c512de46-kube-api-access-lzgdp\") pod \"9a61f4bf-09d3-4fab-809a-4105c512de46\" (UID: \"9a61f4bf-09d3-4fab-809a-4105c512de46\") " Oct 03 09:15:03 crc kubenswrapper[4765]: I1003 09:15:03.497540 4765 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/9a61f4bf-09d3-4fab-809a-4105c512de46-secret-volume\") pod \"9a61f4bf-09d3-4fab-809a-4105c512de46\" (UID: \"9a61f4bf-09d3-4fab-809a-4105c512de46\") " Oct 03 09:15:03 crc kubenswrapper[4765]: I1003 09:15:03.497682 4765 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/9a61f4bf-09d3-4fab-809a-4105c512de46-config-volume\") pod \"9a61f4bf-09d3-4fab-809a-4105c512de46\" (UID: \"9a61f4bf-09d3-4fab-809a-4105c512de46\") " Oct 03 09:15:03 crc kubenswrapper[4765]: I1003 09:15:03.498357 4765 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9a61f4bf-09d3-4fab-809a-4105c512de46-config-volume" (OuterVolumeSpecName: "config-volume") pod "9a61f4bf-09d3-4fab-809a-4105c512de46" (UID: "9a61f4bf-09d3-4fab-809a-4105c512de46"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 03 09:15:03 crc kubenswrapper[4765]: I1003 09:15:03.503265 4765 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9a61f4bf-09d3-4fab-809a-4105c512de46-kube-api-access-lzgdp" (OuterVolumeSpecName: "kube-api-access-lzgdp") pod "9a61f4bf-09d3-4fab-809a-4105c512de46" (UID: "9a61f4bf-09d3-4fab-809a-4105c512de46"). InnerVolumeSpecName "kube-api-access-lzgdp". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 09:15:03 crc kubenswrapper[4765]: I1003 09:15:03.503738 4765 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9a61f4bf-09d3-4fab-809a-4105c512de46-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "9a61f4bf-09d3-4fab-809a-4105c512de46" (UID: "9a61f4bf-09d3-4fab-809a-4105c512de46"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 09:15:03 crc kubenswrapper[4765]: I1003 09:15:03.599897 4765 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/9a61f4bf-09d3-4fab-809a-4105c512de46-config-volume\") on node \"crc\" DevicePath \"\"" Oct 03 09:15:03 crc kubenswrapper[4765]: I1003 09:15:03.599932 4765 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lzgdp\" (UniqueName: \"kubernetes.io/projected/9a61f4bf-09d3-4fab-809a-4105c512de46-kube-api-access-lzgdp\") on node \"crc\" DevicePath \"\"" Oct 03 09:15:03 crc kubenswrapper[4765]: I1003 09:15:03.599946 4765 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/9a61f4bf-09d3-4fab-809a-4105c512de46-secret-volume\") on node \"crc\" DevicePath \"\"" Oct 03 09:15:04 crc kubenswrapper[4765]: I1003 09:15:04.143485 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29324715-zmxsr" event={"ID":"9a61f4bf-09d3-4fab-809a-4105c512de46","Type":"ContainerDied","Data":"8ead2184dabbe8c619ab3cbbed6c51776b4ac865b517854a16ff590bf7d932d2"} Oct 03 09:15:04 crc kubenswrapper[4765]: I1003 09:15:04.143530 4765 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="8ead2184dabbe8c619ab3cbbed6c51776b4ac865b517854a16ff590bf7d932d2" Oct 03 09:15:04 crc kubenswrapper[4765]: I1003 09:15:04.143557 4765 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29324715-zmxsr" Oct 03 09:15:04 crc kubenswrapper[4765]: I1003 09:15:04.521481 4765 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29324670-fpjm8"] Oct 03 09:15:04 crc kubenswrapper[4765]: I1003 09:15:04.527150 4765 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29324670-fpjm8"] Oct 03 09:15:06 crc kubenswrapper[4765]: I1003 09:15:06.316483 4765 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="432cff95-d219-46af-bfc4-c5afbe99c9c0" path="/var/lib/kubelet/pods/432cff95-d219-46af-bfc4-c5afbe99c9c0/volumes" Oct 03 09:15:10 crc kubenswrapper[4765]: I1003 09:15:10.420318 4765 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-must-gather-gkwlq/must-gather-k88dn"] Oct 03 09:15:10 crc kubenswrapper[4765]: I1003 09:15:10.421202 4765 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-must-gather-gkwlq/must-gather-k88dn" podUID="0bddda19-bd72-467b-956e-72a43398545f" containerName="copy" containerID="cri-o://841914e9b3edf8de5cb68b7586d96ee3d7fb064ddca368c749a4fe785cb8fb91" gracePeriod=2 Oct 03 09:15:10 crc kubenswrapper[4765]: I1003 09:15:10.425184 4765 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-must-gather-gkwlq/must-gather-k88dn"] Oct 03 09:15:10 crc kubenswrapper[4765]: I1003 09:15:10.851883 4765 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-gkwlq_must-gather-k88dn_0bddda19-bd72-467b-956e-72a43398545f/copy/0.log" Oct 03 09:15:10 crc kubenswrapper[4765]: I1003 09:15:10.852790 4765 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-gkwlq/must-gather-k88dn" Oct 03 09:15:11 crc kubenswrapper[4765]: I1003 09:15:11.008940 4765 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qsc69\" (UniqueName: \"kubernetes.io/projected/0bddda19-bd72-467b-956e-72a43398545f-kube-api-access-qsc69\") pod \"0bddda19-bd72-467b-956e-72a43398545f\" (UID: \"0bddda19-bd72-467b-956e-72a43398545f\") " Oct 03 09:15:11 crc kubenswrapper[4765]: I1003 09:15:11.009149 4765 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/0bddda19-bd72-467b-956e-72a43398545f-must-gather-output\") pod \"0bddda19-bd72-467b-956e-72a43398545f\" (UID: \"0bddda19-bd72-467b-956e-72a43398545f\") " Oct 03 09:15:11 crc kubenswrapper[4765]: I1003 09:15:11.015028 4765 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0bddda19-bd72-467b-956e-72a43398545f-kube-api-access-qsc69" (OuterVolumeSpecName: "kube-api-access-qsc69") pod "0bddda19-bd72-467b-956e-72a43398545f" (UID: "0bddda19-bd72-467b-956e-72a43398545f"). InnerVolumeSpecName "kube-api-access-qsc69". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 09:15:11 crc kubenswrapper[4765]: I1003 09:15:11.110398 4765 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/0bddda19-bd72-467b-956e-72a43398545f-must-gather-output" (OuterVolumeSpecName: "must-gather-output") pod "0bddda19-bd72-467b-956e-72a43398545f" (UID: "0bddda19-bd72-467b-956e-72a43398545f"). InnerVolumeSpecName "must-gather-output". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 03 09:15:11 crc kubenswrapper[4765]: I1003 09:15:11.110769 4765 reconciler_common.go:293] "Volume detached for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/0bddda19-bd72-467b-956e-72a43398545f-must-gather-output\") on node \"crc\" DevicePath \"\"" Oct 03 09:15:11 crc kubenswrapper[4765]: I1003 09:15:11.110800 4765 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qsc69\" (UniqueName: \"kubernetes.io/projected/0bddda19-bd72-467b-956e-72a43398545f-kube-api-access-qsc69\") on node \"crc\" DevicePath \"\"" Oct 03 09:15:11 crc kubenswrapper[4765]: I1003 09:15:11.201736 4765 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-gkwlq_must-gather-k88dn_0bddda19-bd72-467b-956e-72a43398545f/copy/0.log" Oct 03 09:15:11 crc kubenswrapper[4765]: I1003 09:15:11.202260 4765 generic.go:334] "Generic (PLEG): container finished" podID="0bddda19-bd72-467b-956e-72a43398545f" containerID="841914e9b3edf8de5cb68b7586d96ee3d7fb064ddca368c749a4fe785cb8fb91" exitCode=143 Oct 03 09:15:11 crc kubenswrapper[4765]: I1003 09:15:11.202305 4765 scope.go:117] "RemoveContainer" containerID="841914e9b3edf8de5cb68b7586d96ee3d7fb064ddca368c749a4fe785cb8fb91" Oct 03 09:15:11 crc kubenswrapper[4765]: I1003 09:15:11.202395 4765 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-gkwlq/must-gather-k88dn" Oct 03 09:15:11 crc kubenswrapper[4765]: I1003 09:15:11.219707 4765 scope.go:117] "RemoveContainer" containerID="ac592d307cbd4050950361e916d45fc9bf50e78ffd0b739d1e025232d0e1c9de" Oct 03 09:15:11 crc kubenswrapper[4765]: I1003 09:15:11.284730 4765 scope.go:117] "RemoveContainer" containerID="841914e9b3edf8de5cb68b7586d96ee3d7fb064ddca368c749a4fe785cb8fb91" Oct 03 09:15:11 crc kubenswrapper[4765]: E1003 09:15:11.285133 4765 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"841914e9b3edf8de5cb68b7586d96ee3d7fb064ddca368c749a4fe785cb8fb91\": container with ID starting with 841914e9b3edf8de5cb68b7586d96ee3d7fb064ddca368c749a4fe785cb8fb91 not found: ID does not exist" containerID="841914e9b3edf8de5cb68b7586d96ee3d7fb064ddca368c749a4fe785cb8fb91" Oct 03 09:15:11 crc kubenswrapper[4765]: I1003 09:15:11.285163 4765 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"841914e9b3edf8de5cb68b7586d96ee3d7fb064ddca368c749a4fe785cb8fb91"} err="failed to get container status \"841914e9b3edf8de5cb68b7586d96ee3d7fb064ddca368c749a4fe785cb8fb91\": rpc error: code = NotFound desc = could not find container \"841914e9b3edf8de5cb68b7586d96ee3d7fb064ddca368c749a4fe785cb8fb91\": container with ID starting with 841914e9b3edf8de5cb68b7586d96ee3d7fb064ddca368c749a4fe785cb8fb91 not found: ID does not exist" Oct 03 09:15:11 crc kubenswrapper[4765]: I1003 09:15:11.285186 4765 scope.go:117] "RemoveContainer" containerID="ac592d307cbd4050950361e916d45fc9bf50e78ffd0b739d1e025232d0e1c9de" Oct 03 09:15:11 crc kubenswrapper[4765]: E1003 09:15:11.285543 4765 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ac592d307cbd4050950361e916d45fc9bf50e78ffd0b739d1e025232d0e1c9de\": container with ID starting with ac592d307cbd4050950361e916d45fc9bf50e78ffd0b739d1e025232d0e1c9de not found: ID does not exist" containerID="ac592d307cbd4050950361e916d45fc9bf50e78ffd0b739d1e025232d0e1c9de" Oct 03 09:15:11 crc kubenswrapper[4765]: I1003 09:15:11.285628 4765 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ac592d307cbd4050950361e916d45fc9bf50e78ffd0b739d1e025232d0e1c9de"} err="failed to get container status \"ac592d307cbd4050950361e916d45fc9bf50e78ffd0b739d1e025232d0e1c9de\": rpc error: code = NotFound desc = could not find container \"ac592d307cbd4050950361e916d45fc9bf50e78ffd0b739d1e025232d0e1c9de\": container with ID starting with ac592d307cbd4050950361e916d45fc9bf50e78ffd0b739d1e025232d0e1c9de not found: ID does not exist" Oct 03 09:15:12 crc kubenswrapper[4765]: I1003 09:15:12.316299 4765 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0bddda19-bd72-467b-956e-72a43398545f" path="/var/lib/kubelet/pods/0bddda19-bd72-467b-956e-72a43398545f/volumes" Oct 03 09:15:19 crc kubenswrapper[4765]: I1003 09:15:19.394616 4765 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-4zg46"] Oct 03 09:15:19 crc kubenswrapper[4765]: E1003 09:15:19.395871 4765 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0bddda19-bd72-467b-956e-72a43398545f" containerName="copy" Oct 03 09:15:19 crc kubenswrapper[4765]: I1003 09:15:19.395887 4765 state_mem.go:107] "Deleted CPUSet assignment" podUID="0bddda19-bd72-467b-956e-72a43398545f" containerName="copy" Oct 03 09:15:19 crc kubenswrapper[4765]: E1003 09:15:19.395898 4765 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0bddda19-bd72-467b-956e-72a43398545f" containerName="gather" Oct 03 09:15:19 crc kubenswrapper[4765]: I1003 09:15:19.395904 4765 state_mem.go:107] "Deleted CPUSet assignment" podUID="0bddda19-bd72-467b-956e-72a43398545f" containerName="gather" Oct 03 09:15:19 crc kubenswrapper[4765]: E1003 09:15:19.395927 4765 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9a61f4bf-09d3-4fab-809a-4105c512de46" containerName="collect-profiles" Oct 03 09:15:19 crc kubenswrapper[4765]: I1003 09:15:19.395933 4765 state_mem.go:107] "Deleted CPUSet assignment" podUID="9a61f4bf-09d3-4fab-809a-4105c512de46" containerName="collect-profiles" Oct 03 09:15:19 crc kubenswrapper[4765]: I1003 09:15:19.396079 4765 memory_manager.go:354] "RemoveStaleState removing state" podUID="0bddda19-bd72-467b-956e-72a43398545f" containerName="copy" Oct 03 09:15:19 crc kubenswrapper[4765]: I1003 09:15:19.396095 4765 memory_manager.go:354] "RemoveStaleState removing state" podUID="0bddda19-bd72-467b-956e-72a43398545f" containerName="gather" Oct 03 09:15:19 crc kubenswrapper[4765]: I1003 09:15:19.396111 4765 memory_manager.go:354] "RemoveStaleState removing state" podUID="9a61f4bf-09d3-4fab-809a-4105c512de46" containerName="collect-profiles" Oct 03 09:15:19 crc kubenswrapper[4765]: I1003 09:15:19.398954 4765 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-4zg46" Oct 03 09:15:19 crc kubenswrapper[4765]: I1003 09:15:19.410363 4765 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-4zg46"] Oct 03 09:15:19 crc kubenswrapper[4765]: I1003 09:15:19.581234 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5sc78\" (UniqueName: \"kubernetes.io/projected/3d75be31-37ee-48f5-97af-9aa60eac8b96-kube-api-access-5sc78\") pod \"redhat-marketplace-4zg46\" (UID: \"3d75be31-37ee-48f5-97af-9aa60eac8b96\") " pod="openshift-marketplace/redhat-marketplace-4zg46" Oct 03 09:15:19 crc kubenswrapper[4765]: I1003 09:15:19.581621 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/3d75be31-37ee-48f5-97af-9aa60eac8b96-catalog-content\") pod \"redhat-marketplace-4zg46\" (UID: \"3d75be31-37ee-48f5-97af-9aa60eac8b96\") " pod="openshift-marketplace/redhat-marketplace-4zg46" Oct 03 09:15:19 crc kubenswrapper[4765]: I1003 09:15:19.581770 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/3d75be31-37ee-48f5-97af-9aa60eac8b96-utilities\") pod \"redhat-marketplace-4zg46\" (UID: \"3d75be31-37ee-48f5-97af-9aa60eac8b96\") " pod="openshift-marketplace/redhat-marketplace-4zg46" Oct 03 09:15:19 crc kubenswrapper[4765]: I1003 09:15:19.682879 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/3d75be31-37ee-48f5-97af-9aa60eac8b96-utilities\") pod \"redhat-marketplace-4zg46\" (UID: \"3d75be31-37ee-48f5-97af-9aa60eac8b96\") " pod="openshift-marketplace/redhat-marketplace-4zg46" Oct 03 09:15:19 crc kubenswrapper[4765]: I1003 09:15:19.683235 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5sc78\" (UniqueName: \"kubernetes.io/projected/3d75be31-37ee-48f5-97af-9aa60eac8b96-kube-api-access-5sc78\") pod \"redhat-marketplace-4zg46\" (UID: \"3d75be31-37ee-48f5-97af-9aa60eac8b96\") " pod="openshift-marketplace/redhat-marketplace-4zg46" Oct 03 09:15:19 crc kubenswrapper[4765]: I1003 09:15:19.683289 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/3d75be31-37ee-48f5-97af-9aa60eac8b96-catalog-content\") pod \"redhat-marketplace-4zg46\" (UID: \"3d75be31-37ee-48f5-97af-9aa60eac8b96\") " pod="openshift-marketplace/redhat-marketplace-4zg46" Oct 03 09:15:19 crc kubenswrapper[4765]: I1003 09:15:19.683744 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/3d75be31-37ee-48f5-97af-9aa60eac8b96-utilities\") pod \"redhat-marketplace-4zg46\" (UID: \"3d75be31-37ee-48f5-97af-9aa60eac8b96\") " pod="openshift-marketplace/redhat-marketplace-4zg46" Oct 03 09:15:19 crc kubenswrapper[4765]: I1003 09:15:19.683768 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/3d75be31-37ee-48f5-97af-9aa60eac8b96-catalog-content\") pod \"redhat-marketplace-4zg46\" (UID: \"3d75be31-37ee-48f5-97af-9aa60eac8b96\") " pod="openshift-marketplace/redhat-marketplace-4zg46" Oct 03 09:15:19 crc kubenswrapper[4765]: I1003 09:15:19.706288 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5sc78\" (UniqueName: \"kubernetes.io/projected/3d75be31-37ee-48f5-97af-9aa60eac8b96-kube-api-access-5sc78\") pod \"redhat-marketplace-4zg46\" (UID: \"3d75be31-37ee-48f5-97af-9aa60eac8b96\") " pod="openshift-marketplace/redhat-marketplace-4zg46" Oct 03 09:15:19 crc kubenswrapper[4765]: I1003 09:15:19.724311 4765 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-4zg46" Oct 03 09:15:20 crc kubenswrapper[4765]: I1003 09:15:20.485168 4765 scope.go:117] "RemoveContainer" containerID="8f7a0a346594db76148950b9853e166a62634e5289700e48c0ba599d70913a66" Oct 03 09:15:20 crc kubenswrapper[4765]: I1003 09:15:20.511876 4765 scope.go:117] "RemoveContainer" containerID="82095e3c8a766ab9578d34df5a12aed1393ea5c90dcc36c1b3d5b6998b2cf1c6" Oct 03 09:15:20 crc kubenswrapper[4765]: I1003 09:15:20.543080 4765 scope.go:117] "RemoveContainer" containerID="69df2fe02aca2ccab6a7c3ba623df1fe4c5bd0889c95dcaab6a47e692ab00860" Oct 03 09:15:20 crc kubenswrapper[4765]: I1003 09:15:20.577693 4765 scope.go:117] "RemoveContainer" containerID="a91f44cc0c667004eee6ca4eea74e36064adc2cbd51680a6064eaf99964c8d76" Oct 03 09:15:20 crc kubenswrapper[4765]: I1003 09:15:20.593334 4765 scope.go:117] "RemoveContainer" containerID="58f9d0a6d37ff142730f3ba68ad738dc9ac8d9317f4f3c26c3a577dc34fdd2f9" Oct 03 09:15:20 crc kubenswrapper[4765]: I1003 09:15:20.608333 4765 scope.go:117] "RemoveContainer" containerID="8b36e642d06489402d75d70d3196a61e9a07ee67a6b1a2cbdce5f38e7afa9daa" Oct 03 09:15:20 crc kubenswrapper[4765]: I1003 09:15:20.769103 4765 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-4zg46"] Oct 03 09:15:21 crc kubenswrapper[4765]: I1003 09:15:21.288208 4765 generic.go:334] "Generic (PLEG): container finished" podID="3d75be31-37ee-48f5-97af-9aa60eac8b96" containerID="8dec561c8acc004719fe853d95a8a77e637c2e4828acd4bba98656c4a3660d75" exitCode=0 Oct 03 09:15:21 crc kubenswrapper[4765]: I1003 09:15:21.288322 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-4zg46" event={"ID":"3d75be31-37ee-48f5-97af-9aa60eac8b96","Type":"ContainerDied","Data":"8dec561c8acc004719fe853d95a8a77e637c2e4828acd4bba98656c4a3660d75"} Oct 03 09:15:21 crc kubenswrapper[4765]: I1003 09:15:21.288524 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-4zg46" event={"ID":"3d75be31-37ee-48f5-97af-9aa60eac8b96","Type":"ContainerStarted","Data":"efd58979d299014041f4a4beb285d06a1866c39e0c671ce2da5e1111fb5443c0"} Oct 03 09:15:22 crc kubenswrapper[4765]: I1003 09:15:22.297726 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-4zg46" event={"ID":"3d75be31-37ee-48f5-97af-9aa60eac8b96","Type":"ContainerStarted","Data":"d41d1f2f5b38736e6083eb15c19e5e843a29deb1fc93ea3ac5219cb9472dc92c"} Oct 03 09:15:23 crc kubenswrapper[4765]: I1003 09:15:23.306954 4765 generic.go:334] "Generic (PLEG): container finished" podID="3d75be31-37ee-48f5-97af-9aa60eac8b96" containerID="d41d1f2f5b38736e6083eb15c19e5e843a29deb1fc93ea3ac5219cb9472dc92c" exitCode=0 Oct 03 09:15:23 crc kubenswrapper[4765]: I1003 09:15:23.307048 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-4zg46" event={"ID":"3d75be31-37ee-48f5-97af-9aa60eac8b96","Type":"ContainerDied","Data":"d41d1f2f5b38736e6083eb15c19e5e843a29deb1fc93ea3ac5219cb9472dc92c"} Oct 03 09:15:24 crc kubenswrapper[4765]: I1003 09:15:24.321999 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-4zg46" event={"ID":"3d75be31-37ee-48f5-97af-9aa60eac8b96","Type":"ContainerStarted","Data":"2dd6961e058a045ac7a3d3a8154719cf4a202e3b244fa96e1de8ab51455956aa"} Oct 03 09:15:24 crc kubenswrapper[4765]: I1003 09:15:24.349393 4765 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-4zg46" podStartSLOduration=2.811361138 podStartE2EDuration="5.349376361s" podCreationTimestamp="2025-10-03 09:15:19 +0000 UTC" firstStartedPulling="2025-10-03 09:15:21.290288364 +0000 UTC m=+2165.591782694" lastFinishedPulling="2025-10-03 09:15:23.828303577 +0000 UTC m=+2168.129797917" observedRunningTime="2025-10-03 09:15:24.346395174 +0000 UTC m=+2168.647889504" watchObservedRunningTime="2025-10-03 09:15:24.349376361 +0000 UTC m=+2168.650870691" Oct 03 09:15:29 crc kubenswrapper[4765]: I1003 09:15:29.725000 4765 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-4zg46" Oct 03 09:15:29 crc kubenswrapper[4765]: I1003 09:15:29.726344 4765 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-4zg46" Oct 03 09:15:29 crc kubenswrapper[4765]: I1003 09:15:29.766984 4765 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-4zg46" Oct 03 09:15:30 crc kubenswrapper[4765]: I1003 09:15:30.405281 4765 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-4zg46" Oct 03 09:15:33 crc kubenswrapper[4765]: I1003 09:15:33.377074 4765 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-4zg46"] Oct 03 09:15:33 crc kubenswrapper[4765]: I1003 09:15:33.377585 4765 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-4zg46" podUID="3d75be31-37ee-48f5-97af-9aa60eac8b96" containerName="registry-server" containerID="cri-o://2dd6961e058a045ac7a3d3a8154719cf4a202e3b244fa96e1de8ab51455956aa" gracePeriod=2 Oct 03 09:15:33 crc kubenswrapper[4765]: I1003 09:15:33.789421 4765 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-4zg46" Oct 03 09:15:33 crc kubenswrapper[4765]: I1003 09:15:33.902598 4765 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/3d75be31-37ee-48f5-97af-9aa60eac8b96-utilities\") pod \"3d75be31-37ee-48f5-97af-9aa60eac8b96\" (UID: \"3d75be31-37ee-48f5-97af-9aa60eac8b96\") " Oct 03 09:15:33 crc kubenswrapper[4765]: I1003 09:15:33.902820 4765 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5sc78\" (UniqueName: \"kubernetes.io/projected/3d75be31-37ee-48f5-97af-9aa60eac8b96-kube-api-access-5sc78\") pod \"3d75be31-37ee-48f5-97af-9aa60eac8b96\" (UID: \"3d75be31-37ee-48f5-97af-9aa60eac8b96\") " Oct 03 09:15:33 crc kubenswrapper[4765]: I1003 09:15:33.902904 4765 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/3d75be31-37ee-48f5-97af-9aa60eac8b96-catalog-content\") pod \"3d75be31-37ee-48f5-97af-9aa60eac8b96\" (UID: \"3d75be31-37ee-48f5-97af-9aa60eac8b96\") " Oct 03 09:15:33 crc kubenswrapper[4765]: I1003 09:15:33.904074 4765 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/3d75be31-37ee-48f5-97af-9aa60eac8b96-utilities" (OuterVolumeSpecName: "utilities") pod "3d75be31-37ee-48f5-97af-9aa60eac8b96" (UID: "3d75be31-37ee-48f5-97af-9aa60eac8b96"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 03 09:15:33 crc kubenswrapper[4765]: I1003 09:15:33.909219 4765 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3d75be31-37ee-48f5-97af-9aa60eac8b96-kube-api-access-5sc78" (OuterVolumeSpecName: "kube-api-access-5sc78") pod "3d75be31-37ee-48f5-97af-9aa60eac8b96" (UID: "3d75be31-37ee-48f5-97af-9aa60eac8b96"). InnerVolumeSpecName "kube-api-access-5sc78". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 09:15:33 crc kubenswrapper[4765]: I1003 09:15:33.916348 4765 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/3d75be31-37ee-48f5-97af-9aa60eac8b96-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "3d75be31-37ee-48f5-97af-9aa60eac8b96" (UID: "3d75be31-37ee-48f5-97af-9aa60eac8b96"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 03 09:15:34 crc kubenswrapper[4765]: I1003 09:15:34.004523 4765 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-5sc78\" (UniqueName: \"kubernetes.io/projected/3d75be31-37ee-48f5-97af-9aa60eac8b96-kube-api-access-5sc78\") on node \"crc\" DevicePath \"\"" Oct 03 09:15:34 crc kubenswrapper[4765]: I1003 09:15:34.004562 4765 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/3d75be31-37ee-48f5-97af-9aa60eac8b96-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 03 09:15:34 crc kubenswrapper[4765]: I1003 09:15:34.004572 4765 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/3d75be31-37ee-48f5-97af-9aa60eac8b96-utilities\") on node \"crc\" DevicePath \"\"" Oct 03 09:15:34 crc kubenswrapper[4765]: I1003 09:15:34.397811 4765 generic.go:334] "Generic (PLEG): container finished" podID="3d75be31-37ee-48f5-97af-9aa60eac8b96" containerID="2dd6961e058a045ac7a3d3a8154719cf4a202e3b244fa96e1de8ab51455956aa" exitCode=0 Oct 03 09:15:34 crc kubenswrapper[4765]: I1003 09:15:34.397851 4765 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-4zg46" Oct 03 09:15:34 crc kubenswrapper[4765]: I1003 09:15:34.397851 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-4zg46" event={"ID":"3d75be31-37ee-48f5-97af-9aa60eac8b96","Type":"ContainerDied","Data":"2dd6961e058a045ac7a3d3a8154719cf4a202e3b244fa96e1de8ab51455956aa"} Oct 03 09:15:34 crc kubenswrapper[4765]: I1003 09:15:34.397976 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-4zg46" event={"ID":"3d75be31-37ee-48f5-97af-9aa60eac8b96","Type":"ContainerDied","Data":"efd58979d299014041f4a4beb285d06a1866c39e0c671ce2da5e1111fb5443c0"} Oct 03 09:15:34 crc kubenswrapper[4765]: I1003 09:15:34.398001 4765 scope.go:117] "RemoveContainer" containerID="2dd6961e058a045ac7a3d3a8154719cf4a202e3b244fa96e1de8ab51455956aa" Oct 03 09:15:34 crc kubenswrapper[4765]: I1003 09:15:34.417967 4765 scope.go:117] "RemoveContainer" containerID="d41d1f2f5b38736e6083eb15c19e5e843a29deb1fc93ea3ac5219cb9472dc92c" Oct 03 09:15:34 crc kubenswrapper[4765]: I1003 09:15:34.421391 4765 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-4zg46"] Oct 03 09:15:34 crc kubenswrapper[4765]: I1003 09:15:34.429933 4765 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-4zg46"] Oct 03 09:15:34 crc kubenswrapper[4765]: I1003 09:15:34.435498 4765 scope.go:117] "RemoveContainer" containerID="8dec561c8acc004719fe853d95a8a77e637c2e4828acd4bba98656c4a3660d75" Oct 03 09:15:34 crc kubenswrapper[4765]: I1003 09:15:34.474635 4765 scope.go:117] "RemoveContainer" containerID="2dd6961e058a045ac7a3d3a8154719cf4a202e3b244fa96e1de8ab51455956aa" Oct 03 09:15:34 crc kubenswrapper[4765]: E1003 09:15:34.475323 4765 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2dd6961e058a045ac7a3d3a8154719cf4a202e3b244fa96e1de8ab51455956aa\": container with ID starting with 2dd6961e058a045ac7a3d3a8154719cf4a202e3b244fa96e1de8ab51455956aa not found: ID does not exist" containerID="2dd6961e058a045ac7a3d3a8154719cf4a202e3b244fa96e1de8ab51455956aa" Oct 03 09:15:34 crc kubenswrapper[4765]: I1003 09:15:34.475361 4765 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2dd6961e058a045ac7a3d3a8154719cf4a202e3b244fa96e1de8ab51455956aa"} err="failed to get container status \"2dd6961e058a045ac7a3d3a8154719cf4a202e3b244fa96e1de8ab51455956aa\": rpc error: code = NotFound desc = could not find container \"2dd6961e058a045ac7a3d3a8154719cf4a202e3b244fa96e1de8ab51455956aa\": container with ID starting with 2dd6961e058a045ac7a3d3a8154719cf4a202e3b244fa96e1de8ab51455956aa not found: ID does not exist" Oct 03 09:15:34 crc kubenswrapper[4765]: I1003 09:15:34.475384 4765 scope.go:117] "RemoveContainer" containerID="d41d1f2f5b38736e6083eb15c19e5e843a29deb1fc93ea3ac5219cb9472dc92c" Oct 03 09:15:34 crc kubenswrapper[4765]: E1003 09:15:34.475798 4765 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d41d1f2f5b38736e6083eb15c19e5e843a29deb1fc93ea3ac5219cb9472dc92c\": container with ID starting with d41d1f2f5b38736e6083eb15c19e5e843a29deb1fc93ea3ac5219cb9472dc92c not found: ID does not exist" containerID="d41d1f2f5b38736e6083eb15c19e5e843a29deb1fc93ea3ac5219cb9472dc92c" Oct 03 09:15:34 crc kubenswrapper[4765]: I1003 09:15:34.475828 4765 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d41d1f2f5b38736e6083eb15c19e5e843a29deb1fc93ea3ac5219cb9472dc92c"} err="failed to get container status \"d41d1f2f5b38736e6083eb15c19e5e843a29deb1fc93ea3ac5219cb9472dc92c\": rpc error: code = NotFound desc = could not find container \"d41d1f2f5b38736e6083eb15c19e5e843a29deb1fc93ea3ac5219cb9472dc92c\": container with ID starting with d41d1f2f5b38736e6083eb15c19e5e843a29deb1fc93ea3ac5219cb9472dc92c not found: ID does not exist" Oct 03 09:15:34 crc kubenswrapper[4765]: I1003 09:15:34.475844 4765 scope.go:117] "RemoveContainer" containerID="8dec561c8acc004719fe853d95a8a77e637c2e4828acd4bba98656c4a3660d75" Oct 03 09:15:34 crc kubenswrapper[4765]: E1003 09:15:34.476234 4765 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8dec561c8acc004719fe853d95a8a77e637c2e4828acd4bba98656c4a3660d75\": container with ID starting with 8dec561c8acc004719fe853d95a8a77e637c2e4828acd4bba98656c4a3660d75 not found: ID does not exist" containerID="8dec561c8acc004719fe853d95a8a77e637c2e4828acd4bba98656c4a3660d75" Oct 03 09:15:34 crc kubenswrapper[4765]: I1003 09:15:34.476257 4765 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8dec561c8acc004719fe853d95a8a77e637c2e4828acd4bba98656c4a3660d75"} err="failed to get container status \"8dec561c8acc004719fe853d95a8a77e637c2e4828acd4bba98656c4a3660d75\": rpc error: code = NotFound desc = could not find container \"8dec561c8acc004719fe853d95a8a77e637c2e4828acd4bba98656c4a3660d75\": container with ID starting with 8dec561c8acc004719fe853d95a8a77e637c2e4828acd4bba98656c4a3660d75 not found: ID does not exist" Oct 03 09:15:36 crc kubenswrapper[4765]: I1003 09:15:36.316488 4765 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3d75be31-37ee-48f5-97af-9aa60eac8b96" path="/var/lib/kubelet/pods/3d75be31-37ee-48f5-97af-9aa60eac8b96/volumes" Oct 03 09:16:00 crc kubenswrapper[4765]: I1003 09:16:00.680610 4765 patch_prober.go:28] interesting pod/machine-config-daemon-j8mss container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 03 09:16:00 crc kubenswrapper[4765]: I1003 09:16:00.681125 4765 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-j8mss" podUID="d636dbad-9ffa-4ba7-953f-adea04b76a23" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 03 09:16:20 crc kubenswrapper[4765]: I1003 09:16:20.750066 4765 scope.go:117] "RemoveContainer" containerID="52e5333e0c644f0e7810a257ae9627cd84df8e24ed12d8948d8e8ac2fd1bca37" Oct 03 09:16:20 crc kubenswrapper[4765]: I1003 09:16:20.800965 4765 scope.go:117] "RemoveContainer" containerID="a1e1ad9af4dd403de16b7205ea870344fe8ae66f6f46d7c8cd0144ff04b9c5b9" Oct 03 09:16:20 crc kubenswrapper[4765]: I1003 09:16:20.844195 4765 scope.go:117] "RemoveContainer" containerID="c5aa3d69e81e36f32d78eaaeb917f4e4c5484bccba300b7d17b4d734bfc45b24" Oct 03 09:16:30 crc kubenswrapper[4765]: I1003 09:16:30.680280 4765 patch_prober.go:28] interesting pod/machine-config-daemon-j8mss container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 03 09:16:30 crc kubenswrapper[4765]: I1003 09:16:30.680888 4765 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-j8mss" podUID="d636dbad-9ffa-4ba7-953f-adea04b76a23" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 03 09:17:00 crc kubenswrapper[4765]: I1003 09:17:00.680170 4765 patch_prober.go:28] interesting pod/machine-config-daemon-j8mss container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 03 09:17:00 crc kubenswrapper[4765]: I1003 09:17:00.681022 4765 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-j8mss" podUID="d636dbad-9ffa-4ba7-953f-adea04b76a23" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 03 09:17:00 crc kubenswrapper[4765]: I1003 09:17:00.681067 4765 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-j8mss" Oct 03 09:17:00 crc kubenswrapper[4765]: I1003 09:17:00.681858 4765 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"42bfa14a2e89b5cffa309745efef82767cc344ded4fb8e7eea4024e396e441ec"} pod="openshift-machine-config-operator/machine-config-daemon-j8mss" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Oct 03 09:17:00 crc kubenswrapper[4765]: I1003 09:17:00.681905 4765 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-j8mss" podUID="d636dbad-9ffa-4ba7-953f-adea04b76a23" containerName="machine-config-daemon" containerID="cri-o://42bfa14a2e89b5cffa309745efef82767cc344ded4fb8e7eea4024e396e441ec" gracePeriod=600 Oct 03 09:17:00 crc kubenswrapper[4765]: E1003 09:17:00.814454 4765 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-j8mss_openshift-machine-config-operator(d636dbad-9ffa-4ba7-953f-adea04b76a23)\"" pod="openshift-machine-config-operator/machine-config-daemon-j8mss" podUID="d636dbad-9ffa-4ba7-953f-adea04b76a23" Oct 03 09:17:01 crc kubenswrapper[4765]: I1003 09:17:01.083596 4765 generic.go:334] "Generic (PLEG): container finished" podID="d636dbad-9ffa-4ba7-953f-adea04b76a23" containerID="42bfa14a2e89b5cffa309745efef82767cc344ded4fb8e7eea4024e396e441ec" exitCode=0 Oct 03 09:17:01 crc kubenswrapper[4765]: I1003 09:17:01.083703 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-j8mss" event={"ID":"d636dbad-9ffa-4ba7-953f-adea04b76a23","Type":"ContainerDied","Data":"42bfa14a2e89b5cffa309745efef82767cc344ded4fb8e7eea4024e396e441ec"} Oct 03 09:17:01 crc kubenswrapper[4765]: I1003 09:17:01.083788 4765 scope.go:117] "RemoveContainer" containerID="1978eaa8d4c867a88cdf1bd67d12cbe56c8e728282f53b9cdc636787901cab02" Oct 03 09:17:01 crc kubenswrapper[4765]: I1003 09:17:01.084393 4765 scope.go:117] "RemoveContainer" containerID="42bfa14a2e89b5cffa309745efef82767cc344ded4fb8e7eea4024e396e441ec" Oct 03 09:17:01 crc kubenswrapper[4765]: E1003 09:17:01.084672 4765 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-j8mss_openshift-machine-config-operator(d636dbad-9ffa-4ba7-953f-adea04b76a23)\"" pod="openshift-machine-config-operator/machine-config-daemon-j8mss" podUID="d636dbad-9ffa-4ba7-953f-adea04b76a23" Oct 03 09:17:16 crc kubenswrapper[4765]: I1003 09:17:16.311838 4765 scope.go:117] "RemoveContainer" containerID="42bfa14a2e89b5cffa309745efef82767cc344ded4fb8e7eea4024e396e441ec" Oct 03 09:17:16 crc kubenswrapper[4765]: E1003 09:17:16.312499 4765 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-j8mss_openshift-machine-config-operator(d636dbad-9ffa-4ba7-953f-adea04b76a23)\"" pod="openshift-machine-config-operator/machine-config-daemon-j8mss" podUID="d636dbad-9ffa-4ba7-953f-adea04b76a23" Oct 03 09:17:29 crc kubenswrapper[4765]: I1003 09:17:29.306970 4765 scope.go:117] "RemoveContainer" containerID="42bfa14a2e89b5cffa309745efef82767cc344ded4fb8e7eea4024e396e441ec" Oct 03 09:17:29 crc kubenswrapper[4765]: E1003 09:17:29.307595 4765 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-j8mss_openshift-machine-config-operator(d636dbad-9ffa-4ba7-953f-adea04b76a23)\"" pod="openshift-machine-config-operator/machine-config-daemon-j8mss" podUID="d636dbad-9ffa-4ba7-953f-adea04b76a23"